WorldWideScience

Sample records for analysis georgraphic information

  1. Canonical Information Analysis

    DEFF Research Database (Denmark)

    Vestergaard, Jacob Schack; Nielsen, Allan Aasbjerg

    2015-01-01

    Canonical correlation analysis is an established multivariate statistical method in which correlation between linear combinations of multivariate sets of variables is maximized. In canonical information analysis introduced here, linear correlation as a measure of association between variables is ...... airborne data. The simulation study shows that canonical information analysis is as accurate as and much faster than algorithms presented in previous work, especially for large sample sizes. URL: http://www.imm.dtu.dk/pubdb/p.php?6270...

  2. Information security risk analysis

    CERN Document Server

    Peltier, Thomas R

    2001-01-01

    Effective Risk AnalysisQualitative Risk AnalysisValue AnalysisOther Qualitative MethodsFacilitated Risk Analysis Process (FRAP)Other Uses of Qualitative Risk AnalysisCase StudyAppendix A: QuestionnaireAppendix B: Facilitated Risk Analysis Process FormsAppendix C: Business Impact Analysis FormsAppendix D: Sample of ReportAppendix E: Threat DefinitionsAppendix F: Other Risk Analysis OpinionsIndex

  3. Information Security Risk Analysis

    CERN Document Server

    Peltier, Thomas R

    2010-01-01

    Offers readers with the knowledge and the skill-set needed to achieve a highly effective risk analysis assessment. This title demonstrates how to identify threats and then determine if those threats pose a real risk. It is suitable for industry and academia professionals.

  4. Information Flow Analysis for VHDL

    DEFF Research Database (Denmark)

    Tolstrup, Terkel Kristian; Nielson, Flemming; Nielson, Hanne Riis

    2005-01-01

    We describe a fragment of the hardware description language VHDL that is suitable for implementing the Advanced Encryption Standard algorithm. We then define an Information Flow analysis as required by the international standard Common Criteria. The goal of the analysis is to identify the entire...... information flow through the VHDL program. The result of the analysis is presented as a non-transitive directed graph that connects those nodes (representing either variables or signals) where an information flow might occur. We compare our approach to that of Kemmerer and conclude that our approach yields...

  5. Epistasis analysis using information theory.

    Science.gov (United States)

    Moore, Jason H; Hu, Ting

    2015-01-01

    Here we introduce entropy-based measures derived from information theory for detecting and characterizing epistasis in genetic association studies. We provide a general overview of the methods and highlight some of the modifications that have greatly improved its power for genetic analysis. We end with a few published studies of complex human diseases that have used these measures.

  6. Information Analysis of DNA Sequences

    CERN Document Server

    Mohammed, Riyazuddin

    2010-01-01

    The problem of differentiating the informational content of coding (exons) and non-coding (introns) regions of a DNA sequence is one of the central problems of genomics. The introns are estimated to be nearly 95% of the DNA and since they do not seem to participate in the process of transcription of amino-acids, they have been termed "junk DNA." Although it is believed that the non-coding regions in genomes have no role in cell growth and evolution, demonstration that these regions carry useful information would tend to falsify this belief. In this paper, we consider entropy as a measure of information by modifying the entropy expression to take into account the varying length of these sequences. Exons are usually much shorter in length than introns; therefore the comparison of the entropy values needs to be normalized. A length correction strategy was employed using randomly generated nucleonic base strings built out of the alphabet of the same size as the exons under question. Our analysis shows that intron...

  7. [Information analysis of spinal ganglia].

    Science.gov (United States)

    Lobko, P I; Kovaleva, D V; Kovalchuk, I E; Pivchenko, P G; Rudenok, V V; Davydova, L A

    2000-01-01

    Information parameters (entropia and redundancy) of cervical and thoracic spinal ganglia of albino rat foetuses, mature animals (cat and dog) and human subjects were analysed. Information characteristics of spinal ganglia were shown to be level-specified and to depend on their functional peculiarities. Information parameters of thoracic spinal ganglia of man and different animals are specie specified and may be used in assessment of morphological structures as information systems.

  8. INFORMATION SYSTEM OF THE FINANCIAL ANALYSIS

    OpenAIRE

    MIRELA MONEA

    2013-01-01

    Financial analysis provides the information necessary for decision making, and also helps both the external and internal users of these. The results of the financial analysis work are dependent on the quality, accuracy, relevance and effectiveness of the information collected, and processed. Essential sources of information for financial analysis are financial statements, which are considered the raw material of financial analysis. One of the financial statements -the balance sheet - provi...

  9. Canonical analysis based on mutual information

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Vestergaard, Jacob Schack

    2015-01-01

    combinations with the information theoretical measure mutual information (MI). We term this type of analysis canonical information analysis (CIA). MI allows for the actual joint distribution of the variables involved and not just second order statistics. While CCA is ideal for Gaussian data, CIA facilitates...

  10. Textual Analysis of Intangible Information

    NARCIS (Netherlands)

    A.J. Moniz (Andy)

    2016-01-01

    markdownabstractTraditionally, equity investors have relied upon the information reported in firms’ financial accounts to make their investment decisions. Due to the conservative nature of accounting standards, firms cannot value their intangible assets such as corporate culture, brand value and rep

  11. Generalized Full-Information Item Bifactor Analysis

    Science.gov (United States)

    Cai, Li; Yang, Ji Seung; Hansen, Mark

    2011-01-01

    Full-information item bifactor analysis is an important statistical method in psychological and educational measurement. Current methods are limited to single-group analysis and inflexible in the types of item response models supported. We propose a flexible multiple-group item bifactor analysis framework that supports a variety of…

  12. Shape design sensitivity analysis using domain information

    Science.gov (United States)

    Seong, Hwal-Gyeong; Choi, Kyung K.

    1985-01-01

    A numerical method for obtaining accurate shape design sensitivity information for built-up structures is developed and demonstrated through analysis of examples. The basic character of the finite element method, which gives more accurate domain information than boundary information, is utilized for shape design sensitivity improvement. A domain approach for shape design sensitivity analysis of built-up structures is derived using the material derivative idea of structural mechanics and the adjoint variable method of design sensitivity analysis. Velocity elements and B-spline curves are introduced to alleviate difficulties in generating domain velocity fields. The regularity requirements of the design velocity field are studied.

  13. Informational analysis involving application of complex information system

    Science.gov (United States)

    Ciupak, Clébia; Vanti, Adolfo Alberto; Balloni, Antonio José; Espin, Rafael

    The aim of the present research is performing an informal analysis for internal audit involving the application of complex information system based on fuzzy logic. The same has been applied in internal audit involving the integration of the accounting field into the information systems field. The technological advancements can provide improvements to the work performed by the internal audit. Thus we aim to find, in the complex information systems, priorities for the work of internal audit of a high importance Private Institution of Higher Education. The applied method is quali-quantitative, as from the definition of strategic linguistic variables it was possible to transform them into quantitative with the matrix intersection. By means of a case study, where data were collected via interview with the Administrative Pro-Rector, who takes part at the elaboration of the strategic planning of the institution, it was possible to infer analysis concerning points which must be prioritized at the internal audit work. We emphasize that the priorities were identified when processed in a system (of academic use). From the study we can conclude that, starting from these information systems, audit can identify priorities on its work program. Along with plans and strategic objectives of the enterprise, the internal auditor can define operational procedures to work in favor of the attainment of the objectives of the organization.

  14. ANALYSIS APPROACHES TO EVALUATION OF INFORMATION PROTECTION

    Directory of Open Access Journals (Sweden)

    Zyuzin A. S.

    2015-03-01

    Full Text Available The article is devoted to an actual problem of information systems’ security assessment and the importance of objective quantitative assessment results receiving. The author offers the creation of complex system of information security with system approach, which will be used at each stage of information system’s life cycle. On the basis of this approach the author formulates the general scheme of an information security assessment of information system, and also the principles of an assessment’s carrying out method choice. In this work the existing methods of a quantitative assessment based on object-oriented methods of the system analysis, and also the objectivity of the received estimates on the basis of this approach are considered. On the basis of the carried-out analysis, serious shortcomings of the used modern techniques of an information systems’ security assessment are allocated, then the idea of the scientific and methodical device providing the increase of objectivity and complexity of an information assessment means on the basis of expert data formalization creation necessity was formulated. The possibility of this approach application for expeditious receiving a quantitative information security assessment in the conditions security threat’s dynamics changes, functioning and developments of information system is considered. The problem definition of automated information systems’ security assessment is executed, and the general technique of protection means of information in systems of this type was formulated

  15. INFORMATION SYSTEM OF THE FINANCIAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    MIRELA MONEA

    2013-12-01

    Full Text Available Financial analysis provides the information necessary for decision making, and also helps both the external and internal users of these. The results of the financial analysis work are dependent on the quality, accuracy, relevance and effectiveness of the information collected, and processed. Essential sources of information for financial analysis are financial statements, which are considered the raw material of financial analysis. One of the financial statements -the balance sheet - provides information about assets, liabilities, equity, liquidity, solvency, risk, financial flexibility. The profit and loss account is a synthesis accounting document, part of the financial statement reporting enterprise financial performances during of a specified accounting period and summarizes all revenues earned and expenses of an accounting period and reports the results.

  16. Social Network Analysis and informal trade

    DEFF Research Database (Denmark)

    Walther, Olivier

    networks can be applied to better understand informal trade in developing countries, with a particular focus on Africa. The paper starts by discussing some of the fundamental concepts developed by social network analysis. Through a number of case studies, we show how social network analysis can...... illuminate the relevant causes of social patterns, the impact of social ties on economic performance, the diffusion of resources and information, and the exercise of power. The paper then examines some of the methodological challenges of social network analysis and how it can be combined with other...... approaches. The paper finally highlights some of the applications of social network analysis and their implications for trade policies....

  17. Mathematical Analysis of Evolution, Information, and Complexity

    CERN Document Server

    Arendt, Wolfgang

    2009-01-01

    Mathematical Analysis of Evolution, Information, and Complexity deals with the analysis of evolution, information and complexity. The time evolution of systems or processes is a central question in science, this text covers a broad range of problems including diffusion processes, neuronal networks, quantum theory and cosmology. Bringing together a wide collection of research in mathematics, information theory, physics and other scientific and technical areas, this new title offers elementary and thus easily accessible introductions to the various fields of research addressed in the book.

  18. Crime analysis using open source information

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah; Shah, Azhar Ali;

    2015-01-01

    In this paper, we present a method of crime analysis from open source information. We employed un-supervised methods of data mining to explore the facts regarding the crimes of an area of interest. The analysis is based on well known clustering and association techniques. The results show...

  19. Value of Information Analysis in Structural Safety

    DEFF Research Database (Denmark)

    Konakli, Katerina; Faber, Michael Havbro

    2014-01-01

    Pre-posterior analysis can be used to assess the potential of an experiment to enhance decision making by providing information on parameters characterized by uncertainty. The present paper describes a framework for pre-posterior analysis for support of decisions related to maintenance of structu......Pre-posterior analysis can be used to assess the potential of an experiment to enhance decision making by providing information on parameters characterized by uncertainty. The present paper describes a framework for pre-posterior analysis for support of decisions related to maintenance...... of structural systems. In this context, experiments may refer to inspections or techniques of structural health monitoring. The Value of Information concept provides a powerful tool for determining whether the experimental cost is justified by the expected benefit and for identifying the optimal among different...

  20. Point Information Gain and Multidimensional Data Analysis

    Directory of Open Access Journals (Sweden)

    Renata Rychtáriková

    2016-10-01

    Full Text Available We generalize the point information gain (PIG and derived quantities, i.e., point information gain entropy (PIE and point information gain entropy density (PIED, for the case of the Rényi entropy and simulate the behavior of PIG for typical distributions. We also use these methods for the analysis of multidimensional datasets. We demonstrate the main properties of PIE/PIED spectra for the real data with the examples of several images and discuss further possible utilizations in other fields of data processing.

  1. Information theory applications for biological sequence analysis.

    Science.gov (United States)

    Vinga, Susana

    2014-05-01

    Information theory (IT) addresses the analysis of communication systems and has been widely applied in molecular biology. In particular, alignment-free sequence analysis and comparison greatly benefited from concepts derived from IT, such as entropy and mutual information. This review covers several aspects of IT applications, ranging from genome global analysis and comparison, including block-entropy estimation and resolution-free metrics based on iterative maps, to local analysis, comprising the classification of motifs, prediction of transcription factor binding sites and sequence characterization based on linguistic complexity and entropic profiles. IT has also been applied to high-level correlations that combine DNA, RNA or protein features with sequence-independent properties, such as gene mapping and phenotype analysis, and has also provided models based on communication systems theory to describe information transmission channels at the cell level and also during evolutionary processes. While not exhaustive, this review attempts to categorize existing methods and to indicate their relation with broader transversal topics such as genomic signatures, data compression and complexity, time series analysis and phylogenetic classification, providing a resource for future developments in this promising area.

  2. Enhancing genomics information retrieval through dimensional analysis.

    Science.gov (United States)

    Hu, Qinmin; Huang, Jimmy Xiangji

    2013-06-01

    We propose a novel dimensional analysis approach to employing meta information in order to find the relationships within the unstructured or semi-structured document/passages for improving genomics information retrieval performance. First, we make use of the auxiliary information as three basic dimensions, namely "temporal", "journal", and "author". The reference section is treated as a commensurable quantity of the three basic dimensions. Then, the sample space and subspaces are built up and a set of events are defined to meet the basic requirement of dimensional homogeneity to be commensurable quantities. After that, the classic graph analysis algorithm in the Web environments is applied on each dimension respectively to calculate the importance of each dimension. Finally, we integrate all the dimension networks and re-rank the outputs for evaluation. Our experimental results show the proposed approach is superior and promising.

  3. Information flow analysis of interactome networks.

    Directory of Open Access Journals (Sweden)

    Patrycja Vasilyev Missiuro

    2009-04-01

    Full Text Available Recent studies of cellular networks have revealed modular organizations of genes and proteins. For example, in interactome networks, a module refers to a group of interacting proteins that form molecular complexes and/or biochemical pathways and together mediate a biological process. However, it is still poorly understood how biological information is transmitted between different modules. We have developed information flow analysis, a new computational approach that identifies proteins central to the transmission of biological information throughout the network. In the information flow analysis, we represent an interactome network as an electrical circuit, where interactions are modeled as resistors and proteins as interconnecting junctions. Construing the propagation of biological signals as flow of electrical current, our method calculates an information flow score for every protein. Unlike previous metrics of network centrality such as degree or betweenness that only consider topological features, our approach incorporates confidence scores of protein-protein interactions and automatically considers all possible paths in a network when evaluating the importance of each protein. We apply our method to the interactome networks of Saccharomyces cerevisiae and Caenorhabditis elegans. We find that the likelihood of observing lethality and pleiotropy when a protein is eliminated is positively correlated with the protein's information flow score. Even among proteins of low degree or low betweenness, high information scores serve as a strong predictor of loss-of-function lethality or pleiotropy. The correlation between information flow scores and phenotypes supports our hypothesis that the proteins of high information flow reside in central positions in interactome networks. We also show that the ranks of information flow scores are more consistent than that of betweenness when a large amount of noisy data is added to an interactome. Finally, we

  4. Information analysis for modeling and representation of meaning

    OpenAIRE

    Uda, Norihiko

    1994-01-01

    In this dissertation, information analysis and an information model called the Semantic Structure Model based on information analysis are explained for semantic processing. Methods for self organization of information are also described. In addition, Information-Base Systems for thinking support of research and development in non linear optical materials are explained. As a result of information analysis, general properties of information and structural properties of concepts become clear. Ge...

  5. Improving information retrieval in functional analysis.

    Science.gov (United States)

    Rodriguez, Juan C; González, Germán A; Fresno, Cristóbal; Llera, Andrea S; Fernández, Elmer A

    2016-12-01

    Transcriptome analysis is essential to understand the mechanisms regulating key biological processes and functions. The first step usually consists of identifying candidate genes; to find out which pathways are affected by those genes, however, functional analysis (FA) is mandatory. The most frequently used strategies for this purpose are Gene Set and Singular Enrichment Analysis (GSEA and SEA) over Gene Ontology. Several statistical methods have been developed and compared in terms of computational efficiency and/or statistical appropriateness. However, whether their results are similar or complementary, the sensitivity to parameter settings, or possible bias in the analyzed terms has not been addressed so far. Here, two GSEA and four SEA methods and their parameter combinations were evaluated in six datasets by comparing two breast cancer subtypes with well-known differences in genetic background and patient outcomes. We show that GSEA and SEA lead to different results depending on the chosen statistic, model and/or parameters. Both approaches provide complementary results from a biological perspective. Hence, an Integrative Functional Analysis (IFA) tool is proposed to improve information retrieval in FA. It provides a common gene expression analytic framework that grants a comprehensive and coherent analysis. Only a minimal user parameter setting is required, since the best SEA/GSEA alternatives are integrated. IFA utility was demonstrated by evaluating four prostate cancer and the TCGA breast cancer microarray datasets, which showed its biological generalization capabilities.

  6. Exploiting salient semantic analysis for information retrieval

    Science.gov (United States)

    Luo, Jing; Meng, Bo; Quan, Changqin; Tu, Xinhui

    2016-11-01

    Recently, many Wikipedia-based methods have been proposed to improve the performance of different natural language processing (NLP) tasks, such as semantic relatedness computation, text classification and information retrieval. Among these methods, salient semantic analysis (SSA) has been proven to be an effective way to generate conceptual representation for words or documents. However, its feasibility and effectiveness in information retrieval is mostly unknown. In this paper, we study how to efficiently use SSA to improve the information retrieval performance, and propose a SSA-based retrieval method under the language model framework. First, SSA model is adopted to build conceptual representations for documents and queries. Then, these conceptual representations and the bag-of-words (BOW) representations can be used in combination to estimate the language models of queries and documents. The proposed method is evaluated on several standard text retrieval conference (TREC) collections. Experiment results on standard TREC collections show the proposed models consistently outperform the existing Wikipedia-based retrieval methods.

  7. Information- Theoretic Analysis for the Difficulty of Extracting Hidden Information

    Institute of Scientific and Technical Information of China (English)

    ZHANG Wei-ming; LI Shi-qu; CAO Jia; LIU Jiu-fen

    2005-01-01

    The difficulty of extracting hidden information,which is essentially a kind of secrecy, is analyzed by information-theoretic method. The relations between key rate, message rate, hiding capacity and difficulty of extraction are studied in the terms of unicity distance of stego-key, and the theoretic conclusion is used to analyze the actual extracting attack on Least Significant Bit(LSB) steganographic algorithms.

  8. 78 FR 38096 - Fatality Analysis Reporting System Information Collection

    Science.gov (United States)

    2013-06-25

    ... National Highway Traffic Safety Administration Fatality Analysis Reporting System Information Collection... Reporting System (FARS) is a major system that acquires national fatality information directly from existing...: Request for public comment on proposed collection of information. SUMMARY: Before a Federal agency...

  9. Formal Concept Analysis for Information Retrieval

    CERN Document Server

    Qadi, Abderrahim El; Ennouary, Yassine

    2010-01-01

    In this paper we describe a mechanism to improve Information Retrieval (IR) on the web. The method is based on Formal Concepts Analysis (FCA) that it is makes semantical relations during the queries, and allows a reorganizing, in the shape of a lattice of concepts, the answers provided by a search engine. We proposed for the IR an incremental algorithm based on Galois lattice. This algorithm allows a formal clustering of the data sources, and the results which it turns over are classified by order of relevance. The control of relevance is exploited in clustering, we improved the result by using ontology in field of image processing, and reformulating the user queries which make it possible to give more relevant documents.

  10. Astrophysical data analysis with information field theory

    Energy Technology Data Exchange (ETDEWEB)

    Enßlin, Torsten, E-mail: ensslin@mpa-garching.mpg.de [Max Planck Institut für Astrophysik, Karl-Schwarzschild-Straße 1, D-85748 Garching, Germany and Ludwig-Maximilians-Universität München, Geschwister-Scholl-Platz 1, D-80539 München (Germany)

    2014-12-05

    Non-parametric imaging and data analysis in astrophysics and cosmology can be addressed by information field theory (IFT), a means of Bayesian, data based inference on spatially distributed signal fields. IFT is a statistical field theory, which permits the construction of optimal signal recovery algorithms. It exploits spatial correlations of the signal fields even for nonlinear and non-Gaussian signal inference problems. The alleviation of a perception threshold for recovering signals of unknown correlation structure by using IFT will be discussed in particular as well as a novel improvement on instrumental self-calibration schemes. IFT can be applied to many areas. Here, applications in in cosmology (cosmic microwave background, large-scale structure) and astrophysics (galactic magnetism, radio interferometry) are presented.

  11. Astrophysical data analysis with information field theory

    CERN Document Server

    Enßlin, Torsten

    2014-01-01

    Non-parametric imaging and data analysis in astrophysics and cosmology can be addressed by information field theory (IFT), a means of Bayesian, data based inference on spatially distributed signal fields. IFT is a statistical field theory, which permits the construction of optimal signal recovery algorithms. It exploits spatial correlations of the signal fields even for nonlinear and non-Gaussian signal inference problems. The alleviation of a perception threshold for recovering signals of unknown correlation structure by using IFT will be discussed in particular as well as a novel improvement on instrumental self-calibration schemes. IFT can be applied to many areas. Here, applications in in cosmology (cosmic microwave background, large-scale structure) and astrophysics (galactic magnetism, radio interferometry) are presented.

  12. Modeling uncertainty in geographic information and analysis

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Uncertainty modeling and data quality for spatial data and spatial analyses are im-portant topics in geographic information science together with space and time in geography,as well as spatial analysis. In the past two decades,a lot of efforts have been made to research the uncertainty modeling for spatial data and analyses. This paper presents our work in the research. In particular,four progresses in the re-search are given out: (a) from determinedness-to uncertainty-based representation of geographic objects in GIS; (b) from uncertainty modeling for static data to dy-namic spatial analyses; (c) from modeling uncertainty for spatial data to models; and (d) from error descriptions to quality control for spatial data.

  13. Comprehensive analysis of information dissemination in disasters

    Science.gov (United States)

    Zhang, N.; Huang, H.; Su, Boni

    2016-11-01

    China is a country that experiences a large number of disasters. The number of deaths caused by large-scale disasters and accidents in past 10 years is around 900,000. More than 92.8 percent of these deaths could be avoided if there were an effective pre-warning system deployed. Knowledge of the information dissemination characteristics of different information media taking into consideration governmental assistance (information published by a government) in disasters in urban areas, plays a critical role in increasing response time and reducing the number of deaths and economic losses. In this paper we have developed a comprehensive information dissemination model to optimize efficiency of pre-warning mechanics. This model also can be used for disseminating information for evacuees making real-time evacuation plans. We analyzed every single information dissemination models for pre-warning in disasters by considering 14 media: short message service (SMS), phone, television, radio, news portals, Wechat, microblogs, email, newspapers, loudspeaker vehicles, loudspeakers, oral communication, and passive information acquisition via visual and auditory senses. Since governmental assistance is very useful in a disaster, we calculated the sensitivity of governmental assistance ratio. The results provide useful references for information dissemination during disasters in urban areas.

  14. Modeling and Analysis of Information Product Maps

    Science.gov (United States)

    Heien, Christopher Harris

    2012-01-01

    Information Product Maps are visual diagrams used to represent the inputs, processing, and outputs of data within an Information Manufacturing System. A data unit, drawn as an edge, symbolizes a grouping of raw data as it travels through this system. Processes, drawn as vertices, transform each data unit input into various forms prior to delivery…

  15. Applying Galois compliance for data analysis in information systems

    Directory of Open Access Journals (Sweden)

    Kozlov Sergey

    2016-03-01

    Full Text Available The article deals with the data analysis in information systems. The author discloses the possibility of using Galois compliance to identify the characteristics of the information system structure. The author reveals the specificity of the application of Galois compliance for the analysis of information system content with the use of invariants of graph theory. Aspects of introduction of mathematical apparatus of Galois compliance for research of interrelations between elements of the adaptive training information system of individual testing are analyzed.

  16. Exploring health information technology education: an analysis of the research.

    Science.gov (United States)

    Virgona, Thomas

    2012-01-01

    This article is an analysis of the Health Information Technology Education published research. The purpose of this study was to examine selected literature using variables such as journal frequency, keyword analysis, universities associated with the research and geographic diversity. The analysis presented in this paper has identified intellectually significant studies that have contributed to the development and accumulation of intellectual wealth of Health Information Technology. The keyword analysis suggests that Health Information Technology research has evolved from establishing concepts and domains of health information systems, technology and management to contemporary issues such as education, outsourcing, web services and security. The research findings have implications for educators, researchers, journal.

  17. Function analysis for waste information systems

    Energy Technology Data Exchange (ETDEWEB)

    Sexton, J.L.; Neal, C.T.; Heath, T.C.; Starling, C.D.

    1996-04-01

    This study has a two-fold purpose. It seeks to identify the functional requirements of a waste tracking information system and to find feasible alternatives for meeting those requirements on the Oak Ridge Reservation (ORR) and the Portsmouth (PORTS) and Paducah (PGDP) facilities; identify options that offer potential cost savings to the US government and also show opportunities for improved efficiency and effectiveness in managing waste information; and, finally, to recommend a practical course of action that can be immediately initiated. In addition to identifying relevant requirements, it also identifies any existing requirements that are currently not being completely met. Another aim of this study is to carry out preliminary benchmarking by contacting representative companies about their strategic directions in waste information. The information obtained from representatives of these organizations is contained in an appendix to the document; a full benchmarking effort, however, is beyond the intended scope of this study.

  18. A Mathematical Analysis of Conflict Prevention Information

    Science.gov (United States)

    Maddalon, Jeffrey M.; Butler, Ricky W.; Munoz, Cesar A.; Dowek, Gilles

    2009-01-01

    In air traffic management, conflict prevention information refers to the guidance maneuvers, which if taken, ensure that an aircraft's path is conflict-free. These guidance maneuvers take the form of changes to track angle or ground speed. Conflict prevention information may be assembled into prevention bands that advise the crew on maneuvers that should not be taken. Unlike conflict resolution systems, which presume that the aircraft already has a conflict, conflict prevention systems show conflicts for any maneuver, giving the pilot confidence that if a maneuver is made, then no near-term conflicts will result. Because near-term conflicts can lead to safety concerns, strong verification of information correctness is required. This paper presents a mathematical framework to analyze the correctness of algorithms that produce conflict prevention information incorporating an arbitrary number of traffic aircraft and with both a near-term and intermediate-term lookahead times. The framework is illustrated with a formally verified algorithm for 2-dimensional track angle prevention bands.

  19. Analysis to Inform Defense Planning Despite Austerity

    Science.gov (United States)

    2014-01-01

    envisions going to policymakers (top yellow diamond ) to discuss what capabilities they wish to pursue further given results of the first-cut analysis...envisions going to policymakers to discuss what capabilities they wish to pursue further given results of the first-cut analysis (top yellow diamond ...2008; and related discussion from a larger workshop held by OSD (Acquisition, Technology, and Logistics) ( Porter , Bracken, and Kneece, 2007). 89

  20. Information architecture. Volume 2, Part 1: Baseline analysis summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-01

    The Department of Energy (DOE) Information Architecture, Volume 2, Baseline Analysis, is a collaborative and logical next-step effort in the processes required to produce a Departmentwide information architecture. The baseline analysis serves a diverse audience of program management and technical personnel and provides an organized way to examine the Department`s existing or de facto information architecture. A companion document to Volume 1, The Foundations, it furnishes the rationale for establishing a Departmentwide information architecture. This volume, consisting of the Baseline Analysis Summary (part 1), Baseline Analysis (part 2), and Reference Data (part 3), is of interest to readers who wish to understand how the Department`s current information architecture technologies are employed. The analysis identifies how and where current technologies support business areas, programs, sites, and corporate systems.

  1. Synthesis and analysis of three-dimensional video information

    Science.gov (United States)

    Katys, P. G.; Katys, Georgy P.

    2005-02-01

    The principles of design, the basis of functioning and characteristics of 3-dimensional (3D) visual information systems synthesis and analysis are analyzed. In the first part of paper the modern state of 3D video information synthesis and reproduction systems development is considered, like: stereoscopic, auto-stereoscopic, and holographic. In the second part the principles of machine-vision systems are considered, that the analysis of 3D video-information are realized.

  2. Neurodynamics analysis of brain information transmission

    Institute of Scientific and Technical Information of China (English)

    Ru-bin WANG; Zhi-kang ZHANG; Chi K. Tse

    2009-01-01

    This paper proposes a model of neural networks consisting of populations of perceptive neurons, inter-neurons, and motor neurons according to the theory of stochastic phase resetting dynamics. According to this model, the dynamical characteristics of neural networks are studied in three coupling cases, namely, series and parallel coupling, series coupling, and unilateral coupling. The results show that the indentified structure of neural networks enables the basic characteristics of neural information processing to be described in terms of the actions of both the optional motor and the reflected motor. The excitation of local neural networks is caused by the action of the optional motor. In particular, the excitation of the neural population caused by the action of the optional motor in the motor cortex is larger than that caused by the action of the reflected motor. This phenomenon indicates that there are more neurons participating in the neural information processing and the excited synchronization motion under the action of the optional motor.

  3. Hydrogen Technical Analysis -- Dissemination of Information

    Energy Technology Data Exchange (ETDEWEB)

    George Kervitsky, Jr.

    2006-03-20

    SENTECH is a small energy and environmental consulting firm providing technical, analytical, and communications solutions to technology management issues. The activities proposed by SENTECH focused on gathering and developing communications materials and information, and various dissemination activities to present the benefits of hydrogen energy to a broad audience while at the same time establishing permanent communications channels to enable continued two-way dialog with these audiences in future years. Effective communications and information dissemination is critical to the acceptance of new technology. Hydrogen technologies face the additional challenge of safety preconceptions formed primarily as a result of the crash of the Hindenburg. Effective communications play a key role in all aspects of human interaction, and will help to overcome the perceptual barriers, whether of safety, economics, or benefits. As originally proposed SENTECH identified three distinct information dissemination activities to address three distinct but important audiences; these formed the basis for the task structure used in phases 1 and 2. The tasks were: (1) Print information--Brochures that target the certain segment of the population and will be distributed via relevant technical conferences and traditional distribution channels. (2) Face-to-face meetings--With industries identified to have a stake in hydrogen energy. The three industry audiences are architect/engineering firms, renewable energy firms, and energy companies that have not made a commitment to hydrogen (3) Educational Forums--The final audience is students--the future engineers, technicians, and energy consumers. SENTECH will expand on its previous educational work in this area. The communications activities proposed by SENTECH and completed as a result of this cooperative agreement was designed to compliment the research and development work funded by the DOE by presenting the technical achievements and validations

  4. Army Information Operations Officer Needs Analysis Report

    Science.gov (United States)

    2016-03-01

    network analysis tool . . . it could really lead to some really negative second and third order effects. Another personnel issue that was...Research Institute for the Behavioral and Social Sciences Approved for public release...distribution is unlimited. U.S. Army Research Institute for the Behavioral and Social Sciences Department of the Army Deputy Chief of Staff, G1

  5. Contaminant remediation decision analysis using information gap theory

    CERN Document Server

    Harp, Dylan R

    2011-01-01

    Decision making under severe lack of information is a ubiquitous situation in nearly every applied field of engineering, policy, and science. A severe lack of information precludes our ability to determine a frequency of occurrence of events or conditions that impact the decision; therefore, decision uncertainties due to a severe lack of information cannot be characterized probabilistically. To circumvent this problem, information gap (info-gap) theory has been developed to explicitly recognize and quantify the implications of information gaps in decision making. This paper presents a decision analysis based on info-gap theory developed for a contaminant remediation scenario. The analysis provides decision support in determining the fraction of contaminant mass to remove from the environment in the presence of a lack of information related to the contaminant mass flux into an aquifer. An info-gap uncertainty model is developed to characterize uncertainty due to a lack of information concerning the contaminant...

  6. Propositional Analysis: A Tool for Library and Information Science Research.

    Science.gov (United States)

    Allen, Bryce

    1989-01-01

    Reviews the use of propositional analysis in library and information science research. Evidence that different analysts produce similar judgments about texts and use the method consistently over time is presented, and it is concluded that propositional analysis is a reliable and valid research method. An example of an analysis is appended. (32…

  7. Information Systems Vulnerability: A Systems Analysis Perspective

    Science.gov (United States)

    1996-06-01

    D.F. Haasl, and F.F. Goldberg, “Fault Tree Handbook, ” NUREG -0492, U.S. Nuclear Regulatory Commission, Washington, D. C., January 1981. J.B. Dugan...Symposium, 1996. G.B. Varnado, W.H. Horton, and P.R. Lobner, “Modular Fault Tree Analysis Procedures Guide, ” SAND83-0963, NUREG /CR-3268, Prepared by...D.W. Whitehead, “Microcomputer Applications of and Modifications to the Modular Fault Trees, ” SAND89-1887, NUREG /CR-4838, Prepared by Sandia National

  8. Fundamental procedures of geographic information analysis

    Science.gov (United States)

    Berry, J. K.; Tomlin, C. D.

    1981-01-01

    Analytical procedures common to most computer-oriented geographic information systems are composed of fundamental map processing operations. A conceptual framework for such procedures is developed and basic operations common to a broad range of applications are described. Among the major classes of primitive operations identified are those associated with: reclassifying map categories as a function of the initial classification, the shape, the position, or the size of the spatial configuration associated with each category; overlaying maps on a point-by-point, a category-wide, or a map-wide basis; measuring distance; establishing visual or optimal path connectivity; and characterizing cartographic neighborhoods based on the thematic or spatial attributes of the data values within each neighborhood. By organizing such operations in a coherent manner, the basis for a generalized cartographic modeling structure can be developed which accommodates a variety of needs in a common, flexible and intuitive manner. The use of each is limited only by the general thematic and spatial nature of the data to which it is applied.

  9. Medical Image Analysis by Cognitive Information Systems - a Review.

    Science.gov (United States)

    Ogiela, Lidia; Takizawa, Makoto

    2016-10-01

    This publication presents a review of medical image analysis systems. The paradigms of cognitive information systems will be presented by examples of medical image analysis systems. The semantic processes present as it is applied to different types of medical images. Cognitive information systems were defined on the basis of methods for the semantic analysis and interpretation of information - medical images - applied to cognitive meaning of medical images contained in analyzed data sets. Semantic analysis was proposed to analyzed the meaning of data. Meaning is included in information, for example in medical images. Medical image analysis will be presented and discussed as they are applied to various types of medical images, presented selected human organs, with different pathologies. Those images were analyzed using different classes of cognitive information systems. Cognitive information systems dedicated to medical image analysis was also defined for the decision supporting tasks. This process is very important for example in diagnostic and therapy processes, in the selection of semantic aspects/features, from analyzed data sets. Those features allow to create a new way of analysis.

  10. An Investigation Of Organizational Information Security Risk Analysis

    Directory of Open Access Journals (Sweden)

    Zack Jourdan

    2010-12-01

    Full Text Available Despite a growing number and variety of information security threats, many organizations continue to neglect implementing information security policies and procedures.  The likelihood that an organization’s information systems can fall victim to these threats is known as information systems risk (Straub & Welke, 1998.  To combat these threats, an organization must undergo a rigorous process of self-analysis. To better understand the current state of this information security risk analysis (ISRA process, this study deployed a questionnaire using both open-ended and closed ended questions administered to a group of information security professionals (N=32.  The qualitative and quantitative results of this study show that organizations are beginning to conduct regularly scheduled ISRA processes.  However, the results also show that organizations still have room for improvement to create idyllic ISRA processes. 

  11. Complexity and information flow analysis for multi-threaded programs

    Science.gov (United States)

    Ngo, Tri Minh; Huisman, Marieke

    2017-01-01

    This paper studies the security of multi-threaded programs. We combine two methods, i.e., qualitative and quantitative security analysis, to check whether a multi-threaded program is secure or not. In this paper, besides reviewing classical analysis models, we present a novel model of quantitative analysis where the attacker is able to select the scheduling policy. This model does not follow the traditional information-theoretic channel setting. Our analysis first studies what extra information an attacker can get if he knows the scheduler's choices, and then integrates this information into the transition system modeling the program execution. Via a case study, we compare this approach with the traditional information-theoretic models, and show that this approach gives more intuitive-matching results.

  12. Water Information Management & Analysis System (WIMAS) v 4.0

    Data.gov (United States)

    Kansas Data Access and Support Center — The Water Information Management and Analysis System (WIMAS) is an ArcView based GIS application that allows users to query Kansas water right data maintained by the...

  13. Analysis of safeguards information treatment system at the facility level

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Byung Doo; Song, Dae Yong; Kwack, Eun Ho

    2000-12-01

    Safeguards Information Treatment System(SITS) at the facility level is required to implement efficiently the obligations under the Korea-IAEA Safeguards Agreement, bilateral agreements with other countries and domestic law. In this report, the analysis of information, which the SITS treats, and operation environment of SITS including the review of the relationship between safeguards information are described. SITS will be developed to cover the different accounting procedures and methods applied at the various facilities under IAEA safeguards.

  14. Bayesian imperfect information analysis for clinical recurrent data

    OpenAIRE

    Chang CK; Chang CC

    2014-01-01

    Chih-Kuang Chang,1 Chi-Chang Chang2 1Department of Cardiology, Jen-Ai Hospital, Dali District, Taichung, Taiwan; 2School of Medical Informatics, Chung Shan Medical University, Information Technology Office of Chung Shan Medical University Hospital, Taichung, TaiwanAbstract: In medical research, clinical practice must often be undertaken with imperfect information from limited resources. This study applied Bayesian imperfect information-value analysis to realistic situations to produce likelih...

  15. A time sequence analysis on the informal information flow mechanism of microblogging

    Institute of Scientific and Technical Information of China (English)

    Yuan; HU; Xiaoli; LIAO; Andong; WU

    2011-01-01

    Microblog is a new Internet featured product,which has seen a rapid development in recent years.Researchers from different countries are making various technical analyses on microblogging applications.In this study,through using the natural language processing(NLP)and data mining,we analyzed the information content transmitted via a microblog,users’social networks and their interactions,and carried out an empirical analysis on the dissemination process of one particular piece of information via Sina Weibo.Based on the result of these analyses,we attempt to develop a better understanding about the rule and mechanism of the informal information flow in microblogging.

  16. Agricultural information dissemination using ICTs: A review and analysis of information dissemination models in China

    Directory of Open Access Journals (Sweden)

    Yun Zhang

    2016-03-01

    Full Text Available Over the last three decades, China’s agriculture sector has been transformed from the traditional to modern practice through the effective deployment of Information and Communication Technologies (ICTs. Information processing and dissemination have played a critical role in this transformation process. Many studies in relation to agriculture information services have been conducted in China, but few of them have attempted to provide a comprehensive review and analysis of different information dissemination models and their applications. This paper aims to review and identify the ICT based information dissemination models in China and to share the knowledge and experience in applying emerging ICTs in disseminating agriculture information to farmers and farm communities to improve productivity and economic, social and environmental sustainability. The paper reviews and analyzes the development stages of China’s agricultural information dissemination systems and different mechanisms for agricultural information service development and operations. Seven ICT-based information dissemination models are identified and discussed. Success cases are presented. The findings provide a useful direction for researchers and practitioners in developing future ICT based information dissemination systems. It is hoped that this paper will also help other developing countries to learn from China’s experience and best practice in their endeavor of applying emerging ICTs in agriculture information dissemination and knowledge transfer.

  17. Informativeness of the CODIS STR loci for admixture analysis.

    Science.gov (United States)

    Barnholtz-Sloan, Jill S; Pfaff, Carrie L; Chakraborty, Ranajit; Long, Jeffrey C

    2005-11-01

    Population admixture (or ancestry) is used as an approach to gene discovery in complex diseases, particularly when the disease prevalence varies widely across geographic populations. Admixture analysis could be useful for forensics because an indication of a perpetrator's ancestry would narrow the pool of suspects for a particular crime. The purpose of this study was to use Fisher's information to identify informative sets of markers for admixture analysis. Using published founding population allele frequencies we test three marker sets for efficacy for estimating admixture: the FBI CODIS Core STR loci, the HGDP-CEPH Human Genome Diversity Cell Line Panel and the set of 39 ancestry informative SNPS from the Shriver lab at Pennsylvania State University. We conclude that the FBI CODIS Core STR set is valid for admixture analysis, but not the most precise. We recommend using a combination of the most informative markers from the HGDP-CEPH and Shriver loci sets.

  18. Information Security Analysis Using Game Theory and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Schlicher, Bob G [ORNL; Abercrombie, Robert K [ORNL

    2012-01-01

    Information security analysis can be performed using game theory implemented in dynamic simulations of Agent Based Models (ABMs). Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, and information assets. Our approach addresses imperfect information and scalability that allows us to also address previous limitations of current stochastic game models. Such models only consider perfect information assuming that the defender is always able to detect attacks; assuming that the state transition probabilities are fixed before the game assuming that the players actions are always synchronous; and that most models are not scalable with the size and complexity of systems under consideration. Our use of ABMs yields results of selected experiments that demonstrate our proposed approach and provides a quantitative measure for realistic information systems and their related security scenarios.

  19. Content Analysis in Library and Information Science Research.

    Science.gov (United States)

    Allen, Bryce; Reser, David

    1990-01-01

    Describes ways in which content analysis is being used in library and information science research. Methodological concerns are addressed, including selection of target documents, selection of samples to be analyzed, selection of categories for analysis, and elimination of researcher bias to assure reliability. (35 references) (LRW)

  20. A Strategic Analysis of Information Sharing Among Cyber Attackers

    Directory of Open Access Journals (Sweden)

    Kjell Hausken

    2015-10-01

    Full Text Available We build a game theory model where the market design is such that one firm invests in security to defend against cyber attacks by two hackers. The firm has an asset, which is allocated between the three market participants dependent on their contest success. Each hacker chooses an optimal attack, and they share information with each other about the firm’s vulnerabilities. Each hacker prefers to receive information, but delivering information gives competitive advantage to the other hacker. We find that each hacker’s attack and information sharing are strategic complements while one hacker’s attack and the other hacker’s information sharing are strategic substitutes. As the firm’s unit defense cost increases, the attack is inverse U-shaped and reaches zero, while the firm’s defense and profit decrease, and the hackers’ information sharing and profit increase. The firm’s profit increases in the hackers’ unit cost of attack, while the hackers’ information sharing and profit decrease. Our analysis also reveals the interesting result that the cumulative attack level of the hackers is not affected by the effectiveness of information sharing between them and moreover, is also unaffected by the intensity of joint information sharing. We also find that as the effectiveness of information sharing between hackers increases relative to the investment in attack, the firm’s investment in cyber security defense and profit are constant, the hackers’ investments in attacks decrease, and information sharing levels and hacker profits increase. In contrast, as the intensity of joint information sharing increases, while the firm’s investment in cyber security defense and profit remain constant, the hackers’ investments in attacks increase, and the hackers’ information sharing levels and profits decrease. Increasing the firm’s asset causes all the variables to increase linearly, except information sharing which is constant. We extend

  1. A DSRPCL-SVM Approach to Informative Gene Analysis

    Institute of Scientific and Technical Information of China (English)

    Wei Xiong; Zhibin Cai; Jinwen Ma

    2008-01-01

    Microarray data based tumor diagnosis is a very interesting topic in bioinformatics. One of the key problems is the discovery and analysis of informative genes of a tumor. Although there are many elaborate approaches to this problem, it is still difficult to select a reasonable set of informative genes for tumor diagnosis only with microarray data. In this paper, we classify the genes expressed through microarray data into a number of clusters via the distance sensitive rival penalized competitive learning (DSRPCL) algorithm and then detect the informative gene cluster or set with the help of support vector machine (SVM). Moreover, the critical or powerful informative genes can be found through further classifications and detections on the obtained informative gene clusters. It is well demonstrated by experiments on the colon, leukemia, and breast cancer datasets that our proposed DSRPCL-SVM approach leads to a reasonable selection of informative genes for tumor diagnosis.

  2. An information flow analysis of a distributed information system for space medical support.

    Science.gov (United States)

    Zhang, Tao; Aranzamendez, Gina; Rinkus, Susan; Gong, Yang; Rukab, Jamie; Johnson-Throop, Kathy A; Malin, JaneT; Zhang, Jiajie

    2004-01-01

    In this study, we applied the methodology grounded in human-centered distributed cognition principles to the information flow analysis of a highly intensive, distributed and complex environment--the Biomedical Engineer (BME) console system at NASA Johnson Space Center. This system contains disparate human and artificial agents and artifacts. Users and tasks of this system were analyzed. An ethnographic study and a detailed communication pattern analysis were conducted to gain deeper insight and better understanding of the information flow patterns and the organizational memory of the current BME console system. From this study, we identified some major problems and offered recommendations to improve the efficiency and effectiveness of this system. We believe that this analysis methodology can be used in other distributed information systems, such as a healthcare environment.

  3. Hybrid Information Flow Analysis for Programs with Arrays

    Directory of Open Access Journals (Sweden)

    Gergö Barany

    2016-07-01

    Full Text Available Information flow analysis checks whether certain pieces of (confidential data may affect the results of computations in unwanted ways and thus leak information. Dynamic information flow analysis adds instrumentation code to the target software to track flows at run time and raise alarms if a flow policy is violated; hybrid analyses combine this with preliminary static analysis. Using a subset of C as the target language, we extend previous work on hybrid information flow analysis that handled pointers to scalars. Our extended formulation handles arrays, pointers to array elements, and pointer arithmetic. Information flow through arrays of pointers is tracked precisely while arrays of non-pointer types are summarized efficiently. A prototype of our approach is implemented using the Frama-C program analysis and transformation framework. Work on a full machine-checked proof of the correctness of our approach using Isabelle/HOL is well underway; we present the existing parts and sketch the rest of the correctness argument.

  4. Informational Analysis for Compressive Sampling in Radar Imaging

    Directory of Open Access Journals (Sweden)

    Jingxiong Zhang

    2015-03-01

    Full Text Available Compressive sampling or compressed sensing (CS works on the assumption of the sparsity or compressibility of the underlying signal, relies on the trans-informational capability of the measurement matrix employed and the resultant measurements, operates with optimization-based algorithms for signal reconstruction and is thus able to complete data compression, while acquiring data, leading to sub-Nyquist sampling strategies that promote efficiency in data acquisition, while ensuring certain accuracy criteria. Information theory provides a framework complementary to classic CS theory for analyzing information mechanisms and for determining the necessary number of measurements in a CS environment, such as CS-radar, a radar sensor conceptualized or designed with CS principles and techniques. Despite increasing awareness of information-theoretic perspectives on CS-radar, reported research has been rare. This paper seeks to bridge the gap in the interdisciplinary area of CS, radar and information theory by analyzing information flows in CS-radar from sparse scenes to measurements and determining sub-Nyquist sampling rates necessary for scene reconstruction within certain distortion thresholds, given differing scene sparsity and average per-sample signal-to-noise ratios (SNRs. Simulated studies were performed to complement and validate the information-theoretic analysis. The combined strategy proposed in this paper is valuable for information-theoretic orientated CS-radar system analysis and performance evaluation.

  5. Performance of the Carbon Dioxide Information Analysis Center (CDIAC)

    Energy Technology Data Exchange (ETDEWEB)

    Stoss, F.W. [Univ. of Tennessee, Knoxville, TN (United States). Environment, Energy, and Resources Center; Jones, S.B. [Oak Ridge National Lab., TN (United States)

    1993-11-01

    The Carbon Dioxide Information Analysis Center (CDIAC) provides information and data resources in support of the US Department of Energy`s Global Change Research Program. CDIAC also serves as a resource of global change information for a broader international commonly of researchers, policymakers, managers, educators, and students. The number of requests for CDIAC`s data products, information services, and publications has grown over the years and represents multidisciplinary interests in the physical, life, and social sciences and from diverse work settings in government, business, and academia. CDIAC`s staff addresses thousands of requests yearly for data and information resources. In response to these requests, CDIAC has distributed tens of thousands of data products, technical reports, newsletters, and other information resources worldwide since 1982. This paper describes CDIAC, examines CDIAC`s user community, and describes CDIAC`s response to requests for information. The CDIAC Information System, which serves as a comprehensive PC-based inventory and information management tracking system, is also described.

  6. Performance Analysis of Information Services in a Grid Environment

    Directory of Open Access Journals (Sweden)

    Giovanni Aloisio

    2004-10-01

    Full Text Available The Information Service is a fundamental component in a grid environment. It has to meet a lot of requirements such as access to static and dynamic information related to grid resources, efficient and secure access to dynamic data, decentralized maintenance, fault tolerance etc., in order to achieve better performance, scalability, security and extensibility. Currently there are two different major approaches. One is based on a directory infrastructure and another one on a novel approach that exploits a relational DBMS. In this paper we present a performance comparison analysis between Grid Resource Information Service (GRIS and Local Dynamic Grid Catalog relational information service (LDGC, providing also information about two projects (iGrid and Grid Relational Catalog in the grid data management area.

  7. Stationary Time Series Analysis Using Information and Spectral Analysis

    Science.gov (United States)

    1992-09-01

    spectral density function of the time series. The spectral density function f(w), 0 < w < 1, is defined as the Fourier transform of...series with spectral density function f(w). 4 An important result of Pinsker [(1964), p. 196] can be interpreted as providing a for- mula for asymptotic...Analysis Papers, Holden-Day, San Francisco, California. Parzen, E. (1958) "On asymptotically efficient consistent estimates of the spectral density function

  8. Information delivery manuals to facilitate it supported energy analysis

    DEFF Research Database (Denmark)

    Mondrup, Thomas Fænø; Karlshøj, Jan; Vestergaard, Flemming

    In response to continuing Building Information Modeling (BIM) progress, building performance simulation tools such as IESVE are being utilized to explore construction projects and influence design decisions with increasing frequency. To maximize the potential of these tools, a specification...... of information exchange and digital workflows is required. This paper presents the preliminary findings of an ongoing study aimed at developing an Information Delivery Manual (IDM) for IT supported energy analysis at concept design phase. The IDM development is based on: (1) a review of current approaches (2...

  9. CISAPS: Complex Informational Spectrum for the Analysis of Protein Sequences

    Directory of Open Access Journals (Sweden)

    Charalambos Chrysostomou

    2015-01-01

    Full Text Available Complex informational spectrum analysis for protein sequences (CISAPS and its web-based server are developed and presented. As recent studies show, only the use of the absolute spectrum in the analysis of protein sequences using the informational spectrum analysis is proven to be insufficient. Therefore, CISAPS is developed to consider and provide results in three forms including absolute, real, and imaginary spectrum. Biologically related features to the analysis of influenza A subtypes as presented as a case study in this study can also appear individually either in the real or imaginary spectrum. As the results presented, protein classes can present similarities or differences according to the features extracted from CISAPS web server. These associations are probable to be related with the protein feature that the specific amino acid index represents. In addition, various technical issues such as zero-padding and windowing that may affect the analysis are also addressed. CISAPS uses an expanded list of 611 unique amino acid indices where each one represents a different property to perform the analysis. This web-based server enables researchers with little knowledge of signal processing methods to apply and include complex informational spectrum analysis to their work.

  10. Similarity Measures, Author Cocitation Analysis, and Information Theory

    CERN Document Server

    Leydesdorff, Loet

    2009-01-01

    The use of Pearson's correlation coefficient in Author Cocitation Analysis was compared with Salton's cosine measure in a number of recent contributions. Unlike the Pearson correlation, the cosine is insensitive to the number of zeros. However, one has the option of applying a logarithmic transformation in correlation analysis. Information calculus is based on both the logarithmic transformation and provides a non-parametric statistics. Using this methodology one can cluster a document set in a precise way and express the differences in terms of bits of information. The algorithm is explained and used on the data set which was made the subject of this discussion.

  11. Webometric Analysis of Departments of Librarianship and Information Science.

    Science.gov (United States)

    Thomas, Owen; Willett, Peter

    2000-01-01

    Describes a webometric analysis of linkages to library and information science (LIS) department Web sites in United Kingdom universities. Concludes that situation data are not well suited to evaluation of LIS departments and that departments can boost Web site visibility by hosting a wide range of materials. (Author/LRW)

  12. Need for information metrics: with examples from document analysis

    Science.gov (United States)

    Nartker, Thomas A.

    1994-03-01

    We present an argument that progress in Information Science is inhibited by our incomplete perception of the nature of the field. An agenda for research is proposed which, we believe, will lead to more rapid progress. Specific examples are given from the field of Document Analysis.

  13. Digital Avionics Information System (DAIS): Training Requirements Analysis Model (TRAMOD).

    Science.gov (United States)

    Czuchry, Andrew J.; And Others

    The training requirements analysis model (TRAMOD) described in this report represents an important portion of the larger effort called the Digital Avionics Information System (DAIS) Life Cycle Cost (LCC) Study. TRAMOD is the second of three models that comprise an LCC impact modeling system for use in the early stages of system development. As…

  14. Environmental Quality Information Analysis Center multi-year plan

    Energy Technology Data Exchange (ETDEWEB)

    Rivera, R.G. [RDG, Inc. (United States); Das, S. [Oak Ridge National Lab., TN (United States); Walsh, T.E. [Florida Univ., Gainesville, FL (United States)

    1992-09-01

    An information analysis center (IAC) is a federal resource that provides technical information for a specific technology field. An IAC links an expert technical staff with an experienced information specialist group, supported by in-house or external data bases to provide technical information and maintain a corporate knowledge in a technical area. An IAC promotes the rapid transfer of technology among its users and provides assistance in adopting new technology and predicting and assessing emerging technology. This document outlines the concept, requirements, and proposed development of an Environmental Quality IAC (EQIAC). An EQIAC network is composed of several nodes, each of which has specific technology capabilities. This document outlines strategic and operational objectives for the phased development of one such node of an EQIAC network.

  15. Multiscale Analysis of Information Dynamics for Linear Multivariate Processes

    CERN Document Server

    Faes, Luca; Stramaglia, Sebastiano; Nollo, Giandomenico; Stramaglia, Sebastiano

    2016-01-01

    In the study of complex physical and physiological systems represented by multivariate time series, an issue of great interest is the description of the system dynamics over a range of different temporal scales. While information-theoretic approaches to the multiscale analysis of complex dynamics are being increasingly used, the theoretical properties of the applied measures are poorly understood. This study introduces for the first time a framework for the analytical computation of information dynamics for linear multivariate stochastic processes explored at different time scales. After showing that the multiscale processing of a vector autoregressive (VAR) process introduces a moving average (MA) component, we describe how to represent the resulting VARMA process using state-space (SS) models and how to exploit the SS model parameters to compute analytical measures of information storage and information transfer for the original and rescaled processes. The framework is then used to quantify multiscale infor...

  16. Comparative Analysis of Splice Site Regions by Information Content

    Institute of Scientific and Technical Information of China (English)

    T. Shashi Rekha; Chanchal K. Mitra

    2006-01-01

    We have applied concepts from information theory for a comparative analysis of donor (gt) and acceptor (ag) splice site regions in the genes of five different organisms by calculating their mutual information content (relative entropy) over a selected block of nucleotides. A similar pattern that the information content decreases as the block size increases was observed for both regions in all the organisms studied. This result suggests that the information required for splicing might be contained in the consensus of ~6-8 nt at both regions. We assume from our study that even though the nucleotides are showing some degrees of conservation in the flanking regions of the splice sites, certain level of variability is still tolerated,which leads the splicing process to occur normally even if the extent of base pairing is not fully satisfied. We also suggest that this variability can be compensated by recognizing different splice sites with different spliceosomal factors.

  17. ERISTAR: Earth Resources Information Storage, Transformation, Analysis, and Retrieval

    Science.gov (United States)

    1972-01-01

    The National Aeronautics and Space Administration (NASA) and the American Society for Engineering Education (ASEE) have sponsored faculty fellowship programs in systems engineering design for the past several years. During the summer of 1972 four such programs were conducted by NASA, with Auburn University cooperating with Marshall Space Flight Center (MSFC). The subject for the Auburn-MSFC design group was ERISTAR, an acronym for Earth Resources Information Storage, Transformation, Analysis and Retrieval, which represents an earth resources information management network of state information centers administered by the respective states and linked to federally administered regional centers and a national center. The considerations for serving the users and the considerations that must be given to processing data from a variety of sources are described. The combination of these elements into a national network is discussed and an implementation plan is proposed for a prototype state information center. The compatibility of the proposed plan with the Department of Interior plan, RALI, is indicated.

  18. Sentiment analysis using common-sense and context information.

    Science.gov (United States)

    Agarwal, Basant; Mittal, Namita; Bansal, Pooja; Garg, Sonal

    2015-01-01

    Sentiment analysis research has been increasing tremendously in recent times due to the wide range of business and social applications. Sentiment analysis from unstructured natural language text has recently received considerable attention from the research community. In this paper, we propose a novel sentiment analysis model based on common-sense knowledge extracted from ConceptNet based ontology and context information. ConceptNet based ontology is used to determine the domain specific concepts which in turn produced the domain specific important features. Further, the polarities of the extracted concepts are determined using the contextual polarity lexicon which we developed by considering the context information of a word. Finally, semantic orientations of domain specific features of the review document are aggregated based on the importance of a feature with respect to the domain. The importance of the feature is determined by the depth of the feature in the ontology. Experimental results show the effectiveness of the proposed methods.

  19. Large-scale temporal analysis of computer and information science

    Science.gov (United States)

    Soos, Sandor; Kampis, George; Gulyás, László

    2013-09-01

    The main aim of the project reported in this paper was twofold. One of the primary goals was to produce an extensive source of network data for bibliometric analyses of field dynamics in the case of Computer and Information Science. To this end, we rendered the raw material of the DBLP computer and infoscience bibliography into a comprehensive collection of dynamic network data, promptly available for further statistical analysis. The other goal was to demonstrate the value of our data source via its use in mapping Computer and Information Science (CIS). An analysis of the evolution of CIS was performed in terms of collaboration (co-authorship) network dynamics. Dynamic network analysis covered three quarters of the XX. century (76 years, from 1936 to date). Network evolution was described both at the macro- and the mezo level (in terms of community characteristics). Results show that the development of CIS followed what appears to be a universal pattern of growing into a "mature" discipline.

  20. Implantation of a safety management system information under the ISO 27001: risk analysis information

    Directory of Open Access Journals (Sweden)

    José Gregorio Arévalo Ascanio

    2015-11-01

    Full Text Available In this article the structure of the business of the city of Ocaña is explored with the aim of expanding the information and knowledge of the main variables of the productive activity of the municipality, its entrepreneurial spirit, technological development and productive structure. For this, a descriptive research was performed to identify economic activity in its various forms and promote the implementation of administrative practices consistent with national and international references.The results allowed to establish business weaknesses, including information, which once identified are used to design spaces training, acquisition of abilities and employers management practices in consistent with the challenges of competitiveness and stay on the market.As of the results was collected information regarding technological component companies of the productive fabric of the city, for which the application of tools for the analysis of information systems is proposed using the ISO 27001: 2005, using most appropriate technologies to study organizations that protect their most important asset information: information.

  1. Online nutrition information for pregnant women: a content analysis.

    Science.gov (United States)

    Storr, Tayla; Maher, Judith; Swanepoel, Elizabeth

    2016-06-29

    Pregnant women actively seek health information online, including nutrition and food-related topics. However, the accuracy and readability of this information have not been evaluated. The aim of this study was to describe and evaluate pregnancy-related food and nutrition information available online. Four search engines were used to search for pregnancy-related nutrition web pages. Content analysis of web pages was performed. Web pages were assessed against the 2013 Australian Dietary Guidelines to assess accuracy. Flesch-Kincaid (F-K), Simple Measure of Gobbledygook (SMOG), Gunning Fog Index (FOG) and Flesch reading ease (FRE) formulas were used to assess readability. Data was analysed descriptively. Spearman's correlation was used to assess the relationship between web page characteristics. Kruskal-Wallis test was used to check for differences among readability and other web page characteristics. A total of 693 web pages were included. Web page types included commercial (n = 340), not-for-profit (n = 113), blogs (n = 112), government (n = 89), personal (n = 36) and educational (n = 3). The accuracy of online nutrition information varied with 39.7% of web pages containing accurate information, 22.8% containing mixed information and 37.5% containing inaccurate information. The average reading grade of all pages analysed measured by F-K, SMOG and FOG was 11.8. The mean FRE was 51.6, a 'fairly difficult to read' score. Only 0.5% of web pages were written at or below grade 6 according to F-K, SMOG and FOG. The findings suggest that accuracy of pregnancy-related nutrition information is a problem on the internet. Web page readability is generally difficult and means that the information may not be accessible to those who cannot read at a sophisticated level. © 2016 John Wiley & Sons Ltd.

  2. IMMAN: free software for information theory-based chemometric analysis.

    Science.gov (United States)

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  3. Latent morpho-semantic analysis : multilingual information retrieval with character n-grams and mutual information.

    Energy Technology Data Exchange (ETDEWEB)

    Bader, Brett William; Chew, Peter A.; Abdelali, Ahmed (New Mexico State University)

    2008-08-01

    We describe an entirely statistics-based, unsupervised, and language-independent approach to multilingual information retrieval, which we call Latent Morpho-Semantic Analysis (LMSA). LMSA overcomes some of the shortcomings of related previous approaches such as Latent Semantic Analysis (LSA). LMSA has an important theoretical advantage over LSA: it combines well-known techniques in a novel way to break the terms of LSA down into units which correspond more closely to morphemes. Thus, it has a particular appeal for use with morphologically complex languages such as Arabic. We show through empirical results that the theoretical advantages of LMSA can translate into significant gains in precision in multilingual information retrieval tests. These gains are not matched either when a standard stemmer is used with LSA, or when terms are indiscriminately broken down into n-grams.

  4. COMPILATION AND ANALYSIS OF LEXICAL RESOURCES IN INFORMATION SCIENCE.

    Science.gov (United States)

    Information Science --thesauri, vocabularies, term lists and classification schemes. These resources were analysed with the purpose of tabulating the occurrence of terms. The study, therefore, was aimed at a quantitative review of lexical aids to information science language with the objective of determining its content and the frequency with which concepts have been recorded. This analysis, accomplished from a matrix using the IBM S 360/40 has enabled the researchers to quantify the terminology and to prepare a series of tables illustrating the

  5. Food labelled Information: An Empirical Analysis of Consumer Preferences

    Directory of Open Access Journals (Sweden)

    Alessandro Banterle

    2012-12-01

    Full Text Available This paper aims at analysing which kinds of currently labelled information are of interest and actually used by consumers, and which additional kinds could improve consumer choices. We investigate the attitude of consumers with respect to innovative strategies for the diffusion of product information, as smart-labels for mobile-phones. The empirical analysis was organised in focus groups followed by a survey on 240 consumers. Results show that the most important nutritional claims are vitamins, energy and fat content. Consumers show a high interest in the origin of the products, GMOs, environmental impact, animal welfare and type of breeding.

  6. Analysis of Annotation on Documents for Recycling Information

    Science.gov (United States)

    Nakai, Tomohiro; Kondo, Nobuyuki; Kise, Koichi; Matsumoto, Keinosuke

    In order to make collaborative business activities fruitful, it is essential to know characteristics of organizations and persons in more details and to gather information relevant to the activities. In this paper, we describe a notion of “information recycle" that actualizes these requirements by analyzing documents. The key of recycling information is to utilize annotations on documents as clues for generating users' profiles and for weighting contents in the context of the activities. We also propose a method of extracting annotations on paper documents just by pressing one button with the help of techniques of camera-based document image analysis. Experimental results demonstrate that it is fundamentally capable of acquiring annotations on paper documents on condition that their electronic versions without annotations are available for the processing.

  7. Solving Reality Problems by Using Mutual Information Analysis

    Directory of Open Access Journals (Sweden)

    Chia-Ju Liu

    2014-01-01

    Full Text Available Cross-mutual information (CMI can calculate to time series for thousands of sampled points from corticocortical connection among different functional states of brain in Alzheimer’s disease (AD patients. The aim of this study was to use mutual information analysis in the multichannel EEG to predict the probability of AD disease. Considering the correlation between AD disease and ageing effect, the participants were 9 AD patients and 45 normal cases involving teenagers, young people and elders. This data revealed that both right frontal and temporo-parietal are differences between normal and AD participants. Besides, this study found the theta band is the main frequency to separate AD patients from all participants. Furthermore, this study suggested a higher distinguishable method by mutual information to predict the possibility AD patients.

  8. Structural Simulations and Conservation Analysis -Historic Building Information Model (HBIM

    Directory of Open Access Journals (Sweden)

    C. Dore

    2015-02-01

    Full Text Available In this paper the current findings to date of the Historic Building Information Model (HBIM of the Four Courts in Dublin are presented. The Historic Building Information Model (HBIM forms the basis for both structural and conservation analysis to measure the impact of war damage which still impacts on the building. The laser scan survey was carried out in the summer of 2014 of the internal and external structure. After registration and processing of the laser scan survey, the HBIM was created of the damaged section of the building and is presented as two separate workflows in this paper. The first is the model created from historic data, the second a procedural and segmented model developed from laser scan survey of the war damaged drum and dome. From both models structural damage and decay simulations will be developed for documentation and conservation analysis.

  9. Crime Mapping and Geographical Information Systems in Crime Analysis

    Directory of Open Access Journals (Sweden)

    Murat Dağlar

    2016-04-01

    Full Text Available As essential apparatus in crime analysis, crime mapping and Geographical Information Systems (GIS are being progressively more accepted by police agencies. Development in technology and the accessibility of geographic data sources make it feasible for police departments to use GIS and crime mapping. GIS and crime mapping can be utilized as devices to discover reasons contributing to crime, and hence let law enforcement agencies proactively take action against the crime problems before they become challenging. The purpose of this study is to conduct a literature review of Geographical Information System and Crime Mapping in Crime Analysis and to propose policy recommendations regarding to implementation of crime mapping and GIS. To achieve this purpose, first a historical evaluation of GIS and crime mapping will be rendered and then the importance of place will be explained in terms of assessing crime problems accurately.

  10. Information Retrieval and Graph Analysis Approaches for Book Recommendation.

    Science.gov (United States)

    Benkoussas, Chahinez; Bellot, Patrice

    2015-01-01

    A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model) and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval approach to related document network comprised of social links. We called Directed Graph of Documents (DGD) a network constructed with documents and social information provided from each one of them. Specifically, this work tackles the problem of book recommendation in the context of INEX (Initiative for the Evaluation of XML retrieval) Social Book Search track. A series of reranking experiments demonstrate that combining retrieval models yields significant improvements in terms of standard ranked retrieval metrics. These results extend the applicability of link analysis algorithms to different environments.

  11. Information technology portfolio in supply chain management using factor analysis

    Directory of Open Access Journals (Sweden)

    Ahmad Jaafarnejad

    2013-11-01

    Full Text Available The adoption of information technology (IT along with supply chain management (SCM has become increasingly a necessity among most businesses. This enhances supply chain (SC performance and helps companies achieve the organizational competitiveness. IT systems capture and analyze information and enable management to make decisions by considering a global scope across the entire SC. This paper reviews the existing literature on IT in SCM and considers pertinent criteria. Using principal component analysis (PCA of factor analysis (FA, a number of related criteria are divided into smaller groups. Finally, SC managers can develop an IT portfolio in SCM using mean values of few extracted components on the relevance –emergency matrix. A numerical example is provided to explain details of the proposed method.

  12. A Brief Analysis to Informal Chatting on the Phone

    Institute of Scientific and Technical Information of China (English)

    金晓瑜

    2005-01-01

    An analysis to the telephoning chat is often ignored by analysts as it is very common and very colloquial.In my opinion,however,it differs to the spoken language life in some aspects. For example, it has two parties connected by a line, the speakers talks in deferent places and may do different things.So in analyzing the discourse,such as field, tenor, and mode, skipping and coherence, reference and presupposition in to analyze the informal chatting on the phone.It also mentlors some practical applications of the analysis to this style of discourse in the conclusion part.

  13. Cognitive Dimensions Analysis of Interfaces for Information Seeking

    CERN Document Server

    Golovchinsky, Gene

    2009-01-01

    Cognitive Dimensions is a framework for analyzing human-computer interaction. It is used for meta-analysis, that is, for talking about characteristics of systems without getting bogged down in details of a particular implementation. In this paper, I discuss some of the dimensions of this theory and how they can be applied to analyze information seeking interfaces. The goal of this analysis is to introduce a useful vocabulary that practitioners and researchers can use to describe systems, and to guide interface design toward more usable and useful systems

  14. Fusion of Multimodal Information in Music Content Analysis

    OpenAIRE

    Essid, Slim; Richard, Gaël

    2012-01-01

    Music is often processed through its acoustic realization. This is restrictive in the sense that music is clearly a highly multimodal concept where various types of heterogeneous information can be associated to a given piece of music (a musical score, musicians' gestures, lyrics, user-generated metadata, etc.). This has recently led researchers to apprehend music through its various facets, giving rise to "multimodal music analysis" studies. This article gives a synthetic overview of methods...

  15. Carbon Dioxide Information Analysis Center: FY 1991 activities

    Energy Technology Data Exchange (ETDEWEB)

    Cushman, R.M.; Stoss, F.W.

    1992-06-01

    During the course of a fiscal year, Oak Ridge National Laboratory's Carbon Dioxide Information Analysis Center (CDIAC) distributes thousands of specially publications-numeric data packages (NDPs), computer model packages (CMPs), technical reports, public communication publications, newsletters, article reprints, and reference books-in response to requests for information related to global environmental issues, primarily those pertaining to climate change. CDIAC's staff also provides technical responses to specific inquiries related to carbon dioxide (CO{sub 2}), other trace gases, and climate. Hundreds of referrals to other researchers, policy analysts, information specialists, or organizations are also facilitated by CDIAC's staff. This report provides an account of the activities accomplished by CDIAC during the period October 1, 1990 to September 30, 1991. An organizational overview of CDIAC and its staff is supplemented by a detailed description of inquiries received and CDIAC's response to those inquiries. An analysis and description of the preparation and distribution of numeric data packages, computer model packages, technical reports, newsletters, factsheets, specially publications, and reprints is provided. Comments and descriptions of CDIAC's information management systems, professional networking, and special bilateral agreements are also described.

  16. Carbon Dioxide Information Analysis Center: FY 1991 activities

    Energy Technology Data Exchange (ETDEWEB)

    Cushman, R.M.; Stoss, F.W.

    1992-06-01

    During the course of a fiscal year, Oak Ridge National Laboratory`s Carbon Dioxide Information Analysis Center (CDIAC) distributes thousands of specially publications-numeric data packages (NDPs), computer model packages (CMPs), technical reports, public communication publications, newsletters, article reprints, and reference books-in response to requests for information related to global environmental issues, primarily those pertaining to climate change. CDIAC`s staff also provides technical responses to specific inquiries related to carbon dioxide (CO{sub 2}), other trace gases, and climate. Hundreds of referrals to other researchers, policy analysts, information specialists, or organizations are also facilitated by CDIAC`s staff. This report provides an account of the activities accomplished by CDIAC during the period October 1, 1990 to September 30, 1991. An organizational overview of CDIAC and its staff is supplemented by a detailed description of inquiries received and CDIAC`s response to those inquiries. An analysis and description of the preparation and distribution of numeric data packages, computer model packages, technical reports, newsletters, factsheets, specially publications, and reprints is provided. Comments and descriptions of CDIAC`s information management systems, professional networking, and special bilateral agreements are also described.

  17. Carbon Dioxide Information Analysis Center: FY 1992 activities

    Energy Technology Data Exchange (ETDEWEB)

    Cushman, R.M. [Oak Ridge National Lab., TN (United States). Carbon Dioxide Information Analysis Center; Stoss, F.W. [Tennessee Univ., Knoxville, TN (United States). Energy, Environment and Resources Center

    1993-03-01

    During the course of a fiscal year, Oak Ridge National Laboratory`s Carbon Dioxide Information Analysis Center (CDIAC) distributes thousands of specialty publications-numeric data packages (NDPs), computer model packages (CMPs), technical reports, public communication publications, newsletters, article reprints, and reference books-in response to requests for information related to global environmental issues, primarily those pertaining to climate change. CDIACs staff also provides technical responses to specific inquiries related to carbon dioxide (CO{sub 2}), other trace gases, and climate. Hundreds of referrals to other researchers, policy analysts, information specialists, or organizations are also facilitated by CDIAC`s staff. This report provides an account of the activities accomplished by CDIAC during the period October 1, 1991 to September 30, 1992. An organizational overview of CDIAC and its staff is supplemented by a detailed description of inquiries received and CDIAC`s response to those inquiries. As analysis and description of the preparation and distribution of numeric data packages, computer model packages, technical reports, newsletters, fact sheets, specialty publications, and reprints is provided. Comments and descriptions of CDIAC`s information management systems, professional networking, and special bilateral agreements are also described.

  18. Structured information analysis for human reliability analysis of emergency tasks in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Kim, Jae Whan; Park, Jin Kyun; Ha, Jae Joo [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-02-01

    More than twenty HRA (Human Reliability Analysis) methodologies have been developed and used for the safety analysis in nuclear field during the past two decades. However, no methodology appears to have universally been accepted, as various limitations have been raised for more widely used ones. One of the most important limitations of conventional HRA is insufficient analysis of the task structure and problem space. To resolve this problem, we suggest SIA (Structured Information Analysis) for HRA. The proposed SIA consists of three parts. The first part is the scenario analysis that investigates the contextual information related to the given task on the basis of selected scenarios. The second is the goals-means analysis to define the relations between the cognitive goal and task steps. The third is the cognitive function analysis module that identifies the cognitive patterns and information flows involved in the task. Through the three-part analysis, systematic investigation is made possible from the macroscopic information on the tasks to the microscopic information on the specific cognitive processes. It is expected that analysts can attain a structured set of information that helps to predict the types and possibility of human error in the given task. 48 refs., 12 figs., 11 tabs. (Author)

  19. Patent portfolio analysis model based on legal status information

    Institute of Scientific and Technical Information of China (English)

    Xuezhao; WANG; Yajuan; ZHAO; Jing; ZHANG; Ping; ZHAO

    2014-01-01

    Purpose:This research proposes a patent portfolio analysis model based on the legal status information to chart out a competitive landscape in a particular field,enabling organizations to position themselves within the overall technology landscape.Design/methodology/approach:Three indicators were selected for the proposed model:Patent grant rate,valid patents rate and patent maintenance period.The model uses legal status information to perform a qualitative evaluation of relative values of the individual patents,countries or regions’ technological capabilities and competitiveness of patent applicants.The results are visualized by a four-quadrant bubble chart To test the effectiveness of the model,it is used to present a competitive landscape in the lithium ion battery field.Findings:The model can be used to evaluate the values of the individual patents,highlight countries or regions’ positions in the field,and rank the competitiveness of patent applicants in the field.Research limitations:The model currently takes into consideration only three legal status indicators.It is actually feasible to introduce more indicators such as the reason for invalid patents and the distribution of patent maintenance time and associate them with those in the proposed model.Practical implications:Analysis of legal status information in combination of patent application information can help an organization to spot gaps in its patent claim coverage,as well as evaluate patent quality and maintenance situation of its granted patents.The study results can be used to support technology assessment,technology innovation and intellectual property management.Originality/value:Prior studies attempted to assess patent quality or competitiveness by using either single patent legal status indicator or comparative analysis of the impacts of each indicator.However,they are insufficient in presenting the combined effects of the evaluation indicators.Using our model,it appears possible to get a

  20. Information-Theoretical Complexity Analysis of Selected Elementary Chemical Reactions

    Science.gov (United States)

    Molina-Espíritu, M.; Esquivel, R. O.; Dehesa, J. S.

    We investigate the complexity of selected elementary chemical reactions (namely, the hydrogenic-abstraction reaction and the identity SN2 exchange reaction) by means of the following single and composite information-theoretic measures: disequilibrium (D), exponential entropy(L), Fisher information (I), power entropy (J), I-D, D-L and I-J planes and Fisher-Shannon (FS) and Lopez-Mancini-Calbet (LMC) shape complexities. These quantities, which are functionals of the one-particle density, are computed in both position (r) and momentum (p) spaces. The analysis revealed that the chemically significant regions of these reactions can be identified through most of the single information-theoretic measures and the two-component planes, not only the ones which are commonly revealed by the energy, such as the reactant/product (R/P) and the transition state (TS), but also those that are not present in the energy profile such as the bond cleavage energy region (BCER), the bond breaking/forming regions (B-B/F) and the charge transfer process (CT). The analysis of the complexities shows that the energy profile of the abstraction reaction bears the same information-theoretical features of the LMC and FS measures, however for the identity SN2 exchange reaction does not hold a simple behavior with respect to the LMC and FS measures. Most of the chemical features of interest (BCER, B-B/F and CT) are only revealed when particular information-theoretic aspects of localizability (L or J), uniformity (D) and disorder (I) are considered.

  1. Clinical decision support tools: analysis of online drug information databases

    Directory of Open Access Journals (Sweden)

    Seamon Matthew J

    2007-03-01

    Full Text Available Abstract Background Online drug information databases are used to assist in enhancing clinical decision support. However, the choice of which online database to consult, purchase or subscribe to is likely made based on subjective elements such as history of use, familiarity, or availability during professional training. The purpose of this study was to evaluate clinical decision support tools for drug information by systematically comparing the most commonly used online drug information databases. Methods Five commercially available and two freely available online drug information databases were evaluated according to scope (presence or absence of answer, completeness (the comprehensiveness of the answers, and ease of use. Additionally, a composite score integrating all three criteria was utilized. Fifteen weighted categories comprised of 158 questions were used to conduct the analysis. Descriptive statistics and Chi-square were used to summarize the evaluation components and make comparisons between databases. Scheffe's multiple comparison procedure was used to determine statistically different scope and completeness scores. The composite score was subjected to sensitivity analysis to investigate the effect of the choice of percentages for scope and completeness. Results The rankings for the databases from highest to lowest, based on composite scores were Clinical Pharmacology, Micromedex, Lexi-Comp Online, Facts & Comparisons 4.0, Epocrates Online Premium, RxList.com, and Epocrates Online Free. Differences in scope produced three statistical groupings with Group 1 (best performers being: Clinical Pharmacology, Micromedex, Facts & Comparisons 4.0, Lexi-Comp Online, Group 2: Epocrates Premium and RxList.com and Group 3: Epocrates Free (p Conclusion Online drug information databases, which belong to clinical decision support, vary in their ability to answer questions across a range of categories.

  2. Principal Component Analysis - A Powerful Tool in Computing Marketing Information

    Directory of Open Access Journals (Sweden)

    Constantin C.

    2014-12-01

    Full Text Available This paper is about an instrumental research regarding a powerful multivariate data analysis method which can be used by the researchers in order to obtain valuable information for decision makers that need to solve the marketing problem a company face with. The literature stresses the need to avoid the multicollinearity phenomenon in multivariate analysis and the features of Principal Component Analysis (PCA in reducing the number of variables that could be correlated with each other to a small number of principal components that are uncorrelated. In this respect, the paper presents step-by-step the process of applying the PCA in marketing research when we use a large number of variables that naturally are collinear.

  3. Information Problem Solving: Analysis of a Complex Cognitive Skill

    NARCIS (Netherlands)

    S. Brand-Gruwel; I. Wopereis; Y. Vermetten

    2004-01-01

    textabstractIn (higher) education students are often faced with information problems: tasks or assignments which require the student to identify information needs, locate corresponding information sources, extract and organize relevant information from each source, and synthesize information from a

  4. Analysis of Information Leakage in Quantum Key Agreement

    Institute of Scientific and Technical Information of China (English)

    LIU Sheng-li; ZHENG Dong; CHENG Ke-fei

    2006-01-01

    Quantum key agreement is one of the approaches to unconditional security. Since 1980's, different protocols for quantum key agreement have been proposed and analyzed. A new quantum key agreement protocol was presented in 2004, and a detailed analysis to the protocol was given. The possible game played between legitimate users and the enemy was described:sitting in the middle, an adversary can play a "man-in-the-middle" attack to cheat the sender and receiver. The information leaked to the adversary is essential to the length of the final quantum secret key. It was shown how to determine the amount of information leaked to the enemy and the amount of uncertainty between the legitimate sender and receiver.

  5. Environmental Quality Information Analysis Center (EQIAC) operating procedures handbook

    Energy Technology Data Exchange (ETDEWEB)

    Walsh, T.E. (Florida Univ., Gainesville, FL (United States)); Das, S. (Oak Ridge National Lab., TN (United States))

    1992-08-01

    The Operating Procedures Handbook of the Environmental Quality Information Analysis Center (EQIAC) is intended to be kept current as EQIAC develops and evolves. Its purpose is to provide a comprehensive guide to the mission, infrastructure, functions, and operational procedures of EQIAC. The handbook is a training tool for new personnel and a reference manual for existing personnel. The handbook will be distributed throughout EQIAC and maintained in binders containing current dated editions of the individual sections. The handbook will be revised at least annually to reflect the current structure and operational procedures of EQIAC. The EQIAC provides information on environmental issues such as compliance, restoration, and environmental monitoring do the Air Force and DOD contractors.

  6. Use of historical information in extreme storm surges frequency analysis

    Science.gov (United States)

    Hamdi, Yasser; Duluc, Claire-Marie; Deville, Yves; Bardet, Lise; Rebour, Vincent

    2013-04-01

    The prevention of storm surge flood risks is critical for protection and design of coastal facilities to very low probabilities of failure. The effective protection requires the use of a statistical analysis approach having a solid theoretical motivation. Relating extreme storm surges to their frequency of occurrence using probability distributions has been a common issue since 1950s. The engineer needs to determine the storm surge of a given return period, i.e., the storm surge quantile or design storm surge. Traditional methods for determining such a quantile have been generally based on data from the systematic record alone. However, the statistical extrapolation, to estimate storm surges corresponding to high return periods, is seriously contaminated by sampling and model uncertainty if data are available for a relatively limited period. This has motivated the development of approaches to enlarge the sample extreme values beyond the systematic period. The nonsystematic data occurred before the systematic period is called historical information. During the last three decades, the value of using historical information as a nonsystematic data in frequency analysis has been recognized by several authors. The basic hypothesis in statistical modeling of historical information is that a perception threshold exists and that during a giving historical period preceding the period of tide gauging, all exceedances of this threshold have been recorded. Historical information prior to the systematic records may arise from high-sea water marks left by extreme surges on the coastal areas. It can also be retrieved from archives, old books, earliest newspapers, damage reports, unpublished written records and interviews with local residents. A plotting position formula, to compute empirical probabilities based on systematic and historical data, is used in this communication paper. The objective of the present work is to examine the potential gain in estimation accuracy with the

  7. Least Dependent Component Analysis Based on Mutual Information

    CERN Document Server

    Stögbauer, H; Astakhov, S A; Grassberger, P; St\\"ogbauer, Harald; Kraskov, Alexander; Astakhov, Sergey A.; Grassberger, Peter

    2004-01-01

    We propose to use precise estimators of mutual information (MI) to find least dependent components in a linearly mixed signal. On the one hand this seems to lead to better blind source separation than with any other presently available algorithm. On the other hand it has the advantage, compared to other implementations of `independent' component analysis (ICA) some of which are based on crude approximations for MI, that the numerical values of the MI can be used for: (i) estimating residual dependencies between the output components; (ii) estimating the reliability of the output, by comparing the pairwise MIs with those of re-mixed components; (iii) clustering the output according to the residual interdependencies. For the MI estimator we use a recently proposed k-nearest neighbor based algorithm. For time sequences we combine this with delay embedding, in order to take into account non-trivial time correlations. After several tests with artificial data, we apply the resulting MILCA (Mutual Information based ...

  8. Probabilistic analysis of the human transcriptome with side information

    CERN Document Server

    Lahti, Leo

    2011-01-01

    Understanding functional organization of genetic information is a major challenge in modern biology. Following the initial publication of the human genome sequence in 2001, advances in high-throughput measurement technologies and efficient sharing of research material through community databases have opened up new views to the study of living organisms and the structure of life. In this thesis, novel computational strategies have been developed to investigate a key functional layer of genetic information, the human transcriptome, which regulates the function of living cells through protein synthesis. The key contributions of the thesis are general exploratory tools for high-throughput data analysis that have provided new insights to cell-biological networks, cancer mechanisms and other aspects of genome function. A central challenge in functional genomics is that high-dimensional genomic observations are associated with high levels of complex and largely unknown sources of variation. By combining statistical ...

  9. Climate Informed Low Flow Frequency Analysis Using Nonstationary Modeling

    Science.gov (United States)

    Liu, D.; Guo, S.; Lian, Y.

    2014-12-01

    Stationarity is often assumed for frequency analysis of low flows in water resources management and planning. However, many studies have shown that flow characteristics, particularly the frequency spectrum of extreme hydrologic events,were modified by climate change and human activities and the conventional frequency analysis without considering the non-stationary characteristics may lead to costly design. The analysis presented in this paper was based on the more than 100 years of daily flow data from the Yichang gaging station 44 kilometers downstream of the Three Gorges Dam. The Mann-Kendall trend test under the scaling hypothesis showed that the annual low flows had significant monotonic trend, whereas an abrupt change point was identified in 1936 by the Pettitt test. The climate informed low flow frequency analysis and the divided and combined method are employed to account for the impacts from related climate variables and the nonstationarities in annual low flows. Without prior knowledge of the probability density function for the gaging station, six distribution functions including the Generalized Extreme Values (GEV), Pearson Type III, Gumbel, Gamma, Lognormal, and Weibull distributions have been tested to find the best fit, in which the local likelihood method is used to estimate the parameters. Analyses show that GEV had the best fit for the observed low flows. This study has also shown that the climate informed low flow frequency analysis is able to exploit the link between climate indices and low flows, which would account for the dynamic feature for reservoir management and provide more accurate and reliable designs for infrastructure and water supply.

  10. Information findability: An informal study to explore options for improving information findability for the systems analysis group

    Energy Technology Data Exchange (ETDEWEB)

    Stoecker, Nora Kathleen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-03-01

    A Systems Analysis Group has existed at Sandia National Laboratories since at least the mid-1950s. Much of the groups work output (reports, briefing documents, and other materials) has been retained, along with large numbers of related documents. Over time the collection has grown to hundreds of thousands of unstructured documents in many formats contained in one or more of several different shared drives or SharePoint sites, with perhaps five percent of the collection still existing in print format. This presents a challenge. How can the group effectively find, manage, and build on information contained somewhere within such a large set of unstructured documents? In response, a project was initiated to identify tools that would be able to meet this challenge. This report documents the results found and recommendations made as of August 2013.

  11. Analysis of Internet Information on Lateral Lumbar Interbody Fusion.

    Science.gov (United States)

    Belayneh, Rebekah; Mesfin, Addisu

    2016-07-01

    Lateral lumbar interbody fusion (LLIF) is a surgical technique that is being increasingly used. The authors' objective was to examine information on the Internet pertaining to the LLIF technique. An analysis was conducted of publicly accessible websites pertaining to LLIF. The following search engines were used: Google (www.google.com), Bing (www.bing.com), and Yahoo (www.yahoo.com). DuckDuckGo (www.duckduckgo.com) was an additional search engine used due to its emphasis on generating accurate and consistent results while protecting searchers' privacy and reducing advertisements. The top 35 websites providing information on LLIF from the 4 search engines were identified. A total of 140 websites were evaluated. Each web-site was categorized based on authorship (academic, private, medical industry, insurance company, other) and content of information. Using the search term lateral lumbar interbody fusion, 174,000 Google results, 112,000 Yahoo results, and 112,000 Bing results were obtained. DuckDuckGo does not display the number of results found for a search. From the top 140 websites collected from each website, 78 unique websites were identified. Websites were authored by a private medical group in 46.2% of the cases, an academic medical group in 26.9% of the cases, and the biomedical industry in 5.1% of the cases. Sixty-eight percent of websites reported indications, and 24.4% reported contraindications. Benefits of LLIF were reported by 69.2% of websites. Thirty-six percent of websites reported complications of LLIF. Overall, the quality of information regarding LLIF on the Internet is poor. Spine surgeons and spine societies can assist in improving the quality of the information on the Internet regarding LLIF. [Orthopedics. 2016; 39(4):e701-e707.].

  12. A Brief Analysis to Informal Chatting on the Phone

    Institute of Scientific and Technical Information of China (English)

    金晓瑜

    2005-01-01

    An analysis to the telephoning chat is often ignored by analysts as it is very common and very colloquial. In my opinion, however, it differs to the spoken language in daily life in some aspects. For example, it has two parties connected by a line, the speakers talks in deferent places and may do different things. So in analyzing the discourse, such factors as field, tenor, and mode, skipping and coherence, reference and presupposition in study the context are to be emphasized. The third part of this paper gives an example to show one method to analyze the informal chatting on the phone. It also mentions some practical applications of the analysis to this style of discourse in the conclusion part.

  13. Analysis of system trustworthiness based on information flow noninterference theory

    Institute of Scientific and Technical Information of China (English)

    Xiangying Kong; Yanhui Chen; Yi Zhuang

    2015-01-01

    The trustworthiness analysis and evaluation are the bases of the trust chain transfer. In this paper the formal method of trustworthiness analysis of a system based on the noninterfer-ence (NI) theory of the information flow is studied. Firstly, existing methods cannot analyze the impact of the system states on the trustworthiness of software during the process of trust chain trans-fer. To solve this problem, the impact of the system state on trust-worthiness of software is investigated, the run-time mutual interfer-ence behavior of software entities is described and an interference model of the access control automaton of a system is established. Secondly, based on the intransitive noninterference (INI) theory, a formal analytic method of trustworthiness for trust chain transfer is proposed, providing a theoretical basis for the analysis of dynamic trustworthiness of software during the trust chain transfer process. Thirdly, a prototype system with dynamic trustworthiness on a plat-form with dual core architecture is constructed and a verification algorithm of the system trustworthiness is provided. Final y, the monitor hypothesis is extended to the dynamic monitor hypothe-sis, a theorem of static judgment rule of system trustworthiness is provided, which is useful to prove dynamic trustworthiness of a system at the beginning of system construction. Compared with previous work in this field, this research proposes not only a formal analytic method for the determination of system trustworthiness, but also a modeling method and an analysis algorithm that are feasible for practical implementation.

  14. SUCCESS CONCEPT ANALYSIS APPLIED TO THE INFORMATION TECHNOLOGY PROJECT MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Cassio C. Montenegro Duarte

    2012-05-01

    Full Text Available This study evaluates the concept of success in project management that is applicable to the IT universe, from the classical theory associated with the techniques of project management. Therefore, it applies the theoretical analysis associated to the context of information technology in enterprises as well as the classic literature of traditional project management, focusing on its application in business information technology. From the literature developed in the first part of the study, four propositions were prepared for study which formed the basis for the development of the field research with three large companies that develop projects of Information Technology. The methodology used in the study predicted the development of the multiple case study. Empirical evidence suggests that the concept of success found in the classical literature in project management adjusts to the environment management of IT projects. Showed that it is possible to create the model of standard IT projects in order to replicate it in future derivatives projects, which depends on the learning acquired at the end of a long and continuous process and sponsorship of senior management, which ultimately results in its merger into the company culture.

  15. ASPECTS OF COMPANY PERFORMANCE ANALYSIS BASED ON RELEVANT FINANCIAL INFORMATION AND NONFINANCIAL INFORMATION

    Directory of Open Access Journals (Sweden)

    Popa Dorina

    2012-07-01

    Full Text Available The main objective of our work is the conceptual description of the performance of an economic entity in financial and non-financial terms. During our approach we have shown that it is not sufficient to analyze the performance of a company only in financial terms as the performance reflected in financial reports sometimes do not coincide with the real situation of the company. In this case the cause of the differences has to be found among the influences of other nonfinancial information. Mainly following the great financial scandals the distrust in the reliability of financial-accounting information has eroded strongly and thus the business performance measurement cannot be the exclusive domain of the criteria of financial analysis, but must be done in a comprehensive way, based both on financial criteria and on non-financial ones (intangible assets, social responsibility of the company. Using non-financial criteria have led to the occurrence of new types of analysis, namely extra-financial analysis. Thus, enterprise performance is not subject to material and financial resources managed and controlled by the entities, but to the complex of intangible resources that companies created by thier previous work. The extra-financial analysis has to face difficulties arising mainly from the existence of non-financial indicators very little normalized, and from the lack of uniformity of the practice in the field. In determining the extra-financial performance indicators one has to observe the manifestation and the evolution of the company’s relationships with its partners / environment. In order to analyze the performance measurement by financial and nonfinancial indicators we chose as a case study a company in Bihor county, listed on Bucharest Stock Exchange. The results of our study show that the Romanian entities are increasingly interested in measuring performance and after the extra-financial analysis we concluded that the company had set

  16. An Information Theory Analysis of Spatial Decisions in Cognitive Development

    Directory of Open Access Journals (Sweden)

    Nicole M Scott

    2015-02-01

    Full Text Available Performance in a cognitive task can be considered as the outcome of a decision-making process operating across various knowledge domains or aspects of a single domain. Therefore, an analysis of these decisions in various tasks can shed light on the interplay and integration of these domains (or elements within a single domain as they are associated with specific task characteristics. In this study, we applied an information theoretic approach to assess quantitatively the gain of knowledge across various elements of the cognitive domain of spatial, relational knowledge, as a function of development. Specifically, we examined changing spatial relational knowledge from ages five to ten years. Our analyses consisted of a two-step process. First, we performed a hierarchical clustering analysis on the decisions made in 16 different tasks of spatial relational knowledge to determine which tasks were performed similarly at each age group as well as to discover how the tasks clustered together. We next used two measures of entropy to capture the gradual emergence of order in the development of relational knowledge. These measures of cognitive entropy were defined based on two independent aspects of chunking, namely (1 the number of clusters formed at each age group, and (2 the distribution of tasks across the clusters. We found that both measures of entropy decreased with age in a quadratic fashion and were positively and linearly correlated. The decrease in entropy and, therefore, gain of information during development was accompanied by improved performance. These results document, for the first time, the orderly and progressively structured chunking of decisions across the development of spatial relational reasoning and quantify this gain within a formal information-theoretic framework.

  17. An information theory analysis of spatial decisions in cognitive development.

    Science.gov (United States)

    Scott, Nicole M; Sera, Maria D; Georgopoulos, Apostolos P

    2015-01-01

    Performance in a cognitive task can be considered as the outcome of a decision-making process operating across various knowledge domains or aspects of a single domain. Therefore, an analysis of these decisions in various tasks can shed light on the interplay and integration of these domains (or elements within a single domain) as they are associated with specific task characteristics. In this study, we applied an information theoretic approach to assess quantitatively the gain of knowledge across various elements of the cognitive domain of spatial, relational knowledge, as a function of development. Specifically, we examined changing spatial relational knowledge from ages 5 to 10 years. Our analyses consisted of a two-step process. First, we performed a hierarchical clustering analysis on the decisions made in 16 different tasks of spatial relational knowledge to determine which tasks were performed similarly at each age group as well as to discover how the tasks clustered together. We next used two measures of entropy to capture the gradual emergence of order in the development of relational knowledge. These measures of "cognitive entropy" were defined based on two independent aspects of chunking, namely (1) the number of clusters formed at each age group, and (2) the distribution of tasks across the clusters. We found that both measures of entropy decreased with age in a quadratic fashion and were positively and linearly correlated. The decrease in entropy and, therefore, gain of information during development was accompanied by improved performance. These results document, for the first time, the orderly and progressively structured "chunking" of decisions across the development of spatial relational reasoning and quantify this gain within a formal information-theoretic framework.

  18. Information Infrastructure Development Recommendations through Analysis of Current Information Technology Infrastructure, Plans and Policies

    Science.gov (United States)

    2007-11-02

    information society , and the military influence on information and communication technologies development; a review of the policy, objectives, concepts and methods, and the resources outlined in the Information Technology Management Strategic Plan, the Defense Information Infrastructure Master Plan, and the Global and National Information Infrastructure

  19. From paragraph to graph: latent semantic analysis for information visualization.

    Science.gov (United States)

    Landauer, Thomas K; Laham, Darrell; Derr, Marcia

    2004-04-01

    Most techniques for relating textual information rely on intellectually created links such as author-chosen keywords and titles, authority indexing terms, or bibliographic citations. Similarity of the semantic content of whole documents, rather than just titles, abstracts, or overlap of keywords, offers an attractive alternative. Latent semantic analysis provides an effective dimension reduction method for the purpose that reflects synonymy and the sense of arbitrary word combinations. However, latent semantic analysis correlations with human text-to-text similarity judgments are often empirically highest at approximately 300 dimensions. Thus, two- or three-dimensional visualizations are severely limited in what they can show, and the first and/or second automatically discovered principal component, or any three such for that matter, rarely capture all of the relations that might be of interest. It is our conjecture that linguistic meaning is intrinsically and irreducibly very high dimensional. Thus, some method to explore a high dimensional similarity space is needed. But the 2.7 x 10(7) projections and infinite rotations of, for example, a 300-dimensional pattern are impossible to examine. We suggest, however, that the use of a high dimensional dynamic viewer with an effective projection pursuit routine and user control, coupled with the exquisite abilities of the human visual system to extract information about objects and from moving patterns, can often succeed in discovering multiple revealing views that are missed by current computational algorithms. We show some examples of the use of latent semantic analysis to support such visualizations and offer views on future needs.

  20. 76 FR 37371 - Agency Information Collection: Comment Request for National Gap Analysis Program Evaluation

    Science.gov (United States)

    2011-06-27

    ... (ICR) for a new collection of information: National Gap Analysis Program Evaluation. This notice....S. Geological Survey Agency Information Collection: Comment Request for National Gap Analysis Program Evaluation AGENCY: United States Geological Survey (USGS), Interior. ACTION: Notice of a...

  1. Automation of Large-scale Computer Cluster Monitoring Information Analysis

    Science.gov (United States)

    Magradze, Erekle; Nadal, Jordi; Quadt, Arnulf; Kawamura, Gen; Musheghyan, Haykuhi

    2015-12-01

    High-throughput computing platforms consist of a complex infrastructure and provide a number of services apt to failures. To mitigate the impact of failures on the quality of the provided services, a constant monitoring and in time reaction is required, which is impossible without automation of the system administration processes. This paper introduces a way of automation of the process of monitoring information analysis to provide the long and short term predictions of the service response time (SRT) for a mass storage and batch systems and to identify the status of a service at a given time. The approach for the SRT predictions is based on Adaptive Neuro Fuzzy Inference System (ANFIS). An evaluation of the approaches is performed on real monitoring data from the WLCG Tier 2 center GoeGrid. Ten fold cross validation results demonstrate high efficiency of both approaches in comparison to known methods.

  2. Economic Efficiency Analysis for Information Technology in Developing Countries

    Directory of Open Access Journals (Sweden)

    Ghassan F. Issa

    2009-01-01

    Full Text Available Problem statement: The introduction of Information Technology (IT to government institutions in developing countries bears a great deal of risk of failure. The lack of qualified personnel, lack of financial support and the lack of planning and proper justification are just few of the causes of projects failure. Study presented in this study focused on the justification issue of IT projects through the application of Cost Benefit Analysis (CBA as part of a comprehensive Economic Efficiency Analysis (EEA of IT Projects, thus providing management with a decision making tool which highlights existing and future problems and reduces the risk of failure. Approach: Cost-Benefit Analysis (CBA based on Economic Efficiency Analysis (EEA was performed on selected IT projects from ministries and key institutions in the government of Jordan using a well established approach employed by the Federal Government of Germany (KBSt approach. The approach was then modified and refined to suit the needs of developing countries so that it captured all the relevant elements of cost and benefits both quantitatively and qualitatively and includes a set of guidelines for data collection strategy. Results: When IT projects were evaluated using CBA, most cases yielded negative Net Present Value (NPV, even though, some cases showed some reduction in operation cost starting from the third year of project life. However, when the CBA was applied as a part of a comprehensive EEA by introducing qualitative aspects and urgency criteria, proper justification for new projects became feasible. Conclusion: The modified EEA represented a systematic approach which was well suited for the government of Jordan as a developing country. This approach was capable of dealing with the justification issue, evaluation of existing systems and the urgency of replacing legacy systems. This study explored many of the challenges and inherited problems existing in the public sectors of developing

  3. Mediating Informal Care Online: Findings from an Extensive Requirements Analysis

    Directory of Open Access Journals (Sweden)

    Christiane Moser

    2015-05-01

    Full Text Available Organizing and satisfying the increasing demand for social and informal care for older adults is an important topic. We aim at building a peer-to-peer exchange platform that empowers older adults to benefit from receiving support for daily activities and reciprocally offering support to others. In situated interviews and within a survey we investigated the requirements and needs of 246 older adults with mild impairments. Additionally, we conducted an interpretative role analysis of older adults’ collaborative care processes (i.e., support exchange practices in order to identify social roles and understand the inherent expectations towards the execution of support. We will describe our target group in the form of personas and different social roles, as well as user requirements for establishing a successful peer-to-peer collaboration. We also consider our finding from the perspective of social capital theory that allows us to describe in our requirements how relationships provide valuable social resources (i.e., social capital for informal and social care.

  4. Analysis of Information Content in High-Spectral Resolution Sounders using Subset Selection Analysis

    Science.gov (United States)

    Velez-Reyes, Miguel; Joiner, Joanna

    1998-01-01

    In this paper, we summarize the results of the sensitivity analysis and data reduction carried out to determine the information content of AIRS and IASI channels. The analysis and data reduction was based on the use of subset selection techniques developed in the linear algebra and statistical community to study linear dependencies in high dimensional data sets. We applied the subset selection method to study dependency among channels by studying the dependency among their weighting functions. Also, we applied the technique to study the information provided by the different levels in which the atmosphere is discretized for retrievals and analysis. Results from the method correlate well with intuition in many respects and point out to possible modifications for band selection in sensor design and number and location of levels in the analysis process.

  5. Applications of Geographic Information System (GIS) analysis of Lake Uluabat.

    Science.gov (United States)

    Hacısalihoğlu, Saadet; Karaer, Feza; Katip, Aslıhan

    2016-06-01

    Lake Uluabat is one of the most important wetlands in Turkey because of its rich biodiversity, lying on a migratory bird route with almost all its shores being covered by submerged plants. The lake has been protected by the Ramsar Convention since 1998. However, the Lake is threatened by natural and anthropogenic stressors as a consequence of its location. Geographic Information System (GIS) analysis is a tool that has been widely used, especially for water quality management in recent years. This study aimed to investigate the water quality and determined most polluted points using GIS analysis of the lake. Temperature, pH, dissolved oxygen, chemical oxygen demand, Kjeldahl nitrogen, total phosphorus, chlorophyll-a, arsenic, boron, iron, and manganese were monitored monthly from June 2008 to May 2009, with the samples taken from 8 points in the lake. Effect of pH, relation of temperature, and Chl-a with other water quality parameters and metals are designated as statistically significant. Data were mapped using ArcGIS 9.1 software and were assessed according to the Turkish Water Pollution Control Regulations (TWPCR). The research also focused on classifying and mapping the water quality in the lake by using the spatial analysis functions of GIS. As a result, it was determined that Lake Uluabat belonged to the 4th class, i.e., highly polluted water, including any water of lower quality. A remarkable portion of the pollution in the water basin was attributed to domestic wastewater discharges, industrial and agricultural activities, and mining.

  6. Success story in software engineering using NIAM (Natural language Information Analysis Methodology)

    Energy Technology Data Exchange (ETDEWEB)

    Eaton, S.M.; Eaton, D.S.

    1995-10-01

    To create an information system, we employ NIAM (Natural language Information Analysis Methodology). NIAM supports the goals of both the customer and the analyst completely understanding the information. We use the customer`s own unique vocabulary, collect real examples, and validate the information in natural language sentences. Examples are discussed from a successfully implemented information system.

  7. Longitudinal analysis of new information types in clinical notes.

    Science.gov (United States)

    Zhang, Rui; Pakhomov, Serguei; Melton, Genevieve B

    2014-01-01

    It is increasingly recognized that redundant information in clinical notes within electronic health record (EHR) systems is ubiquitous, significant, and may negatively impact the secondary use of these notes for research and patient care. We investigated several automated methods to identify redundant versus relevant new information in clinical reports. These methods may provide a valuable approach to extract clinically pertinent information and further improve the accuracy of clinical information extraction systems. In this study, we used UMLS semantic types to extract several types of new information, including problems, medications, and laboratory information. Automatically identified new information highly correlated with manual reference standard annotations. Methods to identify different types of new information can potentially help to build up more robust information extraction systems for clinical researchers as well as aid clinicians and researchers in navigating clinical notes more effectively and quickly identify information pertaining to changes in health states.

  8. ANALYSIS ON AGRICULTRUAL INFORMATION DEMAND --Case of Jilin Province

    Institute of Scientific and Technical Information of China (English)

    ZHANG Hui-min; JIANG Hui-ming

    2005-01-01

    With the rapid development of agricultural informalization in the world, the demand of agricultural information has been a focus in the international agriculture and information fields. Based on the investigation, this paper presented the four characteristics of the demand of agricultural information in China, including regionality, seasonality, great potential demand and variation in kind and level. The factors influencing the demand of agricultural information were analyzed by the Optimized Less Square (OLS) method. The result shows that, of all factors influcing agricultural information demand, the most important one is economy, the second is facility of information pass,and knowledge and education of user, credit of agricultural information service system and production situation follow. Taking Jilin Province as an example, this article also elaborated the agricultural information demand status, and deduced the regression model of agricultural information demand and verified it by the survey in rural Jilin.

  9. The system for statistical analysis of logistic information

    Directory of Open Access Journals (Sweden)

    Khayrullin Rustam Zinnatullovich

    2015-05-01

    Full Text Available The current problem for managers in logistic and trading companies is the task of improving the operational business performance and developing the logistics support of sales. The development of logistics sales supposes development and implementation of a set of works for the development of the existing warehouse facilities, including both a detailed description of the work performed, and the timing of their implementation. Logistics engineering of warehouse complex includes such tasks as: determining the number and the types of technological zones, calculation of the required number of loading-unloading places, development of storage structures, development and pre-sales preparation zones, development of specifications of storage types, selection of loading-unloading equipment, detailed planning of warehouse logistics system, creation of architectural-planning decisions, selection of information-processing equipment, etc. The currently used ERP and WMS systems did not allow us to solve the full list of logistics engineering problems. In this regard, the development of specialized software products, taking into account the specifics of warehouse logistics, and subsequent integration of these software with ERP and WMS systems seems to be a current task. In this paper we suggest a system of statistical analysis of logistics information, designed to meet the challenges of logistics engineering and planning. The system is based on the methods of statistical data processing.The proposed specialized software is designed to improve the efficiency of the operating business and the development of logistics support of sales. The system is based on the methods of statistical data processing, the methods of assessment and prediction of logistics performance, the methods for the determination and calculation of the data required for registration, storage and processing of metal products, as well as the methods for planning the reconstruction and development

  10. Tracing cattle breeds with principal components analysis ancestry informative SNPs.

    Directory of Open Access Journals (Sweden)

    Jamey Lewis

    Full Text Available The recent release of the Bovine HapMap dataset represents the most detailed survey of bovine genetic diversity to date, providing an important resource for the design and development of livestock production. We studied this dataset, comprising more than 30,000 Single Nucleotide Polymorphisms (SNPs for 19 breeds (13 taurine, three zebu, and three hybrid breeds, seeking to identify small panels of genetic markers that can be used to trace the breed of unknown cattle samples. Taking advantage of the power of Principal Components Analysis and algorithms that we have recently described for the selection of Ancestry Informative Markers from genomewide datasets, we present a decision-tree which can be used to accurately infer the origin of individual cattle. In doing so, we present a thorough examination of population genetic structure in modern bovine breeds. Performing extensive cross-validation experiments, we demonstrate that 250-500 carefully selected SNPs suffice in order to achieve close to 100% prediction accuracy of individual ancestry, when this particular set of 19 breeds is considered. Our methods, coupled with the dense genotypic data that is becoming increasingly available, have the potential to become a valuable tool and have considerable impact in worldwide livestock production. They can be used to inform the design of studies of the genetic basis of economically important traits in cattle, as well as breeding programs and efforts to conserve biodiversity. Furthermore, the SNPs that we have identified can provide a reliable solution for the traceability of breed-specific branded products.

  11. Situated student learning and spatial informational analysis for environmental problems

    Science.gov (United States)

    Olsen, Timothy Paul

    Ninth and tenth grade high school Biology student research teams used spatial information analysis tools to site a prairie restoration plot on a 55 acre campus during a four-week environment unit. Students made use of innovative technological practices by applying geographic information systems (GIS) approaches to solving environmental and land use problems. Student learning was facilitated by starting with the students' initial conceptions of computing, local landscape and biological environment, and then by guiding them through a problem-based science project process. The project curriculum was framed by the perspective of legitimate peripheral participation (Lave & Wenger, 1991) where students were provided with learning opportunities designed to allow them to act like GIS practitioners. Sociocultural lenses for learning were employed to create accounts of human mental processes that recognize the essential relationship between these processes and their cultural, historical, and institutional settings (Jacob, 1997; Wertsch, 1991). This research investigated how student groups' meaning-making actions were mediated by GIS tools on the periphery of a scientific community of practice. Research observations focused on supporting interpretations of learners' socially constructed actions and the iterative building of assertions from multiple sources. These included the artifacts students produced, the tools they used, the cultural contexts that constrained their activity, and how people begin to adopt ways of speaking (speech genres) of the referent community to negotiate meanings and roles. Students gathered field observations and interpreted attributes of landscape entities from the GIS data to advocate for an environmental decision. However, even while gaining proficiencies with GIS tools, most students did not begin to appropriate roles from the GIS community of practice. Students continued to negotiate their project actions simply as school exercises motivated by

  12. Time course of information representation of macaque AIP neurons in hand manipulation task revealed by information analysis.

    Science.gov (United States)

    Sakaguchi, Yutaka; Ishida, Fumihiko; Shimizu, Takashi; Murata, Akira

    2010-12-01

    We used mutual information analysis of neuronal activity in the macaque anterior intraparietal area (AIP) to examine information processing during a hand manipulation task. The task was to reach-to-grasp a three-dimensional (3D) object after presentation of a go signal. Mutual information was calculated between the spike counts of individual neurons in 50-ms-wide time bins and six unique shape classifications or 15 one-versus-one classifications of these shapes. The spatiotemporal distribution of mutual information was visualized as a two-dimensional image ("information map") to better observe global profiles of information representation. In addition, a nonnegative matrix factorization technique was applied for extracting its structure. Our major finding was that the time course of mutual information differed significantly according to different classes of task-related neurons. This strongly suggests that different classes of neurons were engaged in different information processing stages in executing the hand manipulation task. On the other hand, our analysis revealed the heterogeneous nature of information representation of AIP neurons. For example, "information latency" (or information onset) varied among individual neurons even in the same neuron class and the same shape classification. Further, some neurons changed "information preference" (i.e., shape classification with the largest amount of information) across different task periods. These suggest that neurons encode different information in the different task periods. Taking the present result together with previous findings, we used a Gantt chart to propose a hypothetical scheme of the dynamic interactions between different types of AIP neurons.

  13. An Analysis of the Information Behaviour of Geography Teachers in a Developing African Country–Lesotho

    Directory of Open Access Journals (Sweden)

    Constance BITSO

    2012-08-01

    Full Text Available Information behaviour studies have the potential to inform the design of effective information services that incorporate the information needs, information-seeking and preferences for information sources of target users; hence a doctoral study was conducted on the information behaviour of geography teachers in Lesotho with the aim of guiding the design and implementation of an information service model for these teachers. This paper focuses on the analysis of the information behaviour of geography teachers in Lesotho as a contribution of original knowledge on geography teachers’ information behaviour. The analysis established the information behaviour of geography teachers using the information behaviour concept that encompasses information needs, information-seeking and information sources. Data were collected and analyzed through focus group discussions and conceptual content analysis respectively.The analysis reveals that these geography teachers need current and accurate information covering a variety of aspects in teaching and learning, such as content, pedagogy, classroom management and learners’ assessment. Owing to the increasing number of orphans in schools as a result of the HIV and AIDS pandemic, most teachers expressed the need for information on social assistance for orphans and vulnerable children. Recommendations include information literacy training for teachers and access to the Internet in schools, including the use of open access journals on the Internet by the teachers.

  14. Catchment delineation and morphometric analysis using geographical information system.

    Science.gov (United States)

    Kumar, Manoj; Kumar, Rohitashw; Singh, P K; Singh, Manjeet; Yadav, K K; Mittal, H K

    2015-01-01

    The geographical information system (GIS) has emerged as an efficient tool in delineation of drainage patterns of watershed planning and management. The morphometric parameters of basins can address linear, areal and relief aspects. The study deals with the integrated watershed management of Baliya micro-watersheds, located in the Udaipur district of Rajasthan, India. Morphometric analysis in hydrological investigation is an important aspect and it is inevitable in the development and management of drainage basins. The determination of linear, areal and relief parameters indicate fairly good significance. The low value of the bifurcation ratio of 4.19 revealed that the drainage pattern has not been distorted by structural disturbance. The high value of the elongation ratio (0.68) compared to the circulatory ratio (0.27) indicates an elongated shape of the watershed. The high value of drainage density (5.39 km/km(2)) and stream frequency (12.32) shows that the region has impermeable subsoil material under poor vegetative cover with a low relief factor. The morphometric parameters of relief ratio (0.041) and relative relief (0.99%) show that the watershed can be treated using GIS techniques to determine the morphometric presence of dendritic drainage pattern, with a view to selecting the soil and water conservation measures and water harvesting.

  15. Enhancing multilingual latent semantic analysis with term alignment information.

    Energy Technology Data Exchange (ETDEWEB)

    Chew, Peter A.; Bader, Brett William

    2008-08-01

    Latent Semantic Analysis (LSA) is based on the Singular Value Decomposition (SVD) of a term-by-document matrix for identifying relationships among terms and documents from co-occurrence patterns. Among the multiple ways of computing the SVD of a rectangular matrix X, one approach is to compute the eigenvalue decomposition (EVD) of a square 2 x 2 composite matrix consisting of four blocks with X and XT in the off-diagonal blocks and zero matrices in the diagonal blocks. We point out that significant value can be added to LSA by filling in some of the values in the diagonal blocks (corresponding to explicit term-to-term or document-to-document associations) and computing a term-by-concept matrix from the EVD. For the case of multilingual LSA, we incorporate information on cross-language term alignments of the same sort used in Statistical Machine Translation (SMT). Since all elements of the proposed EVD-based approach can rely entirely on lexical statistics, hardly any price is paid for the improved empirical results. In particular, the approach, like LSA or SMT, can still be generalized to virtually any language(s); computation of the EVD takes similar resources to that of the SVD since all the blocks are sparse; and the results of EVD are just as economical as those of SVD.

  16. Understanding Informal Group Learning in Online Communities through Discourse Analysis

    Science.gov (United States)

    Ziegler, Mary F.; Paulus, Trena; Woodside, Marianne

    2014-01-01

    Since informal learning occurs outside of formal learning environments, describing informal learning and how it takes place can be a challenge for researchers. Past studies have typically oriented to informal learning as an individual, reflective process that can best be understood through the learners' retrospective accounts about their…

  17. The informed society: an analysis of the public's information-seeking behavior regarding coastal flood risks

    OpenAIRE

    Kellens, W.; Zaalberg, R.; P. De Maeyer

    2012-01-01

    Recent flood risk management puts an increasing emphasis on the public's risk perception and its preferences. It is now widely recognized that a better knowledge of the public's awareness and concern about risks is of vital importance to outline effective risk communication strategies. Models such as Risk Information Seeking and Processing address this evolution by considering the public's needs and its information-seeking behavior with regard to risk information. This study builds upon earli...

  18. Analysis of Information Quality in event triggered Smart Grid Control

    DEFF Research Database (Denmark)

    Kristensen, Thomas le Fevre; Olsen, Rasmus Løvenstein; Rasmussen, Jakob Gulddahl

    2015-01-01

    dependability of existing networks. We develop models for network delays and information dynamics, and uses these to model information quality for three given information access schemes in an event triggered control scenario. We analyse the impact of model parameters, and show how optimal choice of information......The integration of renewable energy sources into the power grid requires added control intelligence which imposes new communication requirements onto the future power grid. Since large scale implementation of new communication infrastructure is infeasible, we consider methods of increasing...... access scheme depends on network conditions as well as trade-offs between information quality, network resources and control reactivity....

  19. ANALYSIS OF THE COHERENCE OF LOGISTIC SYSTEMS FROM URBAN ENVIROMNENTS USING THE INFORMATIONAL INDICES

    OpenAIRE

    Gheorghe BASANU; Victor TELEASA

    2010-01-01

    The paper introduces a series of analysis models of the flows of materials and products existent inside a logistic system, elaborated according to the entropic and informational indices introduced in the first part of the paper, which are: informational entropy, the quantity of information, the organization degree, the mutual information, the informational energy and the coefficient of informational correlation. The theoretical elements are used in case studies in the second part of the paper...

  20. Change detection in bi-temporal data by canonical information analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Vestergaard, Jacob Schack

    2015-01-01

    combinations with the information theoretical measure mutual information (MI). We term this type of analysis canonical information analysis (CIA). MI allows for the actual joint distribution of the variables involved and not just second order statistics. Where CCA is ideal for Gaussian data, CIA facilitates...

  1. Analysis on Uncertain Information and Actions for Preventing Collision

    Institute of Scientific and Technical Information of China (English)

    胡甚平; FANG; Quan-gen

    2007-01-01

    Discusses and analyzes the causes and characteristics of the uncertainties of the information and actions for preventing collision at sea on the basic knowledge of the collision avoidance. Describes the ways and functions of the investigations about the uncertainties of the information and actions of collision avoidance with the navigation simulators. Puts forward some suggestions for the officers to master the skills of the recognition of these uncertainties of the information and actions by the training with the simulator during the MET course.

  2. Information and Motion Pattern Learning and Analysis Using Neural Techniques

    Science.gov (United States)

    2009-02-28

    development and application of neural models for higher-level information fusion at Levels 2+/3 according to the JDL Data Fusion Group Process Model. We...application of neural models for higher-level information fusion at Levels 2+/3 according to the JDL Data Fusion Group Process Model. We explored several...and application of neural models of information fusion at Levels 2+/3 according to the JDL Data Fusion Group Process Model (Figure 1) [1, 2]. We

  3. Clinical decision support tools: analysis of online drug information databases

    OpenAIRE

    Seamon Matthew J; Polen Hyla H; Marsh Wallace A; Clauson Kevin A; Ortiz Blanca I

    2007-01-01

    Abstract Background Online drug information databases are used to assist in enhancing clinical decision support. However, the choice of which online database to consult, purchase or subscribe to is likely made based on subjective elements such as history of use, familiarity, or availability during professional training. The purpose of this study was to evaluate clinical decision support tools for drug information by systematically comparing the most commonly used online drug information datab...

  4. INFORMATION ECONOMICS, INSTRUMENT OF ANALYSIS IN NEW MICROECONOMICS

    Directory of Open Access Journals (Sweden)

    Maria Zenovia GRIGORE

    2009-10-01

    Full Text Available In the New Microeconomics the walrasian postulate of perfect information is replaced by two theorems concerning production of information: 1. the acquisition and dissemination of information raise production costs; 2.  the specialisation in information activity is efficient; there are specialists in the production or use of informationInformation economics or the economics of information studies decisions in transaction where one party has more or better information than the other. Incomplete and asymmetric information could generate two types of risks: adverse selection, which can be reduced with signaling games and screening games, and moral hazard, studied in the frame of agency theory, by the principal agent model. The principal agent model treats the difficulties that arise when a principal hires an agent to pursue the interests of the former. There are some mechanisms that align the interests of the agent in solidarity with those of the principal, such as commissions, promotions, profit sharing, efficiency wages, deferred compensation, fear of firing and so on.

  5. A segmentation analysis of consumer uses of health information.

    Science.gov (United States)

    Risker, D C

    1995-01-01

    Public and private health data organizations are receiving increased pressure to produce consumer-level health information. In addition, the proposed health care reforms imply that health care networks will have to market their health plans. However, little attention has been given to what format the information should have and what the consumers' information needs are. This article discusses the health services marketing literature published to date on the subject, compares it to general marketing literature, and suggests some general guidelines for the effective publication and distribution of health information.

  6. 40 CFR 1400.8 - Access to off-site consequence analysis information by Federal government officials.

    Science.gov (United States)

    2010-07-01

    ... INFORMATION DISTRIBUTION OF OFF-SITE CONSEQUENCE ANALYSIS INFORMATION Access to Off-Site Consequence Analysis Information by Government Officials. § 1400.8 Access to off-site consequence analysis information by Federal... analysis information by Federal government officials. 1400.8 Section 1400.8 Protection of...

  7. Analysis of Information Systems Based on Automatic Control Theory,

    Science.gov (United States)

    The development of information systems in the USSR have given rise to increased interest in studing them. In this report basic attention is given...to the problems of optimizing the operation and control of information systems . It appears that this problem can be solved most fruitfully on the basis

  8. Analysis of information systems for the enterprises marketing activities management

    Directory of Open Access Journals (Sweden)

    A.O. Natorina

    2014-06-01

    Full Text Available The article deals with the role of the computer information systems in the enterprise marketing activities strategic management, enterprises marketing management information systems. Tthe stages of the development system and launch of a new product into the market within its life cycle are analyzed, exemplified by fat and oil industry.

  9. Analysis Report Project: Audience, E-writing, and Information Design.

    Science.gov (United States)

    Lawrence, Sally F.

    2003-01-01

    Presents students with the opportunity to evaluate e-writing techniques and information design components of two financial and investment Websites. Discusses strategies for teaching e-writing. Uses a three-part information design model developed by Carliner (2000). Discusses how the physical, cognitive, and affective aspects all work…

  10. The threat nets approach to information system security risk analysis

    NARCIS (Netherlands)

    Mirembe, Drake

    2015-01-01

    The growing demand for healthcare services is motivating hospitals to strengthen outpatient case management using information systems in order to serve more patients using the available resources. Though the use of information systems in outpatient case management raises patient data security concer

  11. Analysis of the Interdisciplinary Nature of Library and Information Science

    Science.gov (United States)

    Prebor, Gila

    2010-01-01

    Library and information science (LIS) is highly interdisciplinary by nature and is affected by the incessant evolution of technologies. A recent study surveying research trends in the years 2002-6 at various information science departments worldwide has found that a clear trend was identified in Masters theses and doctoral dissertations of social…

  12. Consent, Informal Organization and Job Rewards: A Mixed Methods Analysis

    Science.gov (United States)

    Laubach, Marty

    2005-01-01

    This study uses a mixed methods approach to workplace dynamics. Ethnographic observations show that the consent deal underlies an informal stratification that divides the workplace into an "informal periphery," a "conventional core" and an "administrative clan." The "consent deal" is defined as an exchange of autonomy, voice and schedule…

  13. Trading in markets with noisy information: an evolutionary analysis

    Science.gov (United States)

    Bloembergen, Daan; Hennes, Daniel; McBurney, Peter; Tuyls, Karl

    2015-07-01

    We analyse the value of information in a stock market where information can be noisy and costly, using techniques from empirical game theory. Previous work has shown that the value of information follows a J-curve, where averagely informed traders perform below market average, and only insiders prevail. Here we show that both noise and cost can change this picture, in several cases leading to opposite results where insiders perform below market average, and averagely informed traders prevail. Moreover, we investigate the effect of random explorative actions on the market dynamics, showing how these lead to a mix of traders being sustained in equilibrium. These results provide insight into the complexity of real marketplaces, and show under which conditions a broad mix of different trading strategies might be sustainable.

  14. Integrating Information and Communication Technology for Health Information System Strengthening: A Policy Analysis.

    Science.gov (United States)

    Marzuki, Nuraidah; Ismail, Saimy; Al-Sadat, Nabilla; Ehsan, Fauziah Z; Chan, Chee-Khoon; Ng, Chiu-Wan

    2015-11-01

    Despite the high costs involved and the lack of definitive evidence of sustained effectiveness, many low- and middle-income countries had begun to strengthen their health information system using information and communication technology in the past few decades. Following this international trend, the Malaysian Ministry of Health had been incorporating Telehealth (National Telehealth initiatives) into national health policies since the 1990s. Employing qualitative approaches, including key informant interviews and document review, this study examines the agenda-setting processes of the Telehealth policy using Kingdon's framework. The findings suggested that Telehealth policies emerged through actions of policy entrepreneurs within the Ministry of Health, who took advantage of several simultaneously occurring opportunities--official recognition of problems within the existing health information system, availability of information and communication technology to strengthen health information system and political interests surrounding the national Multimedia Super Corridor initiative being developed at the time. The last was achieved by the inclusion of Telehealth as a component of the Multimedia Super Corridor.

  15. 40 CFR 1400.9 - Access to off-site consequence analysis information by State and local government officials.

    Science.gov (United States)

    2010-07-01

    ... CONSEQUENCE ANALYSIS INFORMATION DISTRIBUTION OF OFF-SITE CONSEQUENCE ANALYSIS INFORMATION Access to Off-Site Consequence Analysis Information by Government Officials. § 1400.9 Access to off-site consequence analysis... analysis information by State and local government officials. 1400.9 Section 1400.9 Protection...

  16. Simple LED spectrophotometer for analysis of color information.

    Science.gov (United States)

    Kim, Ji-Sun; Kim, A-Hee; Oh, Han-Byeol; Goh, Bong-Jun; Lee, Eun-Suk; Kim, Jun-Sik; Jung, Gu-In; Baek, Jin-Young; Jun, Jae-Hoon

    2015-01-01

    A spectrophotometer is the basic measuring equipment essential to most research activity fields requiring samples to be measured, such as physics, biotechnology and food engineering. This paper proposes a system that is able to detect sample concentration and color information by using LED and color sensor. Purity and wavelength information can be detected by CIE diagram, and the concentration can be estimated with purity information. This method is more economical and efficient than existing spectrophotometry, and can also be used by ordinary persons. This contribution is applicable to a number of fields because it can be used as a colorimeter to detect the wavelength and purity of samples.

  17. Independent component analysis of edge information for face recognition

    CERN Document Server

    Karande, Kailash Jagannath

    2013-01-01

    The book presents research work on face recognition using edge information as features for face recognition with ICA algorithms. The independent components are extracted from edge information. These independent components are used with classifiers to match the facial images for recognition purpose. In their study, authors have explored Canny and LOG edge detectors as standard edge detection methods. Oriented Laplacian of Gaussian (OLOG) method is explored to extract the edge information with different orientations of Laplacian pyramid. Multiscale wavelet model for edge detection is also propos

  18. Information Flow in the Launch Vehicle Design/Analysis Process

    Science.gov (United States)

    Humphries, W. R., Sr.; Holland, W.; Bishop, R.

    1999-01-01

    This paper describes the results of a team effort aimed at defining the information flow between disciplines at the Marshall Space Flight Center (MSFC) engaged in the design of space launch vehicles. The information flow is modeled at a first level and is described using three types of templates: an N x N diagram, discipline flow diagrams, and discipline task descriptions. It is intended to provide engineers with an understanding of the connections between what they do and where it fits in the overall design process of the project. It is also intended to provide design managers with a better understanding of information flow in the launch vehicle design cycle.

  19. 3-D acquisition geometry analysis: Incorporating information from multiples

    NARCIS (Netherlands)

    Kumar, A.; Blacquiere, G.; Verschuur, D.J.

    2014-01-01

    Recent advances in survey design have led to conventional common-midpoint-based analysis being replaced by the subsurface-based seismic acquisition analysis and design, with the emphasis on advance techniques of illumination analysis. Amongst them are wave-equation-based seismic illumination analyse

  20. Analysis of the Effect of Information System Quality to Intention to Reuse of Employee Management Information System (Simpeg Based on Information Systems Success Model

    Directory of Open Access Journals (Sweden)

    Suryanto Tri Lathif Mardi

    2016-01-01

    Full Text Available This study examines the effect of Information Quality, Systems Quality and Service Quality on the user intention to reuse Employee Management Information System (SIMPEG in University in the city of Surabaya, based on the theoretical foundation of DeLone and McLane Information Systems Success (ISS Model. The distribution of questionnaire was conducted to 120 employees of different universities by means of stratified random sampling. The results showed that: (1 there is a significant positive effect of the System Quality on the Quality of Information, (2 there is a significant positive effect of the Information Quality on the Intention to Reuse, information related to the fulfillment of the user’s needs; (3 there is a significant positive effect of the Quality of the Intention on system re-use, the system related to the fulfillment of the needs of users; (4 there is no effect of the Quality of Service to the Intention to Reuse. In the end, the results of this study provide an analysis and advice to The University officials that can be used as a consideration for Information Technology/Information System investment and development in accordance with the Success of Information System and Intention to Reuse model.

  1. Applying Meta-Analysis to Library and Information Science Research.

    Science.gov (United States)

    Trahan, Eric

    1993-01-01

    Describes and discusses metanalysis and criticisms of the methodology. Reports on a pilot study which tested the feasibility of metanalytic methods in library science research using the literature on paper- or computer-based information retrieval. (28 references) (EA)

  2. Design and Analysis: Payroll of Accounting Information System

    Directory of Open Access Journals (Sweden)

    Suryanto Suryanto

    2011-05-01

    Full Text Available Purpose of the research are to analyze, design, and recommended the payroll of accounting information system that support the internal control to solve the problem. Research methods used are book studies, field studies, and design studies. Fields studies done by survey and interview. The expected result are to give a review about the payroll of accounting information system in the ongoing business process of company and to solve all the weakness in the payroll system, so the company can use the integrated information system in counting of the payroll. Conclusion that can take from the research are there’s some manipulation risk of attendance data and documentation of form still using a manual system and simple data backup. Then, there’s also manipulation risk in allowance cash system and all the report that include in the payroll.Index Terms - Accounting Information System, Payroll

  3. Analysis of co-occurrence networks with clique occurrence information

    Science.gov (United States)

    Shen, Bin; Li, Yixiao

    2014-12-01

    Most of co-occurrence networks only record co-occurrence relationships between two entities, and ignore the weights of co-occurrence cliques whose size is bigger than two. However, this ignored information may help us to gain insight into the co-occurrence phenomena of systems. In this paper, we analyze co-occurrence networks with clique occurrence information (CNCI) thoroughly. First, we describe the components of CNCIs and discuss the generation of clique occurrence information. And then, to illustrate the importance and usefulness of clique occurrence information, several metrics, i.e. single occurrence rate, average size of maximal co-occurrence cliques and four types of co-occurrence coefficients etc., are given. Moreover, some applications, such as combining co-occurrence frequency with structure-oriented centrality measures, are also discussed.

  4. Army Information Technology Procurement: A Business Process Analysis

    Science.gov (United States)

    2015-03-27

    IG – Inspector General IT – Information Technology ITIL – Information Technology Infrastructure Library JCA – Joint Capability Area MDEP...Library® ( ITIL ®, 2012), is an IT industry standard set of practices that focuses primarily on providing and managing IT services. This research will...consider how ITIL best practices for knowledge management and IT operations management processes may provide value to the Army construct. The U.S

  5. Information technology portfolio in supply chain management using factor analysis

    OpenAIRE

    Ahmad Jaafarnejad; Davood Rafierad; Masoumeh Gardeshi

    2013-01-01

    The adoption of information technology (IT) along with supply chain management (SCM) has become increasingly a necessity among most businesses. This enhances supply chain (SC) performance and helps companies achieve the organizational competitiveness. IT systems capture and analyze information and enable management to make decisions by considering a global scope across the entire SC. This paper reviews the existing literature on IT in SCM and considers pertinent criteria. Using principal comp...

  6. Analysis of online information searching for cardiovascular diseases on a consumer health information portal.

    Science.gov (United States)

    Jadhav, Ashutosh; Sheth, Amit; Pathak, Jyotishman

    2014-01-01

    Since the early 2000's, Internet usage for health information searching has increased significantly. Studying search queries can help us to understand users "information need" and how do they formulate search queries ("expression of information need"). Although cardiovascular diseases (CVD) affect a large percentage of the population, few studies have investigated how and what users search for CVD. We address this knowledge gap in the community by analyzing a large corpus of 10 million CVD related search queries from MayoClinic.com. Using UMLS MetaMap and UMLS semantic types/concepts, we developed a rule-based approach to categorize the queries into 14 health categories. We analyzed structural properties, types (keyword-based/Wh-questions/Yes-No questions) and linguistic structure of the queries. Our results show that the most searched health categories are 'Diseases/Conditions', 'Vital-Sings', 'Symptoms' and 'Living-with'. CVD queries are longer and are predominantly keyword-based. This study extends our knowledge about online health information searching and provides useful insights for Web search engines and health websites.

  7. A Novel Quantitative Analysis Model for Information System Survivability Based on Conflict Analysis

    Institute of Scientific and Technical Information of China (English)

    WANG Jian; WANG Huiqiang; ZHAO Guosheng

    2007-01-01

    This paper describes a novel quantitative analysis model for system survivability based on conflict analysis, which provides a direct-viewing survivable situation. Based on the three-dimensional state space of conflict, each player's efficiency matrix on its credible motion set can be obtained. The player whose desire is the strongest in all initiates the moving and the overall state transition matrix of information system may be achieved. In addition, the process of modeling and stability analysis of conflict can be converted into a Markov analysis process, thus the obtained results with occurring probability of each feasible situation will help the players to quantitatively judge the probability of their pursuing situations in conflict. Compared with the existing methods which are limited to post-explanation of system's survivable situation, the proposed model is relatively suitable for quantitatively analyzing and forecasting the future development situation of system survivability. The experimental results show that the model may be effectively applied to quantitative analysis for survivability. Moreover, there will be a good application prospect in practice.

  8. Structural Analysis: Shape Information via Points-To Computation

    CERN Document Server

    Marron, Mark

    2012-01-01

    This paper introduces a new hybrid memory analysis, Structural Analysis, which combines an expressive shape analysis style abstract domain with efficient and simple points-to style transfer functions. Using data from empirical studies on the runtime heap structures and the programmatic idioms used in modern object-oriented languages we construct a heap analysis with the following characteristics: (1) it can express a rich set of structural, shape, and sharing properties which are not provided by a classic points-to analysis and that are useful for optimization and error detection applications (2) it uses efficient, weakly-updating, set-based transfer functions which enable the analysis to be more robust and scalable than a shape analysis and (3) it can be used as the basis for a scalable interprocedural analysis that produces precise results in practice. The analysis has been implemented for .Net bytecode and using this implementation we evaluate both the runtime cost and the precision of the results on a num...

  9. A quantum information theoretic analysis of three flavor neutrino oscillations

    CERN Document Server

    Banerjee, Subhashish; Srikanth, R; Hiesmayr, Beatrix C

    2015-01-01

    Correlations exhibited by neutrino oscillations are studied via quantum information theoretic quantities. We show that the strongest type of entanglement, genuine multipartite entanglement, is persistent in the flavour changing states. We prove the existence of Bell-type nonlocal features, in both its absolute and genuine avatars. Finally, we show that a measure of nonclassicality, dissension, which is a generalization of quantum discord to the tripartite case, is nonzero for almost the entire range of time in the evolution of an initial electron-neutrino. Via these quantum information theoretic quantities capturing different aspects of quantum correlations, we elucidate the differences between the flavour types, shedding light on the quantum-information theoretic aspects of the weak force.

  10. An Information Theoretic Analysis of Decision in Computer Chess

    CERN Document Server

    Godescu, Alexandru

    2011-01-01

    The basis of the method proposed in this article is the idea that information is one of the most important factors in strategic decisions, including decisions in computer chess and other strategy games. The model proposed in this article and the algorithm described are based on the idea of a information theoretic basis of decision in strategy games . The model generalizes and provides a mathematical justification for one of the most popular search algorithms used in leading computer chess programs, the fractional ply scheme. However, despite its success in leading computer chess applications, until now few has been published about this method. The article creates a fundamental basis for this method in the axioms of information theory, then derives the principles used in programming the search and describes mathematically the form of the coefficients. One of the most important parameters of the fractional ply search is derived from fundamental principles. Until now this coefficient has been usually handcrafted...

  11. Information-theoretic analysis of electronic and printed document authentication

    Science.gov (United States)

    Voloshynovskiy, Sviatoslav; Koval, Oleksiy; Villan, Renato; Topak, Emre; Vila Forcén, José Emilio; Deguillaume, Frederic; Rytsar, Yuriy; Pun, Thierry

    2006-02-01

    In this paper we consider the problem of document authentication in electronic and printed forms. We formulate this problem from the information-theoretic perspectives and present the joint source-channel coding theorems showing the performance limits in such protocols. We analyze the security of document authentication methods and present the optimal attacking strategies with corresponding complexity estimates that, contrarily to the existing studies, crucially rely on the information leaked by the authentication protocol. Finally, we present the results of experimental validation of the developed concept that justifies the practical efficiency of the elaborated framework.

  12. An Information Diffusion Technique for Fire Risk Analysis

    Institute of Scientific and Technical Information of China (English)

    刘静; 黄崇福

    2004-01-01

    There are many kinds of fires occurring under different conditions. For a specific site, it is difficult to collect sufficient data for analyzing the fire risk. In this paper, we suggest an information diffusion technique to analyze fire risk with a small sample. The information distribution method is applied to change crisp observations into fuzzy sets, and then to effectively construct a fuzzy relationship between fire and surroundings. With the data of Shanghai in winter, we show how to use the technique to analyze the fire risk.

  13. Eye Movement Analysis of Information Processing under Different Testing Conditions.

    Science.gov (United States)

    Dillon, Ronna F.

    1985-01-01

    Undergraduates were given complex figural analogies items, and eye movements were observed under three types of feedback: (1) elaborate feedback; (2) subjects verbalized their thinking and application of rules; and (3) no feedback. Both feedback conditions enhanced the rule-governed information processing during inductive reasoning. (Author/GDC)

  14. Information technology and ethics: An exploratory factor analysis

    Energy Technology Data Exchange (ETDEWEB)

    Conger, S. [Southern Methodist Univ., Dallas, TX (United States); Loch, K.D. [Georgia State Univ., Atlanta, GA (United States); Helft, B.L. [Baruch College, New York, NY (United States)

    1994-12-31

    Ethical dilemmas are situations in which a decision results in unpleasant consequences. The unpleasant consequences are treated as a zero-sum game in which someone always loses. Introducing information technology (IT) to a situation makes the recognition of a potential loser more abstract and difficult to identify, thus an ethical dilemma may go unrecognized. The computer mediates the human relationship which causes a lost sense of contact with a person at the other end of the computer connection. In 1986, Richard O. Mason published an essay identifying privacy, accuracy, property, and Access (PAPA) as the four main ethical issues of the information age. Anecdotes for each issue describe the injured party`s perspective to identify consequences resulting from unethical use of information and information technology. This research sought to validate Mason`s social issues empirically, but with distinct differences. Mason defined issues to raise awareness and initiate debate on the need for a social agenda; our focus is on individual computer users and the attitudes they hold about ethical behavior in computer use. This study examined the attitudes of the computer user who experiences the ethical dilemma to determine the extent to which ethical components are recognized, and whether Mason`s issues form recognizable constructs.

  15. Towards an Information Theoretic Analysis of Searchable Encryption

    NARCIS (Netherlands)

    Sedghi, S.; Doumen, J.M.; Hartel, P.H.; Jonker, W.

    2008-01-01

    Searchable encryption is a technique that allows a client to store data in encrypted form on a curious server, such that data can be retrieved while leaking a minimal amount of information to the server. Many searchable encryption schemes have been proposed and proved secure in their own computation

  16. Towards an Information Theoretic Analysis of Searchable Encryption (Extended Version)

    NARCIS (Netherlands)

    Sedghi, S.; Doumen, J.M.; Hartel, P.H.; Jonker, W.

    2008-01-01

    Searchable encryption is a technique that allows a client to store data in encrypted form on a curious server, such that data can be retrieved while leaking a minimal amount of information to the server. Many searchable encryption schemes have been proposed and proved secure in their own computation

  17. Analysis on Cloud Computing Information Security Problems and the Countermeasures

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Cloud computing is one of the most popular terms in the present IT industry, as well as one of the most prosperous technology. This paper introduces the concept, principle and characteristics of cloud computing, analyzes information security problems resulted from cloud computing, and puts forward corresponding solutions.

  18. Chromatic Information and Feature Detection in Fast Visual Analysis

    Science.gov (United States)

    Del Viva, Maria M.; Punzi, Giovanni; Shevell, Steven K.

    2016-01-01

    The visual system is able to recognize a scene based on a sketch made of very simple features. This ability is likely crucial for survival, when fast image recognition is necessary, and it is believed that a primal sketch is extracted very early in the visual processing. Such highly simplified representations can be sufficient for accurate object discrimination, but an open question is the role played by color in this process. Rich color information is available in natural scenes, yet artist's sketches are usually monochromatic; and, black-and-white movies provide compelling representations of real world scenes. Also, the contrast sensitivity of color is low at fine spatial scales. We approach the question from the perspective of optimal information processing by a system endowed with limited computational resources. We show that when such limitations are taken into account, the intrinsic statistical properties of natural scenes imply that the most effective strategy is to ignore fine-scale color features and devote most of the bandwidth to gray-scale information. We find confirmation of these information-based predictions from psychophysics measurements of fast-viewing discrimination of natural scenes. We conclude that the lack of colored features in our visual representation, and our overall low sensitivity to high-frequency color components, are a consequence of an adaptation process, optimizing the size and power consumption of our brain for the visual world we live in. PMID:27478891

  19. ANKH: Information Threat Analysis with Actor-NetworK Hypergraphs

    NARCIS (Netherlands)

    Pieters, Wolter

    2010-01-01

    Traditional information security modelling approaches often focus on containment of assets within boundaries. Due to what is called de-perimeterisation, such boundaries, for example in the form of clearly separated company networks, disappear. This paper argues that in a de-perimeterised situation a

  20. Demand Analysis of Informational Shipboard Gun Weapon System Interface Design

    Directory of Open Access Journals (Sweden)

    WANG Hui-chuan

    2013-04-01

    Full Text Available According to development demand of informational shipboard gun weapon system, design concept of shipboard gun weapon system interface is proposed. System composition is put forward and function demand of interface is analyzed from combat, training and detection aspects. General principle need to be followed in design process. A new concept is provided for development of shipboard gun weapon system interface.

  1. Cross-Language Information Retrieval: An Analysis of Errors.

    Science.gov (United States)

    Ruiz, Miguel E.; Srinivasan, Padmini

    1998-01-01

    Investigates an automatic method for Cross Language Information Retrieval (CLIR) that utilizes the multilingual Unified Medical Language System (UMLS) Metathesaurus to translate Spanish natural-language queries into English. Results indicate that for Spanish, the UMLS Metathesaurus-based CLIR method is at least equivalent to if not better than…

  2. Repeater Analysis for Combining Information from Different Assessments

    Science.gov (United States)

    Haberman, Shelby; Yao, Lili

    2015-01-01

    Admission decisions frequently rely on multiple assessments. As a consequence, it is important to explore rational approaches to combine the information from different educational tests. For example, U.S. graduate schools usually receive both TOEFL iBT® scores and GRE® General scores of foreign applicants for admission; however, little guidance…

  3. Information Technology: A Community of Practice. A Workplace Analysis

    Science.gov (United States)

    Guerrero, Tony

    2014-01-01

    Information Technology (IT) encompasses all aspects of computing technology. IT is concerned with issues relating to supporting technology users and meeting their needs within an organizational and societal context through the selection, creation, application, integration, and administration of computing technologies (Lunt, et. al., 2008). The…

  4. C. elegans locomotion analysis using algorithmic information theory.

    Science.gov (United States)

    Skandari, Roghieh; Le Bihan, Nicolas; Manton, Jonathan H

    2015-01-01

    This article investigates the use of algorithmic information theory to analyse C. elegans datasets. The ability of complexity measures to detect similarity in animals' behaviours is demonstrated and their strengths are compared to methods such as histograms. Introduced quantities are illustrated on a couple of real two-dimensional C. elegans datasets to investigate the thermotaxis and chemotaxis behaviours.

  5. Temporal Expectation and Information Processing: A Model-Based Analysis

    Science.gov (United States)

    Jepma, Marieke; Wagenmakers, Eric-Jan; Nieuwenhuis, Sander

    2012-01-01

    People are able to use temporal cues to anticipate the timing of an event, enabling them to process that event more efficiently. We conducted two experiments, using the fixed-foreperiod paradigm (Experiment 1) and the temporal-cueing paradigm (Experiment 2), to assess which components of information processing are speeded when subjects use such…

  6. System Engineering Analysis For Improved Scout Business Information Systems

    Energy Technology Data Exchange (ETDEWEB)

    Van Slyke, D. A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-01-30

    The project uses system engineering principles to address the need of Boy Scout leaders for an integrated system to facilitate advancement and awards records, leader training and planning for meetings and activities. Existing products to address needs of Scout leaders and relevant stakeholders function to support record keeping and some communication functions but opportunity exists for a better system to fully integrate these functions with training delivery and recording, activity planning along with feedback and information gathering from stakeholders. Key stakeholders for the sytem include Scouts and their families, leaders, training providers, sellers of supplies and awards, content generators and facilities that serve Scout activities. Key performance parameters for the system are protection of personal information, availability of current information, information accuracy and information content that has depth. Implementation concepts considered for the system include (1) owned and operated by Boy Scouts of America, (2) Contracted out to a vendor (3) distributed system that functions with BSA managed interfaces. The selected concept is to contract out to a vendor to maximize the likelihood of successful integration and take advantage of the best technology. Development of requirements considers three key use cases (1) System facilitates planning a hike with training needed satisfied in advance and advancement recording real time (2) Scheduling and documenting in-person training, (3) Family interested in Scouting receives information and can request follow-up. Non-functional requirements are analyzed with the Quality Function Deployment tool. Requirement addressing frequency of backup, compatibility with legacy and new technology, language support, software update are developed to address system reliability and intuitive interface. System functions analyzed include update of activity database, maintenance of advancement status, archive of documents, and

  7. BGI-RIS: an integrated information resource and comparative analysis workbench for rice genomics

    DEFF Research Database (Denmark)

    Zhao, Wenming; Wang, Jing; He, Ximiao

    2004-01-01

    the application of the rice genomic information and to provide a foundation for functional and evolutionary studies of other important cereal crops, we implemented our Rice Information System (BGI-RIS), the most up-to-date integrated information resource as well as a workbench for comparative genomic analysis...

  8. 75 FR 58374 - 2010 Release of CADDIS (Causal Analysis/Diagnosis Decision Information System)

    Science.gov (United States)

    2010-09-24

    ... AGENCY 2010 Release of CADDIS (Causal Analysis/Diagnosis Decision Information System) AGENCY... Decision Information System (CADDIS). This Web site was developed to help scientists find, develop... information useful for causal evaluations in aquatic systems. CADDIS is based on EPA's Stressor...

  9. 75 FR 35457 - Draft of the 2010 Causal Analysis/Diagnosis Decision Information System (CADDIS)

    Science.gov (United States)

    2010-06-22

    ... AGENCY Draft of the 2010 Causal Analysis/Diagnosis Decision Information System (CADDIS) AGENCY... investigators find, access, organize, and share information useful for causal evaluations in aquatic systems... ``anonymous access'' system, which means that EPA will not know your identity or contact information...

  10. MIToS.jl: mutual information tools for protein sequence analysis in the Julia language

    DEFF Research Database (Denmark)

    Zea, Diego J.; Anfossi, Diego; Nielsen, Morten

    2017-01-01

    Motivation: MIToS is an environment for mutual information analysis and a framework for protein multiple sequence alignments (MSAs) and protein structures (PDB) management in Julia language. It integrates sequence and structural information through SIFTS, making Pfam MSAs analysis straightforward...

  11. Demand Analysis of Logistics Information Matching Platform: A Survey from Highway Freight Market in Zhejiang Province

    Science.gov (United States)

    Chen, Daqiang; Shen, Xiahong; Tong, Bing; Zhu, Xiaoxiao; Feng, Tao

    With the increasing competition in logistics industry and promotion of lower logistics costs requirements, the construction of logistics information matching platform for highway transportation plays an important role, and the accuracy of platform design is the key to successful operation or not. Based on survey results of logistics service providers, customers and regulation authorities to access to information and in-depth information demand analysis of logistics information matching platform for highway transportation in Zhejiang province, a survey analysis for framework of logistics information matching platform for highway transportation is provided.

  12. Analysis on Recommended System for Web Information Retrieval Using HMM

    Directory of Open Access Journals (Sweden)

    Himangni Rathore

    2014-11-01

    Full Text Available Web is a rich domain of data and knowledge, which is spread over the world in unstructured manner. The number of users is continuously access the information over the internet. Web mining is an application of data mining where web related data is extracted and manipulated for extracting knowledge. The data mining is used in the domain of web information mining is refers as web mining, that is further divided into three major domains web uses mining, web content mining and web structure mining. The proposed work is intended to work with web uses mining. The concept of web mining is to improve the user feedbacks and user navigation pattern discovery for a CRM system. Finally a new algorithm HMM is used for finding the pattern in data, which method promises to provide much accurate recommendation.

  13. Economic analysis of e-waste market under imperfect information

    OpenAIRE

    Dato, Prudence

    2015-01-01

    Despite international regulations that prohibit the trans-boundary movement of electronic and electric waste (e-waste), non-reusable e-waste is often illegally mixed with reusable e-waste and results in being sent to developing countries. As developing countries are not well prepared to properly manage e-waste, this illegal trade has important negative externalities, and creates ‘environmental injustice’. The two main information problems on the e-waste market are imperfect monitoring and imp...

  14. Analysis of consumer information brochures on osteoporosis prevention and treatment

    Directory of Open Access Journals (Sweden)

    Mühlhauser, Ingrid

    2007-01-01

    Full Text Available Purpose: Evidence-based consumer information is a prerequisite for informed decision making. So far, there are no reports on the quality of consumer information brochures on osteoporosis. In the present study we analysed brochures on osteoporosis available in Germany. Method: All printed brochures from patient and consumer advocacy groups, physician and governmental organisations, health insurances, and pharmaceutical companies were initially collected in 2001, and updated in December 2004. Brochures were analysed by two independent researchers using 37 internationally proposed criteria addressing evidence-based content, risk communication, transparency of the development process, and layout and design. Results: A total of 165 brochures were identified; 59 were included as they specifically targeted osteoporosis prevention and treatment. Most brochures were provided by pharmaceutical companies (n=25, followed by health insurances (n=11 and patient and consumer advocacy groups (n=11. Quality of brochures did not differ between providers. Only 1 brochure presented lifetime risk estimate; 4 mentioned natural course of osteoporosis. A balanced report on benefit versus lack of benefit was presented in 2 brochures and on benefit versus adverse effects in 8 brochures. Four brochures mentioned relative risk reduction, 1 reported absolute risk reduction through hormone replacement therapy (HRT. Out of 28 brochures accessed in 2004 10 still recommended HRT without discussing adverse effects. Transparency of the development process was limited: 25 brochures reported publication date, 26 cited author and only 1 references. In contrast, readability and design was generally good. Conclusion: The quality of consumer brochures on osteoporosis in Germany is utterly inadequate. They fail to give evidence-based data on diagnosis and treatment options. Therefore, the material is not useful to enhance informed consumer choice.

  15. Benetton and Zara information systems:a comparative analysis

    OpenAIRE

    Pirone, Chiara

    2010-01-01

    Supply chain management has emerged as one of the major areas for companies to gain a competitive edge. Managing supply chains effectively is a complex and challenging task, due to the current business trends of expanding product variety, short product life cycle, increasing outsourcing, globalization of businesses, and continuous advances in information technology. Because of shorter and shorter product life cycles, the pressure for dynamically adjusting and adapting a company¿s supply chain...

  16. Sources of referral information: a marketing analysis of physician behavior.

    Science.gov (United States)

    Powers, T L; Swan, J E; Taylor, J A; Bendall, D

    1998-01-01

    The referral process is an important means of obtaining patients and it is necessary to determine ways of influencing the referral process to increase the patient base. This article reports research based on a survey of the referral habits of 806 primary care physicians. The results are examined in the context of physician receptivity to marketer-controlled versus health services sources of referral information.

  17. Modeling Information Content Via Dirichlet-Multinomial Regression Analysis.

    Science.gov (United States)

    Ferrari, Alberto

    2017-02-16

    Shannon entropy is being increasingly used in biomedical research as an index of complexity and information content in sequences of symbols, e.g. languages, amino acid sequences, DNA methylation patterns and animal vocalizations. Yet, distributional properties of information entropy as a random variable have seldom been the object of study, leading to researchers mainly using linear models or simulation-based analytical approach to assess differences in information content, when entropy is measured repeatedly in different experimental conditions. Here a method to perform inference on entropy in such conditions is proposed. Building on results coming from studies in the field of Bayesian entropy estimation, a symmetric Dirichlet-multinomial regression model, able to deal efficiently with the issue of mean entropy estimation, is formulated. Through a simulation study the model is shown to outperform linear modeling in a vast range of scenarios and to have promising statistical properties. As a practical example, the method is applied to a data set coming from a real experiment on animal communication.

  18. SynBlast: Assisting the analysis of conserved synteny information

    Directory of Open Access Journals (Sweden)

    Stadler Peter F

    2008-08-01

    Full Text Available Abstract Motivation In the last years more than 20 vertebrate genomes have been sequenced, and the rate at which genomic DNA information becomes available is rapidly accelerating. Gene duplication and gene loss events inherently limit the accuracy of orthology detection based on sequence similarity alone. Fully automated methods for orthology annotation do exist but often fail to identify individual members in cases of large gene families, or to distinguish missing data from traceable gene losses. This situation can be improved in many cases by including conserved synteny information. Results Here we present the SynBlast pipeline that is designed to construct and evaluate local synteny information. SynBlast uses the genomic region around a focal reference gene to retrieve candidates for homologous regions from a collection of target genomes and ranks them in accord with the available evidence for homology. The pipeline is intended as a tool to aid high quality manual annotation in particular in those cases where automatic procedures fail. We demonstrate how SynBlast is applied to retrieving orthologous and paralogous clusters using the vertebrate Hox and ParaHox clusters as examples. Software The SynBlast package written in Perl is available under the GNU General Public License at http://www.bioinf.uni-leipzig.de/Software/SynBlast/.

  19. MIPS: analysis and annotation of genome information in 2007.

    Science.gov (United States)

    Mewes, H W; Dietmann, S; Frishman, D; Gregory, R; Mannhaupt, G; Mayer, K F X; Münsterkötter, M; Ruepp, A; Spannagl, M; Stümpflen, V; Rattei, T

    2008-01-01

    The Munich Information Center for Protein Sequences (MIPS-GSF, Neuherberg, Germany) combines automatic processing of large amounts of sequences with manual annotation of selected model genomes. Due to the massive growth of the available data, the depth of annotation varies widely between independent databases. Also, the criteria for the transfer of information from known to orthologous sequences are diverse. To cope with the task of global in-depth genome annotation has become unfeasible. Therefore, our efforts are dedicated to three levels of annotation: (i) the curation of selected genomes, in particular from fungal and plant taxa (e.g. CYGD, MNCDB, MatDB), (ii) the comprehensive, consistent, automatic annotation employing exhaustive methods for the computation of sequence similarities and sequence-related attributes as well as the classification of individual sequences (SIMAP, PEDANT and FunCat) and (iii) the compilation of manually curated databases for protein interactions based on scrutinized information from the literature to serve as an accepted set of reliable annotated interaction data (MPACT, MPPI, CORUM). All databases and tools described as well as the detailed descriptions of our projects can be accessed through the MIPS web server (http://mips.gsf.de).

  20. The Readability of Electronic Cigarette Health Information and Advice: A Quantitative Analysis of Web-Based Information

    Science.gov (United States)

    Zhu, Shu-Hong; Conway, Mike

    2017-01-01

    Background The popularity and use of electronic cigarettes (e-cigarettes) has increased across all demographic groups in recent years. However, little is currently known about the readability of health information and advice aimed at the general public regarding the use of e-cigarettes. Objective The objective of our study was to examine the readability of publicly available health information as well as advice on e-cigarettes. We compared information and advice available from US government agencies, nongovernment organizations, English speaking government agencies outside the United States, and for-profit entities. Methods A systematic search for health information and advice on e-cigarettes was conducted using search engines. We manually verified search results and converted to plain text for analysis. We then assessed readability of the collected documents using 4 readability metrics followed by pairwise comparisons of groups with adjustment for multiple comparisons. Results A total of 54 documents were collected for this study. All 4 readability metrics indicate that all information and advice on e-cigarette use is written at a level higher than that recommended for the general public by National Institutes of Health (NIH) communication guidelines. However, health information and advice written by for-profit entities, many of which were promoting e-cigarettes, were significantly easier to read. Conclusions A substantial proportion of potential and current e-cigarette users are likely to have difficulty in fully comprehending Web-based health information regarding e-cigarettes, potentially hindering effective health-seeking behaviors. To comply with NIH communication guidelines, government entities and nongovernment organizations would benefit from improving the readability of e-cigarettes information and advice. PMID:28062390

  1. AN ANALYSIS OF SALES INFORMATION SYSTEM AND COMPETITIVE ADVANTAGE (Study Case of UD. Citra Helmet

    Directory of Open Access Journals (Sweden)

    Hendra Alianto

    2012-10-01

    Full Text Available Business development in this era of globalization leads companies to use information system in running business relationship by changing the traditional way of working in non-integrated information systems into integrated information systems. The intended use of the integrated information system will improve the effective and efficient way of working, such as the availability of information in real-time, accurate and informative for decision-making for the benefit of operational activities, as well as decision-making for strategic interests and the company’s business development. Especially with the application of sales information system, it will improve the company’s performance and will affect the competitiveness of companies, which can ultimately increase the maximum profit. However, in reality it is not easy to implement the integrated information system, because it is influenced by the customs, culture and mindset of the user company. It is necessary for running system analysis activity and building an integrated information system by concerning into the needs of users, management, customers and stakeholders. The implementation of integrated information system will increase productivity and achieve the effectiveness and efficiency level of company’s operations, through the analysis of sales information system will affect the competitiveness of companies.Keywords: Sales Information System Analysis

  2. Infodynamics: Analogical Analysis of States of Matter and Information

    Science.gov (United States)

    2007-01-01

    On the Shannon measure of entropy, Information Sciences 23 (1) (1981) 1–9. [2] G.W. Castellan , Physical Chemistry, Addison-Wesley Publishing Co...Reading, MA, 1964, pp. 11–13. [3] G.W. Castellan , Physical Chemistry, Addison-Wesley Publishing Co., Reading, MA, 1964, pp. 291–296. [4] G.W. Castellan ...Physical Chemistry, Addison-Wesley Publishing Co., Reading MA, 1964, pp. 38–40. [5] G.W. Castellan , Physical Chemistry, Addison-Wesley Publishing Co

  3. Computer programs: Information retrieval and data analysis, a compilation

    Science.gov (United States)

    1972-01-01

    The items presented in this compilation are divided into two sections. Section one treats of computer usage devoted to the retrieval of information that affords the user rapid entry into voluminous collections of data on a selective basis. Section two is a more generalized collection of computer options for the user who needs to take such data and reduce it to an analytical study within a specific discipline. These programs, routines, and subroutines should prove useful to users who do not have access to more sophisticated and expensive computer software.

  4. The analysis of network transmission method for welding robot information

    Science.gov (United States)

    Cheng, Weide; Zhang, Hua; Liu, Donghua; Wang, Hongbo

    2012-01-01

    On the asis of User Datagram Protocol (UserDatagram Protocol, UDP), to do some improvement and design a welding robot network communication protocol (welding robot network communicate protocol: WRNCP), working on the fields of the transport layer and application layer of TCP / IP protocol. According to the characteristics of video data, to design the radio push-type (Broadcast Push Model, BPM) transmission method, improving the efficiency and stability of video transmission.and to designed the network information transmission system, used for real-time control of welding robot network.

  5. Information Analysis Centers in the Department of Defense. Revision

    Science.gov (United States)

    1987-07-01

    electric rockets, ramjets, space vehicles and gun propulsion systems. This includes chemical synthesis; thermochemistry ; combustion phenomena; physical...equipment testability, and worst case analysis and currently teaches over 1000 government/industry personnel each year. RAC is recognized as a leading

  6. Data Centric Integration and Analysis of Information Technology Architectures

    Science.gov (United States)

    2007-09-01

    Systems Engineering Analysis Process and DoDAF Architecture Development Process............... 133 5. Federate Architecture Databases and Tools...constructed as a product of this thesis, this step is not applicable. 6. Source Data Collection The NPS Library was used to query the EBSCOhost , BOSUN...

  7. Information

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    There are unstructured abstracts (no more than 256 words) and structured abstracts (no more than 480). The specific requirements for structured abstracts are as follows:An informative, structured abstracts of no more than 4-80 words should accompany each manuscript. Abstracts for original contributions should be structured into the following sections. AIM (no more than 20 words): Only the purpose should be included. Please write the aim as the form of "To investigate/ study/..."; MATERIALS AND METHODS (no more than 140 words); RESULTS (no more than 294 words): You should present P values where appropnate and must provide relevant data to illustrate how they were obtained, e.g. 6.92 ± 3.86 vs 3.61 ± 1.67, P< 0.001; CONCLUSION (no more than 26 words).

  8. Comparative Analysis of Transcription Start Sites Using Mutual Information

    Institute of Scientific and Technical Information of China (English)

    D.Ashok Reddy; Chanchal K.Mitra

    2006-01-01

    The transcription start site (TSS) region shows greater variability compared with other promoter elements. We are interested to search for its variability by using information content as a measure. We note in this study that the variability is significant in the block of 5 nucleotides (nt) surrounding the TSS region compared with the block of 15 nt. This suggests that the actual region that may be involved is in the range of 5-10 nt in size. For Escherichia coli, we note that the information content from dinucleotide substitution matrices clearly shows a better discrimination, suggesting the presence of some correlations. However, for human this effect is much less, and for mouse it is practically absent. We can conclude that the presence of short-range correlations within the TSS region is species-dependent and is not universal. We further observe that there are other variable regions in the mitochondrial control element apart from TSS. It is also noted that effective comparisons can only be made on blocks, while single nucleotide comparisons do not give us any detectable signals.

  9. [Italian physician's needs for medical information. Retrospective analysis of the medical information service provided by Novartis Pharma to clinicians].

    Science.gov (United States)

    Speroni, Elisabetta; Poggi, Susanna; Vinaccia, Vincenza

    2013-10-01

    The physician's need for medical information updates has been studied extensively in recent years but the point of view of the pharmaceutical industry on this need has rarely been considered. This paper reports the results of a retrospective analysis of the medical information service provided to Italian physicians by an important pharmaceutical company, Novartis Pharma, from 2004 to 2012. The results confirm clinicians' appreciation of a service that gives them access to tailored scientific documentation and the number of requests made to the network of medical representatives has been rising steadily, peaking whenever new drugs become available to physicians. The analysis confirms what -other international studies have ascertained, that most queries are about how to use the drugs and what their properties are. The results highlight some differences between different medical specialties: for example, proportionally, neurologists seem to be the most curious. This, as well as other interesting snippets, is worth further exploration. Despite its limits in terms of representativeness, what comes out of the study is the existence of an real unmet need for information by healthcare institutions and that the support offered by the pharmaceutical industry could be invaluable; its role could go well beyond that of a mere supplier to National Healthcare Systems, to that of being recognised as an active partner the process of ensuring balanced and evidence-based information. At the same time, closer appraisal of clinicians' needs could help the pharma industries to improve their communication and educational strategies in presenting their latest clinical research and their own products.

  10. Cognitive tasks in information analysis: Use of event dwell time to characterize component activities

    Energy Technology Data Exchange (ETDEWEB)

    Sanquist, Thomas F.; Greitzer, Frank L.; Slavich, Antoinette L.; Littlefield, Rik J.; Littlefield, Janis S.; Cowley, Paula J.

    2004-09-28

    Technology-based enhancement of information analysis requires a detailed understanding of the cognitive tasks involved in the process. The information search and report production tasks of the information analysis process were investigated through evaluation of time-stamped workstation data gathered with custom software. Model tasks simulated the search and production activities, and a sample of actual analyst data were also evaluated. Task event durations were calculated on the basis of millisecond-level time stamps, and distributions were plotted for analysis. The data indicate that task event time shows a cyclic pattern of variation, with shorter event durations (< 2 sec) reflecting information search and filtering, and longer event durations (> 10 sec) reflecting information evaluation. Application of cognitive principles to the interpretation of task event time data provides a basis for developing “cognitive signatures” of complex activities, and can facilitate the development of technology aids for information intensive tasks.

  11. Modal interval analysis new tools for numerical information

    CERN Document Server

    Sainz, Miguel A; Calm, Remei; Herrero, Pau; Jorba, Lambert; Vehi, Josep

    2014-01-01

    This book presents an innovative new approach to interval analysis. Modal Interval Analysis (MIA) is an attempt to go beyond the limitations of classic intervals in terms of their structural, algebraic and logical features. The starting point of MIA is quite simple: It consists in defining a modal interval that attaches a quantifier to a classical interval and in introducing the basic relation of inclusion between modal intervals by means of the inclusion of the sets of predicates they accept. This modal approach introduces interval extensions of the real continuous functions, identifies equivalences between logical formulas and interval inclusions, and provides the semantic theorems that justify these equivalences, along with guidelines for arriving at these inclusions. Applications of these equivalences in different areas illustrate the obtained results. The book also presents a new interval object: marks, which aspire to be a new form of numerical treatment of errors in measurements and computations.

  12. Sentiment Analysis of Feedback Information in Hospitality Industry

    Directory of Open Access Journals (Sweden)

    Manzoor Ahmad

    2014-06-01

    Full Text Available Sentiment analysis is the study of opinions, emotions of a person‟s towards events or entities which can enable to rate that event or entity for decision making by the prospective buyers/users. In this research paper I have tried to demonst rate the use of automatic opinion mining/sentiment analysis to rate a hotel and its service‟s based on the guest feedback data. We have used a semantic resource for a feature vector and Naïve Bayes classifier for the review classification after reducing the feature sets for better accuracy and efficiency. Also an improvement in the accuracy of the classification has been observed after the use of bi-gram and tri-gram language model.

  13. Sentiment Analysis of Feedback Information in Hospitality Industry

    Directory of Open Access Journals (Sweden)

    Manzoor Ahmad

    2015-11-01

    Full Text Available Sentiment analysis is the study of opinions, emotions of a person‟s towards events or entities which can enable to rate that event or entity for decision making by the prospective buyers/users. In this research paper I have tried to demonstrate the use of automatic opinion mining/sentiment analysis to rate a hotel and its service‟s based on the guest feedback data. We have used a semantic resource for a feature vector and Naïve Bayes classifier for the review classification after reducing the feature sets for better accuracy and efficiency. Also an improvement in the accuracy of the classification has been observed after the use of bi-gram and tri-gram language model.

  14. Directory of Federally Supported Information Analysis Centers, 1979. Fourth Edition.

    Science.gov (United States)

    1979-01-01

    monitoring program for yield. brine disposal in the Gulf of Mexico from leaching of salt domes, under an agreement with the HOLDINGS: Computerized...SPR/Salt Dome Storage-Analysis of Brine Disposal in the Gulf of Mexico (1977); journal ar- HOLDINGS: Computerized data base on the ticles...Electrochemistry, 68 Glaciology, 60 . Electroluminescence, 34 Glass, 34, 65, 103 Electrolysis , 33 Graduate study, 42 Joining, 65 Electrolyte solutions

  15. Computational Analysis of Perfect-Information Position Auctions

    OpenAIRE

    Thompson, David R. M; Leyton-Brown, Kevin

    2014-01-01

    After experimentation with other designs, the major search engines converged on the weighted, generalized second-price auction (wGSP) for selling keyword advertisements. Notably, this convergence occurred before position auctions were well understood (or, indeed, widely studied) theoretically. While much progress has been made since, theoretical analysis is still not able to settle the question of why search engines found wGSP preferable to other position auctions. We approach this question i...

  16. Quantum information analysis of electronic states at different molecular structures

    CERN Document Server

    Barcza, G; Marti, K H; Reiher, M

    2010-01-01

    We have studied transition metal clusters from a quantum information theory perspective using the density-matrix renormalization group (DMRG) method. We demonstrate the competition between entanglement and interaction localization. We also discuss the application of the configuration interaction based dynamically extended active space procedure which significantly reduces the effective system size and accelerates the speed of convergence for complicated molecular electronic structures to a great extent. Our results indicate the importance of taking entanglement among molecular orbitals into account in order to devise an optimal orbital ordering and carry out efficient calculations on transition metal clusters. We propose a recipe to perform DMRG calculations in a black-box fashion and we point out the connections of our work to other tensor network state approaches.

  17. DEACTIVATION AND DECOMMISSIONING PLANNING AND ANALYSIS WITH GEOGRAPHIC INFORMATION SYSTEMS

    Energy Technology Data Exchange (ETDEWEB)

    Bollinger, J; William Austin, W; Larry Koffman, L

    2007-09-17

    From the mid-1950's through the 1980's, the U.S. Department of Energy's Savannah River Site produced nuclear materials for the weapons stockpile, for medical and industrial applications, and for space exploration. Although SRS has a continuing defense-related mission, the overall site mission is now oriented toward environmental restoration and management of legacy chemical and nuclear waste. With the change in mission, SRS no longer has a need for much of the infrastructure developed to support the weapons program. This excess infrastructure, which includes over 1000 facilities, will be decommissioned and demolished over the forthcoming years. Dispositioning facilities for decommissioning and deactivation requires significant resources to determine hazards, structure type, and a rough-order-of-magnitude estimate for the decommissioning and demolition cost. Geographic information systems (GIS) technology was used to help manage the process of dispositioning infrastructure and for reporting the future status of impacted facilities.

  18. Transit Light Curves with Finite Integration Time: Fisher Information Analysis

    CERN Document Server

    Price, Ellen M

    2014-01-01

    Kepler has revolutionized the study of transiting planets with its unprecedented photometric precision on more than 150,000 target stars. Most of the transiting planet candidates detected by Kepler have been observed as long-cadence targets with 30 minute integration times, and the upcoming Transiting Exoplanet Survey Satellite (TESS) will record full frame images with a similar integration time. Integrations of 30 minutes affect the transit shape, particularly for small planets and in cases of low signal-to-noise. Using the Fisher information matrix technique, we derive analytic approximations for the variances and covariances on the transit parameters obtained from fitting light curve photometry collected with a finite integration time. We find that binning the light curve can significantly increase the uncertainties and covariances on the inferred parameters when comparing scenarios with constant total signal-to-noise (constant total integration time in the absence of read noise). Uncertainties on the tran...

  19. 信息分析基础理论研究%Researches on the Elementary Theory of Information Analysis

    Institute of Scientific and Technical Information of China (English)

    高柳宾; 孙云川

    2000-01-01

    The engendering and developing process of the elementary theory of Information Analysis is studied. From the developmental point of view, the elementary theory of Information Analysis can be divided into three stages: Bibliometrics-based stage, Informetrics-based stage and Microeconomics of Information-based stage, an exploring, studying and creating stage of the elementary theory of Information Analysis.

  20. Numerical Investigations into the Value of Information in Lifecycle Analysis of Structural Systems

    DEFF Research Database (Denmark)

    Konakli, Katerina; Sudret, Bruno; Faber, Michael Havbro

    2015-01-01

    of decisions related to maintenance of structural systems. In this context, experiments may refer to inspections or structural health monitoring. The value-of-information concept comprises a powerful tool for determining whether the experimental cost is justified by the expected gained benefit during...... dependencies between the components of a system. Furthermore, challenges and potentials in value-of-information analysis for structural systems are discussed.......Preposterior analysis can be used to assess the potential of an experiment to enhance decision-making by providing information on parameters of the decision problem that are surrounded by epistemic uncertainties. The present paper describes a framework for preposterior analysis for support...

  1. Value of Information Analysis Project Gnome Site, New Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Greg Pohll; Jenny Chapman

    2010-01-01

    The Project Gnome site in southeastern New Mexico was the location of an underground nuclear detonation in 1961 and a hydrologic tracer test using radionuclides in 1963. The tracer test is recognized as having greater radionuclide migration potential than the nuclear test because the tracer test radionuclides (tritium, 90Sr, 131I, and 137Cs) are in direct contact with the Culebra Dolomite aquifer, whereas the nuclear test is within a bedded salt formation. The tracer test is the topic here. Recognizing previous analyses of the fate of the Gnome tracer test contaminants (Pohll and Pohlmann, 1996; Pohlmann and Andricevic, 1994), and the existence of a large body of relevant investigations and analyses associated with the nearby Waste Isolation Pilot Plant (WIPP) site (summarized in US DOE, 2009), the Gnome Site Characterization Work Plan (U.S. DOE, 2002) called for a Data Decision Analysis to determine whether or not additional characterization data are needed prior to evaluating existing subsurface intrusion restrictions and determining long-term monitoring for the tracer test. Specifically, the Work Plan called for the analysis to weigh the potential reduction in uncertainty from additional data collection against the cost of such field efforts.

  2. ERROR ANALYSIS ON INFORMATION AND TECHNOLOGY STUDENTS’ SENTENCE WRITING ASSIGNMENTS

    Directory of Open Access Journals (Sweden)

    Rentauli Mariah Silalahi

    2015-03-01

    Full Text Available Students’ error analysis is very important for helping EFL teachers to develop their teaching materials, assessments and methods. However, it takes much time and effort from the teachers to do such an error analysis towards their students’ language. This study seeks to identify the common errors made by 1 class of 28 freshmen students studying English in their first semester in an IT university. The data is collected from their writing assignments for eight consecutive weeks. The errors found were classified into 24 types and the top ten most common errors committed by the students were article, preposition, spelling, word choice, subject-verb agreement, auxiliary verb, plural form, verb form, capital letter, and meaningless sentences. The findings about the students’ frequency of committing errors were, then, contrasted to their midterm test result and in order to find out the reasons behind the error recurrence; the students were given some questions to answer in a questionnaire format. Most of the students admitted that careless was the major reason for their errors and lack understanding came next. This study suggests EFL teachers to devote their time to continuously check the students’ language by giving corrections so that the students can learn from their errors and stop committing the same errors.

  3. Hierarchical models and the analysis of bird survey information

    Science.gov (United States)

    Sauer, J.R.; Link, W.A.

    2003-01-01

    Management of birds often requires analysis of collections of estimates. We describe a hierarchical modeling approach to the analysis of these data, in which parameters associated with the individual species estimates are treated as random variables, and probability statements are made about the species parameters conditioned on the data. A Markov-Chain Monte Carlo (MCMC) procedure is used to fit the hierarchical model. This approach is computer intensive, and is based upon simulation. MCMC allows for estimation both of parameters and of derived statistics. To illustrate the application of this method, we use the case in which we are interested in attributes of a collection of estimates of population change. Using data for 28 species of grassland-breeding birds from the North American Breeding Bird Survey, we estimate the number of species with increasing populations, provide precision-adjusted rankings of species trends, and describe a measure of population stability as the probability that the trend for a species is within a certain interval. Hierarchical models can be applied to a variety of bird survey applications, and we are investigating their use in estimation of population change from survey data.

  4. Difference image analysis: Automatic kernel design using information criteria

    CERN Document Server

    Bramich, D M; Alsubai, K A; Bachelet, E; Mislis, D; Parley, N

    2015-01-01

    We present a selection of methods for automatically constructing an optimal kernel model for difference image analysis which require very few external parameters to control the kernel design. Each method consists of two components; namely, a kernel design algorithm to generate a set of candidate kernel models, and a model selection criterion to select the simplest kernel model from the candidate models that provides a sufficiently good fit to the target image. We restricted our attention to the case of solving for a spatially-invariant convolution kernel composed of delta basis functions, and we considered 19 different kernel solution methods including six employing kernel regularisation. We tested these kernel solution methods by performing a comprehensive set of image simulations and investigating how their performance in terms of model error, fit quality, and photometric accuracy depends on the properties of the reference and target images. We find that the irregular kernel design algorithm employing unreg...

  5. Processes That Inform Multicultural Supervision: A meta-analysis.

    Science.gov (United States)

    Tohidian, Nilou B; Quek, Karen Mui-Teng

    2017-03-16

    As the fields of counseling and psychotherapy have become more cognizant that individuals, couples, and families bring with them a myriad of diversity factors into therapy, multicultural competency has also become a crucial component in the development of clinicians during clinical supervision and training. We employed a qualitative meta-analysis to provide a detailed and comprehensive description of similar themes identified in primary qualitative studies that have investigated supervisory practices with an emphasis on diversity. Findings revealed six meta-categories, namely: (a) Supervisor's Multicultural Stances; (b) Supervisee's Multicultural Encounters; (c) Competency-Based Content in Supervision; (d) Processes Surrounding Multicultural Supervision; (e) Culturally Attuned Interventions; and (f) Multicultural Supervisory Alliance. Implications for practice are discussed.

  6. Radionuclide Data Analysis and Evaluation: More Information From Fewer Isotopes

    Science.gov (United States)

    Prinke, A.; McIntyre, J.; Cooper, M.; Haas, D.; Lowrey, J.; Miley, H.; Schrom, B.; Suckow, T.

    2013-12-01

    The analysis of the International Monitoring System radionuclide data sets provides daily concentrations for both particulate and radioxenon isotopes. These isotopes can come from many potential sources such as nuclear reactors, nuclear physics experiments, and medical isotope production. These interesting but irrelevant sources have several of the same radio-isotopic signatures from above or underground nuclear explosions and must be ruled out as part of the determination that an event originated as a nuclear explosion. There are several methods under development that aid in this determination and this poster will briefly cover each: radio-isotopic ratios and parent daughter relationships, co-detection of radioxenon and isotopes found on particulates, and past detection history.

  7. Deterministic and risk-informed approaches for safety analysis of advanced reactors: Part II, Risk-informed approaches

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Inn Seock, E-mail: innseockkim@gmail.co [ISSA Technology, 21318 Seneca Crossing Drive, Germantown, MD 20876 (United States); Ahn, Sang Kyu; Oh, Kyu Myung [Korea Institute of Nuclear Safety, 19 Kusong-dong, Yuseong-gu, Daejeon 305-338 (Korea, Republic of)

    2010-05-15

    Technical insights and findings from a critical review of deterministic approaches typically applied to ensure design safety of nuclear power plants were presented in the companion paper of Part I included in this issue. In this paper we discuss the risk-informed approaches that have been proposed to make a safety case for advanced reactors including Generation-IV reactors such as Modular High-Temperature Gas-cooled Reactor (MHTGR), Pebble Bed Modular Reactor (PBMR), or Sodium-cooled Fast Reactor (SFR). Also considered herein are a risk-informed safety analysis approach suggested by Westinghouse as a means to improve the conventional accident analysis, together with the Technology Neutral Framework recently developed by the US Nuclear Regulatory Commission as a high-level regulatory infrastructure for safety evaluation of any type of reactor design. The insights from a comparative review of various deterministic and risk-informed approaches could be usefully used in developing a new licensing architecture for enhanced safety of evolutionary or advanced plants.

  8. Transit light curves with finite integration time: Fisher information analysis

    Energy Technology Data Exchange (ETDEWEB)

    Price, Ellen M. [California Institute of Technology, 1200 East California Boulevard, Pasadena, CA 91125 (United States); Rogers, Leslie A. [California Institute of Technology, MC249-17, 1200 East California Boulevard, Pasadena, CA 91125 (United States)

    2014-10-10

    Kepler has revolutionized the study of transiting planets with its unprecedented photometric precision on more than 150,000 target stars. Most of the transiting planet candidates detected by Kepler have been observed as long-cadence targets with 30 minute integration times, and the upcoming Transiting Exoplanet Survey Satellite will record full frame images with a similar integration time. Integrations of 30 minutes affect the transit shape, particularly for small planets and in cases of low signal to noise. Using the Fisher information matrix technique, we derive analytic approximations for the variances and covariances on the transit parameters obtained from fitting light curve photometry collected with a finite integration time. We find that binning the light curve can significantly increase the uncertainties and covariances on the inferred parameters when comparing scenarios with constant total signal to noise (constant total integration time in the absence of read noise). Uncertainties on the transit ingress/egress time increase by a factor of 34 for Earth-size planets and 3.4 for Jupiter-size planets around Sun-like stars for integration times of 30 minutes compared to instantaneously sampled light curves. Similarly, uncertainties on the mid-transit time for Earth and Jupiter-size planets increase by factors of 3.9 and 1.4. Uncertainties on the transit depth are largely unaffected by finite integration times. While correlations among the transit depth, ingress duration, and transit duration all increase in magnitude with longer integration times, the mid-transit time remains uncorrelated with the other parameters. We provide code in Python and Mathematica for predicting the variances and covariances at www.its.caltech.edu/∼eprice.

  9. Open source tools for the information theoretic analysis of neural data.

    Science.gov (United States)

    Ince, Robin A A; Mazzoni, Alberto; Petersen, Rasmus S; Panzeri, Stefano

    2010-01-01

    The recent and rapid development of open source software tools for the analysis of neurophysiological datasets consisting of simultaneous multiple recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and for the integration of information obtained at different spatial and temporal scales. In this review we focus on recent advances in open source toolboxes for the information theoretic analysis of neural responses. We also present examples of their use to investigate the role of spike timing precision, correlations across neurons, and field potential fluctuations in the encoding of sensory information. These information toolboxes, available both in MATLAB and Python programming environments, hold the potential to enlarge the domain of application of information theory to neuroscience and to lead to new discoveries about how neurons encode and transmit information.

  10. Comparative Functional Analysis of Air Force and Commercially Available Transportation Information Management Systems

    Science.gov (United States)

    2004-03-01

    information management systems. Foster (2001) describes a gap analysis as “a term associated with the SERVQUAL survey instrument; gap analysis is a...Validity - Do pattern-matching - Do explanation-building - Address rival explanations - Use logic models Data analysis Data analysis Data...and redeployment modeling and simulation requirements. s. Joint Warfighting System (JWARS). JWARS is a closed-form, constructive simulation of

  11. Information-theoretic analysis of MIMO channel sounding

    CERN Document Server

    Baum, Daniel S

    2007-01-01

    The large majority of commercially available multiple-input multiple-output (MIMO) radio channel measurement devices (sounders) is based on time-division multiplexed switching (TDMS) of a single transmit/receive radio-frequency chain into the elements of a transmit/receive antenna array. While being cost-effective, such a solution can cause significant measurement errors due to phase noise and frequency offset in the local oscillators. In this paper, we systematically analyze the resulting errors and show that, in practice, overestimation of channel capacity by several hundred percent can occur. Overestimation is caused by phase noise (and to a lesser extent frequency offset) leading to an increase of the MIMO channel rank. Our analysis furthermore reveals that the impact of phase errors is, in general, most pronounced if the physical channel has low rank (typical for line-of-sight or poor scattering scenarios). The extreme case of a rank-1 physical channel is analyzed in detail. Finally, we present measureme...

  12. Information extraction from topographic map using colour and shape analysis

    Indian Academy of Sciences (India)

    Nikam Gitanjali Ganpatrao; Jayanta Kumar Ghosh

    2014-10-01

    The work presented in this paper is related to symbols and toponym understanding with application to scanned Indian topographic maps. The proposed algorithm deals with colour layer separation of enhanced topographic map using kmeans colour segmentation followed by outline detection and chaining, respectively. Outline detection is performed through linear filtering using canny edge detector. Outline is then encoded in a Freeman way, the x-y offsets have been used to obtain a complex representation of outlines. Final matching of shapes is done by computing Fourier descriptors from the chain-codes; comparison of descriptors having same colour index is embedded in a normalized scalar product of descriptors. As this matching process is not rotation invariant (starting point selection), an interrelation function has been proposed to make the method shifting invariant. The recognition rates of symbols, letters and numbers are 84.68, 91.73 and 92.19%, respectively. The core contribution is dedicated to a shape analysis method based on contouring and Fourier descriptors. To improve recognition rate, obtaining most optimal segmentation solution for complex topographic map will be the future scope of work.

  13. Graph analysis of dream reports is especially informative about psychosis.

    Science.gov (United States)

    Mota, Natália B; Furtado, Raimundo; Maia, Pedro P C; Copelli, Mauro; Ribeiro, Sidarta

    2014-01-01

    Early psychiatry investigated dreams to understand psychopathologies. Contemporary psychiatry, which neglects dreams, has been criticized for lack of objectivity. In search of quantitative insight into the structure of psychotic speech, we investigated speech graph attributes (SGA) in patients with schizophrenia, bipolar disorder type I, and non-psychotic controls as they reported waking and dream contents. Schizophrenic subjects spoke with reduced connectivity, in tight correlation with negative and cognitive symptoms measured by standard psychometric scales. Bipolar and control subjects were undistinguishable by waking reports, but in dream reports bipolar subjects showed significantly less connectivity. Dream-related SGA outperformed psychometric scores or waking-related data for group sorting. Altogether, the results indicate that online and offline processing, the two most fundamental modes of brain operation, produce nearly opposite effects on recollections: While dreaming exposes differences in the mnemonic records across individuals, waking dampens distinctions. The results also demonstrate the feasibility of the differential diagnosis of psychosis based on the analysis of dream graphs, pointing to a fast, low-cost and language-invariant tool for psychiatric diagnosis and the objective search for biomarkers. The Freudian notion that "dreams are the royal road to the unconscious" is clinically useful, after all.

  14. Graph analysis of dream reports is especially informative about psychosis

    Science.gov (United States)

    Mota, Natália B.; Furtado, Raimundo; Maia, Pedro P. C.; Copelli, Mauro; Ribeiro, Sidarta

    2014-01-01

    Early psychiatry investigated dreams to understand psychopathologies. Contemporary psychiatry, which neglects dreams, has been criticized for lack of objectivity. In search of quantitative insight into the structure of psychotic speech, we investigated speech graph attributes (SGA) in patients with schizophrenia, bipolar disorder type I, and non-psychotic controls as they reported waking and dream contents. Schizophrenic subjects spoke with reduced connectivity, in tight correlation with negative and cognitive symptoms measured by standard psychometric scales. Bipolar and control subjects were undistinguishable by waking reports, but in dream reports bipolar subjects showed significantly less connectivity. Dream-related SGA outperformed psychometric scores or waking-related data for group sorting. Altogether, the results indicate that online and offline processing, the two most fundamental modes of brain operation, produce nearly opposite effects on recollections: While dreaming exposes differences in the mnemonic records across individuals, waking dampens distinctions. The results also demonstrate the feasibility of the differential diagnosis of psychosis based on the analysis of dream graphs, pointing to a fast, low-cost and language-invariant tool for psychiatric diagnosis and the objective search for biomarkers. The Freudian notion that ``dreams are the royal road to the unconscious'' is clinically useful, after all.

  15. Taiwan's Information Security Policy Enhancement: an Analysis of Patent Indicators and Patent Documents

    Science.gov (United States)

    Hsu, Nai-Wen; Liang-Shiuan, Jr.; Chen, Yi-Chang

    2007-12-01

    Information security policy in Taiwan stems the gap between expectation and reality. For this, the paper presents an analysis of patent indicators and patent document attempt to conclude the overview of the information security technology development. The paper also identifies the leading countries and cutting-edge areas with potential trends. Finally, several practicable and valuable strategies after this work are generalized to achieve the goals of Taiwan information security policy.

  16. Optimization based on retention prediction and information theory for liquid-chromatographic analysis of alkylbenzenes

    Energy Technology Data Exchange (ETDEWEB)

    Matsuda, Rieko; Hayashi Yuzuru; Suzuki Takashi; Saito Yukio (National Inst. of Hygienic Sciences, Tokyo (Japan)); Jinno Kiyokatsu (Toyohashi Univ. of Technology (Japan))

    1991-11-01

    The mobile phase composition and column length are optimized for analyses of six alkylbenzenes in reversed-phase liquid chromatography with the aid of retention prediction and information theory. Optimal conditions selected according to the resolution Rs and information theory are evaluated from the viewpoint of the precision and analytical efficiency (rapidity) of chromatography. The combination of the information-theoretical optimization with the retention prediction will accelerate the development in the automation of liquid-chromatographic analysis.

  17. User Information Fusion Decision Making Analysis with the C-OODA Model

    Science.gov (United States)

    2011-07-01

    Observe-Orient-Decide-Act (C- OODA) model as a method of user and team analysis in the context of the Data Fusion Information Group ( DFIG ) Information...Fusion Model. From the DFIG model [as an update to the Joint Directors of the Lab (JDL) model], we look at Level 5 Fusion of “user refinement” in...OODA comparisons to the DFIG model support systems evaluation and analysis as well as coordinating the time interval of interaction between the machine

  18. The Naval Enlisted Professional Development Information System (NEPDIS): Front End Analysis (FEA) Process. Technical Report 159.

    Science.gov (United States)

    Aagard, James A.; Ansbro, Thomas M.

    The Naval Enlisted Professional Development Information System (NEPDIS) was designed to function as a fully computerized information assembly and analysis system to support labor force, personnel, and training management. The NEPDIS comprises separate training development, instructional, training record and evaluation, career development, and…

  19. Experiments in Discourse Analysis Impact on Information Classification and Retrieval Algorithms.

    Science.gov (United States)

    Morato, Jorge; Llorens, J.; Genova, G.; Moreiro, J. A.

    2003-01-01

    Discusses the inclusion of contextual information in indexing and retrieval systems to improve results and the ability to carry out text analysis by means of linguistic knowledge. Presents research that investigated whether discourse variables have an impact on information and retrieval and classification algorithms. (Author/LRW)

  20. Self-Informant Agreement in Well-Being Ratings: A Meta-Analysis

    Science.gov (United States)

    Schneider, Leann; Schimmack, Ulrich

    2009-01-01

    A meta-analysis of published studies that reported correlations between self-ratings and informant ratings of well-being (life-satisfaction, happiness, positive affect, negative affect) was performed. The average self-informant correlation based on 44 independent samples and 81 correlations for a total of 8,897 participants was r = 0.42 [99%…

  1. Information Theory for Correlation Analysis and Estimation of Uncertainty Reduction in Maps and Models

    Directory of Open Access Journals (Sweden)

    J. Florian Wellmann

    2013-04-01

    Full Text Available The quantification and analysis of uncertainties is important in all cases where maps and models of uncertain properties are the basis for further decisions. Once these uncertainties are identified, the logical next step is to determine how they can be reduced. Information theory provides a framework for the analysis of spatial uncertainties when different subregions are considered as random variables. In the work presented here, joint entropy, conditional entropy, and mutual information are applied for a detailed analysis of spatial uncertainty correlations. The aim is to determine (i which areas in a spatial analysis share information, and (ii where, and by how much, additional information would reduce uncertainties. As an illustration, a typical geological example is evaluated: the case of a subsurface layer with uncertain depth, shape and thickness. Mutual information and multivariate conditional entropies are determined based on multiple simulated model realisations. Even for this simple case, the measures not only provide a clear picture of uncertainties and their correlations but also give detailed insights into the potential reduction of uncertainties at each position, given additional information at a different location. The methods are directly applicable to other types of spatial uncertainty evaluations, especially where multiple realisations of a model simulation are analysed. In summary, the application of information theoretic measures opens up the path to a better understanding of spatial uncertainties, and their relationship to information and prior knowledge, for cases where uncertain property distributions are spatially analysed and visualised in maps and models.

  2. Using Meta-Analysis to Inform the Design of Subsequent Studies of Diagnostic Test Accuracy

    Science.gov (United States)

    Hinchliffe, Sally R.; Crowther, Michael J.; Phillips, Robert S.; Sutton, Alex J.

    2013-01-01

    An individual diagnostic accuracy study rarely provides enough information to make conclusive recommendations about the accuracy of a diagnostic test; particularly when the study is small. Meta-analysis methods provide a way of combining information from multiple studies, reducing uncertainty in the result and hopefully providing substantial…

  3. Bayesian Meta-Analysis of Cronbach's Coefficient Alpha to Evaluate Informative Hypotheses

    Science.gov (United States)

    Okada, Kensuke

    2015-01-01

    This paper proposes a new method to evaluate informative hypotheses for meta-analysis of Cronbach's coefficient alpha using a Bayesian approach. The coefficient alpha is one of the most widely used reliability indices. In meta-analyses of reliability, researchers typically form specific informative hypotheses beforehand, such as "alpha of…

  4. 48 CFR 1415.404-2 - Information to support proposal analysis.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Information to support proposal analysis. 1415.404-2 Section 1415.404-2 Federal Acquisition Regulations System DEPARTMENT OF THE... General for Auditing for information....

  5. Sheltering Children from the Whole Truth: A Critical Analysis of an Informational Picture Book.

    Science.gov (United States)

    Lamme, Linda; Fu, Danling

    2001-01-01

    Uses Orbis Pictus Award Committee criteria (accuracy, organization, design, and style) to examine an informational book, "Rice Is Life," by Rita Golden Gelman. Subjects the book to a deeper critical analysis. Suggests that it is important to help students become critical thinkers about everything they read, including informational books.…

  6. Can Raters with Reduced Job Descriptive Information Provide Accurate Position Analysis Questionnaire (PAQ) Ratings?

    Science.gov (United States)

    Friedman, Lee; Harvey, Robert J.

    1986-01-01

    Job-naive raters provided with job descriptive information made Position Analysis Questionnaire (PAQ) ratings which were validated against ratings of job analysts who were also job content experts. None of the reduced job descriptive information conditions enabled job-naive raters to obtain either acceptable levels of convergent validity with…

  7. Caucasus Seismic Information Network: Data and Analysis Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Randolph Martin; Mary Krasovec; Spring Romer; Timothy O' Connor; Emanuel G. Bombolakis; Youshun Sun; Nafi Toksoz

    2007-02-22

    The geology and tectonics of the Caucasus region (Armenia, Azerbaijan, and Georgia) are highly variable. Consequently, generating a structural model and characterizing seismic wave propagation in the region require data from local seismic networks. As of eight years ago, there was only one broadband digital station operating in the region – an IRIS station at Garni, Armenia – and few analog stations. The Caucasus Seismic Information Network (CauSIN) project is part of a nulti-national effort to build a knowledge base of seismicity and tectonics in the region. During this project, three major tasks were completed: 1) collection of seismic data, both in event catalogus and phase arrival time picks; 2) development of a 3-D P-wave velocity model of the region obtained through crustal tomography; 3) advances in geological and tectonic models of the region. The first two tasks are interrelated. A large suite of historical and recent seismic data were collected for the Caucasus. These data were mainly analog prior to 2000, and more recently, in Georgia and Azerbaijan, the data are digital. Based on the most reliable data from regional networks, a crustal model was developed using 3-D tomographic inversion. The results of the inversion are presented, and the supporting seismic data are reported. The third task was carried out on several fronts. Geologically, the goal of obtaining an integrated geological map of the Caucasus on a scale of 1:500,000 was initiated. The map for Georgia has been completed. This map serves as a guide for the final incorporation of the data from Armenia and Azerbaijan. Description of the geological units across borders has been worked out and formation boundaries across borders have been agreed upon. Currently, Armenia and Azerbaijan are working with scientists in Georgia to complete this task. The successful integration of the geologic data also required addressing and mapping active faults throughout the greater Caucasus. Each of the major

  8. A Real-Time and Dynamic Biological Information Retrieval and Analysis System (BIRAS)

    Institute of Scientific and Technical Information of China (English)

    Qi Zhou; Hong Zhang; Meiying Geng; Chenggang Zhang

    2003-01-01

    The aim of this study is to design a biological information retrieval and analysis system (BIRAS) based on the Internet. Using the specific network protocol, BIRAS system could send and receive information from the Entrez search and retrieval system maintained by National Center for Biotechnology Information (NCBI) in USA. The literatures, nucleotide sequence, protein sequences, and other resources according to the user-defined term could then be retrieved and sent to the user by pop up message or by E-mail informing automatically using BIRAS system.All the information retrieving and analyzing processes are done in real-time. As a robust system for intelligently and dynamically retrieving and analyzing on the user-defined information, it is believed that BIRAS would be extensively used to retrieve specific information from large amount of biological databases in now days.The program is available on request from the corresponding author.

  9. A Real—Time and Dynamic Biological Information Retrieval and Analysis System(BIRAS)

    Institute of Scientific and Technical Information of China (English)

    QiZhou; HongZhang; MeiyingGeng; ChenggangZhang

    2003-01-01

    The aim of this study is to design a biological information retrieval and analysis system(BIRAS) based on the Internet.Using the specific network protocol,BIRAS system could send and receive information from the Entrez search and retrieval system maintained by National Center for Biotechnology Information(NCBI)in USA.The literatures,nucleotide sequence,protein sequences,and other resources according to the user-defined term could then be retrieved and sent to the user by pop up message or by E-amil informing automatically using BIRAS system.All the information retrieving and analyzing processes are done in real-time.As a robust system for intelligently and dynamically retrieving and analyzing on the user-defined information,it is believed that BIRAS would be extensively used to retrieve specific information from large amount of biological databases in now days.The program is available on request from the corresponding author.

  10. Citation analysis in journal rankings: medical informatics in the library and information science literature.

    Science.gov (United States)

    Vishwanatham, R

    1998-10-01

    Medical informatics is an interdisciplinary field. Medical informatics articles will be found in the literature of various disciplines including library and information science publications. The purpose of this study was to provide an objectively ranked list of journals that publish medical informatics articles relevant to library and information science. Library Literature, Library and Information Science Abstracts, and Social Science Citation Index were used to identify articles published on the topic of medical informatics and to identify a ranked list of journals. This study also used citation analysis to identify the most frequently cited journals relevant to library and information science.

  11. Analysis Of Factors Affecting The Success Of The Application Of Accounting Information System

    Directory of Open Access Journals (Sweden)

    Deni Iskandar

    2015-02-01

    Full Text Available Abstract The purpose of this study was to find solutions for problems related to the quality of accounting information systems accounting information quality when connected with management commitment user competency and organizational culture. This research was conducted through deductive analysis supported the phenomenon then sought evidence through empirical facts especially about the effect of management commitment competence and users of organizational culture on the quality of accounting information systems and their impact on the quality of accounting information. This research was conducted at the State-Owned Enterprises SOEs.

  12. ID201202961, DOE S-124,539, Information Security Analysis Using Game Theory and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Abercrombie, Robert K [ORNL; Schlicher, Bob G [ORNL

    2012-01-01

    Information security analysis can be performed using game theory implemented in dynamic simulations of Agent Based Models (ABMs). Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, and information assets. Our approach addresses imperfect information and scalability that allows us to also address previous limitations of current stochastic game models. Such models only consider perfect information assuming that the defender is always able to detect attacks; assuming that the state transition probabilities are fixed before the game assuming that the players actions are always synchronous; and that most models are not scalable with the size and complexity of systems under consideration. Our use of ABMs yields results of selected experiments that demonstrate our proposed approach and provides a quantitative measure for realistic information systems and their related security scenarios.

  13. Auditing Information Structures in Organizations: A Review of Data Collection Techniques for Network Analysis

    NARCIS (Netherlands)

    Zwijze-Koning, Karen H.; Jong, de Menno D.T.

    2005-01-01

    Network analysis is one of the current techniques for investigating organizational communication. Despite the amount of how-to literature about using network analysis to assess information flows and relationships in organizations, little is known about the methodological strengths and weaknesses of

  14. Automatic Content Analysis; Part I of Scientific Report No. ISR-18, Information Storage and Retrieval...

    Science.gov (United States)

    Cornell Univ., Ithaca, NY. Dept. of Computer Science.

    Four papers are included in Part One of the eighteenth report on Salton's Magical Automatic Retriever of Texts (SMART) project. The first paper: "Content Analysis in Information Retrieval" by S. F. Weiss presents the results of experiments aimed at determining the conditions under which content analysis improves retrieval results as well…

  15. Efficient Use of Prior Information to Calibrate the Gridded Surface Subsurface Hydrologic Analysis (GSSHA) Hydrology Model

    Science.gov (United States)

    2014-09-01

    Gridded Surface Subsurface Hydrologic Analysis (GSSHA) Hydrology Model by Brian E. Skahill and Charles W. Downer PURPOSE: The purpose of this... Hydrologic Analysis (GSSHA) model. These new capabilities enable the incorporation of soft data, or prior information (i.e., extra observations which...traditional hydrologic simulation models (viz., lumped and semidistributed model structures). Such models have the potential to predict with greater

  16. Flipping the analytical coin : closing the information flow loop in high speed (real time) analysis

    NARCIS (Netherlands)

    Shahroudi, K.E.

    1997-01-01

    Analysis modules tend to be set up as one way flow of information, i.e a clear distinction between cause and effect or input and output. However, as the speed of analysis approaches real time (or faster than movie rate), it becomes increasingly difficult for an external user to distinguish between c

  17. The information value of early career productivity in mathematics: a ROC analysis of prediction errors in bibliometricly informed decision making.

    Science.gov (United States)

    Lindahl, Jonas; Danell, Rickard

    2016-01-01

    The aim of this study was to provide a framework to evaluate bibliometric indicators as decision support tools from a decision making perspective and to examine the information value of early career publication rate as a predictor of future productivity. We used ROC analysis to evaluate a bibliometric indicator as a tool for binary decision making. The dataset consisted of 451 early career researchers in the mathematical sub-field of number theory. We investigated the effect of three different definitions of top performance groups-top 10, top 25, and top 50 %; the consequences of using different thresholds in the prediction models; and the added prediction value of information on early career research collaboration and publications in prestige journals. We conclude that early career performance productivity has an information value in all tested decision scenarios, but future performance is more predictable if the definition of a high performance group is more exclusive. Estimated optimal decision thresholds using the Youden index indicated that the top 10 % decision scenario should use 7 articles, the top 25 % scenario should use 7 articles, and the top 50 % should use 5 articles to minimize prediction errors. A comparative analysis between the decision thresholds provided by the Youden index which take consequences into consideration and a method commonly used in evaluative bibliometrics which do not take consequences into consideration when determining decision thresholds, indicated that differences are trivial for the top 25 and the 50 % groups. However, a statistically significant difference between the methods was found for the top 10 % group. Information on early career collaboration and publication strategies did not add any prediction value to the bibliometric indicator publication rate in any of the models. The key contributions of this research is the focus on consequences in terms of prediction errors and the notion of transforming uncertainty

  18. A comparative study of information diffusion in weblogs and microblogs based on social network analysis

    Institute of Scientific and Technical Information of China (English)

    Yang; ZHANG; Wanyang; LING

    2012-01-01

    Purpose:This paper intends to explore a quantitative method for investigating the characteristics of information diffusion through social media like weblogs and microblogs.By using the social network analysis methods,we attempt to analyze the different characteristics of information diffusion in weblogs and microblogs as well as the possible reasons of these differences.Design/methodology/approach:Using the social network analysis methods,this paper carries out an empirical study by taking the Chinese weblogs and microblogs in the field of Library and Information Science(LIS)as the research sample and employing measures such as network density,core/peripheral structure and centrality.Findings:Firstly,both bloggers and microbloggers maintain weak ties,and both of their social networks display a small-world effect.Secondly,compared with weblog users,microblog users are more interconnected,more equal and more capable of developing relationships with people outside their own social networks.Thirdly,the microblogging social network is more conducive to information diffusion than the blogging network,because of their differences in functions and the information flow mechanism.Finally,the communication mode emerged with microblogging,with the characteristics of micro-content,multi-channel information dissemination,dense and decentralized social network and content aggregation,will be one of the trends in the development of the information exchange platform in the future.Research limitations:The sample size needs to be increased so that samples are more representative.Errors may exist during the data collection.Moreover,the individual-level characteristics of the samples as well as the types of information exchanged need to be further studied.Practical implications:This preliminary study explores the characteristics of information diffusion in the network environment and verifies the feasibility of conducting a quantitative analysis of information diffusion through social

  19. A New Classification Analysis of Customer Requirement Information Based on Quantitative Standardization for Product Configuration

    Directory of Open Access Journals (Sweden)

    Zheng Xiao

    2016-01-01

    Full Text Available Traditional methods used for the classification of customer requirement information are typically based on specific indicators, hierarchical structures, and data formats and involve a qualitative analysis in terms of stationary patterns. Because these methods neither consider the scalability of classification results nor do they regard subsequent application to product configuration, their classification becomes an isolated operation. However, the transformation of customer requirement information into quantifiable values would lead to a dynamic classification according to specific conditions and would enable an association with product configuration in an enterprise. This paper introduces a classification analysis based on quantitative standardization, which focuses on (i expressing customer requirement information mathematically and (ii classifying customer requirement information for product configuration purposes. Our classification analysis treated customer requirement information as follows: first, it was transformed into standardized values using mathematics, subsequent to which it was classified through calculating the dissimilarity with general customer requirement information related to the product family. Finally, a case study was used to demonstrate and validate the feasibility and effectiveness of the classification analysis.

  20. A generalized rough set-based information filling technique for failure analysis of thruster experimental data

    Institute of Scientific and Technical Information of China (English)

    Han Shan; Zhu Qiang; Li Jianxun; Chen Lin

    2013-01-01

    Interval-valued data and incomplete data are two key problems for failure analysis of thruster experimental data and have been basically solved by the proposed methods in this paper. Firstly, information data acquired from the simulation and evaluation system formed as interval-valued information system (IIS) is classified by the interval similarity relation. Then, as an improve-ment of the classical rough set, a new kind of generalized information entropy called‘‘H0-informa-tion entropy’’ is suggested for the measurement of uncertainty and the classification ability of IIS. There is an innovative information filling technique using the properties of H0-information entropy to replace missing data by some smaller estimation intervals. Finally, an improved method of failure analysis synthesized by the above achievements is presented to classify the thruster experimental data, complete the information, and extract the failure rules. The feasibility and advantage of this method is testified by an actual application of failure analysis, whose performance is evaluated by the quantification of E-condition entropy.

  1. Financial Ratio Analysis: the Development of a Dedicated Management Information System

    Directory of Open Access Journals (Sweden)

    Voicu-Dan Dragomir

    2007-01-01

    Full Text Available This paper disseminates the results of the development process for a financial analysis information system. The system has been subject to conceptual design using the Unified Modeling Language (UML and has been implemented in object-oriented manner using the Visual Basic .NET 2003 programming language. The classic financial analysis literature is focused on the chain-substitution method of computing the prior-year to current-year variation of linked financial ratios. We have applied this technique on the DuPont System of analysis concerning the Return on Equity ratio, by designing several structural UML diagrams depicting the breakdown and analysis of each financial ratio involved. The resulting computer application offers a flexible approach to the analytical tools: the user is required to introduce the raw data and the system provides both table-style and charted information on the output of computation. User-friendliness is also a key feature of this particular financial analysis application.

  2. Causality Analysis of fMRI Data Based on the Directed Information Theory Framework.

    Science.gov (United States)

    Wang, Zhe; Alahmadi, Ahmed; Zhu, David C; Li, Tongtong

    2016-05-01

    This paper aims to conduct fMRI-based causality analysis in brain connectivity by exploiting the directed information (DI) theory framework. Unlike the well-known Granger causality (GC) analysis, which relies on the linear prediction technique, the DI theory framework does not have any modeling constraints on the sequences to be evaluated and ensures estimation convergence. Moreover, it can be used to generate the GC graphs. In this paper, first, we introduce the core concepts in the DI framework. Second, we present how to conduct causality analysis using DI measures between two time series. We provide the detailed procedure on how to calculate the DI for two finite-time series. The two major steps involved here are optimal bin size selection for data digitization and probability estimation. Finally, we demonstrate the applicability of DI-based causality analysis using both the simulated data and experimental fMRI data, and compare the results with that of the GC analysis. Our analysis indicates that GC analysis is effective in detecting linear or nearly linear causal relationship, but may have difficulty in capturing nonlinear causal relationships. On the other hand, DI-based causality analysis is more effective in capturing both linear and nonlinear causal relationships. Moreover, it is observed that brain connectivity among different regions generally involves dynamic two-way information transmissions between them. Our results show that when bidirectional information flow is present, DI is more effective than GC to quantify the overall causal relationship.

  3. Software and Information Life Cycle (SILC) for the Integrated Information Services Organization. Analysis and implementation phase adaptations of the Sandia software guidelines: Issue A, April 18, 1995

    Energy Technology Data Exchange (ETDEWEB)

    Eaton, D.; Cassidy, A.; Cuyler, D. [and others

    1995-07-01

    This document describes the processes to be used for creating corporate information systems within the scope of the Integrated information Services (IIS) Center. This issue A describes the Analysis and Implementation phases within the context of the entire life cycle. Appendix A includes a full set of examples of the analysis set deliverables. Subsequent issues will describe the other life cycle processes as we move toward enterprise-level management of information assets, including information meta-models and an integrated corporate information model. The analysis phase as described here, when combined with a specifications repository, will provide the basis for future reusable components and improve traceability of information system specifications to enterprise business rules.

  4. Beyond Categories: A Structural Analysis of the Social Representations of Information Users' Collective Perceptions on 'Relevance'

    Directory of Open Access Journals (Sweden)

    Ju, Boryung

    2013-06-01

    Full Text Available Relevance has a long history of scholarly investigation and discussion in information science. One of its notable concepts is that of 'user-based' relevance. The purpose of this study is to examine how users construct their perspective on the concept of relevance; to analyze what the constituent elements (facets of relevance are, in terms of core-periphery status; and to compare the difference of constructions of two groups of users (information users vs. information professionals as applied with a social representations theory perspective. Data were collected from 244 information users and 123 information professionals through use of a free word association method. Three methods were employed to analyze data: (1 content analysis was used to elicit 26 categories (facets of the concept of relevance; (2 structural analysis of social representations was used to determine the core-periphery status of those facets in terms of coreness, sum of similarity, and weighted frequency; and, (3 maximum tree analysis was used to present and compare the differences between the two groups. Elicited categories in this study overlap with the ones from previous relevance studies, while the findings of a core-periphery analysis show that Topicality, User-needs, Reliability/Credibility, and Importance are configured as core concepts for the information user group, while Topicality, User-needs, Reliability/Credibility, and Currency are core concepts for the information professional group. Differences between the social representations of relevance revealed that Topicality was similar to User-needs and to Importance. Author is closely related to Title while Reliability/Credibility is linked with Currency. Easiness/Clarity is similar to Accuracy. Overall, information users and professionals function with a similar social collective of shared meanings for the concept of relevance. The overall findings identify the core and periphery concepts of relevance and their

  5. Analysis of Transaction Costs in Logistics and the Methodologies for Their Information Reflection for Automotive Companies

    OpenAIRE

    Ol’ga Evgen’evna Kovrizhnykh; Polina Aleksandrovna Nechaeva

    2016-01-01

    Transaction costs emerge in different types of logistics activities and influence the material flow and the accompanying financial and information flows; due to this fact, the information support and assessment are important tasks for the enterprise. The paper analyzes transaction costs in logistics for automotive manufacturers; according to the analysis, the level of these costs in any functional area of “logistics supply” ranges from 1.5 to 20%. These are only the official figures of transa...

  6. Contractor Past Performance Information: An Analysis of Assessment Narratives and Objective Ratings

    Science.gov (United States)

    2015-05-01

    Contractor Past Performance Information: An Analysis of Assessment Narratives and Objective Ratings Rene G. Rendon Uday Apte Michael Dixon...Assessment Narratives and Objective Ratings 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e...associated objective scores • CPARS deficiencies provide less-than-optimal information to the acquisition team that relies on these reports for source

  7. User satisfaction-based quality evaluation model and survey analysis of network information service

    Institute of Scientific and Technical Information of China (English)

    LEI; Xue; JIAO; Yuying

    2009-01-01

    On the basis of user satisfaction,authors made research hypotheses by learning from relevant e-service quality evaluation models.A questionnaire survey was then conducted on some content-based websites in terms of their convenience,information quality,personalization and site aesthetics,which may affect the overall satisfaction of users.Statistical analysis was also made to build a user satisfaction-based quality evaluation system of network information service.

  8. Deterministic and risk-informed approaches for safety analysis of advanced reactors: Part I, deterministic approaches

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Sang Kyu [Korea Institute of Nuclear Safety, 19 Kusong-dong, Yuseong-gu, Daejeon 305-338 (Korea, Republic of); Kim, Inn Seock, E-mail: innseockkim@gmail.co [ISSA Technology, 21318 Seneca Crossing Drive, Germantown, MD 20876 (United States); Oh, Kyu Myung [Korea Institute of Nuclear Safety, 19 Kusong-dong, Yuseong-gu, Daejeon 305-338 (Korea, Republic of)

    2010-05-15

    The objective of this paper and a companion paper in this issue (part II, risk-informed approaches) is to derive technical insights from a critical review of deterministic and risk-informed safety analysis approaches that have been applied to develop licensing requirements for water-cooled reactors, or proposed for safety verification of the advanced reactor design. To this end, a review was made of a number of safety analysis approaches including those specified in regulatory guides and industry standards, as well as novel methodologies proposed for licensing of advanced reactors. This paper and the companion paper present the review insights on the deterministic and risk-informed safety analysis approaches, respectively. These insights could be used in making a safety case or developing a new licensing review infrastructure for advanced reactors including Generation IV reactors.

  9. Open source information acquisition, analysis and integration in the IAEA Department of Safeguards

    Energy Technology Data Exchange (ETDEWEB)

    Barletta, M.; Zarimpas, N.; Zarucki, R., E-mail: M.Barletta@iaea.or [IAEA, Wagramerstrasse 5, P.O. Box 100, 1400 Vienna (Austria)

    2010-10-15

    Acquisition and analysis of open source information plays an increasingly important role in the IAEA strengthened safeguards system. The Agency's focal point for open source information collection and analysis is the Division of Safeguards Information Management (SGIM) within the IAEA Department of Safeguards. In parallel with the approval of the Model Additional Protocol in 1997, a new centre of information acquisition and analysis expertise was created within SGIM. By acquiring software, developing databases, retraining existing staff and hiring new staff with diverse analytical skills, SGIM is pro actively contributing to the future implementation of information-driven safeguards in collaboration with other Divisions within the Department of Safeguards. Open source information support is now fully integrated with core safeguards processes and activities, and has become an effective tool in the work of the Department of Safeguards. This provides and overview of progress realized through the acquisition and use of open source information in several thematic areas: evaluation of additional protocol declarations; support to the State Evaluation process; in-depth investigation of safeguards issues, including assisting inspections and complementary access; research on illicit nuclear procurement networks and trafficking; and monitoring nuclear developments. Demands for open source information have steadily grown and are likely to continue to grow in the future. Coupled with the enormous growth and accessibility in the volume and sources of information, new challenges are presented, both technical and analytical. This paper discusses actions taken and future plans for multi-source and multi-disciplinary analytic integration to strengthen confidence in safeguards conclusions - especially regarding the absence of undeclared nuclear materials and activities. (Author)

  10. An Empirical Analysis of the Default Rate of Informal Lending—Evidence from Yiwu, China

    Science.gov (United States)

    Lu, Wei; Yu, Xiaobo; Du, Juan; Ji, Feng

    This study empirically analyzes the underlying factors contributing to the default rate of informal lending. This paper adopts snowball sampling interview to collect data and uses the logistic regression model to explore the specific factors. The results of these analyses validate the explanation of how the informal lending differs from the commercial loan. Factors that contribute to the default rate have particular attributes, while sharing some similarities with commercial bank or FICO credit scoring Index. Finally, our concluding remarks draw some inferences from empirical analysis and speculate as to what this may imply for the role of formal and informal financial sectors.

  11. Analysis of Automated Modern Web Crawling and Testing Tools and Their Possible Employment for Information Extraction

    Directory of Open Access Journals (Sweden)

    Tomas Grigalis

    2012-04-01

    Full Text Available World Wide Web has become an enormously big repository of data. Extracting, integrating and reusing this kind of data has a wide range of applications, including meta-searching, comparison shopping, business intelligence tools and security analysis of information in websites. However, reaching information in modern WEB 2.0 web pages, where HTML tree is often dynamically modified by various JavaScript codes, new data are added by asynchronous requests to the web server and elements are positioned with the help of cascading style sheets, is a difficult task. The article reviews automated web testing tools for information extraction tasks.Article in Lithuanian

  12. Methods of sports genetics: dermatoglyphic analysis of human palmarprints (information 2

    Directory of Open Access Journals (Sweden)

    Serhiyenko L.P.

    2010-01-01

    Full Text Available Information is generalized about the dermatoglyphic analysis of hands of hands of man. The quantitative dermatoglyphic indexes of hands of hands are presented for youths and girls of the Podol region of Ukraine. The quantitative indexes of palm's dermatoglyphics are rotined for youths and girls of Ukrainian and Russian nationality in Kharkov. The most informing dermatoglyphic indexes of hands of hands which it is possible to use in sporting genetics are certain. Formed recommendation on technology of dermatoglyphic analysis of hands of hands of man in sporting genetics.

  13. Urban Planning and Management Information Systems Analysis and Design Based on GIS

    Science.gov (United States)

    Xin, Wang

    Based on the analysis of existing relevant systems on the basis of inadequate, after a detailed investigation and research, urban planning and management information system will be designed for three-tier structure system, under the LAN using C/S mode architecture. Related functions for the system designed in accordance with the requirements of the architecture design of the functional relationships between the modules. Analysis of the relevant interface and design, data storage solutions proposed. The design for small and medium urban planning information system provides a viable building program.

  14. The E-net model for the Risk Analysis and Assessment System for the Information Security of Communication and Information Systems ("Defining" Subsystem)

    CERN Document Server

    Stoianov, Nikolai

    2010-01-01

    This paper presents one suggestion that comprises the authors' experience in development and implementation of systems for information security in the Automated Information Systems of the Bulgarian Armed Forces. The architecture of risk analysis and assessment system for the communication and information system's information security (CIS IS) has been presented. E-net model of "Defining" Subsystem as a tool that allows to examine the subsystems is proposed as well. Such approach can be applied successfully for communication and information systems in the business field.

  15. Methods of Sports Genetics: dermatoglyphic analysis of human fingerprints (information 1

    Directory of Open Access Journals (Sweden)

    Serhiyenko L.P.

    2010-02-01

    Full Text Available The article provides data on the dermatoglyphic analysis of human fingerprints. The most informative dermatoglyphic traits of fingerprints are defined. They can be used as genetic markers to prognosticate sports endowments. The recommendations to use the technology of dermatoglyphic analysis of human fingerprints in sports genetics are given. There are certain national and racial differences in phenotypical expressed of dermatoglyphics of digit patterns.

  16. Financial Ratio Analysis: the Development of a Dedicated Management Information System

    OpenAIRE

    Voicu-Dan Dragomir

    2007-01-01

    This paper disseminates the results of the development process for a financial analysis information system. The system has been subject to conceptual design using the Unified Modeling Language (UML) and has been implemented in object-oriented manner using the Visual Basic .NET 2003 programming language. The classic financial analysis literature is focused on the chain-substitution method of computing the prior-year to current-year variation of linked financial ratios. We have applied this tec...

  17. Performing meta-analysis with incomplete statistical information in clinical trials

    Directory of Open Access Journals (Sweden)

    Hunter Anthony

    2008-08-01

    Full Text Available Abstract Background Results from clinical trials are usually summarized in the form of sampling distributions. When full information (mean, SEM about these distributions is given, performing meta-analysis is straightforward. However, when some of the sampling distributions only have mean values, a challenging issue is to decide how to use such distributions in meta-analysis. Currently, the most common approaches are either ignoring such trials or for each trial with a missing SEM, finding a similar trial and taking its SEM value as the missing SEM. Both approaches have drawbacks. As an alternative, this paper develops and tests two new methods, the first being the prognostic method and the second being the interval method, to estimate any missing SEMs from a set of sampling distributions with full information. A merging method is also proposed to handle clinical trials with partial information to simulate meta-analysis. Methods Both of our methods use the assumption that the samples for which the sampling distributions will be merged are randomly selected from the same population. In the prognostic method, we predict the missing SEMs from the given SEMs. In the interval method, we define intervals that we believe will contain the missing SEMs and then we use these intervals in the merging process. Results Two sets of clinical trials are used to verify our methods. One family of trials is on comparing different drugs for reduction of low density lipprotein cholesterol (LDL for Type-2 diabetes, and the other is about the effectiveness of drugs for lowering intraocular pressure (IOP. Both methods are shown to be useful for approximating the conventional meta-analysis including trials with incomplete information. For example, the meta-analysis result of Latanoprost versus Timolol on IOP reduction for six months provided in 1 was 5.05 ± 1.15 (Mean ± SEM with full information. If the last trial in this study is assumed to be with partial information

  18. Information Use in History Research: A Citation Analysis of Master's Level Theses

    Science.gov (United States)

    Sherriff, Graham

    2010-01-01

    This article addresses the need for quantitative investigation into students' use of information resources in historical research. It reports the results of a citation analysis of more than 3,000 citations from master's level history theses submitted between 1998 and 2008 at a mid-sized public university. The study's results support the hypotheses…

  19. Developing Information Skills Test for Malaysian Youth Students Using Rasch Analysis

    Science.gov (United States)

    Karim, Aidah Abdul; Shah, Parilah M.; Din, Rosseni; Ahmad, Mazalah; Lubis, Maimun Aqhsa

    2014-01-01

    This study explored the psychometric properties of a locally developed information skills test for youth students in Malaysia using Rasch analysis. The test was a combination of 24 structured and multiple choice items with a 4-point grading scale. The test was administered to 72 technical college students and 139 secondary school students. The…

  20. A Comprehensive Analysis of the Quality of Online Health-Related Information regarding Schizophrenia

    Science.gov (United States)

    Guada, Joseph; Venable, Victoria

    2011-01-01

    Social workers are major mental health providers and, thus, can be key players in guiding consumers and their families to accurate information regarding schizophrenia. The present study, using the WebMedQual scale, is a comprehensive analysis across a one-year period at two different time points of the top for-profit and nonprofit sites that…

  1. Quantitative and Qualitative Analysis of Nutrition and Food Safety Information in School Science Textbooks of India

    Science.gov (United States)

    Subba Rao, G. M.; Vijayapushapm, T.; Venkaiah, K.; Pavarala, V.

    2012-01-01

    Objective: To assess quantity and quality of nutrition and food safety information in science textbooks prescribed by the Central Board of Secondary Education (CBSE), India for grades I through X. Design: Content analysis. Methods: A coding scheme was developed for quantitative and qualitative analyses. Two investigators independently coded the…

  2. Sensitivity Analysis of Multiple Informant Models When Data Are Not Missing at Random

    Science.gov (United States)

    Blozis, Shelley A.; Ge, Xiaojia; Xu, Shu; Natsuaki, Misaki N.; Shaw, Daniel S.; Neiderhiser, Jenae M.; Scaramella, Laura V.; Leve, Leslie D.; Reiss, David

    2013-01-01

    Missing data are common in studies that rely on multiple informant data to evaluate relationships among variables for distinguishable individuals clustered within groups. Estimation of structural equation models using raw data allows for incomplete data, and so all groups can be retained for analysis even if only 1 member of a group contributes…

  3. Probabilistic aspects of analysis and information operation of the system in situations of conflict

    Directory of Open Access Journals (Sweden)

    S. V. Gluschenko

    2012-01-01

    Full Text Available The problems of research of the parameters interaction structure and the conflict formation are considered in the stochastic systems by the general system theory. The mathematical aspects of relations conflict, promotion and indifference are analyzed. We consider the information approach for the analysis of the conflict.

  4. Cultural-Historical Activity Theory and Domain Analysis: Metatheoretical Implications for Information Science

    Science.gov (United States)

    Wang, Lin

    2013-01-01

    Background: Cultural-historical activity theory is an important theory in modern psychology. In recent years, it has drawn more attention from related disciplines including information science. Argument: This paper argues that activity theory and domain analysis which uses the theory as one of its bases could bring about some important…

  5. 10 CFR 52.79 - Contents of applications; technical information in final safety analysis report.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Contents of applications; technical information in final safety analysis report. 52.79 Section 52.79 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSES, CERTIFICATIONS, AND APPROVALS FOR NUCLEAR POWER PLANTS Combined Licenses § 52.79 Contents of...

  6. Determinants of Microenterprise Success in the Urban Informal Sector of Addis Ababa: A Multidimensional Analysis

    NARCIS (Netherlands)

    B.F. Garoma

    2012-01-01

    textabstractThis study analyzes determinants of microenterprise success in the urban informal sector of Addis Ababa. The study uses a multidimensional analysis of success factors whereby internal and external factors of success are analyzed simultaneously. Success is represented by three indicators,

  7. Design and Implementation of Marine Information System, and Analysis of Learners' Intention toward

    Science.gov (United States)

    Pan, Yu-Jen; Kao, Jui-Chung; Yu, Te-Cheng

    2016-01-01

    The goal of this study is to conduct further research and discussion on applying the internet on marine education, utilizing existing technologies such as cloud service, social network, data collection analysis, etc. to construct a marine environment education information system. The content to be explored includes marine education information…

  8. The intellectual core of enterprise information systems: a co-citation analysis

    Science.gov (United States)

    Shiau, Wen-Lung

    2016-10-01

    Enterprise information systems (EISs) have evolved in the past 20 years, attracting the attention of international practitioners and scholars. Although literature reviews and analyses have been conducted to examine the multiple dimensions of EISs, no co-citation analysis has been conducted to examine the knowledge structures involved in EIS studies; thus, the current study fills this research gap. This study investigated the intellectual structures of EISs. All data source documents (1083 articles and 24,090 citations) were obtained from the Institute for Scientific Information Web of Knowledge database. A co-citation analysis was used to analyse EIS data. By using factor analysis, we identified eight critical factors: (a) factors affecting the implementation and success of information systems (ISs); (b) the successful implementation of enterprise resource planning (ERP); (c) IS evaluation and success, (d) system science studies; (e) factors influencing ERP success; (f) case research and theoretical models; (g) user acceptance of information technology; and (h) IS frameworks. Multidimensional scaling and cluster analysis were used to visually map the resultant EIS knowledge. It is difficult to implement an EIS in an enterprise and each organisation exhibits specific considerations. The current findings indicate that managers must focus on ameliorating inferior project performance levels, enabling a transition from 'vicious' to 'virtuous' projects. Successful EIS implementation yields substantial organisational advantages.

  9. Forty Years of the "Journal of Librarianship and Information Science": A Quantitative Analysis, Part I

    Science.gov (United States)

    Furner, Jonathan

    2009-01-01

    This paper reports on the first part of a two-part quantitative analysis of volume 1-40 (1969-2008) of the "Journal of Librarianship and Information Science" (formerly the "Journal of Librarianship"). It provides an overview of the current state of LIS research journal publishing in the UK; a review of the publication and…

  10. How Human Information Behaviour Researchers Use Each Other's Work: A Basic Citation Analysis Study

    Science.gov (United States)

    McKechnie, Lynne E. F.; Goodall, George R.; Lajoie-Paquette, Darian; Julien, Heidi

    2005-01-01

    Introduction: The purpose of this study was to determine if and how human information behaviour (HIB) research is used by others. Method: Using ISI Web of Knowledge, a citation analysis was conducted on 155 English-language HIB articles published from 1993 to 2000 in six prominent LIS journals. The bibliometric core of 12 papers was identified.…

  11. The dynamic of information-driven coordination phenomena: a transfer entropy analysis

    CERN Document Server

    Borge-Holthoefer, Javier; Gonçalves, Bruno; González-Bailón, Sandra; Arenas, Alex; Moreno, Yamir; Vespignani, Alessandro

    2015-01-01

    Data from social media are providing unprecedented opportunities to investigate the processes that rule the dynamics of collective social phenomena. Here, we consider an information theoretical approach to define and measure the temporal and structural signatures typical of collective social events as they arise and gain prominence. We use the symbolic transfer entropy analysis of micro-blogging time series to extract directed networks of influence among geolocalized sub-units in social systems. This methodology captures the emergence of system-level dynamics close to the onset of socially relevant collective phenomena. The framework is validated against a detailed empirical analysis of five case studies. In particular, we identify a change in the characteristic time-scale of the information transfer that flags the onset of information-driven collective phenomena. Furthermore, our approach identifies an order-disorder transition in the directed network of influence between social sub-units. In the absence of ...

  12. The Correspondence Analysis Platform for Uncovering Deep Structure in Data and Information

    CERN Document Server

    Murtagh, Fionn

    2008-01-01

    We study two aspects of information semantics: (i) the collection of all relationships, (ii) tracking and spotting anomaly and change. The first is implemented by endowing all relevant information spaces with a Euclidean metric in a common projected space. The second is modelled by an induced ultrametric. A very general way to achieve a Euclidean embedding of different information spaces based on cross-tabulation counts (and from other input data formats) is provided by Correspondence Analysis. From there, the induced ultrametric that we are particularly interested in takes a sequential - e.g. temporal - ordering of the data into account. We employ such a perspective to look at narrative, "the flow of thought and the flow of language" (Chafe). In application to policy decision making, we show how we can focus analysis in a small number of dimensions.

  13. An empirical methodology derived from the analysis of information remaining on second hand hard disks

    Science.gov (United States)

    Fragkos, Grigorios; Mee, Vivienne; Xynos, Konstantinos; Angelopoulou, Olga

    In this paper we present the findings of an analysis of approximately 260 second hand disks that was conducted in 2006. A third party organisation bought the disks from the second hand market providing a degree of anonymity. This paper will demonstrate the quantitative outcomes of the analysis and the overall experiences. It will look at how analysts can expand their tools and techniques in order to achieve faster results, how one can organise the analysis based on the way information is found and finally a holistic picture of the case should be generated following the proposed methodology.

  14. Information Flow Through Stages of Complex Engineering Design Projects: A Dynamic Network Analysis Approach

    DEFF Research Database (Denmark)

    Parraguez, Pedro; Eppinger, Steven D.; Maier, Anja

    2015-01-01

    The pattern of information flow through the network of interdependent design activities is thought to be an important determinant of engineering design process results. A previously unexplored aspect of such patterns relates to the temporal dynamics of information transfer between activities...... as those activities are implemented through the network of people executing the project. To address this gap, we develop a dynamic modeling method that integrates both the network of people and the network of activities in the project. We then employ a large dataset collected from an industrial setting...... information flows between activities in complex engineering design projects; 2) we show how the network of information flows in a large-scale engineering project evolved over time and how network analysis yields several managerial insights; and 3) we provide a useful new representation of the engineering...

  15. An Analysis of Risk and Function Information in Early Stage Design

    Science.gov (United States)

    Barrientos, Francesca; Tumer, Irem; Grantham, Katie; VanWie, Michael; Stone, Robert

    2005-01-01

    The concept of function offers a high potential for thinking and reasoning about designs as well as providing a common thread for relating together other design information. This paper focuses specifically on the relation between function and risk by examining how this information is addressed for a design team conducting early stage design for space missions. Risk information is decomposed into a set of key attributes which are then used to scrutinize the risk information using three approaches from the pragmatics sub-field of linguistics: i) Gricean, ii) Relevance Theory, and Functional Analysis. Results of this linguistics-based approach descriptively account for the context of designer communication with respect to function and risk, and offer prescriptive guidelines for improving designer communication.

  16. Getting and giving information: analysis of a family-interview strategy.

    Science.gov (United States)

    Viaro, M; Leonardi, P

    1983-03-01

    This paper reports on a videotape study of particular aspects of the two-part interview developed by Selvini-Palazzoli et al. (8, 9). The first segment is a "search for information," the second part the application of an intervention based on the information gathered in the first part. The study focused on the strategies of information retrieval on the premise that they are significant for the quality of information gathered and for the criteria implicitly conveyed by the therapist that in turn have their own substantial impact on the system. We have employed theories of communication, particularly conversational analysis, that are a departure from the epistemological premises of systems theory and communication pragmatics proposed by Selvini-Palazzoli et al. as the theoretical underpinning of their interview technique.

  17. Radiological accidents: analysis of the information disseminated by media and public acceptance of nuclear technology

    Energy Technology Data Exchange (ETDEWEB)

    Delgado, Jose Ubiratan; Tauhata, Luiz [Instituto de Radioprotecao e Dosimetria (IRD), Rio de Janeiro, RJ (Brazil); Garcia, Marcia Maria [Fundacao Inst. Oswaldo Cruz (FIOCRUZ), Rio de Janeiro, RJ (Brazil). Dept. de Virologia

    1995-12-31

    A methodology to treat quantitatively information by Media concerning a nuclear or a radiological accident is presented. It allows us to classify information according to the amount, importance and way of showing, into one indicator, named Information Equivalent. This establishes a procedure for analysis of released information and includes: number of head-lines, illustrations, printed lines, editorials, authorities quoted and so on. Interpretation becomes easier when the evolution and statistical trend of this indicator is observed. The application to evaluate the dissemination of the accident which took place in 1987 in Goiania, Brazil, was satisfactory and allowed us to purpose a model. This will aid the planning, the decision making process and it will improve relationships between technical staff and media during the emergency. (author). 5 refs., 4 figs., 3 tabs.

  18. Information-theoretic analysis of the dynamics of an executable biological model.

    Directory of Open Access Journals (Sweden)

    Avital Sadot

    Full Text Available To facilitate analysis and understanding of biological systems, large-scale data are often integrated into models using a variety of mathematical and computational approaches. Such models describe the dynamics of the biological system and can be used to study the changes in the state of the system over time. For many model classes, such as discrete or continuous dynamical systems, there exist appropriate frameworks and tools for analyzing system dynamics. However, the heterogeneous information that encodes and bridges molecular and cellular dynamics, inherent to fine-grained molecular simulation models, presents significant challenges to the study of system dynamics. In this paper, we present an algorithmic information theory based approach for the analysis and interpretation of the dynamics of such executable models of biological systems. We apply a normalized compression distance (NCD analysis to the state representations of a model that simulates the immune decision making and immune cell behavior. We show that this analysis successfully captures the essential information in the dynamics of the system, which results from a variety of events including proliferation, differentiation, or perturbations such as gene knock-outs. We demonstrate that this approach can be used for the analysis of executable models, regardless of the modeling framework, and for making experimentally quantifiable predictions.

  19. Analysis of a distributed neural system involved in spatial information, novelty, and memory processing.

    Science.gov (United States)

    Menon, V; White, C D; Eliez, S; Glover, G H; Reiss, A L

    2000-10-01

    Perceiving a complex visual scene and encoding it into memory involves a hierarchical distributed network of brain regions, most notably the hippocampus (HIPP), parahippocampal gyrus (PHG), lingual gyrus (LNG), and inferior frontal gyrus (IFG). Lesion and imaging studies in humans have suggested that these regions are involved in spatial information processing as well as novelty and memory encoding; however, the relative contributions of these regions of interest (ROIs) are poorly understood. This study investigated regional dissociations in spatial information and novelty processing in the context of memory encoding using a 2 x 2 factorial design with factors Novelty (novel vs. repeated) and Stimulus (viewing scenes with rich vs. poor spatial information). Greater activation was observed in the right than left hemisphere; however, hemispheric effects did not differ across regions, novelty, or stimulus type. Significant novelty effects were observed in all four regions. A significant ROI x Stimulus interaction was observed - spatial information processing effects were largest effects in the LNG, significant in the PHG and HIPP and nonsignificant in the IFG. Novelty processing was stimulus dependent in the LNG and stimulus independent in the PHG, HIPP, and IFG. Analysis of the profile of Novelty x Stimulus interaction across ROIs provided evidence for a hierarchical independence in novelty processing characterized by increased dissociation from spatial information processing. Despite these differences in spatial information processing, memory performance for novel scenes with rich and poor spatial information was not significantly different. Memory performance was inversely correlated with right IFG activation, suggesting the involvement of this region in strategically flawed encoding effort. Stepwise regression analysis revealed that memory encoding accounted for only a small fraction of the variance (temporal lobe activation. The implications of these results for

  20. Carpal tunnel syndrome: Analysis of online patient information with the EQIP tool.

    Science.gov (United States)

    Frueh, F S; Palma, A F; Raptis, D A; Graf, C P; Giovanoli, P; Calcagni, M

    2015-06-01

    Patients suffering from carpal tunnel syndrome (CTS) actively search for medical information on the Internet. The World Wide Web represents the main source of patient information. The aim of this study was to systematically assess the quality of patient information about CTS in the Internet. A qualitative and quantitative assessment of websites was performed with the modified Ensuring Quality Information for Patients (EQIP) tool that contains 36 standardized items. Five hundred websites with information on CTS treatment options were identified through Google, Bing, Yahoo, Ask.com and AOL. Duplicates and irrelevant websites were excluded. One hundred and ten websites were included. Only five websites addressed more than 20 items; quality scores were not significantly different between the various providing groups. A median of 15 EQIP items was found, with the top website addressing 26 out of 36 items. Major complications such as median nerve injury were reported in 27% of the websites and their treatment in only 3%. This analysis revealed several critical shortcomings in the quality of the information provided to patients suffering from CTS. There is a collective need to provide interactive, informative and educational websites for standard procedures in hand surgery. These websites should be compatible with international quality standards for hand surgery procedures.

  1. The development of an information criterion for Change-Point Analysis

    CERN Document Server

    Wiggins, Paul A

    2015-01-01

    Change-point analysis is a flexible and computationally tractable tool for the analysis of times series data from systems that transition between discrete states and whose observables are corrupted by noise. The change-point algorithm is used to identify the time indices (change points) at which the system transitions between these discrete states. We present a unified information-based approach to testing for the existence of change points. This new approach reconciles two previously disparate approaches to Change-Point Analysis (frequentist and information-based) for testing transitions between states. The resulting method is statistically principled, parameter and prior free and widely applicable to a wide range of change-point problems.

  2. Using visual information analysis to explore complex patterns in the activity of designers

    DEFF Research Database (Denmark)

    Cash, Philip; Stanković, Tino; Štorga, Mario

    2014-01-01

    the temporality of interrelations between interlinked variables and, as such, can be applied to a range of datasets. By providing a statistical analysis of the networks’ growth the proposed method allows for the modelling of complex patterns of activity. Throughout, the method is demonstrated with respect......The analysis of complex interlinked datasets poses a significant problem for design researchers. This is addressed by proposing an information visualisation method for analysing patterns of design activity, qualitatively and quantitatively, with respect to time. This method visualises...... to a fully realised example of information seeking activity. The core contribution of the proposed method is in supporting the analysis of activity with respect to both macro and micro level temporal interactions between variables....

  3. Information Communication Technology and Politics: A Synthesized Analysis of the Impacts of Information Technology on Voter Participation in Kenya

    Science.gov (United States)

    Tsuma, Clive Katiba

    2011-01-01

    The availability of political information throughout society made possible by the evolution of contemporary information communication technology has precipitated conflicting debate regarding the effects of technology use on real life political participation. Proponents of technology argue that the use of new information technology stimulates…

  4. Empowering Students to Make Sense of an Information-Saturated World: The Evolution of "Information Searching and Analysis"

    Science.gov (United States)

    Wittebols, James H.

    2016-01-01

    How well students conduct research online is an increasing concern for educators at all levels, especially higher education. This paper describes the evolution of a course that examines confirmation bias, information searching, and the political economy of information as keys to becoming more information and media literate. After a key assignment…

  5. NASA Informal Education: Final Report. A Descriptive Analysis of NASA's Informal Education Portfolio: Preliminary Case Studies

    Science.gov (United States)

    Rulf Fountain, Alyssa; Levy, Abigail Jurist

    2010-01-01

    This report was requested by the National Aeronautics and Space Administration's (NASA), Office of Education in July 2009 to evaluate the Informal Education Program. The goals of the evaluation were twofold: (1) to gain insight into its investment in informal education; and (2) to clarify existing distinctions between its informal education…

  6. The dynamics of information-driven coordination phenomena: A transfer entropy analysis.

    Science.gov (United States)

    Borge-Holthoefer, Javier; Perra, Nicola; Gonçalves, Bruno; González-Bailón, Sandra; Arenas, Alex; Moreno, Yamir; Vespignani, Alessandro

    2016-04-01

    Data from social media provide unprecedented opportunities to investigate the processes that govern the dynamics of collective social phenomena. We consider an information theoretical approach to define and measure the temporal and structural signatures typical of collective social events as they arise and gain prominence. We use the symbolic transfer entropy analysis of microblogging time series to extract directed networks of influence among geolocalized subunits in social systems. This methodology captures the emergence of system-level dynamics close to the onset of socially relevant collective phenomena. The framework is validated against a detailed empirical analysis of five case studies. In particular, we identify a change in the characteristic time scale of the information transfer that flags the onset of information-driven collective phenomena. Furthermore, our approach identifies an order-disorder transition in the directed network of influence between social subunits. In the absence of clear exogenous driving, social collective phenomena can be represented as endogenously driven structural transitions of the information transfer network. This study provides results that can help define models and predictive algorithms for the analysis of societal events based on open source data.

  7. Information architecture: study and analysis of data Public Medical base (PubMed

    Directory of Open Access Journals (Sweden)

    Odete Máyra Mesquita Sales

    2016-07-01

    Full Text Available Objective. Based on principles proposed by Rosenfeld and Morville (2006, the present study examined the PubMed database interface, since a well-structured information architecture contributes to good usability in any digital environment. Method. The research development occurred through the use of literature techniques and empirical study on the analysis of information architecture based on organization, navigation, recommended labeling and search for Rosenfeld and Morville (2006 for the sake of usability base PubMed. For better understanding and description of these principles, we used the technique of content analysis. Results. The results showed that the database interface meets the criteria established by the elements of Information Architecture, such as organization based on hypertext structure, horizontal menu and local content divided into categories, identifying active links, global navigation , breadcrumb, textual labeling and iconographic and highlight the search engine. Conclusions. This research showed that the PubMed database interface is well structured, friendly and objective, with numerous possibilities of search and information retrieval. However, there is a need to adopt accessibility standards on this website, so that it reaches more efficiently its purpose of facilitating access to information organized and stored in the PubMed database.

  8. A Novel Approach for Information Content Retrieval and Analysis of Bio-Images using Datamining techniques

    Directory of Open Access Journals (Sweden)

    Ayyagari Sri Nagesh

    2012-11-01

    Full Text Available In Bio-Medical image processing domain, content-based analysis and Information retrieval of bio-images is very critical for disease diagnosis. Content-Based Image Analysis and Information Retrieval (CBIAIR has become a significant part of information retrieval technology. One challenge in this area is that the ever-increasing number of bio-images acquired through the digital world makes the brute force searching almost impossible. Medical Image structural objects content and object identification plays significant role for image content analysis and information retrieval. There are basically three fundamental concepts for content-based bio-image retrieval, i.e. visual-feature extraction, multi-dimensional indexing, and retrieval system process. Each image has three contents such as: colour, texture and shape features. Colour and Texture both plays important image visual features used in Content-Based Image Retrieval to improve results. In this paper, we have presented an effective image retrieval system using features like texture, shape and color, called CBIAIR (Content-Based Image Analysis and Information Retrieval. Here, we have taken three different features such as texture, color and shape. Firstly, we have developed a new texture pattern feature for pixel based feature in CBIAIR system. Subsequently, we have used semantic color feature for color based feature and the shape based feature selection is done using the existing technique. For retrieving, these features are extracted from the query image and matched with the feature library using the feature weighted distance. After that, all feature vectors will be stored in the database using indexing procedure. Finally, the relevant images that have less matched distance than the predefined threshold value are retrieved from the image database after adapting the K-NN classifier.

  9. Development of efficient system for collection-analysis-application of information using system for technology and information in field of RI-biomics

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Sol Ah; Kim, Joo Yeon; Park, Tai Jin [Korean Association for Radiation Application, Seoul (Korea, Republic of)

    2015-08-15

    RI-Biomics is the new radiation fusion technology of which, such as the characteristics of radioisotope, is applied to the biomics. In order to sharing and overall analysis of data between the institutions through total management of information in the field of RI-Biomics, RI-Biomics Information portal ‘RIBio-Info’ was constructed by KARA (Korean Association for Radiation Application) in February 2015. For systematic operation of this ‘RIBio-Info’ system, it is required to develop system of collection-analysis-application of information. So, in this paper, we summarized development of document forms at each processes of collection-analysis-application of information and systematization of collection methods of information, establishment of characteristically analysis methods of reports such as issue paper, policy report, global market report and watch report. Therefore, these are expected to improving the practical applicability in this field through the vitalization of technology development of users by achieving the circular structure of collection analysis-application of information.

  10. Technology and Research Requirements for Combating Human Trafficking: Enhancing Communication, Analysis, Reporting, and Information Sharing

    Energy Technology Data Exchange (ETDEWEB)

    Kreyling, Sean J.; West, Curtis L.; Olson, Jarrod

    2011-03-17

    DHS’ Science & Technology Directorate directed PNNL to conduct an exploratory study on the domain of human trafficking in the Pacific Northwest in order to examine and identify technology and research requirements for enhancing communication, analysis, reporting, and information sharing – activities that directly support efforts to track, identify, deter, and prosecute human trafficking – including identification of potential national threats from smuggling and trafficking networks. This effort was conducted under the Knowledge Management Technologies Portfolio as part of the Integrated Federal, State, and Local/Regional Information Sharing (RISC) and Collaboration Program.

  11. Spike train analysis toolkit: enabling wider application of information-theoretic techniques to neurophysiology.

    Science.gov (United States)

    Goldberg, David H; Victor, Jonathan D; Gardner, Esther P; Gardner, Daniel

    2009-09-01

    Conventional methods widely available for the analysis of spike trains and related neural data include various time- and frequency-domain analyses, such as peri-event and interspike interval histograms, spectral measures, and probability distributions. Information theoretic methods are increasingly recognized as significant tools for the analysis of spike train data. However, developing robust implementations of these methods can be time-consuming, and determining applicability to neural recordings can require expertise. In order to facilitate more widespread adoption of these informative methods by the neuroscience community, we have developed the Spike Train Analysis Toolkit. STAToolkit is a software package which implements, documents, and guides application of several information-theoretic spike train analysis techniques, thus minimizing the effort needed to adopt and use them. This implementation behaves like a typical Matlab toolbox, but the underlying computations are coded in C for portability, optimized for efficiency, and interfaced with Matlab via the MEX framework. STAToolkit runs on any of three major platforms: Windows, Mac OS, and Linux. The toolkit reads input from files with an easy-to-generate text-based, platform-independent format. STAToolkit, including full documentation and test cases, is freely available open source via http://neuroanalysis.org , maintained as a resource for the computational neuroscience and neuroinformatics communities. Use cases drawn from somatosensory and gustatory neurophysiology, and community use of STAToolkit, demonstrate its utility and scope.

  12. Applying a sociolinguistic model to the analysis of informed consent documents.

    Science.gov (United States)

    Granero-Molina, José; Fernández-Sola, Cayetano; Aguilera-Manrique, Gabriel

    2009-11-01

    Information on the risks and benefits related to surgical procedures is essential for patients in order to obtain their informed consent. Some disciplines, such as sociolinguistics, offer insights that are helpful for patient-professional communication in both written and oral consent. Communication difficulties become more acute when patients make decisions through an informed consent document because they may sign this with a lack of understanding and information, and consequently feel deprived of their freedom to make their choice about different treatments or surgery. This article discusses findings from documentary analysis using the sociolinguistic SPEAKING model, which was applied to the general and specific informed consent documents required for laparoscopic surgery of the bile duct at Torrecárdenas Hospital, Almería, Spain. The objective of this procedure was to identify flaws when information was provided, together with its readability, its voluntary basis, and patients' consent. The results suggest potential linguistic communication difficulties, different languages being used, cultural clashes, asymmetry of communication between professionals and patients, assignment of rights on the part of patients, and overprotection of professionals and institutions.

  13. Celebrity Health Announcements and Online Health Information Seeking: An Analysis of Angelina Jolie's Preventative Health Decision.

    Science.gov (United States)

    Dean, Marleah

    2016-01-01

    On May 14, 2013, Angelina Jolie disclosed she carries BRCA1, which means she has an 87% risk of developing breast cancer during her lifetime. Jolie decided to undergo a preventative bilateral mastectomy (PBM), reducing her risk to 5%. The purpose of this study was to analyze the type of information individuals are exposed to when using the Internet to search health information regarding Jolie's decision. Qualitative content analysis revealed four main themes--information about genetics, information about a PBM, information about health care, and information about Jolie's gender identity. Broadly, the identified websites mention Jolie's high risk for developing cancer due to the genetic mutation BRCA1, describe a PBM occasionally noting reasons why she had this surgery and providing alternatives to the surgery, discuss issues related to health care services, costs, and insurances about Jolie's health decision, and portray Jolie as a sexual icon, a partner to Brad Pitt, a mother of six children, and an inspirational humanitarian. The websites also depict Jolie's health decision in positive, negative, and/or both ways. Discussion centers on how this actress' health decision impacts the public.

  14. Analysis of Transaction Costs in Logistics and the Methodologies for Their Information Reflection for Automotive Companies

    Directory of Open Access Journals (Sweden)

    Ol’ga Evgen’evna Kovrizhnykh

    2016-12-01

    Full Text Available Transaction costs emerge in different types of logistics activities and influence the material flow and the accompanying financial and information flows; due to this fact, the information support and assessment are important tasks for the enterprise. The paper analyzes transaction costs in logistics for automotive manufacturers; according to the analysis, the level of these costs in any functional area of “logistics supply” ranges from 1.5 to 20%. These are only the official figures of transaction costs of enterprises that do not take into consideration implicit costs. Despite the growing interest in transaction costs in logistics in the latest fifteen years, this topic is covered rather poorly in Russian literature; the definition of “transaction costs” is unclear, there is no technique of their information reflection and assessment. We have developed the methods for information reflection of transaction costs that can be used by automotive enterprises. Each enterprise will have an opportunity to choose the most suitable technique for information reflection of transaction costs or to compare the level of transaction costs when using different techniques. Application of techniques for information reflection of transaction costs allows the enterprises to increase profits by optimizing and reducing costs and using their assets more effectively, to identify possible ways to improve cost parameters of their performance, to improve their efficiency and productivity; to cut out unnecessary or duplicate activities, to optimize the number of staff involved in a particular activity

  15. SWOT analysis on National Common Geospatial Information Service Platform of China

    Science.gov (United States)

    Zheng, Xinyan; He, Biao

    2010-11-01

    Currently, the trend of International Surveying and Mapping is shifting from map production to integrated service of geospatial information, such as GOS of U.S. etc. Under this circumstance, the Surveying and Mapping of China is inevitably shifting from 4D product service to NCGISPC (National Common Geospatial Information Service Platform of China)-centered service. Although State Bureau of Surveying and Mapping of China has already provided a great quantity of geospatial information service to various lines of business, such as emergency and disaster management, transportation, water resource, agriculture etc. The shortcomings of the traditional service mode are more and more obvious, due to the highly emerging requirement of e-government construction, the remarkable development of IT technology and emerging online geospatial service demands of various lines of business. NCGISPC, which aimed to provide multiple authoritative online one-stop geospatial information service and API for further development to government, business and public, is now the strategic core of SBSM (State Bureau of Surveying and Mapping of China). This paper focuses on the paradigm shift that NCGISPC brings up by using SWOT (Strength, Weakness, Opportunity and Threat) analysis, compared to the service mode that based on 4D product. Though NCGISPC is still at its early stage, it represents the future service mode of geospatial information of China, and surely will have great impact not only on the construction of digital China, but also on the way that everyone uses geospatial information service.

  16. The 2006 Analysis of Information Remaining on Disks Offered for Sale on the Second Hand Market

    Directory of Open Access Journals (Sweden)

    Andy Jones

    2006-09-01

    Full Text Available All organisations, whether in the public or private sector, use computers for the storage and processing of information relating to their business or services, their employees and their customers. A large proportion of families and individuals in their homes now also use personal computers and, both intentionally and inadvertently, often store on those computers personal information. It is clear that most organisations and individuals continue to be unaware of the information that may be stored on the hard disks that the computers contain, and have not considered what may happen to the information after the disposal of the equipment.In 2005, joint research was carried out by the University of Glamorgan in Wales and Edith Cowan University in Australia to determine whether second hand computer disks that were purchased from a number of sources still contained any information or whether the information had been effectively erased. The research revealed that, for the majority of the disks that were examined, the information had not been effectively removed and as a result, both organisations and individuals were potentially exposed to a range of potential crimes.  It is worthy of note that in the disposal of this equipment, the organisations involved had failed to meet their statutory, regulatory and legal obligations.This paper describes a second research project that was carried out in 2006 which repeated the research carried out the previous year and also extended the scope of the research to include additional countries.  The methodology used was the same as that in the previous year and the disks that were used for the research were again supplied blind by a third party. The research involved the forensic imaging of the disks which was followed by an analysis of the disks to determine what information remained and whether it could be easily recovered using publicly available tools and techniques.

  17. Practical Performance Analysis for Multiple Information Fusion Based Scalable Localization System Using Wireless Sensor Networks

    Science.gov (United States)

    Zhao, Yubin; Li, Xiaofan; Zhang, Sha; Meng, Tianhui; Zhang, Yiwen

    2016-01-01

    In practical localization system design, researchers need to consider several aspects to make the positioning efficiently and effectively, e.g., the available auxiliary information, sensing devices, equipment deployment and the environment. Then, these practical concerns turn out to be the technical problems, e.g., the sequential position state propagation, the target-anchor geometry effect, the Non-line-of-sight (NLOS) identification and the related prior information. It is necessary to construct an efficient framework that can exploit multiple available information and guide the system design. In this paper, we propose a scalable method to analyze system performance based on the Cramér–Rao lower bound (CRLB), which can fuse all of the information adaptively. Firstly, we use an abstract function to represent all of the wireless localization system model. Then, the unknown vector of the CRLB consists of two parts: the first part is the estimated vector, and the second part is the auxiliary vector, which helps improve the estimation accuracy. Accordingly, the Fisher information matrix is divided into two parts: the state matrix and the auxiliary matrix. Unlike the theoretical analysis, our CRLB can be a practical fundamental limit to denote the system that fuses multiple information in the complicated environment, e.g., recursive Bayesian estimation based on the hidden Markov model, the map matching method and the NLOS identification and mitigation methods. Thus, the theoretical results are approaching the real case more. In addition, our method is more adaptable than other CRLBs when considering more unknown important factors. We use the proposed method to analyze the wireless sensor network-based indoor localization system. The influence of the hybrid LOS/NLOS channels, the building layout information and the relative height differences between the target and anchors are analyzed. It is demonstrated that our method exploits all of the available information for

  18. Practical Performance Analysis for Multiple Information Fusion Based Scalable Localization System Using Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Yubin Zhao

    2016-08-01

    Full Text Available In practical localization system design, researchers need to consider several aspects to make the positioning efficiently and effectively, e.g., the available auxiliary information, sensing devices, equipment deployment and the environment. Then, these practical concerns turn out to be the technical problems, e.g., the sequential position state propagation, the target-anchor geometry effect, the Non-line-of-sight (NLOS identification and the related prior information. It is necessary to construct an efficient framework that can exploit multiple available information and guide the system design. In this paper, we propose a scalable method to analyze system performance based on the Cramér–Rao lower bound (CRLB, which can fuse all of the information adaptively. Firstly, we use an abstract function to represent all of the wireless localization system model. Then, the unknown vector of the CRLB consists of two parts: the first part is the estimated vector, and the second part is the auxiliary vector, which helps improve the estimation accuracy. Accordingly, the Fisher information matrix is divided into two parts: the state matrix and the auxiliary matrix. Unlike the theoretical analysis, our CRLB can be a practical fundamental limit to denote the system that fuses multiple information in the complicated environment, e.g., recursive Bayesian estimation based on the hidden Markov model, the map matching method and the NLOS identification and mitigation methods. Thus, the theoretical results are approaching the real case more. In addition, our method is more adaptable than other CRLBs when considering more unknown important factors. We use the proposed method to analyze the wireless sensor network-based indoor localization system. The influence of the hybrid LOS/NLOS channels, the building layout information and the relative height differences between the target and anchors are analyzed. It is demonstrated that our method exploits all of the

  19. Information Gap Analysis: near real-time evaluation of disaster response

    Science.gov (United States)

    Girard, Trevor

    2014-05-01

    Disasters, such as major storm events or earthquakes, trigger an immediate response by the disaster management system of the nation in question. The quality of this response is a large factor in its ability to limit the impacts on the local population. Improving the quality of disaster response therefore reduces disaster impacts. Studying past disasters is a valuable exercise to understand what went wrong, identify measures which could have mitigated these issues, and make recommendations to improve future disaster planning and response. While such ex post evaluations can lead to improvements in the disaster management system, there are limitations. The main limitation that has influenced this research is that ex post evaluations do not have the ability to inform the disaster response being assessed for the obvious reason that they are carried out long after the response phase is over. The result is that lessons learned can only be applied to future disasters. In the field of humanitarian relief, this limitation has led to the development of real time evaluations. The key aspect of real time humanitarian evaluations is that they are completed while the operation is still underway. This results in findings being delivered at a time when they can still make a difference to the humanitarian response. Applying such an approach to the immediate disaster response phase requires an even shorter time-frame, as well as a shift in focus from international actors to the nation in question's government. As such, a pilot study was started and methodology developed, to analyze disaster response in near real-time. The analysis uses the information provided by the disaster management system within the first 0 - 5 days of the response. The data is collected from publicly available sources such as ReliefWeb and sorted under various categories which represent each aspect of disaster response. This process was carried out for 12 disasters. The quantity and timeliness of information

  20. Wireless Information-Theoretic Security in an Outdoor Topology with Obstacles: Theoretical Analysis and Experimental Measurements

    Directory of Open Access Journals (Sweden)

    Dagiuklas Tasos

    2011-01-01

    Full Text Available This paper presents a Wireless Information-Theoretic Security (WITS scheme, which has been recently introduced as a robust physical layer-based security solution, especially for infrastructureless networks. An autonomic network of moving users was implemented via 802.11n nodes of an ad hoc network for an outdoor topology with obstacles. Obstructed-Line-of-Sight (OLOS and Non-Line-of-Sight (NLOS propagation scenarios were examined. Low-speed user movement was considered, so that Doppler spread could be discarded. A transmitter and a legitimate receiver exchanged information in the presence of a moving eavesdropper. Average Signal-to-Noise Ratio (SNR values were acquired for both the main and the wiretap channel, and the Probability of Nonzero Secrecy Capacity was calculated based on theoretical formula. Experimental results validate theoretical findings stressing the importance of user location and mobility schemes on the robustness of Wireless Information-Theoretic Security and call for further theoretical analysis.

  1. Descriptive analysis of the inequalities of health information resources between Alberta's rural and urban health regions.

    Science.gov (United States)

    Stieda, Vivian; Colvin, Barb

    2009-01-01

    In an effort to understand the extent of the inequalities in health information resources across Alberta, SEARCH Custom, HKN (Health Knowledge Network) and IRREN (Inter-Regional Research and Evaluation Network) conducted a survey in December 2007 to determine what library resources currently existed in Alberta's seven rural health regions and the two urban health regions. Although anecdotal evidence indicated that these gaps existed, the analysis was undertaken to provide empirical evidence of the exact nature of these gaps. The results, coupled with the published literature on the impact, effectiveness and value of information on clinical practice and administrative decisions in healthcare management, will be used to build momentum among relevant stakeholders to support a vision of equitably funded health information for all healthcare practitioners across the province of Alberta.

  2. Local mine production safety supervision game analysis based on incomplete information

    Institute of Scientific and Technical Information of China (English)

    LI Xing-dong; LI Ying; REN Da-wei; LIU Zhao-xia

    2007-01-01

    Utilized fundamental theory and analysis method of Incomplete Information repeated games, introduced Incomplete Information into repeated games, and established two stages dynamic games model of the local authority and the coal mine owner. The analytic result indicates that: so long as the country established the corresponding rewards and punishments incentive mechanism to the local authority departments responsible for the work, it reports the safety accident in the coal mine on time. The conclusion that the local government displays right and wrong cooperation behavior will be changed with the introduction of the Incomplete Information. Only has the local authority fulfill their responsibility, can the unsafe accident be controlled effectively. Once this kind of cooperation of local government appears, the costs of the country on the safe supervise and the difficulty will be able to decrease greatly.

  3. The application of the unified modeling language in object-oriented analysis of healthcare information systems.

    Science.gov (United States)

    Aggarwal, Vinod

    2002-10-01

    This paper concerns itself with the beneficial effects of the Unified Modeling Language (UML), a nonproprietary object modeling standard, in specifying, visualizing, constructing, documenting, and communicating the model of a healthcare information system from the user's perspective. The author outlines the process of object-oriented analysis (OOA) using the UML and illustrates this with healthcare examples to demonstrate the practicality of application of the UML by healthcare personnel to real-world information system problems. The UML will accelerate advanced uses of object-orientation such as reuse technology, resulting in significantly higher software productivity. The UML is also applicable in the context of a component paradigm that promises to enhance the capabilities of healthcare information systems and simplify their management and maintenance.

  4. How geographical information systems analysis influences the continuum of patient care.

    Science.gov (United States)

    Pliskie, Jennifer; Wallenfang, Laura

    2014-01-01

    As the vast repository of data about millions of patients grows, the analysis of this information is changing the provider-patient relationship and influencing the continuum of care for broad swaths of the population. At the same time, while population health management moves from a volume-based model to a value-based one and additional patients seek care due to healthcare reform, hospitals and healthcare networks are evaluating their business models and searching for new revenue streams. Utilizing geographical information systems to model and analyze large amounts of data is helping organizations better understand the characteristics of their patient population, demographic and socioeconomic trends, and shifts in the utilization of healthcare. In turn, organizations can more effectively conduct service line planning, strategic business plans, market growth strategies, and human resource planning. Healthcare organizations that use GIS modeling can set themselves apart by making more informed and objective business strategy decisions.

  5. Analysis of information for cerebrovascular disorders obtained by 3D MR imaging

    Energy Technology Data Exchange (ETDEWEB)

    Yoshikawa, Kohki [Tokyo Univ. (Japan). Inst. of Medical Science; Yoshioka, Naoki; Watanabe, Fumio; Shiono, Takahiro; Sugishita, Morihiro; Umino, Kazunori

    1995-12-01

    Recently, it becomes easy to analyze information obtained by 3D MR imaging due to remarkable progress of fast MR imaging technique and analysis tool. Six patients suffered from aphasia (4 cerebral infarctions and 2 bleedings) were performed 3D MR imaging (3D FLASH-TR/TE/flip angle; 20-50 msec/6-10 msec/20-30 degrees) and their volume information were analyzed by multiple projection reconstruction (MPR), surface rendering 3D reconstruction, and volume rendering 3D reconstruction using Volume Design PRO (Medical Design Co., Ltd.). Four of them were diagnosed as Broca`s aphasia clinically and their lesions could be detected around the cortices of the left inferior frontal gyrus. Another 2 patients were diagnosed as Wernicke`s aphasia and the lesions could be detected around the cortices of the left supramarginal gyrus. This technique for 3D volume analyses would provide quite exact locational information about cerebral cortical lesions. (author).

  6. Eye-tracking Information Processing in Choice-based Conjoint Analysis

    DEFF Research Database (Denmark)

    Meissner, Martin; Decker, Reinhold

    2010-01-01

    Choice models are a common tool in market research for quantifying the influence of product attributes on consumer decisions. Process tracing techniques, on the other hand, try to answer the question of how people process information and make decisions in choice tasks. This paper suggests...... a combination of both approaches for in-depth investigations of consumer decision processes in preference measurement by means of choice-based conjoint (CBC) analysis. We discuss different process tracing techniques and propose an attribute-specific strategy measure for the analysis of CBC results. In our...... empirical study we eyetrack respondents evaluating CBC choice tasks for single-cup coffee brewers. On the basis of several hypotheses we illustrate the benefits of simultaneously recording eye-tracking information for market research....

  7. The Impact of Informal Economy in the Pension System, Empirical Analysis. The Albanian Case

    Directory of Open Access Journals (Sweden)

    Bernard Dosti

    2015-02-01

    Full Text Available By using a simple model, it will be analyzed the impact that informality has in the amount of consumption of the workers during their life cycle. This paper deals with the interconnections of underreported earnings, savings and old-age pension. The workers sampled for this analysis have been divided into three groups: 1. Low income employees, 2. Higher income employees who declare all incomes, 3. Employees who underreport their incomes. In this paper the analysis is based on two pension models: the model that calculates pension in conformity with the incomes and the basic model, whose objective is poverty reduction for the “third age”. The major result is as follows: Given the fact that the basic pension system favors employees that underreport their incomes and the fact that the impact of informality is greater in the basic system than in the proportional pension system, the application of basic pension system in the Albanian might be problematic.

  8. Information Presentation in Decision and Risk Analysis: Answered, Partly Answered, and Unanswered Questions.

    Science.gov (United States)

    Keller, L Robin; Wang, Yitong

    2016-09-21

    For the last 30 years, researchers in risk analysis, decision analysis, and economics have consistently proven that decisionmakers employ different processes for evaluating and combining anticipated and actual losses, gains, delays, and surprises. Although rational models generally prescribe a consistent response, people's heuristic processes will sometimes lead them to be inconsistent in the way they respond to information presented in theoretically equivalent ways. We point out several promising future research directions by listing and detailing a series of answered, partly answered, and unanswered questions.

  9. Towards a Structurational Theory of Information Systems: a substantive case analysis

    DEFF Research Database (Denmark)

    Rose, Jeremy; Hackney, R. H

    2003-01-01

    This paper employs the analysis of an interpretive case study within a Regional Train Operating Company (RTOC) to arrive at theoretical understandings of Information Systems (IS). Giddens’ ‘structuration theory’ is developed which offers an account of structure and agency; social practices...... developing and changing over time and space. The most common application of structuration theory to the IS domain is the analysis of empirical situations using the ‘dimensions of the duality of structure’ model. The best-known attempts to theorize IS concerns using this approach have come from Orlikowski...

  10. Advances in research methods for information systems research data mining, data envelopment analysis, value focused thinking

    CERN Document Server

    Osei-Bryson, Kweku-Muata

    2013-01-01

    Advances in social science research methodologies and data analytic methods are changing the way research in information systems is conducted. New developments in statistical software technologies for data mining (DM) such as regression splines or decision tree induction can be used to assist researchers in systematic post-positivist theory testing and development. Established management science techniques like data envelopment analysis (DEA), and value focused thinking (VFT) can be used in combination with traditional statistical analysis and data mining techniques to more effectively explore

  11. Integrating semantic annotation and information visualization for the analysis of multichannel fluorescence micrographs from pancreatic tissue.

    Science.gov (United States)

    Herold, Julia; Zhou, Luxian; Abouna, Sylvie; Pelengaris, Stella; Epstein, David; Khan, Michael; Nattkemper, Tim W

    2010-09-01

    The challenging problem of computational bioimage analysis receives growing attention from life sciences. Fluorescence microscopy is capable of simultaneously visualizing multiple molecules by staining with different fluorescent dyes. In the analysis of the result multichannel images, segmentation of ROIs resembles only a first step which must be followed by a second step towards the analysis of the ROI's signals in the different channels. In this paper we present a system that combines image segmentation and information visualization principles for an integrated analysis of fluorescence micrographs of tissue samples. The analysis aims at the detection and annotation of cells of the Islets of Langerhans and the whole pancreas, which is of great importance in diabetes studies and in the search for new anti-diabetes treatments. The system operates with two modules. The automatic annotation module applies supervised machine learning for cell detection and segmentation. The second information visualization module can be used for an interactive classification and visualization of cell types following the link-and-brush principle for filtering. We can compare the results obtained with our system with results obtained manually by an expert, who evaluated a set of example images three times to account for his intra-observer variance. The comparison shows that using our system the images can be evaluated with high accuracy which allows a considerable speed up of the time-consuming evaluation process.

  12. Inclusion of Respiratory Frequency Information in Heart Rate Variability Analysis for Stress Assessment.

    Science.gov (United States)

    Hernando, Alberto; Lazaro, Jesus; Gil, Eduardo; Arza, Adriana; Garzon, Jorge Mario; Lopez-Anton, Raul; de la Camara, Concepcion; Laguna, Pablo; Aguilo, Jordi; Bailon, Raquel

    2016-07-01

    Respiratory rate and heart rate variability (HRV) are studied as stress markers in a database of young healthy volunteers subjected to acute emotional stress, induced by a modification of the Trier Social Stress Test. First, instantaneous frequency domain HRV parameters are computed using time-frequency analysis in the classical bands. Then, the respiratory rate is estimated and this information is included in HRV analysis in two ways: 1) redefining the high-frequency (HF) band to be centered at respiratory frequency; 2) excluding from the analysis those instants where respiratory frequency falls within the low-frequency (LF) band. Classical frequency domain HRV indices scarcely show statistical differences during stress. However, when including respiratory frequency information in HRV analysis, the normalized LF power as well as the LF/HF ratio significantly increase during stress ( p-value 0.05 according to the Wilcoxon test), revealing higher sympathetic dominance. The LF power increases during stress, only being significantly different in a stress anticipation stage, while the HF power decreases during stress, only being significantly different during the stress task demanding attention. Our results support that joint analysis of respiration and HRV obtains a more reliable characterization of autonomic nervous response to stress. In addition, the respiratory rate is observed to be higher and less stable during stress than during relax ( p-value 0.05 according to the Wilcoxon test) being the most discriminative index for stress stratification (AUC = 88.2 % ).

  13. Construction Process Simulation and Safety Analysis Based on Building Information Model and 4D Technology

    Institute of Scientific and Technical Information of China (English)

    HU Zhenzhong; ZHANG Jianping; DENG Ziyin

    2008-01-01

    Time-dependent structure analysis theory has been proved to be more accurate and reliable com-pared to commonly used methods during construction. However, so far applications are limited to partial pe-riod and part of the structure because of immeasurable artificial intervention. Based on the building informa-tion model (BIM) and four-dimensional (4D) technology, this paper proposes an improves structure analysis method, which can generate structural geometry, resistance model, and loading conditions automatically by a close interlink of the schedule information, architectural model, and material properties. The method was applied to a safety analysis during a continuous and dynamic simulation of the entire construction process.The results show that the organic combination of the BIM, 4D technology, construction simulation, and safety analysis of time-dependent structures is feasible and practical. This research also lays a foundation for further researches on building lifecycle management by combining architectural design, structure analy-sis, and construction management.

  14. PASSIVE LOCATION AND ACCURACY ANALYSIS USING TDOA INFORMATION OF MULTI-STATIONS

    Institute of Scientific and Technical Information of China (English)

    1998-01-01

    A new exact, explicit, uniterative, and computationally efficient solution of nonlinear equation set for estimation of emitter position based on the time differences of arrival (TDOA) measured by multi-stations is proposed. The accuracy analysis of the location method is also presented. Finally performance evaluation results of emitter location by using TDOA information are illustrated by some graphs of Geometrical Dilution of Precision (GDOP) for various conditions in the specific surveillance region.

  15. EQUILIBRIUM ANALYSIS OF FINANCIAL COMPANY BASED ON INFORMATION PROVIDED BY THE BALANCE SHEET

    Directory of Open Access Journals (Sweden)

    Ștefăniță ȘUȘU

    2014-06-01

    Full Text Available This article highlights the importance of indicators (as net working capital, working capital requirements and net cash by means of which it is considered in the context of financial balances capitalization information released by the balance sheet of an entity tourist profile. Theoretical concepts presented in a logical sequence are combined with the practical example transposed Turism Covasna company. The results of the analysis are interpreted while trying to formulate solutions to the economic and financial viability of the entity.

  16. Comparative Analysis of Informal Caregiver Burden in Advanced Cancer, Dementia and Acquired Brain Injury

    OpenAIRE

    Harding, Richard; Gao, Wei; Jackson, Diana; Pearson, Clare; Murray, Joanna; Higginson, Irene J

    2015-01-01

    CONTEXT: Measurement and improvement of informal caregiver burden are central aims of policy and intervention. Burden itself is a complex construct and total burden can differ by patient diagnosis, although how diagnosis affects different aspects of caregiver subjective burden is unclear.OBJECTIVES: To compare the subjective burden of caregivers across three diagnostic groups using the 22-item Zarit Burden Inventory (ZBI).METHODS: We performed a secondary analysis of pooled cross-sectional da...

  17. Transportation Routing Analysis Geographic Information System (TRAGIS) User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, PE

    2003-09-18

    The Transportation Routing Analysis Geographic Information System (TRAGIS) model is used to calculate highway, rail, or waterway routes within the United States. TRAGIS is a client-server application with the user interface and map data files residing on the user's personal computer and the routing engine and network data files on a network server. The user's manual provides documentation on installation and the use of the many features of the model.

  18. Information search behaviour among new car buyers: A two-step cluster analysis

    Directory of Open Access Journals (Sweden)

    S.M. Satish

    2010-03-01

    Full Text Available A two-step cluster analysis of new car buyers in India was performed to identify taxonomies of search behaviour using personality and situational variables, apart from sources of information. Four distinct groups were found—broad moderate searchers, intense heavy searchers, low broad searchers, and low searchers. Dealers can identify the members of each segment by measuring the variables used for clustering, and can then design appropriate communication strategies.

  19. An Illumination Invariant Face Detection Based on Human Shape Analysis and Skin Color Information

    Directory of Open Access Journals (Sweden)

    Dibakar Chakraborty

    2012-06-01

    Full Text Available This paper provides a novel approach towards face area localization through analyzing the shape characteristics of human body. The face region is extracted by determining the sharp increase in body pixels in the shoulder area from neck region. For ensuring face area skin color information is also analyzed. The experimental analysis shows that the proposed algorithm detects the face area effectively and it’s performance is found to be quite satisfactory

  20. Reliability Information Analysis Center 1st Quarter 2007, Technical Area Task (TAT) Report

    Science.gov (United States)

    2007-02-05

    07 planning conference 14 Dec 06 II Marine Expeditionary Force (MEF) meeting with Major Smith 14 Dec 06 Gulf of Mexico Tyndall Air Force Base Missile...Restructured action item spreadsheet " Reviewed the following storyboards (functional flow, graphics and text): 1. 050101 Main Rotor System components 2... storyboards (functional flow, graphics, and text): o 050101 Main Rotor System components. Reliability Information Analysis Center 6000 Flanagan Road

  1. Geographic information systems, remote sensing, and spatial analysis activities in Texas, 2008-09

    Science.gov (United States)

    ,

    2009-01-01

    Geographic information system (GIS) technology has become an important tool for scientific investigation, resource management, and environmental planning. A GIS is a computer-aided system capable of collecting, storing, analyzing, and displaying spatially referenced digital data. GIS technology is useful for analyzing a wide variety of spatial data. Remote sensing involves collecting remotely sensed data, such as satellite imagery, aerial photography, or radar images, and analyzing the data to gather information or investigate trends about the environment or the Earth's surface. Spatial analysis combines remotely sensed, thematic, statistical, quantitative, and geographical data through overlay, modeling, and other analytical techniques to investigate specific research questions. It is the combination of data formats and analysis techniques that has made GIS an essential tool in scientific investigations. This fact sheet presents information about the technical capabilities and project activities of the U.S. Geological Survey (USGS) Texas Water Science Center (TWSC) GIS Workgroup during 2008 and 2009. After a summary of GIS Workgroup capabilities, brief descriptions of activities by project at the local and national levels are presented. Projects are grouped by the fiscal year (October-September 2008 or 2009) the project ends and include overviews, project images, and Internet links to additional project information and related publications or articles.

  2. SOCIAL NETWORK ANALYSIS FOR THE MEASUREMENT OF FORMAL AND INFORMAL STRUCTURES

    Directory of Open Access Journals (Sweden)

    Siomara Maria Pierangeli Pascotto,

    2013-03-01

    Full Text Available A discussion around the application of the social network analysis methodology has being helped to understand the social dynamic operation of organizations, including governments, which are subject to different management structures that have a high degree of functional hierarchy. This study was developed in a Federal School Institution in Sao Paulo State – Brazil searching to identify the link developed between Institution administrative support staff members and their social networks. The research objective is to establish a comparison between the link data facts with Social Networks Theories which says that the more ties (contacts exist in an informal group, the more the network promotes an open environment for exchanging knowledge and information. The research looked forward to identify how do the social networks are developed, how common objectives and subjects turns people closely connected, how do the focal points and main contacts emerge from this community and what are their influences in the rest of the group. It was applied social network measurement methodologies, combined with qualitative instrument analysis. As a conclusion, was found that the informal structure has influence over the formal structure providing an environment for exchanging knowledge and information, pointing out that some characters are actually responsible for the dynamic of networks, occupying driver and strategic position that gives them recognition as such by the other agents in the net.

  3. ANALYSIS OF INFORMATION SYSTEM IMPLEMENTATION IN BINUS UNIVERSITY USING DELONE AND MCLEAN INFORMATION SYSTEM SUCCESS MODEL AND COBIT FRAMEWORK

    Directory of Open Access Journals (Sweden)

    Johan Muliadi Kerta

    2013-05-01

    Full Text Available The success of implementation of information system in an organization will supportthe organization in the process of achieving goals. Successful information system will support theorganization's day-to-day operations, so that problem can be resolved more quickly and easily. Theinformation system which has been developed and implemented is also necessary to measure thematurity level. Therefore, it can determine whether the implementation of information systemsmade in accordance with the goals of the organization. Measuring the success of informationsystems used the DeLone and McLean IS success model. To measure the maturity level ofinformation systems used COBIT (Control Objectives for Information and related Technologyframeworks that provides best practices for IT governance and control. The results of this analysiswill assist and support the IT team in order to develop and build information systems that better fitthe needs and goals of the organization.

  4. A path analysis on correlates of consumer trust in online health information: evidence from the health information national trends survey.

    Science.gov (United States)

    Ye, Yinjiao

    2010-01-01

    Many people look for health information online, and the Internet is the third most trusted health information source. What implications does this trust have on consumer health? Not much research has been done in this area. This study explored various health-related correlates of consumer trust in online health information, including Internet use for health, self-efficacy belief in managing one's own health, negative emotions, and subjective health status. The 2007 Health Information National Trends Survey data were analyzed. Results showed that controlling for demographics, trust in online health information was directly related to both Internet use for health and the self-efficacy belief, and was indirectly associated with negative emotions; the latter two factors in turn were associated with self-rated health.

  5. Geographic information systems, remote sensing, and spatial analysis activities in Texas, 2002-07

    Science.gov (United States)

    Pearson, D.K.; Gary, R.H.; Wilson, Z.D.

    2007-01-01

    Geographic information system (GIS) technology has become an important tool for scientific investigation, resource management, and environmental planning. A GIS is a computer-aided system capable of collecting, storing, analyzing, and displaying spatially referenced digital data. GIS technology is particularly useful when analyzing a wide variety of spatial data such as with remote sensing and spatial analysis. Remote sensing involves collecting remotely sensed data, such as satellite imagery, aerial photography, or radar images, and analyzing the data to gather information or investigate trends about the environment or the Earth's surface. Spatial analysis combines remotely sensed, thematic, statistical, quantitative, and geographical data through overlay, modeling, and other analytical techniques to investigate specific research questions. It is the combination of data formats and analysis techniques that has made GIS an essential tool in scientific investigations. This document presents information about the technical capabilities and project activities of the U.S. Geological Survey (USGS) Texas Water Science Center (TWSC) GIS Workgroup from 2002 through 2007.

  6. Improving access to health information for older migrants by using grounded theory and social network analysis to understand their information behaviour and digital technology use.

    Science.gov (United States)

    Goodall, K T; Newman, L A; Ward, P R

    2014-11-01

    Migrant well-being can be strongly influenced by the migration experience and subsequent degree of mainstream language acquisition. There is little research on how older Culturally And Linguistically Diverse (CALD) migrants who have 'aged in place' find health information, and the role which digital technology plays in this. Although the research for this paper was not focused on cancer, we draw out implications for providing cancer-related information to this group. We interviewed 54 participants (14 men and 40 women) aged 63-94 years, who were born in Italy or Greece, and who migrated to Australia mostly as young adults after World War II. Constructivist grounded theory and social network analysis were used for data analysis. Participants identified doctors, adult children, local television, spouse, local newspaper and radio as the most important information sources. They did not generally use computers, the Internet or mobile phones to access information. Literacy in their birth language, and the degree of proficiency in understanding and using English, influenced the range of information sources accessed and the means used. The ways in which older CALD migrants seek and access information has important implications for how professionals and policymakers deliver relevant information to them about cancer prevention, screening, support and treatment, particularly as information and resources are moved online as part of e-health.

  7. An Analysis of Information Technology on Data Processing by using Cobit Framework

    Directory of Open Access Journals (Sweden)

    Surni Erniwati

    2015-09-01

    Full Text Available Information technology and processes is inter connected, directing and controlling the company in achieving corporate goals through value-added and balancing the risks and benefits of information technology. This study is aimed to analyze the level of maturity (maturity level on the data process and produced information technology recommendations that can be made as regards the management of IT to support the academic performance of the service to be better. Maturity level calculation was done by analyzing questionnaires on the state of information technology. The results of this study obtainable that the governance of information technology in data processing in Mataram ASM currently quite good. Current maturity value for the data processing has the value 2.69. This means that the company / organization already has a pattern of repeatedly done in managing the activities related to data management processes. Based on the data analysis, there is an effect on the current conditions and expected conditions can be taken solution or corrective actions to improve IT governance in the process of data management at ASM Mataram gradually.

  8. Patient information on breast reconstruction in the era of the world wide web. A snapshot analysis of information available on youtube.com.

    Science.gov (United States)

    Tan, M L H; Kok, K; Ganesh, V; Thomas, S S

    2014-02-01

    Breast cancer patient's expectation and choice of reconstruction is increasing and patients often satisfy their information needs outside clinic time by searching the world wide web. The aim of our study was to analyse the quality of content and extent of information regarding breast reconstruction available on YouTube videos and whether this is an appropriate additional source of information for patients. A snapshot qualitative and quantitative analysis of the first 100 videos was performed after the term 'breast reconstruction' was input into the search window of the video sharing website www.youtube.com on the 1st of September 2011. Qualitative categorical analysis included patient, oncological and reconstruction factors. It was concluded that although videos uploaded onto YouTube do not provide comprehensive information, it is a useful resource that can be utilised in patient education provided comprehensive and validated videos are made available.

  9. Evidence-based health information from the users’ perspective – a qualitative analysis

    Science.gov (United States)

    2013-01-01

    Background Evidence-based information is a precondition for informed decision-making and participation in health. There are several recommendations and definitions available on the generation and assessment of so called evidence-based health information for patients and consumers (EBHI). They stress the importance of objectively informing people about benefits and harms and any uncertainties in health-related procedures. There are also studies on the comprehensibility, relevance and user-friendliness of these informational materials. But to date there has been little research on the perceptions and cognitive reactions of users or lay people towards EBHI. The aim of our study is to define the spectrum of consumers’ reaction patterns to written EBHI in order to gain a deeper understanding of their comprehension and assumptions, as well as their informational needs and expectations. Methods This study is based on an external user evaluation of EBHI produced by the German Institute for Quality and Efficiency in Health Care (IQWiG), commissioned by the IQWiG. The EBHI were examined within guided group discussions, carried out with lay people. The test readers’ first impressions and their appraisal of the informational content, presentation, structure, comprehensibility and effect were gathered. Then a qualitative text analysis of 25 discussion transcripts involving 94 test readers was performed. Results Based on the qualitative text analysis a framework for reaction patterns was developed, comprising eight main categories: (i) interest, (ii) satisfaction, (iii) reassurance and trust, (iv) activation, (v) disinterest, (vi) dissatisfaction and disappointment, (vii) anxiety and worry, (viii) doubt. Conclusions Many lay people are unfamiliar with core characteristics of this special information type. Two particularly critical issues are the description of insufficient evidence and the attendant absence of clear-cut recommendations. Further research is needed to examine

  10. An analysis of water data systems to inform the Open Water Data Initiative

    Science.gov (United States)

    Blodgett, David L.; Read, Emily Kara; Lucido, Jessica M.; Slawecki, Tad; Young, Dwane

    2016-01-01

    Improving access to data and fostering open exchange of water information is foundational to solving water resources issues. In this vein, the Department of the Interior's Assistant Secretary for Water and Science put forward the charge to undertake an Open Water Data Initiative (OWDI) that would prioritize and accelerate work toward better water data infrastructure. The goal of the OWDI is to build out the Open Water Web (OWW). We therefore considered the OWW in terms of four conceptual functions: water data cataloging, water data as a service, enriching water data, and community for water data. To describe the current state of the OWW and identify areas needing improvement, we conducted an analysis of existing systems using a standard model for describing distributed systems and their business requirements. Our analysis considered three OWDI-focused use cases—flooding, drought, and contaminant transport—and then examined the landscape of other existing applications that support the Open Water Web. The analysis, which includes a discussion of observed successful practices of cataloging, serving, enriching, and building community around water resources data, demonstrates that we have made significant progress toward the needed infrastructure, although challenges remain. The further development of the OWW can be greatly informed by the interpretation and findings of our analysis.

  11. Structure analysis of the Polish academic information society using MDS method

    Science.gov (United States)

    Kaliczynska, Malgorzata

    2006-03-01

    The article presents the methodology of webometrics research and analysis aiming at determining similar features of objects belonging to the Polish information society, which uses the Internet and its www resources for communication purposes. In particular, the analysis applies to the selected Polish technical universities. The research was carried out in several phases - on different data groups - with regards to the Internet space and time changes. The results have been presented in a form of two and three-dimensional topography maps. For the purposes of this analysis, the computer methods of multidimensional scaling were used. The research will be further continued for a selected group of objects over a longer time frame. Its next stage will be the research on more diversified objects, also in a multinational aspect.

  12. Message Structures: a modelling technique for information systems analysis and design

    CERN Document Server

    España, Sergio; Pastor, Óscar; Ruiz, Marcela

    2011-01-01

    Despite the increasing maturity of model-driven software development (MDD), some research challenges remain open in the field of information systems (IS). For instance, there is a need to improve modelling techniques so that they cover several development stages in an integrated way, and they facilitate the transition from analysis to design. This paper presents Message Structures, a technique for the specification of communicative interactions between the IS and organisational actors. This technique can be used both in the analysis stage and in the design stage. During analysis, it allows abstracting from the technology that will support the IS, and to complement business process diagramming techniques with the specification of the communicational needs of the organisation. During design, Message Structures serves two purposes: (i) it allows to systematically derive a specification of the IS memory (e.g. a UML class diagram), (ii) and it allows to reason the user interface design using abstract patterns. Thi...

  13. FACTORS OF INFLUENCE ON THE ENTREPRENEURIAL INTEREST: AN ANALYSIS WITH STUDENTS OF INFORMATION TECHNOLOGY RELATED COURSES

    Directory of Open Access Journals (Sweden)

    Diego Guilherme Bonfim

    2009-10-01

    Full Text Available The purpose of the research was to analyze the entrepreneurial interest of students in information technology related courses. A literature review was performed, from which four hypotheses were announced, affirming that the student interest in entrepreneurial activity is influenced by (1 the perceived vocation of the area, (2 the ownership of a company, (3 the perceived social support from friends and family, and (4 the entrepreneurial skills mastery. A field study was developed, with data collected from the 171 students of higher education institutions from Fortaleza. The data were analyzed by using statistical techniques of descriptive analysis, analysis of variance, and multiple regression analysis. It was found that: (1 students, in general, have a moderate predisposition to engage in entrepreneurial activities; (2 the entrepreneurial interest is influenced by the perceived entrepreneurial vocation of the area, the social support, and the perceived strategic entrepreneurial skills mastery.

  14. Extending hierarchical task analysis to identify cognitive demands and information design requirements.

    Science.gov (United States)

    Phipps, Denham L; Meakin, George H; Beatty, Paul C W

    2011-07-01

    While hierarchical task analysis (HTA) is well established as a general task analysis method, there appears a need to make more explicit both the cognitive elements of a task and design requirements that arise from an analysis. One way of achieving this is to make use of extensions to the standard HTA. The aim of the current study is to evaluate the use of two such extensions--the sub-goal template (SGT) and the skills-rules-knowledge (SRK) framework--to analyse the cognitive activity that takes place during the planning and delivery of anaesthesia. In quantitative terms, the two methods were found to have relatively poor inter-rater reliability; however, qualitative evidence suggests that the two methods were nevertheless of value in generating insights about anaesthetists' information handling and cognitive performance. Implications for the use of an extended HTA to analyse work systems are discussed.

  15. An information-theoretic analysis of return maximization in reinforcement learning.

    Science.gov (United States)

    Iwata, Kazunori

    2011-12-01

    We present a general analysis of return maximization in reinforcement learning. This analysis does not require assumptions of Markovianity, stationarity, and ergodicity for the stochastic sequential decision processes of reinforcement learning. Instead, our analysis assumes the asymptotic equipartition property fundamental to information theory, providing a substantially different view from that in the literature. As our main results, we show that return maximization is achieved by the overlap of typical and best sequence sets, and we present a class of stochastic sequential decision processes with the necessary condition for return maximization. We also describe several examples of best sequences in terms of return maximization in the class of stochastic sequential decision processes, which satisfy the necessary condition.

  16. The Role of Mother in Informing Girls About Puberty: A Meta-Analysis Study

    Directory of Open Access Journals (Sweden)

    Sooki

    2016-02-01

    Full Text Available Context Family, especially the mother, has the most important role in the education, transformation of information, and health behaviors of girls in order for them to have a healthy transition from the critical stage of puberty, but there are different views in this regard. Objectives Considering the various findings about the source of information about puberty, a meta-analysis study was conducted to investigate the extent of the mother’s role in informing girls about puberty. Data Sources This meta-analysis study was based on English articles published from 2000 to February 2015 in the Scopus, PubMed, and Science direct databases and on Persian articles in the SID, Magiran, and Iran Medex databases with determined key words and their MeSH equivalent. Study Selection Quantitative cross-sectional articles were extracted by two independent researchers and finally 46 articles were selected based on inclusion criteria. STROBE list were used for evaluation of studies. Data Extraction The percent of mothers as the current and preferred source of gaining information about the process of puberty, menarche, and menstruation from the perspective of adolescent girls was extracted from the articles. The results of studies were analyzed using meta-analysis (random effects model and the studies’ heterogeneity was analyzed using the I2 calculation index. Variance between studies was analyzed using tau squared (Tau2 and review manager 5 software. Results The results showed that, from the perspective of teenage girls in Iran and other countries, in 56% of cases, the mother was the current source of information about the process of puberty, menarche, and menstruation. The preferred source of information about the process of puberty, menarche, and menstruation was the mother in all studies at 60% (Iran 57%, and other countries 66%. Conclusions According to the findings of this study, it is essential that health professionals and officials of the ministry of

  17. Combining Global and Local Information for Knowledge-Assisted Image Analysis and Classification

    Directory of Open Access Journals (Sweden)

    Mezaris V

    2007-01-01

    Full Text Available A learning approach to knowledge-assisted image analysis and classification is proposed that combines global and local information with explicitly defined knowledge in the form of an ontology. The ontology specifies the domain of interest, its subdomains, the concepts related to each subdomain as well as contextual information. Support vector machines (SVMs are employed in order to provide image classification to the ontology subdomains based on global image descriptions. In parallel, a segmentation algorithm is applied to segment the image into regions and SVMs are again employed, this time for performing an initial mapping between region low-level visual features and the concepts in the ontology. Then, a decision function, that receives as input the computed region-concept associations together with contextual information in the form of concept frequency of appearance, realizes image classification based on local information. A fusion mechanism subsequently combines the intermediate classification results, provided by the local- and global-level information processing, to decide on the final image classification. Once the image subdomain is selected, final region-concept association is performed using again SVMs and a genetic algorithm (GA for optimizing the mapping between the image regions and the selected subdomain concepts taking into account contextual information in the form of spatial relations. Application of the proposed approach to images of the selected domain results in their classification (i.e., their assignment to one of the defined subdomains and the generation of a fine granularity semantic representation of them (i.e., a segmentation map with semantic concepts attached to each segment. Experiments with images from the personal collection domain, as well as comparative evaluation with other approaches of the literature, demonstrate the performance of the proposed approach.

  18. Constructing osteoarthritis through discourse – a qualitative analysis of six patient information leaflets on osteoarthritis

    Directory of Open Access Journals (Sweden)

    Ong Bie

    2007-04-01

    Full Text Available Abstract Background Health service policy in the United Kingdom emphasises the importance of self-care by patients with chronic conditions. Written information for patients about their condition is seen as an important aid to help patients look after themselves. From a discourse analysis perspective written texts such as patient information leaflets do not simply describe the reality of a medical condition and its management but by drawing on some sorts of knowledge and evidence rather than others help construct the reality of that condition. This study explored patient information leaflets on osteoarthritis (OA to see how OA was constructed and to consider the implications for self-care. Methods Systematic and repeated readings of six patient information leaflets on osteoarthritis to look for similarities and differences across leaflets, contradictions within leaflets and the resources called on to make claims about the nature of OA and its management. Results Biomedical discourse of OA as a joint disease dominated. Only one leaflet included an illness discourse albeit limited, and was also the only one to feature patient experiences of living with OA. The leaflets had different views on the causes of OA including the role of lifestyle and ageing. Most emphasised patient responsibility for preventing the progression of OA. Advice about changing behaviour such as diet and exercise was not grounded in lived experience. There were inconsistent messages about using painkillers, exercise and the need to involve professionals when making changes to lifestyle. Conclusion The nature of the discourse impacted on how OA and the respective roles of patients and professionals were depicted. Limited discourse on illness meant that the complexity of living with OA and its consequences was underestimated. Written information needs to shift from joint biology to helping patients live with osteoarthritis. Written information should incorporate patient

  19. Information problem solving by experts and novices: Analysis of a complex cognitive skill.

    NARCIS (Netherlands)

    Brand-Gruwel, Saskia; Wopereis, Iwan; Vermetten, Yvonne

    2007-01-01

    In (higher) education students are often faced with information problems: tasks or assignments that require them to identify information needs, locate corresponding information sources, extract and organize relevant information from each source, and synthesize information from a variety of sources.

  20. Application of evidence theory in information fusion of multiple sources in bayesian analysis

    Institute of Scientific and Technical Information of China (English)

    周忠宝; 蒋平; 武小悦

    2004-01-01

    How to obtain proper prior distribution is one of the most critical problems in Bayesian analysis. In many practical cases, the prior information often comes from different sources, and the prior distribution form could be easily known in some certain way while the parameters are hard to determine. In this paper, based on the evidence theory, a new method is presented to fuse the information of multiple sources and determine the parameters of the prior distribution when the form is known. By taking the prior distributions which result from the information of multiple sources and converting them into corresponding mass functions which can be combined by Dempster-Shafer (D-S) method, we get the combined mass function and the representative points of the prior distribution. These points are used to fit with the given distribution form to determine the parameters of the prior distrbution. And then the fused prior distribution is obtained and Bayesian analysis can be performed.How to convert the prior distributions into mass functions properly and get the representative points of the fused prior distribution is the central question we address in this paper. The simulation example shows that the proposed method is effective.

  1. An Augmented Classical Least Squares Method for Quantitative Raman Spectral Analysis against Component Information Loss

    Directory of Open Access Journals (Sweden)

    Yan Zhou

    2013-01-01

    Full Text Available We propose an augmented classical least squares (ACLS calibration method for quantitative Raman spectral analysis against component information loss. The Raman spectral signals with low analyte concentration correlations were selected and used as the substitutes for unknown quantitative component information during the CLS calibration procedure. The number of selected signals was determined by using the leave-one-out root-mean-square error of cross-validation (RMSECV curve. An ACLS model was built based on the augmented concentration matrix and the reference spectral signal matrix. The proposed method was compared with partial least squares (PLS and principal component regression (PCR using one example: a data set recorded from an experiment of analyte concentration determination using Raman spectroscopy. A 2-fold cross-validation with Venetian blinds strategy was exploited to evaluate the predictive power of the proposed method. The one-way variance analysis (ANOVA was used to access the predictive power difference between the proposed method and existing methods. Results indicated that the proposed method is effective at increasing the robust predictive power of traditional CLS model against component information loss and its predictive power is comparable to that of PLS or PCR.

  2. Data Flow Analysis and Visualization for Spatiotemporal Statistical Data without Trajectory Information.

    Science.gov (United States)

    Kim, Seokyeon; Jeong, Seongmin; Woo, Insoo; Jang, Yun; Maciejewski, Ross; Ebert, David

    2017-02-08

    Geographic visualization research has focused on a variety of techniques to represent and explore spatiotemporal data. The goal of those techniques is to enable users to explore events and interactions over space and time in order to facilitate the discovery of patterns, anomalies and relationships within the data. However, it is difficult to extract and visualize data flow patterns over time for non-directional statistical data without trajectory information. In this work, we develop a novel flow analysis technique to extract, represent, and analyze flow maps of non-directional spatiotemporal data unaccompanied by trajectory information. We estimate a continuous distribution of these events over space and time, and extract flow fields for spatial and temporal changes utilizing a gravity model. Then, we visualize the spatiotemporal patterns in the data by employing flow visualization techniques. The user is presented with temporal trends of geo-referenced discrete events on a map. As such, overall spatiotemporal data flow patterns help users analyze geo-referenced temporal events, such as disease outbreaks, crime patterns, etc. To validate our model, we discard the trajectory information in an origin-destination dataset and apply our technique to the data and compare the derived trajectories and the original. Finally, we present spatiotemporal trend analysis for statistical datasets including twitter data, maritime search and rescue events, and syndromic surveillance.

  3. Comparative Analysis of Postmodern Design for Information Technology in Education in Relation to Modernism

    Directory of Open Access Journals (Sweden)

    Saeid Zarghami Hamrah

    2012-01-01

    Full Text Available Problem statement: The purpose of present study is a comparative analysis of the philosophical bases of postmodernism in relation to modernism and suggesting the necessities of each base in the designing information technology in education. Approach: The research method for the present study was comparative analysis. Results: The first base was rejection of objective view toward the universe and accepting the “pre-objective universe”. In this regard, it was suggested that information technology should be considered in relation to and as a component of life. The second base was doing away with totality. The necessity of this base was in the rejection of universal approaches and designing for specific situations. The third base was uncertainty. Regarding this base, it was suggested that the educational software provide a text in which the learner confront subjects for questioning and interpreting. The forth base was focusing on the complexities of the phenomena. In this ground, it was especially necessary for the design to be integrational. Conclusion/Recommendations: It seems that postmodernism view has been able to provide the possibility of recreating information technology in education through going beyond the basic assumptions of modernism. At last and in order to escape the metanarrative view toward postmodern ideas, we cannot regard the recommended solutions by postmodernists as the definite, final and general solution for educational issues of present and past times. But, we can look at them for further illumination of technological education condition of the present time.

  4. #FluxFlow: Visual Analysis of Anomalous Information Spreading on Social Media.

    Science.gov (United States)

    Zhao, Jian; Cao, Nan; Wen, Zhen; Song, Yale; Lin, Yu-Ru; Collins, Christopher

    2014-12-01

    We present FluxFlow, an interactive visual analysis system for revealing and analyzing anomalous information spreading in social media. Everyday, millions of messages are created, commented, and shared by people on social media websites, such as Twitter and Facebook. This provides valuable data for researchers and practitioners in many application domains, such as marketing, to inform decision-making. Distilling valuable social signals from the huge crowd's messages, however, is challenging, due to the heterogeneous and dynamic crowd behaviors. The challenge is rooted in data analysts' capability of discerning the anomalous information behaviors, such as the spreading of rumors or misinformation, from the rest that are more conventional patterns, such as popular topics and newsworthy events, in a timely fashion. FluxFlow incorporates advanced machine learning algorithms to detect anomalies, and offers a set of novel visualization designs for presenting the detected threads for deeper analysis. We evaluated FluxFlow with real datasets containing the Twitter feeds captured during significant events such as Hurricane Sandy. Through quantitative measurements of the algorithmic performance and qualitative interviews with domain experts, the results show that the back-end anomaly detection model is effective in identifying anomalous retweeting threads, and its front-end interactive visualizations are intuitive and useful for analysts to discover insights in data and comprehend the underlying analytical model.

  5. Parametric sensitivity analysis for biochemical reaction networks based on pathwise information theory

    Science.gov (United States)

    2013-01-01

    Background Stochastic modeling and simulation provide powerful predictive methods for the intrinsic understanding of fundamental mechanisms in complex biochemical networks. Typically, such mathematical models involve networks of coupled jump stochastic processes with a large number of parameters that need to be suitably calibrated against experimental data. In this direction, the parameter sensitivity analysis of reaction networks is an essential mathematical and computational tool, yielding information regarding the robustness and the identifiability of model parameters. However, existing sensitivity analysis approaches such as variants of the finite difference method can have an overwhelming computational cost in models with a high-dimensional parameter space. Results We develop a sensitivity analysis methodology suitable for complex stochastic reaction networks with a large number of parameters. The proposed approach is based on Information Theory methods and relies on the quantification of information loss due to parameter perturbations between time-series distributions. For this reason, we need to work on path-space, i.e., the set consisting of all stochastic trajectories, hence the proposed approach is referred to as “pathwise”. The pathwise sensitivity analysis method is realized by employing the rigorously-derived Relative Entropy Rate, which is directly computable from the propensity functions. A key aspect of the method is that an associated pathwise Fisher Information Matrix (FIM) is defined, which in turn constitutes a gradient-free approach to quantifying parameter sensitivities. The structure of the FIM turns out to be block-diagonal, revealing hidden parameter dependencies and sensitivities in reaction networks. Conclusions As a gradient-free method, the proposed sensitivity analysis provides a significant advantage when dealing with complex stochastic systems with a large number of parameters. In addition, the knowledge of the structure of the

  6. Security analysis and improvement of a privacy authentication scheme for telecare medical information systems.

    Science.gov (United States)

    Wu, Fan; Xu, Lili

    2013-08-01

    Nowadays, patients can gain many kinds of medical service on line via Telecare Medical Information Systems(TMIS) due to the fast development of computer technology. So security of communication through network between the users and the server is very significant. Authentication plays an important part to protect information from being attacked by malicious attackers. Recently, Jiang et al. proposed a privacy enhanced scheme for TMIS using smart cards and claimed their scheme was better than Chen et al.'s. However, we have showed that Jiang et al.'s scheme has the weakness of ID uselessness and is vulnerable to off-line password guessing attack and user impersonation attack if an attacker compromises the legal user's smart card. Also, it can't resist DoS attack in two cases: after a successful impersonation attack and wrong password input in Password change phase. Then we propose an improved mutual authentication scheme used for a telecare medical information system. Remote monitoring, checking patients' past medical history record and medical consultant can be applied in the system where information transmits via Internet. Finally, our analysis indicates that the suggested scheme overcomes the disadvantages of Jiang et al.'s scheme and is practical for TMIS.

  7. Analysis of Characteristics Extension Workers to Utilization of Information and Communication Technology

    Directory of Open Access Journals (Sweden)

    Veronice Veronice

    2015-08-01

    Full Text Available The science and technology is developing rapidly with the demands of changing times. The development of information and communication technology, especially since the advent of internet technology has led to major changes in society. Information technology products are relatively cheap and affordable facilitate access to information beyond the national borders and cultural boundaries. This condition has penetrated to all levels of human life, including farmers in the villages. Therefore, the extension becomes important role as a facilitator in developing the potential of farmers. Consequently extension is required to adjust to the changes and demands of the growing community. The objectives of the research is the analysis of characteristics extension workers to utilization of information and communication technology in Limapuluh Kota regency West Sumatera. This study is a descriptive-correlational survey-based study with the sample consisting of government employee as well as freelance extension workers in 8 Extension Agency of Agriculture  Fisheries and Forestry Extension (BP3K in Limapuluh Kota regency, West Sumatera province. Based on the results obtained, the results of different test (t-test is known that there are significant differences between the characteristics of the civil servants and THL-TBPP especially in the aspect of age and length of employment.

  8. AN ECONOMIC ANALYSIS OF THE DETERMINANTS OF ENTREPRENEURSHIP: THE CASE OF MASVINGO INFORMAL BUSINESSES

    Directory of Open Access Journals (Sweden)

    Clainos Chidoko

    2013-03-01

    Full Text Available In the past decade, Zimbabwe has been hit by its worst economic performance since its independence in 1980. Capacity utilization shrank to ten percent and unemployment rate was above eighty percent by 2008 as the private and public sector witnessed massive retrenchments. As a result many people are finding themselves engaging in informal businesses to make ends meet. However not all people have joined the informal sector as has been witnessed by the number of people who left the country in droves to neighbouring countries. It is against this background that this research conducted an economic analysis of the determinants of entrepreneurship in Masvingo urban with an emphasis on the informal businesses. The research targeted a sample of 100 informal businesses (30 from Rujeko Light industrial area, 40 from Mucheke Light industrial area and 30 from Masvingo Central Business District. The businesses included among others flea market operators, furniture manufacturers, suppliers and producers of agricultural products, and food vendors. The research found out that level of education, gender, age, marital status, number of dependants, type of subjects studied at secondary school and vocational training are the main determinants that influence the type of business that entrepreneur ventures into. The study recommends formal training for the participants, for the businesses to continue into existence since they fill in the gap that is left vacant by most formal enterprises.

  9. A large scale analysis of information-theoretic network complexity measures using chemical structures.

    Directory of Open Access Journals (Sweden)

    Matthias Dehmer

    Full Text Available This paper aims to investigate information-theoretic network complexity measures which have already been intensely used in mathematical- and medicinal chemistry including drug design. Numerous such measures have been developed so far but many of them lack a meaningful interpretation, e.g., we want to examine which kind of structural information they detect. Therefore, our main contribution is to shed light on the relatedness between some selected information measures for graphs by performing a large scale analysis using chemical networks. Starting from several sets containing real and synthetic chemical structures represented by graphs, we study the relatedness between a classical (partition-based complexity measure called the topological information content of a graph and some others inferred by a different paradigm leading to partition-independent measures. Moreover, we evaluate the uniqueness of network complexity measures numerically. Generally, a high uniqueness is an important and desirable property when designing novel topological descriptors having the potential to be applied to large chemical databases.

  10. Co-word analysis for the non-scientific information example of Reuters Business Briefings

    Directory of Open Access Journals (Sweden)

    B Delecroix

    2006-01-01

    Full Text Available Co-word analysis is based on a sociological theory developed by the CSI and the SERPIA (Callon, Courtial, Turner, 1991 in the mid eighties. This method, originally dedicated to scientific fields, measures the association strength between terms in documents to reveal and visualise the evolution of scientific fields through the construction of clusters and strategic diagram. This method has since been successfully applied to investigate the structure of many scientific areas. Nowadays it occurs in many software systems which are used by companies to improve their business, and define their strategy but its relevance to this kind of application has not been proved yet. Using the example of economic and marketing information on DSL technologies from Reuters Business Briefing, this presentation gives an interpretation of co-word analysis for this kind of information. After an overview of the software we used (Sampler and after an outline of the experimental protocol, we investigate and explain each step of the co-word analysis process: terminological extraction, computation of clusters and the strategic diagram. In particular, we explain the meaning of each parameter of the method: the choice of variables and similarity measures is discussed. Finally we try to give a global interpretation of the method in an economic context. Further studies will be added to this work in order to allow a generalisation of these results.

  11. Information Extraction for System-Software Safety Analysis: Calendar Year 2008 Year-End Report

    Science.gov (United States)

    Malin, Jane T.

    2009-01-01

    This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.

  12. Towards a New Approach of the Economic Intelligence Process: Basic Concepts, Analysis Methods and Informational Tools

    Directory of Open Access Journals (Sweden)

    Sorin Briciu

    2009-04-01

    Full Text Available One of the obvious trends in current business environment is the increased competition. In this context, organizations are becoming more and more aware of the importance of knowledge as a key factor in obtaining competitive advantage. A possible solution in knowledge management is Economic Intelligence (EI that involves the collection, evaluation, processing, analysis, and dissemination of economic data (about products, clients, competitors, etc. inside organizations. The availability of massive quantities of data correlated with advances in information and communication technology allowing for the filtering and processing of these data provide new tools for the production of economic intelligence.The research is focused on innovative aspects of economic intelligence process (models of analysis, activities, methods and informational tools and is providing practical guidelines for initiating this process. In this paper, we try: (a to contribute to a coherent view on economic intelligence process (approaches, stages, fields of application; b to describe the most important models of analysis related to this process; c to analyze the activities, methods and tools associated with each stage of an EI process.

  13. Research on analysis method for temperature control information of high arch dam construction

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Temperature control,which is directly responsible for the project quality and progress,plays an important role in high arch dam construction.How to discover the rules from a large amount of temperature control information collected in order to guide the adjustment of temperature control measures to prevent cracks on site is the key scientific problem.In this paper,a mathematic logical model was built firstly by means of a coupling analysis of temperature control system decomposition and coordination for high arch dam.Then,an analysis method for temperature control information was presented based on data mining technology.Furthermore,the data warehouse of temperature control was designed,and the artificial neural network forecasting model for the highest temperature of concrete was also developed.Finally,these methods were applied to a practical project. The result showed that the efficiency and precision of temperature control was improved,and rationality and scientificity of management and decision-making were strengthened.All of these researches provided an advanced analysis method for temperature control in the high arch dam construction process.

  14. Database Semantic Interoperability based on Information Flow Theory and Formal Concept Analysis

    Directory of Open Access Journals (Sweden)

    Guanghui Yang

    2012-07-01

    Full Text Available As databases become widely used, there is a growing need to translate information between multiple databases. Semantic interoperability and integration has been a long standing challenge for the database community and has now become a prominent area of database research. In this paper, we aim to answer the question how semantic interoperability between two databases can be achieved by using Formal Concept Analysis (FCA for short and Information Flow (IF for short theories. For our purposes, firstly we discover knowledge from different databases by using FCA, and then align what is discovered by using IF and FCA. The development of FCA has led to some software systems such as TOSCANA and TUPLEWARE, which can be used as a tool for discovering knowledge in databases. A prototype based on the IF and FCA has been developed. Our method is tested and verified by using this prototype and TUPLEWARE.

  15. Parametric Sensitivity Analysis for Stochastic Molecular Systems using Information Theoretic Metrics

    CERN Document Server

    Tsourtis, Anastasios; Katsoulakis, Markos A; Harmandaris, Vagelis

    2014-01-01

    In this paper we extend the parametric sensitivity analysis (SA) methodology proposed in Ref. [Y. Pantazis and M. A. Katsoulakis, J. Chem. Phys. 138, 054115 (2013)] to continuous time and continuous space Markov processes represented by stochastic differential equations and, particularly, stochastic molecular dynamics as described by the Langevin equation. The utilized SA method is based on the computation of the information-theoretic (and thermodynamic) quantity of relative entropy rate (RER) and the associated Fisher information matrix (FIM) between path distributions. A major advantage of the pathwise SA method is that both RER and pathwise FIM depend only on averages of the force field therefore they are tractable and computable as ergodic averages from a single run of the molecular dynamics simulation both in equilibrium and in non-equilibrium steady state regimes. We validate the performance of the extended SA method to two different molecular stochastic systems, a standard Lennard-Jones fluid and an al...

  16. Quantifying information transfer by protein domains: Analysis of the Fyn SH2 domain structure

    DEFF Research Database (Denmark)

    Lenaerts, Tom; Ferkinghoff-Borg, Jesper; Stricher, Francois

    2008-01-01

    distal communication is achieved. We illustrate the approach by analyzing information transfer by the SH2 domain of Fyn tyrosine kinase, obtained from Monte Carlo dynamics simulations. Our analysis reveals that the Fyn SH2 domain forms a noisy communication channel that couples residues located...... in the phosphopeptide and specificity binding sites and a number of residues at the other side of the domain near the linkers that connect the SH2 domain to the SH3 and kinase domains. We find that for this particular domain, communication is affected by a series of contiguous residues that connect distal sites...... by crossing the core of the SH2 domain. Conclusion: As a result, our method provides a means to directly map the exchange of biological information on the structure of protein domains, making it clear how binding triggers conformational changes in the protein structure. As such it provides a structural road...

  17. ERISTAR: Earth Resources Information Storage, Transformation, Analysis, and Retrieval administrative report

    Science.gov (United States)

    Vachon, R. I.; Obrien, J. F., Jr.; Lueg, R. E.; Cox, J. E.

    1972-01-01

    The 1972 Systems Engineering program at Marshall Space Flight Center where 15 participants representing 15 U.S. universities, 1 NASA/MSFC employee, and another specially assigned faculty member, participated in an 11-week program is discussed. The Fellows became acquainted with the philosophy of systems engineering, and as a training exercise, used this approach to produce a conceptional design for an Earth Resources Information Storage, Transformation, Analysis, and Retrieval System. The program was conducted in three phases; approximately 3 weeks were devoted to seminars, tours, and other presentations to subject the participants to technical and other aspects of the information management problem. The second phase, 5 weeks in length, consisted of evaluating alternative solutions to problems, effecting initial trade-offs and performing preliminary design studies and analyses. The last 3 weeks were occupied with final trade-off sessions, final design analyses and preparation of a final report and oral presentation.

  18. The Roles of Conference Papers in IS: An Analysis of the Scandinavian Conference on Information Systems

    DEFF Research Database (Denmark)

    Lanamäki, Arto; Persson, John Stouby

    2016-01-01

    Information Systems (IS) research has both a journal-oriented publication culture and a rich plethora of conferences. It is unclear why IS researchers even bother with conference publishing given the high focus on journals. Against this backdrop, the purpose of this paper is to increase our...... understanding of conference papers in IS and the role they play for the authoring researchers. We present the first analysis of the papers published during the first six years (2010-2015) in the Scandinavian Conference on Information Systems (SCIS). We conducted interviews with ten SCIS authors. Following...... a framework adopted from Åkerlind [1], we identified how SCIS papers have the roles of fulfilling requirements, establishing oneself, developing personally, enabling change, and other roles. This article contributes to the reflection literature on the IS field by applying a practice lens to understand...

  19. Plant-wide process monitoring based on mutual information-multiblock principal component analysis.

    Science.gov (United States)

    Jiang, Qingchao; Yan, Xuefeng

    2014-09-01

    Multiblock principal component analysis (MBPCA) methods are gaining increasing attentions in monitoring plant-wide processes. Generally, MBPCA assumes that some process knowledge is incorporated for block division; however, process knowledge is not always available. A new totally data-driven MBPCA method, which employs mutual information (MI) to divide the blocks automatically, has been proposed. By constructing sub-blocks using MI, the division not only considers linear correlations between variables, but also takes into account non-linear relations thereby involving more statistical information. The PCA models in sub-blocks reflect more local behaviors of process, and the results in all blocks are combined together by support vector data description. The proposed method is implemented on a numerical process and the Tennessee Eastman process. Monitoring results demonstrate the feasibility and efficiency.

  20. Multivoxel pattern analysis reveals 3D place information in the human hippocampus.

    Science.gov (United States)

    Kim, Misun; Jeffery, Kate J; Maguire, Eleanor A

    2017-03-20

    The spatial world is three-dimensional (3D), and humans and other animals move both horizontally and vertically within it. Extant neuroscientific studies have typically investigated spatial navigation on a horizontal two-dimensional plane, leaving much unknown about how 3D spatial information is represented in the brain. Specifically, horizontal and vertical information may be encoded in the same or different neural structures with equal or unequal sensitivity. Here, we investigated these possibilities using functional MRI (fMRI) while participants were passively moved within a 3D lattice structure as if riding a rollercoaster. Multivoxel pattern analysis was used to test for the existence of information relating to where and in which direction participants were heading in this virtual environment. Behaviorally, participants had similarly accurate memory for vertical and horizontal locations, and the right anterior hippocampus expressed place information that was sensitive to changes along both horizontal and vertical axes. This is suggestive of isotropic 3D place encoding. By contrast, participants indicated their heading direction faster and more accurately when they were heading in a tilted-up or tilted-down direction. This direction information was expressed in the right retrosplenial cortex and posterior hippocampus, and was only sensitive to vertical pitch, which could reflect the importance of the vertical (gravity) axis as a reference frame. Overall, our findings extend previous knowledge of how we represent the spatial world and navigate within it, by taking into account the important third dimension.SIGNIFICANCE STATEMENTThe spatial world is three-dimensional (3D) -- we can move horizontally across surfaces, but also vertically, going up slopes or stairs. Little is known about how the brain supports representations of 3D space. A key question is whether or not horizontal and vertical information is equally well represented. Here we measured functional MRI

  1. Informativeness of minisatellite and microsatellite markers for genetic analysis in papaya.

    Science.gov (United States)

    Oliveira, G A F; Dantas, J L L; Oliveira, E J

    2015-10-01

    The objective of this study was to evaluate information on minisatellite and microsatellite markers in papaya (Carica papaya L.). Forty minisatellites and 91 microsatellites were used for genotyping 24 papaya accessions. Estimates of genetic diversity, genetic linkage and analyses of population structure were compared. A lower average number of alleles per locus was observed in minisatellites (3.10) compared with microsatellites (3.57), although the minisatellites showed rarer alleles (18.54 %) compared with microsatellite (13.85 %). Greater expected (He = 0.52) and observed (Ho = 0.16) heterozygosity was observed in the microsatellites compared with minisatellites (He = 0.42 and Ho = 0.11), possibly due to the high number of hermaphroditic accessions, resulting in high rates of self-fertilization. The polymorphic information content and Shannon-Wiener diversity were also higher for microsatellites (from 0.47 to 1.10, respectively) compared with minisatellite (0.38 and 0.85, respectively). The probability of paternity exclusion was high for both markers (>0.999), and the combined probability of identity was from 1.65(-13) to 4.33(-38) for mini- and micro-satellites, respectively, which indicates that both types of markers are ideal for genetic analysis. The Bayesian analysis indicated the formation of two groups (K = 2) for both markers, although the minisatellites indicated a substructure (K = 4). A greater number of accessions with a low probability of assignment to specific groups were observed for microsatellites. Collectively, the results indicated higher informativeness of microsatellites. However, the lower informative power of minisatellites may be offset by the use of larger number of loci. Furthermore, minisatellites are subject to less error in genotyping because there is greater power to detect genotyping systems when larger motifs are used.

  2. Information Analysis Methodology for Border Security Deployment Prioritization and Post Deployment Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Booker, Paul M.; Maple, Scott A.

    2010-06-08

    Due to international commerce, cross-border conflicts, and corruption, a holistic, information driven, approach to border security is required to best understand how resources should be applied to affect sustainable improvements in border security. The ability to transport goods and people by land, sea, and air across international borders with relative ease for legitimate commercial purposes creates a challenging environment to detect illicit smuggling activities that destabilize national level border security. Smuggling activities operated for profit or smuggling operations driven by cross border conflicts where militant or terrorist organizations facilitate the transport of materials and or extremists to advance a cause add complexity to smuggling interdiction efforts. Border security efforts are further hampered when corruption thwarts interdiction efforts or reduces the effectiveness of technology deployed to enhance border security. These issues necessitate the implementation of a holistic approach to border security that leverages all available data. Large amounts of information found in hundreds of thousands of documents can be compiled to assess national or regional borders to identify variables that influence border security. Location data associated with border topics of interest may be extracted and plotted to better characterize the current border security environment for a given country or region. This baseline assessment enables further analysis, but also documents the initial state of border security that can be used to evaluate progress after border security improvements are made. Then, border security threats are prioritized via a systems analysis approach. Mitigation factors to address risks can be developed and evaluated against inhibiting factor such as corruption. This holistic approach to border security helps address the dynamic smuggling interdiction environment where illicit activities divert to a new location that provides less resistance

  3. Applying information network analysis to fire-prone landscapes: implications for community resilience

    Directory of Open Access Journals (Sweden)

    Derric B. Jacobs

    2017-03-01

    Full Text Available Resilient communities promote trust, have well-developed networks, and can adapt to change. For rural communities in fire-prone landscapes, current resilience strategies may prove insufficient in light of increasing wildfire risks due to climate change. It is argued that, given the complexity of climate change, adaptations are best addressed at local levels where specific social, cultural, political, and economic conditions are matched with local risks and opportunities. Despite the importance of social networks as key attributes of community resilience, research using social network analysis on coupled human and natural systems is scarce. Furthermore, the extent to which local communities in fire-prone areas understand climate change risks, accept the likelihood of potential changes, and have the capacity to develop collaborative mitigation strategies is underexamined, yet these factors are imperative to community resiliency. We apply a social network framework to examine information networks that affect perceptions of wildfire and climate change in Central Oregon. Data were collected using a mailed questionnaire. Analysis focused on the residents' information networks that are used to gain awareness of governmental activities and measures of community social capital. A two-mode network analysis was used to uncover information exchanges. Results suggest that the general public develops perceptions about climate change based on complex social and cultural systems rather than as patrons of scientific inquiry and understanding. It appears that perceptions about climate change itself may not be the limiting factor in these communities' adaptive capacity, but rather how they perceive local risks. We provide a novel methodological approach in understanding rural community adaptation and resilience in fire-prone landscapes and offer a framework for future studies.

  4. Analysis and design of raptor codes for joint decoding using Information Content evolution

    CERN Document Server

    Venkiah, Auguste; Declercq, David

    2007-01-01

    In this paper, we present an analytical analysis of the convergence of raptor codes under joint decoding over the binary input additive white noise channel (BIAWGNC), and derive an optimization method. We use Information Content evolution under Gaussian approximation, and focus on a new decoding scheme that proves to be more efficient: the joint decoding of the two code components of the raptor code. In our general model, the classical tandem decoding scheme appears to be a subcase, and thus, the design of LT codes is also possible.

  5. Transportation Big Data: Unbiased Analysis and Tools to Inform Sustainable Transportation Decisions

    Energy Technology Data Exchange (ETDEWEB)

    2016-06-01

    Today, transportation operation and energy systems data are generated at an unprecedented scale. The U.S. Department of Energy's National Renewable Energy Laboratory (NREL) is the go-to source for expertise in providing data and analysis to inform industry and government transportation decision making. The lab's teams of data experts and engineers are mining and analyzing large sets of complex data -- or 'big data' -- to develop solutions that support the research, development, and deployment of market-ready technologies that reduce fuel consumption and greenhouse gas emissions.

  6. Cluster analysis based on dimensional information with applications to feature selection and classification

    Science.gov (United States)

    Eigen, D. J.; Fromm, F. R.; Northouse, R. A.

    1974-01-01

    A new clustering algorithm is presented that is based on dimensional information. The algorithm includes an inherent feature selection criterion, which is discussed. Further, a heuristic method for choosing the proper number of intervals for a frequency distribution histogram, a feature necessary for the algorithm, is presented. The algorithm, although usable as a stand-alone clustering technique, is then utilized as a global approximator. Local clustering techniques and configuration of a global-local scheme are discussed, and finally the complete global-local and feature selector configuration is shown in application to a real-time adaptive classification scheme for the analysis of remote sensed multispectral scanner data.

  7. Automatic generation of stop word lists for information retrieval and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rose, Stuart J

    2013-01-08

    Methods and systems for automatically generating lists of stop words for information retrieval and analysis. Generation of the stop words can include providing a corpus of documents and a plurality of keywords. From the corpus of documents, a term list of all terms is constructed and both a keyword adjacency frequency and a keyword frequency are determined. If a ratio of the keyword adjacency frequency to the keyword frequency for a particular term on the term list is less than a predetermined value, then that term is excluded from the term list. The resulting term list is truncated based on predetermined criteria to form a stop word list.

  8. Information Base of Financial Analysis of Educational Institutions of Higher Education

    Directory of Open Access Journals (Sweden)

    Alexander A. Galushkin

    2015-12-01

    Full Text Available In this article author analyzes issues related to the formation of information base analysis of the financial condition of educational institutions of higher education. Author notes that are significantly different principles of financial (accounting statements of non-governmental and government (budget of educational institutions of higher professional education. In conclusion, author notes that when analyzing the financial condition of the group of higher professional education institutions, they can be classified into subgroups, depending on the benefits of a species (subspecies of funding and revenue.

  9. Object-oriented analysis and design for information systems Modeling with UML, OCL, IFML

    CERN Document Server

    Wazlawick, Raul Sidnei

    2014-01-01

    Object-Oriented Analysis and Design for Information Systems clearly explains real object-oriented programming in practice. Expert author Raul Sidnei Wazlawick explains concepts such as object responsibility, visibility and the real need for delegation in detail. The object-oriented code generated by using these concepts in a systematic way is concise, organized and reusable. The patterns and solutions presented in this book are based in research and industrial applications. You will come away with clarity regarding processes and use cases and a clear understand of how to expand a use case.

  10. Analysis on health information extracted from an urban professional population in Beijing

    Institute of Scientific and Technical Information of China (English)

    ZHANG Tie-mei; ZHANG Yan; LIU Bin; JIA Hong-bo; LIU Yun-jie; ZHU Ling; LUO Sen-lin; HAN Yi-wen; ZHANG Yan; YANG Shu-wen; LIU An-nan; MA Lan-jun; ZHAO Yan-yan

    2011-01-01

    Background The assembled data from a population could provide information on health trends within the population.The aim of this research was to extract and know basic health information from an urban professional population in Beijing.Methods Data analysis was carried out in a population who underwent a routine medical check-up and aged >20 years,including 30 058 individuals.General information,data from physical examinations and blood samples were collected in the same method.The health status was separated into three groups by the criteria generated in this study,i.e.,people with common chronic diseases,people in a sub-clinic situation,and healthy people.The proportion of both common diseases suffered and health risk distribution of different age groups were also analyzed.Results The proportion of people with common chronic diseases,in the sub-clinic group and in the healthy group was 28.6%,67.8% and 3.6% respectively.There were significant differences in the health situation in different age groups.Hypertension was on the top of list of self-reported diseases.The proportion of chronic diseases increased significantly in people after 35 years of age.Meanwhile,the proportion of sub-clinic conditions was decreasing at the same rate.The complex risk factors to health in this population were metabolic disturbances (61.3%),risk for tumor (2.7%),abnormal results of morphological examination (8.2%) and abnormal results of lab tests of serum (27.8%).Conclusions Health information could be extracted from a complex data set from the heath check-ups of the general population.The information should be applied to support prevention and control chronic diseases as well as for directing intervention for patients with risk factors for disease.

  11. The use of cloud enabled building information models – an expert analysis

    Directory of Open Access Journals (Sweden)

    Alan Redmond

    2015-10-01

    Full Text Available The dependency of today’s construction professionals to use singular commercial applications for design possibilities creates the risk of being dictated by the language-tools they use. This unknowingly approach to converting to the constraints of a particular computer application’s style, reduces one’s association with cutting-edge design as no single computer application can support all of the tasks associated with building-design and production. Interoperability depicts the need to pass data between applications, allowing multiple types of experts and applications to contribute to the work at hand. Cloud computing is a centralized heterogeneous platform that enables different applications to be connected to each other through using remote data servers. However, the possibility of providing an interoperable process based on binding several construction applications through a single repository platform ‘cloud computing’ required further analysis. The following Delphi questionnaires analysed the exchanging information opportunities of Building Information Modelling (BIM as the possible solution for the integration of applications on a cloud platform. The survey structure is modelled to; (i identify the most appropriate applications for advancing interoperability at the early design stage, (ii detect the most severe barriers of BIM implementation from a business and legal viewpoint, (iii examine the need for standards to address information exchange between design team, and (iv explore the use of the most common interfaces for exchanging information. The anticipated findings will assist in identifying a model that will enhance the standardized passing of information between systems at the feasibility design stage of a construction project.

  12. Science information to support Missouri River Scaphirhynchus albus (pallid sturgeon) effects analysis

    Science.gov (United States)

    Jacobson, Robert B.; Parsley, Michael J.; Annis, Mandy L.; Colvin, Michael E.; Welker, Timothy L.; James, Daniel A.

    2016-01-26

    The Missouri River Pallid Sturgeon Effects Analysis (EA) was commissioned by the U.S. Army Corps of Engineers to develop a foundation of understanding of how pallid sturgeon (Scaphirhynchus albus) population dynamics are linked to management actions in the Missouri River. The EA consists of several steps: (1) development of comprehensive, conceptual ecological models illustrating pallid sturgeon population dynamics and links to management actions and other drivers; (2) compilation and assessment of available scientific literature, databases, and models; (3) development of predictive, quantitative models to explore the system dynamics and population responses to management actions; and (4) analysis and assessment of effects of system operations and actions on species’ habitats and populations. This report addresses the second objective, compilation and assessment of relevant information.

  13. SDI-based business processes: A territorial analysis web information system in Spain

    Science.gov (United States)

    Béjar, Rubén; Latre, Miguel Á.; Lopez-Pellicer, Francisco J.; Nogueras-Iso, Javier; Zarazaga-Soria, F. J.; Muro-Medrano, Pedro R.

    2012-09-01

    Spatial Data Infrastructures (SDIs) provide access to geospatial data and operations through interoperable Web services. These data and operations can be chained to set up specialized geospatial business processes, and these processes can give support to different applications. End users can benefit from these applications, while experts can integrate the Web services in their own business processes and developments. This paper presents an SDI-based territorial analysis Web information system for Spain, which gives access to land cover, topography and elevation data, as well as to a number of interoperable geospatial operations by means of a Web Processing Service (WPS). Several examples illustrate how different territorial analysis business processes are supported. The system has been established by the Spanish National SDI (Infraestructura de Datos Espaciales de España, IDEE) both as an experimental platform for geoscientists and geoinformation system developers, and as a mechanism to contribute to the Spanish citizens knowledge about their territory.

  14. Organizational culture, creative behavior, and information and communication technology (ICT) usage: a facet analysis.

    Science.gov (United States)

    Carmeli, Abraham; Sternberg, Akiva; Elizur, D

    2008-04-01

    Despite the prominence of organizational culture (OC), this concept is controversial and its structure has yet to be systematically analyzed. This study develops a three-pronged formal definitional framework on the basis of facet theory (FT) and explores behavior modality, referent, and object. This facet analysis (FA) of OC accounts successfully for variation in both creative behavior at work and the usage of information and communication technologies (ICTs). An analysis of data collected from 230 employees in the financial industry indicates that a radex structure was obtained for work and ICT. The behavior modality facet ordered the space from center to periphery, and referents facet relates to the direction angles away from the origin.

  15. Beyond usage: understanding the use of electronic journals on the basis of information activity analysis. Electronic journals, Use studies, Information activity, Scientific communication

    Directory of Open Access Journals (Sweden)

    Annaïg Mahé

    2004-01-01

    Full Text Available In this article, which reports the second part of a two-part study of the use of electronic journals by researchers in two French research institutions, we attempt to explain the integration of the use of electronic journals in the scientists' information habits, going beyond usage analysis. First, we describe how the development of electronic journals use follows a three-phase innovation process - research-development, first uses, and technical acculturation. Then, we attempt to find more significant explanatory factors, and emphasis is placed on the wider context of information activity. Three main information activity types are outlined - marginal, parallel, and integrated. Each of these types corresponds to a particular attitude towards scientific information and to different levels of electronic journal use.

  16. Information Crisis

    CERN Document Server

    Losavio, Michael

    2012-01-01

    Information Crisis discusses the scope and types of information available online and teaches readers how to critically assess it and analyze potentially dangerous information, especially when teachers, editors, or other information gatekeepers are not available to assess the information for them. Chapters and topics include:. The Internet as an information tool. Critical analysis. Legal issues, traps, and tricks. Protecting personal safety and identity. Types of online information.

  17. The Wind ENergy Data and Information (WENDI) Gateway: New Information and Analysis Tools for Wind Energy Stakeholders

    Science.gov (United States)

    Kaiser, D.; Palanisamy, G.; Santhana Vannan, S.; Wei, Y.; Smith, T.; Starke, M.; Wibking, M.; Pan, Y.; Devarakonda, Ranjeet; Wilson, B. E.; Wind Energy Data; Information (WENDI) Gateway Team

    2010-12-01

    In support of the U.S. Department of Energy’s (DOE) Energy Efficiency and Renewable Energy (EERE) Office, DOE's Oak Ridge National Laboratory (ORNL) has launched the Wind ENergy Data & Information (WENDI) Gateway. The WENDI Gateway is intended to serve a broad range of wind-energy stakeholders by providing easy access to a large amount of wind energy-related data and information through its two main interfaces: the Wind Energy Metadata Clearinghouse and the Wind Energy Geographic Information System (WindGIS). The Metadata Clearinghouse is a powerful, customized search tool for discovering, accessing, and sharing wind energy-related data and information. Its database of metadata records points users to a diverse array of wind energy-related resources: from technical and scientific journal articles to mass media news stories; from annual government and industry reports to downloadable datasets, and much more. Through the WindGIS, users can simultaneously visualize a wide spectrum of United States wind energy-related spatial data, including wind energy power plant locations; wind resource maps; state-level installed wind capacity, generation, and renewable portfolio standards; electric transmission lines; transportation infrastructure; interconnection standards; land ownership, designation, and usage; and various ecological data layers. In addition, WindGIS allows users to download much of the data behind the layers. References: [1] Devarakonda R., et al. Mercury: reusable metadata management, data discovery and access system. Earth Science Informatics (2010), 3(1): 87-94. [2] Wilson, Bruce E., et al. "Mercury Toolset for Spatiotemporal Metadata." (2010).

  18. Mapping informative clusters in a hierarchical [corrected] framework of FMRI multivariate analysis.

    Directory of Open Access Journals (Sweden)

    Rui Xu

    Full Text Available Pattern recognition methods have become increasingly popular in fMRI data analysis, which are powerful in discriminating between multi-voxel patterns of brain activities associated with different mental states. However, when they are used in functional brain mapping, the location of discriminative voxels varies significantly, raising difficulties in interpreting the locus of the effect. Here we proposed a hierarchical framework of multivariate approach that maps informative clusters rather than voxels to achieve reliable functional brain mapping without compromising the discriminative power. In particular, we first searched for local homogeneous clusters that consisted of voxels with similar response profiles. Then, a multi-voxel classifier was built for each cluster to extract discriminative information from the multi-voxel patterns. Finally, through multivariate ranking, outputs from the classifiers were served as a multi-cluster pattern to identify informative clusters by examining interactions among clusters. Results from both simulated and real fMRI data demonstrated that this hierarchical approach showed better performance in the robustness of functional brain mapping than traditional voxel-based multivariate methods. In addition, the mapped clusters were highly overlapped for two perceptually equivalent object categories, further confirming the validity of our approach. In short, the hierarchical framework of multivariate approach is suitable for both pattern classification and brain mapping in fMRI studies.

  19. Value of information analysis for groundwater quality monitoring network design Case study: Eocene Aquifer, Palestine

    Science.gov (United States)

    Khader, A.; McKee, M.

    2010-12-01

    Value of information (VOI) analysis evaluates the benefit of collecting additional information to reduce or eliminate uncertainty in a specific decision-making context. It makes explicit any expected potential losses from errors in decision making due to uncertainty and identifies the “best” information collection strategy as one that leads to the greatest expected net benefit to the decision-maker. This study investigates the willingness to pay for groundwater quality monitoring in the Eocene Aquifer, Palestine, which is an unconfined aquifer located in the northern part of the West Bank. The aquifer is being used by 128,000 Palestinians to fulfill domestic and agricultural demands. The study takes into account the consequences of pollution and the options the decision maker might face. Since nitrate is the major pollutant in the aquifer, the consequences of nitrate pollution were analyzed, which mainly consists of the possibility of methemoglobinemia (blue baby syndrome). In this case, the value of monitoring was compared to the costs of treating for methemoglobinemia or the costs of other options like water treatment, using bottled water or importing water from outside the aquifer. And finally, an optimal monitoring network that takes into account the uncertainties in recharge (climate), aquifer properties (hydraulic conductivity), pollutant chemical reaction (decay factor), and the value of monitoring is designed by utilizing a sparse Bayesian modeling algorithm called a relevance vector machine.

  20. Information and Emotion in Advertising: A Content Analysis on the Internet in Brazil

    Directory of Open Access Journals (Sweden)

    Melby Karina Zuniga Huertas

    2012-02-01

    Full Text Available The significant increase in the number of users and transactions over the Internet has highlighted the importance of this channel, bringing new opportunities and challenges for advertising. However, the way advertisers use information and emotional appeals in their messages is still a subject little studied. Hence, the objectives of this paper are to: a analyze some general features of Internet advertising; and b explore the information content and emotional appeal of Internet advertising. We conducted an exploratory search through content analysis of a sample of 156 ads on sites using a convenience sample of Brazilian Internet users. The results showed that the advertising on the analyzed sites was for an assortment of products but the products were concentrated in a few categories. The types of informational content most commonly used are "availability" and "components or content" of products. Also the advertisements use only positive emotional appeals, which bring opportunities to create campaigns. This study contributes knowledge about the use of internet in marketing communications and suggests the direction of future research.

  1. Personal information of adolescents on the Internet: A quantitative content analysis of MySpace.

    Science.gov (United States)

    Hinduja, Sameer; Patchin, Justin W

    2008-02-01

    Many youth have recently embraced online social networking sites such as MySpace (myspace.com) to meet their social and relational needs. While manifold benefits stem from participating in such web-based environments, the popular media has been quick to demonize MySpace even though an exponentially small proportion of its users have been victimized due to irresponsible or naive usage of the technology it affords. Major concerns revolve around the possibility of sexual predators and pedophiles finding and then assaulting adolescents who carelessly or unwittingly reveal identifiable information on their personal profile pages. The current study sought to empirically ascertain the type of information youth are publicly posting through an extensive content analysis of randomly sampled MySpace profile pages. Among other findings, 8.8% revealed their full name, 57% included a picture, 27.8% listed their school, and 0.3% provided their telephone number. When considered in its proper context, these results indicate that the problem of personal information disclosure on MySpace may not be as widespread as many assume, and that the overwhelming majority of adolescents are responsibly using the web site. Implications for Internet safety among adolescents and future research regarding adolescent Internet use are discussed.

  2. Implementation Of The ISO/IEC 27005 In Risk Security Analysis Of Management Information System

    Directory of Open Access Journals (Sweden)

    Sri Ariyani*

    2016-08-01

    Full Text Available The study conducted and explains about analysis result of Security Management Information System (SMKI at UPT SAMSAT Denpasar. This analysis has purpose to find out the level of SMKI at UPT SAMSAT Denpasar. Framework to be used in this analysis process is the ISO/IEC 27005. Section that wants to be analyze is the main task and function at the Section of Motor Vehicle Tax (PKB and Motor Vehicle Mutation Charge (BBNKB and service process performed, in this case is which is done by the staff in the Section of PKB and BBNKB that includes determining tax, to take data of progressive tax, data slot that involves in it, supporting structure and infrastructure and, of course, the stackeholder who involve in the process. The analysis was performed by implemented the ISO/IEC 27005 framework referring to clause 7 and clause 8. Clause 7 of ISO/IEC 27005 in this analysis was performed to the organization structure, obstacles list that influence the organization, reference list of legislative and regulation that valid to the organization. Whereas clause 8 of ISO/IEC 27005 include asset identification, asset appraisal, impact assessment. Analysis result shows that asset list that has the highest risk rate include the main asset those are: the process of coding selection, determining tax, process of determining the progressive tax ownership status, process of determining the progressive tax ownership order, process to repeat data capture of progressive tax, and supporting asset that cover: staff of determination, staff of progressive data capture. Whereas asset list that has the highest threat level include main asset those are: process of tax determination coding selection, process of progressive tax ownership status determination, process of progressive tax ownership order determination, process to repeat data capture of progressive tax, and supporting asset those are: the staff of determination, staff of progressive data capture.

  3. Managing Returnable Containers Logistics - A Case Study Part I - Physical and Information Flow Analysis

    Directory of Open Access Journals (Sweden)

    Reza A. Maleki

    2011-05-01

    Full Text Available This case study paper is the result of a project conducted on behalf of a company, hereon referred to as Midwest Assembly and Manufacturing or MAAN. The company's operations include component manufacturing, painting, and assembling products. The company also purchases a relatively large percentage of components and major assemblies that are needed to support final assembly operations. MAAN uses its own returnable containers to transport purchased parts from suppliers. Due to poor tracking of the containers, the company has been experiencing lost containers and occasional production disruptions at its facility well as at the supplier sites. The objective of this project was to develop a proposal to enable MAAN to more effectively track and manage its returnable containers. The research activities in support of this project included the analysis and documentation of both the physical flow and the information flow associated with the containers as well as some of the technologies that can help with automatic identification and tracking of containers. The focal point of this paper is on a macro?level approach for the analysis of container and information flow within the logistics chain. A companion paper deals with several of the automatic identification technologies that have the potential to improve the management of MAAN's returnable containers.

  4. Isotopically non-stationary metabolic flux analysis: complex yet highly informative.

    Science.gov (United States)

    Wiechert, Wolfgang; Nöh, Katharina

    2013-12-01

    Metabolic flux analysis (MFA) using isotopic tracers aims at the experimental determination of in vivo reaction rates (fluxes). In recent years, the well-established 13C-MFA method based on metabolic and isotopic steady state was extended to INST-MFA (isotopically non-stationary MFA), which is performed in a transient labeling state. INST-MFA offers short-time experiments with a maximal information gain, and can moreover be applied to a wider range of growth conditions or organisms. Some of these conditions are not accessible by conventional methods. This comes at the price of significant methodological complexity involving high-frequency sampling and quenching, precise analysis of many samples and an extraordinary computational effort. This review gives a brief overview of basic principles, experimental workflows, and recent progress in this field. Special emphasis is laid on the trade-off between total effort and information gain, particularly on the suitability of INST-MFA for certain types of biological questions. In order to integrate INST-MFA as a viable method into the toolbox of MFA, some major challenges must be addressed in the coming years. These are discussed in the outlook.

  5. LBL Socio-Economic Environmental-Demographic Information System (SEEDIS). Chart: graphic analysis and display system

    Energy Technology Data Exchange (ETDEWEB)

    Sventek, V.A.

    1978-03-01

    The Chart Graphic Analysis and Display System was developed as part of Lawrence Berkeley Laboratory's Socio-Economic-Environmental-Demographic Information System (SEEDIS) to provide a tool with which users could draw graphs, print tables of numbers, do analysis, and perform basic statistical operations on the same set of data from a terminal in their own office. The Chart system's operation is completely independent of the type of data being entered and has been used for applications ranging from employment to energy data. Users frequently save the data they put into Chart and add to it on a regular basis, and thereby create personal databases which can be blended with information from the formal databases maintained at LBL. Using any computer system requires that the user learn a set of instructions, which at the onset often seems overwhelming. It is the purpose of this workbook to make this initial learning process less traumatic. The typical use of Chart is to enter a set of numbers, and then tell Chart what should be done with them. Chart commands take the form of gruff pidgin English. There are approximately 50 commands available. This workbook illustrates the basic commands. (RWR)

  6. Content Analysis of Papers Submitted to Communications in Information Literacy, 2007-2013

    Directory of Open Access Journals (Sweden)

    Christopher V. Hollister

    2014-07-01

    Full Text Available The author conducted a content analysis of papers submitted to the journal, Communications in Information Literacy, from the years 2007-2013. The purpose was to investigate and report on the overall quality characteristics of a statistically significant sample of papers submitted to a single-topic, open access, library and information science (LIS journal. Characteristics of manuscript submissions, authorship, reviewer evaluations, and editorial decisions were illuminated to provide context; particular emphasis was given to the analysis of major criticisms found in reviewer evaluations of rejected papers. Overall results were compared to previously published research. The findings suggest a trend in favor of collaborative authorship, and a possible trend toward a more practice-based literature. The findings also suggest a possible deterioration in some of the skills that are required of LIS authors relative to the preparation of scholarly papers. The author discusses potential implications for authors and the disciplinary literature, recommends directions for future research, and where possible, provides recommendations for the benefit of the greater community of LIS scholars.

  7. Allocating health care: cost-utility analysis, informed democratic decision making, or the veil of ignorance?

    Science.gov (United States)

    Goold, S D

    1996-01-01

    Assuming that rationing health care is unavoidable, and that it requires moral reasoning, how should we allocate limited health care resources? This question is difficult because our pluralistic, liberal society has no consensus on a conception of distributive justice. In this article I focus on an alternative: Who shall decide how to ration health care, and how shall this be done to respect autonomy, pluralism, liberalism, and fairness? I explore three processes for making rationing decisions: cost-utility analysis, informed democratic decision making, and applications of the veil of ignorance. I evaluate these processes as examples of procedural justice, assuming that there is no outcome considered the most just. I use consent as a criterion to judge competing processes so that rationing decisions are, to some extent, self-imposed. I also examine the processes' feasibility in our current health care system. Cost-utility analysis does not meet criteria for actual or presumed consent, even if costs and health-related utility could be measured perfectly. Existing structures of government cannot creditably assimilate the information required for sound rationing decisions, and grassroots efforts are not representative. Applications of the veil of ignorance are more useful for identifying principles relevant to health care rationing than for making concrete rationing decisions. I outline a process of decision making, specifically for health care, that relies on substantive, selected representation, respects pluralism, liberalism, and deliberative democracy, and could be implemented at the community or organizational level.

  8. Understanding and Managing Our Earth through Integrated Use and Analysis of Geo-Information

    Directory of Open Access Journals (Sweden)

    Wolfgang Kainz

    2011-09-01

    Full Text Available All things in our world are related to some location in space and time, and according to Tobler’s first law of geography “everything is related to everything else, but near things are more related than distant things” [1]. Since humans exist they have been contemplating about space and time and have tried to depict and manage the geographic space they live in. We know graphic representations of the land from various regions of the world dating back several thousands of years. The processing and analysis of spatial data has a long history in the disciplines that deal with spatial data such as geography, surveying engineering, cartography, photogrammetry, and remote sensing. Until recently, all these activities have been analog in nature; only since the invention of the computer in the second half of the 20th century and the use of computers for the acquisition, storage, analysis, and display of spatial data starting in the 1960s we speak of geo-information and geo-information systems. [...

  9. Selecting essential information for biosurveillance--a multi-criteria decision analysis.

    Directory of Open Access Journals (Sweden)

    Nicholas Generous

    Full Text Available The National Strategy for Biosurveillance defines biosurveillance as "the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels." However, the strategy does not specify how "essential information" is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being "essential". The question of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of "essential information" for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system.

  10. Selecting essential information for biosurveillance--a multi-criteria decision analysis.

    Science.gov (United States)

    Generous, Nicholas; Margevicius, Kristen J; Taylor-McCabe, Kirsten J; Brown, Mac; Daniel, W Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina

    2014-01-01

    The National Strategy for Biosurveillance defines biosurveillance as "the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels." However, the strategy does not specify how "essential information" is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being "essential". The question of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of "essential information" for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system.

  11. PathNet: a tool for pathway analysis using topological information

    Directory of Open Access Journals (Sweden)

    Dutta Bhaskar

    2012-09-01

    Full Text Available Abstract Background Identification of canonical pathways through enrichment of differentially expressed genes in a given pathway is a widely used method for interpreting gene lists generated from high-throughput experimental studies. However, most algorithms treat pathways as sets of genes, disregarding any inter- and intra-pathway connectivity information, and do not provide insights beyond identifying lists of pathways. Results We developed an algorithm (PathNet that utilizes the connectivity information in canonical pathway descriptions to help identify study-relevant pathways and characterize non-obvious dependencies and connections among pathways using gene expression data. PathNet considers both the differential expression of genes and their pathway neighbors to strengthen the evidence that a pathway is implicated in the biological conditions characterizing the experiment. As an adjunct to this analysis, PathNet uses the connectivity of the differentially expressed genes among all pathways to score pathway contextual associations and statistically identify biological relations among pathways. In this study, we used PathNet to identify biologically relevant results in two Alzheimer’s disease microarray datasets, and compared its performance with existing methods. Importantly, PathNet identified de-regulation of the ubiquitin-mediated proteolysis pathway as an important component in Alzheimer’s disease progression, despite the absence of this pathway in the standard enrichment analyses. Conclusions PathNet is a novel method for identifying enrichment and association between canonical pathways in the context of gene expression data. It takes into account topological information present in pathways to reveal biological information. PathNet is available as an R workspace image from http://www.bhsai.org/downloads/pathnet/.

  12. Petrophysical Analysis and Geographic Information System for San Juan Basin Tight Gas Reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Martha Cather; Robert Lee; Robert Balch; Tom Engler; Roger Ruan; Shaojie Ma

    2008-10-01

    The primary goal of this project is to increase the availability and ease of access to critical data on the Mesaverde and Dakota tight gas reservoirs of the San Juan Basin. Secondary goals include tuning well log interpretations through integration of core, water chemistry and production analysis data to help identify bypassed pay zones; increased knowledge of permeability ratios and how they affect well drainage and thus infill drilling plans; improved time-depth correlations through regional mapping of sonic logs; and improved understanding of the variability of formation waters within the basin through spatial analysis of water chemistry data. The project will collect, integrate, and analyze a variety of petrophysical and well data concerning the Mesaverde and Dakota reservoirs of the San Juan Basin, with particular emphasis on data available in the areas defined as tight gas areas for purpose of FERC. A relational, geo-referenced database (a geographic information system, or GIS) will be created to archive this data. The information will be analyzed using neural networks, kriging, and other statistical interpolation/extrapolation techniques to fine-tune regional well log interpretations, improve pay zone recognition from old logs or cased-hole logs, determine permeability ratios, and also to analyze water chemistries and compatibilities within the study area. This single-phase project will be accomplished through four major tasks: Data Collection, Data Integration, Data Analysis, and User Interface Design. Data will be extracted from existing databases as well as paper records, then cleaned and integrated into a single GIS database. Once the data warehouse is built, several methods of data analysis will be used both to improve pay zone recognition in single wells, and to extrapolate a variety of petrophysical properties on a regional basis. A user interface will provide tools to make the data and results of the study accessible and useful. The final deliverable

  13. 企业ERP投资公告信息分析%Information Analysis on ERP Investment

    Institute of Scientific and Technical Information of China (English)

    徐扬; 程媛媛

    2015-01-01

    Investment bulletin is valuable information resource open to public,and how to extract tacit information from it is a key issue in information analysis.Enterprise Resource Planning (ERP) is widely applied by enterprises because of its standardization,process-oriented management method and high integration.Researches on ERP investment value have caught attention from academic scholars and enterprise managers,and help decision making of enterprises.This paper analyzes information gathered from ERP investment bulletin,tries to measure ERP value by reflection from stock market,so as to make tacit information which may influence ERP value explicit.Based on former insightful studies,this paper gathers ERP investment bulletin of American publicly traded enterprises from 1997-2013,analyzes how information from investment bulletin may influence enterprise value,and provides decision making supports.%投资公告是重要的企业公开信息来源,如何通过提炼其中蕴含的隐性知识并将其显性表达是企业信息分析与决策的重要内容.本文以企业资源规划(Enterprise Resource Planning,ERP)投资公告为分析对象,将隐藏在投资公告中的ERP投资价值影响因素显性化,在对前人研究进行总结和提炼的基础上,采集了1997~2013年在美国上市企业的ERP投资公告,通过对九大变量的抽取、编码和计算,结合信息系统的组织集成和期权价值理论,分析这些公告信息对企业市值的影响,从而对企业投资决策提供支持.

  14. Information seeking for making evidence-informed decisions: a social network analysis on the staff of a public health department in Canada

    Directory of Open Access Journals (Sweden)

    Yousefi-Nooraie Reza

    2012-05-01

    Full Text Available Abstract Background Social network analysis is an approach to study the interactions and exchange of resources among people. It can help understanding the underlying structural and behavioral complexities that influence the process of capacity building towards evidence-informed decision making. A social network analysis was conducted to understand if and how the staff of a public health department in Ontario turn to peers to get help incorporating research evidence into practice. Methods The staff were invited to respond to an online questionnaire inquiring about information seeking behavior, identification of colleague expertise, and friendship status. Three networks were developed based on the 170 participants. Overall shape, key indices, the most central people and brokers, and their characteristics were identified. Results The network analysis showed a low density and localized information-seeking network. Inter-personal connections were mainly clustered by organizational divisions; and people tended to limit information-seeking connections to a handful of peers in their division. However, recognition of expertise and friendship networks showed more cross-divisional connections. Members of the office of the Medical Officer of Health were located at the heart of the department, bridging across divisions. A small group of professional consultants and middle managers were the most-central staff in the network, also connecting their divisions to the center of the information-seeking network. In each division, there were some locally central staff, mainly practitioners, who connected their neighboring peers; but they were not necessarily connected to other experts or managers. Conclusions The methods of social network analysis were useful in providing a systems approach to understand how knowledge might flow in an organization. The findings of this study can be used to identify early adopters of knowledge translation interventions, forming

  15. Legal analysis of information displayed on dental material packages: An exploratory research

    Directory of Open Access Journals (Sweden)

    Bhumika Rathore

    2016-01-01

    Full Text Available Introduction: Some of the dental materials possess occupational hazards, preprocedural errors, and patient allergies as suggested by evidence. With due consideration to safety of the patients and dental professionals, it is essential that the trade of these materials is in conformity with the law. Aim: To perform the legal analysis of the information displayed on the packaging of dental materials. Materials and Methods: The Bureau of Indian Standards sets guidelines for packaging and marketing of dental products in India. An exploratory cross-sectional study was performed using various search engines and websites to access the laws and regulations existing pertaining to dental materials packaging. Based on the data obtained, a unique packaging standardization checklist was developed. Dental laboratory and impression plasters, alginates, and endodontic instruments were surveyed for all the available brands. This study considered 16 brands of plasters and alginates and 42 brands of endodontic instruments for legal analysis. Legal analysis was performed using the direct observation checklist. Descriptive statistics were obtained using SPSS version 19. Results: The guidelines set by the Bureau of Indian Standards do exist but are not updated and stand as oblivious guards for marketing standards. Overall compliance to the guidelines was reported to be 18.5% by brands of alginates, 4.1% by plaster of Paris, and 11.11% by endodontic instruments. Wave One™ File reported maximum adherence with the guidelines as 66.7%. Conclusion: This study found lower rate of adherence to the guidelines, thus indicating insufficient information being disclosed to the consumers.

  16. Value of information analysis for Corrective Action Unit 97: Yucca Flat, Nevada Test Site, Nevada

    Energy Technology Data Exchange (ETDEWEB)

    IT Corporation Las Vegas

    1999-11-19

    The value-of-information analysis evaluated data collection options for characterizing groundwater transport of contamination associated with the Yucca Flat and Climax Mine Corrective Action Units. Experts provided inputs for the evaluation of 48 characterization options, which included 27 component activities, 12 combinations of activities (subgroups), and 9 combinations of subgroups (groups). The options range from an individual study using existing data and intended to address a relatively narrow uncertainty to a 52-million dollar group of activities designed to collect and analyze new information to broadly address multiple uncertainties. A modified version of the contaminant transport component of the regional model was used to simulate contaminant transport and to estimate the maximum extent of the contaminant boundary, defined as that distance beyond which the committed effective dose equivalent from the residual radionuclides in groundwater will not exceed 4 millirem per year within 1,000 years. These simulations identified the model parameters most responsible for uncertainty over the contaminant boundary and determined weights indicating the relative importance of these parameters. Key inputs were identified through sensitivity analysis; the five selected parameters were flux for flow into Yucca Flat from the north, hydrologic source term, effective porosity and diffusion parameter for the Lower Carbonate Aquifer, and path length from the Volcanic Confining Unit to the Lower Carbonate Aquifer. Four measures were used to quantify uncertainty reduction. Using Bayesian analysis, the options were compared and ranked based on their costs and estimates of their effectiveness at reducing the key uncertainties relevant to predicting the maximum contaminant boundary.

  17. An Information-Systems Program for the Language Sciences. Final Report on Survey-and-Analysis Stage, 1967-1968.

    Science.gov (United States)

    Freeman, Robert R.; And Others

    The main results of the survey-and-analysis stage include a substantial collection of preliminary data on the language-sciences information user community, its professional specialties and information channels, its indexing tools, and its terminologies. The prospects and techniques for the development of a modern, discipline-based information…

  18. Confirmatory Factor Analysis of IT-based Competency Questionnaire in Information Science & Knowledge Studies, Based on Job Market Analysis

    Directory of Open Access Journals (Sweden)

    Rahim Shahbazi

    2016-03-01

    Full Text Available The main purpose of the present research is to evaluate the validity of an IT-based competency questionnaire in Information Science & Knowledge Studies. The Survey method has been used in the present research. A data collection tool has been a researcher-made questionnaire. Statistic samples, which are 315 people, have been chosen purposefully from among Iranian faculty members, Ph.D. students, and information center employees. The findings showed that by eliminating 17 items from the whole questionnaire and Confirmatory Factor Analysis of the rest and rotating findings using the Varimax method, 8 Factors were revealed. The resulting components and also the items which had a high load factor with these components were considerably consistent with the classifications in the questionnaire and partly consistent with the findings of other researchers. 76 competency indicators (knowledge, skills, and attitudes were validated and grouped under 8 main categories: 1. “Computer Basics” 2. “Database Operating, Collection Development of Digital Resources, & Digital Library Management” 3. “Basics of Computer Networking” 4. “Basics of Programming & Database Designing” 5. “Web Designing & Web Content Analysis” 6. “Library Software & Computerized Organizing” 7. Archive of Digital Resources and 8. Attitudes.

  19. ANALYSIS OF TRAIN SHEET IN THE INFORMATION SYSTEM OF JSC «UKRZALIZNYTSIA»: PERSPECTIVE

    Directory of Open Access Journals (Sweden)

    S. M. Ovcharenko

    2016-04-01

    Full Text Available Purpose. The system of train sheet analysis (TSA in the information system of JSC «Ukrzaliznytsia» provides work with passenger and suburban trains and has considerable potential. Therefore it is necessary to establish the prospects of development of the system. Methodology. Departments’ setup and the train delay causes should be carried out at every station and span, where such delays took place. This requires the fixation of condition deviations of infrastructure from normal and other adverse factors. In the sector of freight transportations the train schedule analysis is insufficient, since this analysis does not account for deviations from the terms of delivery. Therefore it also is necessary to analyze the delivery graphs. The basis for monitoring the cargo delivery is the method of control time points (CTP of technological operations performed with cargo at railway stations. On the basis of CTP to assess the quality of the transport process one should calculate the values of the analysis of cargo delivery schedule (performance level of the cargo delivery schedule, the coefficient of ahead of schedule/delay delivery. Findings. The article proposes to develop the system TSA using the input and display of the train delay causes on-line by transportation service employees, expansion of statistical databases and processing of the input delay causes during its calculation train sheet analysis of freight trains and quality assessment of the delivery schedule fulfillment. It is also appropriate before the new operator companies had appeared to make changes in the instructions TSCHU-TSD-0002 on the list of departments, which include delayed trains, by adding «the department» «The fault of operator companies» and corresponding causes of delays. Originality. The scheme of automated TSA in the information system of JSC «Ukrzaliznytsia» was improved. The author proposes to determine the cargo delivery quality on the certain polygon using the

  20. DHLAS: A web-based information system for statistical genetic analysis of HLA population data.

    Science.gov (United States)

    Thriskos, P; Zintzaras, E; Germenis, A

    2007-03-01

    DHLAS (database HLA system) is a user-friendly, web-based information system for the analysis of human leukocyte antigens (HLA) data from population studies. DHLAS has been developed using JAVA and the R system, it runs on a Java Virtual Machine and its user-interface is web-based powered by the servlet engine TOMCAT. It utilizes STRUTS, a Model-View-Controller framework and uses several GNU packages to perform several of its tasks. The database engine it relies upon for fast access is MySQL, but others can be used a well. The system estimates metrics, performs statistical testing and produces graphs required for HLA population studies: (i) Hardy-Weinberg equilibrium (calculated using both asymptotic and exact tests), (ii) genetics distances (Euclidian or Nei), (iii) phylogenetic trees using the unweighted pair group method with averages and neigbor-joining method, (iv) linkage disequilibrium (pairwise and overall, including variance estimations), (v) haplotype frequencies (estimate using the expectation-maximization algorithm) and (vi) discriminant analysis. The main merit of DHLAS is the incorporation of a database, thus, the data can be stored and manipulated along with integrated genetic data analysis procedures. In addition, it has an open architecture allowing the inclusion of other functions and procedures.

  1. Comparison of Seven Methods for Boolean Factor Analysis and Their Evaluation by Information Gain.

    Science.gov (United States)

    Frolov, Alexander A; Húsek, Dušan; Polyakov, Pavel Yu

    2016-03-01

    An usual task in large data set analysis is searching for an appropriate data representation in a space of fewer dimensions. One of the most efficient methods to solve this task is factor analysis. In this paper, we compare seven methods for Boolean factor analysis (BFA) in solving the so-called bars problem (BP), which is a BFA benchmark. The performance of the methods is evaluated by means of information gain. Study of the results obtained in solving BP of different levels of complexity has allowed us to reveal strengths and weaknesses of these methods. It is shown that the Likelihood maximization Attractor Neural Network with Increasing Activity (LANNIA) is the most efficient BFA method in solving BP in many cases. Efficacy of the LANNIA method is also shown, when applied to the real data from the Kyoto Encyclopedia of Genes and Genomes database, which contains full genome sequencing for 1368 organisms, and to text data set R52 (from Reuters 21578) typically used for label categorization.

  2. Efficient Techniques of Sparse Signal Analysis for Enhanced Recovery of Information in Biomedical Engineering and Geosciences

    KAUST Repository

    Sana, Furrukh

    2016-11-01

    enhanced imaging of the subsurface earth and result in substantial savings in terms of convergence time, leading to optimized placement of oil wells. This dissertation demonstrates through detailed experimental analysis that the sparse estimation approach not only enables enhanced information recovery in variety of application areas, but also greatly helps in reducing the computational complexities associated with the problems.

  3. Safety analysis and realization of safe information transmission optical LAN on high-speed railway

    Science.gov (United States)

    Tao, Ying; Wu, Chongqing; Li, Zuoyi

    2001-10-01

    structure, status and function of the safe information transmission optical LAN, we analyze the main factors which affect the network's safe transmission performance based on the principle of safe design. Then, on the foundation of the elementary principle of the concept, fail-safe, in railway signal field, we discuss the real network's safety and realization of the software and hardware. The final experiment indicates that the real network works robustly, performs reliably, and achieves the designed aim. When some kind of faults occur which produced by hardware, software or transmission media, the network can easily and reliably deal with them and transfer to safety mode that will ensure the train's operation. The analysis and design of this network is a meaningful step in constructing our country's railway special communication network and also the network can be used in other similar conditions that demand extreme safe.

  4. On the development of an interactive resource information management system for analysis and display of spatiotemporal data

    Science.gov (United States)

    Schell, J. A.

    1974-01-01

    The recent availability of timely synoptic earth imagery from the Earth Resources Technology Satellites (ERTS) provides a wealth of information for the monitoring and management of vital natural resources. Formal language definitions and syntax interpretation algorithms were adapted to provide a flexible, computer information system for the maintenance of resource interpretation of imagery. These techniques are incorporated, together with image analysis functions, into an Interactive Resource Information Management and Analysis System, IRIMAS, which is implemented on a Texas Instruments 980A minicomputer system augmented with a dynamic color display for image presentation. A demonstration of system usage and recommendations for further system development are also included.

  5. An analysis of narrative nursing documentation in an otherwise structured intensive care clinical information system.

    Science.gov (United States)

    Moss, Jacqueline; Andison, Margot; Sobko, Heather

    2007-10-11

    Most structured nursing documentation systems allow the entry of data in a free text narrative format. Narrative data, while sometimes necessary, cannot easily be analyzed or linked to the structured portion of the record. This study examined the characteristics of free text narrative documentation entered in an otherwise structured record utilized in a cardiovascular intensive care unit. The analysis revealed that nurses documented 31 categories of narrative entries. Approximately 25% of these entries could have been entered into the structured portion of the record through the use of existing documentation codes. Nurses most frequently used the narrative documentation as a means to communicate summarized information for the coordination of healthcare team members. Development of tools to summarize structured data into an 'at a glance' format could enhance the coordination of healthcare team functioning. The authors discuss these results in the context of developing strategies to increase structured documentation and decrease free text in the patient record.

  6. Hidden Markov model analysis of force/torque information in telemanipulation

    Science.gov (United States)

    Hannaford, Blake; Lee, Paul

    1991-01-01

    A model for the prediction and analysis of sensor information recorded during robotic performance of telemanipulation tasks is presented. The model uses the hidden Markov model to describe the task structure, the operator's or intelligent controller's goal structure, and the sensor signals. A methodology for constructing the model parameters based on engineering knowledge of the task is described. It is concluded that the model and its optimal state estimation algorithm, the Viterbi algorithm, are very succesful at the task of segmenting the data record into phases corresponding to subgoals of the task. The model provides a rich modeling structure within a statistical framework, which enables it to represent complex systems and be robust to real-world sensory signals.

  7. Determination of signal-to-noise ratio on the base of information-entropic analysis

    CERN Document Server

    Zhanabaev, Z Zh; Kozhagulov, E T; Karibayev, B A

    2016-01-01

    In this paper we suggest a new algorithm for determination of signal-to-noise ratio (SNR). SNR is a quantitative measure widely used in science and engineering. Generally, methods for determination of SNR are based on using of experimentally defined power of noise level, or some conditional noise criterion which can be specified for signal processing. In the present work we describe method for determination of SNR of chaotic and stochastic signals at unknown power levels of signal and noise. For this aim we use information as difference between unconditional and conditional entropy. Our theoretical results are confirmed by results of analysis of signals which can be described by nonlinear maps and presented as overlapping of harmonic and stochastic signals.

  8. Geographic information analysis: An ecological approach for the management of wildlife on the forest landscape

    Science.gov (United States)

    Ripple, William J.

    1995-01-01

    This document is a summary of the project funded by NAGw-1460 as part of the Earth Observation Commericalization/Applications Program (EOCAP) directed by NASA's Earth Science and Applications Division. The goal was to work with several agencies to focus on forest structure and landscape characterizations for wildlife habitat applications. New analysis techniques were used in remote sensing and landscape ecology with geographic information systems (GIS). The development of GIS and the emergence of the discipline of landscape ecology provided us with an opportunity to study forest and wildlife habitat resources from a new perspective. New techniques were developed to measure forest structure across scales from the canopy to the regional level. This paper describes the project team, technical advances, and technology adoption process that was used. Reprints of related refereed journal articles are in the Appendix.

  9. More data trumps smarter algorithms: comparing pointwise mutual information with latent semantic analysis.

    Science.gov (United States)

    Recchia, Gabriel; Jones, Michael N

    2009-08-01

    Computational models of lexical semantics, such as latent semantic analysis, can automatically generate semantic similarity measures between words from statistical redundancies in text. These measures are useful for experimental stimulus selection and for evaluating a model's cognitive plausibility as a mechanism that people might use to organize meaning in memory. Although humans are exposed to enormous quantities of speech, practical constraints limit the amount of data that many current computational models can learn from. We follow up on previous work evaluating a simple metric of pointwise mutual information. Controlling for confounds in previous work, we demonstrate that this metric benefits from training on extremely large amounts of data and correlates more closely with human semantic similarity ratings than do publicly available implementations of several more complex models. We also present a simple tool for building simple and scalable models from large corpora quickly and efficiently.

  10. Implementing informative priors for heterogeneity in meta-analysis using meta-regression and pseudo data.

    Science.gov (United States)

    Rhodes, Kirsty M; Turner, Rebecca M; White, Ian R; Jackson, Dan; Spiegelhalter, David J; Higgins, Julian P T

    2016-12-20

    Many meta-analyses combine results from only a small number of studies, a situation in which the between-study variance is imprecisely estimated when standard methods are applied. Bayesian meta-analysis allows incorporation of external evidence on heterogeneity, providing the potential for more robust inference on the effect size of interest. We present a method for performing Bayesian meta-analysis using data augmentation, in which we represent an informative conjugate prior for between-study variance by pseudo data and use meta-regression for estimation. To assist in this, we derive predictive inverse-gamma distributions for the between-study variance expected in future meta-analyses. These may serve as priors for heterogeneity in new meta-analyses. In a simulation study, we compare approximate Bayesian methods using meta-regression and pseudo data against fully Bayesian approaches based on importance sampling techniques and Markov chain Monte Carlo (MCMC). We compare the frequentist properties of these Bayesian methods with those of the commonly used frequentist DerSimonian and Laird procedure. The method is implemented in standard statistical software and provides a less complex alternative to standard MCMC approaches. An importance sampling approach produces almost identical results to standard MCMC approaches, and results obtained through meta-regression and pseudo data are very similar. On average, data augmentation provides closer results to MCMC, if implemented using restricted maximum likelihood estimation rather than DerSimonian and Laird or maximum likelihood estimation. The methods are applied to real datasets, and an extension to network meta-analysis is described. The proposed method facilitates Bayesian meta-analysis in a way that is accessible to applied researchers. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  11. Analysis of phylogenetic signal in protostomial intron patterns using Mutual Information.

    Science.gov (United States)

    Hill, Natascha; Leow, Alexander; Bleidorn, Christoph; Groth, Detlef; Tiedemann, Ralph; Selbig, Joachim; Hartmann, Stefanie

    2013-06-01

    Many deep evolutionary divergences still remain unresolved, such as those among major taxa of the Lophotrochozoa. As alternative phylogenetic markers, the intron-exon structure of eukaryotic genomes and the patterns of absence and presence of spliceosomal introns appear to be promising. However, given the potential homoplasy of intron presence, the phylogenetic analysis of this data using standard evolutionary approaches has remained a challenge. Here, we used Mutual Information (MI) to estimate the phylogeny of Protostomia using gene structure data, and we compared these results with those obtained with Dollo Parsimony. Using full genome sequences from nine Metazoa, we identified 447 groups of orthologous sequences with 21,732 introns in 4,870 unique intron positions. We determined the shared absence and presence of introns in the corresponding sequence alignments and have made this data available in "IntronBase", a web-accessible and downloadable SQLite database. Our results obtained using Dollo Parsimony are obviously misled through systematic errors that arise from multiple intron loss events, but extensive filtering of data improved the quality of the estimated phylogenies. Mutual Information, in contrast, performs better with larger datasets, but at the same time it requires a complete data set, which is difficult to obtain for orthologs from a large number of taxa. Nevertheless, Mutual Information-based distances proved to be useful in analyzing this kind of data, also because the estimation of MI-based distances is independent of evolutionary models and therefore no pre-definitions of ancestral and derived character states are necessary.

  12. Water flows, energy demand, and market analysis of the informal water sector in Kisumu, Kenya.

    Science.gov (United States)

    Sima, Laura C; Kelner-Levine, Evan; Eckelman, Matthew J; McCarty, Kathleen M; Elimelech, Menachem

    2013-03-01

    In rapidly growing urban areas of developing countries, infrastructure has not been able to cope with population growth. Informal water businesses fulfill unmet water supply needs, yet little is understood about this sector. This paper presents data gathered from quantitative interviews with informal water business operators (n=260) in Kisumu, Kenya, collected during the dry season. Sales volume, location, resource use, and cost were analyzed by using material flow accounting and spatial analysis tools. Estimates show that over 76% of the city's water is consumed by less than 10% of the population who have water piped into their dwellings. The remainder of the population relies on a combination of water sources, including water purchased directly from kiosks (1.5 million m(3) per day) and delivered by hand-drawn water-carts (0.75 million m(3) per day). Energy audits were performed to compare energy use among various water sources in the city. Water delivery by truck is the highest per cubic meter energy demand (35 MJ/m(3)), while the city's tap water has the highest energy use overall (21,000 MJ/day). We group kiosks by neighborhood and compare sales volume and cost with neighborhood-level population data. Contrary to popular belief, we do not find evidence of price gouging; the lowest prices are charged in the highest-demand low-income area. We also see that the informal sector is sensitive to demand, as the number of private boreholes that serve as community water collection points are much larger where demand is greatest.

  13. The role of color information on object recognition: a review and meta-analysis.

    Science.gov (United States)

    Bramão, Inês; Reis, Alexandra; Petersson, Karl Magnus; Faísca, Luís

    2011-09-01

    In this study, we systematically review the scientific literature on the effect of color on object recognition. Thirty-five independent experiments, comprising 1535 participants, were included in a meta-analysis. We found a moderate effect of color on object recognition (d=0.28). Specific effects of moderator variables were analyzed and we found that color diagnosticity is the factor with the greatest moderator effect on the influence of color in object recognition; studies using color diagnostic objects showed a significant color effect (d=0.43), whereas a marginal color effect was found in studies that used non-color diagnostic objects (d=0.18). The present study did not permit the drawing of specific conclusions about the moderator effect of the object recognition task; while the meta-analytic review showed that color information improves object recognition mainly in studies using naming tasks (d=0.36), the literature review revealed a large body of evidence showing positive effects of color information on object recognition in studies using a large variety of visual recognition tasks. We also found that color is important for the ability to recognize artifacts and natural objects, to recognize objects presented as types (line-drawings) or as tokens (photographs), and to recognize objects that are presented without surface details, such as texture or shadow. Taken together, the results of the meta-analysis strongly support the contention that color plays a role in object recognition. This suggests that the role of color should be taken into account in models of visual object recognition.

  14. Using occlusal wear information and finite element analysis to investigate stress distributions in human molars.

    Science.gov (United States)

    Benazzi, Stefano; Kullmer, Ottmar; Grosse, Ian R; Weber, Gerhard W

    2011-09-01

    Simulations based on finite element analysis (FEA) have attracted increasing interest in dentistry and dental anthropology for evaluating the stress and strain distribution in teeth under occlusal loading conditions. Nonetheless, FEA is usually applied without considering changes in contacts between antagonistic teeth during the occlusal power stroke. In this contribution we show how occlusal information can be used to investigate the stress distribution with 3D FEA in lower first molars (M(1)). The antagonistic crowns M(1) and P(2)-M(1) of two dried modern human skulls were scanned by μCT in maximum intercuspation (centric occlusion) contact. A virtual analysis of the occlusal power stroke between M(1) and P(2)-M(1) was carried out in the Occlusal Fingerprint Analyser (OFA) software, and the occlusal trajectory path was recorded, while contact areas per time-step were visualized and quantified. Stress distribution of the M(1) in selected occlusal stages were analyzed in strand7, considering occlusal information taken from OFA results for individual loading direction and loading area. Our FEA results show that the stress pattern changes considerably during the power stroke, suggesting that wear facets have a crucial influence on the distribution of stress on the whole tooth. Grooves and fissures on the occlusal surface are seen as critical locations, as tensile stresses are concentrated at these features. Properly accounting for the power stroke kinematics of occluding teeth results in quite different results (less tensile stresses in the crown) than usual loading scenarios based on parallel forces to the long axis of the tooth. This leads to the conclusion that functional studies considering kinematics of teeth are important to understand biomechanics and interpret morphological adaptation of teeth.

  15. Parametric sensitivity analysis for stochastic molecular systems using information theoretic metrics

    Energy Technology Data Exchange (ETDEWEB)

    Tsourtis, Anastasios, E-mail: tsourtis@uoc.gr [Department of Mathematics and Applied Mathematics, University of Crete, Crete (Greece); Pantazis, Yannis, E-mail: pantazis@math.umass.edu; Katsoulakis, Markos A., E-mail: markos@math.umass.edu [Department of Mathematics and Statistics, University of Massachusetts, Amherst, Massachusetts 01003 (United States); Harmandaris, Vagelis, E-mail: harman@uoc.gr [Department of Mathematics and Applied Mathematics, University of Crete, and Institute of Applied and Computational Mathematics (IACM), Foundation for Research and Technology Hellas (FORTH), GR-70013 Heraklion, Crete (Greece)

    2015-07-07

    In this paper, we present a parametric sensitivity analysis (SA) methodology for continuous time and continuous space Markov processes represented by stochastic differential equations. Particularly, we focus on stochastic molecular dynamics as described by the Langevin equation. The utilized SA method is based on the computation of the information-theoretic (and thermodynamic) quantity of relative entropy rate (RER) and the associated Fisher information matrix (FIM) between path distributions, and it is an extension of the work proposed by Y. Pantazis and M. A. Katsoulakis [J. Chem. Phys. 138, 054115 (2013)]. A major advantage of the pathwise SA method is that both RER and pathwise FIM depend only on averages of the force field; therefore, they are tractable and computable as ergodic averages from a single run of the molecular dynamics simulation both in equilibrium and in non-equilibrium steady state regimes. We validate the performance of the extended SA method to two different molecular stochastic systems, a standard Lennard-Jones fluid and an all-atom methane liquid, and compare the obtained parameter sensitivities with parameter sensitivities on three popular and well-studied observable functions, namely, the radial distribution function, the mean squared displacement, and the pressure. Results show that the RER-based sensitivities are highly correlated with the observable-based sensitivities.

  16. An Information-Based Approach to Precision Analysis of Indoor WLAN Localization Using Location Fingerprint

    Directory of Open Access Journals (Sweden)

    Mu Zhou

    2015-12-01

    Full Text Available In this paper, we proposed a novel information-based approach to precision analysis of indoor wireless local area network (WLAN localization using location fingerprint. First of all, by using the Fisher information matrix (FIM, we derive the fundamental limit of WLAN fingerprint-based localization precision considering different signal distributions in characterizing the variation of received signal strengths (RSSs in the target environment. After that, we explore the relationship between the localization precision and access point (AP placement, which can provide valuable suggestions for the design of the highly-precise localization system. Second, we adopt the heuristic simulated annealing (SA algorithm to optimize the AP locations for the sake of approaching the fundamental limit of localization precision. Finally, the extensive simulations and experiments are conducted in both regular line-of-sight (LOS and irregular non-line-of-sight (NLOS environments to demonstrate that the proposed approach can not only effectively improve the WLAN fingerprint-based localization precision, but also reduce the time overhead.

  17. Research in health sciences library and information science: a quantitative analysis.

    Science.gov (United States)

    Dimitroff, A

    1992-10-01

    A content analysis of research articles published between 1966 and 1990 in the Bulletin of the Medical Library Association was undertaken. Four specific questions were addressed: What subjects are of interest to health sciences librarians? Who is conducting this research? How do health sciences librarians conduct their research? Do health sciences librarians obtain funding for their research activities? Bibliometric characteristics of the research articles are described and compared to characteristics of research in library and information science as a whole in terms of subject and methodology. General findings were that most research in health sciences librarianship is conducted by librarians affiliated with academic health sciences libraries (51.8%); most deals with an applied (45.7%) or a theoretical (29.2%) topic; survey (41.0%) or observational (20.7%) research methodologies are used; descriptive quantitative analytical techniques are used (83.5%); and over 25% of research is funded. The average number of authors was 1.85, average article length was 7.25 pages, and average number of citations per article was 9.23. These findings are consistent with those reported in the general library and information science literature for the most part, although specific differences do exist in methodological and analytical areas.

  18. Social network analysis as a strategy for monitoring the dissemination of information between hospitals

    Directory of Open Access Journals (Sweden)

    Francisco José Aragão Pedroza CUNHA

    Full Text Available Abstract This article explores the structure of connections between the hospitals that are members of a hospital management innovation and learning network. This study was based on the assumption that there are limitations to encourage the communication and diffusion of knowledge between health service organizations if they are not effectively connected through social networks. Social Network Analysis was used as a strategy for monitoring the dissemination of information between hospitals. Theoretical concepts of diffusion of knowledge allowed emphasizing the role of the phenomena and communication and learning processes as the driving forces for health service innovation. The results showed weak interactions between hospitals and a lack of cohesion within the network. Therefore, there is a need for policies to promote the flow of data and information, which requires network openness to foster the exchange of innovative processes. Interactions between these hospitals in horizontal and disseminated structures have yet to be stimulated, established, incorporated, and developed by individuals, institutions and health service organizations.

  19. Writing wrongs? An analysis of published discourses about the use of patient information leaflets.

    Science.gov (United States)

    Dixon-Woods, M

    2001-05-01

    Much has been written about how to communicate with patients, but there has been little critical scrutiny of this literature. This paper presents an analysis of publications about the use of patient information leaflets. It suggests that two discourses can be distinguished in this literature. The first of these is the larger of the two. It reflects traditional biomedical concerns and it invokes a mechanistic model of communication in which patients are characterised as passive and open to manipulation in the interests of a biomedical agenda. The persistence of the biomedical model in this discourse is contrasted with the second discourse, which is smaller and more recent in origin. This second discourse draws on a political agenda of patient empowerment, and reflects this in its choice of outcomes of interest, its concern with the use of leaflets as a means of democratisation, and its orientation towards patients. It is suggested that the two discourses, though distinct, are not entirely discrete, and may begin to draw closer as they begin to draw on a wider set of resources, including sociological research and theory, to develop a rigorous theoretically grounded approach to patient information leaflets.

  20. Building Information Modelling and Standardised Construction Contracts: a Content Analysis of the GC21 Contract

    Directory of Open Access Journals (Sweden)

    Aaron Manderson

    2015-08-01

    Full Text Available Building Information Modelling (BIM is seen as a panacea to many of the ills confronting the Architectural, Engineering and Construction (AEC sector. In spite of its well documented benefits the widespread integration of BIM into the project lifecycle is yet to occur. One commonly identified barrier to BIM adoption is the perceived legal risks associated with its integration, coupled with the need for implementation in a collaborative environment. Many existing standardised contracts used in the Australian AEC industry were drafted before the emergence of BIM. As BIM continues to become ingrained in the delivery process the shortcomings of these existing contracts have become apparent. This paper reports on a study that reviewed and consolidated the contractual and legal concerns associated with BIM implementation. The findings of the review were used to conduct a qualitative content analysis of the GC21 2nd edition, an Australian standardised construction contract, to identify possible changes to facilitate the implementation of BIM in a collaborative environment. The findings identified a number of changes including the need to adopt a collaborative contract structure with equitable risk and reward mechanisms, recognition of the model as a contract document and the need for standardisation of communication/information exchange.

  1. Analysis Of Educational Services Distribution-Based Geographic Information System GIS

    Directory of Open Access Journals (Sweden)

    Waleed Lagrab

    2015-03-01

    Full Text Available Abstract This study analyzes the spatial distribution of kindergarten facilities in the study area based on the Geographic Information Systems GIS in order to test an efficiency of GIS technology to redistribute the existing kindergarten and choose the best location in the future and applying the standard criteria for selecting the suitable locations for kindergarten. To achieve this goal the data and information are collected via interviews and comprehensive statistics on the education facilities in Mukalla districts in YEMEN which contributed to building a geographic database for the study area. After that the Kindergarten spatial patterns are analyzed in terms of proximity to each other and used near some other land in the surrounding area such as streets highways factories etc. Also measures the concentration dispersion clustering and distribution direction for the kindergarten this study showed the effectiveness of the GIS for spatial data analysis. One of the most important finding that most of the Kindergarten was established in Mukalla city did not take into account the criteria that set by the authorities. Furthermore almost district suffers from a shortage in the number of kindergarten and pattern of distribution of those kindergartens dominated by spatial dispersed.

  2. A Cost-Benefit Analysis of the Legalization of an Informal Health Sector

    Directory of Open Access Journals (Sweden)

    Roger Lee Mendoza

    2010-01-01

    Full Text Available Problem statement: The Philippines--a developing Southeast Asian country--exemplifies the co-existence of Western-oriented, medical science and indigenous, non-allopathic practices collectively known as Complementary and Alternative Medicine (CAM. The purpose of this study is to determine why and how the economics and politics of CAM’s integration with biomedical science could impede the achievement of health care redistribution in developing countries like the Philippines. Approach: Representative case studies of CAM methods and content analysis of related legislation and policy initiatives were undertaken. Results: The study shed light on the problems, challenges and opportunities in addressing the misdistribution of primary and secondary health care in the Philippines. It found that subjective considerations underlie CAM’s legitimacy. These become critical when scientific validity is at issue, information exchanged is asymmetric and political consensus is not readily available. How these considerations were valued from a cost-benefit perspective shaped actual policy outcomes. Conclusion: The study suggested that proper timing, phasing and collaborative strategies are critical to CAM's institutionalization in light of confining economic conditions and political conflicts over health policy. Both objective and subjective costs and benefits of CAM methods and products should be considered in integrating the formal (biomedical and informal (CAM health sectors, particularly in developing countries where health care is largely dependent on individual or household resource-based access and competitive prowess.

  3. Environmental factor analysis of cholera in China using remote sensing and geographical information systems.

    Science.gov (United States)

    Xu, M; Cao, C X; Wang, D C; Kan, B; Xu, Y F; Ni, X L; Zhu, Z C

    2016-04-01

    Cholera is one of a number of infectious diseases that appears to be influenced by climate, geography and other natural environments. This study analysed the environmental factors of the spatial distribution of cholera in China. It shows that temperature, precipitation, elevation, and distance to the coastline have significant impact on the distribution of cholera. It also reveals the oceanic environmental factors associated with cholera in Zhejiang, which is a coastal province of China, using both remote sensing (RS) and geographical information systems (GIS). The analysis has validated the correlation between indirect satellite measurements of sea surface temperature (SST), sea surface height (SSH) and ocean chlorophyll concentration (OCC) and the local number of cholera cases based on 8-year monthly data from 2001 to 2008. The results show the number of cholera cases has been strongly affected by the variables of SST, SSH and OCC. Utilizing this information, a cholera prediction model has been established based on the oceanic and climatic environmental factors. The model indicates that RS and GIS have great potential for designing an early warning system for cholera.

  4. Network Analysis of the Shanghai Stock Exchange Based on Partial Mutual Information

    Directory of Open Access Journals (Sweden)

    Tao You

    2015-06-01

    Full Text Available Analyzing social systems, particularly financial markets, using a complex network approach has become one of the most popular fields within econophysics. A similar trend is currently appearing within the econometrics and finance communities, as well. In this study, we present a state-of-the-artmethod for analyzing the structure and risk within stockmarkets, treating them as complex networks using model-free, nonlinear dependency measures based on information theory. This study is the first network analysis of the stockmarket in Shanghai using a nonlinear network methodology. Further, it is often assumed that markets outside the United States and Western Europe are inherently riskier. We find that the Chinese stock market is not structurally risky, contradicting this popular opinion. We use partial mutual information to create filtered networks representing the Shanghai stock exchange, comparing them to networks based on Pearson’s correlation. Consequently, we discuss the structure and characteristics of both the presented methods and the Shanghai stock exchange. This paper provides an insight into the cutting edge methodology designed for analyzing complex financial networks, as well as analyzing the structure of the market in Shanghai and, as such, is of interest to both researchers and financial analysts.

  5. Analysis Methods for Extracting Knowledge from Large-Scale WiFi Monitoring to Inform Building Facility Planning

    DEFF Research Database (Denmark)

    Ruiz-Ruiz, Antonio; Blunck, Henrik; Prentow, Thor Siiger;

    2014-01-01

    realistic data to inform facility planning. In this paper, we propose analysis methods to extract knowledge from large sets of network collected WiFi traces to better inform facility management and planning in large building complexes. The analysis methods, which build on a rich set of temporal and spatial....... Spatio-temporal visualization tools built on top of these methods enable planners to inspect and explore extracted information to inform facility-planning activities. To evaluate the methods, we present results for a large hospital complex covering more than 10 hectares. The evaluation is based on WiFi...... traces collected in the hospital’s WiFi infrastructure over two weeks observing around 18000 different devices recording more than a billion individual WiFi measurements. For the presented analysis methods we present quantitative performance results, e.g., demonstrating over 95% accuracy for correct...

  6. SMALL Savannah : an information system for the integrated analysis of land use change in the Far North of Cameroon

    NARCIS (Netherlands)

    Fotsing, Eric

    2009-01-01

    SMALL Savannah is an Environmental Information System designed for the integrated analysis and sustainable land management in the savannas region of the Far North of Cameroon. This system combines an observation and spatial analysis module for the representation of phenomena from various geographic

  7. Library and Information Science Research Areas: A Content Analysis of Articles from the Top 10 Journals 2007-8

    Science.gov (United States)

    Aharony, Noa

    2012-01-01

    The current study seeks to describe and analyze journal research publications in the top 10 Library and Information Science journals from 2007-8. The paper presents a statistical descriptive analysis of authorship patterns (geographical distribution and affiliation) and keywords. Furthermore, it displays a thorough content analysis of keywords and…

  8. Stakeholder Analysis as a Medium to Aid Change in Information System Reengineering Projects

    Directory of Open Access Journals (Sweden)

    Jean Davison

    2004-04-01

    Full Text Available The importance of involving stakeholders within a change process is well recognised, and successfully managed change is equally important. Information systems development and redesign is a form of change activity involving people and social issues and therefore resistance to change may occur. A stakeholder identification and analysis (SIA technique has been developed as an enhancement to PISO® (Process Improvement for Strategic Objectives, a method that engages the users of a system in the problem solving and reengineering of their own work-based problem areas. The SIA technique aids the identification and analysis of system stakeholders, and helps view the projected outcome of system changes and their effect on relevant stakeholders with attention being given to change resistance to ensure smooth negotiation and achieve consensus. A case study is presented here describing the successful implementation of a direct appointment booking system for patients within the National Health Service in the UK, utilising the SIA technique, which resulted in a feeling of empowerment and ownership of the change of those involved.

  9. Application of information theory for the analysis of cogeneration-system performance

    Energy Technology Data Exchange (ETDEWEB)

    Takahashi, Kazuki; Ishizaka, Tadashi [Tokyo Gas Co. Ltd., Tokyo (Japan)

    1998-11-01

    Successful cogeneration system performance depends critically upon the correct estimation of load variation and the accuracy of demand prediction. We need not only aggregated annual heat and electricity demands, but also hourly and monthly patterns in order to evaluate a cogeneration system's performance by computer simulation. These data are usually obtained from the actual measurements of energy demand in existing buildings. However, it is extremely expensive to collect actual energy demand data and store it over a long period for many buildings. However we face the question of whether it is really necessary to survey hourly demands. This paper provides a sensitivity analysis of the influence of demand-prediction error upon the efficiency of cogeneration system, so as to evaluate the relative importance of various demand components. These components are annual energy demand, annual heat-to-electricity ratio, daily load factor and so forth. Our approach employs the concept of information theory to construct a mathematical model. This analysis provides an indication of the relative importances of demand indices, and identifies what may become a good measure of assessing the efficiency of the cogeneration system for planning purposes. (Author)

  10. Based on Weibull Information Fusion Analysis Semiconductors Quality the Key Technology of Manufacturing Execution Systems Reliability

    Science.gov (United States)

    Huang, Zhi-Hui; Tang, Ying-Chun; Dai, Kai

    2016-05-01

    Semiconductor materials and Product qualified rate are directly related to the manufacturing costs and survival of the enterprise. Application a dynamic reliability growth analysis method studies manufacturing execution system reliability growth to improve product quality. Refer to classical Duane model assumptions and tracking growth forecasts the TGP programming model, through the failure data, established the Weibull distribution model. Combining with the median rank of average rank method, through linear regression and least squares estimation method, match respectively weibull information fusion reliability growth curve. This assumption model overcome Duane model a weakness which is MTBF point estimation accuracy is not high, through the analysis of the failure data show that the method is an instance of the test and evaluation modeling process are basically identical. Median rank in the statistics is used to determine the method of random variable distribution function, which is a good way to solve the problem of complex systems such as the limited sample size. Therefore this method has great engineering application value.

  11. Climate Risk Informed Decision Analysis: A Hypothetical Application to the Waas Region

    Science.gov (United States)

    Gilroy, Kristin; Mens, Marjolein; Haasnoot, Marjolijn; Jeuken, Ad

    2016-04-01

    More frequent and intense hydrologic events under climate change are expected to enhance water security and flood risk management challenges worldwide. Traditional planning approaches must be adapted to address climate change and develop solutions with an appropriate level of robustness and flexibility. The Climate Risk Informed Decision Analysis (CRIDA) method is a novel planning approach embodying a suite of complementary methods, including decision scaling and adaptation pathways. Decision scaling offers a bottom-up approach to assess risk and tailors the complexity of the analysis to the problem at hand and the available capacity. Through adaptation pathway,s an array of future strategies towards climate robustness are developed, ranging in flexibility and immediacy of investments. Flexible pathways include transfer points to other strategies to ensure that the system can be adapted if future conditions vary from those expected. CRIDA combines these two approaches in a stakeholder driven process which guides decision makers through the planning and decision process, taking into account how the confidence in the available science, the consequences in the system, and the capacity of institutions should influence strategy selection. In this presentation, we will explain the CRIDA method and compare it to existing planning processes, such as the US Army Corps of Engineers Principles and Guidelines as well as Integrated Water Resources Management Planning. Then, we will apply the approach to a hypothetical case study for the Waas Region, a large downstream river basin facing rapid development threatened by increased flood risks. Through the case study, we will demonstrate how a stakeholder driven process can be used to evaluate system robustness to climate change; develop adaptation pathways for multiple objectives and criteria; and illustrate how varying levels of confidence, consequences, and capacity would play a role in the decision making process, specifically

  12. Analysis of Information Publicity System%信息公开制度探析

    Institute of Scientific and Technical Information of China (English)

    朱庆华; 颜祥林

    2001-01-01

    Information publicity system is a guarantee in law in exploiting government information resources.This paper discusses the reasons for establishing such a system,and based on the Japanese Freedom of Information Act,discusses the main content of information publicity system.

  13. The Influence of Place-Based Communities on Information Behavior: A Comparative Grounded Theory Analysis

    Science.gov (United States)

    Gibson, Amelia N.

    2013-01-01

    This study examines the effect of experiential place and local community on information access and behavior for two communities of parents of children with Down syndrome. It uncovers substantive issues associated with health information seeking, government and education-related information access, and information overload and avoidance within the…

  14. On the use of information theory for the analysis of the relationship between neural and imaging signals.

    Science.gov (United States)

    Panzeri, Stefano; Magri, Cesare; Logothetis, Nikos K

    2008-09-01

    Functional magnetic resonance imaging (fMRI) is a widely used method for studying the neural basis of cognition and of sensory function. A potential problem in the interpretation of fMRI data is that fMRI measures neural activity only indirectly, as a local change of deoxyhemoglobin concentration due to the metabolic demands of neural function. To build correct sensory and cognitive maps in the human brain, it is thus crucial to understand whether fMRI and neural activity convey the same type of information about external correlates. While a substantial experimental effort has been devoted to the simultaneous recordings of hemodynamic and neural signals, so far, the development of analysis methods that elucidate how neural and hemodynamic signals represent sensory information has received less attention. In this article, we critically review why the analytical framework of information theory, the mathematical theory of communication, is ideally suited to this purpose. We review the principles of information theory and explain how they could be applied to the analysis of fMRI and neural signals. We show that a critical advantage of information theory over more traditional analysis paradigms commonly used in the fMRI literature is that it can elucidate, within a single framework, whether an empirically observed correlation between neural and fMRI signals reflects either a similar stimulus tuning or a common source of variability unrelated to the external stimuli. In addition, information theory determines the extent to which these shared sources of stimulus signal and of variability lead fMRI and neural signals to convey similar information about external correlates. We then illustrate the formalism by applying it to the analysis of the information carried by different bands of the local field potential. We conclude by discussing the current methodological challenges that need to be addressed to make the information-theoretic approach more robustly applicable to the

  15. Informed consent and placebo effects: a content analysis of information leaflets to identify what clinical trial participants are told about placebos.

    Directory of Open Access Journals (Sweden)

    Felicity L Bishop

    Full Text Available BACKGROUND: Placebo groups are used in randomised clinical trials (RCTs to control for placebo effects, which can be large. Participants in trials can misunderstand written information particularly regarding technical aspects of trial design such as randomisation; the adequacy of written information about placebos has not been explored. We aimed to identify what participants in major RCTs in the UK are told about placebos and their effects. METHODS AND FINDINGS: We conducted a content analysis of 45 Participant Information Leaflets (PILs using quantitative and qualitative methodologies. PILs were obtained from trials on a major registry of current UK clinical trials (the UKCRN database. Eligible leaflets were received from 44 non-commercial trials but only 1 commercial trial. The main limitation is the low response rate (13.5%, but characteristics of included trials were broadly representative of all non-commercial trials on the database. 84% of PILs were for trials with 50:50 randomisation ratios yet in almost every comparison the target treatments were prioritized over the placebos. Placebos were referred to significantly less frequently than target treatments (7 vs. 27 mentions, p<001 and were significantly less likely than target treatments to be described as triggering either beneficial effects (1 vs. 45, p<001 or adverse effects (4 vs. 39, p<001. 8 PILs (18% explicitly stated that the placebo treatment was either undesirable or ineffective. CONCLUSIONS: PILs from recent high quality clinical trials emphasise the benefits and adverse effects of the target treatment, while largely ignoring the possible effects of the placebo. Thus they provide incomplete and at times inaccurate information about placebos. Trial participants should be more fully informed about the health changes that they might experience from a placebo. To do otherwise jeopardises informed consent and is inconsistent with not only the science of placebos but also the

  16. CONTEMPORARY APPROACHES OF COMPANY PERFORMANCE ANALYSIS BASED ON RELEVANT FINANCIAL INFORMATION

    Directory of Open Access Journals (Sweden)

    Sziki Klara

    2012-12-01

    Full Text Available In this paper we chose to present two components of the financial statements: the profit and loss account and the cash flow statement. These summary documents and different indicators calculated based on them allow us to formulate assessments on the performance and profitability on various functions and levels of the company’s activity. This paper aims to support the hypothesis that the accounting information presented in the profit and loss account and in the cash flow statement is an appropriate source for assessing company performance. The purpose of this research is to answer the question linked to the main hypothesis: Is it the profit and loss statement or the cash flow account that reflects better the performance of a business? Based on the literature of specialty studied we tried a conceptual, analytical and practical approach of the term performance, overviewing some terminological acceptations of the term performance as well as the main indicators of performance analysis on the basis of the profit and loss account and of the cash flow statement: aggregated indicators, also known as intermediary balances of administration, economic rate of return, rate of financial profitability, rate of return through cash flows, operating cash flow rate, rate of generating operating cash out of gross operating result. At the same time we had a comparative approach of the profit and loss account and cash flow statement, outlining the main advantages and disadvantages of these documents. In order to demonstrate the above theoretical assessments, we chose to analyze these indicators based on information from the financial statements of SC Sinteza SA, a company in Bihor county, listed on the Bucharest Stock Exchange.

  17. EXTRACTION OF BENTHIC COVER INFORMATION FROM VIDEO TOWS AND PHOTOGRAPHS USING OBJECT-BASED IMAGE ANALYSIS

    Directory of Open Access Journals (Sweden)

    M. T. L. Estomata

    2012-07-01

    Full Text Available Mapping benthic cover in deep waters comprises a very small proportion of studies in the field of research. Majority of benthic cover mapping makes use of satellite images and usually, classification is carried out only for shallow waters. To map the seafloor in optically deep waters, underwater videos and photos are needed. Some researchers have applied this method on underwater photos, but made use of different classification methods such as: Neural Networks, and rapid classification via down sampling. In this study, accurate bathymetric data obtained using a multi-beam echo sounder (MBES was attempted to be used as complementary data with the underwater photographs. Due to the absence of a motion reference unit (MRU, which applies correction to the data gathered by the MBES, accuracy of the said depth data was compromised. Nevertheless, even with the absence of accurate bathymetric data, object-based image analysis (OBIA, which used rule sets based on information such as shape, size, area, relative distance, and spectral information, was still applied. Compared to pixel-based classifications, OBIA was able to classify more specific benthic cover types other than coral and sand, such as rubble and fish. Through the use of rule sets on area, less than or equal to 700 pixels for fish and between 700 to 10,000 pixels for rubble, as well as standard deviation values to distinguish texture, fish and rubble were identified. OBIA produced benthic cover maps that had higher overall accuracy, 93.78±0.85%, as compared to pixel-based methods that had an average accuracy of only 87.30±6.11% (p-value = 0.0001, α = 0.05.

  18. Extraction of Benthic Cover Information from Video Tows and Photographs Using Object-Based Image Analysis

    Science.gov (United States)

    Estomata, M. T. L.; Blanco, A. C.; Nadaoka, K.; Tomoling, E. C. M.

    2012-07-01

    Mapping benthic cover in deep waters comprises a very small proportion of studies in the field of research. Majority of benthic cover mapping makes use of satellite images and usually, classification is carried out only for shallow waters. To map the seafloor in optically deep waters, underwater videos and photos are needed. Some researchers have applied this method on underwater photos, but made use of different classification methods such as: Neural Networks, and rapid classification via down sampling. In this study, accurate bathymetric data obtained using a multi-beam echo sounder (MBES) was attempted to be used as complementary data with the underwater photographs. Due to the absence of a motion reference unit (MRU), which applies correction to the data gathered by the MBES, accuracy of the said depth data was compromised. Nevertheless, even with the absence of accurate bathymetric data, object-based image analysis (OBIA), which used rule sets based on information such as shape, size, area, relative distance, and spectral information, was still applied. Compared to pixel-based classifications, OBIA was able to classify more specific benthic cover types other than coral and sand, such as rubble and fish. Through the use of rule sets on area, less than or equal to 700 pixels for fish and between 700 to 10,000 pixels for rubble, as well as standard deviation values to distinguish texture, fish and rubble were identified. OBIA produced benthic cover maps that had higher overall accuracy, 93.78±0.85%, as compared to pixel-based methods that had an average accuracy of only 87.30±6.11% (p-value = 0.0001, α = 0.05).

  19. Biological data analysis as an information theory problem: multivariable dependence measures and the shadows algorithm.

    Science.gov (United States)

    Sakhanenko, Nikita A; Galas, David J

    2015-11-01

    Information theory is valuable in multiple-variable analysis for being model-free and nonparametric, and for the modest sensitivity to undersampling. We previously introduced a general approach to finding multiple dependencies that provides accurate measures of levels of dependency for subsets of variables in a data set, which is significantly nonzero only if the subset of variables is collectively dependent. This is useful, however, only if we can avoid a combinatorial explosion of calculations for increasing numbers of variables.  The proposed dependence measure for a subset of variables, τ, differential interaction information, Δ(τ), has the property that for subsets of τ some of the factors of Δ(τ) are significantly nonzero, when the full dependence includes more variables. We use this property to suppress the combinatorial explosion by following the "shadows" of multivariable dependency on smaller subsets. Rather than calculating the marginal entropies of all subsets at each degree level, we need to consider only calculations for subsets of variables with appropriate "shadows." The number of calculations for n variables at a degree level of d grows therefore, at a much smaller rate than the binomial coefficient (n, d), but depends on the parameters of the "shadows" calculation. This approach, avoiding a combinatorial explosion, enables the use of our multivariable measures on very large data sets. We demonstrate this method on simulated data sets, and characterize the effects of noise and sample numbers. In addition, we analyze a data set of a few thousand mutant yeast strains interacting with a few thousand chemical compounds.

  20. Opinions of ICT Teachers about Information Technology Course Implementations: A Social Media Analysis

    Directory of Open Access Journals (Sweden)

    Alaattin Parlakkılıç

    2014-01-01

    Full Text Available The use of Information and Communication Technologies (ICT is increasing in education. ICT teachers have important role and responsibilities in ICT world. In this study, the problems of ICT teachers and their suggested solutions that stated by them were evaluated by analyzing their messages and shared information in Internet and social media. Document analysis was used as qualitative data collection method for this study. The research group was consisting of the ICT teachers that have worked in secondary Turkish Schools from July 2012 to July 2013 who used social media. In the study, teachers’ opinions and suggested solutions in social media (forums, blogs, Facebook and Twitter had been obtained and categorized in six area as course compulsory, curriculum, personal rights, job definitions, Fatih Project, ICT infrastructure and innovative ideas. The data have been evaluated categorically in frequency and percentage. At the end of the study; it was evaluated that the solution suggestions provide a great asset in education for innovation and changes. In this context, problems about employee personal rights (f=61 and %31.9 have been the most important one and the suggested solutions express legal arrangements to be made. In the second place, obligatory course (f=49 and %29.9 was stated. Inadequacy of the curriculum and the need for update (f=28 and %14.6 was the third most discussed topic. Progressive applications and renovations (f=23 and %12.1 were in the fourth place. In the fifth place, it was expressed that the success probability of the Fatih Project (f=21 and % 11 was low in the current situation and the ICT teachers must be included in the project. Lastly it was seen that the infrastructure and support (f=18 and %9.5 were required for development

  1. Measuring child maltreatment using multi-informant survey data: a higher-order confirmatory factor analysis

    Directory of Open Access Journals (Sweden)

    Giovanni A. Salum

    2016-03-01

    Full Text Available Objective To investigate the validity and reliability of a multi-informant approach to measuring child maltreatment (CM comprising seven questions assessing CM administered to children and their parents in a large community sample. Methods Our sample comprised 2,512 children aged 6 to 12 years and their parents. Child maltreatment (CM was assessed with three questions answered by the children and four answered by their parents, covering physical abuse, physical neglect, emotional abuse and sexual abuse. Confirmatory factor analysis was used to compare the fit indices of different models. Convergent and divergent validity were tested using parent-report and teacher-report scores on the Strengths and Difficulties Questionnaire. Discriminant validity was investigated using the Development and Well-Being Assessment to divide subjects into five diagnostic groups: typically developing controls (n = 1,880, fear disorders (n = 108, distress disorders (n = 76, attention deficit hyperactivity disorder (n = 143 and oppositional defiant disorder/conduct disorder (n = 56. Results A higher-order model with one higher-order factor (child maltreatment encompassing two lower-order factors (child report and parent report exhibited the best fit to the data and this model's reliability results were acceptable. As expected, child maltreatment was positively associated with measures of psychopathology and negatively associated with prosocial measures. All diagnostic category groups had higher levels of overall child maltreatment than typically developing children. Conclusions We found evidence for the validity and reliability of this brief measure of child maltreatment using data from a large survey combining information from parents and their children.

  2. An information-theoretic machine learning approach to expression QTL analysis.

    Directory of Open Access Journals (Sweden)

    Tao Huang

    Full Text Available Expression Quantitative Trait Locus (eQTL analysis is a powerful tool to study the biological mechanisms linking the genotype with gene expression. Such analyses can identify genomic locations where genotypic variants influence the expression of genes, both in close proximity to the variant (cis-eQTL, and on other chromosomes (trans-eQTL. Many traditional eQTL methods are based on a linear regression model. In this study, we propose a novel method by which to identify eQTL associations with information theory and machine learning approaches. Mutual Information (MI is used to describe the association between genetic marker and gene expression. MI can detect both linear and non-linear associations. What's more, it can capture the heterogeneity of the population. Advanced feature selection methods, Maximum Relevance Minimum Redundancy (mRMR and Incremental Feature Selection (IFS, were applied to optimize the selection of the affected genes by the genetic marker. When we applied our method to a study of apoE-deficient mice, it was found that the cis-acting eQTLs are stronger than trans-acting eQTLs but there are more trans-acting eQTLs than cis-acting eQTLs. We compared our results (mRMR.eQTL with R/qtl, and MatrixEQTL (modelLINEAR and modelANOVA. In female mice, 67.9% of mRMR.eQTL results can be confirmed by at least two other methods while only 14.4% of R/qtl result can be confirmed by at least two other methods. In male mice, 74.1% of mRMR.eQTL results can be confirmed by at least two other methods while only 18.2% of R/qtl result can be confirmed by at least two other methods. Our methods provide a new way to identify the association between genetic markers and gene expression. Our software is available from supporting information.

  3. Geographic information system for fusion and analysis of high-resolution remote sensing and ground data

    Science.gov (United States)

    Freeman, Anthony; Way, Jo Bea; Dubois, Pascale; Leberl, Franz

    1993-01-01

    We seek to combine high-resolution remotely sensed data with models and ground truth measurements, in the context of a Geographical Information System (GIS), integrated with specialized image processing software. We will use this integrated system to analyze the data from two Case Studies, one at a boreal forest site, the other a tropical forest site. We will assess the information content of the different components of the data, determine the optimum data combinations to study biogeophysical changes in the forest, assess the best way to visualize the results, and validate the models for the forest response to different radar wavelengths/polarizations. During the 1990's, unprecedented amounts of high-resolution images from space of the Earth's surface will become available to the applications scientist from the LANDSAT/TM series, European and Japanese ERS-1 satellites, RADARSAT and SIR-C missions. When the Earth Observation Systems (EOS) program is operational, the amount of data available for a particular site can only increase. The interdisciplinary scientist, seeking to use data from various sensors to study his site of interest, may be faced with massive difficulties in manipulating such large data sets, assessing their information content, determining the optimum combinations of data to study a particular parameter, visualizing his results and validating his model of the surface. The techniques to deal with these problems are also needed to support the analysis of data from NASA's current program of Multi-sensor Airborne Campaigns, which will also generate large volumes of data. In the Case Studies outlined in this proposal, we will have somewhat unique data sets. For the Bonanza Creek Experimental Forest (Case 1) calibrated DC-8 SAR (Synthetic Aperture Radar) data and extensive ground truth measurement are already at our disposal. The data set shows documented evidence to temporal change. The Belize Forest Experiment (Case 2) will produce calibrated DC-8 SAR

  4. Climate Risk Informed Decision Analysis (CRIDA): A novel practical guidance for Climate Resilient Investments and Planning

    Science.gov (United States)

    Jeuken, Ad; Mendoza, Guillermo; Matthews, John; Ray, Patrick; Haasnoot, Marjolijn; Gilroy, Kristin; Olsen, Rolf; Kucharski, John; Stakhiv, Gene; Cushing, Janet; Brown, Casey

    2016-04-01

    over time. They are part of the Dutch adaptive planning approach Adaptive Delta Management, executed and develop by the Dutch Delta program. Both decision scaling and adaptation pathways have been piloted in studies worldwide. The objective of CRIDA is to mainstream effective climate adaptation for professional water managers. The CRIDA publication, due in april 2016, follows the generic water design planning design cycle. At each step, CRIDA describes stepwise guidance for incorporating climate robustness: problem definition, stress test, alternatives formulation and recommendation, evaluation and selection. In the presentation the origin, goal, steps and practical tools available at each step of CRIDA will be explained. In two other abstracts ("Climate Risk Informed Decision Analysis: A Hypothetical Application to the Waas Region" by Gilroy et al., "The Application of Climate Risk Informed Decision Analysis to the Ioland Water Treatment Plant in Lusaka, Zambia, by Kucharski et al.), the application of CRIDA to cases is explained

  5. Bayesian Information-Gap Decision Analysis Applied to a CO2 Leakage Problem

    Science.gov (United States)

    O'Malley, D.; Vesselinov, V. V.

    2014-12-01

    We describe a decision analysis in the presence of uncertainty that combines a non-probabilistic approach (information-gap decision theory) with a probabilistic approach (Bayes' theorem). Bayes' theorem is one of the most popular techniques for probabilistic uncertainty quantification (UQ). It is effective in many situations, because it updates our understanding of the uncertainties by conditioning on real data using a mathematically rigorous technique. However, the application of Bayes' theorem in science and engineering is not always rigorous. There are two reasons for this: (1) We can enumerate the possible outcomes of dice-rolling, but not the possible outcomes of real-world contamination remediation; (2) We can precisely determine conditional probabilities for coin-tossing, but substantial uncertainty surrounds the conditional probabilities for real-world contamination remediation. Of course, Bayes' theorem is rigorously applicable beyond dice-rolling and coin-tossing, but even in cases that are constructed to be simple with ostensibly good probabilistic models, applying Bayes' theorem to the real world may not work as well as one might expect. Bayes' theorem is rigorously applicable only if all possible events can be described, and their conditional probabilities can be derived rigorously. Outside of this domain, it may still be useful, but its use lacks at least some rigor. The information-gap approach allows us to circumvent some of the highlighted shortcomings of Bayes' theorem. In particular, it provides a way to account for possibilities beyond those described by our models, and a way to deal with uncertainty in the conditional distribution that forms the core of Bayesian analysis. We have developed a three-tiered technique enables one to make scientifically defensible decisions in the face of severe uncertainty such as is found in many geologic problems. To demonstrate the applicability, we apply the technique to a CO2 leakage problem. The goal is to

  6. Influence analysis of information erupted on social networks based on SIR model

    Science.gov (United States)

    Zhou, Xue; Hu, Yong; Wu, Yue; Xiong, Xi

    2015-07-01

    In this paper, according to the similarity of chain reaction principle and the characteristics of information propagation on social network, we proposed a new word "information bomb". Based on the complex networks and SIR model, dynamical evolution equations were setup. Then methods used to evaluate the four indexes of bomb power were given, including influence breadth, influence strength, peak time and relaxation time. At last, the power of information was ascertained through these indexes. The process of information propagation is simulated to illustrate the spreading characteristics through the results. Then parameters which impact on the power of information bomb are analyzed and some methods which control the propagation of information are given.

  7. Politics and technology in health information systems development: a discourse analysis of conflicts addressed in a systems design group.

    Science.gov (United States)

    Irestig, Magnus; Timpka, Toomas

    2008-02-01

    Different types of disagreements must be managed during the development of health information systems. This study examines the antagonisms discussed during the design of an information system for 175,000 users in a public health context. Discourse analysis methods were used for data collection and analysis. Three hundred and twenty-six conflict events were identified from four design meetings and divided into 16 categories. There were no differences regarding the types of conflicts that the different participants brought into the design discussions. Instead, conflict occurrence was primarily affected by the agendas that set the stage for examinations and debates. The results indicate that the selection of design method and the structure used for the meetings are important factors for the manner in which conflicts are brought into consideration during health information system design. Further studies comparing participatory and non-participatory information system design practices in health service settings are warranted.

  8. A Study Space Analysis and Narrative Review of Trauma-Informed Mediators of Dating Violence.

    Science.gov (United States)

    Cascardi, Michele; Jouriles, Ernest N

    2016-07-28

    Research linking child maltreatment and dating violence in adolescence and emerging adulthood has proliferated in the past two decades; however, the precise mechanisms by which these experiences are related remain elusive. A trauma-informed perspective suggests four particularly promising mediators: maladaptive attachment, emotion regulation difficulties, emotional distress, and hostility. The current article characterizes the status of the empirical literature examining these four mediators using a study space analysis and a narrative review of existing research. An extensive literature search identified 42 papers (44 studies) that met the following criteria: (1) at least one measure of child maltreatment (emotional, physical, sexual, neglect, or exposure to intimate partner violence); (2) a measure of one of the four mediator variables; (3) a measure of dating violence perpetration or victimization; and (4) a sample of adolescents or young adults. The study space analysis suggested several important observations about the research on this topic, including a dearth of studies examining hostility as a mediator and little research using prospective designs or clinical samples. There are also limitations with the conceptualization and measurement of dating violence, child maltreatment, and some of the mediator variables. In addition, few studies examined more than one mediator variable in the same study. The narrative review suggested that maladaptive attachment (specifically insecure attachment styles), emotion regulation difficulties (specifically regulation of the emotion of anger), and emotional distress construed broadly represent promising mediators of the association between child maltreatment and dating violence, but conclusions about mediation must remain tentative given the state of the literature. The discussion offers recommendations for improved theoretical and empirical rigor to advance future research on mechanisms linking child maltreatment and dating

  9. Parameter Analysis of the VPIN (Volume synchronized Probability of Informed Trading) Metric

    Energy Technology Data Exchange (ETDEWEB)

    Song, Jung Heon; Wu, Kesheng; Simon, Horst D.

    2014-03-01

    VPIN (Volume synchronized Probability of Informed trading) is a leading indicator of liquidity-induced volatility. It is best known for having produced a signal more than hours before the Flash Crash of 2010. On that day, the market saw the biggest one-day point decline in the Dow Jones Industrial Average, which culminated to the market value of $1 trillion disappearing, but only to recover those losses twenty minutes later (Lauricella 2010). The computation of VPIN requires the user to set up a handful of free parameters. The values of these parameters significantly affect the effectiveness of VPIN as measured by the false positive rate (FPR). An earlier publication reported that a brute-force search of simple parameter combinations yielded a number of parameter combinations with FPR of 7%. This work is a systematic attempt to find an optimal parameter set using an optimization package, NOMAD (Nonlinear Optimization by Mesh Adaptive Direct Search) by Audet, le digabel, and tribes (2009) and le digabel (2011). We have implemented a number of techniques to reduce the computation time with NOMAD. Tests show that we can reduce the FPR to only 2%. To better understand the parameter choices, we have conducted a series of sensitivity analysis via uncertainty quantification on the parameter spaces using UQTK (Uncertainty Quantification Toolkit). Results have shown dominance of 2 parameters in the computation of FPR. Using the outputs from NOMAD optimization and sensitivity analysis, We recommend A range of values for each of the free parameters that perform well on a large set of futures trading records.

  10. Finite-block-length analysis in classical and quantum information theory.

    Science.gov (United States)

    Hayashi, Masahito

    2017-01-01

    Coding technology is used in several information processing tasks. In particular, when noise during transmission disturbs communications, coding technology is employed to protect the information. However, there are two types of coding technology: coding in classical information theory and coding in quantum information theory. Although the physical media used to transmit information ultimately obey quantum mechanics, we need to choose the type of coding depending on the kind of information device, classical or quantum, that is being used. In both branches of information theory, there are many elegant theoretical results under the ideal assumption that an infinitely large system is available. In a realistic situation, we need to account for finite size effects. The present paper reviews finite size effects in classical and quantum information theory with respect to various topics, including applied aspects.

  11. Effects of information on young consumers' willingness to pay for genetically modified food: experimental auction analysis.

    Science.gov (United States)

    Kajale, Dilip B; Becker, T C

    2014-01-01

    This study examines the effects of information on consumers' willingness to pay (WTP) for genetically modified food (GMF). We used Vickrey second price experimental auction method for elicitation of consumer WTP for GM potato chips and GM soya-chocolate bar. The sample used in this study was university students from Delhi, India. Four information formats (positive, negative, no information, and combined information about GM technology) were used for the examination. The results show that, when students received the combine information they were willing to pay around 17%-20% premium for GMF and when received the negative information they demanded around 22% discount for GMF. While the positive- and the no-information formats alone have no considerable effect on consumers' WTP for GMF. Overall, our findings suggest that while doing marketing of GMF in India, the best strategy is to provide combined information about GM technology.

  12. 78 FR 68463 - Notice of Emergency Approval of an Information Collection: Regional Analysis of Impediments...

    Science.gov (United States)

    2013-11-14

    ... FURTHER INFORMATION CONTACT: Lynnette McRae, Grants Management Specialist, Office of Sustainable Housing.... Description of the need for the information and proposed use: HUD's Office of Sustainable Housing...

  13. GENEVIEW and the DNACE data bus: computational tools for analysis, display and exchange of genetic information.

    OpenAIRE

    1986-01-01

    We describe an interactive computational tool, GENEVIEW, that allows the scientist to retrieve, analyze, display and exchange genetic information. The scientist may request a display of information from a GenBank locus, request that a restriction map be computed, stored and superimposed on GenBank information, and interactively view this information. GENEVIEW provides an interface between the GenBank data base and the programs of the Lilly DNA Computing Environment (DNACE). This interface sto...

  14. Research study on analysis/use technologies of genome information; Genome joho kaidoku riyo gijutsu no chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    For wide use of genome information in the industrial field, the required R and D was surveyed from the standpoints of biology and information science. To clarify the present state and issues of the international research on genome analysis, the genome map as well as sequence and function information are first surveyed. The current analysis/use technologies of genome information are analyzed, and the following are summarized: prediction and identification of gene regions in genome sequences, techniques for searching and selecting useful genes, and techniques for predicting the expression of gene functions and the gene-product structure and functions. It is recommended that R and D and data collection/interpretation necessary to clarify inter-gene interactions and information networks should be promoted by integrating Japanese advanced know-how and technologies. As examples of the impact of the research results on industry and society, the present state and future expected effect are summarized for medicines, diagnosis/analysis instruments, chemicals, foods, agriculture, fishery, animal husbandry, electronics, environment and information. 278 refs., 42 figs., 5 tabs.

  15. Improvement of the Accounting System at an Enterprise with the aim of Information Support of the Strategic Analysis

    Directory of Open Access Journals (Sweden)

    Feofanova Iryna V.

    2013-11-01

    Full Text Available The goal of the article is identification of directions of improvement of the accounting system at an enterprise for ensuring procedures of strategic analysis of trustworthy information. Historical (for the study of conditions of appearance and development of the strategic analysis and logical (for identification of directions of improvement of accounting methods were used during the study. The article establishes that the modern conditions require a system of indicators that is based both on financial and non-financial information. In order to conduct the strategic analysis it is necessary to expand the volume of information, which characterises such resources of an enterprise as scientific research and developments, personnel and quality of products (services. The article selects indicators of innovation activity costs and personnel training costs, accounting of which is not sufficiently regulated, among indicators that provides such information. It offers, in order to ensure information requirements of analysts, to improve accounting by the following directions: identification of the nature and volume of information required for enterprise managers; formation of the system of accounting at the place of appearance of expenses and responsibility centres; identification and accounting of income or other results received by the enterprise due to personnel advanced training, research and developments and innovation introduction costs. The article offers a form for calculating savings in the result of reduction of costs obtained due to provision of governmental privileges to enterprises that introduce innovations and deal with personnel training.

  16. Spatial analysis of lettuce downy mildew using geostatistics and geographic information systems.

    Science.gov (United States)

    Wu, B M; van Bruggen, A H; Subbarao, K V; Pennings, G G

    2001-02-01

    ABSTRACT The epidemiology of lettuce downy mildew has been investigated extensively in coastal California. However, the spatial patterns of the disease and the distance that Bremia lactucae spores can be transported have not been determined. During 1995 to 1998, we conducted several field- and valley-scale surveys to determine spatial patterns of this disease in the Salinas valley. Geostatistical analyses of the survey data at both scales showed that the influence range of downy mildew incidence at one location on incidence at other locations was between 80 and 3,000 m. A linear relationship was detected between semivariance and lag distance at the field scale, although no single statistical model could fit the semi-variograms at the valley scale. Spatial interpolation by the inverse distance weighting method with a power of 2 resulted in plausible estimates of incidence throughout the valley. Cluster analysis in geographic information systems on the interpolated disease incidence from different dates demonstrated that the Salinas valley could be divided into two areas, north and south of Salinas City, with high and low disease pressure, respectively. Seasonal and spatial trends along the valley suggested that the distinction between the downy mildew conducive and nonconducive areas might be determined by environmental factors.

  17. QTL linkage analysis of connected populations using ancestral marker and pedigree information.

    Science.gov (United States)

    Bink, Marco C A M; Totir, L Radu; ter Braak, Cajo J F; Winkler, Christopher R; Boer, Martin P; Smith, Oscar S

    2012-04-01

    The common assumption in quantitative trait locus (QTL) linkage mapping studies that parents of multiple connected populations are unrelated is unrealistic for many plant breeding programs. We remove this assumption and propose a Bayesian approach that clusters the alleles of the parents of the current mapping populations from locus-specific identity by descent (IBD) matrices that capture ancestral marker and pedigree information. Moreover, we demonstrate how the parental IBD data can be incorporated into a QTL linkage analysis framework by using two approaches: a Threshold IBD model (TIBD) and a Latent Ancestral Allele Model (LAAM). The TIBD and LAAM models are empirically tested via numerical simulation based on the structure of a commercial maize breeding program. The simulations included a pilot dataset with closely linked QTL on a single linkage group and 100 replicated datasets with five linkage groups harboring four unlinked QTL. The simulation results show that including parental IBD data (similarly for TIBD and LAAM) significantly improves the power and particularly accuracy of QTL mapping, e.g., position, effect size and individuals' genotype probability without significantly increasing computational demand.

  18. Explanatory Information Analysis for Day-Ahead Price Forecasting in the Iberian Electricity Market

    Directory of Open Access Journals (Sweden)

    Claudio Monteiro

    2015-09-01

    Full Text Available This paper presents the analysis of the importance of a set of explanatory (input variables for the day-ahead price forecast in the Iberian Electricity Market (MIBEL. The available input variables include extensive hourly time series records of weather forecasts, previous prices, and regional aggregation of power generations and power demands. The paper presents the comparisons of the forecasting results achieved with a model which includes all these available input variables (EMPF model with respect to those obtained by other forecasting models containing a reduced set of input variables. These comparisons identify the most important variables for forecasting purposes. In addition, a novel Reference Explanatory Model for Price Estimations (REMPE that achieves hourly price estimations by using actual power generations and power demands of such day is described in the paper, which offers the lowest limit for the forecasting error of the EMPF model. All the models have been implemented using the same technique (artificial neural networks and have been satisfactorily applied to the real-world case study of the Iberian Electricity Market (MIBEL. The relative importance of each explanatory variable is identified for the day-ahead price forecasts in the MIBEL. The comparisons also allow outlining guidelines of the value of the different types of input information.

  19. Information retrieval and cross-correlation function analysis of random noise radar signal through dispersive media

    Science.gov (United States)

    Alejos, Ana Vazques; Dawood, Muhammad

    2012-06-01

    In this contribution we examine the propagation of an ultrawideband (UWB) random noise signal through dispersive media such as soil, vegetation, and water, using Fourier-based analysis. For such media, the propagated signal undergoes medium-specific impairments which degrade the received signal in a different way than the non-dispersive propagation media. Theoretically, larger penetration depths into a dispersive medium can be achieved by identifying and detecting the precursors, thereby offering significantly better signal-to-noise ratio and enhanced imaging. For a random noise signal, well defined precursors in term of peak-amplitude don't occur. The phenomenon must therefore be studied in terms of energy evolution. Additionally, the distortion undergone by the UWB random noise signal through a dispersive medium can introduce frequency-dependent uncertainty or noise in the received signal. This leads to larger degradation of the cross-correlation function (CCF), mainly in terms of sidelobe levels and main peak deformation, and consequently making the information retrieval difficult. We would further analyze one method to restore the shape and carrier frequency of the input UWB random noise signal, thereby, improving the CCF estimation.

  20. A toolbox for the fast information analysis of multiple-site LFP, EEG and spike train recordings

    Directory of Open Access Journals (Sweden)

    Logothetis Nikos K

    2009-07-01

    Full Text Available Abstract Background Information theory is an increasingly popular framework for studying how the brain encodes sensory information. Despite its widespread use for the analysis of spike trains of single neurons and of small neural populations, its application to the analysis of other types of neurophysiological signals (EEGs, LFPs, BOLD has remained relatively limited so far. This is due to the limited-sampling bias which affects calculation of information, to the complexity of the techniques to eliminate the bias, and to the lack of publicly available fast routines for the information analysis of multi-dimensional responses. Results Here we introduce a new C- and Matlab-based information theoretic toolbox, specifically developed for neuroscience data. This toolbox implements a novel computationally-optimized algorithm for estimating many of the main information theoretic quantities and bias correction techniques used in neuroscience applications. We illustrate and test the toolbox in several ways. First, we verify that these algorithms provide accurate and unbiased estimates of the information carried by analog brain signals (i.e. LFPs, EEGs, or BOLD even when using limited amounts of experimental data. This test is important since existing algorithms were so far tested primarily on spike trains. Second, we apply the toolbox to the analysis of EEGs recorded from a subject watching natural movies, and we characterize the electrodes locations, frequencies and signal features carrying the most visual information. Third, we explain how the toolbox can be used to break down the information carried by different features of the neural signal into distinct components reflecting different ways in which correlations between parts of the neural signal contribute to coding. We illustrate this breakdown by analyzing LFPs recorded from primary visual cortex during presentation of naturalistic movies. Conclusion The new toolbox presented here implements fast

  1. Prototype integration of protein electrophoresis laboratory results in an information warehouse to improve workflow and data analysis.

    Science.gov (United States)

    Liu, Jianhua; Silvey, Scott A; Bissell, Michael G; Saltz, Joel H; Kamal, Jyoti

    2006-01-01

    This poster demonstrates our efforts to enhance workflow and clinical analysis of protein electrophoresis (PEP) data through integration with the Information Warehouse (IW) at The Ohio State University Medical Center (OSUMC). A new desktop application has been developed with the aim of enabling more efficient and accurate gel analysis by clinical pathologists. This tool gives the pathologists the ability to perform their analysis conveniently from anywhere on the OSUMC network along with the aid of numerical analysis algorithms, image enhancement techniques, and access to historical PEP results for the given patient.

  2. Qualitative and quantitative information flow analysis for multi-thread programs

    NARCIS (Netherlands)

    Ngo, Tri Minh

    2014-01-01

    In today's information-based society, guaranteeing information security plays an important role in all aspects of life: communication between citizens and governments, military, companies, financial information systems, web-based services etc. With the increasing popularity of computer systems with

  3. The Key Roles in the Informal Organization: A Network Analysis Perspective

    Science.gov (United States)

    de Toni, Alberto F.; Nonino, Fabio

    2010-01-01

    Purpose: The purpose of this paper is to identify the key roles embedded in the informal organizational structure (informal networks) and to outline their contribution in the companies' performance. A major objective of the research is to find and characterize a new key informal role that synthesises problem solving, expertise, and accessibility…

  4. A critical analysis of Floridi’s theory of semantic information

    NARCIS (Netherlands)

    Adriaans, P.

    2010-01-01

    In various publications over the past years, Floridi has developed a theory of semantic information as well-formed, meaningful, and truthful data. This theory is more or less orthogonal to the standard entropy-based notions of information known from physics, information theory, and computer science

  5. Analysis of Influencing Factors of Technical Barriers on Information Product Exports Based on the Fuzzy AHP

    Institute of Scientific and Technical Information of China (English)

    YuyingWu; NaLi

    2004-01-01

    To the actual situation of TBT impacting information product and according to the concept of the triangular fuzzy number, this paper forms the fuzzy matrix of factors of impacting export of information product, then uses the fuzzy AHP to analyze and rate factors. We put forward suggestions on how to keep away and surpass the technical barriers to trade in the information product enterprises.

  6. Analysis of medication information exchange at discharge from a Dutch hospital

    NARCIS (Netherlands)

    van Berlo-van de laar, Inge R. F.; Driessen, Erwin; Merkx, Maria M.; Jansman, Frank G. A.

    2012-01-01

    Background At hospitalisation and discharge the risk of errors in medication information transfer is high. Objective To study the routes by which medication information is transferred during discharge from Deventer Hospital, and to improve medication information transfer. Setting Eight hospital ward

  7. Information systems for mental health in six low and middle income countries : Cross country situation analysis

    NARCIS (Netherlands)

    Upadhaya, Nawaraj; Jordans, Mark J D; Abdulmalik, Jibril; Ahuja, Shalini; Alem, Atalay; Hanlon, Charlotte; Kigozi, Fred; Kizza, Dorothy; Lund, Crick; Semrau, Maya; Shidhaye, Rahul; Thornicroft, Graham; Komproe, Ivan H.; Gureje, Oye

    2016-01-01

    Background: Research on information systems for mental health in low and middle income countries (LMICs) is scarce. As a result, there is a lack of reliable information on mental health service needs, treatment coverage and the quality of services provided. Methods: With the aim of informing the dev

  8. A SWOT analysis on the implementation of Building Information Models within the geospatial environment

    NARCIS (Netherlands)

    Isikdag, U.; Zlatanova, S.

    2009-01-01

    Building Information Models as product models and Building Information Modelling as a process which supports information management throughout the lifecycle of a building are becoming more widely used in the Architecture/Engineering/Construction (AEC) industry. In order to facilitate various urban m

  9. Exploring drought vulnerability in Africa: an indicator based analysis to inform early warning systems

    Science.gov (United States)

    Naumann, G.; Barbosa, P.; Garrote, L.; Iglesias, A.; Vogt, J.

    2013-10-01

    Drought vulnerability is a complex concept that includes both biophysical and socio-economic drivers of drought impact that determine capacity to cope with drought. In order to develop an efficient drought early warning system and to be prepared to mitigate upcoming drought events it is important to understand the drought vulnerability of the affected regions. We propose a composite Drought Vulnerability Indicator (DVI) that reflects different aspects of drought vulnerability evaluated at Pan-African level in four components: the renewable natural capital, the economic capacity, the human and civic resources, and the infrastructure and technology. The selection of variables and weights reflects the assumption that a society with institutional capacity and coordination, as well as with mechanisms for public participation is less vulnerable to drought; furthermore we consider that agriculture is only one of the many sectors affected by drought. The quality and accuracy of a composite indicator depends on the theoretical framework, on the data collection and quality, and on how the different components are aggregated. This kind of approach can lead to some degree of scepticism; to overcome this problem a sensitivity analysis was done in order to measure the degree of uncertainty associated with the construction of the composite indicator. Although the proposed drought vulnerability indicator relies on a number of theoretical assumptions and some degree of subjectivity, the sensitivity analysis showed that it is a robust indicator and hence able of representing the complex processes that lead to drought vulnerability. According to the DVI computed at country level, the African countries classified with higher relative vulnerability are Somalia, Burundi, Niger, Ethiopia, Mali and Chad. The analysis of the renewable natural capital component at sub-basin level shows that the basins with high to moderate drought vulnerability can be subdivided in three main different

  10. Exploring drought vulnerability in Africa: an indicator based analysis to inform early warning systems

    Directory of Open Access Journals (Sweden)

    G. Naumann

    2013-10-01

    Full Text Available Drought vulnerability is a complex concept that includes both biophysical and socio-economic drivers of drought impact that determine capacity to cope with drought. In order to develop an efficient drought early warning system and to be prepared to mitigate upcoming drought events it is important to understand the drought vulnerability of the affected regions. We propose a composite Drought Vulnerability Indicator (DVI that reflects different aspects of drought vulnerability evaluated at Pan-African level in four components: the renewable natural capital, the economic capacity, the human and civic resources, and the infrastructure and technology. The selection of variables and weights reflects the assumption that a society with institutional capacity and coordination, as well as with mechanisms for public participation is less vulnerable to drought; furthermore we consider that agriculture is only one of the many sectors affected by drought. The quality and accuracy of a composite indicator depends on the theoretical framework, on the data collection and quality, and on how the different components are aggregated. This kind of approach can lead to some degree of scepticism; to overcome this problem a sensitivity analysis was done in order to measure the degree of uncertainty associated with the construction of the composite indicator. Although the proposed drought vulnerability indicator relies on a number of theoretical assumptions and some degree of subjectivity, the sensitivity analysis showed that it is a robust indicator and hence able of representing the complex processes that lead to drought vulnerability. According to the DVI computed at country level, the African countries classified with higher relative vulnerability are Somalia, Burundi, Niger, Ethiopia, Mali and Chad. The analysis of the renewable natural capital component at sub-basin level shows that the basins with high to moderate drought vulnerability can be subdivided in three

  11. The Channels and Demands Analysis for Chinese Farmers’ Agricultural Information Acquisition

    Directory of Open Access Journals (Sweden)

    Zhensheng Tao

    2012-06-01

    Full Text Available This paper studies the characteristics of information sources and farmers’ demand for agricultural information in the process of agricultural informationization in China. We point out that it is not common for farmers to adopt modern information technology communication tools in rural areas nowadays, and the reason for this phenomenon is that farmers still rely mainly on traditional channels for information dissemination. To change the status quo, our government should respect farmers as dominant statuses in the process of agricultural informationization and encourage households with large-scale agricultural operation to use modern IT products in order to spread agricultural information in rural areas.

  12. An empirical analysis of executive behaviour with hospital executive information systems in Taiwan.

    Science.gov (United States)

    Huang, Wei-Min

    2013-01-01

    Existing health information systems largely only support the daily operations of a medical centre, and are unable to generate the information required by executives for decision-making. Building on past research concerning information retrieval behaviour and learning through mental models, this study examines the use of information systems by hospital executives in medical centres. It uses a structural equation model to help find ways hospital executives might use information systems more effectively. The results show that computer self-efficacy directly affects the maintenance of mental models, and that system characteristics directly impact learning styles and information retrieval behaviour. Other results include the significant impact of perceived environmental uncertainty on scan searches; information retrieval behaviour and focused searches on mental models and perceived efficiency; scan searches on mental model building; learning styles and model building on perceived efficiency; and finally the impact of mental model maintenance on perceived efficiency and effectiveness.

  13. Analysis of the question-answer service of the Emma Children's Hospital information centre.

    Science.gov (United States)

    Kruisinga, Frea H; Heinen, Richard C; Heymans, Hugo S A

    2010-07-01

    The information centre of the Emma Children's Hospital AMC (EKZ AMC) is a specialised information centre where paediatric patients and persons involved with the patient can ask questions about all aspects of disease and its social implications. The aim of the study was to evaluate the question-answer service of this information centre in order to determine the role of a specialised information centre in an academic children's hospital, identify the appropriate resources for the service and potential positive effects. For this purpose, a case management system was developed in MS ACCESS. The characteristics of the requester and the question, the time it took to answer questions, the information sources used and the extent to which we were able to answer the questions were registered. The costs of the service were determined. We analysed all questions that were asked in the year 2007. Fourteen hundred thirty-four questions were asked. Most questions were asked by parents (23.3%), healthcare workers (other than nurses; 16.5%) and nurses (15.3%). The scope of the most frequently asked questions include disease (20.2%) and treatment (13.0%). Information on paper was the main information source used. Most questions could be solved within 15 min. Twelve percent to 28% of total working hours are used for the question-answer service. Total costs including staff salary are rather large. In conclusions, taking over the task of providing additional medical information and by providing readily available, good quality information that healthcare professionals can use to inform their patients will lead to less time investment of these more expensive staff members. A specialised information service can anticipate on the information need of parents and persons involved with the paediatric patient. It improves information by providing with relatively simple resources that has the potential to improve patient and parent satisfaction, coping and medical results. A specialised

  14. Risk-informed criticality analysis as applied to waste packages subject to a subsurface igneous intrusion

    Science.gov (United States)

    Kimball, Darby Suzan

    Practitioners of many branches of nuclear facility safety use probabilistic risk assessment (PRA) methodology, which evaluates the reliability of a system along with the consequences of various failure states. One important exception is nuclear criticality safety, which traditionally produces binary results (critical or subcritical, based upon value of the effective multiplication factor, keff). For complex systems, criticality safety can benefit from application of the more flexible PRA techniques. A new risk-based technique in criticality safety analysis is detailed. In addition to identifying the most reactive configuration(s) and determining subcriticality, it yields more information about the relative reactivity contributions of various factors. By analyzing a more complete system, confidence that the system will remain subcritical is increased and areas where additional safety features would be most effective are indicated. The first step in the method is to create a criticality event tree (a specialized form of event tree where multiple outcomes stemming from a single event are acceptable). The tree lists events that impact reactivity by changing a system parameter. Next, the value of keff is calculated for the end states using traditional methods like the MCNP code. As calculations progress, the criticality event tree is modified; event branches demonstrated to have little effect on reactivity may be collapsed (thus reducing the total number of criticality runs), and branches may be added if more information is needed to characterize the system. When the criticality event tree is mature, critical limits are determined according to traditional validation techniques. Finally, results are evaluated. Criticality for the system is determined by comparing the value of k eff for each end state to the critical limit derived for those cases. The relative contributions of various events to criticality are identified by comparing end states resulting from different

  15. SWOT Analysis of King Abdullah II School for Information Technology at University of Jordan According to Quality Assurance Procedures

    Directory of Open Access Journals (Sweden)

    Lubna Naser Eddeen

    2013-02-01

    Full Text Available Many books and research papers have defined and referred to the term SWOT Analysis. SWOT Analysis can be defines as "strategic planning method used to evaluate the Strengths, Weaknesses, Opportunities, and Threats involved in a project or in a business venture". It's used to assess internal and external environmental factors which affect on the business. This paper analyze the main SWOT factors at King Abdullah II School for Information Technology.

  16. Health-care district management information system plan: Review of operations analysis activities during calendar year 1975 and plan for continued research and analysis activities

    Science.gov (United States)

    Nielson, G. J.; Stevenson, W. G.

    1976-01-01

    Operations research activities developed to identify the information required to manage both the efficiency and effectiveness of the Veterans Administration (VA) health services as these services relate to individual patient care are reported. The clinical concerns and management functions that determine this information requirement are discussed conceptually. Investigations of existing VA data for useful management information are recorded, and a diagnostic index is provided. The age-specific characteristics of diseases and lengths of stay are explored, and recommendations for future analysis activities are articulated. The effect of the introduction of new technology to health care is also discussed.

  17. Information for policy makers 2. Analysis of the EU's energy roadmap 2050 scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Foerster, Hannah; Healy, Sean; Loreck, Charlotte; Matthes, Felix [Oeko-Institut e.V. - Institut fuer Angewandte Oekologie, Freiburg im Breisgau (Germany); Fischedick, Manfred; Lechtenboehmer, Stefan; Samadi, Sascha; Venjakob, Johannes [Wuppertal Institut (Germany)

    2012-05-15

    With growing concerns about climate change, energy import dependency and increasing fuel costs, a political consensus has formed in Europe in recent years about the need to transform the way we supply and consume energy. However, there is less political consensus on the specific steps that need to be taken in order to achieve a future sustainable energy system. Questions about which technologies should be used to what extent and how fast changes in the energy system should be instituted are being discussed on the European Union as well as on the Member State level. Energy scenarios are seen as a helpful tool to guide and inform these discussions. Several scenario studies on the European energy system have been released in recent years by stakeholders like environmental NGOs and industry associations. A number of these studies have recently been analysed by the Oeko-Institut and the Wuppertal Institute within an ongoing project commissioned by the Smart Energy for Europe Platform (SEFEF). The project aims to advance the debate on the decarbonisation of the energy system in the EU as well as its Member States during the course of 2012 and to make contributions to the scientific literature on this topic. Analysis within the project focuses on the development of the electricity system, as this system today is the main source for CO{sub 2} emissions and is widely regarded to be the key to any future decarbonisation pathway. The paper at hand summarises the analyses accomplished based on scenarios developed within the recently released Energy Roadmap 2050 of the European Union. The Roadmap explores different energy system pathways, which are compatible with the EU's long-term climate targets. It is a highly influential publication and will play a significant role in determining what will follow the EU's 2020 energy agenda. The Roadmap's analysis is currently discussed by EU and Member States policymakers as well as by stakeholders throughout Europe

  18. Functional MRI Representational Similarity Analysis Reveals a Dissociation between Discriminative and Relative Location Information in the Human Visual System

    Directory of Open Access Journals (Sweden)

    Zvi N Roth

    2016-03-01

    Full Text Available Neural responses in visual cortex are governed by a topographic mapping from retinal locations to cortical responses. Moreover, at the voxel population level early visual cortex (EVC activity enables accurate decoding of stimuli locations. However, in many cases information enabling one to discriminate between locations (i.e. discriminative information may be less relevant than information regarding the relative location of two objects (i.e. relative information. For example, when planning to grab a cup, determining whether the cup is located at the same retinal location as the hand is hardly relevant, whereas the location of the cup relative to the hand is crucial for performing the action.We have previously used multivariate pattern analysis techniques to measure discriminative location information, and found the highest levels in early visual cortex, in line with other studies. Here we show, using representational similarity analysis, that availability of discriminative information in fMRI activation patterns does not entail availability of relative information. Specifically, we find that relative location information can be reliably extracted from activity patterns in posterior intraparietal sulcus (pIPS, but not from EVC, where we find the spatial representation to be warped.We further show that this variability in relative information levels between regions can be explained by a computational model based on an array of receptive fields. Moreover, when the model’s receptive fields are extended to include inhibitory surround regions, the model can account for the spatial warping in EVC.These results demonstrate how size and shape properties of receptive fields in human visual cortex contribute to the transformation of discriminative spatial representation into relative spatial representation along the visual stream.

  19. Functional MRI Representational Similarity Analysis Reveals a Dissociation between Discriminative and Relative Location Information in the Human Visual System.

    Science.gov (United States)

    Roth, Zvi N

    2016-01-01

    Neural responses in visual cortex are governed by a topographic mapping from retinal locations to cortical responses. Moreover, at the voxel population level early visual cortex (EVC) activity enables accurate decoding of stimuli locations. However, in many cases information enabling one to discriminate between locations (i.e., discriminative information) may be less relevant than information regarding the relative location of two objects (i.e., relative information). For example, when planning to grab a cup, determining whether the cup is located at the same retinal location as the hand is hardly relevant, whereas the location of the cup relative to the hand is crucial for performing the action. We have previously used multivariate pattern analysis techniques to measure discriminative location information, and found the highest levels in EVC, in line with other studies. Here we show, using representational similarity analysis, that availability of discriminative information in fMRI activation patterns does not entail availability of relative information. Specifically, we find that relative location information can be reliably extracted from activity patterns in posterior intraparietal sulcus (pIPS), but not from EVC, where we find the spatial representation to be warped. We further show that this variability in relative information levels between regions can be explained by a computational model based on an array of receptive fields. Moreover, when the model's receptive fields are extended to include inhibitory surround regions, the model can account for the spatial warping in EVC. These results demonstrate how size and shape properties of receptive fields in human visual cortex contribute to the transformation of discriminative spatial representations into relative spatial representations along the visual stream.

  20. Retrieval of aerosol microphysical properties from AERONET photopolarimetric measurements: 1. Information content analysis

    Science.gov (United States)

    Xu, Xiaoguang; Wang, Jun

    2015-07-01

    This paper is the first part of a two-part study that aims to retrieve aerosol particle size distribution (PSD) and refractive index from the multispectral and multiangular polarimetric measurements taken by the new-generation Sun photometer as part of the Aerosol Robotic Network (AERONET). It provides theoretical analysis and guidance to the companion study in which we have developed an inversion algorithm for retrieving 22 aerosol microphysical parameters associated with a bimodal PSD function from real AERONET measurements. Our theoretical analysis starts with generating the synthetic measurements at four spectral bands (440, 675, 870, and 1020 nm) with a Unified Linearized Vector Radiative Transfer Model for various types of spherical aerosol particles. Subsequently, the quantitative information content for retrieving aerosol parameters is investigated in four observation scenarios, i.e., I1, I2, P1, and P2. Measurements in the scenario (I1) comprise the solar direct radiances and almucantar radiances that are used in the current AERONET operational inversion algorithm. The other three scenarios include different additional measurements: (I2) the solar principal plane radiances, (P1) the solar principal plane radiances and polarization, and (P2) the solar almucantar polarization. Results indicate that adding polarization measurements can increase the degree of freedom for signal by 2-5 in the scenario P1, while not as much of an increase is found in the scenarios I2 and P2. Correspondingly, smallest retrieval errors are found in the scenario P1: 2.3% (2.9%) for the fine-mode (coarse-mode) aerosol volume concentration, 1.3% (3.5%) for the effective radius, 7.2% (12%) for the effective variance, 0.005 (0.035) for the real-part refractive index, and 0.019 (0.068) for the single-scattering albedo. These errors represent a reduction from their counterparts in scenario I1 of 79% (57%), 76% (49%), 69% (52%), 66% (46%), and 49% (20%), respectively. We further