WorldWideScience

Sample records for analysis georgraphic information

  1. Transportation Routing Analysis Georgraphic Information System (WebTRAGIS) User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Michelhaugh, R.D.

    2000-04-20

    In the early 1980s, Oak Ridge National Laboratory (ORNL) developed two transportation routing models: HIGHWAY, which predicts truck transportation routes, and INTERLINE, which predicts rail transportation routes. Both of these models have been used by the U.S. Department of Energy (DOE) community for a variety of routing needs over the years. One of the primary uses of the models has been to determine population-density information, which is used as input for risk assessment with the RADTRAN model, which is available on the TRANSNET computer system. During the recent years, advances in the development of geographic information systems (GISs) have resulted in increased demands from the user community for a GIS version of the ORNL routing models. In April 1994, the DOE Transportation Management Division (EM-261) held a Baseline Requirements Assessment Session with transportation routing experts and users of the HIGHWAY and INTERLINE models. As a result of the session, the development of a new GIS routing model, Transportation Routing Analysis GIS (TRAGIS), was initiated. TRAGIS is a user-friendly, GIS-based transportation and analysis computer model. The older HIGHWAY and INTERLINE models are useful to calculate routes, but they cannot display a graphic of the calculated route. Consequently, many users have experienced difficulty determining the proper node for facilities and have been confused by or have misinterpreted the text-based listing from the older routing models. Some of the primary reasons for the development of TRAGIS are (a) to improve the ease of selecting locations for routing, (b) to graphically display the calculated route, and (c) to provide for additional geographic analysis of the route.

  2. HIRENASD analysis Information Package

    Data.gov (United States)

    National Aeronautics and Space Administration — Updated November 2, 2011 Contains summary information and analysis condition details for the Aeroelastic Prediction Workshop Information plotted in this package is...

  3. Canonical Information Analysis

    DEFF Research Database (Denmark)

    Vestergaard, Jacob Schack; Nielsen, Allan Aasbjerg

    2015-01-01

    Canonical correlation analysis is an established multivariate statistical method in which correlation between linear combinations of multivariate sets of variables is maximized. In canonical information analysis introduced here, linear correlation as a measure of association between variables...... is replaced by the information theoretical, entropy based measure mutual information, which is a much more general measure of association. We make canonical information analysis feasible for large sample problems, including for example multispectral images, due to the use of a fast kernel density estimator...... for entropy estimation. Canonical information analysis is applied successfully to (1) simple simulated data to illustrate the basic idea and evaluate performance, (2) fusion of weather radar and optical geostationary satellite data in a situation with heavy precipitation, and (3) change detection in optical...

  4. Information security risk analysis

    CERN Document Server

    Peltier, Thomas R

    2001-01-01

    Effective Risk AnalysisQualitative Risk AnalysisValue AnalysisOther Qualitative MethodsFacilitated Risk Analysis Process (FRAP)Other Uses of Qualitative Risk AnalysisCase StudyAppendix A: QuestionnaireAppendix B: Facilitated Risk Analysis Process FormsAppendix C: Business Impact Analysis FormsAppendix D: Sample of ReportAppendix E: Threat DefinitionsAppendix F: Other Risk Analysis OpinionsIndex

  5. Information Security Risk Analysis

    CERN Document Server

    Peltier, Thomas R

    2010-01-01

    Offers readers with the knowledge and the skill-set needed to achieve a highly effective risk analysis assessment. This title demonstrates how to identify threats and then determine if those threats pose a real risk. It is suitable for industry and academia professionals.

  6. Information Flow Analysis for VHDL

    DEFF Research Database (Denmark)

    Tolstrup, Terkel Kristian; Nielson, Flemming; Nielson, Hanne Riis

    2005-01-01

    We describe a fragment of the hardware description language VHDL that is suitable for implementing the Advanced Encryption Standard algorithm. We then define an Information Flow analysis as required by the international standard Common Criteria. The goal of the analysis is to identify the entire...... information flow through the VHDL program. The result of the analysis is presented as a non-transitive directed graph that connects those nodes (representing either variables or signals) where an information flow might occur. We compare our approach to that of Kemmerer and conclude that our approach yields...

  7. Marketing Information: A Competitive Analysis

    OpenAIRE

    Miklos Sarvary; Philip M. Parker

    1997-01-01

    Selling information that is later used in decision making constitutes an increasingly important business in modern economies (Jensen [Jensen, Fred O. 1991. Information services. Congram, Friedman, eds. , Chapter 22. AMA-COM, New York, 423–443.]). Information is sold under a large variety of forms: industry reports, consulting services, database access, and/or professional opinions given by medical, engineering, accounting/financial, and legal professionals, among others. This paper is the fir...

  8. Using Information from Operating Experience to Inform Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bruce P. Hallbert; David I. Gertman; Julie Marble; Erasmia Lois; Nathan Siu

    2004-06-01

    This paper reports on efforts being sponsored by the U.S. NRC and performed by INEEL to develop a technical basis and perform work to extract information from sources for use in HRA. The objectives of this work are to: 1) develop a method for conducting risk-informed event analysis of human performance information that stems from operating experience at nuclear power plants and for compiling and documenting the results in a structured manner; 2) provide information from these analyses for use in risk-informed and performance-based regulatory activities; 3) create methods for information extraction and a repository for this information that, likewise, support HRA methods and their applications.

  9. Content analysis in information flows

    Science.gov (United States)

    Grusho, Alexander A.; Grusho, Nick A.; Timonina, Elena E.

    2016-06-01

    The paper deals with architecture of content recognition system. To analyze the problem the stochastic model of content recognition in information flows was built. We proved that under certain conditions it is possible to solve correctly a part of the problem with probability 1, viewing a finite section of the information flow. That means that good architecture consists of two steps. The first step determines correctly certain subsets of contents, while the second step may demand much more time for true decision.

  10. Empirical Analysis of Informal Institutions

    OpenAIRE

    Park, Sang-Min

    2013-01-01

    The New Institutional Economics has established itself as widely accepted extension to the standard neoclassical paradigm. Here, institutions are defined as commonly known rules that structure recurring interaction situations and the corresponding sanctioning mechanism. For-mal institutions describe rules with a sanction mechanism that is organized by the state. In-formal institutions describe rules with a sanction mechanism that ...

  11. Textual Analysis of Intangible Information

    NARCIS (Netherlands)

    A.J. Moniz (Andy)

    2016-01-01

    markdownabstractTraditionally, equity investors have relied upon the information reported in firms’ financial accounts to make their investment decisions. Due to the conservative nature of accounting standards, firms cannot value their intangible assets such as corporate culture, brand value and rep

  12. Social Network Analysis and informal trade

    DEFF Research Database (Denmark)

    Walther, Olivier

    networks can be applied to better understand informal trade in developing countries, with a particular focus on Africa. The paper starts by discussing some of the fundamental concepts developed by social network analysis. Through a number of case studies, we show how social network analysis can i...... approaches. The paper finally highlights some of the applications of social network analysis and their implications for trade policies.......networks can be applied to better understand informal trade in developing countries, with a particular focus on Africa. The paper starts by discussing some of the fundamental concepts developed by social network analysis. Through a number of case studies, we show how social network analysis can...

  13. Shape design sensitivity analysis using domain information

    Science.gov (United States)

    Seong, Hwal-Gyeong; Choi, Kyung K.

    1985-01-01

    A numerical method for obtaining accurate shape design sensitivity information for built-up structures is developed and demonstrated through analysis of examples. The basic character of the finite element method, which gives more accurate domain information than boundary information, is utilized for shape design sensitivity improvement. A domain approach for shape design sensitivity analysis of built-up structures is derived using the material derivative idea of structural mechanics and the adjoint variable method of design sensitivity analysis. Velocity elements and B-spline curves are introduced to alleviate difficulties in generating domain velocity fields. The regularity requirements of the design velocity field are studied.

  14. Canonical analysis based on mutual information

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Vestergaard, Jacob Schack

    2015-01-01

    combinations with the information theoretical measure mutual information (MI). We term this type of analysis canonical information analysis (CIA). MI allows for the actual joint distribution of the variables involved and not just second order statistics. While CCA is ideal for Gaussian data, CIA facilitates......Canonical correlation analysis (CCA) is an established multi-variate statistical method for finding similarities between linear combinations of (normally two) sets of multivariate observations. In this contribution we replace (linear) correlation as the measure of association between the linear...

  15. ANALYSIS APPROACHES TO EVALUATION OF INFORMATION PROTECTION

    Directory of Open Access Journals (Sweden)

    Zyuzin A. S.

    2015-03-01

    Full Text Available The article is devoted to an actual problem of information systems’ security assessment and the importance of objective quantitative assessment results receiving. The author offers the creation of complex system of information security with system approach, which will be used at each stage of information system’s life cycle. On the basis of this approach the author formulates the general scheme of an information security assessment of information system, and also the principles of an assessment’s carrying out method choice. In this work the existing methods of a quantitative assessment based on object-oriented methods of the system analysis, and also the objectivity of the received estimates on the basis of this approach are considered. On the basis of the carried-out analysis, serious shortcomings of the used modern techniques of an information systems’ security assessment are allocated, then the idea of the scientific and methodical device providing the increase of objectivity and complexity of an information assessment means on the basis of expert data formalization creation necessity was formulated. The possibility of this approach application for expeditious receiving a quantitative information security assessment in the conditions security threat’s dynamics changes, functioning and developments of information system is considered. The problem definition of automated information systems’ security assessment is executed, and the general technique of protection means of information in systems of this type was formulated

  16. Informational analysis involving application of complex information system

    Science.gov (United States)

    Ciupak, Clébia; Vanti, Adolfo Alberto; Balloni, Antonio José; Espin, Rafael

    The aim of the present research is performing an informal analysis for internal audit involving the application of complex information system based on fuzzy logic. The same has been applied in internal audit involving the integration of the accounting field into the information systems field. The technological advancements can provide improvements to the work performed by the internal audit. Thus we aim to find, in the complex information systems, priorities for the work of internal audit of a high importance Private Institution of Higher Education. The applied method is quali-quantitative, as from the definition of strategic linguistic variables it was possible to transform them into quantitative with the matrix intersection. By means of a case study, where data were collected via interview with the Administrative Pro-Rector, who takes part at the elaboration of the strategic planning of the institution, it was possible to infer analysis concerning points which must be prioritized at the internal audit work. We emphasize that the priorities were identified when processed in a system (of academic use). From the study we can conclude that, starting from these information systems, audit can identify priorities on its work program. Along with plans and strategic objectives of the enterprise, the internal auditor can define operational procedures to work in favor of the attainment of the objectives of the organization.

  17. Informed spectral analysis: audio signal parameter estimation using side information

    Science.gov (United States)

    Fourer, Dominique; Marchand, Sylvain

    2013-12-01

    Parametric models are of great interest for representing and manipulating sounds. However, the quality of the resulting signals depends on the precision of the parameters. When the signals are available, these parameters can be estimated, but the presence of noise decreases the resulting precision of the estimation. Furthermore, the Cramér-Rao bound shows the minimal error reachable with the best estimator, which can be insufficient for demanding applications. These limitations can be overcome by using the coding approach which consists in directly transmitting the parameters with the best precision using the minimal bitrate. However, this approach does not take advantage of the information provided by the estimation from the signal and may require a larger bitrate and a loss of compatibility with existing file formats. The purpose of this article is to propose a compromised approach, called the 'informed approach,' which combines analysis with (coded) side information in order to increase the precision of parameter estimation using a lower bitrate than pure coding approaches, the audio signal being known. Thus, the analysis problem is presented in a coder/decoder configuration where the side information is computed and inaudibly embedded into the mixture signal at the coder. At the decoder, the extra information is extracted and is used to assist the analysis process. This study proposes applying this approach to audio spectral analysis using sinusoidal modeling which is a well-known model with practical applications and where theoretical bounds have been calculated. This work aims at uncovering new approaches for audio quality-based applications. It provides a solution for challenging problems like active listening of music, source separation, and realistic sound transformations.

  18. Mathematical Analysis of Evolution, Information, and Complexity

    CERN Document Server

    Arendt, Wolfgang

    2009-01-01

    Mathematical Analysis of Evolution, Information, and Complexity deals with the analysis of evolution, information and complexity. The time evolution of systems or processes is a central question in science, this text covers a broad range of problems including diffusion processes, neuronal networks, quantum theory and cosmology. Bringing together a wide collection of research in mathematics, information theory, physics and other scientific and technical areas, this new title offers elementary and thus easily accessible introductions to the various fields of research addressed in the book.

  19. Crime analysis using open source information

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah; Shah, Azhar Ali;

    2015-01-01

    In this paper, we present a method of crime analysis from open source information. We employed un-supervised methods of data mining to explore the facts regarding the crimes of an area of interest. The analysis is based on well known clustering and association techniques. The results show...... that the proposed method of crime analysis is efficient and gives a broad picture of the crimes of an area to analyst without much effort. The analysis is evaluated using manual approach, which reveals that the results produced by the proposed approach are comparable to the manual analysis, while a great amount...

  20. Risk Analysis of Accounting Information System Infrastructure

    OpenAIRE

    MIHALACHE, Arsenie-Samoil

    2011-01-01

    National economy and security are fully dependent on information technology and infrastructure. At the core of the information infrastructure society relies on, we have the Internet, a system designed initially as a scientists’ forum for unclassified research. The use of communication networks and systems may lead to hazardous situations that generate undesirable effects such as communication systems breakdown, loss of data or taking the wrong decisions. The paper studies the risk analysis of...

  1. Information theory applications for biological sequence analysis.

    Science.gov (United States)

    Vinga, Susana

    2014-05-01

    Information theory (IT) addresses the analysis of communication systems and has been widely applied in molecular biology. In particular, alignment-free sequence analysis and comparison greatly benefited from concepts derived from IT, such as entropy and mutual information. This review covers several aspects of IT applications, ranging from genome global analysis and comparison, including block-entropy estimation and resolution-free metrics based on iterative maps, to local analysis, comprising the classification of motifs, prediction of transcription factor binding sites and sequence characterization based on linguistic complexity and entropic profiles. IT has also been applied to high-level correlations that combine DNA, RNA or protein features with sequence-independent properties, such as gene mapping and phenotype analysis, and has also provided models based on communication systems theory to describe information transmission channels at the cell level and also during evolutionary processes. While not exhaustive, this review attempts to categorize existing methods and to indicate their relation with broader transversal topics such as genomic signatures, data compression and complexity, time series analysis and phylogenetic classification, providing a resource for future developments in this promising area.

  2. Enhancing genomics information retrieval through dimensional analysis.

    Science.gov (United States)

    Hu, Qinmin; Huang, Jimmy Xiangji

    2013-06-01

    We propose a novel dimensional analysis approach to employing meta information in order to find the relationships within the unstructured or semi-structured document/passages for improving genomics information retrieval performance. First, we make use of the auxiliary information as three basic dimensions, namely "temporal", "journal", and "author". The reference section is treated as a commensurable quantity of the three basic dimensions. Then, the sample space and subspaces are built up and a set of events are defined to meet the basic requirement of dimensional homogeneity to be commensurable quantities. After that, the classic graph analysis algorithm in the Web environments is applied on each dimension respectively to calculate the importance of each dimension. Finally, we integrate all the dimension networks and re-rank the outputs for evaluation. Our experimental results show the proposed approach is superior and promising.

  3. Air Force geographic information and analysis system

    Energy Technology Data Exchange (ETDEWEB)

    Henney, D.A.; Jansing, D.S.; Durfee, R.C.; Margle, S.M.; Till, L.E.

    1987-01-01

    A microcomputer-based geographic information and analysis system (GIAS) was developed to assist Air Force planners with environmental analysis, natural resources management, and facility and land-use planning. The system processes raster image data, topological data structures, and geometric or vector data similar to that produced by computer-aided design and drafting (CADD) systems, integrating the data where appropriate. Data types included Landsat imagery, scanned images of base maps, digitized point and chain features, topographic elevation data, USGS stream course data, highway networks, railroad networks, and land use/land cover information from USGS interpreted aerial photography. The system is also being developed to provide an integrated display and analysis capability with base maps and facility data bases prepared on CADD systems. 3 refs.

  4. Analysis of information security reliability: A tutorial

    International Nuclear Information System (INIS)

    This article presents a concise reliability analysis of network security abstracted from stochastic modeling, reliability, and queuing theories. Network security analysis is composed of threats, their impacts, and recovery of the failed systems. A unique framework with a collection of the key reliability models is presented here to guide the determination of the system reliability based on the strength of malicious acts and performance of the recovery processes. A unique model, called Attack-obstacle model, is also proposed here for analyzing systems with immunity growth features. Most computer science curricula do not contain courses in reliability modeling applicable to different areas of computer engineering. Hence, the topic of reliability analysis is often too diffuse to most computer engineers and researchers dealing with network security. This work is thus aimed at shedding some light on this issue, which can be useful in identifying models, their assumptions and practical parameters for estimating the reliability of threatened systems and for assessing the performance of recovery facilities. It can also be useful for the classification of processes and states regarding the reliability of information systems. Systems with stochastic behaviors undergoing queue operations and random state transitions can also benefit from the approaches presented here. - Highlights: • A concise survey and tutorial in model-based reliability analysis applicable to information security. • A framework of key modeling approaches for assessing reliability of networked systems. • The framework facilitates quantitative risk assessment tasks guided by stochastic modeling and queuing theory. • Evaluation of approaches and models for modeling threats, failures, impacts, and recovery analysis of information systems

  5. Sustainability and information in urban system analysis

    International Nuclear Information System (INIS)

    In the present paper, a possible application of information theory for urban system analysis is shown. The ESM method proposed, based on Shannon's entropy analysis, is useful to evaluate different alternative measures of new energy saving technology transfer at different programming stages for consumption reduction and environmental impact control. A case study has been conducted in an urban area of Florence (Italy): the action/factor interaction entropy values can provide a scale of intervention priority and by comparing results obtained evaluating conditional entropy, ambiguity and redundancy, it is possible to identify the highest energy sustainable intervention in terms of higher or lower critical and risky action/factor combinations for the project being carried out. The ESM method proposed, if applied to different urban areas, can provide a rational criterion to compare complex innovative and sustainable technologies for irreversibility reduction and energy efficiency increase

  6. Exploiting salient semantic analysis for information retrieval

    Science.gov (United States)

    Luo, Jing; Meng, Bo; Quan, Changqin; Tu, Xinhui

    2016-11-01

    Recently, many Wikipedia-based methods have been proposed to improve the performance of different natural language processing (NLP) tasks, such as semantic relatedness computation, text classification and information retrieval. Among these methods, salient semantic analysis (SSA) has been proven to be an effective way to generate conceptual representation for words or documents. However, its feasibility and effectiveness in information retrieval is mostly unknown. In this paper, we study how to efficiently use SSA to improve the information retrieval performance, and propose a SSA-based retrieval method under the language model framework. First, SSA model is adopted to build conceptual representations for documents and queries. Then, these conceptual representations and the bag-of-words (BOW) representations can be used in combination to estimate the language models of queries and documents. The proposed method is evaluated on several standard text retrieval conference (TREC) collections. Experiment results on standard TREC collections show the proposed models consistently outperform the existing Wikipedia-based retrieval methods.

  7. 78 FR 38096 - Fatality Analysis Reporting System Information Collection

    Science.gov (United States)

    2013-06-25

    ... National Highway Traffic Safety Administration Fatality Analysis Reporting System Information Collection... Reporting System (FARS) is a major system that acquires national fatality information directly from existing...: Request for public comment on proposed collection of information. SUMMARY: Before a Federal agency...

  8. Information- Theoretic Analysis for the Difficulty of Extracting Hidden Information

    Institute of Scientific and Technical Information of China (English)

    ZHANG Wei-ming; LI Shi-qu; CAO Jia; LIU Jiu-fen

    2005-01-01

    The difficulty of extracting hidden information,which is essentially a kind of secrecy, is analyzed by information-theoretic method. The relations between key rate, message rate, hiding capacity and difficulty of extraction are studied in the terms of unicity distance of stego-key, and the theoretic conclusion is used to analyze the actual extracting attack on Least Significant Bit(LSB) steganographic algorithms.

  9. Modeling uncertainty in geographic information and analysis

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Uncertainty modeling and data quality for spatial data and spatial analyses are im-portant topics in geographic information science together with space and time in geography,as well as spatial analysis. In the past two decades,a lot of efforts have been made to research the uncertainty modeling for spatial data and analyses. This paper presents our work in the research. In particular,four progresses in the re-search are given out: (a) from determinedness-to uncertainty-based representation of geographic objects in GIS; (b) from uncertainty modeling for static data to dy-namic spatial analyses; (c) from modeling uncertainty for spatial data to models; and (d) from error descriptions to quality control for spatial data.

  10. Astrophysical data analysis with information field theory

    Energy Technology Data Exchange (ETDEWEB)

    Enßlin, Torsten, E-mail: ensslin@mpa-garching.mpg.de [Max Planck Institut für Astrophysik, Karl-Schwarzschild-Straße 1, D-85748 Garching, Germany and Ludwig-Maximilians-Universität München, Geschwister-Scholl-Platz 1, D-80539 München (Germany)

    2014-12-05

    Non-parametric imaging and data analysis in astrophysics and cosmology can be addressed by information field theory (IFT), a means of Bayesian, data based inference on spatially distributed signal fields. IFT is a statistical field theory, which permits the construction of optimal signal recovery algorithms. It exploits spatial correlations of the signal fields even for nonlinear and non-Gaussian signal inference problems. The alleviation of a perception threshold for recovering signals of unknown correlation structure by using IFT will be discussed in particular as well as a novel improvement on instrumental self-calibration schemes. IFT can be applied to many areas. Here, applications in in cosmology (cosmic microwave background, large-scale structure) and astrophysics (galactic magnetism, radio interferometry) are presented.

  11. Multicriteria analysis of ontologically represented information

    Science.gov (United States)

    Wasielewska, K.; Ganzha, M.; Paprzycki, M.; Bǎdicǎ, C.; Ivanovic, M.; Lirkov, I.

    2014-11-01

    Our current work concerns the development of a decision support system for the software selection problem. The main idea is to utilize expert knowledge to help the user in selecting the best software / method / computational resource to solve a computational problem. Obviously, this involves multicriterial decision making and the key open question is: which method to choose. The context of the work is provided by the Agents in Grid (AiG) project, where the software selection (and thus multicriterial analysis) is to be realized when all information concerning the problem, the hardware and the software is ontologically represented. Initially, we have considered the Analytical Hierarchy Process (AHP), which is well suited for the hierarchical data structures (e.g., such that have been formulated in terms of ontologies). However, due to its well-known shortcomings, we have decided to extend our search for the multicriterial analysis method best suited for the problem in question. In this paper we report results of our search, which involved: (i) TOPSIS (Technique for Order Preference by Similarity to Ideal Solution), (ii) PROMETHEE, and (iii) GRIP (Generalized Regression with Intensities of Preference). We also briefly argue why other methods have not been considered as valuable candidates.

  12. Information security policies : a content analysis

    OpenAIRE

    Lopes, Isabel Maria; Sá-Soares, Filipe de

    2012-01-01

    Completed research paper Among information security controls, the literature gives a central role to information security policies. However, there is a reduced number ofempirical studies about the features and components of information security policies. Thisresearch aims to contribute to fill this gap. It presents a synthesis of the literature on information security policies content and it characterizes 25 City Councils information security policy documents in terms of features and compo...

  13. Applying Galois compliance for data analysis in information systems

    Directory of Open Access Journals (Sweden)

    Kozlov Sergey

    2016-03-01

    Full Text Available The article deals with the data analysis in information systems. The author discloses the possibility of using Galois compliance to identify the characteristics of the information system structure. The author reveals the specificity of the application of Galois compliance for the analysis of information system content with the use of invariants of graph theory. Aspects of introduction of mathematical apparatus of Galois compliance for research of interrelations between elements of the adaptive training information system of individual testing are analyzed.

  14. A Comparative Analysis of University Information Systems within the Scope of the Information Security Risks

    Directory of Open Access Journals (Sweden)

    Rustu Yilmaz

    2016-05-01

    Full Text Available Universities are the leading institutions that are the sources of educated human population who both produce information and ensure to develop new products and new services by using information effectively, and who are needed in every area. Therefore, universities are expected to be institutions where information and information management are used efficiently. In the present study, the topics such as infrastructure, operation, application, information, policy and human-based information security at universities were examined within the scope of the information security standards which are highly required and intended to be available at each university today, and then a comparative analysis was conducted specific to Turkey. Within the present study, the Microsoft Security Assessment Tool developed by Microsoft was used as the risk analysis tool. The analyses aim to enable the universities to compare their information systems with the information systems of other universities within the scope of the information security awareness, and to make suggestions in this regard.

  15. Exploring health information technology education: an analysis of the research.

    Science.gov (United States)

    Virgona, Thomas

    2012-01-01

    This article is an analysis of the Health Information Technology Education published research. The purpose of this study was to examine selected literature using variables such as journal frequency, keyword analysis, universities associated with the research and geographic diversity. The analysis presented in this paper has identified intellectually significant studies that have contributed to the development and accumulation of intellectual wealth of Health Information Technology. The keyword analysis suggests that Health Information Technology research has evolved from establishing concepts and domains of health information systems, technology and management to contemporary issues such as education, outsourcing, web services and security. The research findings have implications for educators, researchers, journal.

  16. Modeling and Analysis of Information Product Maps

    Science.gov (United States)

    Heien, Christopher Harris

    2012-01-01

    Information Product Maps are visual diagrams used to represent the inputs, processing, and outputs of data within an Information Manufacturing System. A data unit, drawn as an edge, symbolizes a grouping of raw data as it travels through this system. Processes, drawn as vertices, transform each data unit input into various forms prior to delivery…

  17. Comprehensive analysis of information dissemination in disasters

    Science.gov (United States)

    Zhang, N.; Huang, H.; Su, Boni

    2016-11-01

    China is a country that experiences a large number of disasters. The number of deaths caused by large-scale disasters and accidents in past 10 years is around 900,000. More than 92.8 percent of these deaths could be avoided if there were an effective pre-warning system deployed. Knowledge of the information dissemination characteristics of different information media taking into consideration governmental assistance (information published by a government) in disasters in urban areas, plays a critical role in increasing response time and reducing the number of deaths and economic losses. In this paper we have developed a comprehensive information dissemination model to optimize efficiency of pre-warning mechanics. This model also can be used for disseminating information for evacuees making real-time evacuation plans. We analyzed every single information dissemination models for pre-warning in disasters by considering 14 media: short message service (SMS), phone, television, radio, news portals, Wechat, microblogs, email, newspapers, loudspeaker vehicles, loudspeakers, oral communication, and passive information acquisition via visual and auditory senses. Since governmental assistance is very useful in a disaster, we calculated the sensitivity of governmental assistance ratio. The results provide useful references for information dissemination during disasters in urban areas.

  18. Informational and technological provision of marginal analysis

    OpenAIRE

    Назаренко, Тетяна Петрівна

    2016-01-01

    Approaches of scientists towards the methods of carrying out marginal analysis have been analyzed as well as the main stages and procedures of carrying out such kind of analysis have been singled out.

  19. Function analysis for waste information systems

    International Nuclear Information System (INIS)

    This study has a two-fold purpose. It seeks to identify the functional requirements of a waste tracking information system and to find feasible alternatives for meeting those requirements on the Oak Ridge Reservation (ORR) and the Portsmouth (PORTS) and Paducah (PGDP) facilities; identify options that offer potential cost savings to the US government and also show opportunities for improved efficiency and effectiveness in managing waste information; and, finally, to recommend a practical course of action that can be immediately initiated. In addition to identifying relevant requirements, it also identifies any existing requirements that are currently not being completely met. Another aim of this study is to carry out preliminary benchmarking by contacting representative companies about their strategic directions in waste information. The information obtained from representatives of these organizations is contained in an appendix to the document; a full benchmarking effort, however, is beyond the intended scope of this study

  20. Discourse Analysis: Part I, Information Management and Cohesion.

    Science.gov (United States)

    Lovejoy, Kim Brian; Lance, Donald M.

    Combining linguistics and composition studies, this paper (part 1 of a two-part article) proposes a model for the analysis of information management and cohesion in written discourse. It defines concepts of discourse analysis--specifically information management, syntax, semantic reference, lexicon, cohesion, and intonation, with examples taken…

  1. Multicriteria Evaluation and Sensitivity Analysis on Information Security

    Science.gov (United States)

    Syamsuddin, Irfan

    2013-05-01

    Information security plays a significant role in recent information society. Increasing number and impact of cyber attacks on information assets have resulted the increasing awareness among managers that attack on information is actually attack on organization itself. Unfortunately, particular model for information security evaluation for management levels is still not well defined. In this study, decision analysis based on Ternary Analytic Hierarchy Process (T-AHP) is proposed as a novel model to aid managers who responsible in making strategic evaluation related to information security issues. In addition, sensitivity analysis is applied to extend our analysis by using several "what-if" scenarios in order to measure the consistency of the final evaluation. Finally, we conclude that the final evaluation made by managers has a significant consistency shown by sensitivity analysis results.

  2. Information architecture. Volume 2, Part 1: Baseline analysis summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-01

    The Department of Energy (DOE) Information Architecture, Volume 2, Baseline Analysis, is a collaborative and logical next-step effort in the processes required to produce a Departmentwide information architecture. The baseline analysis serves a diverse audience of program management and technical personnel and provides an organized way to examine the Department`s existing or de facto information architecture. A companion document to Volume 1, The Foundations, it furnishes the rationale for establishing a Departmentwide information architecture. This volume, consisting of the Baseline Analysis Summary (part 1), Baseline Analysis (part 2), and Reference Data (part 3), is of interest to readers who wish to understand how the Department`s current information architecture technologies are employed. The analysis identifies how and where current technologies support business areas, programs, sites, and corporate systems.

  3. A Mathematical Analysis of Conflict Prevention Information

    Science.gov (United States)

    Maddalon, Jeffrey M.; Butler, Ricky W.; Munoz, Cesar A.; Dowek, Gilles

    2009-01-01

    In air traffic management, conflict prevention information refers to the guidance maneuvers, which if taken, ensure that an aircraft's path is conflict-free. These guidance maneuvers take the form of changes to track angle or ground speed. Conflict prevention information may be assembled into prevention bands that advise the crew on maneuvers that should not be taken. Unlike conflict resolution systems, which presume that the aircraft already has a conflict, conflict prevention systems show conflicts for any maneuver, giving the pilot confidence that if a maneuver is made, then no near-term conflicts will result. Because near-term conflicts can lead to safety concerns, strong verification of information correctness is required. This paper presents a mathematical framework to analyze the correctness of algorithms that produce conflict prevention information incorporating an arbitrary number of traffic aircraft and with both a near-term and intermediate-term lookahead times. The framework is illustrated with a formally verified algorithm for 2-dimensional track angle prevention bands.

  4. Neurodynamics analysis of brain information transmission

    Institute of Scientific and Technical Information of China (English)

    Ru-bin WANG; Zhi-kang ZHANG; Chi K. Tse

    2009-01-01

    This paper proposes a model of neural networks consisting of populations of perceptive neurons, inter-neurons, and motor neurons according to the theory of stochastic phase resetting dynamics. According to this model, the dynamical characteristics of neural networks are studied in three coupling cases, namely, series and parallel coupling, series coupling, and unilateral coupling. The results show that the indentified structure of neural networks enables the basic characteristics of neural information processing to be described in terms of the actions of both the optional motor and the reflected motor. The excitation of local neural networks is caused by the action of the optional motor. In particular, the excitation of the neural population caused by the action of the optional motor in the motor cortex is larger than that caused by the action of the reflected motor. This phenomenon indicates that there are more neurons participating in the neural information processing and the excited synchronization motion under the action of the optional motor.

  5. A Multidisciplinary Analysis of Cyber Information Sharing

    Directory of Open Access Journals (Sweden)

    Aviram Zrahia

    2014-12-01

    Full Text Available The emergence of the cyber threat phenomenon is forcing organizations to change the way they think about security. One of these changes relates to organizations’ policy on sharing cyber information with outside parties. This means shifting away from the view of the organization as an isolated, compartmentalized entity towards a view of the organization as a sharing one. Sharing generates a complex, multifaceted challenge to technology, law, organizational culture and even politics. Establishing a system of sharing serves many parties, including regulatory bodies, governments, legal authorities, intelligence agencies, the manufacturers of solutions and services, as well as the organizations themselves, but it also arouses opposition among elements within the organization, and organizations defending the right for privacy. The purpose of this essay is to present the various challenges posed by cyber information sharing, expose the reader to its conceptual world, and present some insights and forecasts for its future development.

  6. Hydrogen Technical Analysis -- Dissemination of Information

    Energy Technology Data Exchange (ETDEWEB)

    George Kervitsky, Jr.

    2006-03-20

    SENTECH is a small energy and environmental consulting firm providing technical, analytical, and communications solutions to technology management issues. The activities proposed by SENTECH focused on gathering and developing communications materials and information, and various dissemination activities to present the benefits of hydrogen energy to a broad audience while at the same time establishing permanent communications channels to enable continued two-way dialog with these audiences in future years. Effective communications and information dissemination is critical to the acceptance of new technology. Hydrogen technologies face the additional challenge of safety preconceptions formed primarily as a result of the crash of the Hindenburg. Effective communications play a key role in all aspects of human interaction, and will help to overcome the perceptual barriers, whether of safety, economics, or benefits. As originally proposed SENTECH identified three distinct information dissemination activities to address three distinct but important audiences; these formed the basis for the task structure used in phases 1 and 2. The tasks were: (1) Print information--Brochures that target the certain segment of the population and will be distributed via relevant technical conferences and traditional distribution channels. (2) Face-to-face meetings--With industries identified to have a stake in hydrogen energy. The three industry audiences are architect/engineering firms, renewable energy firms, and energy companies that have not made a commitment to hydrogen (3) Educational Forums--The final audience is students--the future engineers, technicians, and energy consumers. SENTECH will expand on its previous educational work in this area. The communications activities proposed by SENTECH and completed as a result of this cooperative agreement was designed to compliment the research and development work funded by the DOE by presenting the technical achievements and validations

  7. Medical Image Analysis by Cognitive Information Systems - a Review.

    Science.gov (United States)

    Ogiela, Lidia; Takizawa, Makoto

    2016-10-01

    This publication presents a review of medical image analysis systems. The paradigms of cognitive information systems will be presented by examples of medical image analysis systems. The semantic processes present as it is applied to different types of medical images. Cognitive information systems were defined on the basis of methods for the semantic analysis and interpretation of information - medical images - applied to cognitive meaning of medical images contained in analyzed data sets. Semantic analysis was proposed to analyzed the meaning of data. Meaning is included in information, for example in medical images. Medical image analysis will be presented and discussed as they are applied to various types of medical images, presented selected human organs, with different pathologies. Those images were analyzed using different classes of cognitive information systems. Cognitive information systems dedicated to medical image analysis was also defined for the decision supporting tasks. This process is very important for example in diagnostic and therapy processes, in the selection of semantic aspects/features, from analyzed data sets. Those features allow to create a new way of analysis.

  8. Water Information Management & Analysis System (WIMAS) v 4.0

    Data.gov (United States)

    Kansas Data Access and Support Center — The Water Information Management and Analysis System (WIMAS) is an ArcView based GIS application that allows users to query Kansas water right data maintained by...

  9. ANALYSIS BETWEEN ACCOUNTING INFORMATION DISCLOSURE QUALITY AND STAKEHOLDER INTERESTS

    OpenAIRE

    Ioana (HERBEI) MOT; MOLDOVAN Nicoleta-Claudia; Cristina CERNIT

    2015-01-01

    The evolution and globalization of the markets, financial scandals that collapsed the American and European economical systems, pressure growth from investors over economic performance, underlined the fundamental role of economic-financial communication, corporate governance model and information transparence have over the information disclosure quality. The main objective of this paper is the ratio analysis between accounting information disclosure quality and stakeholder interests and the e...

  10. The threat nets approach to information system security risk analysis

    OpenAIRE

    Mirembe, Drake

    2015-01-01

    The growing demand for healthcare services is motivating hospitals to strengthen outpatient case management using information systems in order to serve more patients using the available resources. Though the use of information systems in outpatient case management raises patient data security concerns, it was established that the current approaches to information systems risk analysis do not provide logical recipes for quantifying threat impact and determining the cost-effectiveness of risk m...

  11. HISTORICAL AND LEGAL ANALYSIS OF THE FORMATION OF INFORMATION LAW

    OpenAIRE

    Indrisova Z. N.

    2014-01-01

    The article is devoted to carrying out the historical and legal analysis of the formation of information law. Based on this study, it is proposed to have classification stages of the formation of information law, which includes a pre-scientific, elementary, secondary stage, the stage of uncertainly and the modern stage

  12. Analysis of the brazilian scientific production about information flows

    Directory of Open Access Journals (Sweden)

    Danielly Oliveira Inomata

    2015-07-01

    Full Text Available Objective. This paper presents and discuss the concepts, contexts and applications involving information flows in organizations. Method. Systematic review, followed by a bibliometric analysis and system analysis. The systematic review aimed to search for, evaluate and review evidence about the research topic. The systematic review process comprised the following steps: 1 definition of keywords, 2 systematic review, 3 exploration and analysis of articles and 4 comparison and consolidation of results. Results. A bibliometric analysis aimed to provide a statement of the relevance of articles where the authors, dates of publications, citation index, and periodic keywords with higher occurrence. Conclusions. As survey results confirms the emphasis on information featured in the knowledge management process, and advancing years, it seems that the emphasis is on networks, ie, studies are turning to the operationalization and analysis of flows information networks. The literature produced demonstrates the relationship of information flow with its management, applied to different organizational contexts, including showing new trends in information science as the study and analysis of information flow in networks.

  13. Key Information Systems Issues: An Analysis of MIS Publications.

    Science.gov (United States)

    Palvia, Prashant C.; And Others

    1996-01-01

    Presents results of a content analysis of journal articles discussing management information systems (MIS) that was conducted to identify, classify, and prioritize the key issues; to perform a trend analysis; and to compare results with previous studies. Twenty-six key issues are ranked according to frequency of occurrence. Contains 52 references.…

  14. Informativeness of the CODIS STR loci for admixture analysis.

    Science.gov (United States)

    Barnholtz-Sloan, Jill S; Pfaff, Carrie L; Chakraborty, Ranajit; Long, Jeffrey C

    2005-11-01

    Population admixture (or ancestry) is used as an approach to gene discovery in complex diseases, particularly when the disease prevalence varies widely across geographic populations. Admixture analysis could be useful for forensics because an indication of a perpetrator's ancestry would narrow the pool of suspects for a particular crime. The purpose of this study was to use Fisher's information to identify informative sets of markers for admixture analysis. Using published founding population allele frequencies we test three marker sets for efficacy for estimating admixture: the FBI CODIS Core STR loci, the HGDP-CEPH Human Genome Diversity Cell Line Panel and the set of 39 ancestry informative SNPS from the Shriver lab at Pennsylvania State University. We conclude that the FBI CODIS Core STR set is valid for admixture analysis, but not the most precise. We recommend using a combination of the most informative markers from the HGDP-CEPH and Shriver loci sets.

  15. A time sequence analysis on the informal information flow mechanism of microblogging

    Institute of Scientific and Technical Information of China (English)

    Yuan; HU; Xiaoli; LIAO; Andong; WU

    2011-01-01

    Microblog is a new Internet featured product,which has seen a rapid development in recent years.Researchers from different countries are making various technical analyses on microblogging applications.In this study,through using the natural language processing(NLP)and data mining,we analyzed the information content transmitted via a microblog,users’social networks and their interactions,and carried out an empirical analysis on the dissemination process of one particular piece of information via Sina Weibo.Based on the result of these analyses,we attempt to develop a better understanding about the rule and mechanism of the informal information flow in microblogging.

  16. Information Security Analysis Using Game Theory and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Schlicher, Bob G [ORNL; Abercrombie, Robert K [ORNL

    2012-01-01

    Information security analysis can be performed using game theory implemented in dynamic simulations of Agent Based Models (ABMs). Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, and information assets. Our approach addresses imperfect information and scalability that allows us to also address previous limitations of current stochastic game models. Such models only consider perfect information assuming that the defender is always able to detect attacks; assuming that the state transition probabilities are fixed before the game assuming that the players actions are always synchronous; and that most models are not scalable with the size and complexity of systems under consideration. Our use of ABMs yields results of selected experiments that demonstrate our proposed approach and provides a quantitative measure for realistic information systems and their related security scenarios.

  17. A Strategic Analysis of Information Sharing Among Cyber Attackers

    Directory of Open Access Journals (Sweden)

    Kjell Hausken

    2015-10-01

    Full Text Available We build a game theory model where the market design is such that one firm invests in security to defend against cyber attacks by two hackers. The firm has an asset, which is allocated between the three market participants dependent on their contest success. Each hacker chooses an optimal attack, and they share information with each other about the firm’s vulnerabilities. Each hacker prefers to receive information, but delivering information gives competitive advantage to the other hacker. We find that each hacker’s attack and information sharing are strategic complements while one hacker’s attack and the other hacker’s information sharing are strategic substitutes. As the firm’s unit defense cost increases, the attack is inverse U-shaped and reaches zero, while the firm’s defense and profit decrease, and the hackers’ information sharing and profit increase. The firm’s profit increases in the hackers’ unit cost of attack, while the hackers’ information sharing and profit decrease. Our analysis also reveals the interesting result that the cumulative attack level of the hackers is not affected by the effectiveness of information sharing between them and moreover, is also unaffected by the intensity of joint information sharing. We also find that as the effectiveness of information sharing between hackers increases relative to the investment in attack, the firm’s investment in cyber security defense and profit are constant, the hackers’ investments in attacks decrease, and information sharing levels and hacker profits increase. In contrast, as the intensity of joint information sharing increases, while the firm’s investment in cyber security defense and profit remain constant, the hackers’ investments in attacks increase, and the hackers’ information sharing levels and profits decrease. Increasing the firm’s asset causes all the variables to increase linearly, except information sharing which is constant. We extend

  18. Agricultural information dissemination using ICTs: A review and analysis of information dissemination models in China

    Directory of Open Access Journals (Sweden)

    Yun Zhang

    2016-03-01

    Full Text Available Over the last three decades, China’s agriculture sector has been transformed from the traditional to modern practice through the effective deployment of Information and Communication Technologies (ICTs. Information processing and dissemination have played a critical role in this transformation process. Many studies in relation to agriculture information services have been conducted in China, but few of them have attempted to provide a comprehensive review and analysis of different information dissemination models and their applications. This paper aims to review and identify the ICT based information dissemination models in China and to share the knowledge and experience in applying emerging ICTs in disseminating agriculture information to farmers and farm communities to improve productivity and economic, social and environmental sustainability. The paper reviews and analyzes the development stages of China’s agricultural information dissemination systems and different mechanisms for agricultural information service development and operations. Seven ICT-based information dissemination models are identified and discussed. Success cases are presented. The findings provide a useful direction for researchers and practitioners in developing future ICT based information dissemination systems. It is hoped that this paper will also help other developing countries to learn from China’s experience and best practice in their endeavor of applying emerging ICTs in agriculture information dissemination and knowledge transfer.

  19. Information gap analysis of flood model uncertainties and regional frequency analysis

    Science.gov (United States)

    Hine, Daniel; Hall, Jim W.

    2010-01-01

    Flood risk analysis is subject to often severe uncertainties, which can potentially undermine flood management decisions. This paper explores the use of information gap theory to analyze the sensitivity of flood management decisions to uncertainties in flood inundation models and flood frequency analysis. Information gap is a quantified nonprobabilistic theory of robustness. To analyze uncertainties in flood modeling, an energy-bounded information gap model is established and applied first to a simplified uniform channel and then to a more realistic 2-D flood model. Information gap theory is then applied to the estimation of flood discharges using regional frequency analysis. The use of an information gap model is motivated by the notion that hydrologically similar sites are clustered in the space of their L moments. The information gap model is constructed around a parametric statistical flood frequency analysis, resulting in a hybrid model of uncertainty in which natural variability is handled statistically while epistemic uncertainties are represented in the information gap model. The analysis is demonstrated for sites in the Trent catchment, United Kingdom. The analysis is extended to address ungauged catchments, which, because of the attendant uncertainties in flood frequency analysis, are particularly appropriate for information gap analysis. Finally, the information gap model of flood frequency is combined with the treatment of hydraulic model uncertainties in an example of how both sources of uncertainty can be accounted for using information gap theory in a flood risk management decision.

  20. Informational Analysis for Compressive Sampling in Radar Imaging

    Directory of Open Access Journals (Sweden)

    Jingxiong Zhang

    2015-03-01

    Full Text Available Compressive sampling or compressed sensing (CS works on the assumption of the sparsity or compressibility of the underlying signal, relies on the trans-informational capability of the measurement matrix employed and the resultant measurements, operates with optimization-based algorithms for signal reconstruction and is thus able to complete data compression, while acquiring data, leading to sub-Nyquist sampling strategies that promote efficiency in data acquisition, while ensuring certain accuracy criteria. Information theory provides a framework complementary to classic CS theory for analyzing information mechanisms and for determining the necessary number of measurements in a CS environment, such as CS-radar, a radar sensor conceptualized or designed with CS principles and techniques. Despite increasing awareness of information-theoretic perspectives on CS-radar, reported research has been rare. This paper seeks to bridge the gap in the interdisciplinary area of CS, radar and information theory by analyzing information flows in CS-radar from sparse scenes to measurements and determining sub-Nyquist sampling rates necessary for scene reconstruction within certain distortion thresholds, given differing scene sparsity and average per-sample signal-to-noise ratios (SNRs. Simulated studies were performed to complement and validate the information-theoretic analysis. The combined strategy proposed in this paper is valuable for information-theoretic orientated CS-radar system analysis and performance evaluation.

  1. Semantic analysis based forms information retrieval and classification

    Science.gov (United States)

    Saba, Tanzila; Alqahtani, Fatimah Ayidh

    2013-09-01

    Data entry forms are employed in all types of enterprises to collect hundreds of customer's information on daily basis. The information is filled manually by the customers. Hence, it is laborious and time consuming to use human operator to transfer these customers information into computers manually. Additionally, it is expensive and human errors might cause serious flaws. The automatic interpretation of scanned forms has facilitated many real applications from speed and accuracy point of view such as keywords spotting, sorting of postal addresses, script matching and writer identification. This research deals with different strategies to extract customer's information from these scanned forms, interpretation and classification. Accordingly, extracted information is segmented into characters for their classification and finally stored in the forms of records in databases for their further processing. This paper presents a detailed discussion of these semantic based analysis strategies for forms processing. Finally, new directions are also recommended for future research. [Figure not available: see fulltext.

  2. Similarity Measures, Author Cocitation Analysis, and Information Theory

    CERN Document Server

    Leydesdorff, Loet

    2009-01-01

    The use of Pearson's correlation coefficient in Author Cocitation Analysis was compared with Salton's cosine measure in a number of recent contributions. Unlike the Pearson correlation, the cosine is insensitive to the number of zeros. However, one has the option of applying a logarithmic transformation in correlation analysis. Information calculus is based on both the logarithmic transformation and provides a non-parametric statistics. Using this methodology one can cluster a document set in a precise way and express the differences in terms of bits of information. The algorithm is explained and used on the data set which was made the subject of this discussion.

  3. Mapping Rise Time Information with Down-Shift Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Tunnell, T. W., Machorro, E. A., Diaz, A. B.

    2011-11-01

    These viewgraphs summarize the application of recent developments in digital down-shift (DDS) analysis of up converted PDV data to map out how well the PDV diagnostic would capture rise time information (mid point and rise time) in short rise time (<1 ns) shock events. The mapping supports a PDV vs VISAR challenge. The analysis concepts are new (~September FY 2011), simple, and run quickly, which makes them good tools to map out (with ~1 million Monte Carlo simulations) how well PDV captures rise time information as function of baseline velocity, rise time, velocity jump, and signal-to-noise ratios.

  4. Information delivery manuals to facilitate it supported energy analysis

    DEFF Research Database (Denmark)

    Mondrup, Thomas Fænø; Karlshøj, Jan; Vestergaard, Flemming

    In response to continuing Building Information Modeling (BIM) progress, building performance simulation tools such as IESVE are being utilized to explore construction projects and influence design decisions with increasing frequency. To maximize the potential of these tools, a specification...... of information exchange and digital workflows is required. This paper presents the preliminary findings of an ongoing study aimed at developing an Information Delivery Manual (IDM) for IT supported energy analysis at concept design phase. The IDM development is based on: (1) a review of current approaches (2...

  5. Collaborative for Historical Information and Analysis: Vision and Work Plan

    OpenAIRE

    Vladimir Zadorozhny; Patrick Manning; Daniel J. Bain; Ruth Mostern

    2013-01-01

    This article conveys the vision of a world-historical dataset, constructed in order to provide data on human social affairs at the global level over the past several centuries. The construction of this dataset will allow the routine application of tools developed for analyzing “Big Data” to global, historical analysis. The work is conducted by the Collaborative for Historical Information and Analysis (CHIA). This association of groups at universities and research institutes in the U.S. and Eu...

  6. Similarity Measures, Author Cocitation Analysis, and Information Theory

    OpenAIRE

    Leydesdorff, Loet

    2009-01-01

    The use of Pearson's correlation coefficient in Author Cocitation Analysis was compared with Salton's cosine measure in a number of recent contributions. Unlike the Pearson correlation, the cosine is insensitive to the number of zeros. However, one has the option of applying a logarithmic transformation in correlation analysis. Information calculus is based on both the logarithmic transformation and provides a non-parametric statistics. Using this methodology one can cluster a document set in...

  7. Shape Analysis for Complex Systems Using Information Geometry Tools.

    OpenAIRE

    Sanctis, Angela De

    2012-01-01

    In this paper we use Information Geometry tools to model statistically patterns arising in complex systems and describe their evolution in time. In particular, we focus on the analysis of images with medical applications and propose an index that can estimate the level of self-organization and predict future problems that may occur in these systems.

  8. Webometric Analysis of Departments of Librarianship and Information Science.

    Science.gov (United States)

    Thomas, Owen; Willett, Peter

    2000-01-01

    Describes a webometric analysis of linkages to library and information science (LIS) department Web sites in United Kingdom universities. Concludes that situation data are not well suited to evaluation of LIS departments and that departments can boost Web site visibility by hosting a wide range of materials. (Author/LRW)

  9. ANALYSIS OF INFORMATION FACTORS FOR DESIGNING INTELLECTUAL MECHATRONIC SYSTEM

    Directory of Open Access Journals (Sweden)

    A. V. Gulai

    2016-01-01

    Full Text Available The paper proposes to evaluate achievement of main results in operation of intellectual mechatronic systems with digital control by the obtained information effect. In this respect, common information requirements with intellectual components are considered as a basic information factor which influences on the process of mechatronic system designing. Therefore, some parameters have been accentuated and they can help to provide rather complete description of the processes used for obtaining and using systematic information within the volume of the intellectual mechatronic system. Conformity degree of control vector parameters synthesized by the system and identification results of its current states have been selected as an information criterion of the control efficiency. A set of expected probability values for location of each parameter of an control object and a mechatronic system within the required tolerances has been used for formation of possible states. The paper shows that when a complex information description of the system is used then it is expedient to use an expert assessment of selection probability for allowable control vectors which ensure a system transfer to favorable states. This approach has made it possible to pinpoint main information and technical specifications of the intellectual mechatronic system: structural construction (informational and technical compatibility and information matching of its components; control object (uncertainty of its state and information vector, information capacity of the mechatronic system; control actions (their hierarchy and entropic balance of control process, managerial resource of mechatronic system; functioning result (informational effect and control efficiency criterion, probabilistic selection of system states. In accordance with the fulfilled analysis it is possible to note the most effective directions for practical use of the proposed informational approach for creation of the

  10. Multiscale Analysis of Information Dynamics for Linear Multivariate Processes

    CERN Document Server

    Faes, Luca; Stramaglia, Sebastiano; Nollo, Giandomenico; Stramaglia, Sebastiano

    2016-01-01

    In the study of complex physical and physiological systems represented by multivariate time series, an issue of great interest is the description of the system dynamics over a range of different temporal scales. While information-theoretic approaches to the multiscale analysis of complex dynamics are being increasingly used, the theoretical properties of the applied measures are poorly understood. This study introduces for the first time a framework for the analytical computation of information dynamics for linear multivariate stochastic processes explored at different time scales. After showing that the multiscale processing of a vector autoregressive (VAR) process introduces a moving average (MA) component, we describe how to represent the resulting VARMA process using state-space (SS) models and how to exploit the SS model parameters to compute analytical measures of information storage and information transfer for the original and rescaled processes. The framework is then used to quantify multiscale infor...

  11. Environmental Quality Information Analysis Center multi-year plan

    International Nuclear Information System (INIS)

    An information analysis center (IAC) is a federal resource that provides technical information for a specific technology field. An IAC links an expert technical staff with an experienced information specialist group, supported by in-house or external data bases to provide technical information and maintain a corporate knowledge in a technical area. An IAC promotes the rapid transfer of technology among its users and provides assistance in adopting new technology and predicting and assessing emerging technology. This document outlines the concept, requirements, and proposed development of an Environmental Quality IAC (EQIAC). An EQIAC network is composed of several nodes, each of which has specific technology capabilities. This document outlines strategic and operational objectives for the phased development of one such node of an EQIAC network

  12. Environmental Quality Information Analysis Center multi-year plan

    Energy Technology Data Exchange (ETDEWEB)

    Rivera, R.G. (RDG, Inc. (United States)); Das, S. (Oak Ridge National Lab., TN (United States)); Walsh, T.E. (Florida Univ., Gainesville, FL (United States))

    1992-09-01

    An information analysis center (IAC) is a federal resource that provides technical information for a specific technology field. An IAC links an expert technical staff with an experienced information specialist group, supported by in-house or external data bases to provide technical information and maintain a corporate knowledge in a technical area. An IAC promotes the rapid transfer of technology among its users and provides assistance in adopting new technology and predicting and assessing emerging technology. This document outlines the concept, requirements, and proposed development of an Environmental Quality IAC (EQIAC). An EQIAC network is composed of several nodes, each of which has specific technology capabilities. This document outlines strategic and operational objectives for the phased development of one such node of an EQIAC network.

  13. Comparative Analysis of Splice Site Regions by Information Content

    Institute of Scientific and Technical Information of China (English)

    T. Shashi Rekha; Chanchal K. Mitra

    2006-01-01

    We have applied concepts from information theory for a comparative analysis of donor (gt) and acceptor (ag) splice site regions in the genes of five different organisms by calculating their mutual information content (relative entropy) over a selected block of nucleotides. A similar pattern that the information content decreases as the block size increases was observed for both regions in all the organisms studied. This result suggests that the information required for splicing might be contained in the consensus of ~6-8 nt at both regions. We assume from our study that even though the nucleotides are showing some degrees of conservation in the flanking regions of the splice sites, certain level of variability is still tolerated,which leads the splicing process to occur normally even if the extent of base pairing is not fully satisfied. We also suggest that this variability can be compensated by recognizing different splice sites with different spliceosomal factors.

  14. ERISTAR: Earth Resources Information Storage, Transformation, Analysis, and Retrieval

    Science.gov (United States)

    1972-01-01

    The National Aeronautics and Space Administration (NASA) and the American Society for Engineering Education (ASEE) have sponsored faculty fellowship programs in systems engineering design for the past several years. During the summer of 1972 four such programs were conducted by NASA, with Auburn University cooperating with Marshall Space Flight Center (MSFC). The subject for the Auburn-MSFC design group was ERISTAR, an acronym for Earth Resources Information Storage, Transformation, Analysis and Retrieval, which represents an earth resources information management network of state information centers administered by the respective states and linked to federally administered regional centers and a national center. The considerations for serving the users and the considerations that must be given to processing data from a variety of sources are described. The combination of these elements into a national network is discussed and an implementation plan is proposed for a prototype state information center. The compatibility of the proposed plan with the Department of Interior plan, RALI, is indicated.

  15. Large-scale temporal analysis of computer and information science

    Science.gov (United States)

    Soos, Sandor; Kampis, George; Gulyás, László

    2013-09-01

    The main aim of the project reported in this paper was twofold. One of the primary goals was to produce an extensive source of network data for bibliometric analyses of field dynamics in the case of Computer and Information Science. To this end, we rendered the raw material of the DBLP computer and infoscience bibliography into a comprehensive collection of dynamic network data, promptly available for further statistical analysis. The other goal was to demonstrate the value of our data source via its use in mapping Computer and Information Science (CIS). An analysis of the evolution of CIS was performed in terms of collaboration (co-authorship) network dynamics. Dynamic network analysis covered three quarters of the XX. century (76 years, from 1936 to date). Network evolution was described both at the macro- and the mezo level (in terms of community characteristics). Results show that the development of CIS followed what appears to be a universal pattern of growing into a "mature" discipline.

  16. Sentiment analysis using common-sense and context information.

    Science.gov (United States)

    Agarwal, Basant; Mittal, Namita; Bansal, Pooja; Garg, Sonal

    2015-01-01

    Sentiment analysis research has been increasing tremendously in recent times due to the wide range of business and social applications. Sentiment analysis from unstructured natural language text has recently received considerable attention from the research community. In this paper, we propose a novel sentiment analysis model based on common-sense knowledge extracted from ConceptNet based ontology and context information. ConceptNet based ontology is used to determine the domain specific concepts which in turn produced the domain specific important features. Further, the polarities of the extracted concepts are determined using the contextual polarity lexicon which we developed by considering the context information of a word. Finally, semantic orientations of domain specific features of the review document are aggregated based on the importance of a feature with respect to the domain. The importance of the feature is determined by the depth of the feature in the ontology. Experimental results show the effectiveness of the proposed methods.

  17. IMMAN: free software for information theory-based chemometric analysis.

    Science.gov (United States)

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  18. IMMAN: free software for information theory-based chemometric analysis.

    Science.gov (United States)

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  19. Structured information analysis for human reliability analysis of emergency tasks in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Kim, Jae Whan; Park, Jin Kyun; Ha, Jae Joo [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-02-01

    More than twenty HRA (Human Reliability Analysis) methodologies have been developed and used for the safety analysis in nuclear field during the past two decades. However, no methodology appears to have universally been accepted, as various limitations have been raised for more widely used ones. One of the most important limitations of conventional HRA is insufficient analysis of the task structure and problem space. To resolve this problem, we suggest SIA (Structured Information Analysis) for HRA. The proposed SIA consists of three parts. The first part is the scenario analysis that investigates the contextual information related to the given task on the basis of selected scenarios. The second is the goals-means analysis to define the relations between the cognitive goal and task steps. The third is the cognitive function analysis module that identifies the cognitive patterns and information flows involved in the task. Through the three-part analysis, systematic investigation is made possible from the macroscopic information on the tasks to the microscopic information on the specific cognitive processes. It is expected that analysts can attain a structured set of information that helps to predict the types and possibility of human error in the given task. 48 refs., 12 figs., 11 tabs. (Author)

  20. Food labelled Information: An Empirical Analysis of Consumer Preferences

    Directory of Open Access Journals (Sweden)

    Alessandro Banterle

    2012-12-01

    Full Text Available This paper aims at analysing which kinds of currently labelled information are of interest and actually used by consumers, and which additional kinds could improve consumer choices. We investigate the attitude of consumers with respect to innovative strategies for the diffusion of product information, as smart-labels for mobile-phones. The empirical analysis was organised in focus groups followed by a survey on 240 consumers. Results show that the most important nutritional claims are vitamins, energy and fat content. Consumers show a high interest in the origin of the products, GMOs, environmental impact, animal welfare and type of breeding.

  1. Activation Analysis. Proceedings of an Informal Study Group Meeting

    International Nuclear Information System (INIS)

    As part of its programme to promote the exchange of information relating to nuclear science and technology, the International Atomic Energy Agency convened in Bangkok, Thailand, from 6-8 July 1970, an informal meeting to discuss the topic of Activation Analysis. The meeting was attended by participants drawn from the following countries: Australia, Burma, Ceylon, Republic of China, India, Indonesia, Prance, Japan, Republic of Korea, New Zealand, Philippines, Singapore, Thailand, United States of America and Vietnam. The proceedings consist of the contributions presented at the meeting with minor editorial changes

  2. Enhancing Safeguards through Information Analysis: Business Analytics Tools

    International Nuclear Information System (INIS)

    For the past 25 years the IBM i2 Intelligence Analysis product portfolio has assisted over 4,500 organizations across law enforcement, defense, government agencies, and commercial private sector businesses to maximize the value of the mass of information to discover and disseminate actionable intelligence that can help identify, investigate, predict, prevent, and disrupt criminal, terrorist, and fraudulent acts; safeguarding communities, organizations, infrastructures, and investments. The collaborative Intelligence Analysis environment delivered by i2 is specifically designed to be: · scalable: supporting business needs as well as operational and end user environments · modular: an architecture which can deliver maximum operational flexibility with ability to add complimentary analytics · interoperable: integrating with existing environments and eases information sharing across partner agencies · extendable: providing an open source developer essential toolkit, examples, and documentation for custom requirements i2 Intelligence Analysis brings clarity to complex investigations and operations by delivering industry leading multidimensional analytics that can be run on-demand across disparate data sets or across a single centralized analysis environment. The sole aim is to detect connections, patterns, and relationships hidden within high-volume, all-source data, and to create and disseminate intelligence products in near real time for faster informed decision making. (author)

  3. Implantation of a safety management system information under the ISO 27001: risk analysis information

    Directory of Open Access Journals (Sweden)

    José Gregorio Arévalo Ascanio

    2015-11-01

    Full Text Available In this article the structure of the business of the city of Ocaña is explored with the aim of expanding the information and knowledge of the main variables of the productive activity of the municipality, its entrepreneurial spirit, technological development and productive structure. For this, a descriptive research was performed to identify economic activity in its various forms and promote the implementation of administrative practices consistent with national and international references.The results allowed to establish business weaknesses, including information, which once identified are used to design spaces training, acquisition of abilities and employers management practices in consistent with the challenges of competitiveness and stay on the market.As of the results was collected information regarding technological component companies of the productive fabric of the city, for which the application of tools for the analysis of information systems is proposed using the ISO 27001: 2005, using most appropriate technologies to study organizations that protect their most important asset information: information.

  4. Latent morpho-semantic analysis : multilingual information retrieval with character n-grams and mutual information.

    Energy Technology Data Exchange (ETDEWEB)

    Bader, Brett William; Chew, Peter A.; Abdelali, Ahmed (New Mexico State University)

    2008-08-01

    We describe an entirely statistics-based, unsupervised, and language-independent approach to multilingual information retrieval, which we call Latent Morpho-Semantic Analysis (LMSA). LMSA overcomes some of the shortcomings of related previous approaches such as Latent Semantic Analysis (LSA). LMSA has an important theoretical advantage over LSA: it combines well-known techniques in a novel way to break the terms of LSA down into units which correspond more closely to morphemes. Thus, it has a particular appeal for use with morphologically complex languages such as Arabic. We show through empirical results that the theoretical advantages of LMSA can translate into significant gains in precision in multilingual information retrieval tests. These gains are not matched either when a standard stemmer is used with LSA, or when terms are indiscriminately broken down into n-grams.

  5. Cognitive Dimensions Analysis of Interfaces for Information Seeking

    CERN Document Server

    Golovchinsky, Gene

    2009-01-01

    Cognitive Dimensions is a framework for analyzing human-computer interaction. It is used for meta-analysis, that is, for talking about characteristics of systems without getting bogged down in details of a particular implementation. In this paper, I discuss some of the dimensions of this theory and how they can be applied to analyze information seeking interfaces. The goal of this analysis is to introduce a useful vocabulary that practitioners and researchers can use to describe systems, and to guide interface design toward more usable and useful systems

  6. Structural Simulations and Conservation Analysis -Historic Building Information Model (HBIM

    Directory of Open Access Journals (Sweden)

    C. Dore

    2015-02-01

    Full Text Available In this paper the current findings to date of the Historic Building Information Model (HBIM of the Four Courts in Dublin are presented. The Historic Building Information Model (HBIM forms the basis for both structural and conservation analysis to measure the impact of war damage which still impacts on the building. The laser scan survey was carried out in the summer of 2014 of the internal and external structure. After registration and processing of the laser scan survey, the HBIM was created of the damaged section of the building and is presented as two separate workflows in this paper. The first is the model created from historic data, the second a procedural and segmented model developed from laser scan survey of the war damaged drum and dome. From both models structural damage and decay simulations will be developed for documentation and conservation analysis.

  7. Information Retrieval and Graph Analysis Approaches for Book Recommendation.

    Science.gov (United States)

    Benkoussas, Chahinez; Bellot, Patrice

    2015-01-01

    A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model) and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval approach to related document network comprised of social links. We called Directed Graph of Documents (DGD) a network constructed with documents and social information provided from each one of them. Specifically, this work tackles the problem of book recommendation in the context of INEX (Initiative for the Evaluation of XML retrieval) Social Book Search track. A series of reranking experiments demonstrate that combining retrieval models yields significant improvements in terms of standard ranked retrieval metrics. These results extend the applicability of link analysis algorithms to different environments.

  8. Fusion of Multimodal Information in Music Content Analysis

    OpenAIRE

    Essid, Slim; Richard, Gaël

    2012-01-01

    Music is often processed through its acoustic realization. This is restrictive in the sense that music is clearly a highly multimodal concept where various types of heterogeneous information can be associated to a given piece of music (a musical score, musicians' gestures, lyrics, user-generated metadata, etc.). This has recently led researchers to apprehend music through its various facets, giving rise to "multimodal music analysis" studies. This article gives a synthetic overview of methods...

  9. Carbon Dioxide Information Analysis Center: FY 1992 activities

    Energy Technology Data Exchange (ETDEWEB)

    Cushman, R.M. [Oak Ridge National Lab., TN (United States). Carbon Dioxide Information Analysis Center; Stoss, F.W. [Tennessee Univ., Knoxville, TN (United States). Energy, Environment and Resources Center

    1993-03-01

    During the course of a fiscal year, Oak Ridge National Laboratory`s Carbon Dioxide Information Analysis Center (CDIAC) distributes thousands of specialty publications-numeric data packages (NDPs), computer model packages (CMPs), technical reports, public communication publications, newsletters, article reprints, and reference books-in response to requests for information related to global environmental issues, primarily those pertaining to climate change. CDIACs staff also provides technical responses to specific inquiries related to carbon dioxide (CO{sub 2}), other trace gases, and climate. Hundreds of referrals to other researchers, policy analysts, information specialists, or organizations are also facilitated by CDIAC`s staff. This report provides an account of the activities accomplished by CDIAC during the period October 1, 1991 to September 30, 1992. An organizational overview of CDIAC and its staff is supplemented by a detailed description of inquiries received and CDIAC`s response to those inquiries. As analysis and description of the preparation and distribution of numeric data packages, computer model packages, technical reports, newsletters, fact sheets, specialty publications, and reprints is provided. Comments and descriptions of CDIAC`s information management systems, professional networking, and special bilateral agreements are also described.

  10. Carbon Dioxide Information Analysis Center: FY 1991 activities

    Energy Technology Data Exchange (ETDEWEB)

    Cushman, R.M.; Stoss, F.W.

    1992-06-01

    During the course of a fiscal year, Oak Ridge National Laboratory`s Carbon Dioxide Information Analysis Center (CDIAC) distributes thousands of specially publications-numeric data packages (NDPs), computer model packages (CMPs), technical reports, public communication publications, newsletters, article reprints, and reference books-in response to requests for information related to global environmental issues, primarily those pertaining to climate change. CDIAC`s staff also provides technical responses to specific inquiries related to carbon dioxide (CO{sub 2}), other trace gases, and climate. Hundreds of referrals to other researchers, policy analysts, information specialists, or organizations are also facilitated by CDIAC`s staff. This report provides an account of the activities accomplished by CDIAC during the period October 1, 1990 to September 30, 1991. An organizational overview of CDIAC and its staff is supplemented by a detailed description of inquiries received and CDIAC`s response to those inquiries. An analysis and description of the preparation and distribution of numeric data packages, computer model packages, technical reports, newsletters, factsheets, specially publications, and reprints is provided. Comments and descriptions of CDIAC`s information management systems, professional networking, and special bilateral agreements are also described.

  11. Carbon Dioxide Information Analysis Center: FY 1991 activities

    Energy Technology Data Exchange (ETDEWEB)

    Cushman, R.M.; Stoss, F.W.

    1992-06-01

    During the course of a fiscal year, Oak Ridge National Laboratory's Carbon Dioxide Information Analysis Center (CDIAC) distributes thousands of specially publications-numeric data packages (NDPs), computer model packages (CMPs), technical reports, public communication publications, newsletters, article reprints, and reference books-in response to requests for information related to global environmental issues, primarily those pertaining to climate change. CDIAC's staff also provides technical responses to specific inquiries related to carbon dioxide (CO{sub 2}), other trace gases, and climate. Hundreds of referrals to other researchers, policy analysts, information specialists, or organizations are also facilitated by CDIAC's staff. This report provides an account of the activities accomplished by CDIAC during the period October 1, 1990 to September 30, 1991. An organizational overview of CDIAC and its staff is supplemented by a detailed description of inquiries received and CDIAC's response to those inquiries. An analysis and description of the preparation and distribution of numeric data packages, computer model packages, technical reports, newsletters, factsheets, specially publications, and reprints is provided. Comments and descriptions of CDIAC's information management systems, professional networking, and special bilateral agreements are also described.

  12. Integrated information system for analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Performing complicated engineering analyses of a nuclear power plant requires storage and manipulation of a large amount of information, both data and knowledge. This information is characterized by its multidisciplinary nature, complexity, and diversity. The problems caused by inefficient and lengthy manual operations involving the data flow management within the frame-work of the safety-related analysis of a power plant can be solved by applying the computer aided engineering principles. These principles are the basis of the design of an integrated information storage system (IRIS). The basic idea is to create a computerized environment, which includes both database and functional capabilities. Consideration and analysis of the data types and required data manipulation capabilities as well as operational requirements, resulted in the choice of an object-oriented data-base management system (OODBMS) as a development platform for solving the software engineering problems. Several advantages of OODBMSs over conventional relations database systems were found of crucial importance, especially providing the necessary flexibility for different data types and extensibility potential. A detailed design of a data model is produced for the plant technical data and for the storage of analysis results. The overall system architecture was designed to assure the feasibility of integrating database capabilities with procedures and functions written in conventional algorithmic programming languages

  13. Patent portfolio analysis model based on legal status information

    Institute of Scientific and Technical Information of China (English)

    Xuezhao; WANG; Yajuan; ZHAO; Jing; ZHANG; Ping; ZHAO

    2014-01-01

    Purpose:This research proposes a patent portfolio analysis model based on the legal status information to chart out a competitive landscape in a particular field,enabling organizations to position themselves within the overall technology landscape.Design/methodology/approach:Three indicators were selected for the proposed model:Patent grant rate,valid patents rate and patent maintenance period.The model uses legal status information to perform a qualitative evaluation of relative values of the individual patents,countries or regions’ technological capabilities and competitiveness of patent applicants.The results are visualized by a four-quadrant bubble chart To test the effectiveness of the model,it is used to present a competitive landscape in the lithium ion battery field.Findings:The model can be used to evaluate the values of the individual patents,highlight countries or regions’ positions in the field,and rank the competitiveness of patent applicants in the field.Research limitations:The model currently takes into consideration only three legal status indicators.It is actually feasible to introduce more indicators such as the reason for invalid patents and the distribution of patent maintenance time and associate them with those in the proposed model.Practical implications:Analysis of legal status information in combination of patent application information can help an organization to spot gaps in its patent claim coverage,as well as evaluate patent quality and maintenance situation of its granted patents.The study results can be used to support technology assessment,technology innovation and intellectual property management.Originality/value:Prior studies attempted to assess patent quality or competitiveness by using either single patent legal status indicator or comparative analysis of the impacts of each indicator.However,they are insufficient in presenting the combined effects of the evaluation indicators.Using our model,it appears possible to get a

  14. Principal Component Analysis - A Powerful Tool in Computing Marketing Information

    Directory of Open Access Journals (Sweden)

    Constantin C.

    2014-12-01

    Full Text Available This paper is about an instrumental research regarding a powerful multivariate data analysis method which can be used by the researchers in order to obtain valuable information for decision makers that need to solve the marketing problem a company face with. The literature stresses the need to avoid the multicollinearity phenomenon in multivariate analysis and the features of Principal Component Analysis (PCA in reducing the number of variables that could be correlated with each other to a small number of principal components that are uncorrelated. In this respect, the paper presents step-by-step the process of applying the PCA in marketing research when we use a large number of variables that naturally are collinear.

  15. Combinatorial Fusion Analysis for Meta Search Information Retrieval

    Science.gov (United States)

    Hsu, D. Frank; Taksa, Isak

    Leading commercial search engines are built as single event systems. In response to a particular search query, the search engine returns a single list of ranked search results. To find more relevant results the user must frequently try several other search engines. A meta search engine was developed to enhance the process of multi-engine querying. The meta search engine queries several engines at the same time and fuses individual engine results into a single search results list. The fusion of multiple search results has been shown (mostly experimentally) to be highly effective. However, the question of why and how the fusion should be done still remains largely unanswered. In this chapter, we utilize the combinatorial fusion analysis proposed by Hsu et al. to analyze combination and fusion of multiple sources of information. A rank/score function is used in the design and analysis of our framework. The framework provides a better understanding of the fusion phenomenon in information retrieval. For example, to improve the performance of the combined multiple scoring systems, it is necessary that each of the individual scoring systems has relatively high performance and the individual scoring systems are diverse. Additionally, we illustrate various applications of the framework using two examples from the information retrieval domain.

  16. Petroleum labour market information supply demand analysis 2009-2020

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2010-03-15

    Since 2006, the petroleum industry has been interested in collaboration to determine labour demand and supply/demand gaps for the upstream petroleum industry. In 2006, the petroleum industry experienced strong employment growth and was having difficulty finding workers. Comprehensive, up-to-date labour market information and analysis are the key foundation for addressing labour supply/demand issues. This document presented labour market information on the petroleum industry in order to inform company retention and recruitment offices; government departments involved in development of labour market policies and programs; education and training institutions; guidance counsellors, employment centres and organizations that work with youth and labour supply pools; and job seekers. Specific topics that were discussed included two industry scenarios (growth and base case) in determining the petroleum industry's medium-and long-term employment needs; labour supply/demand considerations for the industry as a whole and an industry-wide cost management; and an analysis of exploration and production, oil sands, services, and pipeline sectors to 2020. It was concluded that while new employment is not expected to lead to labour shortages within the pipeline sector, attrition due to requirements almost certainly would. In the growth scenario, it is likely the pipeline sector will be challenged by competition from the other petroleum industry sectors. tabs., figs., appendices.

  17. Information Problem Solving: Analysis of a Complex Cognitive Skill

    NARCIS (Netherlands)

    S. Brand-Gruwel; I. Wopereis; Y. Vermetten

    2004-01-01

    textabstractIn (higher) education students are often faced with information problems: tasks or assignments which require the student to identify information needs, locate corresponding information sources, extract and organize relevant information from each source, and synthesize information from a

  18. Brain Tumor Detection Based On Mathematical Analysis and Symmetry Information

    Directory of Open Access Journals (Sweden)

    Narkhede Sachin G.,

    2014-02-01

    Full Text Available Image segmentation some of the challenging issues on brain magnetic resonance (MR image tumor segmentation caused by the weak correlation between magnetic resonance imaging (MRI intensity and anatomical meaning. With the objective of utilizing more meaningful information to improve brain tumor segmentation, an approach which employs bilateral symmetry information as an additional feature for segmentation is proposed. This is motivated by potential performance improvement in the general automatic brain tumor segmentation systems which are important for many medical and scientific applications. Brain Magnetic Resonance Imaging (MRI segmentation is a complex problem in the field of medical imaging despite various presented methods. MR image of human brain can be divided into several sub-regions especially soft tissues such as gray matter, white matter and cerebrospinal fluid. Although edge information is the main clue in image segmentation, it can’t get a better result in analysis the content of images without combining other information. Our goal is to detect the position and boundary of tumors automatically. Experiments were conducted on real pictures, and the results show that the algorithm is flexible and convenient.

  19. Thermal analysis and safety information for metal nanopowders by DSC

    International Nuclear Information System (INIS)

    Highlights: • Metal nanopowders are common and frequently employed in industry. • Nano iron powder experimental results of To were 140–150 °C. • Safety information can benefit relevant metal powders industries. - Abstract: Metal nanopowders are common and frequently employed in industry. Iron is mostly applied in high-performance magnetic materials and pollutants treatment for groundwater. Zinc is widely used in brass, bronze, die casting metal, alloys, rubber, and paints, etc. Nonetheless, some disasters induced by metal powders are due to the lack of related safety information. In this study, we applied differential scanning calorimetry (DSC) and used thermal analysis software to evaluate the related thermal safety information, such as exothermic onset temperature (To), peak of temperature (Tp), and heat of reaction (ΔH). The nano iron powder experimental results of To were 140–150 °C, 148–158 °C, and 141–149 °C for 15 nm, 35 nm, and 65 nm, respectively. The ΔH was larger than 3900 J/g, 5000 J/g, and 3900 J/g for 15 nm, 35 nm, and 65 nm, respectively. Safety information can benefit the relevant metal powders industries for preventing accidents from occurring

  20. Climate Informed Low Flow Frequency Analysis Using Nonstationary Modeling

    Science.gov (United States)

    Liu, D.; Guo, S.; Lian, Y.

    2014-12-01

    Stationarity is often assumed for frequency analysis of low flows in water resources management and planning. However, many studies have shown that flow characteristics, particularly the frequency spectrum of extreme hydrologic events,were modified by climate change and human activities and the conventional frequency analysis without considering the non-stationary characteristics may lead to costly design. The analysis presented in this paper was based on the more than 100 years of daily flow data from the Yichang gaging station 44 kilometers downstream of the Three Gorges Dam. The Mann-Kendall trend test under the scaling hypothesis showed that the annual low flows had significant monotonic trend, whereas an abrupt change point was identified in 1936 by the Pettitt test. The climate informed low flow frequency analysis and the divided and combined method are employed to account for the impacts from related climate variables and the nonstationarities in annual low flows. Without prior knowledge of the probability density function for the gaging station, six distribution functions including the Generalized Extreme Values (GEV), Pearson Type III, Gumbel, Gamma, Lognormal, and Weibull distributions have been tested to find the best fit, in which the local likelihood method is used to estimate the parameters. Analyses show that GEV had the best fit for the observed low flows. This study has also shown that the climate informed low flow frequency analysis is able to exploit the link between climate indices and low flows, which would account for the dynamic feature for reservoir management and provide more accurate and reliable designs for infrastructure and water supply.

  1. Least Dependent Component Analysis Based on Mutual Information

    CERN Document Server

    Stögbauer, H; Astakhov, S A; Grassberger, P; St\\"ogbauer, Harald; Kraskov, Alexander; Astakhov, Sergey A.; Grassberger, Peter

    2004-01-01

    We propose to use precise estimators of mutual information (MI) to find least dependent components in a linearly mixed signal. On the one hand this seems to lead to better blind source separation than with any other presently available algorithm. On the other hand it has the advantage, compared to other implementations of `independent' component analysis (ICA) some of which are based on crude approximations for MI, that the numerical values of the MI can be used for: (i) estimating residual dependencies between the output components; (ii) estimating the reliability of the output, by comparing the pairwise MIs with those of re-mixed components; (iii) clustering the output according to the residual interdependencies. For the MI estimator we use a recently proposed k-nearest neighbor based algorithm. For time sequences we combine this with delay embedding, in order to take into account non-trivial time correlations. After several tests with artificial data, we apply the resulting MILCA (Mutual Information based ...

  2. Environmental Quality Information Analysis Center (EQIAC) operating procedures handbook

    Energy Technology Data Exchange (ETDEWEB)

    Walsh, T.E. (Florida Univ., Gainesville, FL (United States)); Das, S. (Oak Ridge National Lab., TN (United States))

    1992-08-01

    The Operating Procedures Handbook of the Environmental Quality Information Analysis Center (EQIAC) is intended to be kept current as EQIAC develops and evolves. Its purpose is to provide a comprehensive guide to the mission, infrastructure, functions, and operational procedures of EQIAC. The handbook is a training tool for new personnel and a reference manual for existing personnel. The handbook will be distributed throughout EQIAC and maintained in binders containing current dated editions of the individual sections. The handbook will be revised at least annually to reflect the current structure and operational procedures of EQIAC. The EQIAC provides information on environmental issues such as compliance, restoration, and environmental monitoring do the Air Force and DOD contractors.

  3. Probabilistic analysis of the human transcriptome with side information

    CERN Document Server

    Lahti, Leo

    2011-01-01

    Understanding functional organization of genetic information is a major challenge in modern biology. Following the initial publication of the human genome sequence in 2001, advances in high-throughput measurement technologies and efficient sharing of research material through community databases have opened up new views to the study of living organisms and the structure of life. In this thesis, novel computational strategies have been developed to investigate a key functional layer of genetic information, the human transcriptome, which regulates the function of living cells through protein synthesis. The key contributions of the thesis are general exploratory tools for high-throughput data analysis that have provided new insights to cell-biological networks, cancer mechanisms and other aspects of genome function. A central challenge in functional genomics is that high-dimensional genomic observations are associated with high levels of complex and largely unknown sources of variation. By combining statistical ...

  4. Analysis of Information Leakage in Quantum Key Agreement

    Institute of Scientific and Technical Information of China (English)

    LIU Sheng-li; ZHENG Dong; CHENG Ke-fei

    2006-01-01

    Quantum key agreement is one of the approaches to unconditional security. Since 1980's, different protocols for quantum key agreement have been proposed and analyzed. A new quantum key agreement protocol was presented in 2004, and a detailed analysis to the protocol was given. The possible game played between legitimate users and the enemy was described:sitting in the middle, an adversary can play a "man-in-the-middle" attack to cheat the sender and receiver. The information leaked to the adversary is essential to the length of the final quantum secret key. It was shown how to determine the amount of information leaked to the enemy and the amount of uncertainty between the legitimate sender and receiver.

  5. The visual analysis of textual information: Browsing large document sets

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, J.; Pennock, K.; Fiegel, T.; Wise, J.; Pottier, M.; Schur, A.; Crow, V. [Pacific Northwest Lab., Richland, WA (United States); Lantrip, D. [California Univ., Santa Barbara, CA (United States)

    1995-05-01

    Visualization tools have been invaluable in the process of scientific discovery by providing researchers with insights gained through graphical tools and techniques. At PNL, the Multidimensional Visualization and Advanced Browsing (MVAB) project is extending visualization technology to the problems of intelligence analysis of textual documents by creating spatial representations of textual information. By representing an entire corpus of documents as points in a coordinate space of two or more dimensions, the tools developed by the MVAB team give the analyst the ability to quickly browse the entire document base and determine relationships among documents and publication patterns not readily discernible through traditional lexical means.

  6. Analysis of Internet Information on Lateral Lumbar Interbody Fusion.

    Science.gov (United States)

    Belayneh, Rebekah; Mesfin, Addisu

    2016-07-01

    Lateral lumbar interbody fusion (LLIF) is a surgical technique that is being increasingly used. The authors' objective was to examine information on the Internet pertaining to the LLIF technique. An analysis was conducted of publicly accessible websites pertaining to LLIF. The following search engines were used: Google (www.google.com), Bing (www.bing.com), and Yahoo (www.yahoo.com). DuckDuckGo (www.duckduckgo.com) was an additional search engine used due to its emphasis on generating accurate and consistent results while protecting searchers' privacy and reducing advertisements. The top 35 websites providing information on LLIF from the 4 search engines were identified. A total of 140 websites were evaluated. Each web-site was categorized based on authorship (academic, private, medical industry, insurance company, other) and content of information. Using the search term lateral lumbar interbody fusion, 174,000 Google results, 112,000 Yahoo results, and 112,000 Bing results were obtained. DuckDuckGo does not display the number of results found for a search. From the top 140 websites collected from each website, 78 unique websites were identified. Websites were authored by a private medical group in 46.2% of the cases, an academic medical group in 26.9% of the cases, and the biomedical industry in 5.1% of the cases. Sixty-eight percent of websites reported indications, and 24.4% reported contraindications. Benefits of LLIF were reported by 69.2% of websites. Thirty-six percent of websites reported complications of LLIF. Overall, the quality of information regarding LLIF on the Internet is poor. Spine surgeons and spine societies can assist in improving the quality of the information on the Internet regarding LLIF. [Orthopedics. 2016; 39(4):e701-e707.]. PMID:27111081

  7. Analysis of Internet Information on Lateral Lumbar Interbody Fusion.

    Science.gov (United States)

    Belayneh, Rebekah; Mesfin, Addisu

    2016-07-01

    Lateral lumbar interbody fusion (LLIF) is a surgical technique that is being increasingly used. The authors' objective was to examine information on the Internet pertaining to the LLIF technique. An analysis was conducted of publicly accessible websites pertaining to LLIF. The following search engines were used: Google (www.google.com), Bing (www.bing.com), and Yahoo (www.yahoo.com). DuckDuckGo (www.duckduckgo.com) was an additional search engine used due to its emphasis on generating accurate and consistent results while protecting searchers' privacy and reducing advertisements. The top 35 websites providing information on LLIF from the 4 search engines were identified. A total of 140 websites were evaluated. Each web-site was categorized based on authorship (academic, private, medical industry, insurance company, other) and content of information. Using the search term lateral lumbar interbody fusion, 174,000 Google results, 112,000 Yahoo results, and 112,000 Bing results were obtained. DuckDuckGo does not display the number of results found for a search. From the top 140 websites collected from each website, 78 unique websites were identified. Websites were authored by a private medical group in 46.2% of the cases, an academic medical group in 26.9% of the cases, and the biomedical industry in 5.1% of the cases. Sixty-eight percent of websites reported indications, and 24.4% reported contraindications. Benefits of LLIF were reported by 69.2% of websites. Thirty-six percent of websites reported complications of LLIF. Overall, the quality of information regarding LLIF on the Internet is poor. Spine surgeons and spine societies can assist in improving the quality of the information on the Internet regarding LLIF. [Orthopedics. 2016; 39(4):e701-e707.].

  8. An Examination of Canadian Information Professionals' Involvement in the Provision of Business Information Synthesis and Analysis Services

    Science.gov (United States)

    Patterson, Liane; Martzoukou, Konstantina

    2012-01-01

    The present study investigated the processes information professionals, working in a business environment, follow to meet business clients' information needs and particularly their involvement in information synthesis and analysis practices. A combination of qualitative and quantitative data was collected via a survey of 98 information…

  9. Analysis of system trustworthiness based on information flow noninterference theory

    Institute of Scientific and Technical Information of China (English)

    Xiangying Kong; Yanhui Chen; Yi Zhuang

    2015-01-01

    The trustworthiness analysis and evaluation are the bases of the trust chain transfer. In this paper the formal method of trustworthiness analysis of a system based on the noninterfer-ence (NI) theory of the information flow is studied. Firstly, existing methods cannot analyze the impact of the system states on the trustworthiness of software during the process of trust chain trans-fer. To solve this problem, the impact of the system state on trust-worthiness of software is investigated, the run-time mutual interfer-ence behavior of software entities is described and an interference model of the access control automaton of a system is established. Secondly, based on the intransitive noninterference (INI) theory, a formal analytic method of trustworthiness for trust chain transfer is proposed, providing a theoretical basis for the analysis of dynamic trustworthiness of software during the trust chain transfer process. Thirdly, a prototype system with dynamic trustworthiness on a plat-form with dual core architecture is constructed and a verification algorithm of the system trustworthiness is provided. Final y, the monitor hypothesis is extended to the dynamic monitor hypothe-sis, a theorem of static judgment rule of system trustworthiness is provided, which is useful to prove dynamic trustworthiness of a system at the beginning of system construction. Compared with previous work in this field, this research proposes not only a formal analytic method for the determination of system trustworthiness, but also a modeling method and an analysis algorithm that are feasible for practical implementation.

  10. SUCCESS CONCEPT ANALYSIS APPLIED TO THE INFORMATION TECHNOLOGY PROJECT MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Cassio C. Montenegro Duarte

    2012-05-01

    Full Text Available This study evaluates the concept of success in project management that is applicable to the IT universe, from the classical theory associated with the techniques of project management. Therefore, it applies the theoretical analysis associated to the context of information technology in enterprises as well as the classic literature of traditional project management, focusing on its application in business information technology. From the literature developed in the first part of the study, four propositions were prepared for study which formed the basis for the development of the field research with three large companies that develop projects of Information Technology. The methodology used in the study predicted the development of the multiple case study. Empirical evidence suggests that the concept of success found in the classical literature in project management adjusts to the environment management of IT projects. Showed that it is possible to create the model of standard IT projects in order to replicate it in future derivatives projects, which depends on the learning acquired at the end of a long and continuous process and sponsorship of senior management, which ultimately results in its merger into the company culture.

  11. Analysis of informational redundancy in the protein-assembling machinery

    Science.gov (United States)

    Berkovich, Simon

    2004-03-01

    Entropy analysis of the DNA structure does not reveal a significant departure from randomness indicating lack of informational redundancy. This signifies the absence of a hidden meaning in the genome text and supports the 'barcode' interpretation of DNA given in [1]. Lack of informational redundancy is a characteristic property of an identification label rather than of a message of instructions. Yet randomness of DNA has to induce non-random structures of the proteins. Protein synthesis is a two-step process: transcription into RNA with gene splicing and formation a structure of amino acids. Entropy estimations, performed by A. Djebbari, show typical values of redundancy of the biomolecules along these pathways: DNA gene 4proteins 15-40in gene expression, the RNA copy carries the same information as the original DNA template. Randomness is essentially eliminated only at the step of the protein creation by a degenerate code. According to [1], the significance of the substitution of U for T with a subsequent gene splicing is that these transformations result in a different pattern of RNA oscillations, so the vital DNA communications are protected against extraneous noise coming from the protein making activities. 1. S. Berkovich, "On the 'barcode' functionality of DNA, or the Phenomenon of Life in the Physical Universe", Dorrance Publishing Co., Pittsburgh, 2003

  12. Genetic association analysis of complex diseases incorporating intermediate phenotype information.

    Directory of Open Access Journals (Sweden)

    Yafang Li

    Full Text Available Genetic researchers often collect disease related quantitative traits in addition to disease status because they are interested in understanding the pathophysiology of disease processes. In genome-wide association (GWA studies, these quantitative phenotypes may be relevant to disease development and serve as intermediate phenotypes or they could be behavioral or other risk factors that predict disease risk. Statistical tests combining both disease status and quantitative risk factors should be more powerful than case-control studies, as the former incorporates more information about the disease. In this paper, we proposed a modified inverse-variance weighted meta-analysis method to combine disease status and quantitative intermediate phenotype information. The simulation results showed that when an intermediate phenotype was available, the inverse-variance weighted method had more power than did a case-control study of complex diseases, especially in identifying susceptibility loci having minor effects. We further applied this modified meta-analysis to a study of imputed lung cancer genotypes with smoking data in 1154 cases and 1137 matched controls. The most significant SNPs came from the CHRNA3-CHRNA5-CHRNB4 region on chromosome 15q24-25.1, which has been replicated in many other studies. Our results confirm that this CHRNA region is associated with both lung cancer development and smoking behavior. We also detected three significant SNPs--rs1800469, rs1982072, and rs2241714--in the promoter region of the TGFB1 gene on chromosome 19 (p = 1.46×10(-5, 1.18×10(-5, and 6.57×10(-6, respectively. The SNP rs1800469 is reported to be associated with chronic obstructive pulmonary disease and lung cancer in cigarette smokers. The present study is the first GWA study to replicate this result. Signals in the 3q26 region were also identified in the meta-analysis. We demonstrate the intermediate phenotype can potentially enhance the power of complex

  13. An information theory analysis of spatial decisions in cognitive development.

    Science.gov (United States)

    Scott, Nicole M; Sera, Maria D; Georgopoulos, Apostolos P

    2015-01-01

    Performance in a cognitive task can be considered as the outcome of a decision-making process operating across various knowledge domains or aspects of a single domain. Therefore, an analysis of these decisions in various tasks can shed light on the interplay and integration of these domains (or elements within a single domain) as they are associated with specific task characteristics. In this study, we applied an information theoretic approach to assess quantitatively the gain of knowledge across various elements of the cognitive domain of spatial, relational knowledge, as a function of development. Specifically, we examined changing spatial relational knowledge from ages 5 to 10 years. Our analyses consisted of a two-step process. First, we performed a hierarchical clustering analysis on the decisions made in 16 different tasks of spatial relational knowledge to determine which tasks were performed similarly at each age group as well as to discover how the tasks clustered together. We next used two measures of entropy to capture the gradual emergence of order in the development of relational knowledge. These measures of "cognitive entropy" were defined based on two independent aspects of chunking, namely (1) the number of clusters formed at each age group, and (2) the distribution of tasks across the clusters. We found that both measures of entropy decreased with age in a quadratic fashion and were positively and linearly correlated. The decrease in entropy and, therefore, gain of information during development was accompanied by improved performance. These results document, for the first time, the orderly and progressively structured "chunking" of decisions across the development of spatial relational reasoning and quantify this gain within a formal information-theoretic framework.

  14. Information findability: An informal study to explore options for improving information findability for the systems analysis group

    Energy Technology Data Exchange (ETDEWEB)

    Stoecker, Nora Kathleen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-03-01

    A Systems Analysis Group has existed at Sandia National Laboratories since at least the mid-1950s. Much of the groups work output (reports, briefing documents, and other materials) has been retained, along with large numbers of related documents. Over time the collection has grown to hundreds of thousands of unstructured documents in many formats contained in one or more of several different shared drives or SharePoint sites, with perhaps five percent of the collection still existing in print format. This presents a challenge. How can the group effectively find, manage, and build on information contained somewhere within such a large set of unstructured documents? In response, a project was initiated to identify tools that would be able to meet this challenge. This report documents the results found and recommendations made as of August 2013.

  15. From paragraph to graph: latent semantic analysis for information visualization.

    Science.gov (United States)

    Landauer, Thomas K; Laham, Darrell; Derr, Marcia

    2004-04-01

    Most techniques for relating textual information rely on intellectually created links such as author-chosen keywords and titles, authority indexing terms, or bibliographic citations. Similarity of the semantic content of whole documents, rather than just titles, abstracts, or overlap of keywords, offers an attractive alternative. Latent semantic analysis provides an effective dimension reduction method for the purpose that reflects synonymy and the sense of arbitrary word combinations. However, latent semantic analysis correlations with human text-to-text similarity judgments are often empirically highest at approximately 300 dimensions. Thus, two- or three-dimensional visualizations are severely limited in what they can show, and the first and/or second automatically discovered principal component, or any three such for that matter, rarely capture all of the relations that might be of interest. It is our conjecture that linguistic meaning is intrinsically and irreducibly very high dimensional. Thus, some method to explore a high dimensional similarity space is needed. But the 2.7 x 10(7) projections and infinite rotations of, for example, a 300-dimensional pattern are impossible to examine. We suggest, however, that the use of a high dimensional dynamic viewer with an effective projection pursuit routine and user control, coupled with the exquisite abilities of the human visual system to extract information about objects and from moving patterns, can often succeed in discovering multiple revealing views that are missed by current computational algorithms. We show some examples of the use of latent semantic analysis to support such visualizations and offer views on future needs.

  16. Economic Efficiency Analysis for Information Technology in Developing Countries

    Directory of Open Access Journals (Sweden)

    Ghassan F. Issa

    2009-01-01

    Full Text Available Problem statement: The introduction of Information Technology (IT to government institutions in developing countries bears a great deal of risk of failure. The lack of qualified personnel, lack of financial support and the lack of planning and proper justification are just few of the causes of projects failure. Study presented in this study focused on the justification issue of IT projects through the application of Cost Benefit Analysis (CBA as part of a comprehensive Economic Efficiency Analysis (EEA of IT Projects, thus providing management with a decision making tool which highlights existing and future problems and reduces the risk of failure. Approach: Cost-Benefit Analysis (CBA based on Economic Efficiency Analysis (EEA was performed on selected IT projects from ministries and key institutions in the government of Jordan using a well established approach employed by the Federal Government of Germany (KBSt approach. The approach was then modified and refined to suit the needs of developing countries so that it captured all the relevant elements of cost and benefits both quantitatively and qualitatively and includes a set of guidelines for data collection strategy. Results: When IT projects were evaluated using CBA, most cases yielded negative Net Present Value (NPV, even though, some cases showed some reduction in operation cost starting from the third year of project life. However, when the CBA was applied as a part of a comprehensive EEA by introducing qualitative aspects and urgency criteria, proper justification for new projects became feasible. Conclusion: The modified EEA represented a systematic approach which was well suited for the government of Jordan as a developing country. This approach was capable of dealing with the justification issue, evaluation of existing systems and the urgency of replacing legacy systems. This study explored many of the challenges and inherited problems existing in the public sectors of developing

  17. Automation of Large-scale Computer Cluster Monitoring Information Analysis

    Science.gov (United States)

    Magradze, Erekle; Nadal, Jordi; Quadt, Arnulf; Kawamura, Gen; Musheghyan, Haykuhi

    2015-12-01

    High-throughput computing platforms consist of a complex infrastructure and provide a number of services apt to failures. To mitigate the impact of failures on the quality of the provided services, a constant monitoring and in time reaction is required, which is impossible without automation of the system administration processes. This paper introduces a way of automation of the process of monitoring information analysis to provide the long and short term predictions of the service response time (SRT) for a mass storage and batch systems and to identify the status of a service at a given time. The approach for the SRT predictions is based on Adaptive Neuro Fuzzy Inference System (ANFIS). An evaluation of the approaches is performed on real monitoring data from the WLCG Tier 2 center GoeGrid. Ten fold cross validation results demonstrate high efficiency of both approaches in comparison to known methods.

  18. Mediating Informal Care Online: Findings from an Extensive Requirements Analysis

    Directory of Open Access Journals (Sweden)

    Christiane Moser

    2015-05-01

    Full Text Available Organizing and satisfying the increasing demand for social and informal care for older adults is an important topic. We aim at building a peer-to-peer exchange platform that empowers older adults to benefit from receiving support for daily activities and reciprocally offering support to others. In situated interviews and within a survey we investigated the requirements and needs of 246 older adults with mild impairments. Additionally, we conducted an interpretative role analysis of older adults’ collaborative care processes (i.e., support exchange practices in order to identify social roles and understand the inherent expectations towards the execution of support. We will describe our target group in the form of personas and different social roles, as well as user requirements for establishing a successful peer-to-peer collaboration. We also consider our finding from the perspective of social capital theory that allows us to describe in our requirements how relationships provide valuable social resources (i.e., social capital for informal and social care.

  19. A Comparative Analysis of University Information Systems within the Scope of the Information Security Risks

    OpenAIRE

    Rustu Yilmaz; Yildiray Yalman

    2016-01-01

    Universities are the leading institutions that are the sources of educated human population who both produce information and ensure to develop new products and new services by using information effectively, and who are needed in every area. Therefore, universities are expected to be institutions where information and information management are used efficiently. In the present study, the topics such as infrastructure, operation, application, information, policy and human-based information secu...

  20. Information Security Risk Analysis Methods and Research Trends: AHP and Fuzzy Comprehensive Method

    OpenAIRE

    Ming-Chang Lee

    2014-01-01

    Information security risk analysis becomes an increasingly essential component of organization’s operations. Traditional Information security risk analysis is quantitative and qualitative analysis methods. Quantitative and qualitative analysis methods have some advantages for information risk analysis. However, hierarchy process has been widely used in security assessment. A future research direction may be development and application of soft computing such as rough sets, grey set...

  1. Success story in software engineering using NIAM (Natural language Information Analysis Methodology)

    Energy Technology Data Exchange (ETDEWEB)

    Eaton, S.M.; Eaton, D.S.

    1995-10-01

    To create an information system, we employ NIAM (Natural language Information Analysis Methodology). NIAM supports the goals of both the customer and the analyst completely understanding the information. We use the customer`s own unique vocabulary, collect real examples, and validate the information in natural language sentences. Examples are discussed from a successfully implemented information system.

  2. Maximal information component analysis: a novel non-linear network analysis method

    Directory of Open Access Journals (Sweden)

    Christoph Daniel Rau

    2013-03-01

    Full Text Available Background: Network construction and analysis algorithms provide scientists with the ability to sift through high-throughput biological outputs, such as transcription microarrays, for small groups of genes (modules that are relevant for further research. Most of these algorithms ignore the important role of nonlinear interactions in the data, and the ability for genes to operate in multiple functional groups at once, despite clear evidence for both of these phenomena in observed biological systems. Results: We have created a novel co-expression network analysis algorithm that incorporates both of these principles by combining the information-theoretic association measure of the Maximal Information Coefficient with an Interaction Component Model. We evaluate the performance of this approach on two datasets collected from a large panel of mice, one from macrophages and the other from liver by comparing the two measures based on a measure of module entropy, GO enrichment and scale free topology fit. Our algorithm outperforms a widely used co-expression analysis method, Weighted Gene Coexpression Network Analysis (WGCNA, in the macrophage data, while returning comparable results in the liver dataset when using these criteria. We demonstrate that the macrophage data has more nonlinear interactions than the liver dataset, which may explain the increased performance of our method, termed Maximal Information Component Analysis (MICA in that case.Conclusions: In making our network algorithm more accurately reflect known biological principles, we are able to generate modules with improved relevance, particularly in networks with confounding factors such as gene by environment interactions.

  3. Collaborative for Historical Information and Analysis: Vision and Work Plan

    Directory of Open Access Journals (Sweden)

    Vladimir Zadorozhny

    2013-02-01

    Full Text Available This article conveys the vision of a world-historical dataset, constructed in order to provide data on human social affairs at the global level over the past several centuries. The construction of this dataset will allow the routine application of tools developed for analyzing “Big Data” to global, historical analysis. The work is conducted by the Collaborative for Historical Information and Analysis (CHIA. This association of groups at universities and research institutes in the U.S. and Europe includes five groups funded by the National Science Foundation for work to construct infrastructure for collecting and archiving data on a global level. The article identifies the elements of infrastructure-building, shows how they are connected, and sets the project in the context of previous and current efforts to build large-scale historical datasets. The project is developing a crowd-sourcing application for ingesting and documenting data, a broad and flexible archive, and a “data hoover” process to locate and gather historical datasets for inclusion. In addition, the article identifies four types of data and analytical questions to be explored through this data resource, addressing development, governance, social structure, and the interaction of social and natural variables.

  4. Applications of Geographic Information System (GIS) analysis of Lake Uluabat.

    Science.gov (United States)

    Hacısalihoğlu, Saadet; Karaer, Feza; Katip, Aslıhan

    2016-06-01

    Lake Uluabat is one of the most important wetlands in Turkey because of its rich biodiversity, lying on a migratory bird route with almost all its shores being covered by submerged plants. The lake has been protected by the Ramsar Convention since 1998. However, the Lake is threatened by natural and anthropogenic stressors as a consequence of its location. Geographic Information System (GIS) analysis is a tool that has been widely used, especially for water quality management in recent years. This study aimed to investigate the water quality and determined most polluted points using GIS analysis of the lake. Temperature, pH, dissolved oxygen, chemical oxygen demand, Kjeldahl nitrogen, total phosphorus, chlorophyll-a, arsenic, boron, iron, and manganese were monitored monthly from June 2008 to May 2009, with the samples taken from 8 points in the lake. Effect of pH, relation of temperature, and Chl-a with other water quality parameters and metals are designated as statistically significant. Data were mapped using ArcGIS 9.1 software and were assessed according to the Turkish Water Pollution Control Regulations (TWPCR). The research also focused on classifying and mapping the water quality in the lake by using the spatial analysis functions of GIS. As a result, it was determined that Lake Uluabat belonged to the 4th class, i.e., highly polluted water, including any water of lower quality. A remarkable portion of the pollution in the water basin was attributed to domestic wastewater discharges, industrial and agricultural activities, and mining. PMID:27154052

  5. Applications of Geographic Information System (GIS) analysis of Lake Uluabat.

    Science.gov (United States)

    Hacısalihoğlu, Saadet; Karaer, Feza; Katip, Aslıhan

    2016-06-01

    Lake Uluabat is one of the most important wetlands in Turkey because of its rich biodiversity, lying on a migratory bird route with almost all its shores being covered by submerged plants. The lake has been protected by the Ramsar Convention since 1998. However, the Lake is threatened by natural and anthropogenic stressors as a consequence of its location. Geographic Information System (GIS) analysis is a tool that has been widely used, especially for water quality management in recent years. This study aimed to investigate the water quality and determined most polluted points using GIS analysis of the lake. Temperature, pH, dissolved oxygen, chemical oxygen demand, Kjeldahl nitrogen, total phosphorus, chlorophyll-a, arsenic, boron, iron, and manganese were monitored monthly from June 2008 to May 2009, with the samples taken from 8 points in the lake. Effect of pH, relation of temperature, and Chl-a with other water quality parameters and metals are designated as statistically significant. Data were mapped using ArcGIS 9.1 software and were assessed according to the Turkish Water Pollution Control Regulations (TWPCR). The research also focused on classifying and mapping the water quality in the lake by using the spatial analysis functions of GIS. As a result, it was determined that Lake Uluabat belonged to the 4th class, i.e., highly polluted water, including any water of lower quality. A remarkable portion of the pollution in the water basin was attributed to domestic wastewater discharges, industrial and agricultural activities, and mining.

  6. ANALYSIS ON AGRICULTRUAL INFORMATION DEMAND --Case of Jilin Province

    Institute of Scientific and Technical Information of China (English)

    ZHANG Hui-min; JIANG Hui-ming

    2005-01-01

    With the rapid development of agricultural informalization in the world, the demand of agricultural information has been a focus in the international agriculture and information fields. Based on the investigation, this paper presented the four characteristics of the demand of agricultural information in China, including regionality, seasonality, great potential demand and variation in kind and level. The factors influencing the demand of agricultural information were analyzed by the Optimized Less Square (OLS) method. The result shows that, of all factors influcing agricultural information demand, the most important one is economy, the second is facility of information pass,and knowledge and education of user, credit of agricultural information service system and production situation follow. Taking Jilin Province as an example, this article also elaborated the agricultural information demand status, and deduced the regression model of agricultural information demand and verified it by the survey in rural Jilin.

  7. INFORMATION CULTURE OF EDUCATION MANAGER: LEVEL AND FACTOR ANALYSIS

    OpenAIRE

    Koshevenko, Svetlana; Silchenkova, Svetlana

    2016-01-01

    The article presents the results of a monographic research of information culture of  educational manager. The article includes the author's definition of the concept of "information culture  of education manager", the functional model of information culture of educational manager consisting of following components: motivational, normative-valuable, cognitive-operational, communicative, informational, critical, educational and creative. The article includes information about level and factor ...

  8. The system for statistical analysis of logistic information

    Directory of Open Access Journals (Sweden)

    Khayrullin Rustam Zinnatullovich

    2015-05-01

    Full Text Available The current problem for managers in logistic and trading companies is the task of improving the operational business performance and developing the logistics support of sales. The development of logistics sales supposes development and implementation of a set of works for the development of the existing warehouse facilities, including both a detailed description of the work performed, and the timing of their implementation. Logistics engineering of warehouse complex includes such tasks as: determining the number and the types of technological zones, calculation of the required number of loading-unloading places, development of storage structures, development and pre-sales preparation zones, development of specifications of storage types, selection of loading-unloading equipment, detailed planning of warehouse logistics system, creation of architectural-planning decisions, selection of information-processing equipment, etc. The currently used ERP and WMS systems did not allow us to solve the full list of logistics engineering problems. In this regard, the development of specialized software products, taking into account the specifics of warehouse logistics, and subsequent integration of these software with ERP and WMS systems seems to be a current task. In this paper we suggest a system of statistical analysis of logistics information, designed to meet the challenges of logistics engineering and planning. The system is based on the methods of statistical data processing.The proposed specialized software is designed to improve the efficiency of the operating business and the development of logistics support of sales. The system is based on the methods of statistical data processing, the methods of assessment and prediction of logistics performance, the methods for the determination and calculation of the data required for registration, storage and processing of metal products, as well as the methods for planning the reconstruction and development

  9. An Analysis of the Information Behaviour of Geography Teachers in a Developing African Country–Lesotho

    Directory of Open Access Journals (Sweden)

    Constance BITSO

    2012-08-01

    Full Text Available Information behaviour studies have the potential to inform the design of effective information services that incorporate the information needs, information-seeking and preferences for information sources of target users; hence a doctoral study was conducted on the information behaviour of geography teachers in Lesotho with the aim of guiding the design and implementation of an information service model for these teachers. This paper focuses on the analysis of the information behaviour of geography teachers in Lesotho as a contribution of original knowledge on geography teachers’ information behaviour. The analysis established the information behaviour of geography teachers using the information behaviour concept that encompasses information needs, information-seeking and information sources. Data were collected and analyzed through focus group discussions and conceptual content analysis respectively.The analysis reveals that these geography teachers need current and accurate information covering a variety of aspects in teaching and learning, such as content, pedagogy, classroom management and learners’ assessment. Owing to the increasing number of orphans in schools as a result of the HIV and AIDS pandemic, most teachers expressed the need for information on social assistance for orphans and vulnerable children. Recommendations include information literacy training for teachers and access to the Internet in schools, including the use of open access journals on the Internet by the teachers.

  10. Situated student learning and spatial informational analysis for environmental problems

    Science.gov (United States)

    Olsen, Timothy Paul

    Ninth and tenth grade high school Biology student research teams used spatial information analysis tools to site a prairie restoration plot on a 55 acre campus during a four-week environment unit. Students made use of innovative technological practices by applying geographic information systems (GIS) approaches to solving environmental and land use problems. Student learning was facilitated by starting with the students' initial conceptions of computing, local landscape and biological environment, and then by guiding them through a problem-based science project process. The project curriculum was framed by the perspective of legitimate peripheral participation (Lave & Wenger, 1991) where students were provided with learning opportunities designed to allow them to act like GIS practitioners. Sociocultural lenses for learning were employed to create accounts of human mental processes that recognize the essential relationship between these processes and their cultural, historical, and institutional settings (Jacob, 1997; Wertsch, 1991). This research investigated how student groups' meaning-making actions were mediated by GIS tools on the periphery of a scientific community of practice. Research observations focused on supporting interpretations of learners' socially constructed actions and the iterative building of assertions from multiple sources. These included the artifacts students produced, the tools they used, the cultural contexts that constrained their activity, and how people begin to adopt ways of speaking (speech genres) of the referent community to negotiate meanings and roles. Students gathered field observations and interpreted attributes of landscape entities from the GIS data to advocate for an environmental decision. However, even while gaining proficiencies with GIS tools, most students did not begin to appropriate roles from the GIS community of practice. Students continued to negotiate their project actions simply as school exercises motivated by

  11. Time course of information representation of macaque AIP neurons in hand manipulation task revealed by information analysis.

    Science.gov (United States)

    Sakaguchi, Yutaka; Ishida, Fumihiko; Shimizu, Takashi; Murata, Akira

    2010-12-01

    We used mutual information analysis of neuronal activity in the macaque anterior intraparietal area (AIP) to examine information processing during a hand manipulation task. The task was to reach-to-grasp a three-dimensional (3D) object after presentation of a go signal. Mutual information was calculated between the spike counts of individual neurons in 50-ms-wide time bins and six unique shape classifications or 15 one-versus-one classifications of these shapes. The spatiotemporal distribution of mutual information was visualized as a two-dimensional image ("information map") to better observe global profiles of information representation. In addition, a nonnegative matrix factorization technique was applied for extracting its structure. Our major finding was that the time course of mutual information differed significantly according to different classes of task-related neurons. This strongly suggests that different classes of neurons were engaged in different information processing stages in executing the hand manipulation task. On the other hand, our analysis revealed the heterogeneous nature of information representation of AIP neurons. For example, "information latency" (or information onset) varied among individual neurons even in the same neuron class and the same shape classification. Further, some neurons changed "information preference" (i.e., shape classification with the largest amount of information) across different task periods. These suggest that neurons encode different information in the different task periods. Taking the present result together with previous findings, we used a Gantt chart to propose a hypothetical scheme of the dynamic interactions between different types of AIP neurons.

  12. Using Failure Information Analysis to Detect Enterprise Zombies

    Science.gov (United States)

    Zhu, Zhaosheng; Yegneswaran, Vinod; Chen, Yan

    We propose failure information analysis as a novel strategy for uncovering malware activity and other anomalies in enterprise network traffic. A focus of our study is detecting self-propagating malware such as worms and botnets. We begin by conducting an empirical study of transport- and application-layer failure activity using a collection of long-lived malware traces. We dissect the failure activity observed in this traffic in several dimensions, finding that their failure patterns differ significantly from those of real-world applications. Based on these observations, we describe the design of a prototype system called Netfuse to automatically detect and isolate malware-like failure patterns. The system uses an SVM-based classification engine to identify suspicious systems and clustering to aggregate failure activity of related enterprise hosts. Our evaluation using several malware traces demonstrates that the Netfuse system provides an effective means to discover suspicious application failures and infected enterprise hosts. We believe it would be a useful complement to existing defenses.

  13. Enhancing multilingual latent semantic analysis with term alignment information.

    Energy Technology Data Exchange (ETDEWEB)

    Chew, Peter A.; Bader, Brett William

    2008-08-01

    Latent Semantic Analysis (LSA) is based on the Singular Value Decomposition (SVD) of a term-by-document matrix for identifying relationships among terms and documents from co-occurrence patterns. Among the multiple ways of computing the SVD of a rectangular matrix X, one approach is to compute the eigenvalue decomposition (EVD) of a square 2 x 2 composite matrix consisting of four blocks with X and XT in the off-diagonal blocks and zero matrices in the diagonal blocks. We point out that significant value can be added to LSA by filling in some of the values in the diagonal blocks (corresponding to explicit term-to-term or document-to-document associations) and computing a term-by-concept matrix from the EVD. For the case of multilingual LSA, we incorporate information on cross-language term alignments of the same sort used in Statistical Machine Translation (SMT). Since all elements of the proposed EVD-based approach can rely entirely on lexical statistics, hardly any price is paid for the improved empirical results. In particular, the approach, like LSA or SMT, can still be generalized to virtually any language(s); computation of the EVD takes similar resources to that of the SVD since all the blocks are sparse; and the results of EVD are just as economical as those of SVD.

  14. Catchment delineation and morphometric analysis using geographical information system.

    Science.gov (United States)

    Kumar, Manoj; Kumar, Rohitashw; Singh, P K; Singh, Manjeet; Yadav, K K; Mittal, H K

    2015-01-01

    The geographical information system (GIS) has emerged as an efficient tool in delineation of drainage patterns of watershed planning and management. The morphometric parameters of basins can address linear, areal and relief aspects. The study deals with the integrated watershed management of Baliya micro-watersheds, located in the Udaipur district of Rajasthan, India. Morphometric analysis in hydrological investigation is an important aspect and it is inevitable in the development and management of drainage basins. The determination of linear, areal and relief parameters indicate fairly good significance. The low value of the bifurcation ratio of 4.19 revealed that the drainage pattern has not been distorted by structural disturbance. The high value of the elongation ratio (0.68) compared to the circulatory ratio (0.27) indicates an elongated shape of the watershed. The high value of drainage density (5.39 km/km(2)) and stream frequency (12.32) shows that the region has impermeable subsoil material under poor vegetative cover with a low relief factor. The morphometric parameters of relief ratio (0.041) and relative relief (0.99%) show that the watershed can be treated using GIS techniques to determine the morphometric presence of dendritic drainage pattern, with a view to selecting the soil and water conservation measures and water harvesting.

  15. Integration of IT-Security Aspects into Information Demand Analysis and Patterns

    OpenAIRE

    Sandkuhl, Kurt; Matulevicius, Raimundas; Kirikova, Marite; Ahmed, Naved

    2015-01-01

    Information logistics in general addresses demand-oriented information supply in organizations. IT-security has not received much attention in information logistics research. However, integration of security aspects into information logistics methods could be useful for application contexts with strong security requirements. As a contribution to this aspect, the paper investigates the possibility to extend information demand patterns (IDP) and information demand analysis (IDA) with security e...

  16. ANALYSIS OF THE COHERENCE OF LOGISTIC SYSTEMS FROM URBAN ENVIROMNENTS USING THE INFORMATIONAL INDICES

    OpenAIRE

    Gheorghe BASANU; Victor TELEASA

    2010-01-01

    The paper introduces a series of analysis models of the flows of materials and products existent inside a logistic system, elaborated according to the entropic and informational indices introduced in the first part of the paper, which are: informational entropy, the quantity of information, the organization degree, the mutual information, the informational energy and the coefficient of informational correlation. The theoretical elements are used in case studies in the second part of the paper...

  17. FACTORS INFLUENCING INFORMATION TECHNOLOGY ADPOTION: A CROSS-SECTIONAL ANALYSIS

    OpenAIRE

    Stroade, Jeri L.; Schurle, Bryan W.

    2003-01-01

    This project will explore information technology adoption issues. The unique characteristics of information technology will be discussed. Advantages and disadvantages to adoption will also be identified. Finally, a statistical model of Internet adoption will be developed to estimate the impacts of certain variables on the underlying process of information technology adoption.

  18. Multi-Dimensional Analysis of Dynamic Human Information Interaction

    Science.gov (United States)

    Park, Minsoo

    2013-01-01

    Introduction: This study aims to understand the interactions of perception, effort, emotion, time and performance during the performance of multiple information tasks using Web information technologies. Method: Twenty volunteers from a university participated in this study. Questionnaires were used to obtain general background information and…

  19. Information Security Risk Analysis Methods and Research Trends: AHP and Fuzzy Comprehensive Method

    Directory of Open Access Journals (Sweden)

    Ming-Chang Lee

    2014-02-01

    Full Text Available Information security risk analysis becomes an increasingly essential component of organization’s operations. Traditional Information security risk analysis is quantitative and qualitative analysis methods. Quantitative and qualitative analysis methods have some advantages for information risk analysis. However, hierarchy process has been widely used in security assessment. A future research direction may be development and application of soft computing such as rough sets, grey sets, fuzzy systems, generic algorithm, support vector machine, and Bayesian network and hybrid model. Hybrid model are developed by integrating two or more existing model. A Practical advice for evaluation information security risk is discussed. This approach is combination with AHP and Fuzzy comprehensive method

  20. Consumer information on fetal heart rate monitoring during labor: a content analysis: a content analysis.

    Science.gov (United States)

    Torres, Jennifer; De Vries, Raymond; Low, Lisa Kane

    2014-01-01

    Electronic fetal monitoring (EFM) is used for the majority of births that occur in the United States. While there are indications for use of EFM for women with high-risk pregnancies, its use in low-risk pregnancies is less evidence-based. In low-risk women, the use of EFM is associated with an increased risk for cesarean birth compared with the use of intermittent auscultation of the fetal heart rate. The purpose of this investigation was to evaluate the existence of evidence-based information on fetal heart rate monitoring in popular consumer-focused maternity books and Web sites. Content analysis of information in consumer-oriented Web sites and books was completed using the NVivo software (QRSinternational, Melbourne, Australia). Themes identified included lack of clear terminology when discussing fetal monitoring, use of broad categories such as low risk and high risk, limited presentation of information about intermittent auscultation, and presentation of EFM as the standard of care, particularly upon admission into the labor unit. More than one-third of the sources did not mention auscultation, and conflicting information about monitoring methods was presented. The availability of accurate, publically accessible information offers consumers the opportunity to translate knowledge into the power to seek evidence-based care practices during their maternity care experience. PMID:24781772

  1. Commercial babassu mesocarp: microbiological evaluation and analysis of label information

    Directory of Open Access Journals (Sweden)

    Laisa Lis Fontinele Sá

    2015-10-01

    Full Text Available The babassu mesocarp is easily found in supermarkets and other commercial establishments in Brazil. Despite its widespread use in both pharmaceutical and food industries, the literature has neither scientific studies about microbial contamination for these products nor about legal information expressed on label. The aim of this study was to evaluate the level of microbiological contamination in babassu mesocarp sold in commercial establishments in Teresina-PI/Brazil besides the conformity of label information according to the rules of Brazilian Sanitary Surveillance Agency (ANVISA. Ten samples of babassu mesocarp powder sold in the region were selected for study. Determination of heterotrophic microorganisms was carried out using the seeding technique of Plate Count Agar (CFU g-1. It was used Sabouraud Dextrose Agar medium for cultivation of fungi. For the analysis of label information, the resolutions (RDC, 259 of September 20, 2002, and 360 of December 23, 2003, beyond the law 10,674 of May 16, 2003 were used. The results of levels of contamination for heterotrophic bacteria and fungi showed high contamination for all samples. Most of the label samples were according to the rules. Therefore, the results suggest a more comprehensive monitoring of these microorganisms besides the development of more effective methods for decontamination of these products sold in Brazil.Keywords: Babassu. Label. Contamination. Food. Pharmacy. RESUMO O mesocarpo de babaçu é encontrado facilmente em supermercados e em outros estabelecimentos comercias e apesar de sua ampla utilização, tanto na indústria farmacêutica e de alimentos, na literatura não há trabalhos científicos que avaliem sua contaminação microbiológica ou informações legais necessárias para rótulos. O objetivo do estudo foi avaliar o nível de contaminação microbiológica do mesocarpo de babaçu, vendidos no comércio de Teresina-PI, bem como verificar a conformidade das informa

  2. MULTIPLE CRITERIA ANALYSIS FOR EVALUATION OF INFORMATION SYSTEM RISK

    OpenAIRE

    Olson, David L.; DESHENG DASH WU

    2011-01-01

    Information technology (IT) involve a wide set of risks. Enterprise information systems are a major developing form of information technology involving their own set of risks, thus creating potential blind spots. This paper describes risk management issues involved in enterprise resource planning systems (ERP) which have high impact on organizations due to their high cost, and their pervasive impact on organizational operations. Alternative means of acquiring ERP systems, to include outsourci...

  3. Analysis on Uncertain Information and Actions for Preventing Collision

    Institute of Scientific and Technical Information of China (English)

    胡甚平; FANG; Quan-gen

    2007-01-01

    Discusses and analyzes the causes and characteristics of the uncertainties of the information and actions for preventing collision at sea on the basic knowledge of the collision avoidance. Describes the ways and functions of the investigations about the uncertainties of the information and actions of collision avoidance with the navigation simulators. Puts forward some suggestions for the officers to master the skills of the recognition of these uncertainties of the information and actions by the training with the simulator during the MET course.

  4. ACCOUNTING INFORMATION - THE BASE OF FINANCIAL ANALYSIS IN INVESTMENT DECISIONS

    OpenAIRE

    Cristian Ionel VĂTĂŞOIU; Mihae la GHEORGHE; Ioan Dumitru MOTONIU; Ileana Sorina (Rakos) BOCA

    2010-01-01

    Accounting information must allow both the current and potential investors to identify measure, classify, and evaluate all operations and activities of an enterprise. Undoubtedly, one of the most important is that accounting provides accounting information. This allows management decisions and proper investment by current and potential investors. This paper aims to show that it is important accounting information in making management decisions and investments, wh...

  5. Content of information ethics in the Holy Quran: an analysis

    Directory of Open Access Journals (Sweden)

    Akram Mehrandasht

    2014-06-01

    Full Text Available Background and Objectives: Information ethics, according to Islam, equals observing human rights and morals in dealing with information transmission and provision, which must be based on honesty and truthfulness. As Islam highly values society and social issues, it believes that human behaviors and interactions are strongly interconnected in a society. Committing to ethical issues regarding information is a responsibility of all members of a society according to Quran. Islam prohibits Believers from unmasking peoples’ private information. Results: Thus, from the viewpoint of Islam, the purpose of information ethics is to observe and protect human rights in the society and to show the positive effects of the right information on all aspects of human life. Method s: For the purpose of the current study all the words and verses related to information ethics was identified in Quran. The data was then evaluated and analyzed from the perspective of scholars and experts of Quran. Conclusions: To implement information ethics, it is necessary to be aware of the position Islam has given this concept. Following Quran guidelines and the manners of the family members of the Prophet Muhammad (pbuh results in a society in which information ethics is observed optimally.

  6. INFORMATION ECONOMICS, INSTRUMENT OF ANALYSIS IN NEW MICROECONOMICS

    Directory of Open Access Journals (Sweden)

    Maria Zenovia GRIGORE

    2009-10-01

    Full Text Available In the New Microeconomics the walrasian postulate of perfect information is replaced by two theorems concerning production of information: 1. the acquisition and dissemination of information raise production costs; 2.  the specialisation in information activity is efficient; there are specialists in the production or use of informationInformation economics or the economics of information studies decisions in transaction where one party has more or better information than the other. Incomplete and asymmetric information could generate two types of risks: adverse selection, which can be reduced with signaling games and screening games, and moral hazard, studied in the frame of agency theory, by the principal agent model. The principal agent model treats the difficulties that arise when a principal hires an agent to pursue the interests of the former. There are some mechanisms that align the interests of the agent in solidarity with those of the principal, such as commissions, promotions, profit sharing, efficiency wages, deferred compensation, fear of firing and so on.

  7. Analysis and improvement of vehicle information sharing networks

    Science.gov (United States)

    Gong, Hang; He, Kun; Qu, Yingchun; Wang, Pu

    2016-06-01

    Based on large-scale mobile phone data, mobility demand was estimated and locations of vehicles were inferred in the Boston area. Using the spatial distribution of vehicles, we analyze the vehicle information sharing network generated by the vehicle-to-vehicle (V2V) communications. Although a giant vehicle cluster is observed, the coverage and the efficiency of the information sharing network remain limited. Consequently, we propose a method to extend the information sharing network's coverage by adding long-range connections between targeted vehicle clusters. Furthermore, we employ the optimal design strategy discovered in square lattice to improve the efficiency of the vehicle information sharing network.

  8. Change detection in bi-temporal data by canonical information analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Vestergaard, Jacob Schack

    2015-01-01

    Canonical correlation analysis (CCA) is an established multivariate statistical method for finding similarities between linear combinations of (normally two) sets of multivariate observations. In this contribution we replace (linear) correlation as the measure of association between the linear...... combinations with the information theoretical measure mutual information (MI). We term this type of analysis canonical information analysis (CIA). MI allows for the actual joint distribution of the variables involved and not just second order statistics. Where CCA is ideal for Gaussian data, CIA facilitates...

  9. Analysis Report Project: Audience, E-writing, and Information Design.

    Science.gov (United States)

    Lawrence, Sally F.

    2003-01-01

    Presents students with the opportunity to evaluate e-writing techniques and information design components of two financial and investment Websites. Discusses strategies for teaching e-writing. Uses a three-part information design model developed by Carliner (2000). Discusses how the physical, cognitive, and affective aspects all work…

  10. Consent, Informal Organization and Job Rewards: A Mixed Methods Analysis

    Science.gov (United States)

    Laubach, Marty

    2005-01-01

    This study uses a mixed methods approach to workplace dynamics. Ethnographic observations show that the consent deal underlies an informal stratification that divides the workplace into an "informal periphery," a "conventional core" and an "administrative clan." The "consent deal" is defined as an exchange of autonomy, voice and schedule…

  11. Analysis of the Interdisciplinary Nature of Library and Information Science

    Science.gov (United States)

    Prebor, Gila

    2010-01-01

    Library and information science (LIS) is highly interdisciplinary by nature and is affected by the incessant evolution of technologies. A recent study surveying research trends in the years 2002-6 at various information science departments worldwide has found that a clear trend was identified in Masters theses and doctoral dissertations of social…

  12. The threat nets approach to information system security risk analysis

    NARCIS (Netherlands)

    Mirembe, Drake

    2015-01-01

    The growing demand for healthcare services is motivating hospitals to strengthen outpatient case management using information systems in order to serve more patients using the available resources. Though the use of information systems in outpatient case management raises patient data security concer

  13. Trading in markets with noisy information: an evolutionary analysis

    Science.gov (United States)

    Bloembergen, Daan; Hennes, Daniel; McBurney, Peter; Tuyls, Karl

    2015-07-01

    We analyse the value of information in a stock market where information can be noisy and costly, using techniques from empirical game theory. Previous work has shown that the value of information follows a J-curve, where averagely informed traders perform below market average, and only insiders prevail. Here we show that both noise and cost can change this picture, in several cases leading to opposite results where insiders perform below market average, and averagely informed traders prevail. Moreover, we investigate the effect of random explorative actions on the market dynamics, showing how these lead to a mix of traders being sustained in equilibrium. These results provide insight into the complexity of real marketplaces, and show under which conditions a broad mix of different trading strategies might be sustainable.

  14. A Novel Quantitative Analysis Model for Information System Survivability Based on Conflict Analysis

    Institute of Scientific and Technical Information of China (English)

    WANG Jian; WANG Huiqiang; ZHAO Guosheng

    2007-01-01

    This paper describes a novel quantitative analysis model for system survivability based on conflict analysis, which provides a direct-viewing survivable situation. Based on the three-dimensional state space of conflict, each player's efficiency matrix on its credible motion set can be obtained. The player whose desire is the strongest in all initiates the moving and the overall state transition matrix of information system may be achieved. In addition, the process of modeling and stability analysis of conflict can be converted into a Markov analysis process, thus the obtained results with occurring probability of each feasible situation will help the players to quantitatively judge the probability of their pursuing situations in conflict. Compared with the existing methods which are limited to post-explanation of system's survivable situation, the proposed model is relatively suitable for quantitatively analyzing and forecasting the future development situation of system survivability. The experimental results show that the model may be effectively applied to quantitative analysis for survivability. Moreover, there will be a good application prospect in practice.

  15. Information Services at the Nuclear Safety Analysis Center.

    Science.gov (United States)

    Simard, Ronald

    This paper describes the operations of the Nuclear Safety Analysis Center. Established soon after an accident at the Three Mile Island nuclear power plant near Harrisburg, Pennsylvania, its efforts were initially directed towards a detailed analysis of the accident. Continuing functions include: (1) the analysis of generic nuclear safety issues,…

  16. Information Flow in the Launch Vehicle Design/Analysis Process

    Science.gov (United States)

    Humphries, W. R., Sr.; Holland, W.; Bishop, R.

    1999-01-01

    This paper describes the results of a team effort aimed at defining the information flow between disciplines at the Marshall Space Flight Center (MSFC) engaged in the design of space launch vehicles. The information flow is modeled at a first level and is described using three types of templates: an N x N diagram, discipline flow diagrams, and discipline task descriptions. It is intended to provide engineers with an understanding of the connections between what they do and where it fits in the overall design process of the project. It is also intended to provide design managers with a better understanding of information flow in the launch vehicle design cycle.

  17. Independent component analysis of edge information for face recognition

    CERN Document Server

    Karande, Kailash Jagannath

    2013-01-01

    The book presents research work on face recognition using edge information as features for face recognition with ICA algorithms. The independent components are extracted from edge information. These independent components are used with classifiers to match the facial images for recognition purpose. In their study, authors have explored Canny and LOG edge detectors as standard edge detection methods. Oriented Laplacian of Gaussian (OLOG) method is explored to extract the edge information with different orientations of Laplacian pyramid. Multiscale wavelet model for edge detection is also propos

  18. SYSTEM TECHNICAL ANALYSIS OF INFORMATION AND DOCUMENTATION IN STRUCTURAL DESIGN

    OpenAIRE

    Ignatiev Oleg Vladimirovich; Pavlov Aleksandr Sergeevich; Lavdansky Pavel Aleksandrovich

    2012-01-01

    The authors review the general types of documentation used in construction. Much attention is given to the different kinds of information contained in electronic files as well as data processing and transmission.

  19. Analysis on Recommended System for Web Information Retrieval Using HMM

    OpenAIRE

    Himangni Rathore; Hemant Verma

    2014-01-01

    Web is a rich domain of data and knowledge, which is spread over the world in unstructured manner. The number of users is continuously access the information over the internet. Web mining is an application of data mining where web related data is extracted and manipulated for extracting knowledge. The data mining is used in the domain of web information mining is refers as web mining, that is further divided into three major domains web uses mining, web content mining and web stru...

  20. Brain Tumor Detection Based On Mathematical Analysis and Symmetry Information

    OpenAIRE

    G., Narkhede Sachin; Khairnar, Vaishali; Kadu, Sujata

    2014-01-01

    Image segmentation some of the challenging issues on brain magnetic resonance image tumor segmentation caused by the weak correlation between magnetic resonance imaging intensity and anatomical meaning.With the objective of utilizing more meaningful information to improve brain tumor segmentation,an approach which employs bilateral symmetry information as an additional feature for segmentation is proposed.This is motivated by potential performance improvement in the general automatic brain tu...

  1. Brain Tumor Detection Based On Mathematical Analysis and Symmetry Information

    OpenAIRE

    Narkhede Sachin G.,; Prof. Vaishali Khairnar

    2014-01-01

    Image segmentation some of the challenging issues on brain magnetic resonance (MR) image tumor segmentation caused by the weak correlation between magnetic resonance imaging (MRI) intensity and anatomical meaning. With the objective of utilizing more meaningful information to improve brain tumor segmentation, an approach which employs bilateral symmetry information as an additional feature for segmentation is proposed. This is motivated by potential performance improvement in ...

  2. Information flow analysis for mobile code in dynamic security environments

    OpenAIRE

    Grabowski, Robert

    2012-01-01

    With the growing amount of data handled by Internet-enabled mobile devices, the task of preventing software from leaking confidential information is becoming increasingly important. At the same time, mobile applications are typically executed on different devices whose users have varying requirements for the privacy of their data. Users should be able to define their personal information security settings, and they should get a reliable assurance that the installed softwa...

  3. Information technology portfolio in supply chain management using factor analysis

    OpenAIRE

    Ahmad Jaafarnejad; Davood Rafierad; Masoumeh Gardeshi

    2013-01-01

    The adoption of information technology (IT) along with supply chain management (SCM) has become increasingly a necessity among most businesses. This enhances supply chain (SC) performance and helps companies achieve the organizational competitiveness. IT systems capture and analyze information and enable management to make decisions by considering a global scope across the entire SC. This paper reviews the existing literature on IT in SCM and considers pertinent criteria. Using principal comp...

  4. Integrating Information and Communication Technology for Health Information System Strengthening: A Policy Analysis.

    Science.gov (United States)

    Marzuki, Nuraidah; Ismail, Saimy; Al-Sadat, Nabilla; Ehsan, Fauziah Z; Chan, Chee-Khoon; Ng, Chiu-Wan

    2015-11-01

    Despite the high costs involved and the lack of definitive evidence of sustained effectiveness, many low- and middle-income countries had begun to strengthen their health information system using information and communication technology in the past few decades. Following this international trend, the Malaysian Ministry of Health had been incorporating Telehealth (National Telehealth initiatives) into national health policies since the 1990s. Employing qualitative approaches, including key informant interviews and document review, this study examines the agenda-setting processes of the Telehealth policy using Kingdon's framework. The findings suggested that Telehealth policies emerged through actions of policy entrepreneurs within the Ministry of Health, who took advantage of several simultaneously occurring opportunities--official recognition of problems within the existing health information system, availability of information and communication technology to strengthen health information system and political interests surrounding the national Multimedia Super Corridor initiative being developed at the time. The last was achieved by the inclusion of Telehealth as a component of the Multimedia Super Corridor.

  5. Analysis of the Effect of Information System Quality to Intention to Reuse of Employee Management Information System (Simpeg Based on Information Systems Success Model

    Directory of Open Access Journals (Sweden)

    Suryanto Tri Lathif Mardi

    2016-01-01

    Full Text Available This study examines the effect of Information Quality, Systems Quality and Service Quality on the user intention to reuse Employee Management Information System (SIMPEG in University in the city of Surabaya, based on the theoretical foundation of DeLone and McLane Information Systems Success (ISS Model. The distribution of questionnaire was conducted to 120 employees of different universities by means of stratified random sampling. The results showed that: (1 there is a significant positive effect of the System Quality on the Quality of Information, (2 there is a significant positive effect of the Information Quality on the Intention to Reuse, information related to the fulfillment of the user’s needs; (3 there is a significant positive effect of the Quality of the Intention on system re-use, the system related to the fulfillment of the needs of users; (4 there is no effect of the Quality of Service to the Intention to Reuse. In the end, the results of this study provide an analysis and advice to The University officials that can be used as a consideration for Information Technology/Information System investment and development in accordance with the Success of Information System and Intention to Reuse model.

  6. Structural Analysis: Shape Information via Points-To Computation

    CERN Document Server

    Marron, Mark

    2012-01-01

    This paper introduces a new hybrid memory analysis, Structural Analysis, which combines an expressive shape analysis style abstract domain with efficient and simple points-to style transfer functions. Using data from empirical studies on the runtime heap structures and the programmatic idioms used in modern object-oriented languages we construct a heap analysis with the following characteristics: (1) it can express a rich set of structural, shape, and sharing properties which are not provided by a classic points-to analysis and that are useful for optimization and error detection applications (2) it uses efficient, weakly-updating, set-based transfer functions which enable the analysis to be more robust and scalable than a shape analysis and (3) it can be used as the basis for a scalable interprocedural analysis that produces precise results in practice. The analysis has been implemented for .Net bytecode and using this implementation we evaluate both the runtime cost and the precision of the results on a num...

  7. 75 FR 58374 - 2010 Release of CADDIS (Causal Analysis/Diagnosis Decision Information System)

    Science.gov (United States)

    2010-09-24

    ... AGENCY 2010 Release of CADDIS (Causal Analysis/Diagnosis Decision Information System) AGENCY... Decision Information System (CADDIS). This Web site was developed to help scientists find, develop... information useful for causal evaluations in aquatic systems. CADDIS is based on EPA's Stressor...

  8. 75 FR 35457 - Draft of the 2010 Causal Analysis/Diagnosis Decision Information System (CADDIS)

    Science.gov (United States)

    2010-06-22

    ... AGENCY Draft of the 2010 Causal Analysis/Diagnosis Decision Information System (CADDIS) AGENCY... investigators find, access, organize, and share information useful for causal evaluations in aquatic systems... ``anonymous access'' system, which means that EPA will not know your identity or contact information...

  9. 77 FR 63833 - Equifax Information Services LLC; Analysis of Proposed Consent Order To Aid Public Comment

    Science.gov (United States)

    2012-10-17

    ... Equifax Information Services LLC; Analysis of Proposed Consent Order To Aid Public Comment AGENCY: Federal... INFORMATION section below. Write ``Equifax Info Services, File No. 102 3252'' on your comment and file your... from Equifax Information Services LLC (``Equifax''). The proposed consent order has been placed on...

  10. Demand Analysis of Logistics Information Matching Platform: A Survey from Highway Freight Market in Zhejiang Province

    Science.gov (United States)

    Chen, Daqiang; Shen, Xiahong; Tong, Bing; Zhu, Xiaoxiao; Feng, Tao

    With the increasing competition in logistics industry and promotion of lower logistics costs requirements, the construction of logistics information matching platform for highway transportation plays an important role, and the accuracy of platform design is the key to successful operation or not. Based on survey results of logistics service providers, customers and regulation authorities to access to information and in-depth information demand analysis of logistics information matching platform for highway transportation in Zhejiang province, a survey analysis for framework of logistics information matching platform for highway transportation is provided.

  11. Usability Analysis of Geographic Information System Software: A Case Study

    Directory of Open Access Journals (Sweden)

    Mehedi Masud

    2009-07-01

    Full Text Available A Geographical Information System (GIS is a computer system capable of creating, capturing and storing, analyzing, managing, and displaying geographically referenced information. A GIS tool offers interactive user interfaces to submit queries, analyze and edit data. The usability criterion of a GIS tool is an important factor for analyzing geographical information. This paper presents a methodology for evaluating the usability of a GIS tool and proposes some guidelines to find out the severity ratings of problems in a GIS tool. The paper also demonstrates how to scrutinize the usability to discover potential problems using a prototype user interface. Based on the study, experience, and observation, this paper also proposes a number of general usability evaluation guidelines for GIS tools.

  12. A quantum information theoretic analysis of three flavor neutrino oscillations

    CERN Document Server

    Banerjee, Subhashish; Srikanth, R; Hiesmayr, Beatrix C

    2015-01-01

    Correlations exhibited by neutrino oscillations are studied via quantum information theoretic quantities. We show that the strongest type of entanglement, genuine multipartite entanglement, is persistent in the flavour changing states. We prove the existence of Bell-type nonlocal features, in both its absolute and genuine avatars. Finally, we show that a measure of nonclassicality, dissension, which is a generalization of quantum discord to the tripartite case, is nonzero for almost the entire range of time in the evolution of an initial electron-neutrino. Via these quantum information theoretic quantities capturing different aspects of quantum correlations, we elucidate the differences between the flavour types, shedding light on the quantum-information theoretic aspects of the weak force.

  13. An Information Theoretic Analysis of Decision in Computer Chess

    CERN Document Server

    Godescu, Alexandru

    2011-01-01

    The basis of the method proposed in this article is the idea that information is one of the most important factors in strategic decisions, including decisions in computer chess and other strategy games. The model proposed in this article and the algorithm described are based on the idea of a information theoretic basis of decision in strategy games . The model generalizes and provides a mathematical justification for one of the most popular search algorithms used in leading computer chess programs, the fractional ply scheme. However, despite its success in leading computer chess applications, until now few has been published about this method. The article creates a fundamental basis for this method in the axioms of information theory, then derives the principles used in programming the search and describes mathematically the form of the coefficients. One of the most important parameters of the fractional ply search is derived from fundamental principles. Until now this coefficient has been usually handcrafted...

  14. An Information Diffusion Technique for Fire Risk Analysis

    Institute of Scientific and Technical Information of China (English)

    刘静; 黄崇福

    2004-01-01

    There are many kinds of fires occurring under different conditions. For a specific site, it is difficult to collect sufficient data for analyzing the fire risk. In this paper, we suggest an information diffusion technique to analyze fire risk with a small sample. The information distribution method is applied to change crisp observations into fuzzy sets, and then to effectively construct a fuzzy relationship between fire and surroundings. With the data of Shanghai in winter, we show how to use the technique to analyze the fire risk.

  15. Shannon Information and Power Law Analysis of the Chromosome Code

    Directory of Open Access Journals (Sweden)

    J. A. Tenreiro Machado

    2012-01-01

    Full Text Available This paper studies the information content of the chromosomes of twenty-three species. Several statistics considering different number of bases for alphabet character encoding are derived. Based on the resulting histograms, word delimiters and character relative frequencies are identified. The knowledge of this data allows moving along each chromosome while evaluating the flow of characters and words. The resulting flux of information is captured by means of Shannon entropy. The results are explored in the perspective of power law relationships allowing a quantitative evaluation of the DNA of the species.

  16. A Novel Approach for Information Content Retrieval and Analysis of Bio-Images using Datamining techniques

    OpenAIRE

    Ayyagari Sri Nagesh; G.P.Saradhi Varma; Govardhan, A.

    2012-01-01

    In Bio-Medical image processing domain, content-based analysis and Information retrieval of bio-images is very critical for disease diagnosis. Content-Based Image Analysis and Information Retrieval (CBIAIR) has become a significant part of information retrieval technology. One challenge in this area is that the ever-increasing number of bio-images acquired through the digital world makes the brute force searching almost impossible. Medical Image structural objects content and object identific...

  17. Efficiency and credit ratings: a permutation-information-theory analysis

    Science.gov (United States)

    Fernandez Bariviera, Aurelio; Zunino, Luciano; Belén Guercio, M.; Martinez, Lisana B.; Rosso, Osvaldo A.

    2013-08-01

    The role of credit rating agencies has been under severe scrutiny after the subprime crisis. In this paper we explore the relationship between credit ratings and informational efficiency of a sample of thirty nine corporate bonds of US oil and energy companies from April 2008 to November 2012. For this purpose we use a powerful statistical tool, relatively new in the financial literature: the complexity-entropy causality plane. This representation space allows us to graphically classify the different bonds according to their degree of informational efficiency. We find that this classification agrees with the credit ratings assigned by Moody’s. In particular, we detect the formation of two clusters, which correspond to the global categories of investment and speculative grades. Regarding the latter cluster, two subgroups reflect distinct levels of efficiency. Additionally, we also find an intriguing absence of correlation between informational efficiency and firm characteristics. This allows us to conclude that the proposed permutation-information-theory approach provides an alternative practical way to justify bond classification.

  18. Fitting the Jigsaw of Citation: Information Visualization in Domain Analysis.

    Science.gov (United States)

    Chen, Chaomei; Paul, Ray J.; O'Keefe, Bob

    2001-01-01

    Discusses the role of information visualization in modeling and representing intellectual structures associated with scientific disciplines and visualizes the domain of computer graphics based on bibliographic data from author cocitation patterns. Highlights include author cocitation maps, citation time lines, animation of a high-dimensional…

  19. Information sharing for consumption tax purposes : An empirical analysis

    NARCIS (Netherlands)

    Ligthart, Jenny E.

    2007-01-01

    The paper studies the determinants of information sharing between Swedish tax authorities and 14 EU tax authorities for value-added tax (VAT) purposes. It is shown that trade-related variables (such as the partner country's net trade position and population size), reciprocity, and legal arrangements

  20. C. elegans locomotion analysis using algorithmic information theory.

    Science.gov (United States)

    Skandari, Roghieh; Le Bihan, Nicolas; Manton, Jonathan H

    2015-01-01

    This article investigates the use of algorithmic information theory to analyse C. elegans datasets. The ability of complexity measures to detect similarity in animals' behaviours is demonstrated and their strengths are compared to methods such as histograms. Introduced quantities are illustrated on a couple of real two-dimensional C. elegans datasets to investigate the thermotaxis and chemotaxis behaviours.

  1. Considering Complex Objectives and Scarce Resources in Information Systems' Analysis.

    Science.gov (United States)

    Crowther, Warren

    The low efficacy of many of the library and large-scale information systems that have been implemented in the developing countries has been disappointing, and their appropriateness is often questioned in the governmental and educational institutions of more industrialized countries beset by budget-crunching and a very dynamic transformation of…

  2. Efficiency and credit ratings: a permutation-information-theory analysis

    International Nuclear Information System (INIS)

    The role of credit rating agencies has been under severe scrutiny after the subprime crisis. In this paper we explore the relationship between credit ratings and informational efficiency of a sample of thirty nine corporate bonds of US oil and energy companies from April 2008 to November 2012. For this purpose we use a powerful statistical tool, relatively new in the financial literature: the complexity–entropy causality plane. This representation space allows us to graphically classify the different bonds according to their degree of informational efficiency. We find that this classification agrees with the credit ratings assigned by Moody’s. In particular, we detect the formation of two clusters, which correspond to the global categories of investment and speculative grades. Regarding the latter cluster, two subgroups reflect distinct levels of efficiency. Additionally, we also find an intriguing absence of correlation between informational efficiency and firm characteristics. This allows us to conclude that the proposed permutation-information-theory approach provides an alternative practical way to justify bond classification. (paper)

  3. Analysis on Cloud Computing Information Security Problems and the Countermeasures

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Cloud computing is one of the most popular terms in the present IT industry, as well as one of the most prosperous technology. This paper introduces the concept, principle and characteristics of cloud computing, analyzes information security problems resulted from cloud computing, and puts forward corresponding solutions.

  4. Chromatic Information and Feature Detection in Fast Visual Analysis

    Science.gov (United States)

    Del Viva, Maria M.; Punzi, Giovanni; Shevell, Steven K.

    2016-01-01

    The visual system is able to recognize a scene based on a sketch made of very simple features. This ability is likely crucial for survival, when fast image recognition is necessary, and it is believed that a primal sketch is extracted very early in the visual processing. Such highly simplified representations can be sufficient for accurate object discrimination, but an open question is the role played by color in this process. Rich color information is available in natural scenes, yet artist's sketches are usually monochromatic; and, black-and-white movies provide compelling representations of real world scenes. Also, the contrast sensitivity of color is low at fine spatial scales. We approach the question from the perspective of optimal information processing by a system endowed with limited computational resources. We show that when such limitations are taken into account, the intrinsic statistical properties of natural scenes imply that the most effective strategy is to ignore fine-scale color features and devote most of the bandwidth to gray-scale information. We find confirmation of these information-based predictions from psychophysics measurements of fast-viewing discrimination of natural scenes. We conclude that the lack of colored features in our visual representation, and our overall low sensitivity to high-frequency color components, are a consequence of an adaptation process, optimizing the size and power consumption of our brain for the visual world we live in. PMID:27478891

  5. Information Technology: A Community of Practice. A Workplace Analysis

    Science.gov (United States)

    Guerrero, Tony

    2014-01-01

    Information Technology (IT) encompasses all aspects of computing technology. IT is concerned with issues relating to supporting technology users and meeting their needs within an organizational and societal context through the selection, creation, application, integration, and administration of computing technologies (Lunt, et. al., 2008). The…

  6. ANKH: Information Threat Analysis with Actor-NetworK Hypergraphs

    NARCIS (Netherlands)

    Pieters, Wolter

    2010-01-01

    Traditional information security modelling approaches often focus on containment of assets within boundaries. Due to what is called de-perimeterisation, such boundaries, for example in the form of clearly separated company networks, disappear. This paper argues that in a de-perimeterised situation a

  7. Temporal Expectation and Information Processing: A Model-Based Analysis

    Science.gov (United States)

    Jepma, Marieke; Wagenmakers, Eric-Jan; Nieuwenhuis, Sander

    2012-01-01

    People are able to use temporal cues to anticipate the timing of an event, enabling them to process that event more efficiently. We conducted two experiments, using the fixed-foreperiod paradigm (Experiment 1) and the temporal-cueing paradigm (Experiment 2), to assess which components of information processing are speeded when subjects use such…

  8. Repeater Analysis for Combining Information from Different Assessments

    Science.gov (United States)

    Haberman, Shelby; Yao, Lili

    2015-01-01

    Admission decisions frequently rely on multiple assessments. As a consequence, it is important to explore rational approaches to combine the information from different educational tests. For example, U.S. graduate schools usually receive both TOEFL iBT® scores and GRE® General scores of foreign applicants for admission; however, little guidance…

  9. A SOCIOECONOMIC ANALYSIS OF MARKETING INFORMATION USAGE AMONG OHIO FRUIT PRODUCERS

    OpenAIRE

    Jones, Eugene; Batte, Marvin T.; Schnitkey, Gary D.

    1990-01-01

    Farm producers attempt to mitigate risk and uncertainty by utilizing accurate and reliable information. This research attempts to identify sources of information used by Ohio fruit producers and then determine which of these sources are best meeting their information needs. Results are based on a logit analysis of Ohio fruit producers and several factors are shown to influence producers' evaluation of the "adequacy" of their marketing information. Among these factors are age, business size, e...

  10. Value of information analysis for corrective action unit No. 98: Frenchman Flat

    International Nuclear Information System (INIS)

    A value of information analysis has been completed as part of the corrective action process for Frenchman Flat, the first Nevada Test Site underground test area to be scheduled for the corrective action process. A value of information analysis is a cost-benefit analysis applied to the acquisition of new information which is needed to reduce the uncertainty in the prediction of a contaminant boundary surrounding underground nuclear tests in Frenchman Flat. The boundary location will be established to protect human health and the environment from the consequences of using contaminated groundwater on the Nevada Test Site. Uncertainties in the boundary predictions are assumed to be the result of data gaps. The value of information analysis in this document compares the cost of acquiring new information with the benefit of acquiring that information during the corrective action investigation at Frenchman Flat. Methodologies incorporated into the value of information analysis include previous geological modeling, groundwater flow modeling, contaminant transport modeling, statistics, sensitivity analysis, uncertainty analysis, and decision analysis

  11. Analysis on Recommended System for Web Information Retrieval Using HMM

    Directory of Open Access Journals (Sweden)

    Himangni Rathore

    2014-11-01

    Full Text Available Web is a rich domain of data and knowledge, which is spread over the world in unstructured manner. The number of users is continuously access the information over the internet. Web mining is an application of data mining where web related data is extracted and manipulated for extracting knowledge. The data mining is used in the domain of web information mining is refers as web mining, that is further divided into three major domains web uses mining, web content mining and web structure mining. The proposed work is intended to work with web uses mining. The concept of web mining is to improve the user feedbacks and user navigation pattern discovery for a CRM system. Finally a new algorithm HMM is used for finding the pattern in data, which method promises to provide much accurate recommendation.

  12. HIV Drug-Resistant Patient Information Management, Analysis, and Interpretation

    OpenAIRE

    Singh, Yashik; Mars, Maurice

    2012-01-01

    Introduction The science of information systems, management, and interpretation plays an important part in the continuity of care of patients. This is becoming more evident in the treatment of human immunodeficiency virus (HIV) and acquired immune deficiency syndrome (AIDS), the leading cause of death in sub-Saharan Africa. The high replication rates, selective pressure, and initial infection by resistant strains of HIV infer that drug resistance will inevitably become an important health car...

  13. FORTES: Forensic Information Flow Analysis of Business Processes

    OpenAIRE

    Accorsi, Rafael; Müller, Günter

    2010-01-01

    Nearly 70% of all business processes in use today rely on automated workflow systems for their execution. Despite the growing expenses in the design of advanced tools for secure and compliant deployment of workflows, an exponential growth of dependability incidents persists. Concepts beyond access control focusing on information flow control offer new paradigms to design security mechanisms for reliable and secure IT-based workflows. This talk presents FORTES, an approach for the forensic...

  14. Analysis of consumer information brochures on osteoporosis prevention and treatment

    Directory of Open Access Journals (Sweden)

    Mühlhauser, Ingrid

    2007-01-01

    Full Text Available Purpose: Evidence-based consumer information is a prerequisite for informed decision making. So far, there are no reports on the quality of consumer information brochures on osteoporosis. In the present study we analysed brochures on osteoporosis available in Germany. Method: All printed brochures from patient and consumer advocacy groups, physician and governmental organisations, health insurances, and pharmaceutical companies were initially collected in 2001, and updated in December 2004. Brochures were analysed by two independent researchers using 37 internationally proposed criteria addressing evidence-based content, risk communication, transparency of the development process, and layout and design. Results: A total of 165 brochures were identified; 59 were included as they specifically targeted osteoporosis prevention and treatment. Most brochures were provided by pharmaceutical companies (n=25, followed by health insurances (n=11 and patient and consumer advocacy groups (n=11. Quality of brochures did not differ between providers. Only 1 brochure presented lifetime risk estimate; 4 mentioned natural course of osteoporosis. A balanced report on benefit versus lack of benefit was presented in 2 brochures and on benefit versus adverse effects in 8 brochures. Four brochures mentioned relative risk reduction, 1 reported absolute risk reduction through hormone replacement therapy (HRT. Out of 28 brochures accessed in 2004 10 still recommended HRT without discussing adverse effects. Transparency of the development process was limited: 25 brochures reported publication date, 26 cited author and only 1 references. In contrast, readability and design was generally good. Conclusion: The quality of consumer brochures on osteoporosis in Germany is utterly inadequate. They fail to give evidence-based data on diagnosis and treatment options. Therefore, the material is not useful to enhance informed consumer choice.

  15. Food labelled Information: An Empirical Analysis of Consumer Preferences

    OpenAIRE

    Alessandro Banterle; Alessia Cavaliere; Elena Claire Ricci

    2012-01-01

    Labelling can support consumers in making choices connected to their preferences in terms of qualitative features. Nevertheless, the space available on packaging is limited and some indications are not used by consumers. This paper aims at analysing which kinds of currently labelled information are of interest and actually used by consumers, and which additional kinds could improve consumer choices. Moreover, we investigate the attitude of consumers with respect to innovative strategies for t...

  16. On Static and Dynamic Control-Flow Information in Program Analysis and Transformation

    DEFF Research Database (Denmark)

    Damian, Daniel

    This thesis addresses several aspects of static and dynamic control-flow information in programming languages, by investigating its interaction with program transformation and program analysis. Control-flow information indicates for each point in a program the possible program points to be executed...... of the program may be executed next. A control-flow analysis approximates the dynamic control-flow information with conservative static control-flow information. We explore the impact of a continuation-passing-style (CPS) transformation on the result of a constraint-based control-flow analysis over Moggi...... next. Control-flow information in a program may be static, as when the syntax of the program directly determines which parts of the program may be executed next. Control-flow information may be dynamic, as when run-time values and inputs of the program are required to determine which parts...

  17. Cognitive tasks in information analysis: Use of event dwell time to characterize component activities

    Energy Technology Data Exchange (ETDEWEB)

    Sanquist, Thomas F.; Greitzer, Frank L.; Slavich, Antoinette L.; Littlefield, Rik J.; Littlefield, Janis S.; Cowley, Paula J.

    2004-09-28

    Technology-based enhancement of information analysis requires a detailed understanding of the cognitive tasks involved in the process. The information search and report production tasks of the information analysis process were investigated through evaluation of time-stamped workstation data gathered with custom software. Model tasks simulated the search and production activities, and a sample of actual analyst data were also evaluated. Task event durations were calculated on the basis of millisecond-level time stamps, and distributions were plotted for analysis. The data indicate that task event time shows a cyclic pattern of variation, with shorter event durations (< 2 sec) reflecting information search and filtering, and longer event durations (> 10 sec) reflecting information evaluation. Application of cognitive principles to the interpretation of task event time data provides a basis for developing “cognitive signatures” of complex activities, and can facilitate the development of technology aids for information intensive tasks.

  18. 信息分析基础理论研究%Researches on the Elementary Theory of Information Analysis

    Institute of Scientific and Technical Information of China (English)

    高柳宾; 孙云川

    2000-01-01

    The engendering and developing process of the elementary theory of Information Analysis is studied. From the developmental point of view, the elementary theory of Information Analysis can be divided into three stages: Bibliometrics-based stage, Informetrics-based stage and Microeconomics of Information-based stage, an exploring, studying and creating stage of the elementary theory of Information Analysis.

  19. The Soft Systems Methodology Based Analysis Model in the Development of Selfmanaging Information Systems

    Directory of Open Access Journals (Sweden)

    Sa'Adah Hassan

    2012-01-01

    Full Text Available Problem statement: In order to be able to manage its own information within its operating environment with minimal human intervention, a selfmanaging information system ought to identify and make use of information from the resources available in its environment. The development of requirements for selfmanaging information systems should start with an appropriate analysis model that can explicitly show the collaborating entities in the environment. The traditional forms of analysis in systems development approaches have not focused on computing systems environments and ways to identify environment resources. Approach: This study discusses the analysis activities in the development of selfmanaging information systems. Results: We propose an SSM based analysis model, which is able to examine the requirements for selfmanaging information systems. We describe the analysis of one particular system, the inventory management system and illustrate how this system fulfils certain desired selfmanaging properties. Conclusion: The SSM based analysis model is able to address the actuation capabilities of the systems and considers internal and external resources in the environment. The analysis model not only takes into account the information from the environment but is also able to provide support in determining the requirements for selfmanaging properties.

  20. 10 CFR 52.157 - Contents of applications; technical information in final safety analysis report.

    Science.gov (United States)

    2010-01-01

    ... requirements in 10 CFR Chapter I, including: (1) An analysis and evaluation of the design and performance of...; technical information in final safety analysis report. The application must contain a final safety analysis... NRC: (a) The principal design criteria for the reactor to be manufactured. Appendix A of 10 CFR...

  1. Analysis and optimalization of information and material flow in choosen company

    OpenAIRE

    SVOJŠE, Zdeněk

    2012-01-01

    This thesis is focused on analysis of information and material flow of production company, supplier to automotive industry. The analysis indicate the fact that each flow doesn´t work separately from the other. The results of the analysis are 3 improvements.We can reduce the cost by the optimalization and the company is more competitive.

  2. Cost-Benefit Analysis of Electronic Information: A Case Study.

    Science.gov (United States)

    White, Gary W.; Crawford, Gregory A.

    1998-01-01

    Describes a study at Pennsylvania State University Harrisburg in which cost-benefit analysis (CBA) was used to examine the cost effectiveness of an electronic database. Concludes that librarians can use the results of CBA studies to justify budgets and acquisitions and to provide insight into the true costs of providing library services. (PEN)

  3. Computer programs: Information retrieval and data analysis, a compilation

    Science.gov (United States)

    1972-01-01

    The items presented in this compilation are divided into two sections. Section one treats of computer usage devoted to the retrieval of information that affords the user rapid entry into voluminous collections of data on a selective basis. Section two is a more generalized collection of computer options for the user who needs to take such data and reduce it to an analytical study within a specific discipline. These programs, routines, and subroutines should prove useful to users who do not have access to more sophisticated and expensive computer software.

  4. Sentiment Analysis of Feedback Information in Hospitality Industry

    Directory of Open Access Journals (Sweden)

    Manzoor Ahmad

    2015-11-01

    Full Text Available Sentiment analysis is the study of opinions, emotions of a person‟s towards events or entities which can enable to rate that event or entity for decision making by the prospective buyers/users. In this research paper I have tried to demonstrate the use of automatic opinion mining/sentiment analysis to rate a hotel and its service‟s based on the guest feedback data. We have used a semantic resource for a feature vector and Naïve Bayes classifier for the review classification after reducing the feature sets for better accuracy and efficiency. Also an improvement in the accuracy of the classification has been observed after the use of bi-gram and tri-gram language model.

  5. Value flow mapping: Using networks to inform stakeholder analysis

    Science.gov (United States)

    Cameron, Bruce G.; Crawley, Edward F.; Loureiro, Geilson; Rebentisch, Eric S.

    2008-02-01

    Stakeholder theory has garnered significant interest from the corporate community, but has proved difficult to apply to large government programs. A detailed value flow exercise was conducted to identify the value delivery mechanisms among stakeholders for the current Vision for Space Exploration. We propose a method for capturing stakeholder needs that explicitly recognizes the outcomes required of the value creating organization. The captured stakeholder needs are then translated into input-output models for each stakeholder, which are then aggregated into a network model. Analysis of this network suggests that benefits are infrequently linked to the root provider of value. Furthermore, it is noted that requirements should not only be written to influence the organization's outputs, but also to influence the propagation of benefit further along the value chain. A number of future applications of this model to systems architecture and requirement analysis are discussed.

  6. Sentiment Analysis of Feedback Information in Hospitality Industry

    Directory of Open Access Journals (Sweden)

    Manzoor Ahmad

    2014-06-01

    Full Text Available Sentiment analysis is the study of opinions, emotions of a person‟s towards events or entities which can enable to rate that event or entity for decision making by the prospective buyers/users. In this research paper I have tried to demonst rate the use of automatic opinion mining/sentiment analysis to rate a hotel and its service‟s based on the guest feedback data. We have used a semantic resource for a feature vector and Naïve Bayes classifier for the review classification after reducing the feature sets for better accuracy and efficiency. Also an improvement in the accuracy of the classification has been observed after the use of bi-gram and tri-gram language model.

  7. Real-Time Environmental Information Network and Analysis System (REINAS)

    OpenAIRE

    Nuss, Wendell A.

    1998-01-01

    The long term goals of the NPS portion of this project, which is joint with UCSC, are to develop a mesoscale coastal analysis system for use in diagnosing and predicting coastal circulations in a topographically complex coastal region and to provide guidance to UCSC for the development of data collection, data management, and visualization tools for mesoscale meteorological problems. Funding Document Number: N0001498WR30144

  8. [Italian physician's needs for medical information. Retrospective analysis of the medical information service provided by Novartis Pharma to clinicians].

    Science.gov (United States)

    Speroni, Elisabetta; Poggi, Susanna; Vinaccia, Vincenza

    2013-10-01

    The physician's need for medical information updates has been studied extensively in recent years but the point of view of the pharmaceutical industry on this need has rarely been considered. This paper reports the results of a retrospective analysis of the medical information service provided to Italian physicians by an important pharmaceutical company, Novartis Pharma, from 2004 to 2012. The results confirm clinicians' appreciation of a service that gives them access to tailored scientific documentation and the number of requests made to the network of medical representatives has been rising steadily, peaking whenever new drugs become available to physicians. The analysis confirms what -other international studies have ascertained, that most queries are about how to use the drugs and what their properties are. The results highlight some differences between different medical specialties: for example, proportionally, neurologists seem to be the most curious. This, as well as other interesting snippets, is worth further exploration. Despite its limits in terms of representativeness, what comes out of the study is the existence of an real unmet need for information by healthcare institutions and that the support offered by the pharmaceutical industry could be invaluable; its role could go well beyond that of a mere supplier to National Healthcare Systems, to that of being recognised as an active partner the process of ensuring balanced and evidence-based information. At the same time, closer appraisal of clinicians' needs could help the pharma industries to improve their communication and educational strategies in presenting their latest clinical research and their own products.

  9. DEACTIVATION AND DECOMMISSIONING PLANNING AND ANALYSIS WITH GEOGRAPHIC INFORMATION SYSTEMS

    Energy Technology Data Exchange (ETDEWEB)

    Bollinger, J; William Austin, W; Larry Koffman, L

    2007-09-17

    From the mid-1950's through the 1980's, the U.S. Department of Energy's Savannah River Site produced nuclear materials for the weapons stockpile, for medical and industrial applications, and for space exploration. Although SRS has a continuing defense-related mission, the overall site mission is now oriented toward environmental restoration and management of legacy chemical and nuclear waste. With the change in mission, SRS no longer has a need for much of the infrastructure developed to support the weapons program. This excess infrastructure, which includes over 1000 facilities, will be decommissioned and demolished over the forthcoming years. Dispositioning facilities for decommissioning and deactivation requires significant resources to determine hazards, structure type, and a rough-order-of-magnitude estimate for the decommissioning and demolition cost. Geographic information systems (GIS) technology was used to help manage the process of dispositioning infrastructure and for reporting the future status of impacted facilities.

  10. Quantum information analysis of electronic states at different molecular structures

    CERN Document Server

    Barcza, G; Marti, K H; Reiher, M

    2010-01-01

    We have studied transition metal clusters from a quantum information theory perspective using the density-matrix renormalization group (DMRG) method. We demonstrate the competition between entanglement and interaction localization. We also discuss the application of the configuration interaction based dynamically extended active space procedure which significantly reduces the effective system size and accelerates the speed of convergence for complicated molecular electronic structures to a great extent. Our results indicate the importance of taking entanglement among molecular orbitals into account in order to devise an optimal orbital ordering and carry out efficient calculations on transition metal clusters. We propose a recipe to perform DMRG calculations in a black-box fashion and we point out the connections of our work to other tensor network state approaches.

  11. 2004/2008 labour market information comparative analysis report

    International Nuclear Information System (INIS)

    The electricity sector has entered into a phase of both challenges and opportunities. Challenges include workforce retirement, labour shortages, and increased competition from other employers to attract and retain the skilled people required to deliver on the increasing demand for electricity in Canada. The electricity sector in Canada is also moving into a new phase, whereby much of the existing infrastructure is either due for significant upgrades, or complete replacement. The increasing demand for electricity means that increased investment and capital expenditure will need to be put toward building new infrastructure altogether. The opportunities for the electricity industry will lie in its ability to effectively and efficiently react to these challenges. The purpose of this report was to provide employers and stakeholders in the sector with relevant and current trend data to help them make appropriate policy and human resource decisions. The report presented a comparative analysis of a 2004 Canadian Electricity Association employer survey with a 2008 Electricity Sector Council employer survey. The comparative analysis highlighted trends and changes that emerged between the 2004 and 2008 studies. Specific topics that were addressed included overall employment trends; employment diversity in the sector; age of non-support staff; recruitment; and retirements and pension eligibility. Recommendations were also offered. It was concluded that the electricity sector could benefit greatly from implementing on-going recruitment campaigns. refs., tabs., figs

  12. Value of Information Analysis Project Gnome Site, New Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Greg Pohll; Jenny Chapman

    2010-01-01

    The Project Gnome site in southeastern New Mexico was the location of an underground nuclear detonation in 1961 and a hydrologic tracer test using radionuclides in 1963. The tracer test is recognized as having greater radionuclide migration potential than the nuclear test because the tracer test radionuclides (tritium, 90Sr, 131I, and 137Cs) are in direct contact with the Culebra Dolomite aquifer, whereas the nuclear test is within a bedded salt formation. The tracer test is the topic here. Recognizing previous analyses of the fate of the Gnome tracer test contaminants (Pohll and Pohlmann, 1996; Pohlmann and Andricevic, 1994), and the existence of a large body of relevant investigations and analyses associated with the nearby Waste Isolation Pilot Plant (WIPP) site (summarized in US DOE, 2009), the Gnome Site Characterization Work Plan (U.S. DOE, 2002) called for a Data Decision Analysis to determine whether or not additional characterization data are needed prior to evaluating existing subsurface intrusion restrictions and determining long-term monitoring for the tracer test. Specifically, the Work Plan called for the analysis to weigh the potential reduction in uncertainty from additional data collection against the cost of such field efforts.

  13. ERROR ANALYSIS ON INFORMATION AND TECHNOLOGY STUDENTS’ SENTENCE WRITING ASSIGNMENTS

    Directory of Open Access Journals (Sweden)

    Rentauli Mariah Silalahi

    2015-03-01

    Full Text Available Students’ error analysis is very important for helping EFL teachers to develop their teaching materials, assessments and methods. However, it takes much time and effort from the teachers to do such an error analysis towards their students’ language. This study seeks to identify the common errors made by 1 class of 28 freshmen students studying English in their first semester in an IT university. The data is collected from their writing assignments for eight consecutive weeks. The errors found were classified into 24 types and the top ten most common errors committed by the students were article, preposition, spelling, word choice, subject-verb agreement, auxiliary verb, plural form, verb form, capital letter, and meaningless sentences. The findings about the students’ frequency of committing errors were, then, contrasted to their midterm test result and in order to find out the reasons behind the error recurrence; the students were given some questions to answer in a questionnaire format. Most of the students admitted that careless was the major reason for their errors and lack understanding came next. This study suggests EFL teachers to devote their time to continuously check the students’ language by giving corrections so that the students can learn from their errors and stop committing the same errors.

  14. Hierarchical models and the analysis of bird survey information

    Science.gov (United States)

    Sauer, J.R.; Link, W.A.

    2003-01-01

    Management of birds often requires analysis of collections of estimates. We describe a hierarchical modeling approach to the analysis of these data, in which parameters associated with the individual species estimates are treated as random variables, and probability statements are made about the species parameters conditioned on the data. A Markov-Chain Monte Carlo (MCMC) procedure is used to fit the hierarchical model. This approach is computer intensive, and is based upon simulation. MCMC allows for estimation both of parameters and of derived statistics. To illustrate the application of this method, we use the case in which we are interested in attributes of a collection of estimates of population change. Using data for 28 species of grassland-breeding birds from the North American Breeding Bird Survey, we estimate the number of species with increasing populations, provide precision-adjusted rankings of species trends, and describe a measure of population stability as the probability that the trend for a species is within a certain interval. Hierarchical models can be applied to a variety of bird survey applications, and we are investigating their use in estimation of population change from survey data.

  15. Value of Information Analysis Project Gnome Site, New Mexico

    International Nuclear Information System (INIS)

    The Project Gnome site in southeastern New Mexico was the location of an underground nuclear detonation in 1961 and a hydrologic tracer test using radionuclides in 1963. The tracer test is recognized as having greater radionuclide migration potential than the nuclear test because the tracer test radionuclides (tritium, 90Sr, 131I, and 137Cs) are in direct contact with the Culebra Dolomite aquifer, whereas the nuclear test is within a bedded salt formation. The tracer test is the topic here. Recognizing previous analyses of the fate of the Gnome tracer test contaminants (Pohll and Pohlmann, 1996; Pohlmann and Andricevic, 1994), and the existence of a large body of relevant investigations and analyses associated with the nearby Waste Isolation Pilot Plant (WIPP) site (summarized in US DOE, 2009), the Gnome Site Characterization Work Plan (U.S. DOE, 2002) called for a Data Decision Analysis to determine whether or not additional characterization data are needed prior to evaluating existing subsurface intrusion restrictions and determining long-term monitoring for the tracer test. Specifically, the Work Plan called for the analysis to weigh the potential reduction in uncertainty from additional data collection against the cost of such field efforts.

  16. Drought Analysis of Aksu Irrigation Area in Antalya by Aydeniz Method and Geographic Information Systems

    OpenAIRE

    ARSLAN, Onur; ÖNDER, Hasan Hüseyin; ÖZDEMİR, Gültekin

    2014-01-01

    n this study, a drought analysis has been carried out for Aksu-Antalya Irrigation Area by using Aydeniz Method and Geographic Information Systems. Meteorological data of Antalya, Isparta, Korkuteli and Manavgat I

  17. Process analysis of the Centre of Information and Library Services of the University of Economics, Prague

    OpenAIRE

    DAŇKOVÁ, Lucie

    2009-01-01

    Theme of the diploma thesis is the analysis of the processes inside the library of the University of economics. The first part is dedicated to the definition of the goals of the thesis and to the theory of the company processes. The text interprets the term of information literacy and information education and links these terms with the procedural analysis. The thesis introduces the library and identified processes for optimalization. In the second part, the identified processes are analyzed ...

  18. A Novel Complex Event Processing Engine for Intelligent Data Analysis in Integrated Information Systems

    OpenAIRE

    Dong Wang; Mingquan Zhou; Sajid Ali; Pengbo Zhou; Yusong Liu; Xuesong Wang

    2016-01-01

    Novel and effective engines for data analysis in integrated information systems are urgently required by diverse applications, in which massive business data can be analyzed to enable the capturing of various situations in real time. The performance of existing engines has limited capacity of data processing in distributed computing. Although Complex Event Processing (CEP) has enhanced the capacity of data analysis in information systems, it is still a big challenging task since events are ra...

  19. The Effect of Corporate Break-ups on Information Asymmetry: A Market Microstructure Analysis

    OpenAIRE

    Bardong, Florian; Bartram, Söhnke M.; Yadav, Pradeep K.

    2006-01-01

    This paper investigates the information environment during and after a corporate break-up utilizing direct measures of information asymmetry developed in the market microstructure literature. The analysis is based on all corporate break-ups in the United States in the period 1995-2005. The results document that information asymmetry declines significantly as a result of a break-up. However, this reduction takes place not at the time of its announcement or its completion, but after it has been...

  20. Climate networks constructed by using information-theoretic measures and ordinal time-series analysis

    OpenAIRE

    Deza, Juan Ignacio

    2015-01-01

    This Thesis is devoted to the construction of global climate networks (CNs) built from time series -surface air temperature anomalies (SAT)- using nonlinear analysis. Several information theory measures have been used including mutual information (MI) and conditional mutual information (CMI). The ultimate goal of the study is to improve the present understanding of climatic variability by means of networks, focusing on the different spatial and time-scales of climate phenomena. An intro...

  1. Radionuclide Data Analysis and Evaluation: More Information From Fewer Isotopes

    Science.gov (United States)

    Prinke, A.; McIntyre, J.; Cooper, M.; Haas, D.; Lowrey, J.; Miley, H.; Schrom, B.; Suckow, T.

    2013-12-01

    The analysis of the International Monitoring System radionuclide data sets provides daily concentrations for both particulate and radioxenon isotopes. These isotopes can come from many potential sources such as nuclear reactors, nuclear physics experiments, and medical isotope production. These interesting but irrelevant sources have several of the same radio-isotopic signatures from above or underground nuclear explosions and must be ruled out as part of the determination that an event originated as a nuclear explosion. There are several methods under development that aid in this determination and this poster will briefly cover each: radio-isotopic ratios and parent daughter relationships, co-detection of radioxenon and isotopes found on particulates, and past detection history.

  2. Sheltering Children from the Whole Truth: A Critical Analysis of an Informational Picture Book.

    Science.gov (United States)

    Lamme, Linda; Fu, Danling

    2001-01-01

    Uses Orbis Pictus Award Committee criteria (accuracy, organization, design, and style) to examine an informational book, "Rice Is Life," by Rita Golden Gelman. Subjects the book to a deeper critical analysis. Suggests that it is important to help students become critical thinkers about everything they read, including informational books. (SG)

  3. 76 FR 37371 - Agency Information Collection: Comment Request for National Gap Analysis Program Evaluation

    Science.gov (United States)

    2011-06-27

    ... documents, (2) measure user satisfaction, and (3) understand user needs. Additionally, this survey can....S. Geological Survey Agency Information Collection: Comment Request for National Gap Analysis... that we have submitted to the Office of Management and Budget (OMB) an information collection...

  4. Using Meta-Analysis to Inform the Design of Subsequent Studies of Diagnostic Test Accuracy

    Science.gov (United States)

    Hinchliffe, Sally R.; Crowther, Michael J.; Phillips, Robert S.; Sutton, Alex J.

    2013-01-01

    An individual diagnostic accuracy study rarely provides enough information to make conclusive recommendations about the accuracy of a diagnostic test; particularly when the study is small. Meta-analysis methods provide a way of combining information from multiple studies, reducing uncertainty in the result and hopefully providing substantial…

  5. A probabilistic framework for image information fusion with an application to mammographic analysis.

    NARCIS (Netherlands)

    Velikova, M.; Lucas, P.J.; Samulski, M.; Karssemeijer, N.

    2012-01-01

    The recent increased interest in information fusion methods for solving complex problem, such as in image analysis, is motivated by the wish to better exploit the multitude of information, available from different sources, to enhance decision-making. In this paper, we propose a novel method, that ad

  6. The Naval Enlisted Professional Development Information System (NEPDIS): Front End Analysis (FEA) Process. Technical Report 159.

    Science.gov (United States)

    Aagard, James A.; Ansbro, Thomas M.

    The Naval Enlisted Professional Development Information System (NEPDIS) was designed to function as a fully computerized information assembly and analysis system to support labor force, personnel, and training management. The NEPDIS comprises separate training development, instructional, training record and evaluation, career development, and…

  7. A New Method for Supporting Information Management in Engineering-consulting Companies: Organizational Network Analysis

    OpenAIRE

    Mehdi Jafari Rizi; Ali Shaemi Barzaki; Mohammad Hossein Yarmohammadian

    2014-01-01

    Organizational performance depends on specialized information that is transfered throughout an organization via communication networks among employees. Interactions that occur within these networks are poorly understood and are generally unmanaged. In this case study, we describe a method that has potential to provide systematic support for information management in engineering- consulting companies. We applied organizational network analysis, a method for studying commun...

  8. The Information Flood in Learning Disabilities: A Bibliometric Analysis of the Journal Literature.

    Science.gov (United States)

    Summers, Edward G.

    1986-01-01

    A bibliometric analysis of 2,270 journal articles on learning disabilities published in 248 journals 1968-1983 reveals the most frequently cited core journals publishing the largest percentage of articles, the relative scatter of information across journal sources, the increase in information and the temporal distribution of articles, and the most…

  9. Self-Informant Agreement in Well-Being Ratings: A Meta-Analysis

    Science.gov (United States)

    Schneider, Leann; Schimmack, Ulrich

    2009-01-01

    A meta-analysis of published studies that reported correlations between self-ratings and informant ratings of well-being (life-satisfaction, happiness, positive affect, negative affect) was performed. The average self-informant correlation based on 44 independent samples and 81 correlations for a total of 8,897 participants was r = 0.42 [99%…

  10. Information Theory for Correlation Analysis and Estimation of Uncertainty Reduction in Maps and Models

    Directory of Open Access Journals (Sweden)

    J. Florian Wellmann

    2013-04-01

    Full Text Available The quantification and analysis of uncertainties is important in all cases where maps and models of uncertain properties are the basis for further decisions. Once these uncertainties are identified, the logical next step is to determine how they can be reduced. Information theory provides a framework for the analysis of spatial uncertainties when different subregions are considered as random variables. In the work presented here, joint entropy, conditional entropy, and mutual information are applied for a detailed analysis of spatial uncertainty correlations. The aim is to determine (i which areas in a spatial analysis share information, and (ii where, and by how much, additional information would reduce uncertainties. As an illustration, a typical geological example is evaluated: the case of a subsurface layer with uncertain depth, shape and thickness. Mutual information and multivariate conditional entropies are determined based on multiple simulated model realisations. Even for this simple case, the measures not only provide a clear picture of uncertainties and their correlations but also give detailed insights into the potential reduction of uncertainties at each position, given additional information at a different location. The methods are directly applicable to other types of spatial uncertainty evaluations, especially where multiple realisations of a model simulation are analysed. In summary, the application of information theoretic measures opens up the path to a better understanding of spatial uncertainties, and their relationship to information and prior knowledge, for cases where uncertain property distributions are spatially analysed and visualised in maps and models.

  11. Can Raters with Reduced Job Descriptive Information Provide Accurate Position Analysis Questionnaire (PAQ) Ratings?

    Science.gov (United States)

    Friedman, Lee; Harvey, Robert J.

    1986-01-01

    Job-naive raters provided with job descriptive information made Position Analysis Questionnaire (PAQ) ratings which were validated against ratings of job analysts who were also job content experts. None of the reduced job descriptive information conditions enabled job-naive raters to obtain either acceptable levels of convergent validity with…

  12. Transit light curves with finite integration time: Fisher information analysis

    Energy Technology Data Exchange (ETDEWEB)

    Price, Ellen M. [California Institute of Technology, 1200 East California Boulevard, Pasadena, CA 91125 (United States); Rogers, Leslie A. [California Institute of Technology, MC249-17, 1200 East California Boulevard, Pasadena, CA 91125 (United States)

    2014-10-10

    Kepler has revolutionized the study of transiting planets with its unprecedented photometric precision on more than 150,000 target stars. Most of the transiting planet candidates detected by Kepler have been observed as long-cadence targets with 30 minute integration times, and the upcoming Transiting Exoplanet Survey Satellite will record full frame images with a similar integration time. Integrations of 30 minutes affect the transit shape, particularly for small planets and in cases of low signal to noise. Using the Fisher information matrix technique, we derive analytic approximations for the variances and covariances on the transit parameters obtained from fitting light curve photometry collected with a finite integration time. We find that binning the light curve can significantly increase the uncertainties and covariances on the inferred parameters when comparing scenarios with constant total signal to noise (constant total integration time in the absence of read noise). Uncertainties on the transit ingress/egress time increase by a factor of 34 for Earth-size planets and 3.4 for Jupiter-size planets around Sun-like stars for integration times of 30 minutes compared to instantaneously sampled light curves. Similarly, uncertainties on the mid-transit time for Earth and Jupiter-size planets increase by factors of 3.9 and 1.4. Uncertainties on the transit depth are largely unaffected by finite integration times. While correlations among the transit depth, ingress duration, and transit duration all increase in magnitude with longer integration times, the mid-transit time remains uncorrelated with the other parameters. We provide code in Python and Mathematica for predicting the variances and covariances at www.its.caltech.edu/∼eprice.

  13. Deterministic and risk-informed approaches for safety analysis of advanced reactors: Part II, Risk-informed approaches

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Inn Seock, E-mail: innseockkim@gmail.co [ISSA Technology, 21318 Seneca Crossing Drive, Germantown, MD 20876 (United States); Ahn, Sang Kyu; Oh, Kyu Myung [Korea Institute of Nuclear Safety, 19 Kusong-dong, Yuseong-gu, Daejeon 305-338 (Korea, Republic of)

    2010-05-15

    Technical insights and findings from a critical review of deterministic approaches typically applied to ensure design safety of nuclear power plants were presented in the companion paper of Part I included in this issue. In this paper we discuss the risk-informed approaches that have been proposed to make a safety case for advanced reactors including Generation-IV reactors such as Modular High-Temperature Gas-cooled Reactor (MHTGR), Pebble Bed Modular Reactor (PBMR), or Sodium-cooled Fast Reactor (SFR). Also considered herein are a risk-informed safety analysis approach suggested by Westinghouse as a means to improve the conventional accident analysis, together with the Technology Neutral Framework recently developed by the US Nuclear Regulatory Commission as a high-level regulatory infrastructure for safety evaluation of any type of reactor design. The insights from a comparative review of various deterministic and risk-informed approaches could be usefully used in developing a new licensing architecture for enhanced safety of evolutionary or advanced plants.

  14. Efficiency and Effectiveness in the Collection and Analysis of S&T Open Source Information

    International Nuclear Information System (INIS)

    While looking for information in scientific database, we are overwhelmed by the amount of information that we encounter. In this big data collection, getting information with added-value could be strategic for nuclear verification. In our study, we have worked about ''best practices'' in collecting, processing and analyzing open source scientific and technical information. First, we were insistent on working with information authenticated by referees such as scientific publications (structured information). Analysis of this structured data is made with bibliometric tools. Several steps are carried out: collecting data related to the paradigm, creating a database to store data generated by bibliographic research, analyzing data with selected tools. With analysis of bibliographic data only, we are able to get: · a panoramic view of countries that publish in the paradigm, · co-publication networks, · organizations that contribute to scientific publications, · countries with which a country collaborates, · areas of interest of a country, . . . So we are able to identify a target. On a second phase, we can focus on a target (countries for example). Working with non-structured data (i.e., press release, social networks, full text analysis of publications) is in progress and needs other tools to be added to the process, as we will discuss in this paper. In information analysis, methodology and expert analysis are important. Software analysis is just a tool to achieve our goal. This presentation deals with concrete measures that improve the efficiency and effectiveness in the use of open source S&T information and in the management of that information over time. Examples are shown. (author)

  15. Information extraction from topographic map using colour and shape analysis

    Indian Academy of Sciences (India)

    Nikam Gitanjali Ganpatrao; Jayanta Kumar Ghosh

    2014-10-01

    The work presented in this paper is related to symbols and toponym understanding with application to scanned Indian topographic maps. The proposed algorithm deals with colour layer separation of enhanced topographic map using kmeans colour segmentation followed by outline detection and chaining, respectively. Outline detection is performed through linear filtering using canny edge detector. Outline is then encoded in a Freeman way, the x-y offsets have been used to obtain a complex representation of outlines. Final matching of shapes is done by computing Fourier descriptors from the chain-codes; comparison of descriptors having same colour index is embedded in a normalized scalar product of descriptors. As this matching process is not rotation invariant (starting point selection), an interrelation function has been proposed to make the method shifting invariant. The recognition rates of symbols, letters and numbers are 84.68, 91.73 and 92.19%, respectively. The core contribution is dedicated to a shape analysis method based on contouring and Fourier descriptors. To improve recognition rate, obtaining most optimal segmentation solution for complex topographic map will be the future scope of work.

  16. Graph analysis of dream reports is especially informative about psychosis.

    Science.gov (United States)

    Mota, Natália B; Furtado, Raimundo; Maia, Pedro P C; Copelli, Mauro; Ribeiro, Sidarta

    2014-01-01

    Early psychiatry investigated dreams to understand psychopathologies. Contemporary psychiatry, which neglects dreams, has been criticized for lack of objectivity. In search of quantitative insight into the structure of psychotic speech, we investigated speech graph attributes (SGA) in patients with schizophrenia, bipolar disorder type I, and non-psychotic controls as they reported waking and dream contents. Schizophrenic subjects spoke with reduced connectivity, in tight correlation with negative and cognitive symptoms measured by standard psychometric scales. Bipolar and control subjects were undistinguishable by waking reports, but in dream reports bipolar subjects showed significantly less connectivity. Dream-related SGA outperformed psychometric scores or waking-related data for group sorting. Altogether, the results indicate that online and offline processing, the two most fundamental modes of brain operation, produce nearly opposite effects on recollections: While dreaming exposes differences in the mnemonic records across individuals, waking dampens distinctions. The results also demonstrate the feasibility of the differential diagnosis of psychosis based on the analysis of dream graphs, pointing to a fast, low-cost and language-invariant tool for psychiatric diagnosis and the objective search for biomarkers. The Freudian notion that "dreams are the royal road to the unconscious" is clinically useful, after all.

  17. Graph analysis of dream reports is especially informative about psychosis

    Science.gov (United States)

    Mota, Natália B.; Furtado, Raimundo; Maia, Pedro P. C.; Copelli, Mauro; Ribeiro, Sidarta

    2014-01-01

    Early psychiatry investigated dreams to understand psychopathologies. Contemporary psychiatry, which neglects dreams, has been criticized for lack of objectivity. In search of quantitative insight into the structure of psychotic speech, we investigated speech graph attributes (SGA) in patients with schizophrenia, bipolar disorder type I, and non-psychotic controls as they reported waking and dream contents. Schizophrenic subjects spoke with reduced connectivity, in tight correlation with negative and cognitive symptoms measured by standard psychometric scales. Bipolar and control subjects were undistinguishable by waking reports, but in dream reports bipolar subjects showed significantly less connectivity. Dream-related SGA outperformed psychometric scores or waking-related data for group sorting. Altogether, the results indicate that online and offline processing, the two most fundamental modes of brain operation, produce nearly opposite effects on recollections: While dreaming exposes differences in the mnemonic records across individuals, waking dampens distinctions. The results also demonstrate the feasibility of the differential diagnosis of psychosis based on the analysis of dream graphs, pointing to a fast, low-cost and language-invariant tool for psychiatric diagnosis and the objective search for biomarkers. The Freudian notion that ``dreams are the royal road to the unconscious'' is clinically useful, after all.

  18. Information Flow Through Stages of Complex Engineering Design Projects: A Dynamic Network Analysis Approach

    DEFF Research Database (Denmark)

    Parraguez, Pedro; Eppinger, Steven D.; Maier, Anja

    2015-01-01

    The pattern of information flow through the network of interdependent design activities is thought to be an important determinant of engineering design process results. A previously unexplored aspect of such patterns relates to the temporal dynamics of information transfer between activities...... information flows between activities in complex engineering design projects; 2) we show how the network of information flows in a large-scale engineering project evolved over time and how network analysis yields several managerial insights; and 3) we provide a useful new representation of the engineering...... design process and thus support theory-building toward the evolution of information flows through systems engineering stages. Implications include guidance on how to analyze and predict information flows as well as better planning of information flows in engineering design projects according...

  19. A Real—Time and Dynamic Biological Information Retrieval and Analysis System(BIRAS)

    Institute of Scientific and Technical Information of China (English)

    QiZhou; HongZhang; MeiyingGeng; ChenggangZhang

    2003-01-01

    The aim of this study is to design a biological information retrieval and analysis system(BIRAS) based on the Internet.Using the specific network protocol,BIRAS system could send and receive information from the Entrez search and retrieval system maintained by National Center for Biotechnology Information(NCBI)in USA.The literatures,nucleotide sequence,protein sequences,and other resources according to the user-defined term could then be retrieved and sent to the user by pop up message or by E-amil informing automatically using BIRAS system.All the information retrieving and analyzing processes are done in real-time.As a robust system for intelligently and dynamically retrieving and analyzing on the user-defined information,it is believed that BIRAS would be extensively used to retrieve specific information from large amount of biological databases in now days.The program is available on request from the corresponding author.

  20. A Real-Time and Dynamic Biological Information Retrieval and Analysis System (BIRAS)

    Institute of Scientific and Technical Information of China (English)

    Qi Zhou; Hong Zhang; Meiying Geng; Chenggang Zhang

    2003-01-01

    The aim of this study is to design a biological information retrieval and analysis system (BIRAS) based on the Internet. Using the specific network protocol, BIRAS system could send and receive information from the Entrez search and retrieval system maintained by National Center for Biotechnology Information (NCBI) in USA. The literatures, nucleotide sequence, protein sequences, and other resources according to the user-defined term could then be retrieved and sent to the user by pop up message or by E-mail informing automatically using BIRAS system.All the information retrieving and analyzing processes are done in real-time. As a robust system for intelligently and dynamically retrieving and analyzing on the user-defined information, it is believed that BIRAS would be extensively used to retrieve specific information from large amount of biological databases in now days.The program is available on request from the corresponding author.

  1. Auditing Information Structures in Organizations: A Review of Data Collection Techniques for Network Analysis

    NARCIS (Netherlands)

    Zwijze-Koning, Karen H.; Jong, de Menno D.T.

    2005-01-01

    Network analysis is one of the current techniques for investigating organizational communication. Despite the amount of how-to literature about using network analysis to assess information flows and relationships in organizations, little is known about the methodological strengths and weaknesses of

  2. REAL-TIME ENERGY INFORMATION AND CONSUMER BEHAVIOR: A META-ANALYSIS AND FORECAST

    Science.gov (United States)

    The meta-analysis of literature and program results will shed light on potential causes of study-to-study variation in information feedback programs and trials. Outputs from the meta-analysis, such as price elasticity, will be used in NEMS to estimate the impact of a nation...

  3. BGI-RIS: an integrated information resource and comparative analysis workbench for rice genomics

    DEFF Research Database (Denmark)

    Zhao, Wenming; Wang, Jing; He, Ximiao;

    2004-01-01

    . In addition to comprehensive data from Oryza sativa L. ssp. indica sequenced by BGI, BGI-RIS also hosts carefully curated genome information from Oryza sativa L. ssp. japonica and EST sequences available from other cereal crops. In this resource, sequence contigs of indica (93-11) have been further assembled......Rice is a major food staple for the world's population and serves as a model species in cereal genome research. The Beijing Genomics Institute (BGI) has long been devoting itself to sequencing, information analysis and biological research of the rice and other crop genomes. In order to facilitate...... the application of the rice genomic information and to provide a foundation for functional and evolutionary studies of other important cereal crops, we implemented our Rice Information System (BGI-RIS), the most up-to-date integrated information resource as well as a workbench for comparative genomic analysis...

  4. ID201202961, DOE S-124,539, Information Security Analysis Using Game Theory and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Abercrombie, Robert K [ORNL; Schlicher, Bob G [ORNL

    2012-01-01

    Information security analysis can be performed using game theory implemented in dynamic simulations of Agent Based Models (ABMs). Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, and information assets. Our approach addresses imperfect information and scalability that allows us to also address previous limitations of current stochastic game models. Such models only consider perfect information assuming that the defender is always able to detect attacks; assuming that the state transition probabilities are fixed before the game assuming that the players actions are always synchronous; and that most models are not scalable with the size and complexity of systems under consideration. Our use of ABMs yields results of selected experiments that demonstrate our proposed approach and provides a quantitative measure for realistic information systems and their related security scenarios.

  5. Analysis Of Factors Affecting The Success Of The Application Of Accounting Information System

    Directory of Open Access Journals (Sweden)

    Deni Iskandar

    2015-02-01

    Full Text Available Abstract The purpose of this study was to find solutions for problems related to the quality of accounting information systems accounting information quality when connected with management commitment user competency and organizational culture. This research was conducted through deductive analysis supported the phenomenon then sought evidence through empirical facts especially about the effect of management commitment competence and users of organizational culture on the quality of accounting information systems and their impact on the quality of accounting information. This research was conducted at the State-Owned Enterprises SOEs.

  6. Citation analysis in journal rankings: medical informatics in the library and information science literature.

    Science.gov (United States)

    Vishwanatham, R

    1998-10-01

    Medical informatics is an interdisciplinary field. Medical informatics articles will be found in the literature of various disciplines including library and information science publications. The purpose of this study was to provide an objectively ranked list of journals that publish medical informatics articles relevant to library and information science. Library Literature, Library and Information Science Abstracts, and Social Science Citation Index were used to identify articles published on the topic of medical informatics and to identify a ranked list of journals. This study also used citation analysis to identify the most frequently cited journals relevant to library and information science.

  7. The US Support Program Assistance to the IAEA Safeguards Information Technology, Collection, and Analysis 2008

    Energy Technology Data Exchange (ETDEWEB)

    Tackentien,J.

    2008-06-12

    One of the United States Support Program's (USSP) priorities for 2008 is to support the International Atomic Energy Agency's (IAEA) development of an integrated and efficient safeguards information infrastructure, including reliable and maintainable information systems, and effective tools and resources to collect and analyze safeguards-relevant information. The USSP has provided funding in support of this priority for the ISIS Re-engineering Project (IRP), and for human resources support to the design and definition of the enhanced information analysis architecture project (nVision). Assistance for several other information technology efforts is provided. This paper will report on the various ongoing support measures undertaken by the USSP to support the IAEA's information technology enhancements and will provide some insights into activities that the USSP may support in the future.

  8. A comparative study of information diffusion in weblogs and microblogs based on social network analysis

    Institute of Scientific and Technical Information of China (English)

    Yang; ZHANG; Wanyang; LING

    2012-01-01

    Purpose:This paper intends to explore a quantitative method for investigating the characteristics of information diffusion through social media like weblogs and microblogs.By using the social network analysis methods,we attempt to analyze the different characteristics of information diffusion in weblogs and microblogs as well as the possible reasons of these differences.Design/methodology/approach:Using the social network analysis methods,this paper carries out an empirical study by taking the Chinese weblogs and microblogs in the field of Library and Information Science(LIS)as the research sample and employing measures such as network density,core/peripheral structure and centrality.Findings:Firstly,both bloggers and microbloggers maintain weak ties,and both of their social networks display a small-world effect.Secondly,compared with weblog users,microblog users are more interconnected,more equal and more capable of developing relationships with people outside their own social networks.Thirdly,the microblogging social network is more conducive to information diffusion than the blogging network,because of their differences in functions and the information flow mechanism.Finally,the communication mode emerged with microblogging,with the characteristics of micro-content,multi-channel information dissemination,dense and decentralized social network and content aggregation,will be one of the trends in the development of the information exchange platform in the future.Research limitations:The sample size needs to be increased so that samples are more representative.Errors may exist during the data collection.Moreover,the individual-level characteristics of the samples as well as the types of information exchanged need to be further studied.Practical implications:This preliminary study explores the characteristics of information diffusion in the network environment and verifies the feasibility of conducting a quantitative analysis of information diffusion through social

  9. A New Classification Analysis of Customer Requirement Information Based on Quantitative Standardization for Product Configuration

    Directory of Open Access Journals (Sweden)

    Zheng Xiao

    2016-01-01

    Full Text Available Traditional methods used for the classification of customer requirement information are typically based on specific indicators, hierarchical structures, and data formats and involve a qualitative analysis in terms of stationary patterns. Because these methods neither consider the scalability of classification results nor do they regard subsequent application to product configuration, their classification becomes an isolated operation. However, the transformation of customer requirement information into quantifiable values would lead to a dynamic classification according to specific conditions and would enable an association with product configuration in an enterprise. This paper introduces a classification analysis based on quantitative standardization, which focuses on (i expressing customer requirement information mathematically and (ii classifying customer requirement information for product configuration purposes. Our classification analysis treated customer requirement information as follows: first, it was transformed into standardized values using mathematics, subsequent to which it was classified through calculating the dissimilarity with general customer requirement information related to the product family. Finally, a case study was used to demonstrate and validate the feasibility and effectiveness of the classification analysis.

  10. A generalized rough set-based information filling technique for failure analysis of thruster experimental data

    Institute of Scientific and Technical Information of China (English)

    Han Shan; Zhu Qiang; Li Jianxun; Chen Lin

    2013-01-01

    Interval-valued data and incomplete data are two key problems for failure analysis of thruster experimental data and have been basically solved by the proposed methods in this paper. Firstly, information data acquired from the simulation and evaluation system formed as interval-valued information system (IIS) is classified by the interval similarity relation. Then, as an improve-ment of the classical rough set, a new kind of generalized information entropy called‘‘H0-informa-tion entropy’’ is suggested for the measurement of uncertainty and the classification ability of IIS. There is an innovative information filling technique using the properties of H0-information entropy to replace missing data by some smaller estimation intervals. Finally, an improved method of failure analysis synthesized by the above achievements is presented to classify the thruster experimental data, complete the information, and extract the failure rules. The feasibility and advantage of this method is testified by an actual application of failure analysis, whose performance is evaluated by the quantification of E-condition entropy.

  11. Financial Ratio Analysis: the Development of a Dedicated Management Information System

    Directory of Open Access Journals (Sweden)

    Voicu-Dan Dragomir

    2007-01-01

    Full Text Available This paper disseminates the results of the development process for a financial analysis information system. The system has been subject to conceptual design using the Unified Modeling Language (UML and has been implemented in object-oriented manner using the Visual Basic .NET 2003 programming language. The classic financial analysis literature is focused on the chain-substitution method of computing the prior-year to current-year variation of linked financial ratios. We have applied this technique on the DuPont System of analysis concerning the Return on Equity ratio, by designing several structural UML diagrams depicting the breakdown and analysis of each financial ratio involved. The resulting computer application offers a flexible approach to the analytical tools: the user is required to introduce the raw data and the system provides both table-style and charted information on the output of computation. User-friendliness is also a key feature of this particular financial analysis application.

  12. Information

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    There are unstructured abstracts (no more than 256 words) and structured abstracts (no more than 480). The specific requirements for structured abstracts are as follows:An informative, structured abstracts of no more than 4-80 words should accompany each manuscript. Abstracts for original contributions should be structured into the following sections. AIM (no more than 20 words): Only the purpose should be included. Please write the aim as the form of "To investigate/ study/..."; MATERIALS AND METHODS (no more than 140 words); RESULTS (no more than 294 words): You should present P values where appropnate and must provide relevant data to illustrate how they were obtained, e.g. 6.92 ± 3.86 vs 3.61 ± 1.67, P< 0.001; CONCLUSION (no more than 26 words).

  13. The E-net model for the Risk Analysis and Assessment System for the Information Security of Communication and Information Systems ("Defining" Subsystem)

    OpenAIRE

    Stoianov, Nikolai; Aleksandrova, Veselina

    2010-01-01

    This paper presents one suggestion that comprises the authors' experience in development and implementation of systems for information security in the Automated Information Systems of the Bulgarian Armed Forces. The architecture of risk analysis and assessment system for the communication and information system's information security (CIS IS) has been presented. E-net model of "Defining" Subsystem as a tool that allows to examine the subsystems is proposed as well. Such approach can be applie...

  14. Deterministic and risk-informed approaches for safety analysis of advanced reactors: Part I, deterministic approaches

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Sang Kyu [Korea Institute of Nuclear Safety, 19 Kusong-dong, Yuseong-gu, Daejeon 305-338 (Korea, Republic of); Kim, Inn Seock, E-mail: innseockkim@gmail.co [ISSA Technology, 21318 Seneca Crossing Drive, Germantown, MD 20876 (United States); Oh, Kyu Myung [Korea Institute of Nuclear Safety, 19 Kusong-dong, Yuseong-gu, Daejeon 305-338 (Korea, Republic of)

    2010-05-15

    The objective of this paper and a companion paper in this issue (part II, risk-informed approaches) is to derive technical insights from a critical review of deterministic and risk-informed safety analysis approaches that have been applied to develop licensing requirements for water-cooled reactors, or proposed for safety verification of the advanced reactor design. To this end, a review was made of a number of safety analysis approaches including those specified in regulatory guides and industry standards, as well as novel methodologies proposed for licensing of advanced reactors. This paper and the companion paper present the review insights on the deterministic and risk-informed safety analysis approaches, respectively. These insights could be used in making a safety case or developing a new licensing review infrastructure for advanced reactors including Generation IV reactors.

  15. Using visual information analysis to explore complex patterns in the activity of designers

    DEFF Research Database (Denmark)

    Cash, Philip; Stanković, Tino; Štorga, Mario

    2014-01-01

    The analysis of complex interlinked datasets poses a significant problem for design researchers. This is addressed by proposing an information visualisation method for analysing patterns of design activity, qualitatively and quantitatively, with respect to time. This method visualises the tempora......The analysis of complex interlinked datasets poses a significant problem for design researchers. This is addressed by proposing an information visualisation method for analysing patterns of design activity, qualitatively and quantitatively, with respect to time. This method visualises...... to a fully realised example of information seeking activity. The core contribution of the proposed method is in supporting the analysis of activity with respect to both macro and micro level temporal interactions between variables....

  16. Static versus Dynamic Data Information Fusion Analysis using DDDAS for Cyber Security Trust

    OpenAIRE

    Erik Blasch; Youssif Al-Nashif; Salim Hariri

    2015-01-01

    Information fusion includes signals, features, and decision-level analysis over various types of data including imagery, text, and cyber security detection. With the maturity of data processing, the explosion of big data, and the need for user acceptance; the Dynamic Data-Driven Application System (DDDAS) philosophy fosters insights into the usability of information systems solutions. In this paper, we explore a notion of an adaptive adjustment of secure communication...

  17. Task Analysis in Action: The Role of Information Systems in Communicable Disease Reporting

    OpenAIRE

    Pina, Jamie; Turner, Anne; Kwan-Gett, Tao; Duchin, Jeff

    2009-01-01

    In order to improve the design of information systems for notifiable conditions reporting, it is essential to understand the role of such systems in public health practice. Using qualitative techniques, we performed a task analysis of the activities associated with notifiable conditions reporting at a large urban health department. We identified seventeen primary tasks associated with the use of the department’s information system. The results of this investigation suggest that communicable d...

  18. Description of a method to support public health information management: organizational network analysis

    OpenAIRE

    Merrill, Jacqueline; Bakken, Suzanne; Rockoff, Maxine; Gebbie, Kristine; Carley, Kathleen

    2006-01-01

    In this case study we describe a method that has potential to provide systematic support for public health information management. Public health agencies depend on specialized information that travels throughout an organization via communication networks among employees. Interactions that occur within these networks are poorly understood and are generally unmanaged. We applied organizational network analysis, a method for studying communication networks, to assess the method’s utility to supp...

  19. Analysis Of Factors Affecting The Success Of The Application Of Accounting Information System

    OpenAIRE

    Deni Iskandar

    2015-01-01

    Abstract The purpose of this study was to find solutions for problems related to the quality of accounting information systems accounting information quality when connected with management commitment user competency and organizational culture. This research was conducted through deductive analysis supported the phenomenon then sought evidence through empirical facts especially about the effect of management commitment competence and users of organizational culture on the quality of accounting...

  20. User satisfaction-based quality evaluation model and survey analysis of network information service

    Institute of Scientific and Technical Information of China (English)

    LEI; Xue; JIAO; Yuying

    2009-01-01

    On the basis of user satisfaction,authors made research hypotheses by learning from relevant e-service quality evaluation models.A questionnaire survey was then conducted on some content-based websites in terms of their convenience,information quality,personalization and site aesthetics,which may affect the overall satisfaction of users.Statistical analysis was also made to build a user satisfaction-based quality evaluation system of network information service.

  1. Open source information acquisition, analysis and integration in the IAEA Department of Safeguards

    Energy Technology Data Exchange (ETDEWEB)

    Barletta, M.; Zarimpas, N.; Zarucki, R., E-mail: M.Barletta@iaea.or [IAEA, Wagramerstrasse 5, P.O. Box 100, 1400 Vienna (Austria)

    2010-10-15

    Acquisition and analysis of open source information plays an increasingly important role in the IAEA strengthened safeguards system. The Agency's focal point for open source information collection and analysis is the Division of Safeguards Information Management (SGIM) within the IAEA Department of Safeguards. In parallel with the approval of the Model Additional Protocol in 1997, a new centre of information acquisition and analysis expertise was created within SGIM. By acquiring software, developing databases, retraining existing staff and hiring new staff with diverse analytical skills, SGIM is pro actively contributing to the future implementation of information-driven safeguards in collaboration with other Divisions within the Department of Safeguards. Open source information support is now fully integrated with core safeguards processes and activities, and has become an effective tool in the work of the Department of Safeguards. This provides and overview of progress realized through the acquisition and use of open source information in several thematic areas: evaluation of additional protocol declarations; support to the State Evaluation process; in-depth investigation of safeguards issues, including assisting inspections and complementary access; research on illicit nuclear procurement networks and trafficking; and monitoring nuclear developments. Demands for open source information have steadily grown and are likely to continue to grow in the future. Coupled with the enormous growth and accessibility in the volume and sources of information, new challenges are presented, both technical and analytical. This paper discusses actions taken and future plans for multi-source and multi-disciplinary analytic integration to strengthen confidence in safeguards conclusions - especially regarding the absence of undeclared nuclear materials and activities. (Author)

  2. Open source information acquisition, analysis and integration in the IAEA Department of Safeguards

    International Nuclear Information System (INIS)

    Acquisition and analysis of open source information plays an increasingly important role in the IAEA strengthened safeguards system. The Agency's focal point for open source information collection and analysis is the Division of Safeguards Information Management (SGIM) within the IAEA Department of Safeguards. In parallel with the approval of the Model Additional Protocol in 1997, a new centre of information acquisition and analysis expertise was created within SGIM. By acquiring software, developing databases, retraining existing staff and hiring new staff with diverse analytical skills, SGIM is pro actively contributing to the future implementation of information-driven safeguards in collaboration with other Divisions within the Department of Safeguards. Open source information support is now fully integrated with core safeguards processes and activities, and has become an effective tool in the work of the Department of Safeguards. This provides and overview of progress realized through the acquisition and use of open source information in several thematic areas: evaluation of additional protocol declarations; support to the State Evaluation process; in-depth investigation of safeguards issues, including assisting inspections and complementary access; research on illicit nuclear procurement networks and trafficking; and monitoring nuclear developments. Demands for open source information have steadily grown and are likely to continue to grow in the future. Coupled with the enormous growth and accessibility in the volume and sources of information, new challenges are presented, both technical and analytical. This paper discusses actions taken and future plans for multi-source and multi-disciplinary analytic integration to strengthen confidence in safeguards conclusions - especially regarding the absence of undeclared nuclear materials and activities. (Author)

  3. Open Source Information Acquisition and Analysis in the IAEA Department of Safeguards

    International Nuclear Information System (INIS)

    Acquisition and analysis of open source information plays an increasingly important role in the IAEA's strengthened safeguards system. The Agency's focal point for open source information collection and analysis is the Division of Safeguards Information Management (SGIM) within the IAEA's Department of Safeguards. In parallel with the approval of the Model Additional Protocol in 1997, a new center of information acquisition and analysis expertise was created within SGIM. By acquiring software, developing databases, retraining existing staff and hiring new staff with diverse analytical skills, SGIM is pro-actively contributing to the future implementation of information-driven safeguards in collaboration with other Divisions within the Department of Safeguards. Open source information support is now fully integrated with core safeguards processes and activities, and has become an effective tool in the work of the Department of Safeguards. This paper provides an overview of progress realized through the acquisition and use of open source information in several thematic areas: evaluation of additional protocol declarations; support to the State Evaluation process; in-depth investigation of safeguards issues, including assisting inspections and complementary access; research on illicit nuclear procurement networks and trafficking; and monitoring nuclear developments. Demands for open source information have steadily grown and will likely continue to grow in the future. Coupled with the enormous growth and accessibility in the volume and sources of information, new challenges are presented, both technical and analytical. This paper discusses actions taken and future plans for multi-source and multidisciplinary analytic integration to strengthen confidence in safeguards conclusions especially regarding the absence of undeclared nuclear materials and activities. (authors)

  4. Software and Information Life Cycle (SILC) for the Integrated Information Services Organization. Analysis and implementation phase adaptations of the Sandia software guidelines: Issue A, April 18, 1995

    Energy Technology Data Exchange (ETDEWEB)

    Eaton, D.; Cassidy, A.; Cuyler, D. [and others

    1995-07-01

    This document describes the processes to be used for creating corporate information systems within the scope of the Integrated information Services (IIS) Center. This issue A describes the Analysis and Implementation phases within the context of the entire life cycle. Appendix A includes a full set of examples of the analysis set deliverables. Subsequent issues will describe the other life cycle processes as we move toward enterprise-level management of information assets, including information meta-models and an integrated corporate information model. The analysis phase as described here, when combined with a specifications repository, will provide the basis for future reusable components and improve traceability of information system specifications to enterprise business rules.

  5. Quantifying information transfer by protein domains: Analysis of the Fyn SH2 domain structure

    DEFF Research Database (Denmark)

    Lenaerts, Tom; Ferkinghoff-Borg, Jesper; Stricher, Francois;

    2008-01-01

    instance of communication over a noisy channel. In particular, we analyze the conformational correlations between protein residues and apply the concept of mutual information to quantify information exchange. Mapping out changes of mutual information on the protein structure then allows visualizing how...... distal communication is achieved. We illustrate the approach by analyzing information transfer by the SH2 domain of Fyn tyrosine kinase, obtained from Monte Carlo dynamics simulations. Our analysis reveals that the Fyn SH2 domain forms a noisy communication channel that couples residues located...... by crossing the core of the SH2 domain. Conclusion: As a result, our method provides a means to directly map the exchange of biological information on the structure of protein domains, making it clear how binding triggers conformational changes in the protein structure. As such it provides a structural road...

  6. Urban Planning and Management Information Systems Analysis and Design Based on GIS

    Science.gov (United States)

    Xin, Wang

    Based on the analysis of existing relevant systems on the basis of inadequate, after a detailed investigation and research, urban planning and management information system will be designed for three-tier structure system, under the LAN using C/S mode architecture. Related functions for the system designed in accordance with the requirements of the architecture design of the functional relationships between the modules. Analysis of the relevant interface and design, data storage solutions proposed. The design for small and medium urban planning information system provides a viable building program.

  7. Methods of sports genetics: dermatoglyphic analysis of human palmarprints (information 2

    Directory of Open Access Journals (Sweden)

    Serhiyenko L.P.

    2010-01-01

    Full Text Available Information is generalized about the dermatoglyphic analysis of hands of hands of man. The quantitative dermatoglyphic indexes of hands of hands are presented for youths and girls of the Podol region of Ukraine. The quantitative indexes of palm's dermatoglyphics are rotined for youths and girls of Ukrainian and Russian nationality in Kharkov. The most informing dermatoglyphic indexes of hands of hands which it is possible to use in sporting genetics are certain. Formed recommendation on technology of dermatoglyphic analysis of hands of hands of man in sporting genetics.

  8. RACLOUDS - Model for Clouds Risk Analysis in the Information Assets Context

    Directory of Open Access Journals (Sweden)

    SILVA, P. F.

    2016-06-01

    Full Text Available Cloud computing offers benefits in terms of availability and cost, but transfers the responsibility of information security management for the cloud service provider. Thus the consumer loses control over the security of their information and services. This factor has prevented the migration to cloud computing in many businesses. This paper proposes a model where the cloud consumer can perform risk analysis on providers before and after contracting the service. The proposed model establishes the responsibilities of three actors: Consumer, Provider and Security Labs. The inclusion of actor Security Labs provides more credibility to risk analysis making the results more consistent for the consumer.

  9. APPLICATION OF INFORMATION THEORY AND A.S.C. ANALYSIS FOR EXPERIMENTAL RESEARCH IN NUMBER THEORY

    Directory of Open Access Journals (Sweden)

    Lutsenko Y. V.

    2014-03-01

    Full Text Available Is it possible to automate the study of the properties of numbers and their relationship so that the results of this study can be formulated in the form of statements, indicating the specific quantity of information stored in them? To answer this question it is offered to apply the same method that is widely tested and proved in studies of real objects and their relations in various fields to study the properties of numbers in the theory of numbers namely - the automated system-cognitive analysis (A.S.C. analysis, based on information theory

  10. SETUP OF RESOLUTIVE CRITERION FOR SEDIMENT-RELATED DISASTER WARNING INFORMATION USING LOGISTIC REGRESSION ANALYSIS

    Science.gov (United States)

    Sugihara, Shigemitsu; Shinozaki, Tsuguhiro; Ohishi, Hiroyuki; Araki, Yoshinori; Furukawa, Kohei

    It is difficult to deregulate sediment-related disaster warning information, for the reason that it is difficult to quantify the risk of disaster after the heavy rain. If we can quantify the risk according to the rain situation, it will be an indication of deregulation. In this study, using logistic regression analysis, we quantified the risk according to the rain situation as the probability of disaster occurrence. And we analyzed the setup of resolutive criterion for sediment-related disaster warning information. As a result, we can improve convenience of the evaluation method of probability of disaster occurrence, which is useful to provide information of imminently situation.

  11. Analysis of Automated Modern Web Crawling and Testing Tools and Their Possible Employment for Information Extraction

    Directory of Open Access Journals (Sweden)

    Tomas Grigalis

    2012-04-01

    Full Text Available World Wide Web has become an enormously big repository of data. Extracting, integrating and reusing this kind of data has a wide range of applications, including meta-searching, comparison shopping, business intelligence tools and security analysis of information in websites. However, reaching information in modern WEB 2.0 web pages, where HTML tree is often dynamically modified by various JavaScript codes, new data are added by asynchronous requests to the web server and elements are positioned with the help of cascading style sheets, is a difficult task. The article reviews automated web testing tools for information extraction tasks.Article in Lithuanian

  12. An Empirical Analysis of the Default Rate of Informal Lending—Evidence from Yiwu, China

    Science.gov (United States)

    Lu, Wei; Yu, Xiaobo; Du, Juan; Ji, Feng

    This study empirically analyzes the underlying factors contributing to the default rate of informal lending. This paper adopts snowball sampling interview to collect data and uses the logistic regression model to explore the specific factors. The results of these analyses validate the explanation of how the informal lending differs from the commercial loan. Factors that contribute to the default rate have particular attributes, while sharing some similarities with commercial bank or FICO credit scoring Index. Finally, our concluding remarks draw some inferences from empirical analysis and speculate as to what this may imply for the role of formal and informal financial sectors.

  13. Analysis of Nuclear Relevant Information on International Procurement and Industrial Activities for Safeguards Purposes

    International Nuclear Information System (INIS)

    Through the use of information on trade and industry, analysts in the Department of Safeguards create an understanding of relevant technological capabilities available to States with safeguards agreements in force and the nuclear related equipment and materials they can make use of either through indigenous manufacture or import. This information gives a valuable independent input into the consistency analysis of States' declarations and may identify inconsistencies or provide indicators of possible undeclared activities. Information on procurement attempts of potential safeguards relevance is made available to the Department through the voluntary support of several Member States. These provide complete and original primary details on enquiries that reach expert suppliers of nuclear relevant goods in the respective Member States, enquiries that may not adequately declare the intended end use of the goods. Information on export/import activities (EXIM) is collected from a variety of publicly available statistical trade databases. These provide details on trade flows of commodities between States. The information is categorized according to the World Customs Organization's universal product nomenclature: the Harmonized System (HS). Querying relevant HS codes allows analysis of EXIM information for indicators of safeguards relevance, providing insight into potential safeguards relevant capabilities, resources or activities. Surveys of nuclear relevant manufacturing capabilities of States are performed by collecting information from publicly available business directories. Such information is then further refined by identifying the actual activities of the individual manufacturers and suppliers of interest. This survey provides valuable knowledge on the technical capabilities of States. This paper will discuss the most important types of information used, clarify why they are relevant, describe the methodologies now routinely used in the Department of

  14. Methods of Sports Genetics: dermatoglyphic analysis of human fingerprints (information 1

    Directory of Open Access Journals (Sweden)

    Serhiyenko L.P.

    2010-02-01

    Full Text Available The article provides data on the dermatoglyphic analysis of human fingerprints. The most informative dermatoglyphic traits of fingerprints are defined. They can be used as genetic markers to prognosticate sports endowments. The recommendations to use the technology of dermatoglyphic analysis of human fingerprints in sports genetics are given. There are certain national and racial differences in phenotypical expressed of dermatoglyphics of digit patterns.

  15. Near-Real-Time Analysis of Publicly Communicated Disaster Response Information

    Science.gov (United States)

    Girard, Trevor

    2015-04-01

    During a disaster situation the public will need to make critical actions regarding what to do, where to go, how to get there, and so on. The more informed the public is, the better actions they are able to make, resulting in reduced disaster impacts. The criteria for what information to provide the public needs to change depending on the specific needs of the disaster affected population. The method of dissemination also needs to match the communication channels that the public typically uses in disaster situations. This research project investigates the dynamic information needs of disaster affected populations and how information leads to actions. The purpose of the research project is to identify key indicators for measuring how well informed the public is during disasters. The indicators are limited to those which can be observed as communication is happening (i.e., in near-real-time). By doing so, the indicators can be analyzed as disaster situations unfold, deficiencies can be identified, and recommendations can be made to potentially improve communication while the response is still underway. The end goal of the research is to improve the ability of communicators to inform disaster affected communities. A classification scheme has been developed to categorize the information provided to the public during disasters. Under each category is a set of typical questions that the information should answer. These questions are the result of a best observed practice review of the information available during 11 disasters. For example, under the category 'Life Saving Response', the questions which should be answered are who is doing what (Evacuation, SAR), where and when, and the amount of the affected communities' needs being covered by these actions. Review of what questions remain unanswered acts as the first indicator, referred to as an 'Information Gap Analysis'. Comparative analysis of the information within categories, between categories, and between similar

  16. Performing meta-analysis with incomplete statistical information in clinical trials

    Directory of Open Access Journals (Sweden)

    Hunter Anthony

    2008-08-01

    Full Text Available Abstract Background Results from clinical trials are usually summarized in the form of sampling distributions. When full information (mean, SEM about these distributions is given, performing meta-analysis is straightforward. However, when some of the sampling distributions only have mean values, a challenging issue is to decide how to use such distributions in meta-analysis. Currently, the most common approaches are either ignoring such trials or for each trial with a missing SEM, finding a similar trial and taking its SEM value as the missing SEM. Both approaches have drawbacks. As an alternative, this paper develops and tests two new methods, the first being the prognostic method and the second being the interval method, to estimate any missing SEMs from a set of sampling distributions with full information. A merging method is also proposed to handle clinical trials with partial information to simulate meta-analysis. Methods Both of our methods use the assumption that the samples for which the sampling distributions will be merged are randomly selected from the same population. In the prognostic method, we predict the missing SEMs from the given SEMs. In the interval method, we define intervals that we believe will contain the missing SEMs and then we use these intervals in the merging process. Results Two sets of clinical trials are used to verify our methods. One family of trials is on comparing different drugs for reduction of low density lipprotein cholesterol (LDL for Type-2 diabetes, and the other is about the effectiveness of drugs for lowering intraocular pressure (IOP. Both methods are shown to be useful for approximating the conventional meta-analysis including trials with incomplete information. For example, the meta-analysis result of Latanoprost versus Timolol on IOP reduction for six months provided in 1 was 5.05 ± 1.15 (Mean ± SEM with full information. If the last trial in this study is assumed to be with partial information

  17. Design and Implementation of Marine Information System, and Analysis of Learners' Intention toward

    Science.gov (United States)

    Pan, Yu-Jen; Kao, Jui-Chung; Yu, Te-Cheng

    2016-01-01

    The goal of this study is to conduct further research and discussion on applying the internet on marine education, utilizing existing technologies such as cloud service, social network, data collection analysis, etc. to construct a marine environment education information system. The content to be explored includes marine education information…

  18. The Technical Report: An Analysis of Information Design and Packaging for an Inelastic Market.

    Science.gov (United States)

    Pinelli, Thomas E.; And Others

    As part of an evaluation of its scientific and technical information program, the National Aeronautics and Space Administration (NASA) conducted a review and analysis of structural, language, and presentation components of its technical report form. The investigation involved comparing and contrasting NASA's publications standards for technical…

  19. Analysis of patent activity in the field of quantum information processing

    CERN Document Server

    Winiarczyk, Ryszard; Miszczak, Jarosław Adam; Pawela, Łukasz; Puchała, Zbigniew

    2013-01-01

    This paper provides an analysis of patent activity in the field of quantum information processing. Data from the PatentScope database from the years 1993-2011 was used. In order to predict the future trends in the number of filed patents time series models were used.

  20. Developing Information Skills Test for Malaysian Youth Students Using Rasch Analysis

    Science.gov (United States)

    Karim, Aidah Abdul; Shah, Parilah M.; Din, Rosseni; Ahmad, Mazalah; Lubis, Maimun Aqhsa

    2014-01-01

    This study explored the psychometric properties of a locally developed information skills test for youth students in Malaysia using Rasch analysis. The test was a combination of 24 structured and multiple choice items with a 4-point grading scale. The test was administered to 72 technical college students and 139 secondary school students. The…

  1. A Comprehensive Analysis of the Quality of Online Health-Related Information regarding Schizophrenia

    Science.gov (United States)

    Guada, Joseph; Venable, Victoria

    2011-01-01

    Social workers are major mental health providers and, thus, can be key players in guiding consumers and their families to accurate information regarding schizophrenia. The present study, using the WebMedQual scale, is a comprehensive analysis across a one-year period at two different time points of the top for-profit and nonprofit sites that…

  2. The intellectual core of enterprise information systems: a co-citation analysis

    Science.gov (United States)

    Shiau, Wen-Lung

    2016-10-01

    Enterprise information systems (EISs) have evolved in the past 20 years, attracting the attention of international practitioners and scholars. Although literature reviews and analyses have been conducted to examine the multiple dimensions of EISs, no co-citation analysis has been conducted to examine the knowledge structures involved in EIS studies; thus, the current study fills this research gap. This study investigated the intellectual structures of EISs. All data source documents (1083 articles and 24,090 citations) were obtained from the Institute for Scientific Information Web of Knowledge database. A co-citation analysis was used to analyse EIS data. By using factor analysis, we identified eight critical factors: (a) factors affecting the implementation and success of information systems (ISs); (b) the successful implementation of enterprise resource planning (ERP); (c) IS evaluation and success, (d) system science studies; (e) factors influencing ERP success; (f) case research and theoretical models; (g) user acceptance of information technology; and (h) IS frameworks. Multidimensional scaling and cluster analysis were used to visually map the resultant EIS knowledge. It is difficult to implement an EIS in an enterprise and each organisation exhibits specific considerations. The current findings indicate that managers must focus on ameliorating inferior project performance levels, enabling a transition from 'vicious' to 'virtuous' projects. Successful EIS implementation yields substantial organisational advantages.

  3. Cultural-Historical Activity Theory and Domain Analysis: Metatheoretical Implications for Information Science

    Science.gov (United States)

    Wang, Lin

    2013-01-01

    Background: Cultural-historical activity theory is an important theory in modern psychology. In recent years, it has drawn more attention from related disciplines including information science. Argument: This paper argues that activity theory and domain analysis which uses the theory as one of its bases could bring about some important…

  4. Quantitative and Qualitative Analysis of Nutrition and Food Safety Information in School Science Textbooks of India

    Science.gov (United States)

    Subba Rao, G. M.; Vijayapushapm, T.; Venkaiah, K.; Pavarala, V.

    2012-01-01

    Objective: To assess quantity and quality of nutrition and food safety information in science textbooks prescribed by the Central Board of Secondary Education (CBSE), India for grades I through X. Design: Content analysis. Methods: A coding scheme was developed for quantitative and qualitative analyses. Two investigators independently coded the…

  5. Three dimensional visualization breakthrough in analysis and communication of technical information for nuclear waste management

    International Nuclear Information System (INIS)

    Computer graphics systems that provide interactive display and manipulation of three-dimensional data are powerful tools for the analysis and communication of technical information required for characterization and design of a geologic repository for nuclear waste. Greater understanding of site performance and repository design information is possible when performance-assessment modeling results can be visually analyzed in relation to site geologic and hydrologic information and engineering data for surface and subsurface facilities. In turn, this enhanced visualization capability provides better communication between technical staff and program management with respect to analysis of available information and prioritization of program planning. A commercially-available computer system was used to demonstrate some of the current technology for three-dimensional visualization within the architecture of systems for nuclear waste management. This computer system was used to interactively visualize and analyze the information for two examples: (1) site-characterization and engineering data for a potential geologic repository at Yucca Mountain, Nevada; and (2) three-dimensional simulations of a hypothetical release and transport of contaminants from a source of radionuclides to the vadose zone. Users may assess the three-dimensional distribution of data and modeling results by interactive zooming, rotating, slicing, and peeling operations. For those parts of the database where information is sparse or not available, the software incorporates models for the interpolation and extrapolation of data over the three-dimensional space of interest. 12 refs., 4 figs

  6. The Correspondence Analysis Platform for Uncovering Deep Structure in Data and Information

    CERN Document Server

    Murtagh, Fionn

    2008-01-01

    We study two aspects of information semantics: (i) the collection of all relationships, (ii) tracking and spotting anomaly and change. The first is implemented by endowing all relevant information spaces with a Euclidean metric in a common projected space. The second is modelled by an induced ultrametric. A very general way to achieve a Euclidean embedding of different information spaces based on cross-tabulation counts (and from other input data formats) is provided by Correspondence Analysis. From there, the induced ultrametric that we are particularly interested in takes a sequential - e.g. temporal - ordering of the data into account. We employ such a perspective to look at narrative, "the flow of thought and the flow of language" (Chafe). In application to policy decision making, we show how we can focus analysis in a small number of dimensions.

  7. MANAGEMENT INFORMATION SYSTEMS ISSUES: CO-CITATION ANALYSIS OF JOURNAL ARTICLES

    Directory of Open Access Journals (Sweden)

    Wen-Lung Shiau

    2015-06-01

    Full Text Available This study aimed to analyze and identify key issues being studied in leading Management Information Systems (MIS journals collected in an ISI database. With the help of co-citation analysis and factor analysis, thirteen core issues were identified, including: (1 Technology Acceptance; (2 Information Technology (IT, Organization Performance, and Competitive Advantage; (3 IT and Organizational Structure; (4 Case Study and Methodology Issues; (5 Trust Issues in IT; (6 Knowledge Management; (7 Measurement Issues in MIS study; (8 Diffusion of Innovation; (9 Success Factors of IT; (10 Research Modeling and Approach; (11 Theory, Research and Practice; (12 MIS as an academic discipline; and (13 Enterprise Information Systems. These results can help MIS researchers and practitioners gain a better awareness of core and significant issues being studied in the field.

  8. The dynamic of information-driven coordination phenomena: a transfer entropy analysis

    CERN Document Server

    Borge-Holthoefer, Javier; Gonçalves, Bruno; González-Bailón, Sandra; Arenas, Alex; Moreno, Yamir; Vespignani, Alessandro

    2015-01-01

    Data from social media are providing unprecedented opportunities to investigate the processes that rule the dynamics of collective social phenomena. Here, we consider an information theoretical approach to define and measure the temporal and structural signatures typical of collective social events as they arise and gain prominence. We use the symbolic transfer entropy analysis of micro-blogging time series to extract directed networks of influence among geolocalized sub-units in social systems. This methodology captures the emergence of system-level dynamics close to the onset of socially relevant collective phenomena. The framework is validated against a detailed empirical analysis of five case studies. In particular, we identify a change in the characteristic time-scale of the information transfer that flags the onset of information-driven collective phenomena. Furthermore, our approach identifies an order-disorder transition in the directed network of influence between social sub-units. In the absence of ...

  9. Management and analysis of water-use data using a geographic information system

    Science.gov (United States)

    Juracek, K.E.; Kenny, J.F.

    1993-01-01

    As part of its mission, the U.S. Geological Survey conducts water-resources research. Site-specific and aggregate water-use data are used in the Survey's National Water-Use Information Program and in various hydrologic investigations. Both types of activities have specific requirements in terms of water-use data access, analysis, and display. In Kansas, the Survey obtains water-use information from several sources. Typically, this information is in a format that is not readily usable by the Survey. Geographic information system (GIS) technology is being used to restructure the available water-use data into a format that allows users to readily access and summarize site-specific water-use data by source (i.e., surface or ground water), type of use, and user-defined area.

  10. Radiological accidents: analysis of the information disseminated by media and public acceptance of nuclear technology

    Energy Technology Data Exchange (ETDEWEB)

    Delgado, Jose Ubiratan; Tauhata, Luiz [Instituto de Radioprotecao e Dosimetria (IRD), Rio de Janeiro, RJ (Brazil); Garcia, Marcia Maria [Fundacao Inst. Oswaldo Cruz (FIOCRUZ), Rio de Janeiro, RJ (Brazil). Dept. de Virologia

    1995-12-31

    A methodology to treat quantitatively information by Media concerning a nuclear or a radiological accident is presented. It allows us to classify information according to the amount, importance and way of showing, into one indicator, named Information Equivalent. This establishes a procedure for analysis of released information and includes: number of head-lines, illustrations, printed lines, editorials, authorities quoted and so on. Interpretation becomes easier when the evolution and statistical trend of this indicator is observed. The application to evaluate the dissemination of the accident which took place in 1987 in Goiania, Brazil, was satisfactory and allowed us to purpose a model. This will aid the planning, the decision making process and it will improve relationships between technical staff and media during the emergency. (author). 5 refs., 4 figs., 3 tabs.

  11. An Analysis of Risk and Function Information in Early Stage Design

    Science.gov (United States)

    Barrientos, Francesca; Tumer, Irem; Grantham, Katie; VanWie, Michael; Stone, Robert

    2005-01-01

    The concept of function offers a high potential for thinking and reasoning about designs as well as providing a common thread for relating together other design information. This paper focuses specifically on the relation between function and risk by examining how this information is addressed for a design team conducting early stage design for space missions. Risk information is decomposed into a set of key attributes which are then used to scrutinize the risk information using three approaches from the pragmatics sub-field of linguistics: i) Gricean, ii) Relevance Theory, and Functional Analysis. Results of this linguistics-based approach descriptively account for the context of designer communication with respect to function and risk, and offer prescriptive guidelines for improving designer communication.

  12. Getting and giving information: analysis of a family-interview strategy.

    Science.gov (United States)

    Viaro, M; Leonardi, P

    1983-03-01

    This paper reports on a videotape study of particular aspects of the two-part interview developed by Selvini-Palazzoli et al. (8, 9). The first segment is a "search for information," the second part the application of an intervention based on the information gathered in the first part. The study focused on the strategies of information retrieval on the premise that they are significant for the quality of information gathered and for the criteria implicitly conveyed by the therapist that in turn have their own substantial impact on the system. We have employed theories of communication, particularly conversational analysis, that are a departure from the epistemological premises of systems theory and communication pragmatics proposed by Selvini-Palazzoli et al. as the theoretical underpinning of their interview technique.

  13. An economic analysis of five selected LANDSAT assisted information systems in Oregon

    Science.gov (United States)

    Solomon, S.; Maher, K. M.

    1979-01-01

    A comparative cost analysis was performed on five LANDSAT-based information systems. In all cases, the LANDSAT system was found to have cost advantages over its alternative. The information sets generated by LANDSAT and the alternative method are not identical but are comparable in terms of satisfying the needs of the sponsor. The information obtained from the LANDSAT system in some cases is said to lack precision and detail. On the other hand, it was found to be superior in terms of providing information on areas that are inaccessible and unobtainable through conventional means. There is therefore a trade-off between precision and detail, and considerations of costs. The projects examined were concerned with locating irrigation circles in Morrow County; monitoring tansy ragwort infestation; inventoring old growth Douglas fir near Spotted Owl habitats; inventoring vegetation and resources in all state-owned lands; and determining and use for Columbia River water policies.

  14. Information-theoretic analysis of the dynamics of an executable biological model.

    Directory of Open Access Journals (Sweden)

    Avital Sadot

    Full Text Available To facilitate analysis and understanding of biological systems, large-scale data are often integrated into models using a variety of mathematical and computational approaches. Such models describe the dynamics of the biological system and can be used to study the changes in the state of the system over time. For many model classes, such as discrete or continuous dynamical systems, there exist appropriate frameworks and tools for analyzing system dynamics. However, the heterogeneous information that encodes and bridges molecular and cellular dynamics, inherent to fine-grained molecular simulation models, presents significant challenges to the study of system dynamics. In this paper, we present an algorithmic information theory based approach for the analysis and interpretation of the dynamics of such executable models of biological systems. We apply a normalized compression distance (NCD analysis to the state representations of a model that simulates the immune decision making and immune cell behavior. We show that this analysis successfully captures the essential information in the dynamics of the system, which results from a variety of events including proliferation, differentiation, or perturbations such as gene knock-outs. We demonstrate that this approach can be used for the analysis of executable models, regardless of the modeling framework, and for making experimentally quantifiable predictions.

  15. The development of an information criterion for Change-Point Analysis

    CERN Document Server

    Wiggins, Paul A

    2015-01-01

    Change-point analysis is a flexible and computationally tractable tool for the analysis of times series data from systems that transition between discrete states and whose observables are corrupted by noise. The change-point algorithm is used to identify the time indices (change points) at which the system transitions between these discrete states. We present a unified information-based approach to testing for the existence of change points. This new approach reconciles two previously disparate approaches to Change-Point Analysis (frequentist and information-based) for testing transitions between states. The resulting method is statistically principled, parameter and prior free and widely applicable to a wide range of change-point problems.

  16. Analysis of biological time-lapse microscopic experiment from the point of view of the information theory.

    Science.gov (United States)

    Štys, Dalibor; Urban, Jan; Vaněk, Jan; Císař, Petr

    2011-06-01

    We report objective analysis of information in the microscopic image of the cell monolayer. The process of transfer of information about the cell by the microscope is analyzed in terms of the classical Shannon information transfer scheme. The information source is the biological object, the information transfer channel is the whole microscope including the camera chip. The destination is the model of biological system. The information contribution is analyzed as information carried by a point to overall information in the image. Subsequently we obtain information reflection of the biological object. This is transformed in the biological model which, in information terminology, is the destination. This, we propose, should be constructed as state transitions in individual cells modulated by information bonds between the cells. We show examples of detected cell states in multidimensional state space. This space is reflected as colour channel intensity phenomenological state space. We have also observed information bonds and show examples of them. PMID:25478628

  17. ANALYSIS OF A WEB INFORMATION SYSTEM APPLIED MANAGEMENT SCHOOL OF COMPUTING

    Directory of Open Access Journals (Sweden)

    ROGER CRISTHIAN GOMES

    2010-01-01

    Full Text Available One of the tasks of an entrepreneur is choose a computerized information system for the management of your business, regardless of their size and field of expertise. Having to determine if the information system will be modeling for local use, also known as standalone, or developed for the web, is becoming increasingly common, as the Internet, with its characteristics, greatly facilitates the work of the manager. However, can not simply deduct or take into account only the technological trends and market to resolve an issue that will require in the form of operation, administration and management. To choose between one or another type of system is necessary to examine the advantages and disadvantages of each model in relation to the business in question. This study aimed to list the main features intrinsic to web and stand-alone applications. The study of these two types of applications was based on analysis of an information system applied to a company to provide services in computer training. For the analysis of the information system were carried out a survey of the main requirements and modeling of a prototype. It was proposed to develop the system in a web environment, using the JAVA platform with the database manager MySQL, because the tools are complete, well documented, free, and with features that help to ensure the functionality and quality of the information system web.

  18. A Novel Approach for Information Content Retrieval and Analysis of Bio-Images using Datamining techniques

    Directory of Open Access Journals (Sweden)

    Ayyagari Sri Nagesh

    2012-11-01

    Full Text Available In Bio-Medical image processing domain, content-based analysis and Information retrieval of bio-images is very critical for disease diagnosis. Content-Based Image Analysis and Information Retrieval (CBIAIR has become a significant part of information retrieval technology. One challenge in this area is that the ever-increasing number of bio-images acquired through the digital world makes the brute force searching almost impossible. Medical Image structural objects content and object identification plays significant role for image content analysis and information retrieval. There are basically three fundamental concepts for content-based bio-image retrieval, i.e. visual-feature extraction, multi-dimensional indexing, and retrieval system process. Each image has three contents such as: colour, texture and shape features. Colour and Texture both plays important image visual features used in Content-Based Image Retrieval to improve results. In this paper, we have presented an effective image retrieval system using features like texture, shape and color, called CBIAIR (Content-Based Image Analysis and Information Retrieval. Here, we have taken three different features such as texture, color and shape. Firstly, we have developed a new texture pattern feature for pixel based feature in CBIAIR system. Subsequently, we have used semantic color feature for color based feature and the shape based feature selection is done using the existing technique. For retrieving, these features are extracted from the query image and matched with the feature library using the feature weighted distance. After that, all feature vectors will be stored in the database using indexing procedure. Finally, the relevant images that have less matched distance than the predefined threshold value are retrieved from the image database after adapting the K-NN classifier.

  19. Information-theoretic analysis of x-ray scatter and phase architectures for anomaly detection

    Science.gov (United States)

    Coccarelli, David; Gong, Qian; Stoian, Razvan-Ionut; Greenberg, Joel A.; Gehm, Michael E.; Lin, Yuzhang; Huang, Liang-Chih; Ashok, Amit

    2016-05-01

    Conventional performance analysis of detection systems confounds the effects of the system architecture (sources, detectors, system geometry, etc.) with the effects of the detection algorithm. Previously, we introduced an information-theoretic approach to this problem by formulating a performance metric, based on Cauchy-Schwarz mutual information, that is analogous to the channel capacity concept from communications engineering. In this work, we discuss the application of this metric to study novel screening systems based on x-ray scatter or phase. Our results show how effective use of this metric can impact design decisions for x-ray scatter and phase systems.

  20. Using Lewin's force field analysis in implementing a nursing information system.

    Science.gov (United States)

    Bozak, Marilynn G

    2003-01-01

    Change is a regular occurrence in the healthcare environment. The computerization of nursing systems is one aspect of the changes taking place in the information revolution. As a result, nurses have widely varying attitudes toward computers and change in the workplace. To transition the nursing team effectively from one system to another, the nurse informaticist must be aware of the factors that encourage and those that impede the change. Strategies must be developed to assist nurses in moving forward with the transition. This article presents a theoretical discussion of how Lewin's Force Field Analysis Model could be applied in the practice setting to implement a nursing information system successfully. PMID:12802948

  1. Quantitative analysis of access strategies to remote information in network services

    DEFF Research Database (Denmark)

    Olsen, Rasmus Løvenstein; Schwefel, Hans-Peter; Hansen, Martin Bøgsted

    2006-01-01

    Remote access to dynamically changing information elements is a required functionality for various network services, including routing and instances of context-sensitive networking. Three fundamentally different strategies for such access are investigated in this paper: (1) a reactive approach in......, network delay characterization) and specific requirements on mismatch probability, traffic overhead, and access delay. Finally, the analysis is applied to the use-case of context-sensitive service discovery.......Remote access to dynamically changing information elements is a required functionality for various network services, including routing and instances of context-sensitive networking. Three fundamentally different strategies for such access are investigated in this paper: (1) a reactive approach...

  2. Technology and Research Requirements for Combating Human Trafficking: Enhancing Communication, Analysis, Reporting, and Information Sharing

    Energy Technology Data Exchange (ETDEWEB)

    Kreyling, Sean J.; West, Curtis L.; Olson, Jarrod

    2011-03-17

    DHS’ Science & Technology Directorate directed PNNL to conduct an exploratory study on the domain of human trafficking in the Pacific Northwest in order to examine and identify technology and research requirements for enhancing communication, analysis, reporting, and information sharing – activities that directly support efforts to track, identify, deter, and prosecute human trafficking – including identification of potential national threats from smuggling and trafficking networks. This effort was conducted under the Knowledge Management Technologies Portfolio as part of the Integrated Federal, State, and Local/Regional Information Sharing (RISC) and Collaboration Program.

  3. The relative importance of head, flux, and prior information in hydraulic tomography analysis

    Science.gov (United States)

    Michael Tso, Chak-Hau; Zha, Yuanyuan; Jim Yeh, Tian-Chyi; Wen, Jet-Chau

    2016-01-01

    Using cross-correlation analysis, we demonstrate that flux measurements at observation locations during hydraulic tomography (HT) surveys carry nonredundant information about heterogeneity that are complementary to head measurements at the same locations. We then hypothesize that a joint interpretation of head and flux data, even when the same observation network as head has been used, can enhance the resolution of HT estimates. Subsequently, we use numerical experiments to test this hypothesis and investigate the impact of flux conditioning and prior information (such as correlation lengths and initial mean models (i.e., uniform mean or distributed means)) on the HT estimates of a nonstationary, layered medium. We find that the addition of flux conditioning to HT analysis improves the estimates in all of the prior models tested. While prior information on geologic structures could be useful, its influence on the estimates reduces as more nonredundant data (i.e., flux) are used in the HT analysis. Lastly, recommendations for conducting HT surveys and analysis are presented.

  4. Fusion Energy: Contextual Analysis of the Information Panels Developed by the Scientific Community versus Citizen Discourse

    International Nuclear Information System (INIS)

    The report presents an exploratory study on the impact of scientific dissemination, particularly a comparative analysis of two discourses on fusion energy as an alternative energy future. The report introduces a comparative analysis of the institutional discourse, as portrayed by the scientific jargon used in a European travelling exhibition on nuclear fusion Fusion Expo, and the social discourse, as illustrated by a citizen deliberation on this very same exhibition. Through textual analysis, the scientific discourse as deployed in the informative panels at the Fusion Expo is compared with the citizen discourse as developed in the discussions within the citizen groups. The ConText software was applied for such analysis. The purpose is to analyze how visitors assimilate, capture and understand highly technical information. Results suggest that, in despite of convergence points, the two discourses present certain differences, showing diverse levels of communication. The scientific discourse shows a great profusion of formalisms and technicalities of scientific jargon. The citizen discourse shows abundance of words associated with daily life and the more practical aspects (economy, efficiency), concerning institutional and evaluative references. In sum, the study shows that although there are a few common communicative spaces, there are still very few turning points. These data indicate that although exhibitions can be a good tool to disseminate advances in fusion energy in informal learning contexts, public feedback is a powerful tool for improving the quality of social dialogue. (Author)

  5. Collection and Analysis of Open Source News for Information Awareness and Early Warning in Nuclear Safeguards

    International Nuclear Information System (INIS)

    Acquisition and analysis of open source information plays an increasingly important role in the IAEA’s move towards safeguards implementation based on all safeguards relevant information known about a State. The growing volume of open source information requires the development of technology and tools capable of effectively collecting relevant information, filtering out “noise”, organizing valuable information in a clear and accessible manner, and assessing its relevance. In this context, the IAEA’s Division of Information Management (SGIM) and the EC’s Joint Research Centre (JRC) are currently implementing a joint project to advance the effectiveness and efficiency of the IAEA’s workflow for open source information collection and analysis. The objective is to provide tools to support SGIM in the production of the SGIM Open Source Highlights, which is a daily news brief consisting of the most pertinent news stories relevant to safeguards and non-proliferation. The process involves the review and selection of hundreds of articles from a wide array of specifically selected sources. The joint activity exploits the JRC’s Europe Media Monitor (EMM) and NewsDesk applications: EMM automatically collects and analyses news articles from a pre-defined list of web sites, and NewsDesk allows an analyst to manually select the most relevant articles from the EMM stream for further processing. The paper discusses the IAEA’s workflow for the production of SGIM Open Source Highlights and describes the capabilities of EMM and NewsDesk. It then provides an overview of the joint activities since the project started in 2011, which were focused i) on setting up a separate EMM installation dedicated to the nuclear safeguards and security domain (Nuclear Security Media Monitor, NSMM) and ii) on evaluating the NSMM/NewsDesk for meeting the IAEA’s needs. Finally, it presents the current use NSMM/NewsDesk at the IAEA and proposes options for further integration with the

  6. Wavelet q-Fisher Information for Scaling Signal Analysis

    Directory of Open Access Journals (Sweden)

    Joel Trejo-Sanchez

    2012-08-01

    Full Text Available Abstract: This article first introduces the concept of wavelet q-Fisher information  and then derives a closed-form  expression of this quantifier for scaling signals of parameter α.  It is shown that this information measure appropriately describes the complexities  of scaling signals and provides further analysis flexibility with the parameter q. In the limit of q → 1, wavelet q-Fisher information  reduces to the standard wavelet Fisher information  and for q  > 2 it reverses its behavior. Experimental results on synthesized fGn signals validates the level-shift  detection capabilities of wavelet q-Fisher information. A comparative study also shows that wavelet q-Fisher information  locates structural changes in correlated and anti-correlated fGn signals in a way comparable with standard breakpoint location techniques but at a fraction of the time. Finally, the application of this quantifier to H.263 encoded video signals is presented.

  7. Celebrity Health Announcements and Online Health Information Seeking: An Analysis of Angelina Jolie's Preventative Health Decision.

    Science.gov (United States)

    Dean, Marleah

    2016-01-01

    On May 14, 2013, Angelina Jolie disclosed she carries BRCA1, which means she has an 87% risk of developing breast cancer during her lifetime. Jolie decided to undergo a preventative bilateral mastectomy (PBM), reducing her risk to 5%. The purpose of this study was to analyze the type of information individuals are exposed to when using the Internet to search health information regarding Jolie's decision. Qualitative content analysis revealed four main themes--information about genetics, information about a PBM, information about health care, and information about Jolie's gender identity. Broadly, the identified websites mention Jolie's high risk for developing cancer due to the genetic mutation BRCA1, describe a PBM occasionally noting reasons why she had this surgery and providing alternatives to the surgery, discuss issues related to health care services, costs, and insurances about Jolie's health decision, and portray Jolie as a sexual icon, a partner to Brad Pitt, a mother of six children, and an inspirational humanitarian. The websites also depict Jolie's health decision in positive, negative, and/or both ways. Discussion centers on how this actress' health decision impacts the public. PMID:26574936

  8. SWOT analysis on National Common Geospatial Information Service Platform of China

    Science.gov (United States)

    Zheng, Xinyan; He, Biao

    2010-11-01

    Currently, the trend of International Surveying and Mapping is shifting from map production to integrated service of geospatial information, such as GOS of U.S. etc. Under this circumstance, the Surveying and Mapping of China is inevitably shifting from 4D product service to NCGISPC (National Common Geospatial Information Service Platform of China)-centered service. Although State Bureau of Surveying and Mapping of China has already provided a great quantity of geospatial information service to various lines of business, such as emergency and disaster management, transportation, water resource, agriculture etc. The shortcomings of the traditional service mode are more and more obvious, due to the highly emerging requirement of e-government construction, the remarkable development of IT technology and emerging online geospatial service demands of various lines of business. NCGISPC, which aimed to provide multiple authoritative online one-stop geospatial information service and API for further development to government, business and public, is now the strategic core of SBSM (State Bureau of Surveying and Mapping of China). This paper focuses on the paradigm shift that NCGISPC brings up by using SWOT (Strength, Weakness, Opportunity and Threat) analysis, compared to the service mode that based on 4D product. Though NCGISPC is still at its early stage, it represents the future service mode of geospatial information of China, and surely will have great impact not only on the construction of digital China, but also on the way that everyone uses geospatial information service.

  9. Maximal information component analysis: a novel non-linear network analysis method

    OpenAIRE

    Christoph Daniel Rau; Nicholas eWisniewski; Orozco, Luz D; Brian eBennett; James Nathaniel Weiss; Aldons Jake Lusis

    2013-01-01

    Background: Network construction and analysis algorithms provide scientists with the ability to sift through high-throughput biological outputs, such as transcription microarrays, for small groups of genes (modules) that are relevant for further research. Most of these algorithms ignore the important role of nonlinear interactions in the data, and the ability for genes to operate in multiple functional groups at once, despite clear evidence for both of these phenomena in observed biological...

  10. Maximal information component analysis: a novel non-linear network analysis method

    OpenAIRE

    Rau, Christoph D.; Wisniewski, Nicholas; Orozco, Luz D; Bennett, Brian; Weiss, James; Lusis, Aldons J.

    2013-01-01

    Background: Network construction and analysis algorithms provide scientists with the ability to sift through high-throughput biological outputs, such as transcription microarrays, for small groups of genes (modules) that are relevant for further research. Most of these algorithms ignore the important role of non-linear interactions in the data, and the ability for genes to operate in multiple functional groups at once, despite clear evidence for both of these phenomena in observed biological ...

  11. Development of efficient system for collection-analysis-application of information using system for technology and information in field of RI-biomics

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Sol Ah; Kim, Joo Yeon; Park, Tai Jin [Korean Association for Radiation Application, Seoul (Korea, Republic of)

    2015-08-15

    RI-Biomics is the new radiation fusion technology of which, such as the characteristics of radioisotope, is applied to the biomics. In order to sharing and overall analysis of data between the institutions through total management of information in the field of RI-Biomics, RI-Biomics Information portal ‘RIBio-Info’ was constructed by KARA (Korean Association for Radiation Application) in February 2015. For systematic operation of this ‘RIBio-Info’ system, it is required to develop system of collection-analysis-application of information. So, in this paper, we summarized development of document forms at each processes of collection-analysis-application of information and systematization of collection methods of information, establishment of characteristically analysis methods of reports such as issue paper, policy report, global market report and watch report. Therefore, these are expected to improving the practical applicability in this field through the vitalization of technology development of users by achieving the circular structure of collection analysis-application of information.

  12. Empowering Students to Make Sense of an Information-Saturated World: The Evolution of "Information Searching and Analysis"

    Science.gov (United States)

    Wittebols, James H.

    2016-01-01

    How well students conduct research online is an increasing concern for educators at all levels, especially higher education. This paper describes the evolution of a course that examines confirmation bias, information searching, and the political economy of information as keys to becoming more information and media literate. After a key assignment…

  13. NASA Informal Education: Final Report. A Descriptive Analysis of NASA's Informal Education Portfolio: Preliminary Case Studies

    Science.gov (United States)

    Rulf Fountain, Alyssa; Levy, Abigail Jurist

    2010-01-01

    This report was requested by the National Aeronautics and Space Administration's (NASA), Office of Education in July 2009 to evaluate the Informal Education Program. The goals of the evaluation were twofold: (1) to gain insight into its investment in informal education; and (2) to clarify existing distinctions between its informal education…

  14. Information Communication Technology and Politics: A Synthesized Analysis of the Impacts of Information Technology on Voter Participation in Kenya

    Science.gov (United States)

    Tsuma, Clive Katiba

    2011-01-01

    The availability of political information throughout society made possible by the evolution of contemporary information communication technology has precipitated conflicting debate regarding the effects of technology use on real life political participation. Proponents of technology argue that the use of new information technology stimulates…

  15. Practical Performance Analysis for Multiple Information Fusion Based Scalable Localization System Using Wireless Sensor Networks.

    Science.gov (United States)

    Zhao, Yubin; Li, Xiaofan; Zhang, Sha; Meng, Tianhui; Zhang, Yiwen

    2016-01-01

    In practical localization system design, researchers need to consider several aspects to make the positioning efficiently and effectively, e.g., the available auxiliary information, sensing devices, equipment deployment and the environment. Then, these practical concerns turn out to be the technical problems, e.g., the sequential position state propagation, the target-anchor geometry effect, the Non-line-of-sight (NLOS) identification and the related prior information. It is necessary to construct an efficient framework that can exploit multiple available information and guide the system design. In this paper, we propose a scalable method to analyze system performance based on the Cramér-Rao lower bound (CRLB), which can fuse all of the information adaptively. Firstly, we use an abstract function to represent all of the wireless localization system model. Then, the unknown vector of the CRLB consists of two parts: the first part is the estimated vector, and the second part is the auxiliary vector, which helps improve the estimation accuracy. Accordingly, the Fisher information matrix is divided into two parts: the state matrix and the auxiliary matrix. Unlike the theoretical analysis, our CRLB can be a practical fundamental limit to denote the system that fuses multiple information in the complicated environment, e.g., recursive Bayesian estimation based on the hidden Markov model, the map matching method and the NLOS identification and mitigation methods. Thus, the theoretical results are approaching the real case more. In addition, our method is more adaptable than other CRLBs when considering more unknown important factors. We use the proposed method to analyze the wireless sensor network-based indoor localization system. The influence of the hybrid LOS/NLOS channels, the building layout information and the relative height differences between the target and anchors are analyzed. It is demonstrated that our method exploits all of the available information for

  16. The 2006 Analysis of Information Remaining on Disks Offered for Sale on the Second Hand Market

    Directory of Open Access Journals (Sweden)

    Andy Jones

    2006-09-01

    Full Text Available All organisations, whether in the public or private sector, use computers for the storage and processing of information relating to their business or services, their employees and their customers. A large proportion of families and individuals in their homes now also use personal computers and, both intentionally and inadvertently, often store on those computers personal information. It is clear that most organisations and individuals continue to be unaware of the information that may be stored on the hard disks that the computers contain, and have not considered what may happen to the information after the disposal of the equipment.In 2005, joint research was carried out by the University of Glamorgan in Wales and Edith Cowan University in Australia to determine whether second hand computer disks that were purchased from a number of sources still contained any information or whether the information had been effectively erased. The research revealed that, for the majority of the disks that were examined, the information had not been effectively removed and as a result, both organisations and individuals were potentially exposed to a range of potential crimes.  It is worthy of note that in the disposal of this equipment, the organisations involved had failed to meet their statutory, regulatory and legal obligations.This paper describes a second research project that was carried out in 2006 which repeated the research carried out the previous year and also extended the scope of the research to include additional countries.  The methodology used was the same as that in the previous year and the disks that were used for the research were again supplied blind by a third party. The research involved the forensic imaging of the disks which was followed by an analysis of the disks to determine what information remained and whether it could be easily recovered using publicly available tools and techniques.

  17. Practical Performance Analysis for Multiple Information Fusion Based Scalable Localization System Using Wireless Sensor Networks

    Science.gov (United States)

    Zhao, Yubin; Li, Xiaofan; Zhang, Sha; Meng, Tianhui; Zhang, Yiwen

    2016-01-01

    In practical localization system design, researchers need to consider several aspects to make the positioning efficiently and effectively, e.g., the available auxiliary information, sensing devices, equipment deployment and the environment. Then, these practical concerns turn out to be the technical problems, e.g., the sequential position state propagation, the target-anchor geometry effect, the Non-line-of-sight (NLOS) identification and the related prior information. It is necessary to construct an efficient framework that can exploit multiple available information and guide the system design. In this paper, we propose a scalable method to analyze system performance based on the Cramér–Rao lower bound (CRLB), which can fuse all of the information adaptively. Firstly, we use an abstract function to represent all of the wireless localization system model. Then, the unknown vector of the CRLB consists of two parts: the first part is the estimated vector, and the second part is the auxiliary vector, which helps improve the estimation accuracy. Accordingly, the Fisher information matrix is divided into two parts: the state matrix and the auxiliary matrix. Unlike the theoretical analysis, our CRLB can be a practical fundamental limit to denote the system that fuses multiple information in the complicated environment, e.g., recursive Bayesian estimation based on the hidden Markov model, the map matching method and the NLOS identification and mitigation methods. Thus, the theoretical results are approaching the real case more. In addition, our method is more adaptable than other CRLBs when considering more unknown important factors. We use the proposed method to analyze the wireless sensor network-based indoor localization system. The influence of the hybrid LOS/NLOS channels, the building layout information and the relative height differences between the target and anchors are analyzed. It is demonstrated that our method exploits all of the available information for

  18. Practical Performance Analysis for Multiple Information Fusion Based Scalable Localization System Using Wireless Sensor Networks.

    Science.gov (United States)

    Zhao, Yubin; Li, Xiaofan; Zhang, Sha; Meng, Tianhui; Zhang, Yiwen

    2016-01-01

    In practical localization system design, researchers need to consider several aspects to make the positioning efficiently and effectively, e.g., the available auxiliary information, sensing devices, equipment deployment and the environment. Then, these practical concerns turn out to be the technical problems, e.g., the sequential position state propagation, the target-anchor geometry effect, the Non-line-of-sight (NLOS) identification and the related prior information. It is necessary to construct an efficient framework that can exploit multiple available information and guide the system design. In this paper, we propose a scalable method to analyze system performance based on the Cramér-Rao lower bound (CRLB), which can fuse all of the information adaptively. Firstly, we use an abstract function to represent all of the wireless localization system model. Then, the unknown vector of the CRLB consists of two parts: the first part is the estimated vector, and the second part is the auxiliary vector, which helps improve the estimation accuracy. Accordingly, the Fisher information matrix is divided into two parts: the state matrix and the auxiliary matrix. Unlike the theoretical analysis, our CRLB can be a practical fundamental limit to denote the system that fuses multiple information in the complicated environment, e.g., recursive Bayesian estimation based on the hidden Markov model, the map matching method and the NLOS identification and mitigation methods. Thus, the theoretical results are approaching the real case more. In addition, our method is more adaptable than other CRLBs when considering more unknown important factors. We use the proposed method to analyze the wireless sensor network-based indoor localization system. The influence of the hybrid LOS/NLOS channels, the building layout information and the relative height differences between the target and anchors are analyzed. It is demonstrated that our method exploits all of the available information for

  19. Information Gap Analysis: near real-time evaluation of disaster response

    Science.gov (United States)

    Girard, Trevor

    2014-05-01

    Disasters, such as major storm events or earthquakes, trigger an immediate response by the disaster management system of the nation in question. The quality of this response is a large factor in its ability to limit the impacts on the local population. Improving the quality of disaster response therefore reduces disaster impacts. Studying past disasters is a valuable exercise to understand what went wrong, identify measures which could have mitigated these issues, and make recommendations to improve future disaster planning and response. While such ex post evaluations can lead to improvements in the disaster management system, there are limitations. The main limitation that has influenced this research is that ex post evaluations do not have the ability to inform the disaster response being assessed for the obvious reason that they are carried out long after the response phase is over. The result is that lessons learned can only be applied to future disasters. In the field of humanitarian relief, this limitation has led to the development of real time evaluations. The key aspect of real time humanitarian evaluations is that they are completed while the operation is still underway. This results in findings being delivered at a time when they can still make a difference to the humanitarian response. Applying such an approach to the immediate disaster response phase requires an even shorter time-frame, as well as a shift in focus from international actors to the nation in question's government. As such, a pilot study was started and methodology developed, to analyze disaster response in near real-time. The analysis uses the information provided by the disaster management system within the first 0 - 5 days of the response. The data is collected from publicly available sources such as ReliefWeb and sorted under various categories which represent each aspect of disaster response. This process was carried out for 12 disasters. The quantity and timeliness of information

  20. Inclusion of Respiratory Frequency Information in Heart Rate Variability Analysis for Stress Assessment.

    Science.gov (United States)

    Hernando, Alberto; Lazaro, Jesus; Gil, Eduardo; Arza, Adriana; Garzon, Jorge Mario; Lopez-Anton, Raul; de la Camara, Concepcion; Laguna, Pablo; Aguilo, Jordi; Bailon, Raquel

    2016-07-01

    Respiratory rate and heart rate variability (HRV) are studied as stress markers in a database of young healthy volunteers subjected to acute emotional stress, induced by a modification of the Trier Social Stress Test. First, instantaneous frequency domain HRV parameters are computed using time-frequency analysis in the classical bands. Then, the respiratory rate is estimated and this information is included in HRV analysis in two ways: 1) redefining the high-frequency (HF) band to be centered at respiratory frequency; 2) excluding from the analysis those instants where respiratory frequency falls within the low-frequency (LF) band. Classical frequency domain HRV indices scarcely show statistical differences during stress. However, when including respiratory frequency information in HRV analysis, the normalized LF power as well as the LF/HF ratio significantly increase during stress ( p-value 0.05 according to the Wilcoxon test), revealing higher sympathetic dominance. The LF power increases during stress, only being significantly different in a stress anticipation stage, while the HF power decreases during stress, only being significantly different during the stress task demanding attention. Our results support that joint analysis of respiration and HRV obtains a more reliable characterization of autonomic nervous response to stress. In addition, the respiratory rate is observed to be higher and less stable during stress than during relax ( p-value 0.05 according to the Wilcoxon test) being the most discriminative index for stress stratification (AUC = 88.2 % ). PMID:27093713

  1. Construction Process Simulation and Safety Analysis Based on Building Information Model and 4D Technology

    Institute of Scientific and Technical Information of China (English)

    HU Zhenzhong; ZHANG Jianping; DENG Ziyin

    2008-01-01

    Time-dependent structure analysis theory has been proved to be more accurate and reliable com-pared to commonly used methods during construction. However, so far applications are limited to partial pe-riod and part of the structure because of immeasurable artificial intervention. Based on the building informa-tion model (BIM) and four-dimensional (4D) technology, this paper proposes an improves structure analysis method, which can generate structural geometry, resistance model, and loading conditions automatically by a close interlink of the schedule information, architectural model, and material properties. The method was applied to a safety analysis during a continuous and dynamic simulation of the entire construction process.The results show that the organic combination of the BIM, 4D technology, construction simulation, and safety analysis of time-dependent structures is feasible and practical. This research also lays a foundation for further researches on building lifecycle management by combining architectural design, structure analy-sis, and construction management.

  2. Eye-tracking Information Processing in Choice-based Conjoint Analysis

    DEFF Research Database (Denmark)

    Meissner, Martin; Decker, Reinhold

    2010-01-01

    Choice models are a common tool in market research for quantifying the influence of product attributes on consumer decisions. Process tracing techniques, on the other hand, try to answer the question of how people process information and make decisions in choice tasks. This paper suggests...... a combination of both approaches for in-depth investigations of consumer decision processes in preference measurement by means of choice-based conjoint (CBC) analysis. We discuss different process tracing techniques and propose an attribute-specific strategy measure for the analysis of CBC results. In our...

  3. Advances in research methods for information systems research data mining, data envelopment analysis, value focused thinking

    CERN Document Server

    Osei-Bryson, Kweku-Muata

    2013-01-01

    Advances in social science research methodologies and data analytic methods are changing the way research in information systems is conducted. New developments in statistical software technologies for data mining (DM) such as regression splines or decision tree induction can be used to assist researchers in systematic post-positivist theory testing and development. Established management science techniques like data envelopment analysis (DEA), and value focused thinking (VFT) can be used in combination with traditional statistical analysis and data mining techniques to more effectively explore

  4. Information Security Management: ANP Based Approach for Risk Analysis and Decision Making

    Directory of Open Access Journals (Sweden)

    H. Brožová

    2016-03-01

    Full Text Available In information systems security, the objectives of risk analysis process are to help to identify new threats and vulnerabilities, to estimate their business impact and to provide a dynamic set of tools to control the security level of the information system. The identification of risk factors as well as the estimation of their business impact require tools for assessment of risk with multi-value scales according to different stakeholders’ point of view. Therefore, the purpose of this paper is to model risk analysis decision making problem using semantic network to develop the decision network and the Analytical Network Process (ANP that allows solving complex problems taking into consideration quantitative and qualitative data. As a decision support technique ANP also measures the dependency among risk factors related to the elicitation of individual judgement. An empirical study involving the Forestry Company is used to illustrate the relevance of ANP.

  5. The Impact of Informal Economy in the Pension System, Empirical Analysis. The Albanian Case

    Directory of Open Access Journals (Sweden)

    Bernard Dosti

    2015-02-01

    Full Text Available By using a simple model, it will be analyzed the impact that informality has in the amount of consumption of the workers during their life cycle. This paper deals with the interconnections of underreported earnings, savings and old-age pension. The workers sampled for this analysis have been divided into three groups: 1. Low income employees, 2. Higher income employees who declare all incomes, 3. Employees who underreport their incomes. In this paper the analysis is based on two pension models: the model that calculates pension in conformity with the incomes and the basic model, whose objective is poverty reduction for the “third age”. The major result is as follows: Given the fact that the basic pension system favors employees that underreport their incomes and the fact that the impact of informality is greater in the basic system than in the proportional pension system, the application of basic pension system in the Albanian might be problematic.

  6. Local mine production safety supervision game analysis based on incomplete information

    Institute of Scientific and Technical Information of China (English)

    LI Xing-dong; LI Ying; REN Da-wei; LIU Zhao-xia

    2007-01-01

    Utilized fundamental theory and analysis method of Incomplete Information repeated games, introduced Incomplete Information into repeated games, and established two stages dynamic games model of the local authority and the coal mine owner. The analytic result indicates that: so long as the country established the corresponding rewards and punishments incentive mechanism to the local authority departments responsible for the work, it reports the safety accident in the coal mine on time. The conclusion that the local government displays right and wrong cooperation behavior will be changed with the introduction of the Incomplete Information. Only has the local authority fulfill their responsibility, can the unsafe accident be controlled effectively. Once this kind of cooperation of local government appears, the costs of the country on the safe supervise and the difficulty will be able to decrease greatly.

  7. Descriptive analysis of the inequalities of health information resources between Alberta's rural and urban health regions.

    Science.gov (United States)

    Stieda, Vivian; Colvin, Barb

    2009-01-01

    In an effort to understand the extent of the inequalities in health information resources across Alberta, SEARCH Custom, HKN (Health Knowledge Network) and IRREN (Inter-Regional Research and Evaluation Network) conducted a survey in December 2007 to determine what library resources currently existed in Alberta's seven rural health regions and the two urban health regions. Although anecdotal evidence indicated that these gaps existed, the analysis was undertaken to provide empirical evidence of the exact nature of these gaps. The results, coupled with the published literature on the impact, effectiveness and value of information on clinical practice and administrative decisions in healthcare management, will be used to build momentum among relevant stakeholders to support a vision of equitably funded health information for all healthcare practitioners across the province of Alberta.

  8. Analysis of information for cerebrovascular disorders obtained by 3D MR imaging

    Energy Technology Data Exchange (ETDEWEB)

    Yoshikawa, Kohki [Tokyo Univ. (Japan). Inst. of Medical Science; Yoshioka, Naoki; Watanabe, Fumio; Shiono, Takahiro; Sugishita, Morihiro; Umino, Kazunori

    1995-12-01

    Recently, it becomes easy to analyze information obtained by 3D MR imaging due to remarkable progress of fast MR imaging technique and analysis tool. Six patients suffered from aphasia (4 cerebral infarctions and 2 bleedings) were performed 3D MR imaging (3D FLASH-TR/TE/flip angle; 20-50 msec/6-10 msec/20-30 degrees) and their volume information were analyzed by multiple projection reconstruction (MPR), surface rendering 3D reconstruction, and volume rendering 3D reconstruction using Volume Design PRO (Medical Design Co., Ltd.). Four of them were diagnosed as Broca`s aphasia clinically and their lesions could be detected around the cortices of the left inferior frontal gyrus. Another 2 patients were diagnosed as Wernicke`s aphasia and the lesions could be detected around the cortices of the left supramarginal gyrus. This technique for 3D volume analyses would provide quite exact locational information about cerebral cortical lesions. (author).

  9. An analysis method of the press information related with the nuclear activity in Argentina

    International Nuclear Information System (INIS)

    The articles published by the newspapers during the year 1987 were analyzed and classified according to their contents. An attribute was assigned to each article (positive, negative or neutral) in agreement with its connotation regarding the nuclear activity in Argentina. An ISIS base system was developed using these data. The purpose of this analysis was to evaluate the influence of the press in the public opinion. The relation between the different variables show the importance and approach (environmental, technico-scientifical or political) given by the press to the different subjects. The results show a general lack of knowledge about nuclear activities and a concern among the readers associated with the environmental risks, which calls for the need to develop an information program for the community. The fundamentals of this program should improve the organization in order to make the information reach the external demands, to promote educational programs and to continuously provide information to the press. (S.M.)

  10. Wireless Information-Theoretic Security in an Outdoor Topology with Obstacles: Theoretical Analysis and Experimental Measurements

    Directory of Open Access Journals (Sweden)

    Dagiuklas Tasos

    2011-01-01

    Full Text Available This paper presents a Wireless Information-Theoretic Security (WITS scheme, which has been recently introduced as a robust physical layer-based security solution, especially for infrastructureless networks. An autonomic network of moving users was implemented via 802.11n nodes of an ad hoc network for an outdoor topology with obstacles. Obstructed-Line-of-Sight (OLOS and Non-Line-of-Sight (NLOS propagation scenarios were examined. Low-speed user movement was considered, so that Doppler spread could be discarded. A transmitter and a legitimate receiver exchanged information in the presence of a moving eavesdropper. Average Signal-to-Noise Ratio (SNR values were acquired for both the main and the wiretap channel, and the Probability of Nonzero Secrecy Capacity was calculated based on theoretical formula. Experimental results validate theoretical findings stressing the importance of user location and mobility schemes on the robustness of Wireless Information-Theoretic Security and call for further theoretical analysis.

  11. A Time Series Analysis of Cancer-Related Information Seeking: Hints From the Health Information National Trends Survey (HINTS) 2003-2014.

    Science.gov (United States)

    Huerta, Timothy R; Walker, Daniel M; Johnson, Tyler; Ford, Eric W

    2016-09-01

    Recent technological changes, such as the growth of the Internet, have made cancer information widely available. However, it remains unknown whether changes in access have resulted in concomitant changes in information seeking behavior. Previous work explored the cancer information seeking behaviors of the general population using the 2003 Health Information National Trends Survey (HINTS). This article aims to reproduce, replicate, and extend that existing analysis using the original dataset and five additional iterations of HINTS (2007, 2011, 2012, 2013, 2014). This approach builds on the earlier work by quantifying the magnitude of change in information seeking behaviors. Bivariate comparison of the 2003 and 2014 data revealed very similar results; however, the multivariate model including all years of data indicated differences between the original and extended models: individuals age 65 and older were no longer less likely to seek cancer information than the 18-35 reference population, and Hispanics were also no longer less likely to be cancer information seekers. The results of our analysis indicate an overall shift in cancer information seeking behaviors and also illuminate the impact of increased Internet usage over the past decade, suggesting specific demographic groups that may benefit from cancer information seeking encouragement. PMID:27565190

  12. Open source tools for the information theoretic analysis of neural data

    OpenAIRE

    Alberto Mazzoni; Petersen, Rasmus S.

    2010-01-01

    The recent and rapid development of open source software tools for the analysis of neurophysiological datasets consisting of simultaneous multiple recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and for the integration of information obtained at different spatial and temporal scales. In this review we focus...

  13. An Illumination Invariant Face Detection Based on Human Shape Analysis and Skin Color Information

    Directory of Open Access Journals (Sweden)

    Dibakar Chakraborty

    2012-06-01

    Full Text Available This paper provides a novel approach towards face area localization through analyzing the shape characteristics of human body. The face region is extracted by determining the sharp increase in body pixels in the shoulder area from neck region. For ensuring face area skin color information is also analyzed. The experimental analysis shows that the proposed algorithm detects the face area effectively and it’s performance is found to be quite satisfactory

  14. The dynamic of information-driven coordination phenomena: a transfer entropy analysis

    OpenAIRE

    Borge-Holthoefer, Javier; Perra, Nicola; Gonçalves, Bruno; González-Bailón, Sandra; Arenas, Alex; Moreno, Yamir; Vespignani, Alessandro

    2015-01-01

    Data from social media are providing unprecedented opportunities to investigate the processes that rule the dynamics of collective social phenomena. Here, we consider an information theoretical approach to define and measure the temporal and structural signatures typical of collective social events as they arise and gain prominence. We use the symbolic transfer entropy analysis of micro-blogging time series to extract directed networks of influence among geolocalized sub-units in social syste...

  15. The dynamics of information-driven coordination phenomena: A transfer entropy analysis

    OpenAIRE

    Borge-Holthoefer, Javier; Perra, Nicola; Gonçalves, Bruno; González-Bailón, Sandra; Arenas, Alex; Moreno, Yamir; Vespignani, Alessandro

    2016-01-01

    Data from social media provide unprecedented opportunities to investigate the processes that govern the dynamics of collective social phenomena. We consider an information theoretical approach to define and measure the temporal and structural signatures typical of collective social events as they arise and gain prominence. We use the symbolic transfer entropy analysis of microblogging time series to extract directed networks of influence among geolocalized subunits in social systems. This met...

  16. Transportation Routing Analysis Geographic Information System (TRAGIS) User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, PE

    2003-09-18

    The Transportation Routing Analysis Geographic Information System (TRAGIS) model is used to calculate highway, rail, or waterway routes within the United States. TRAGIS is a client-server application with the user interface and map data files residing on the user's personal computer and the routing engine and network data files on a network server. The user's manual provides documentation on installation and the use of the many features of the model.

  17. PASSIVE LOCATION AND ACCURACY ANALYSIS USING TDOA INFORMATION OF MULTI-STATIONS

    Institute of Scientific and Technical Information of China (English)

    1998-01-01

    A new exact, explicit, uniterative, and computationally efficient solution of nonlinear equation set for estimation of emitter position based on the time differences of arrival (TDOA) measured by multi-stations is proposed. The accuracy analysis of the location method is also presented. Finally performance evaluation results of emitter location by using TDOA information are illustrated by some graphs of Geometrical Dilution of Precision (GDOP) for various conditions in the specific surveillance region.

  18. Log Usage Analysis: What it Discloses about Use, Information Seeking and Trustworthiness

    OpenAIRE

    David Nicholas; David Clark; Hamid R. Jamali; Anthony Watkinson

    2014-01-01

    The Trust and Authority in Scholarly Communications in the Light of the Digital Transition research project1) was a study which investigated the behaviours and attitudes of academic researchers as producers and consumers of scholarly information resources in respect to how they determine authority and trustworthiness. The research questions for the study arose out of CIBER’s studies of the virtual scholar. This paper focuses on elements of this study, mainly an analysis of a scholarly publish...

  19. Webometric analysis of departments of librarianship and information science: a follow-up study

    OpenAIRE

    Arakaki, M.; Willett, P.

    2009-01-01

    This paper reports an analysis of the websites of UK departments of library and information science. Inlink counts of these websites revealed no statistically significant correlation with the quality of the research carried out by these departments, as quantified using departmental grades in the 2001 Research Assessment Exercise and citations in Google Scholar to publications submitted for that Exercise. Reasons for this lack of correlation include: difficulties in disambiguating department...

  20. SOCIAL NETWORK ANALYSIS FOR THE MEASUREMENT OF FORMAL AND INFORMAL STRUCTURES

    Directory of Open Access Journals (Sweden)

    Siomara Maria Pierangeli Pascotto,

    2013-03-01

    Full Text Available A discussion around the application of the social network analysis methodology has being helped to understand the social dynamic operation of organizations, including governments, which are subject to different management structures that have a high degree of functional hierarchy. This study was developed in a Federal School Institution in Sao Paulo State – Brazil searching to identify the link developed between Institution administrative support staff members and their social networks. The research objective is to establish a comparison between the link data facts with Social Networks Theories which says that the more ties (contacts exist in an informal group, the more the network promotes an open environment for exchanging knowledge and information. The research looked forward to identify how do the social networks are developed, how common objectives and subjects turns people closely connected, how do the focal points and main contacts emerge from this community and what are their influences in the rest of the group. It was applied social network measurement methodologies, combined with qualitative instrument analysis. As a conclusion, was found that the informal structure has influence over the formal structure providing an environment for exchanging knowledge and information, pointing out that some characters are actually responsible for the dynamic of networks, occupying driver and strategic position that gives them recognition as such by the other agents in the net.

  1. Geographic information systems, remote sensing, and spatial analysis activities in Texas, 2008-09

    Science.gov (United States)

    ,

    2009-01-01

    Geographic information system (GIS) technology has become an important tool for scientific investigation, resource management, and environmental planning. A GIS is a computer-aided system capable of collecting, storing, analyzing, and displaying spatially referenced digital data. GIS technology is useful for analyzing a wide variety of spatial data. Remote sensing involves collecting remotely sensed data, such as satellite imagery, aerial photography, or radar images, and analyzing the data to gather information or investigate trends about the environment or the Earth's surface. Spatial analysis combines remotely sensed, thematic, statistical, quantitative, and geographical data through overlay, modeling, and other analytical techniques to investigate specific research questions. It is the combination of data formats and analysis techniques that has made GIS an essential tool in scientific investigations. This fact sheet presents information about the technical capabilities and project activities of the U.S. Geological Survey (USGS) Texas Water Science Center (TWSC) GIS Workgroup during 2008 and 2009. After a summary of GIS Workgroup capabilities, brief descriptions of activities by project at the local and national levels are presented. Projects are grouped by the fiscal year (October-September 2008 or 2009) the project ends and include overviews, project images, and Internet links to additional project information and related publications or articles.

  2. Geographic information systems, remote sensing, and spatial analysis activities in Texas, 2002-07

    Science.gov (United States)

    Pearson, D.K.; Gary, R.H.; Wilson, Z.D.

    2007-01-01

    Geographic information system (GIS) technology has become an important tool for scientific investigation, resource management, and environmental planning. A GIS is a computer-aided system capable of collecting, storing, analyzing, and displaying spatially referenced digital data. GIS technology is particularly useful when analyzing a wide variety of spatial data such as with remote sensing and spatial analysis. Remote sensing involves collecting remotely sensed data, such as satellite imagery, aerial photography, or radar images, and analyzing the data to gather information or investigate trends about the environment or the Earth's surface. Spatial analysis combines remotely sensed, thematic, statistical, quantitative, and geographical data through overlay, modeling, and other analytical techniques to investigate specific research questions. It is the combination of data formats and analysis techniques that has made GIS an essential tool in scientific investigations. This document presents information about the technical capabilities and project activities of the U.S. Geological Survey (USGS) Texas Water Science Center (TWSC) GIS Workgroup from 2002 through 2007.

  3. Trial sequential analysis reveals insufficient information size and potentially false positive results in many meta-analyses

    DEFF Research Database (Denmark)

    Brok, Jesper; Thorlund, Kristian; Gluud, Christian;

    2008-01-01

    To evaluate meta-analyses with trial sequential analysis (TSA). TSA adjusts for random error risk and provides the required number of participants (information size) in a meta-analysis. Meta-analyses not reaching information size are analyzed with trial sequential monitoring boundaries analogous...

  4. Analysis Methods for Extracting Knowledge from Large-Scale WiFi Monitoring to Inform Building Facility Planning

    DEFF Research Database (Denmark)

    Ruiz-Ruiz, Antonio; Blunck, Henrik; Prentow, Thor Siiger;

    2014-01-01

    realistic data to inform facility planning. In this paper, we propose analysis methods to extract knowledge from large sets of network collected WiFi traces to better inform facility management and planning in large building complexes. The analysis methods, which build on a rich set of temporal and spatial...

  5. 40 CFR 1400.9 - Access to off-site consequence analysis information by State and local government officials.

    Science.gov (United States)

    2010-07-01

    ... analysis information by State and local government officials. 1400.9 Section 1400.9 Protection of... information by State and local government officials. (a) The Administrator shall make available to any State or local government official for official use the OCA information for stationary sources located...

  6. An Analysis of Information Technology on Data Processing by using Cobit Framework

    Directory of Open Access Journals (Sweden)

    Surni Erniwati

    2015-09-01

    Full Text Available Information technology and processes is inter connected, directing and controlling the company in achieving corporate goals through value-added and balancing the risks and benefits of information technology. This study is aimed to analyze the level of maturity (maturity level on the data process and produced information technology recommendations that can be made as regards the management of IT to support the academic performance of the service to be better. Maturity level calculation was done by analyzing questionnaires on the state of information technology. The results of this study obtainable that the governance of information technology in data processing in Mataram ASM currently quite good. Current maturity value for the data processing has the value 2.69. This means that the company / organization already has a pattern of repeatedly done in managing the activities related to data management processes. Based on the data analysis, there is an effect on the current conditions and expected conditions can be taken solution or corrective actions to improve IT governance in the process of data management at ASM Mataram gradually.

  7. Legal analysis of the instructions for use information in the K-Files packages

    Directory of Open Access Journals (Sweden)

    Rhonan Ferreira da SILVA

    2010-06-01

    Full Text Available Introduction: Anvisa classifies endodontic files as medical products and, therefore, all commercial trademarks sold in Brazil must have an adequate registration. To achieve this registration, several information on the product should be available to consumers (dentists in order to allow its proper use and to avoid possible accidents. Objective: To examine whether the information set forth in endodontic K-Files packages, labels and instructions for use are in accordance with current legislation, especially those established by Anvisa and the Consumer’s Defense Code (CDC. Material and methods: 29 retail dental centers were visited and 11 samples of different commercial trademarks of K-Files first series (15-40 were obtained, and the information available on them was submitted to an analysis based on legal orders. Results:In all trademarks, there was no information available on how to use the product and on the means of storing the files before/after use. Only SybronEndo trademark warned about the risks of using the files and reported criteria of number of use and disposal. Only Mani trademark adequately informed on how to sterilize. Conclusion: It was verified that certain rules established by Anvisa and CDC are being disregarded concerning the display of certain necessary and required information that should be included on labels, instructions for use or K-Files commercial packages. Considering the large amount of information that must be available for the proper use of endodontic files, it is important that they are displayed preferably by means of instructions for use in the commercial package to be acquired by dentists.

  8. What Are Your Patients Reading Online About Soft-tissue Fillers? An Analysis of Internet Information

    Science.gov (United States)

    Al Youha, Sarah A.; Bull, Courtney E.; Butler, Michael B.; Williams, Jason G.

    2016-01-01

    Background: Soft-tissue fillers are increasingly being used for noninvasive facial rejuvenation. They generally offer minimal downtime and reliable results. However, significant complications are reported and patients need to be aware of these as part of informed consent. The Internet serves as a vital resource to inform patients of the risks and benefits of this procedure. Methods: Three independent reviewers performed a structured analysis of 65 Websites providing information on soft-tissue fillers. Validated instruments were used to analyze each site across multiple domains, including readability, accessibility, reliability, usability, quality, and accuracy. Associations between the endpoints and Website characteristics were assessed using linear regression and proportional odds modeling. Results: The majority of Websites were physician private practice sites (36.9%) and authored by board-certified plastic surgeons or dermatologists (35.4%) or nonphysicians (27.7%). Sites had a mean Flesch-Kincaid grade level of 11.9 ± 2.6, which is well above the recommended average of 6 to 7 grade level. Physician private practice sites had the lowest scores across all domains with a notable lack of information on complications. Conversely, Websites of professional societies focused in plastic surgery and dermatology, as well as academic centers scored highest overall. Conclusions: As the use of soft-tissue fillers is rising, patients should be guided toward appropriate sources of information such as Websites sponsored by professional societies. Medical professionals should be aware that patients may be accessing poor information online and strive to improve the overall quality of information available on soft-tissue fillers.

  9. An analysis of water data systems to inform the Open Water Data Initiative

    Science.gov (United States)

    Blodgett, David L.; Read, Emily Kara; Lucido, Jessica M.; Slawecki, Tad; Young, Dwane

    2016-01-01

    Improving access to data and fostering open exchange of water information is foundational to solving water resources issues. In this vein, the Department of the Interior's Assistant Secretary for Water and Science put forward the charge to undertake an Open Water Data Initiative (OWDI) that would prioritize and accelerate work toward better water data infrastructure. The goal of the OWDI is to build out the Open Water Web (OWW). We therefore considered the OWW in terms of four conceptual functions: water data cataloging, water data as a service, enriching water data, and community for water data. To describe the current state of the OWW and identify areas needing improvement, we conducted an analysis of existing systems using a standard model for describing distributed systems and their business requirements. Our analysis considered three OWDI-focused use cases—flooding, drought, and contaminant transport—and then examined the landscape of other existing applications that support the Open Water Web. The analysis, which includes a discussion of observed successful practices of cataloging, serving, enriching, and building community around water resources data, demonstrates that we have made significant progress toward the needed infrastructure, although challenges remain. The further development of the OWW can be greatly informed by the interpretation and findings of our analysis.

  10. Improving access to health information for older migrants by using grounded theory and social network analysis to understand their information behaviour and digital technology use.

    Science.gov (United States)

    Goodall, K T; Newman, L A; Ward, P R

    2014-11-01

    Migrant well-being can be strongly influenced by the migration experience and subsequent degree of mainstream language acquisition. There is little research on how older Culturally And Linguistically Diverse (CALD) migrants who have 'aged in place' find health information, and the role which digital technology plays in this. Although the research for this paper was not focused on cancer, we draw out implications for providing cancer-related information to this group. We interviewed 54 participants (14 men and 40 women) aged 63-94 years, who were born in Italy or Greece, and who migrated to Australia mostly as young adults after World War II. Constructivist grounded theory and social network analysis were used for data analysis. Participants identified doctors, adult children, local television, spouse, local newspaper and radio as the most important information sources. They did not generally use computers, the Internet or mobile phones to access information. Literacy in their birth language, and the degree of proficiency in understanding and using English, influenced the range of information sources accessed and the means used. The ways in which older CALD migrants seek and access information has important implications for how professionals and policymakers deliver relevant information to them about cancer prevention, screening, support and treatment, particularly as information and resources are moved online as part of e-health.

  11. Extending hierarchical task analysis to identify cognitive demands and information design requirements.

    Science.gov (United States)

    Phipps, Denham L; Meakin, George H; Beatty, Paul C W

    2011-07-01

    While hierarchical task analysis (HTA) is well established as a general task analysis method, there appears a need to make more explicit both the cognitive elements of a task and design requirements that arise from an analysis. One way of achieving this is to make use of extensions to the standard HTA. The aim of the current study is to evaluate the use of two such extensions--the sub-goal template (SGT) and the skills-rules-knowledge (SRK) framework--to analyse the cognitive activity that takes place during the planning and delivery of anaesthesia. In quantitative terms, the two methods were found to have relatively poor inter-rater reliability; however, qualitative evidence suggests that the two methods were nevertheless of value in generating insights about anaesthetists' information handling and cognitive performance. Implications for the use of an extended HTA to analyse work systems are discussed.

  12. FACTORS OF INFLUENCE ON THE ENTREPRENEURIAL INTEREST: AN ANALYSIS WITH STUDENTS OF INFORMATION TECHNOLOGY RELATED COURSES

    Directory of Open Access Journals (Sweden)

    Diego Guilherme Bonfim

    2009-10-01

    Full Text Available The purpose of the research was to analyze the entrepreneurial interest of students in information technology related courses. A literature review was performed, from which four hypotheses were announced, affirming that the student interest in entrepreneurial activity is influenced by (1 the perceived vocation of the area, (2 the ownership of a company, (3 the perceived social support from friends and family, and (4 the entrepreneurial skills mastery. A field study was developed, with data collected from the 171 students of higher education institutions from Fortaleza. The data were analyzed by using statistical techniques of descriptive analysis, analysis of variance, and multiple regression analysis. It was found that: (1 students, in general, have a moderate predisposition to engage in entrepreneurial activities; (2 the entrepreneurial interest is influenced by the perceived entrepreneurial vocation of the area, the social support, and the perceived strategic entrepreneurial skills mastery.

  13. Message Structures: a modelling technique for information systems analysis and design

    CERN Document Server

    España, Sergio; Pastor, Óscar; Ruiz, Marcela

    2011-01-01

    Despite the increasing maturity of model-driven software development (MDD), some research challenges remain open in the field of information systems (IS). For instance, there is a need to improve modelling techniques so that they cover several development stages in an integrated way, and they facilitate the transition from analysis to design. This paper presents Message Structures, a technique for the specification of communicative interactions between the IS and organisational actors. This technique can be used both in the analysis stage and in the design stage. During analysis, it allows abstracting from the technology that will support the IS, and to complement business process diagramming techniques with the specification of the communicational needs of the organisation. During design, Message Structures serves two purposes: (i) it allows to systematically derive a specification of the IS memory (e.g. a UML class diagram), (ii) and it allows to reason the user interface design using abstract patterns. Thi...

  14. An information-theoretic analysis of return maximization in reinforcement learning.

    Science.gov (United States)

    Iwata, Kazunori

    2011-12-01

    We present a general analysis of return maximization in reinforcement learning. This analysis does not require assumptions of Markovianity, stationarity, and ergodicity for the stochastic sequential decision processes of reinforcement learning. Instead, our analysis assumes the asymptotic equipartition property fundamental to information theory, providing a substantially different view from that in the literature. As our main results, we show that return maximization is achieved by the overlap of typical and best sequence sets, and we present a class of stochastic sequential decision processes with the necessary condition for return maximization. We also describe several examples of best sequences in terms of return maximization in the class of stochastic sequential decision processes, which satisfy the necessary condition.

  15. Patient information on breast reconstruction in the era of the world wide web. A snapshot analysis of information available on youtube.com.

    Science.gov (United States)

    Tan, M L H; Kok, K; Ganesh, V; Thomas, S S

    2014-02-01

    Breast cancer patient's expectation and choice of reconstruction is increasing and patients often satisfy their information needs outside clinic time by searching the world wide web. The aim of our study was to analyse the quality of content and extent of information regarding breast reconstruction available on YouTube videos and whether this is an appropriate additional source of information for patients. A snapshot qualitative and quantitative analysis of the first 100 videos was performed after the term 'breast reconstruction' was input into the search window of the video sharing website www.youtube.com on the 1st of September 2011. Qualitative categorical analysis included patient, oncological and reconstruction factors. It was concluded that although videos uploaded onto YouTube do not provide comprehensive information, it is a useful resource that can be utilised in patient education provided comprehensive and validated videos are made available.

  16. The Role of Mother in Informing Girls About Puberty: A Meta-Analysis Study

    Directory of Open Access Journals (Sweden)

    Sooki

    2016-02-01

    Full Text Available Context Family, especially the mother, has the most important role in the education, transformation of information, and health behaviors of girls in order for them to have a healthy transition from the critical stage of puberty, but there are different views in this regard. Objectives Considering the various findings about the source of information about puberty, a meta-analysis study was conducted to investigate the extent of the mother’s role in informing girls about puberty. Data Sources This meta-analysis study was based on English articles published from 2000 to February 2015 in the Scopus, PubMed, and Science direct databases and on Persian articles in the SID, Magiran, and Iran Medex databases with determined key words and their MeSH equivalent. Study Selection Quantitative cross-sectional articles were extracted by two independent researchers and finally 46 articles were selected based on inclusion criteria. STROBE list were used for evaluation of studies. Data Extraction The percent of mothers as the current and preferred source of gaining information about the process of puberty, menarche, and menstruation from the perspective of adolescent girls was extracted from the articles. The results of studies were analyzed using meta-analysis (random effects model and the studies’ heterogeneity was analyzed using the I2 calculation index. Variance between studies was analyzed using tau squared (Tau2 and review manager 5 software. Results The results showed that, from the perspective of teenage girls in Iran and other countries, in 56% of cases, the mother was the current source of information about the process of puberty, menarche, and menstruation. The preferred source of information about the process of puberty, menarche, and menstruation was the mother in all studies at 60% (Iran 57%, and other countries 66%. Conclusions According to the findings of this study, it is essential that health professionals and officials of the ministry of

  17. Pattern Classification of Decomposed Wavelet Information using ART2 Networks for echoes Analysis

    Directory of Open Access Journals (Sweden)

    Solís M.

    2008-04-01

    Full Text Available The Ultrasonic Pulse-Echo technique has been successfully used in a non-destructive testing of materials. Toperform Ultrasonic Non-destructive Evaluation (NDE, an ultrasonic pulsed wave is transmitted into thematerials using a transmitting/receiving transducer or arrays of transducers,that produces an image ofultrasonic reflectivity. The information inherent in ultrasonic signals or image are the echoes coming fromflaws, grains, and boundaries of the tested material. The main goal of this evaluation is to determine theexistence of defect, its size and its position; for that matter, an innovative methodology is proposed based onpattern recognition and wavelet analysis for flaws detection and localization.The pattern recognition technique used in this work is the neural network named ART2 (Adaptive ResonanceTheory trained by the information given by the time-scale information of the signals via the wavelet transform.A thorough analysis between the neural network training and the type wavelets used for the training has beendeveloped, showing that the Symlet 6 wavelet is the optimum for our problem.

  18. Analysis of the Health Information and Communication System and Cloud Computing

    Directory of Open Access Journals (Sweden)

    Matija Varga

    2015-05-01

    Full Text Available This paper describes an analysis and shows its use in analysing strengths, weaknesses, opportunities and threats (risks within the health care system.The aim is further more to show strengths, weaknesses, opportunities and threats when using cloud computing in the health care system. Cloud computing in medicine is an integral part of telemedicine. Based on the information presented in this paper, employees may identify the advantages and disadvantages of using cloud computing. When introducing new information technologies in the health care business the implementers will encounter numerous problems, such as: the complexity of the existing and the new information system, the costs of maintaining and updating the software, the cost of implementing new modules,a way of protecting the existing data in the database and the data that will be collected in the diagnosis. Using the SWOT analysis this paper evaluates the feasibility and possibility of adopting cloud computing in the health sector to improve health services based on samples (examples from abroad. The intent of cloud computing in medicine is to send data of the patient to the doctor instead of the patient sending it himself/herself.

  19. Application of evidence theory in information fusion of multiple sources in bayesian analysis

    Institute of Scientific and Technical Information of China (English)

    周忠宝; 蒋平; 武小悦

    2004-01-01

    How to obtain proper prior distribution is one of the most critical problems in Bayesian analysis. In many practical cases, the prior information often comes from different sources, and the prior distribution form could be easily known in some certain way while the parameters are hard to determine. In this paper, based on the evidence theory, a new method is presented to fuse the information of multiple sources and determine the parameters of the prior distribution when the form is known. By taking the prior distributions which result from the information of multiple sources and converting them into corresponding mass functions which can be combined by Dempster-Shafer (D-S) method, we get the combined mass function and the representative points of the prior distribution. These points are used to fit with the given distribution form to determine the parameters of the prior distrbution. And then the fused prior distribution is obtained and Bayesian analysis can be performed.How to convert the prior distributions into mass functions properly and get the representative points of the fused prior distribution is the central question we address in this paper. The simulation example shows that the proposed method is effective.

  20. #FluxFlow: Visual Analysis of Anomalous Information Spreading on Social Media.

    Science.gov (United States)

    Zhao, Jian; Cao, Nan; Wen, Zhen; Song, Yale; Lin, Yu-Ru; Collins, Christopher

    2014-12-01

    We present FluxFlow, an interactive visual analysis system for revealing and analyzing anomalous information spreading in social media. Everyday, millions of messages are created, commented, and shared by people on social media websites, such as Twitter and Facebook. This provides valuable data for researchers and practitioners in many application domains, such as marketing, to inform decision-making. Distilling valuable social signals from the huge crowd's messages, however, is challenging, due to the heterogeneous and dynamic crowd behaviors. The challenge is rooted in data analysts' capability of discerning the anomalous information behaviors, such as the spreading of rumors or misinformation, from the rest that are more conventional patterns, such as popular topics and newsworthy events, in a timely fashion. FluxFlow incorporates advanced machine learning algorithms to detect anomalies, and offers a set of novel visualization designs for presenting the detected threads for deeper analysis. We evaluated FluxFlow with real datasets containing the Twitter feeds captured during significant events such as Hurricane Sandy. Through quantitative measurements of the algorithmic performance and qualitative interviews with domain experts, the results show that the back-end anomaly detection model is effective in identifying anomalous retweeting threads, and its front-end interactive visualizations are intuitive and useful for analysts to discover insights in data and comprehend the underlying analytical model. PMID:26356891

  1. Combining Global and Local Information for Knowledge-Assisted Image Analysis and Classification

    Directory of Open Access Journals (Sweden)

    Mezaris V

    2007-01-01

    Full Text Available A learning approach to knowledge-assisted image analysis and classification is proposed that combines global and local information with explicitly defined knowledge in the form of an ontology. The ontology specifies the domain of interest, its subdomains, the concepts related to each subdomain as well as contextual information. Support vector machines (SVMs are employed in order to provide image classification to the ontology subdomains based on global image descriptions. In parallel, a segmentation algorithm is applied to segment the image into regions and SVMs are again employed, this time for performing an initial mapping between region low-level visual features and the concepts in the ontology. Then, a decision function, that receives as input the computed region-concept associations together with contextual information in the form of concept frequency of appearance, realizes image classification based on local information. A fusion mechanism subsequently combines the intermediate classification results, provided by the local- and global-level information processing, to decide on the final image classification. Once the image subdomain is selected, final region-concept association is performed using again SVMs and a genetic algorithm (GA for optimizing the mapping between the image regions and the selected subdomain concepts taking into account contextual information in the form of spatial relations. Application of the proposed approach to images of the selected domain results in their classification (i.e., their assignment to one of the defined subdomains and the generation of a fine granularity semantic representation of them (i.e., a segmentation map with semantic concepts attached to each segment. Experiments with images from the personal collection domain, as well as comparative evaluation with other approaches of the literature, demonstrate the performance of the proposed approach.

  2. Information problem solving by experts and novices: Analysis of a complex cognitive skill.

    NARCIS (Netherlands)

    Brand-Gruwel, Saskia; Wopereis, Iwan; Vermetten, Yvonne

    2007-01-01

    In (higher) education students are often faced with information problems: tasks or assignments that require them to identify information needs, locate corresponding information sources, extract and organize relevant information from each source, and synthesize information from a variety of sources.

  3. Bringing trauma-informed practice to domestic violence programs: A qualitative analysis of current approaches.

    Science.gov (United States)

    Wilson, Joshua M; Fauci, Jenny E; Goodman, Lisa A

    2015-11-01

    Three out of 10 women and 1 out of 10 men in the United States experience violence at the hands of an intimate partner-often with devastating costs. In response, hundreds of residential and community-based organizations have sprung up to support survivors. Over the last decade, many of these organizations have joined other human service systems in adopting trauma-informed care (TIC), an approach to working with survivors that responds directly to the effects of trauma. Although there have been various efforts to describe TIC in domestic violence (DV) programs, there is a need to further synthesize this discourse on trauma-informed approaches to better understand specific applications and practices for DV programs. This study aimed to address this gap. The authors of this study systematically identified key documents that describe trauma-informed approaches in DV services and then conducted a qualitative content analysis to identify core themes. Results yielded 6 principles (Establishing emotional safety, Restoring choice and control, Facilitating connection, Supporting coping, Responding to identity and context, and Building strengths), each of which comprised a set of concrete practices. Despite the common themes articulated across descriptions of DV-specific trauma-informed practices (TIP), we also found critical differences, with some publications focusing narrowly on individual healing and others emphasizing the broader community and social contexts of violence and oppression. Implications for future research and evaluation are discussed. (PsycINFO Database Record PMID:26594925

  4. AN ECONOMIC ANALYSIS OF THE DETERMINANTS OF ENTREPRENEURSHIP: THE CASE OF MASVINGO INFORMAL BUSINESSES

    Directory of Open Access Journals (Sweden)

    Clainos Chidoko

    2013-03-01

    Full Text Available In the past decade, Zimbabwe has been hit by its worst economic performance since its independence in 1980. Capacity utilization shrank to ten percent and unemployment rate was above eighty percent by 2008 as the private and public sector witnessed massive retrenchments. As a result many people are finding themselves engaging in informal businesses to make ends meet. However not all people have joined the informal sector as has been witnessed by the number of people who left the country in droves to neighbouring countries. It is against this background that this research conducted an economic analysis of the determinants of entrepreneurship in Masvingo urban with an emphasis on the informal businesses. The research targeted a sample of 100 informal businesses (30 from Rujeko Light industrial area, 40 from Mucheke Light industrial area and 30 from Masvingo Central Business District. The businesses included among others flea market operators, furniture manufacturers, suppliers and producers of agricultural products, and food vendors. The research found out that level of education, gender, age, marital status, number of dependants, type of subjects studied at secondary school and vocational training are the main determinants that influence the type of business that entrepreneur ventures into. The study recommends formal training for the participants, for the businesses to continue into existence since they fill in the gap that is left vacant by most formal enterprises.

  5. Analysis of Characteristics Extension Workers to Utilization of Information and Communication Technology

    Directory of Open Access Journals (Sweden)

    Veronice Veronice

    2015-08-01

    Full Text Available The science and technology is developing rapidly with the demands of changing times. The development of information and communication technology, especially since the advent of internet technology has led to major changes in society. Information technology products are relatively cheap and affordable facilitate access to information beyond the national borders and cultural boundaries. This condition has penetrated to all levels of human life, including farmers in the villages. Therefore, the extension becomes important role as a facilitator in developing the potential of farmers. Consequently extension is required to adjust to the changes and demands of the growing community. The objectives of the research is the analysis of characteristics extension workers to utilization of information and communication technology in Limapuluh Kota regency West Sumatera. This study is a descriptive-correlational survey-based study with the sample consisting of government employee as well as freelance extension workers in 8 Extension Agency of Agriculture  Fisheries and Forestry Extension (BP3K in Limapuluh Kota regency, West Sumatera province. Based on the results obtained, the results of different test (t-test is known that there are significant differences between the characteristics of the civil servants and THL-TBPP especially in the aspect of age and length of employment.

  6. A large scale analysis of information-theoretic network complexity measures using chemical structures.

    Directory of Open Access Journals (Sweden)

    Matthias Dehmer

    Full Text Available This paper aims to investigate information-theoretic network complexity measures which have already been intensely used in mathematical- and medicinal chemistry including drug design. Numerous such measures have been developed so far but many of them lack a meaningful interpretation, e.g., we want to examine which kind of structural information they detect. Therefore, our main contribution is to shed light on the relatedness between some selected information measures for graphs by performing a large scale analysis using chemical networks. Starting from several sets containing real and synthetic chemical structures represented by graphs, we study the relatedness between a classical (partition-based complexity measure called the topological information content of a graph and some others inferred by a different paradigm leading to partition-independent measures. Moreover, we evaluate the uniqueness of network complexity measures numerically. Generally, a high uniqueness is an important and desirable property when designing novel topological descriptors having the potential to be applied to large chemical databases.

  7. Security analysis and improvement of a privacy authentication scheme for telecare medical information systems.

    Science.gov (United States)

    Wu, Fan; Xu, Lili

    2013-08-01

    Nowadays, patients can gain many kinds of medical service on line via Telecare Medical Information Systems(TMIS) due to the fast development of computer technology. So security of communication through network between the users and the server is very significant. Authentication plays an important part to protect information from being attacked by malicious attackers. Recently, Jiang et al. proposed a privacy enhanced scheme for TMIS using smart cards and claimed their scheme was better than Chen et al.'s. However, we have showed that Jiang et al.'s scheme has the weakness of ID uselessness and is vulnerable to off-line password guessing attack and user impersonation attack if an attacker compromises the legal user's smart card. Also, it can't resist DoS attack in two cases: after a successful impersonation attack and wrong password input in Password change phase. Then we propose an improved mutual authentication scheme used for a telecare medical information system. Remote monitoring, checking patients' past medical history record and medical consultant can be applied in the system where information transmits via Internet. Finally, our analysis indicates that the suggested scheme overcomes the disadvantages of Jiang et al.'s scheme and is practical for TMIS. PMID:23818249

  8. Research on analysis method for temperature control information of high arch dam construction

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Temperature control,which is directly responsible for the project quality and progress,plays an important role in high arch dam construction.How to discover the rules from a large amount of temperature control information collected in order to guide the adjustment of temperature control measures to prevent cracks on site is the key scientific problem.In this paper,a mathematic logical model was built firstly by means of a coupling analysis of temperature control system decomposition and coordination for high arch dam.Then,an analysis method for temperature control information was presented based on data mining technology.Furthermore,the data warehouse of temperature control was designed,and the artificial neural network forecasting model for the highest temperature of concrete was also developed.Finally,these methods were applied to a practical project. The result showed that the efficiency and precision of temperature control was improved,and rationality and scientificity of management and decision-making were strengthened.All of these researches provided an advanced analysis method for temperature control in the high arch dam construction process.

  9. Towards a New Approach of the Economic Intelligence Process: Basic Concepts, Analysis Methods and Informational Tools

    Directory of Open Access Journals (Sweden)

    Sorin Briciu

    2009-04-01

    Full Text Available One of the obvious trends in current business environment is the increased competition. In this context, organizations are becoming more and more aware of the importance of knowledge as a key factor in obtaining competitive advantage. A possible solution in knowledge management is Economic Intelligence (EI that involves the collection, evaluation, processing, analysis, and dissemination of economic data (about products, clients, competitors, etc. inside organizations. The availability of massive quantities of data correlated with advances in information and communication technology allowing for the filtering and processing of these data provide new tools for the production of economic intelligence.The research is focused on innovative aspects of economic intelligence process (models of analysis, activities, methods and informational tools and is providing practical guidelines for initiating this process. In this paper, we try: (a to contribute to a coherent view on economic intelligence process (approaches, stages, fields of application; b to describe the most important models of analysis related to this process; c to analyze the activities, methods and tools associated with each stage of an EI process.

  10. Parametric Sensitivity Analysis for Stochastic Molecular Systems using Information Theoretic Metrics

    CERN Document Server

    Tsourtis, Anastasios; Katsoulakis, Markos A; Harmandaris, Vagelis

    2014-01-01

    In this paper we extend the parametric sensitivity analysis (SA) methodology proposed in Ref. [Y. Pantazis and M. A. Katsoulakis, J. Chem. Phys. 138, 054115 (2013)] to continuous time and continuous space Markov processes represented by stochastic differential equations and, particularly, stochastic molecular dynamics as described by the Langevin equation. The utilized SA method is based on the computation of the information-theoretic (and thermodynamic) quantity of relative entropy rate (RER) and the associated Fisher information matrix (FIM) between path distributions. A major advantage of the pathwise SA method is that both RER and pathwise FIM depend only on averages of the force field therefore they are tractable and computable as ergodic averages from a single run of the molecular dynamics simulation both in equilibrium and in non-equilibrium steady state regimes. We validate the performance of the extended SA method to two different molecular stochastic systems, a standard Lennard-Jones fluid and an al...

  11. A media information analysis for implementing effective countermeasure against harmful rumor

    Science.gov (United States)

    Nagao, Mitsuyoshi; Suto, Kazuhiro; Ohuchi, Azuma

    2010-04-01

    When large scale earthquake occurred, the word of "harmful rumor" came to be frequently heard. The harmful rumor means an economic damage which is caused by the action that people regard actually safe foods or areas as dangerous and then abort consumption or sightseeing. In the case of harmful rumor caused by earthquake, especially, tourism industry receives massive economic damage. Currently, harmful rumor which gives substantial economic damage have become serious social issue which must be solved. In this paper, we propose a countermeasure method for harmful rumor on the basis of media trend in order to implement speedy recovery from harmful rumor. Here, we investigate the amount and content of information which is transmitted to the general public by the media when an earthquake occurred. In addition, the media information in three earthquakes is treated as instance. Finally, we discuss an effective countermeasure method for dispeling harmful rumor through these analysis results.

  12. The value of information as applied to the Landsat Follow-on benefit-cost analysis

    Science.gov (United States)

    Wood, D. B.

    1978-01-01

    An econometric model was run to compare the current forecasting system with a hypothetical (Landsat Follow-on) space-based system. The baseline current system was a hybrid of USDA SRS domestic forecasts and the best known foreign data. The space-based system improved upon the present Landsat by the higher spatial resolution capability of the thematic mapper. This satellite system is a major improvement for foreign forecasts but no better than SRS for domestic forecasts. The benefit analysis was concentrated on the use of Landsat Follow-on to forecast world wheat production. Results showed that it was possible to quantify the value of satellite information and that there are significant benefits in more timely and accurate crop condition information.

  13. The Roles of Conference Papers in IS: An Analysis of the Scandinavian Conference on Information Systems

    DEFF Research Database (Denmark)

    Lanamäki, Arto; Persson, John Stouby

    2016-01-01

    Information Systems (IS) research has both a journal-oriented publication culture and a rich plethora of conferences. It is unclear why IS researchers even bother with conference publishing given the high focus on journals. Against this backdrop, the purpose of this paper is to increase our...... understanding of conference papers in IS and the role they play for the authoring researchers. We present the first analysis of the papers published during the first six years (2010-2015) in the Scandinavian Conference on Information Systems (SCIS). We conducted interviews with ten SCIS authors. Following...... a framework adopted from Åkerlind [1], we identified how SCIS papers have the roles of fulfilling requirements, establishing oneself, developing personally, enabling change, and other roles. This article contributes to the reflection literature on the IS field by applying a practice lens to understand...

  14. ERISTAR: Earth Resources Information Storage, Transformation, Analysis, and Retrieval administrative report

    Science.gov (United States)

    Vachon, R. I.; Obrien, J. F., Jr.; Lueg, R. E.; Cox, J. E.

    1972-01-01

    The 1972 Systems Engineering program at Marshall Space Flight Center where 15 participants representing 15 U.S. universities, 1 NASA/MSFC employee, and another specially assigned faculty member, participated in an 11-week program is discussed. The Fellows became acquainted with the philosophy of systems engineering, and as a training exercise, used this approach to produce a conceptional design for an Earth Resources Information Storage, Transformation, Analysis, and Retrieval System. The program was conducted in three phases; approximately 3 weeks were devoted to seminars, tours, and other presentations to subject the participants to technical and other aspects of the information management problem. The second phase, 5 weeks in length, consisted of evaluating alternative solutions to problems, effecting initial trade-offs and performing preliminary design studies and analyses. The last 3 weeks were occupied with final trade-off sessions, final design analyses and preparation of a final report and oral presentation.

  15. Database Semantic Interoperability based on Information Flow Theory and Formal Concept Analysis

    Directory of Open Access Journals (Sweden)

    Guanghui Yang

    2012-07-01

    Full Text Available As databases become widely used, there is a growing need to translate information between multiple databases. Semantic interoperability and integration has been a long standing challenge for the database community and has now become a prominent area of database research. In this paper, we aim to answer the question how semantic interoperability between two databases can be achieved by using Formal Concept Analysis (FCA for short and Information Flow (IF for short theories. For our purposes, firstly we discover knowledge from different databases by using FCA, and then align what is discovered by using IF and FCA. The development of FCA has led to some software systems such as TOSCANA and TUPLEWARE, which can be used as a tool for discovering knowledge in databases. A prototype based on the IF and FCA has been developed. Our method is tested and verified by using this prototype and TUPLEWARE.

  16. Rock Slopes Failure Susceptibility Analysis: From Remote Sensing Measurements to Geographic Information System Raster Modules

    Directory of Open Access Journals (Sweden)

    Filipello Andrea

    2010-01-01

    Full Text Available Problem statement: Two important step can be recognised in the rockfall analysis: the potential failure detection and the run out simulation. Analyzing the stability of rock slopes, most important kinematisms are planar or wedge slidings and topplings. The aim of this study was to coupling a deterministic approach for landslide initiation (potential rockfall source areas with a runout analysis by developing new GRASS GIS raster modules. A case study in the Ossola Valley, at the border between Italy and Swiss, was discussed. Approach: New GIS raster modules for rockfall analysis were developed. Slope stability modules were based on rock mass classification indexes and on limit equilibrium model, while the prediction of rockfall travel distance was based on the shadow angle approach. Results: The study highlighted the importance of GIS tools for analysis of landslide susceptibility. The spatial forecasts provided by the new GIS modules were validated and supplemented by traditional analysis. Conclusion: This study proved that there is a good correspondence between the prediction of high attitude to instability calculated by the modules and the location of past events. The new modules have provided an opportunity to assess, in an objective and repeatable way, the susceptibility to failure and also quantitative information about area of invasion for rock falling.

  17. Informativeness of minisatellite and microsatellite markers for genetic analysis in papaya.

    Science.gov (United States)

    Oliveira, G A F; Dantas, J L L; Oliveira, E J

    2015-10-01

    The objective of this study was to evaluate information on minisatellite and microsatellite markers in papaya (Carica papaya L.). Forty minisatellites and 91 microsatellites were used for genotyping 24 papaya accessions. Estimates of genetic diversity, genetic linkage and analyses of population structure were compared. A lower average number of alleles per locus was observed in minisatellites (3.10) compared with microsatellites (3.57), although the minisatellites showed rarer alleles (18.54 %) compared with microsatellite (13.85 %). Greater expected (He = 0.52) and observed (Ho = 0.16) heterozygosity was observed in the microsatellites compared with minisatellites (He = 0.42 and Ho = 0.11), possibly due to the high number of hermaphroditic accessions, resulting in high rates of self-fertilization. The polymorphic information content and Shannon-Wiener diversity were also higher for microsatellites (from 0.47 to 1.10, respectively) compared with minisatellite (0.38 and 0.85, respectively). The probability of paternity exclusion was high for both markers (>0.999), and the combined probability of identity was from 1.65(-13) to 4.33(-38) for mini- and micro-satellites, respectively, which indicates that both types of markers are ideal for genetic analysis. The Bayesian analysis indicated the formation of two groups (K = 2) for both markers, although the minisatellites indicated a substructure (K = 4). A greater number of accessions with a low probability of assignment to specific groups were observed for microsatellites. Collectively, the results indicated higher informativeness of microsatellites. However, the lower informative power of minisatellites may be offset by the use of larger number of loci. Furthermore, minisatellites are subject to less error in genotyping because there is greater power to detect genotyping systems when larger motifs are used.

  18. Transportation Big Data: Unbiased Analysis and Tools to Inform Sustainable Transportation Decisions

    Energy Technology Data Exchange (ETDEWEB)

    2016-06-01

    Today, transportation operation and energy systems data are generated at an unprecedented scale. The U.S. Department of Energy's National Renewable Energy Laboratory (NREL) is the go-to source for expertise in providing data and analysis to inform industry and government transportation decision making. The lab's teams of data experts and engineers are mining and analyzing large sets of complex data -- or 'big data' -- to develop solutions that support the research, development, and deployment of market-ready technologies that reduce fuel consumption and greenhouse gas emissions.

  19. Object-oriented analysis and design for information systems Modeling with UML, OCL, IFML

    CERN Document Server

    Wazlawick, Raul Sidnei

    2014-01-01

    Object-Oriented Analysis and Design for Information Systems clearly explains real object-oriented programming in practice. Expert author Raul Sidnei Wazlawick explains concepts such as object responsibility, visibility and the real need for delegation in detail. The object-oriented code generated by using these concepts in a systematic way is concise, organized and reusable. The patterns and solutions presented in this book are based in research and industrial applications. You will come away with clarity regarding processes and use cases and a clear understand of how to expand a use case.

  20. Automatic generation of stop word lists for information retrieval and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rose, Stuart J

    2013-01-08

    Methods and systems for automatically generating lists of stop words for information retrieval and analysis. Generation of the stop words can include providing a corpus of documents and a plurality of keywords. From the corpus of documents, a term list of all terms is constructed and both a keyword adjacency frequency and a keyword frequency are determined. If a ratio of the keyword adjacency frequency to the keyword frequency for a particular term on the term list is less than a predetermined value, then that term is excluded from the term list. The resulting term list is truncated based on predetermined criteria to form a stop word list.

  1. Towards a Structurational Theory of Information Systems: a substantive case analysis

    DEFF Research Database (Denmark)

    Rose, Jeremy; Hackney, R. H

    2003-01-01

    This paper employs the analysis of an interpretive case study within a Regional Train Operating Company (RTOC) to arrive at theoretical understandings of Information Systems (IS). Giddens’ ‘structuration theory’ is developed which offers an account of structure and agency; social practices...... in the company (RTOC) from engineers and train drivers to the board of directors. Participant observation was also undertaken with the authors attending twenty-one meetings, workshops and presentations. The resulting theoretical model describes IS embedded in social practices, which evolve to display both...

  2. Analysis on health information extracted from an urban professional population in Beijing

    Institute of Scientific and Technical Information of China (English)

    ZHANG Tie-mei; ZHANG Yan; LIU Bin; JIA Hong-bo; LIU Yun-jie; ZHU Ling; LUO Sen-lin; HAN Yi-wen; ZHANG Yan; YANG Shu-wen; LIU An-nan; MA Lan-jun; ZHAO Yan-yan

    2011-01-01

    Background The assembled data from a population could provide information on health trends within the population.The aim of this research was to extract and know basic health information from an urban professional population in Beijing.Methods Data analysis was carried out in a population who underwent a routine medical check-up and aged >20 years,including 30 058 individuals.General information,data from physical examinations and blood samples were collected in the same method.The health status was separated into three groups by the criteria generated in this study,i.e.,people with common chronic diseases,people in a sub-clinic situation,and healthy people.The proportion of both common diseases suffered and health risk distribution of different age groups were also analyzed.Results The proportion of people with common chronic diseases,in the sub-clinic group and in the healthy group was 28.6%,67.8% and 3.6% respectively.There were significant differences in the health situation in different age groups.Hypertension was on the top of list of self-reported diseases.The proportion of chronic diseases increased significantly in people after 35 years of age.Meanwhile,the proportion of sub-clinic conditions was decreasing at the same rate.The complex risk factors to health in this population were metabolic disturbances (61.3%),risk for tumor (2.7%),abnormal results of morphological examination (8.2%) and abnormal results of lab tests of serum (27.8%).Conclusions Health information could be extracted from a complex data set from the heath check-ups of the general population.The information should be applied to support prevention and control chronic diseases as well as for directing intervention for patients with risk factors for disease.

  3. The use of cloud enabled building information models – an expert analysis

    Directory of Open Access Journals (Sweden)

    Alan Redmond

    2015-10-01

    Full Text Available The dependency of today’s construction professionals to use singular commercial applications for design possibilities creates the risk of being dictated by the language-tools they use. This unknowingly approach to converting to the constraints of a particular computer application’s style, reduces one’s association with cutting-edge design as no single computer application can support all of the tasks associated with building-design and production. Interoperability depicts the need to pass data between applications, allowing multiple types of experts and applications to contribute to the work at hand. Cloud computing is a centralized heterogeneous platform that enables different applications to be connected to each other through using remote data servers. However, the possibility of providing an interoperable process based on binding several construction applications through a single repository platform ‘cloud computing’ required further analysis. The following Delphi questionnaires analysed the exchanging information opportunities of Building Information Modelling (BIM as the possible solution for the integration of applications on a cloud platform. The survey structure is modelled to; (i identify the most appropriate applications for advancing interoperability at the early design stage, (ii detect the most severe barriers of BIM implementation from a business and legal viewpoint, (iii examine the need for standards to address information exchange between design team, and (iv explore the use of the most common interfaces for exchanging information. The anticipated findings will assist in identifying a model that will enhance the standardized passing of information between systems at the feasibility design stage of a construction project.

  4. Log Usage Analysis: What it Discloses about Use, Information Seeking and Trustworthiness

    Directory of Open Access Journals (Sweden)

    David Nicholas

    2014-06-01

    Full Text Available The Trust and Authority in Scholarly Communications in the Light of the Digital Transition research project1 was a study which investigated the behaviours and attitudes of academic researchers as producers and consumers of scholarly information resources in respect to how they determine authority and trustworthiness. The research questions for the study arose out of CIBER’s studies of the virtual scholar. This paper focuses on elements of this study, mainly an analysis of a scholarly publisher’s usage logs, which was undertaken at the start of the project in order to build an evidence base, which would help calibrate the main methodological tools used by the project: interviews and questionnaire. The specific purpose of the log study was to identify and assess the digital usage behaviours that potentially raise trustworthiness and authority questions. Results from the self-report part of the study were additionally used to explain the logs. The main findings were that: 1 logs provide a good indicator of use and information seeking behaviour, albeit in respect to just a part of the information seeking journey; 2 the ‘lite’ form of information seeking behaviour observed in the logs is a sign of users trying to make their mind up in the face of a tsunami of information as to what is relevant and to be trusted; 3 Google and Google Scholar are the discovery platforms of choice for academic researchers, which partly points to the fact that they are influenced in what they use and read by ease of access; 4 usage is not a suitable proxy for quality. The paper also provides contextual data from CIBER’s previous studies.

  5. Organizational culture, creative behavior, and information and communication technology (ICT) usage: a facet analysis.

    Science.gov (United States)

    Carmeli, Abraham; Sternberg, Akiva; Elizur, D

    2008-04-01

    Despite the prominence of organizational culture (OC), this concept is controversial and its structure has yet to be systematically analyzed. This study develops a three-pronged formal definitional framework on the basis of facet theory (FT) and explores behavior modality, referent, and object. This facet analysis (FA) of OC accounts successfully for variation in both creative behavior at work and the usage of information and communication technologies (ICTs). An analysis of data collected from 230 employees in the financial industry indicates that a radex structure was obtained for work and ICT. The behavior modality facet ordered the space from center to periphery, and referents facet relates to the direction angles away from the origin.

  6. Science information to support Missouri River Scaphirhynchus albus (pallid sturgeon) effects analysis

    Science.gov (United States)

    Jacobson, Robert B.; Parsley, Michael J.; Annis, Mandy L.; Colvin, Michael E.; Welker, Timothy L.; James, Daniel A.

    2016-01-26

    The Missouri River Pallid Sturgeon Effects Analysis (EA) was commissioned by the U.S. Army Corps of Engineers to develop a foundation of understanding of how pallid sturgeon (Scaphirhynchus albus) population dynamics are linked to management actions in the Missouri River. The EA consists of several steps: (1) development of comprehensive, conceptual ecological models illustrating pallid sturgeon population dynamics and links to management actions and other drivers; (2) compilation and assessment of available scientific literature, databases, and models; (3) development of predictive, quantitative models to explore the system dynamics and population responses to management actions; and (4) analysis and assessment of effects of system operations and actions on species’ habitats and populations. This report addresses the second objective, compilation and assessment of relevant information.

  7. SDI-based business processes: A territorial analysis web information system in Spain

    Science.gov (United States)

    Béjar, Rubén; Latre, Miguel Á.; Lopez-Pellicer, Francisco J.; Nogueras-Iso, Javier; Zarazaga-Soria, F. J.; Muro-Medrano, Pedro R.

    2012-09-01

    Spatial Data Infrastructures (SDIs) provide access to geospatial data and operations through interoperable Web services. These data and operations can be chained to set up specialized geospatial business processes, and these processes can give support to different applications. End users can benefit from these applications, while experts can integrate the Web services in their own business processes and developments. This paper presents an SDI-based territorial analysis Web information system for Spain, which gives access to land cover, topography and elevation data, as well as to a number of interoperable geospatial operations by means of a Web Processing Service (WPS). Several examples illustrate how different territorial analysis business processes are supported. The system has been established by the Spanish National SDI (Infraestructura de Datos Espaciales de España, IDEE) both as an experimental platform for geoscientists and geoinformation system developers, and as a mechanism to contribute to the Spanish citizens knowledge about their territory.

  8. Implementation Of The ISO/IEC 27005 In Risk Security Analysis Of Management Information System

    Directory of Open Access Journals (Sweden)

    Sri Ariyani*

    2016-08-01

    Full Text Available The study conducted and explains about analysis result of Security Management Information System (SMKI at UPT SAMSAT Denpasar. This analysis has purpose to find out the level of SMKI at UPT SAMSAT Denpasar. Framework to be used in this analysis process is the ISO/IEC 27005. Section that wants to be analyze is the main task and function at the Section of Motor Vehicle Tax (PKB and Motor Vehicle Mutation Charge (BBNKB and service process performed, in this case is which is done by the staff in the Section of PKB and BBNKB that includes determining tax, to take data of progressive tax, data slot that involves in it, supporting structure and infrastructure and, of course, the stackeholder who involve in the process. The analysis was performed by implemented the ISO/IEC 27005 framework referring to clause 7 and clause 8. Clause 7 of ISO/IEC 27005 in this analysis was performed to the organization structure, obstacles list that influence the organization, reference list of legislative and regulation that valid to the organization. Whereas clause 8 of ISO/IEC 27005 include asset identification, asset appraisal, impact assessment. Analysis result shows that asset list that has the highest risk rate include the main asset those are: the process of coding selection, determining tax, process of determining the progressive tax ownership status, process of determining the progressive tax ownership order, process to repeat data capture of progressive tax, and supporting asset that cover: staff of determination, staff of progressive data capture. Whereas asset list that has the highest threat level include main asset those are: process of tax determination coding selection, process of progressive tax ownership status determination, process of progressive tax ownership order determination, process to repeat data capture of progressive tax, and supporting asset those are: the staff of determination, staff of progressive data capture.

  9. Understanding and Managing Our Earth through Integrated Use and Analysis of Geo-Information

    Directory of Open Access Journals (Sweden)

    Wolfgang Kainz

    2011-09-01

    Full Text Available All things in our world are related to some location in space and time, and according to Tobler’s first law of geography “everything is related to everything else, but near things are more related than distant things” [1]. Since humans exist they have been contemplating about space and time and have tried to depict and manage the geographic space they live in. We know graphic representations of the land from various regions of the world dating back several thousands of years. The processing and analysis of spatial data has a long history in the disciplines that deal with spatial data such as geography, surveying engineering, cartography, photogrammetry, and remote sensing. Until recently, all these activities have been analog in nature; only since the invention of the computer in the second half of the 20th century and the use of computers for the acquisition, storage, analysis, and display of spatial data starting in the 1960s we speak of geo-information and geo-information systems. [...

  10. Analysis Level Of Utilization Information And Communication Technology With The Competency Level Of Extension Workers

    Directory of Open Access Journals (Sweden)

    Veronice Veronice

    2015-01-01

    Full Text Available Extension placed man as the subject of development and human capital to develop into independent and empowered (dignity in adapting to the environment, thus being able to improve the quality of life for themselves, their families and communities. It is therefore necessary professional competence standard extension clear and effective controls in carrying counseling profession domination supported by Information and Communication Technology (ICT. This research aimed to analyze the relationship between the level of competency with the level of ICT use by the extension workers. The study was designed as a descriptive survey research correlational ,which was observed by quantitative analysis approach that is supported by descriptive and inferential statistic analysis. The study was conducted in Bogor Regency,West Java Province. Based on this research can be concluded  level of ICT utilization in the range of aspects related resources are very real to the competence of extension on the capability of understanding the potential of the region, entrepreneurial ability and the ability of the system guides the network, while the variation of the material aspects of counseling and a variety of related information is very real with all levels of competence extension.

  11. LBL Socio-Economic Environmental-Demographic Information System (SEEDIS). Chart: graphic analysis and display system

    Energy Technology Data Exchange (ETDEWEB)

    Sventek, V.A.

    1978-03-01

    The Chart Graphic Analysis and Display System was developed as part of Lawrence Berkeley Laboratory's Socio-Economic-Environmental-Demographic Information System (SEEDIS) to provide a tool with which users could draw graphs, print tables of numbers, do analysis, and perform basic statistical operations on the same set of data from a terminal in their own office. The Chart system's operation is completely independent of the type of data being entered and has been used for applications ranging from employment to energy data. Users frequently save the data they put into Chart and add to it on a regular basis, and thereby create personal databases which can be blended with information from the formal databases maintained at LBL. Using any computer system requires that the user learn a set of instructions, which at the onset often seems overwhelming. It is the purpose of this workbook to make this initial learning process less traumatic. The typical use of Chart is to enter a set of numbers, and then tell Chart what should be done with them. Chart commands take the form of gruff pidgin English. There are approximately 50 commands available. This workbook illustrates the basic commands. (RWR)

  12. Allocating health care: cost-utility analysis, informed democratic decision making, or the veil of ignorance?

    Science.gov (United States)

    Goold, S D

    1996-01-01

    Assuming that rationing health care is unavoidable, and that it requires moral reasoning, how should we allocate limited health care resources? This question is difficult because our pluralistic, liberal society has no consensus on a conception of distributive justice. In this article I focus on an alternative: Who shall decide how to ration health care, and how shall this be done to respect autonomy, pluralism, liberalism, and fairness? I explore three processes for making rationing decisions: cost-utility analysis, informed democratic decision making, and applications of the veil of ignorance. I evaluate these processes as examples of procedural justice, assuming that there is no outcome considered the most just. I use consent as a criterion to judge competing processes so that rationing decisions are, to some extent, self-imposed. I also examine the processes' feasibility in our current health care system. Cost-utility analysis does not meet criteria for actual or presumed consent, even if costs and health-related utility could be measured perfectly. Existing structures of government cannot creditably assimilate the information required for sound rationing decisions, and grassroots efforts are not representative. Applications of the veil of ignorance are more useful for identifying principles relevant to health care rationing than for making concrete rationing decisions. I outline a process of decision making, specifically for health care, that relies on substantive, selected representation, respects pluralism, liberalism, and deliberative democracy, and could be implemented at the community or organizational level.

  13. Managing Returnable Containers Logistics - A Case Study Part I - Physical and Information Flow Analysis

    Directory of Open Access Journals (Sweden)

    Reza A. Maleki

    2011-05-01

    Full Text Available This case study paper is the result of a project conducted on behalf of a company, hereon referred to as Midwest Assembly and Manufacturing or MAAN. The company's operations include component manufacturing, painting, and assembling products. The company also purchases a relatively large percentage of components and major assemblies that are needed to support final assembly operations. MAAN uses its own returnable containers to transport purchased parts from suppliers. Due to poor tracking of the containers, the company has been experiencing lost containers and occasional production disruptions at its facility well as at the supplier sites. The objective of this project was to develop a proposal to enable MAAN to more effectively track and manage its returnable containers. The research activities in support of this project included the analysis and documentation of both the physical flow and the information flow associated with the containers as well as some of the technologies that can help with automatic identification and tracking of containers. The focal point of this paper is on a macro?level approach for the analysis of container and information flow within the logistics chain. A companion paper deals with several of the automatic identification technologies that have the potential to improve the management of MAAN's returnable containers.

  14. Training needs for toxicity testing in the 21st century: a survey-informed analysis.

    Science.gov (United States)

    Lapenna, Silvia; Gabbert, Silke; Worth, Andrew

    2012-12-01

    Current training needs on the use of alternative methods in predictive toxicology, including new approaches based on mode-of-action (MoA) and adverse outcome pathway (AOP) concepts, are expected to evolve rapidly. In order to gain insight into stakeholder preferences for training, the European Commission's Joint Research Centre (JRC) conducted a single-question survey with twelve experts in regulatory agencies, industry, national research organisations, NGOs and consultancies. Stakeholder responses were evaluated by means of theory-based qualitative data analysis. Overall, a set of training topics were identified that relate both to general background information and to guidance for applying alternative testing methods. In particular, for the use of in silico methods, stakeholders emphasised the need for training on data integration and evaluation, in order to increase confidence in applying these methods for regulatory purposes. Although the survey does not claim to offer an exhaustive overview of the training requirements, its findings support the conclusion that the development of well-targeted and tailor-made training opportunities that inform about the usefulness of alternative methods, in particular those that offer practical experience in the application of in silico methods, deserves more attention. This should be complemented by transparent information and guidance on the interpretation of the results generated by these methods and software tools. PMID:23398336

  15. Mapping informative clusters in a hierarchical [corrected] framework of FMRI multivariate analysis.

    Directory of Open Access Journals (Sweden)

    Rui Xu

    Full Text Available Pattern recognition methods have become increasingly popular in fMRI data analysis, which are powerful in discriminating between multi-voxel patterns of brain activities associated with different mental states. However, when they are used in functional brain mapping, the location of discriminative voxels varies significantly, raising difficulties in interpreting the locus of the effect. Here we proposed a hierarchical framework of multivariate approach that maps informative clusters rather than voxels to achieve reliable functional brain mapping without compromising the discriminative power. In particular, we first searched for local homogeneous clusters that consisted of voxels with similar response profiles. Then, a multi-voxel classifier was built for each cluster to extract discriminative information from the multi-voxel patterns. Finally, through multivariate ranking, outputs from the classifiers were served as a multi-cluster pattern to identify informative clusters by examining interactions among clusters. Results from both simulated and real fMRI data demonstrated that this hierarchical approach showed better performance in the robustness of functional brain mapping than traditional voxel-based multivariate methods. In addition, the mapped clusters were highly overlapped for two perceptually equivalent object categories, further confirming the validity of our approach. In short, the hierarchical framework of multivariate approach is suitable for both pattern classification and brain mapping in fMRI studies.

  16. Beyond usage: understanding the use of electronic journals on the basis of information activity analysis. Electronic journals, Use studies, Information activity, Scientific communication

    Directory of Open Access Journals (Sweden)

    Annaïg Mahé

    2004-01-01

    Full Text Available In this article, which reports the second part of a two-part study of the use of electronic journals by researchers in two French research institutions, we attempt to explain the integration of the use of electronic journals in the scientists' information habits, going beyond usage analysis. First, we describe how the development of electronic journals use follows a three-phase innovation process - research-development, first uses, and technical acculturation. Then, we attempt to find more significant explanatory factors, and emphasis is placed on the wider context of information activity. Three main information activity types are outlined - marginal, parallel, and integrated. Each of these types corresponds to a particular attitude towards scientific information and to different levels of electronic journal use.

  17. Confirmatory Factor Analysis of IT-based Competency Questionnaire in Information Science & Knowledge Studies, Based on Job Market Analysis

    Directory of Open Access Journals (Sweden)

    Rahim Shahbazi

    2016-03-01

    Full Text Available The main purpose of the present research is to evaluate the validity of an IT-based competency questionnaire in Information Science & Knowledge Studies. The Survey method has been used in the present research. A data collection tool has been a researcher-made questionnaire. Statistic samples, which are 315 people, have been chosen purposefully from among Iranian faculty members, Ph.D. students, and information center employees. The findings showed that by eliminating 17 items from the whole questionnaire and Confirmatory Factor Analysis of the rest and rotating findings using the Varimax method, 8 Factors were revealed. The resulting components and also the items which had a high load factor with these components were considerably consistent with the classifications in the questionnaire and partly consistent with the findings of other researchers. 76 competency indicators (knowledge, skills, and attitudes were validated and grouped under 8 main categories: 1. “Computer Basics” 2. “Database Operating, Collection Development of Digital Resources, & Digital Library Management” 3. “Basics of Computer Networking” 4. “Basics of Programming & Database Designing” 5. “Web Designing & Web Content Analysis” 6. “Library Software & Computerized Organizing” 7. Archive of Digital Resources and 8. Attitudes.

  18. A comparative analysis of the explanatory power of accounting and patent information for the market values of German firms

    OpenAIRE

    Reitzig, Markus; Ramb, Fred

    2004-01-01

    We present a theoretical and empirical analysis of the fitness of national German (German Commercial Code – Handelsgesetzbuch (HGB)) and international (IAS and US-GAAP) accounting information, as well as European patent data to explain the market values of German manufacturing firms. For the chosen volatile period from 1997 to 2002, cautious national accounting information does not correlate with the firms’ residual market values (RMV). International accounting information make...

  19. Selecting essential information for biosurveillance--a multi-criteria decision analysis.

    Science.gov (United States)

    Generous, Nicholas; Margevicius, Kristen J; Taylor-McCabe, Kirsten J; Brown, Mac; Daniel, W Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina

    2014-01-01

    The National Strategy for Biosurveillance defines biosurveillance as "the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels." However, the strategy does not specify how "essential information" is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being "essential". The question of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of "essential information" for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system.

  20. PathNet: a tool for pathway analysis using topological information

    Directory of Open Access Journals (Sweden)

    Dutta Bhaskar

    2012-09-01

    Full Text Available Abstract Background Identification of canonical pathways through enrichment of differentially expressed genes in a given pathway is a widely used method for interpreting gene lists generated from high-throughput experimental studies. However, most algorithms treat pathways as sets of genes, disregarding any inter- and intra-pathway connectivity information, and do not provide insights beyond identifying lists of pathways. Results We developed an algorithm (PathNet that utilizes the connectivity information in canonical pathway descriptions to help identify study-relevant pathways and characterize non-obvious dependencies and connections among pathways using gene expression data. PathNet considers both the differential expression of genes and their pathway neighbors to strengthen the evidence that a pathway is implicated in the biological conditions characterizing the experiment. As an adjunct to this analysis, PathNet uses the connectivity of the differentially expressed genes among all pathways to score pathway contextual associations and statistically identify biological relations among pathways. In this study, we used PathNet to identify biologically relevant results in two Alzheimer’s disease microarray datasets, and compared its performance with existing methods. Importantly, PathNet identified de-regulation of the ubiquitin-mediated proteolysis pathway as an important component in Alzheimer’s disease progression, despite the absence of this pathway in the standard enrichment analyses. Conclusions PathNet is a novel method for identifying enrichment and association between canonical pathways in the context of gene expression data. It takes into account topological information present in pathways to reveal biological information. PathNet is available as an R workspace image from http://www.bhsai.org/downloads/pathnet/.

  1. 企业ERP投资公告信息分析%Information Analysis on ERP Investment

    Institute of Scientific and Technical Information of China (English)

    徐扬; 程媛媛

    2015-01-01

    Investment bulletin is valuable information resource open to public,and how to extract tacit information from it is a key issue in information analysis.Enterprise Resource Planning (ERP) is widely applied by enterprises because of its standardization,process-oriented management method and high integration.Researches on ERP investment value have caught attention from academic scholars and enterprise managers,and help decision making of enterprises.This paper analyzes information gathered from ERP investment bulletin,tries to measure ERP value by reflection from stock market,so as to make tacit information which may influence ERP value explicit.Based on former insightful studies,this paper gathers ERP investment bulletin of American publicly traded enterprises from 1997-2013,analyzes how information from investment bulletin may influence enterprise value,and provides decision making supports.%投资公告是重要的企业公开信息来源,如何通过提炼其中蕴含的隐性知识并将其显性表达是企业信息分析与决策的重要内容.本文以企业资源规划(Enterprise Resource Planning,ERP)投资公告为分析对象,将隐藏在投资公告中的ERP投资价值影响因素显性化,在对前人研究进行总结和提炼的基础上,采集了1997~2013年在美国上市企业的ERP投资公告,通过对九大变量的抽取、编码和计算,结合信息系统的组织集成和期权价值理论,分析这些公告信息对企业市值的影响,从而对企业投资决策提供支持.

  2. Value of information analysis for Corrective Action Unit 97: Yucca Flat, Nevada Test Site, Nevada

    Energy Technology Data Exchange (ETDEWEB)

    IT Corporation Las Vegas

    1999-11-19

    The value-of-information analysis evaluated data collection options for characterizing groundwater transport of contamination associated with the Yucca Flat and Climax Mine Corrective Action Units. Experts provided inputs for the evaluation of 48 characterization options, which included 27 component activities, 12 combinations of activities (subgroups), and 9 combinations of subgroups (groups). The options range from an individual study using existing data and intended to address a relatively narrow uncertainty to a 52-million dollar group of activities designed to collect and analyze new information to broadly address multiple uncertainties. A modified version of the contaminant transport component of the regional model was used to simulate contaminant transport and to estimate the maximum extent of the contaminant boundary, defined as that distance beyond which the committed effective dose equivalent from the residual radionuclides in groundwater will not exceed 4 millirem per year within 1,000 years. These simulations identified the model parameters most responsible for uncertainty over the contaminant boundary and determined weights indicating the relative importance of these parameters. Key inputs were identified through sensitivity analysis; the five selected parameters were flux for flow into Yucca Flat from the north, hydrologic source term, effective porosity and diffusion parameter for the Lower Carbonate Aquifer, and path length from the Volcanic Confining Unit to the Lower Carbonate Aquifer. Four measures were used to quantify uncertainty reduction. Using Bayesian analysis, the options were compared and ranked based on their costs and estimates of their effectiveness at reducing the key uncertainties relevant to predicting the maximum contaminant boundary.

  3. Value of information analysis for Corrective Action Unit 97: Yucca Flat, Nevada Test Site, Nevada

    International Nuclear Information System (INIS)

    The value-of-information analysis evaluated data collection options for characterizing groundwater transport of contamination associated with the Yucca Flat and Climax Mine Corrective Action Units. Experts provided inputs for the evaluation of 48 characterization options, which included 27 component activities, 12 combinations of activities (subgroups), and 9 combinations of subgroups (groups). The options range from an individual study using existing data and intended to address a relatively narrow uncertainty to a 52-million dollar group of activities designed to collect and analyze new information to broadly address multiple uncertainties. A modified version of the contaminant transport component of the regional model was used to simulate contaminant transport and to estimate the maximum extent of the contaminant boundary, defined as that distance beyond which the committed effective dose equivalent from the residual radionuclides in groundwater will not exceed 4 millirem per year within 1,000 years. These simulations identified the model parameters most responsible for uncertainty over the contaminant boundary and determined weights indicating the relative importance of these parameters. Key inputs were identified through sensitivity analysis; the five selected parameters were flux for flow into Yucca Flat from the north, hydrologic source term, effective porosity and diffusion parameter for the Lower Carbonate Aquifer, and path length from the Volcanic Confining Unit to the Lower Carbonate Aquifer. Four measures were used to quantify uncertainty reduction. Using Bayesian analysis, the options were compared and ranked based on their costs and estimates of their effectiveness at reducing the key uncertainties relevant to predicting the maximum contaminant boundary

  4. A Citation Analysis of Australian Information Systems Researchers: Towards a New ERA?

    Directory of Open Access Journals (Sweden)

    Roger Clarke

    2008-05-01

    Full Text Available Citation analysis is a potentially valuable means of assessing the contributions of researchers, in Information Systems (IS as in other disciplines. In particular, a combination of raw counts and deeper analysis of citation data can deliver insights into the impact of a researcher's publications on other researchers. Despite this potential, the limited literature in the IS discipline has paid very little attention to the use of citation analysis for this purpose. Meanwhile, the federal department responsible for education funding has convinced successive federal governments to develop research quality measures that can be used as a basis for differential funding. The Howard Government's proposed Research Quality Framework (RQF has been abandoned, but a number of aspects of it survive within the Rudd Government's Excellence in Research for Australia (ERA initiative. The ERA also appears likely to involve a highly formalised process whereby 'research groupings' within individual universities will be evaluated, with (as yet unclear impacts on the distribution of research funding. Funding agencies have an interest in score-keeping, whether or not their enthusiasm is shared by Australian researchers. It is therefore highly advisable that Australian disciplines, and especially less well-established and powerful disciplines like Information Systems, achieve a clear understanding of their performance as indicated by the available measurement techniques applied to the available data. This paper reports on citation analysis using data from both the longstanding Thomson/ISI collection and the more recently developed Google Scholar service. Few Australian IS researchers have achieved scores of any great significance in the Thomson/ISI collection, whereas the greater depth available in Google Scholar provides a more realistic picture. Quality assessment of the Thomson/ISI collection shows it to be seriously inappropriate for relatively new disciplines

  5. Information seeking for making evidence-informed decisions: a social network analysis on the staff of a public health department in Canada

    Science.gov (United States)

    2012-01-01

    Background Social network analysis is an approach to study the interactions and exchange of resources among people. It can help understanding the underlying structural and behavioral complexities that influence the process of capacity building towards evidence-informed decision making. A social network analysis was conducted to understand if and how the staff of a public health department in Ontario turn to peers to get help incorporating research evidence into practice. Methods The staff were invited to respond to an online questionnaire inquiring about information seeking behavior, identification of colleague expertise, and friendship status. Three networks were developed based on the 170 participants. Overall shape, key indices, the most central people and brokers, and their characteristics were identified. Results The network analysis showed a low density and localized information-seeking network. Inter-personal connections were mainly clustered by organizational divisions; and people tended to limit information-seeking connections to a handful of peers in their division. However, recognition of expertise and friendship networks showed more cross-divisional connections. Members of the office of the Medical Officer of Health were located at the heart of the department, bridging across divisions. A small group of professional consultants and middle managers were the most-central staff in the network, also connecting their divisions to the center of the information-seeking network. In each division, there were some locally central staff, mainly practitioners, who connected their neighboring peers; but they were not necessarily connected to other experts or managers. Conclusions The methods of social network analysis were useful in providing a systems approach to understand how knowledge might flow in an organization. The findings of this study can be used to identify early adopters of knowledge translation interventions, forming Communities of Practice, and

  6. Comparison of Seven Methods for Boolean Factor Analysis and Their Evaluation by Information Gain.

    Science.gov (United States)

    Frolov, Alexander A; Húsek, Dušan; Polyakov, Pavel Yu

    2016-03-01

    An usual task in large data set analysis is searching for an appropriate data representation in a space of fewer dimensions. One of the most efficient methods to solve this task is factor analysis. In this paper, we compare seven methods for Boolean factor analysis (BFA) in solving the so-called bars problem (BP), which is a BFA benchmark. The performance of the methods is evaluated by means of information gain. Study of the results obtained in solving BP of different levels of complexity has allowed us to reveal strengths and weaknesses of these methods. It is shown that the Likelihood maximization Attractor Neural Network with Increasing Activity (LANNIA) is the most efficient BFA method in solving BP in many cases. Efficacy of the LANNIA method is also shown, when applied to the real data from the Kyoto Encyclopedia of Genes and Genomes database, which contains full genome sequencing for 1368 organisms, and to text data set R52 (from Reuters 21578) typically used for label categorization.

  7. On the development of an interactive resource information management system for analysis and display of spatiotemporal data

    Science.gov (United States)

    Schell, J. A.

    1974-01-01

    The recent availability of timely synoptic earth imagery from the Earth Resources Technology Satellites (ERTS) provides a wealth of information for the monitoring and management of vital natural resources. Formal language definitions and syntax interpretation algorithms were adapted to provide a flexible, computer information system for the maintenance of resource interpretation of imagery. These techniques are incorporated, together with image analysis functions, into an Interactive Resource Information Management and Analysis System, IRIMAS, which is implemented on a Texas Instruments 980A minicomputer system augmented with a dynamic color display for image presentation. A demonstration of system usage and recommendations for further system development are also included.

  8. Web-based Weather and Climate Information Service of Forensic Disaster Analysis

    Science.gov (United States)

    Mühr, Bernhard; Kunz, Michael; Köbele, Daniel

    2014-05-01

    , Europe, and the other continents. In 2007, 'Wettergefahren-Frühwarnung' became part of CEDIM and contributed to the activity of near-real time Forensic Disaster Analysis ahead, during and after a major event. Information is provided as text, own weather charts or data.

  9. SMALL Savannah : an information system for the integrated analysis of land use change in the Far North of Cameroon

    NARCIS (Netherlands)

    Fotsing, Eric

    2009-01-01

    SMALL Savannah is an Environmental Information System designed for the integrated analysis and sustainable land management in the savannas region of the Far North of Cameroon. This system combines an observation and spatial analysis module for the representation of phenomena from various geographic

  10. Library and Information Science Research Areas: A Content Analysis of Articles from the Top 10 Journals 2007-8

    Science.gov (United States)

    Aharony, Noa

    2012-01-01

    The current study seeks to describe and analyze journal research publications in the top 10 Library and Information Science journals from 2007-8. The paper presents a statistical descriptive analysis of authorship patterns (geographical distribution and affiliation) and keywords. Furthermore, it displays a thorough content analysis of keywords and…

  11. Information content of the low-energy electric dipole strength: correlation analysis

    CERN Document Server

    Reinhard, P -G

    2012-01-01

    We study the information content carried by the electric dipole strength with respect to isovector and isoscalar indicators characterizing bulk nuclear matter and finite nuclei. To separate isoscalar and isovector modes, and low-energy strength and giant resonances, we analyze the E1 strength as a function of excitation energy $E$ and momentum transfer $q$. We use the self-consistent nuclear density functional theory with Skyrme energy density functionals, augmented by the random phase, to compute the E1 strength, and covariance analysis to assess correlations between observables. Calculations are performed for spherical, doubly-magic nuclei $^{208}$Pb and $^{132}$Sn. We demonstrate that E1 transition densities in the low-energy region below the giant dipole resonance exhibit appreciable state dependence and multi-nodal structures, which are fingerprints of weak collectivity. The correlation between the accumulated low-energy strength and symmetry energy is weak, and dramatically depends on the energy cutoff ...

  12. Determination of signal-to-noise ratio on the base of information-entropic analysis

    CERN Document Server

    Zhanabaev, Z Zh; Kozhagulov, E T; Karibayev, B A

    2016-01-01

    In this paper we suggest a new algorithm for determination of signal-to-noise ratio (SNR). SNR is a quantitative measure widely used in science and engineering. Generally, methods for determination of SNR are based on using of experimentally defined power of noise level, or some conditional noise criterion which can be specified for signal processing. In the present work we describe method for determination of SNR of chaotic and stochastic signals at unknown power levels of signal and noise. For this aim we use information as difference between unconditional and conditional entropy. Our theoretical results are confirmed by results of analysis of signals which can be described by nonlinear maps and presented as overlapping of harmonic and stochastic signals.

  13. Geographic information analysis: An ecological approach for the management of wildlife on the forest landscape

    Science.gov (United States)

    Ripple, William J.

    1995-01-01

    This document is a summary of the project funded by NAGw-1460 as part of the Earth Observation Commericalization/Applications Program (EOCAP) directed by NASA's Earth Science and Applications Division. The goal was to work with several agencies to focus on forest structure and landscape characterizations for wildlife habitat applications. New analysis techniques were used in remote sensing and landscape ecology with geographic information systems (GIS). The development of GIS and the emergence of the discipline of landscape ecology provided us with an opportunity to study forest and wildlife habitat resources from a new perspective. New techniques were developed to measure forest structure across scales from the canopy to the regional level. This paper describes the project team, technical advances, and technology adoption process that was used. Reprints of related refereed journal articles are in the Appendix.

  14. A damage analysis for brittle materials using stochastic micro-structural information

    Science.gov (United States)

    Lin, Shih-Po; Chen, Jiun-Shyan; Liang, Shixue

    2016-03-01

    In this work, a micro-crack informed stochastic damage analysis is performed to consider the failures of material with stochastic microstructure. The derivation of the damage evolution law is based on the Helmholtz free energy equivalence between cracked microstructure and homogenized continuum. The damage model is constructed under the stochastic representative volume element (SRVE) framework. The characteristics of SRVE used in the construction of the stochastic damage model have been investigated based on the principle of the minimum potential energy. The mesh dependency issue has been addressed by introducing a scaling law into the damage evolution equation. The proposed methods are then validated through the comparison between numerical simulations and experimental observations of a high strength concrete. It is observed that the standard deviation of porosity in the microstructures has stronger effect on the damage states and the peak stresses than its effect on the Young's and shear moduli in the macro-scale responses.

  15. Geographical information system (GIS) suitability analysis of radioactive waste repository site in Pahang, Malaysia

    International Nuclear Information System (INIS)

    The aim of this project is to identify a suitable site for radioactive waste repository in Pahang using remote sensing and geographical information system (GIS) technologies. There are ten parameters considered in the analysis, which divided into Selection Criteria and Exclusion Criteria. The Selection Criteria parameters consists of land use, rainfall, lineament, slope, groundwater potential and elevation while Exclusion Criteria parameters consist of urban, protected land and island. Furthermore, all parameters were integrated, given weight age and ranked for site selection evaluation in GIS environment. At the first place, about twelve sites have been identified as suitable sites for radioactive waste repository throughout the study area. These sites were further analysed by ground checking on the physical setting including geological, drainage, and population density in order to finalise three most suitable sites for radioactive waste repository. (author)

  16. On the predictive information criteria for model determination in seismic hazard analysis

    Science.gov (United States)

    Varini, Elisa; Rotondi, Renata

    2016-04-01

    estimate, but it is hardly applicable to data which are not independent given parameters (Watanabe, J. Mach. Learn. Res., 2010). A solution is given by Ando and Tsay criterion where the joint density may be decomposed into the product of the conditional densities (Ando and Tsay, Int. J. Forecast., 2010). The above mentioned criteria are global summary measures of model performance, but more detailed analysis could be required to discover the reasons for poor global performance. In this latter case, a retrospective predictive analysis is performed on each individual observation. In this study we performed the Bayesian analysis of Italian data sets by four versions of a long-term hazard model known as the stress release model (Vere-Jones, J. Physics Earth, 1978; Bebbington and Harte, Geophys. J. Int., 2003; Varini and Rotondi, Environ. Ecol. Stat., 2015). Then we illustrate the results on their performance evaluated by Bayes Factor, predictive information criteria and retrospective predictive analysis.

  17. Information Technology Project Portfolio and Strategy Alignment Assessment Based on Data Envelopment Analysis

    Directory of Open Access Journals (Sweden)

    Marisa Analía Sánchez

    2012-11-01

    Full Text Available Recent researches have shown that companies face considerable difficulties in assessing the strategy value contribution of Information Technology (IT investments. One of the major obstacles to achieving strategy alignment is that organizations find extremely difficult to link and quantify the IT investments benefits with strategic goals. The aim of this paper is to define an approach to assess portfolio-strategy alignment. To this end a formal specification of Kaplan and Norton Strategy Map is developed utilizing Unified Modeling Language (UML. The approach uses the Strategy Map as a framework for defining the portfolio value contribution and Data Envelopment Analysis (DEA is used as the methodology for measuring efficiency of project portfolios.DOI:10.5585/gep.v3i2.66

  18. Water flows, energy demand, and market analysis of the informal water sector in Kisumu, Kenya.

    Science.gov (United States)

    Sima, Laura C; Kelner-Levine, Evan; Eckelman, Matthew J; McCarty, Kathleen M; Elimelech, Menachem

    2013-03-01

    In rapidly growing urban areas of developing countries, infrastructure has not been able to cope with population growth. Informal water businesses fulfill unmet water supply needs, yet little is understood about this sector. This paper presents data gathered from quantitative interviews with informal water business operators (n=260) in Kisumu, Kenya, collected during the dry season. Sales volume, location, resource use, and cost were analyzed by using material flow accounting and spatial analysis tools. Estimates show that over 76% of the city's water is consumed by less than 10% of the population who have water piped into their dwellings. The remainder of the population relies on a combination of water sources, including water purchased directly from kiosks (1.5 million m(3) per day) and delivered by hand-drawn water-carts (0.75 million m(3) per day). Energy audits were performed to compare energy use among various water sources in the city. Water delivery by truck is the highest per cubic meter energy demand (35 MJ/m(3)), while the city's tap water has the highest energy use overall (21,000 MJ/day). We group kiosks by neighborhood and compare sales volume and cost with neighborhood-level population data. Contrary to popular belief, we do not find evidence of price gouging; the lowest prices are charged in the highest-demand low-income area. We also see that the informal sector is sensitive to demand, as the number of private boreholes that serve as community water collection points are much larger where demand is greatest.

  19. Information Crisis

    CERN Document Server

    Losavio, Michael

    2012-01-01

    Information Crisis discusses the scope and types of information available online and teaches readers how to critically assess it and analyze potentially dangerous information, especially when teachers, editors, or other information gatekeepers are not available to assess the information for them. Chapters and topics include:. The Internet as an information tool. Critical analysis. Legal issues, traps, and tricks. Protecting personal safety and identity. Types of online information.

  20. Parametric sensitivity analysis for stochastic molecular systems using information theoretic metrics

    Energy Technology Data Exchange (ETDEWEB)

    Tsourtis, Anastasios, E-mail: tsourtis@uoc.gr [Department of Mathematics and Applied Mathematics, University of Crete, Crete (Greece); Pantazis, Yannis, E-mail: pantazis@math.umass.edu; Katsoulakis, Markos A., E-mail: markos@math.umass.edu [Department of Mathematics and Statistics, University of Massachusetts, Amherst, Massachusetts 01003 (United States); Harmandaris, Vagelis, E-mail: harman@uoc.gr [Department of Mathematics and Applied Mathematics, University of Crete, and Institute of Applied and Computational Mathematics (IACM), Foundation for Research and Technology Hellas (FORTH), GR-70013 Heraklion, Crete (Greece)

    2015-07-07

    In this paper, we present a parametric sensitivity analysis (SA) methodology for continuous time and continuous space Markov processes represented by stochastic differential equations. Particularly, we focus on stochastic molecular dynamics as described by the Langevin equation. The utilized SA method is based on the computation of the information-theoretic (and thermodynamic) quantity of relative entropy rate (RER) and the associated Fisher information matrix (FIM) between path distributions, and it is an extension of the work proposed by Y. Pantazis and M. A. Katsoulakis [J. Chem. Phys. 138, 054115 (2013)]. A major advantage of the pathwise SA method is that both RER and pathwise FIM depend only on averages of the force field; therefore, they are tractable and computable as ergodic averages from a single run of the molecular dynamics simulation both in equilibrium and in non-equilibrium steady state regimes. We validate the performance of the extended SA method to two different molecular stochastic systems, a standard Lennard-Jones fluid and an all-atom methane liquid, and compare the obtained parameter sensitivities with parameter sensitivities on three popular and well-studied observable functions, namely, the radial distribution function, the mean squared displacement, and the pressure. Results show that the RER-based sensitivities are highly correlated with the observable-based sensitivities.

  1. Network Analysis of the Shanghai Stock Exchange Based on Partial Mutual Information

    Directory of Open Access Journals (Sweden)

    Tao You

    2015-06-01

    Full Text Available Analyzing social systems, particularly financial markets, using a complex network approach has become one of the most popular fields within econophysics. A similar trend is currently appearing within the econometrics and finance communities, as well. In this study, we present a state-of-the-artmethod for analyzing the structure and risk within stockmarkets, treating them as complex networks using model-free, nonlinear dependency measures based on information theory. This study is the first network analysis of the stockmarket in Shanghai using a nonlinear network methodology. Further, it is often assumed that markets outside the United States and Western Europe are inherently riskier. We find that the Chinese stock market is not structurally risky, contradicting this popular opinion. We use partial mutual information to create filtered networks representing the Shanghai stock exchange, comparing them to networks based on Pearson’s correlation. Consequently, we discuss the structure and characteristics of both the presented methods and the Shanghai stock exchange. This paper provides an insight into the cutting edge methodology designed for analyzing complex financial networks, as well as analyzing the structure of the market in Shanghai and, as such, is of interest to both researchers and financial analysts.

  2. Parametric sensitivity analysis for stochastic molecular systems using information theoretic metrics

    International Nuclear Information System (INIS)

    In this paper, we present a parametric sensitivity analysis (SA) methodology for continuous time and continuous space Markov processes represented by stochastic differential equations. Particularly, we focus on stochastic molecular dynamics as described by the Langevin equation. The utilized SA method is based on the computation of the information-theoretic (and thermodynamic) quantity of relative entropy rate (RER) and the associated Fisher information matrix (FIM) between path distributions, and it is an extension of the work proposed by Y. Pantazis and M. A. Katsoulakis [J. Chem. Phys. 138, 054115 (2013)]. A major advantage of the pathwise SA method is that both RER and pathwise FIM depend only on averages of the force field; therefore, they are tractable and computable as ergodic averages from a single run of the molecular dynamics simulation both in equilibrium and in non-equilibrium steady state regimes. We validate the performance of the extended SA method to two different molecular stochastic systems, a standard Lennard-Jones fluid and an all-atom methane liquid, and compare the obtained parameter sensitivities with parameter sensitivities on three popular and well-studied observable functions, namely, the radial distribution function, the mean squared displacement, and the pressure. Results show that the RER-based sensitivities are highly correlated with the observable-based sensitivities

  3. Writing wrongs? An analysis of published discourses about the use of patient information leaflets.

    Science.gov (United States)

    Dixon-Woods, M

    2001-05-01

    Much has been written about how to communicate with patients, but there has been little critical scrutiny of this literature. This paper presents an analysis of publications about the use of patient information leaflets. It suggests that two discourses can be distinguished in this literature. The first of these is the larger of the two. It reflects traditional biomedical concerns and it invokes a mechanistic model of communication in which patients are characterised as passive and open to manipulation in the interests of a biomedical agenda. The persistence of the biomedical model in this discourse is contrasted with the second discourse, which is smaller and more recent in origin. This second discourse draws on a political agenda of patient empowerment, and reflects this in its choice of outcomes of interest, its concern with the use of leaflets as a means of democratisation, and its orientation towards patients. It is suggested that the two discourses, though distinct, are not entirely discrete, and may begin to draw closer as they begin to draw on a wider set of resources, including sociological research and theory, to develop a rigorous theoretically grounded approach to patient information leaflets. PMID:11286365

  4. Building Information Modelling and Standardised Construction Contracts: a Content Analysis of the GC21 Contract

    Directory of Open Access Journals (Sweden)

    Aaron Manderson

    2015-08-01

    Full Text Available Building Information Modelling (BIM is seen as a panacea to many of the ills confronting the Architectural, Engineering and Construction (AEC sector. In spite of its well documented benefits the widespread integration of BIM into the project lifecycle is yet to occur. One commonly identified barrier to BIM adoption is the perceived legal risks associated with its integration, coupled with the need for implementation in a collaborative environment. Many existing standardised contracts used in the Australian AEC industry were drafted before the emergence of BIM. As BIM continues to become ingrained in the delivery process the shortcomings of these existing contracts have become apparent. This paper reports on a study that reviewed and consolidated the contractual and legal concerns associated with BIM implementation. The findings of the review were used to conduct a qualitative content analysis of the GC21 2nd edition, an Australian standardised construction contract, to identify possible changes to facilitate the implementation of BIM in a collaborative environment. The findings identified a number of changes including the need to adopt a collaborative contract structure with equitable risk and reward mechanisms, recognition of the model as a contract document and the need for standardisation of communication/information exchange.

  5. Environmental factor analysis of cholera in China using remote sensing and geographical information systems.

    Science.gov (United States)

    Xu, M; Cao, C X; Wang, D C; Kan, B; Xu, Y F; Ni, X L; Zhu, Z C

    2016-04-01

    Cholera is one of a number of infectious diseases that appears to be influenced by climate, geography and other natural environments. This study analysed the environmental factors of the spatial distribution of cholera in China. It shows that temperature, precipitation, elevation, and distance to the coastline have significant impact on the distribution of cholera. It also reveals the oceanic environmental factors associated with cholera in Zhejiang, which is a coastal province of China, using both remote sensing (RS) and geographical information systems (GIS). The analysis has validated the correlation between indirect satellite measurements of sea surface temperature (SST), sea surface height (SSH) and ocean chlorophyll concentration (OCC) and the local number of cholera cases based on 8-year monthly data from 2001 to 2008. The results show the number of cholera cases has been strongly affected by the variables of SST, SSH and OCC. Utilizing this information, a cholera prediction model has been established based on the oceanic and climatic environmental factors. The model indicates that RS and GIS have great potential for designing an early warning system for cholera.

  6. Environmental factor analysis of cholera in China using remote sensing and geographical information systems.

    Science.gov (United States)

    Xu, M; Cao, C X; Wang, D C; Kan, B; Xu, Y F; Ni, X L; Zhu, Z C

    2016-04-01

    Cholera is one of a number of infectious diseases that appears to be influenced by climate, geography and other natural environments. This study analysed the environmental factors of the spatial distribution of cholera in China. It shows that temperature, precipitation, elevation, and distance to the coastline have significant impact on the distribution of cholera. It also reveals the oceanic environmental factors associated with cholera in Zhejiang, which is a coastal province of China, using both remote sensing (RS) and geographical information systems (GIS). The analysis has validated the correlation between indirect satellite measurements of sea surface temperature (SST), sea surface height (SSH) and ocean chlorophyll concentration (OCC) and the local number of cholera cases based on 8-year monthly data from 2001 to 2008. The results show the number of cholera cases has been strongly affected by the variables of SST, SSH and OCC. Utilizing this information, a cholera prediction model has been established based on the oceanic and climatic environmental factors. The model indicates that RS and GIS have great potential for designing an early warning system for cholera. PMID:26464184

  7. Research in health sciences library and information science: a quantitative analysis.

    Science.gov (United States)

    Dimitroff, A

    1992-10-01

    A content analysis of research articles published between 1966 and 1990 in the Bulletin of the Medical Library Association was undertaken. Four specific questions were addressed: What subjects are of interest to health sciences librarians? Who is conducting this research? How do health sciences librarians conduct their research? Do health sciences librarians obtain funding for their research activities? Bibliometric characteristics of the research articles are described and compared to characteristics of research in library and information science as a whole in terms of subject and methodology. General findings were that most research in health sciences librarianship is conducted by librarians affiliated with academic health sciences libraries (51.8%); most deals with an applied (45.7%) or a theoretical (29.2%) topic; survey (41.0%) or observational (20.7%) research methodologies are used; descriptive quantitative analytical techniques are used (83.5%); and over 25% of research is funded. The average number of authors was 1.85, average article length was 7.25 pages, and average number of citations per article was 9.23. These findings are consistent with those reported in the general library and information science literature for the most part, although specific differences do exist in methodological and analytical areas.

  8. A Bibliometric Analysis on the Literature of Information Organization in Taiwan

    Directory of Open Access Journals (Sweden)

    Chiao-Min Lin

    2009-12-01

    Full Text Available The purpose of this study is to explore the characteristics of the literature on information organization in Taiwan. A total of 610 articles from 1882 to 2008 and 113 theses and dissertations from 1971 to 2008 were identified and analyzed. The growth of literature, research subjects, author productivity, the distribution of journals and organizations are addressed. The results of this study reveal that the journal articles in Taiwan had been up-growing except after 2003, but the theses and dissertations had been growing stably. The major research subject of journal articles is classification theory and descriptive cataloging, but the theses and dissertations are knowledge organization. The zone analysis from Bradford’s law is not applicable to journal productivity. Productivity is mainly focused on single author and toward coauthor. The Journal of Educational Media and Library Sciences is the major journal that publishes information organization articles. The National Taiwan University is the most productive schools of theses and dissertations, but various schools have their characteristics of research subjects. [Article content in Chinese

  9. Based on Weibull Information Fusion Analysis Semiconductors Quality the Key Technology of Manufacturing Execution Systems Reliability

    Science.gov (United States)

    Huang, Zhi-Hui; Tang, Ying-Chun; Dai, Kai

    2016-05-01

    Semiconductor materials and Product qualified rate are directly related to the manufacturing costs and survival of the enterprise. Application a dynamic reliability growth analysis method studies manufacturing execution system reliability growth to improve product quality. Refer to classical Duane model assumptions and tracking growth forecasts the TGP programming model, through the failure data, established the Weibull distribution model. Combining with the median rank of average rank method, through linear regression and least squares estimation method, match respectively weibull information fusion reliability growth curve. This assumption model overcome Duane model a weakness which is MTBF point estimation accuracy is not high, through the analysis of the failure data show that the method is an instance of the test and evaluation modeling process are basically identical. Median rank in the statistics is used to determine the method of random variable distribution function, which is a good way to solve the problem of complex systems such as the limited sample size. Therefore this method has great engineering application value.

  10. Application of information theory for the analysis of cogeneration-system performance

    International Nuclear Information System (INIS)

    Successful cogeneration system performance depends critically upon the correct estimation of load variation and the accuracy of demand prediction. We need not only aggregated annual heat and electricity demands, but also hourly and monthly patterns in order to evaluate a cogeneration system's performance by computer simulation. These data are usually obtained from the actual measurements of energy demand in existing buildings. However, it is extremely expensive to collect actual energy demand data and store it over a long period for many buildings. However we face the question of whether it is really necessary to survey hourly demands. This paper provides a sensitivity analysis of the influence of demand-prediction error upon the efficiency of cogeneration system, so as to evaluate the relative importance of various demand components. These components are annual energy demand, annual heat-to-electricity ratio, daily load factor and so forth. Our approach employs the concept of information theory to construct a mathematical model. This analysis provides an indication of the relative importances of demand indices, and identifies what may become a good measure of assessing the efficiency of the cogeneration system for planning purposes. (Author)

  11. Stakeholder Analysis as a Medium to Aid Change in Information System Reengineering Projects

    Directory of Open Access Journals (Sweden)

    Jean Davison

    2004-04-01

    Full Text Available The importance of involving stakeholders within a change process is well recognised, and successfully managed change is equally important. Information systems development and redesign is a form of change activity involving people and social issues and therefore resistance to change may occur. A stakeholder identification and analysis (SIA technique has been developed as an enhancement to PISO® (Process Improvement for Strategic Objectives, a method that engages the users of a system in the problem solving and reengineering of their own work-based problem areas. The SIA technique aids the identification and analysis of system stakeholders, and helps view the projected outcome of system changes and their effect on relevant stakeholders with attention being given to change resistance to ensure smooth negotiation and achieve consensus. A case study is presented here describing the successful implementation of a direct appointment booking system for patients within the National Health Service in the UK, utilising the SIA technique, which resulted in a feeling of empowerment and ownership of the change of those involved.

  12. A Novel Segmentation, Mutual Information Network Framework for EEG Analysis of Motor Tasks

    Directory of Open Access Journals (Sweden)

    Lee Pamela

    2009-05-01

    Full Text Available Abstract Background Monitoring the functional connectivity between brain regions is becoming increasingly important in elucidating brain functionality in normal and disease states. Current methods of detecting networks in the recorded electroencephalogram (EEG such as correlation and coherence are limited by the fact that they assume stationarity of the relationship between channels, and rely on linear dependencies. In contrast to diseases of the brain cortex (e.g. Alzheimer's disease, with motor disorders such as Parkinson's disease (PD the EEG abnormalities are most apparent during performance of dynamic motor tasks, but this makes the stationarity assumption untenable. Methods We therefore propose a novel EEG segmentation method based on the temporal dynamics of the cross-spectrogram of the computed Independent Components (ICs. We then utilize mutual information (MI as the metric for determining also nonlinear statistical dependencies between EEG channels. Graphical theoretical analysis is then applied to the derived MI networks. The method was applied to EEG data recorded from six normal subjects and seven PD subjects off medication. One-way analysis of variance (ANOVA tests demonstrated statistically significant difference in the connectivity patterns between groups. Results The results suggested that PD subjects are unable to independently recruit different areas of the brain while performing simultaneous tasks compared to individual tasks, but instead they attempt to recruit disparate clusters of synchronous activity to maintain behavioral performance. Conclusion The proposed segmentation/MI network method appears to be a promising approach for analyzing the EEG recorded during dynamic behaviors.

  13. Information Loss Determination on Digital Image Compression and Reconstruction Using Qualitative and Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Zhengmao Ye

    2011-12-01

    Full Text Available To effectively utilize the storage capacity, digital image compression has been applied to numerous science and engineering problems. There are two fundamental image compression techniques, either lossless or lossy. The former employs probabilistic models for lossless storage on a basis of statistical redundancy occurred in digital images. However, it has limitation in the compression ratio and bit per pixel ratio. Hence, the latter has also been widely implemented to further improve the storage capacities, covering various fundamental digital image processing approaches. It has been well documented that most lossy compression schemes will provide perfect visual perception under an exceptional compression ratio, among which discrete wavelet transform, discrete Fourier transform and some statistical optimization compression schemes (e.g., principal component analysis and independent component analysis are the dominant approaches. It is necessary to evaluate these compression and reconstruction schemes objectively in addition to the visual appealing. Using a well defined set of the quantitative metrics from Information Theory, a comparative study on several typical digital image compression and reconstruction schemes will be conducted in this research.

  14. Informational database methodology for urban risk analysis.Case study: the historic centre of Bucharest

    Science.gov (United States)

    Armas, I.; Dumitrascu, S.

    2009-04-01

    , but is also a very populated area, this being factors that favour a high susceptibility level. In addition, the majority of the buildings are included in the first and second categories of seismic risk, being built between 1875 and 1940, the age of the buildings establishing an increased vulnerability to natural hazards. The methodology was developed through the contribution of three partner universities from Bucharest: the University of Bucharest, the Academy for Economic Studies and the Technical University of Constructions. The method suggested was based on the analysis and processing of digital and statistical spatial information resulted from 1:500 topographical plans, satellite pictures, archives and historical maps used for the identification of the age of the buildings. Also, an important stage was represented by the field investigations that resulted with the data used in the assessment of the buildings: year of construction, location and vicinity, height, number of floors, state and function of the building, equipment and construction type. The information collected from the field together with the data resulted from the digitization of the ortophotoplans were inserted in ArcGIS in order to compile the database. Furthermore, the team from the Cybernetics Faculty developed a special software package in Visual Studio and SQL server in order to insert the sheets in GIS so that they could be statistically processed. The final product of the study is a program that includes as main functions editing, the analysis based on selected factors (individual or group) and viewing of building information in the shape of maps or 3D visualization. The strengths of the informational system resulted are given by the extended range of applicability, the short processing period, accessibility, capacity of support for a large amount of information and, thus, standing out as an adequate instrument to fit the needs of a susceptible population.

  15. A parametric Bayesian combination of local and regional information in flood frequency analysis

    Science.gov (United States)

    Seidou, O.; Ouarda, T. B. M. J.; Barbet, M.; Bruneau, P.; BobéE, B.

    2006-11-01

    Because of their impact on hydraulic structure design as well as on floodplain management, flood quantiles must be estimated with the highest precision given available information. If the site of interest has been monitored for a sufficiently long period (more than 30-40 years), at-site frequency analysis can be used to estimate flood quantiles with a fair precision. Otherwise, regional estimation may be used to mitigate the lack of data, but local information is then ignored. A commonly used approach to combine at-site and regional information is the linear empirical Bayes estimation: Under the assumption that both local and regional flood quantile estimators have a normal distribution, the empirical Bayesian estimator of the true quantile is the weighted average of both estimations. The weighting factor for each estimator is conversely proportional to its variance. We propose in this paper an alternative Bayesian method for combining local and regional information which provides the full probability density of quantiles and parameters. The application of the method is made with the generalized extreme values (GEV) distribution, but it can be extended to other types of extreme value distributions. In this method the prior distributions are obtained using a regional log linear regression model, and then local observations are used within a Markov chain Monte Carlo algorithm to infer the posterior distributions of parameters and quantiles. Unlike the empirical Bayesian approach the proposed method works even with a single local observation. It also relaxes the hypothesis of normality of the local quantiles probability distribution. The performance of the proposed methodology is compared to that of local, regional, and empirical Bayes estimators on three generated regional data sets with different statistical characteristics. The results show that (1) when the regional log linear model is unbiased, the proposed method gives better estimations of the GEV quantiles and

  16. Climate Risk Informed Decision Analysis: A Hypothetical Application to the Waas Region

    Science.gov (United States)

    Gilroy, Kristin; Mens, Marjolein; Haasnoot, Marjolijn; Jeuken, Ad

    2016-04-01

    More frequent and intense hydrologic events under climate change are expected to enhance water security and flood risk management challenges worldwide. Traditional planning approaches must be adapted to address climate change and develop solutions with an appropriate level of robustness and flexibility. The Climate Risk Informed Decision Analysis (CRIDA) method is a novel planning approach embodying a suite of complementary methods, including decision scaling and adaptation pathways. Decision scaling offers a bottom-up approach to assess risk and tailors the complexity of the analysis to the problem at hand and the available capacity. Through adaptation pathway,s an array of future strategies towards climate robustness are developed, ranging in flexibility and immediacy of investments. Flexible pathways include transfer points to other strategies to ensure that the system can be adapted if future conditions vary from those expected. CRIDA combines these two approaches in a stakeholder driven process which guides decision makers through the planning and decision process, taking into account how the confidence in the available science, the consequences in the system, and the capacity of institutions should influence strategy selection. In this presentation, we will explain the CRIDA method and compare it to existing planning processes, such as the US Army Corps of Engineers Principles and Guidelines as well as Integrated Water Resources Management Planning. Then, we will apply the approach to a hypothetical case study for the Waas Region, a large downstream river basin facing rapid development threatened by increased flood risks. Through the case study, we will demonstrate how a stakeholder driven process can be used to evaluate system robustness to climate change; develop adaptation pathways for multiple objectives and criteria; and illustrate how varying levels of confidence, consequences, and capacity would play a role in the decision making process, specifically

  17. Using Enabling Technologies to Advance Data Intensive Analysis Tools in the JPL Tropical Cyclone Information System

    Science.gov (United States)

    Knosp, B.; Gangl, M. E.; Hristova-Veleva, S. M.; Kim, R. M.; Lambrigtsen, B.; Li, P.; Niamsuwan, N.; Shen, T. P. J.; Turk, F. J.; Vu, Q. A.

    2014-12-01

    The JPL Tropical Cyclone Information System (TCIS) brings together satellite, aircraft, and model forecast data from several NASA, NOAA, and other data centers to assist researchers in comparing and analyzing data related to tropical cyclones. The TCIS has been supporting specific science field campaigns, such as the Genesis and Rapid Intensification Processes (GRIP) campaign and the Hurricane and Severe Storm Sentinel (HS3) campaign, by creating near real-time (NRT) data visualization portals. These portals are intended to assist in mission planning, enhance the understanding of current physical processes, and improve model data by comparing it to satellite and aircraft observations. The TCIS NRT portals allow the user to view plots on a Google Earth interface. To compliment these visualizations, the team has been working on developing data analysis tools to let the user actively interrogate areas of Level 2 swath and two-dimensional plots they see on their screen. As expected, these observation and model data are quite voluminous and bottlenecks in the system architecture can occur when the databases try to run geospatial searches for data files that need to be read by the tools. To improve the responsiveness of the data analysis tools, the TCIS team has been conducting studies on how to best store Level 2 swath footprints and run sub-second geospatial searches to discover data. The first objective was to improve the sampling accuracy of the footprints being stored in the TCIS database by comparing the Java-based NASA PO.DAAC Level 2 Swath Generator with a TCIS Python swath generator. The second objective was to compare the performance of four database implementations - MySQL, MySQL+Solr, MongoDB, and PostgreSQL - to see which database management system would yield the best geospatial query and storage performance. The final objective was to integrate our chosen technologies with our Joint Probability Density Function (Joint PDF), Wave Number Analysis, and

  18. On the use of information theory for the analysis of the relationship between neural and imaging signals.

    Science.gov (United States)

    Panzeri, Stefano; Magri, Cesare; Logothetis, Nikos K

    2008-09-01

    Functional magnetic resonance imaging (fMRI) is a widely used method for studying the neural basis of cognition and of sensory function. A potential problem in the interpretation of fMRI data is that fMRI measures neural activity only indirectly, as a local change of deoxyhemoglobin concentration due to the metabolic demands of neural function. To build correct sensory and cognitive maps in the human brain, it is thus crucial to understand whether fMRI and neural activity convey the same type of information about external correlates. While a substantial experimental effort has been devoted to the simultaneous recordings of hemodynamic and neural signals, so far, the development of analysis methods that elucidate how neural and hemodynamic signals represent sensory information has received less attention. In this article, we critically review why the analytical framework of information theory, the mathematical theory of communication, is ideally suited to this purpose. We review the principles of information theory and explain how they could be applied to the analysis of fMRI and neural signals. We show that a critical advantage of information theory over more traditional analysis paradigms commonly used in the fMRI literature is that it can elucidate, within a single framework, whether an empirically observed correlation between neural and fMRI signals reflects either a similar stimulus tuning or a common source of variability unrelated to the external stimuli. In addition, information theory determines the extent to which these shared sources of stimulus signal and of variability lead fMRI and neural signals to convey similar information about external correlates. We then illustrate the formalism by applying it to the analysis of the information carried by different bands of the local field potential. We conclude by discussing the current methodological challenges that need to be addressed to make the information-theoretic approach more robustly applicable to the

  19. Analysis of Information Publicity System%信息公开制度探析

    Institute of Scientific and Technical Information of China (English)

    朱庆华; 颜祥林

    2001-01-01

    Information publicity system is a guarantee in law in exploiting government information resources.This paper discusses the reasons for establishing such a system,and based on the Japanese Freedom of Information Act,discusses the main content of information publicity system.

  20. The Influence of Place-Based Communities on Information Behavior: A Comparative Grounded Theory Analysis

    Science.gov (United States)

    Gibson, Amelia N.

    2013-01-01

    This study examines the effect of experiential place and local community on information access and behavior for two communities of parents of children with Down syndrome. It uncovers substantive issues associated with health information seeking, government and education-related information access, and information overload and avoidance within the…

  1. Politics and technology in health information systems development: a discourse analysis of conflicts addressed in a systems design group.

    Science.gov (United States)

    Irestig, Magnus; Timpka, Toomas

    2008-02-01

    Different types of disagreements must be managed during the development of health information systems. This study examines the antagonisms discussed during the design of an information system for 175,000 users in a public health context. Discourse analysis methods were used for data collection and analysis. Three hundred and twenty-six conflict events were identified from four design meetings and divided into 16 categories. There were no differences regarding the types of conflicts that the different participants brought into the design discussions. Instead, conflict occurrence was primarily affected by the agendas that set the stage for examinations and debates. The results indicate that the selection of design method and the structure used for the meetings are important factors for the manner in which conflicts are brought into consideration during health information system design. Further studies comparing participatory and non-participatory information system design practices in health service settings are warranted.

  2. EXTRACTION OF BENTHIC COVER INFORMATION FROM VIDEO TOWS AND PHOTOGRAPHS USING OBJECT-BASED IMAGE ANALYSIS

    Directory of Open Access Journals (Sweden)

    M. T. L. Estomata

    2012-07-01

    Full Text Available Mapping benthic cover in deep waters comprises a very small proportion of studies in the field of research. Majority of benthic cover mapping makes use of satellite images and usually, classification is carried out only for shallow waters. To map the seafloor in optically deep waters, underwater videos and photos are needed. Some researchers have applied this method on underwater photos, but made use of different classification methods such as: Neural Networks, and rapid classification via down sampling. In this study, accurate bathymetric data obtained using a multi-beam echo sounder (MBES was attempted to be used as complementary data with the underwater photographs. Due to the absence of a motion reference unit (MRU, which applies correction to the data gathered by the MBES, accuracy of the said depth data was compromised. Nevertheless, even with the absence of accurate bathymetric data, object-based image analysis (OBIA, which used rule sets based on information such as shape, size, area, relative distance, and spectral information, was still applied. Compared to pixel-based classifications, OBIA was able to classify more specific benthic cover types other than coral and sand, such as rubble and fish. Through the use of rule sets on area, less than or equal to 700 pixels for fish and between 700 to 10,000 pixels for rubble, as well as standard deviation values to distinguish texture, fish and rubble were identified. OBIA produced benthic cover maps that had higher overall accuracy, 93.78±0.85%, as compared to pixel-based methods that had an average accuracy of only 87.30±6.11% (p-value = 0.0001, α = 0.05.

  3. Extraction of Benthic Cover Information from Video Tows and Photographs Using Object-Based Image Analysis

    Science.gov (United States)

    Estomata, M. T. L.; Blanco, A. C.; Nadaoka, K.; Tomoling, E. C. M.

    2012-07-01

    Mapping benthic cover in deep waters comprises a very small proportion of studies in the field of research. Majority of benthic cover mapping makes use of satellite images and usually, classification is carried out only for shallow waters. To map the seafloor in optically deep waters, underwater videos and photos are needed. Some researchers have applied this method on underwater photos, but made use of different classification methods such as: Neural Networks, and rapid classification via down sampling. In this study, accurate bathymetric data obtained using a multi-beam echo sounder (MBES) was attempted to be used as complementary data with the underwater photographs. Due to the absence of a motion reference unit (MRU), which applies correction to the data gathered by the MBES, accuracy of the said depth data was compromised. Nevertheless, even with the absence of accurate bathymetric data, object-based image analysis (OBIA), which used rule sets based on information such as shape, size, area, relative distance, and spectral information, was still applied. Compared to pixel-based classifications, OBIA was able to classify more specific benthic cover types other than coral and sand, such as rubble and fish. Through the use of rule sets on area, less than or equal to 700 pixels for fish and between 700 to 10,000 pixels for rubble, as well as standard deviation values to distinguish texture, fish and rubble were identified. OBIA produced benthic cover maps that had higher overall accuracy, 93.78±0.85%, as compared to pixel-based methods that had an average accuracy of only 87.30±6.11% (p-value = 0.0001, α = 0.05).

  4. Biological data analysis as an information theory problem: multivariable dependence measures and the shadows algorithm.

    Science.gov (United States)

    Sakhanenko, Nikita A; Galas, David J

    2015-11-01

    Information theory is valuable in multiple-variable analysis for being model-free and nonparametric, and for the modest sensitivity to undersampling. We previously introduced a general approach to finding multiple dependencies that provides accurate measures of levels of dependency for subsets of variables in a data set, which is significantly nonzero only if the subset of variables is collectively dependent. This is useful, however, only if we can avoid a combinatorial explosion of calculations for increasing numbers of variables.  The proposed dependence measure for a subset of variables, τ, differential interaction information, Δ(τ), has the property that for subsets of τ some of the factors of Δ(τ) are significantly nonzero, when the full dependence includes more variables. We use this property to suppress the combinatorial explosion by following the "shadows" of multivariable dependency on smaller subsets. Rather than calculating the marginal entropies of all subsets at each degree level, we need to consider only calculations for subsets of variables with appropriate "shadows." The number of calculations for n variables at a degree level of d grows therefore, at a much smaller rate than the binomial coefficient (n, d), but depends on the parameters of the "shadows" calculation. This approach, avoiding a combinatorial explosion, enables the use of our multivariable measures on very large data sets. We demonstrate this method on simulated data sets, and characterize the effects of noise and sample numbers. In addition, we analyze a data set of a few thousand mutant yeast strains interacting with a few thousand chemical compounds.

  5. Morphological analysis of the primary center receiving spatial information transferred by the waggle dance of honeybees.

    Science.gov (United States)

    Ai, Hiroyuki; Hagio, Hiromi

    2013-08-01

    The waggle dancers of honeybees encodes roughly the distance and direction to the food source as the duration of the waggle phase and the body angle during the waggle phase. It is believed that hive-mates detect airborne vibrations produced during the waggle phase to acquire distance information and simultaneously detect the body axis during the waggle phase to acquire direction information. It has been further proposed that the orientation of the body axis on the vertical comb is detected by neck hairs (NHs) on the prosternal organ. The afferents of the NHs project into the prothoracic and mesothoracic ganglia and the dorsal subesophageal ganglion (dSEG). This study demonstrates somatotopic organization within the dSEG of the central projections of the mechanosensory neurons of the NHs. The terminals of the NH afferents in dSEG are in close apposition to those of Johnston's organ (JO) afferents. The sensory axons of both terminate in a region posterior to the crossing of the ventral intermediate tract (VIT) and the maxillary dorsal commissures I and III (MxDCI, III) in the subesophageal ganglion. These features of the terminal areas of the NH and JO afferents are common to the worker, drone, and queen castes of honeybees. Analysis of the spatial relationship between the NH neurons and the morphologically and physiologically characterized vibration-sensitive interneurons DL-Int-1 and DL-Int-2 demonstrated that several branches of DL-Int-1 are in close proximity to the central projection of the mechanosensory neurons of the NHs in the dSEG. PMID:23297020

  6. Biological data analysis as an information theory problem: multivariable dependence measures and the shadows algorithm.

    Science.gov (United States)

    Sakhanenko, Nikita A; Galas, David J

    2015-11-01

    Information theory is valuable in multiple-variable analysis for being model-free and nonparametric, and for the modest sensitivity to undersampling. We previously introduced a general approach to finding multiple dependencies that provides accurate measures of levels of dependency for subsets of variables in a data set, which is significantly nonzero only if the subset of variables is collectively dependent. This is useful, however, only if we can avoid a combinatorial explosion of calculations for increasing numbers of variables.  The proposed dependence measure for a subset of variables, τ, differential interaction information, Δ(τ), has the property that for subsets of τ some of the factors of Δ(τ) are significantly nonzero, when the full dependence includes more variables. We use this property to suppress the combinatorial explosion by following the "shadows" of multivariable dependency on smaller subsets. Rather than calculating the marginal entropies of all subsets at each degree level, we need to consider only calculations for subsets of variables with appropriate "shadows." The number of calculations for n variables at a degree level of d grows therefore, at a much smaller rate than the binomial coefficient (n, d), but depends on the parameters of the "shadows" calculation. This approach, avoiding a combinatorial explosion, enables the use of our multivariable measures on very large data sets. We demonstrate this method on simulated data sets, and characterize the effects of noise and sample numbers. In addition, we analyze a data set of a few thousand mutant yeast strains interacting with a few thousand chemical compounds. PMID:26335709

  7. CONTEMPORARY APPROACHES OF COMPANY PERFORMANCE ANALYSIS BASED ON RELEVANT FINANCIAL INFORMATION

    Directory of Open Access Journals (Sweden)

    Sziki Klara

    2012-12-01

    Full Text Available In this paper we chose to present two components of the financial statements: the profit and loss account and the cash flow statement. These summary documents and different indicators calculated based on them allow us to formulate assessments on the performance and profitability on various functions and levels of the company’s activity. This paper aims to support the hypothesis that the accounting information presented in the profit and loss account and in the cash flow statement is an appropriate source for assessing company performance. The purpose of this research is to answer the question linked to the main hypothesis: Is it the profit and loss statement or the cash flow account that reflects better the performance of a business? Based on the literature of specialty studied we tried a conceptual, analytical and practical approach of the term performance, overviewing some terminological acceptations of the term performance as well as the main indicators of performance analysis on the basis of the profit and loss account and of the cash flow statement: aggregated indicators, also known as intermediary balances of administration, economic rate of return, rate of financial profitability, rate of return through cash flows, operating cash flow rate, rate of generating operating cash out of gross operating result. At the same time we had a comparative approach of the profit and loss account and cash flow statement, outlining the main advantages and disadvantages of these documents. In order to demonstrate the above theoretical assessments, we chose to analyze these indicators based on information from the financial statements of SC Sinteza SA, a company in Bihor county, listed on the Bucharest Stock Exchange.

  8. Opinions of ICT Teachers about Information Technology Course Implementations: A Social Media Analysis

    Directory of Open Access Journals (Sweden)

    Alaattin Parlakkılıç

    2014-01-01

    Full Text Available The use of Information and Communication Technologies (ICT is increasing in education. ICT teachers have important role and responsibilities in ICT world. In this study, the problems of ICT teachers and their suggested solutions that stated by them were evaluated by analyzing their messages and shared information in Internet and social media. Document analysis was used as qualitative data collection method for this study. The research group was consisting of the ICT teachers that have worked in secondary Turkish Schools from July 2012 to July 2013 who used social media. In the study, teachers’ opinions and suggested solutions in social media (forums, blogs, Facebook and Twitter had been obtained and categorized in six area as course compulsory, curriculum, personal rights, job definitions, Fatih Project, ICT infrastructure and innovative ideas. The data have been evaluated categorically in frequency and percentage. At the end of the study; it was evaluated that the solution suggestions provide a great asset in education for innovation and changes. In this context, problems about employee personal rights (f=61 and %31.9 have been the most important one and the suggested solutions express legal arrangements to be made. In the second place, obligatory course (f=49 and %29.9 was stated. Inadequacy of the curriculum and the need for update (f=28 and %14.6 was the third most discussed topic. Progressive applications and renovations (f=23 and %12.1 were in the fourth place. In the fifth place, it was expressed that the success probability of the Fatih Project (f=21 and % 11 was low in the current situation and the ICT teachers must be included in the project. Lastly it was seen that the infrastructure and support (f=18 and %9.5 were required for development

  9. Measuring child maltreatment using multi-informant survey data: a higher-order confirmatory factor analysis

    Directory of Open Access Journals (Sweden)

    Giovanni A. Salum

    2016-03-01

    Full Text Available Objective To investigate the validity and reliability of a multi-informant approach to measuring child maltreatment (CM comprising seven questions assessing CM administered to children and their parents in a large community sample. Methods Our sample comprised 2,512 children aged 6 to 12 years and their parents. Child maltreatment (CM was assessed with three questions answered by the children and four answered by their parents, covering physical abuse, physical neglect, emotional abuse and sexual abuse. Confirmatory factor analysis was used to compare the fit indices of different models. Convergent and divergent validity were tested using parent-report and teacher-report scores on the Strengths and Difficulties Questionnaire. Discriminant validity was investigated using the Development and Well-Being Assessment to divide subjects into five diagnostic groups: typically developing controls (n = 1,880, fear disorders (n = 108, distress disorders (n = 76, attention deficit hyperactivity disorder (n = 143 and oppositional defiant disorder/conduct disorder (n = 56. Results A higher-order model with one higher-order factor (child maltreatment encompassing two lower-order factors (child report and parent report exhibited the best fit to the data and this model's reliability results were acceptable. As expected, child maltreatment was positively associated with measures of psychopathology and negatively associated with prosocial measures. All diagnostic category groups had higher levels of overall child maltreatment than typically developing children. Conclusions We found evidence for the validity and reliability of this brief measure of child maltreatment using data from a large survey combining information from parents and their children.

  10. Multi-resolution, object-oriented fuzzy analysis of remote sensing data for GIS-ready information

    Science.gov (United States)

    Benz, Ursula C.; Hofmann, Peter; Willhauck, Gregor; Lingenfelder, Iris; Heynen, Markus

    Remote sensing from airborne and spaceborne platforms provides valuable data for mapping, environmental monitoring, disaster management and civil and military intelligence. However, to explore the full value of these data, the appropriate information has to be extracted and presented in standard format to import it into geo-information systems and thus allow efficient decision processes. The object-oriented approach can contribute to powerful automatic and semi-automatic analysis for most remote sensing applications. Synergetic use to pixel-based or statistical signal processing methods explores the rich information contents. Here, we explain principal strategies of object-oriented analysis, discuss how the combination with fuzzy methods allows implementing expert knowledge and describe a representative example for the proposed workflow from remote sensing imagery to GIS. The strategies are demonstrated using the first object-oriented image analysis software on the market, eCognition, which provides an appropriate link between remote sensing imagery and GIS.

  11. Trial sequential analysis reveals insufficient information size and potentially false positive results in many meta-analyses

    DEFF Research Database (Denmark)

    Brok, J.; Thorlund, K.; Gluud, C.;

    2008-01-01

    OBJECTIVES: To evaluate meta-analyses with trial sequential analysis (TSA). TSA adjusts for random error risk and provides the required number of participants (information size) in a meta-analysis. Meta-analyses not reaching information size are analyzed with trial sequential monitoring boundaries...... analogous to interim monitoring boundaries in a single trial. STUDY DESIGN AND SETTING: We applied TSA on meta-analyses performed in Cochrane Neonatal reviews. We calculated information sizes and monitoring boundaries with three different anticipated intervention effects of 30% relative risk reduction (TSA......(30%)), 15% (TSA(15%)), or a risk reduction suggested by low-bias risk trials of the meta-analysis corrected for heterogeneity (TSA(LBHIS)). RESULTS: A total of 174 meta-analyses were eligible; 79 out of 174 (45%) meta-analyses were statistically significant (P

  12. Climate Risk Informed Decision Analysis (CRIDA): A novel practical guidance for Climate Resilient Investments and Planning

    Science.gov (United States)

    Jeuken, Ad; Mendoza, Guillermo; Matthews, John; Ray, Patrick; Haasnoot, Marjolijn; Gilroy, Kristin; Olsen, Rolf; Kucharski, John; Stakhiv, Gene; Cushing, Janet; Brown, Casey

    2016-04-01

    over time. They are part of the Dutch adaptive planning approach Adaptive Delta Management, executed and develop by the Dutch Delta program. Both decision scaling and adaptation pathways have been piloted in studies worldwide. The objective of CRIDA is to mainstream effective climate adaptation for professional water managers. The CRIDA publication, due in april 2016, follows the generic water design planning design cycle. At each step, CRIDA describes stepwise guidance for incorporating climate robustness: problem definition, stress test, alternatives formulation and recommendation, evaluation and selection. In the presentation the origin, goal, steps and practical tools available at each step of CRIDA will be explained. In two other abstracts ("Climate Risk Informed Decision Analysis: A Hypothetical Application to the Waas Region" by Gilroy et al., "The Application of Climate Risk Informed Decision Analysis to the Ioland Water Treatment Plant in Lusaka, Zambia, by Kucharski et al.), the application of CRIDA to cases is explained

  13. Geographic information system for fusion and analysis of high-resolution remote sensing and ground data

    Science.gov (United States)

    Freeman, Anthony; Way, Jo Bea; Dubois, Pascale; Leberl, Franz

    1993-01-01

    We seek to combine high-resolution remotely sensed data with models and ground truth measurements, in the context of a Geographical Information System (GIS), integrated with specialized image processing software. We will use this integrated system to analyze the data from two Case Studies, one at a boreal forest site, the other a tropical forest site. We will assess the information content of the different components of the data, determine the optimum data combinations to study biogeophysical changes in the forest, assess the best way to visualize the results, and validate the models for the forest response to different radar wavelengths/polarizations. During the 1990's, unprecedented amounts of high-resolution images from space of the Earth's surface will become available to the applications scientist from the LANDSAT/TM series, European and Japanese ERS-1 satellites, RADARSAT and SIR-C missions. When the Earth Observation Systems (EOS) program is operational, the amount of data available for a particular site can only increase. The interdisciplinary scientist, seeking to use data from various sensors to study his site of interest, may be faced with massive difficulties in manipulating such large data sets, assessing their information content, determining the optimum combinations of data to study a particular parameter, visualizing his results and validating his model of the surface. The techniques to deal with these problems are also needed to support the analysis of data from NASA's current program of Multi-sensor Airborne Campaigns, which will also generate large volumes of data. In the Case Studies outlined in this proposal, we will have somewhat unique data sets. For the Bonanza Creek Experimental Forest (Case 1) calibrated DC-8 SAR (Synthetic Aperture Radar) data and extensive ground truth measurement are already at our disposal. The data set shows documented evidence to temporal change. The Belize Forest Experiment (Case 2) will produce calibrated DC-8 SAR

  14. Prototype integration of protein electrophoresis laboratory results in an information warehouse to improve workflow and data analysis.

    Science.gov (United States)

    Liu, Jianhua; Silvey, Scott A; Bissell, Michael G; Saltz, Joel H; Kamal, Jyoti

    2006-01-01

    This poster demonstrates our efforts to enhance workflow and clinical analysis of protein electrophoresis (PEP) data through integration with the Information Warehouse (IW) at The Ohio State University Medical Center (OSUMC). A new desktop application has been developed with the aim of enabling more efficient and accurate gel analysis by clinical pathologists. This tool gives the pathologists the ability to perform their analysis conveniently from anywhere on the OSUMC network along with the aid of numerical analysis algorithms, image enhancement techniques, and access to historical PEP results for the given patient.

  15. Informed consent and placebo effects: a content analysis of information leaflets to identify what clinical trial participants are told about placebos.

    Directory of Open Access Journals (Sweden)

    Felicity L Bishop

    Full Text Available BACKGROUND: Placebo groups are used in randomised clinical trials (RCTs to control for placebo effects, which can be large. Participants in trials can misunderstand written information particularly regarding technical aspects of trial design such as randomisation; the adequacy of written information about placebos has not been explored. We aimed to identify what participants in major RCTs in the UK are told about placebos and their effects. METHODS AND FINDINGS: We conducted a content analysis of 45 Participant Information Leaflets (PILs using quantitative and qualitative methodologies. PILs were obtained from trials on a major registry of current UK clinical trials (the UKCRN database. Eligible leaflets were received from 44 non-commercial trials but only 1 commercial trial. The main limitation is the low response rate (13.5%, but characteristics of included trials were broadly representative of all non-commercial trials on the database. 84% of PILs were for trials with 50:50 randomisation ratios yet in almost every comparison the target treatments were prioritized over the placebos. Placebos were referred to significantly less frequently than target treatments (7 vs. 27 mentions, p<001 and were significantly less likely than target treatments to be described as triggering either beneficial effects (1 vs. 45, p<001 or adverse effects (4 vs. 39, p<001. 8 PILs (18% explicitly stated that the placebo treatment was either undesirable or ineffective. CONCLUSIONS: PILs from recent high quality clinical trials emphasise the benefits and adverse effects of the target treatment, while largely ignoring the possible effects of the placebo. Thus they provide incomplete and at times inaccurate information about placebos. Trial participants should be more fully informed about the health changes that they might experience from a placebo. To do otherwise jeopardises informed consent and is inconsistent with not only the science of placebos but also the

  16. Influence analysis of information erupted on social networks based on SIR model

    Science.gov (United States)

    Zhou, Xue; Hu, Yong; Wu, Yue; Xiong, Xi

    2015-07-01

    In this paper, according to the similarity of chain reaction principle and the characteristics of information propagation on social network, we proposed a new word "information bomb". Based on the complex networks and SIR model, dynamical evolution equations were setup. Then methods used to evaluate the four indexes of bomb power were given, including influence breadth, influence strength, peak time and relaxation time. At last, the power of information was ascertained through these indexes. The process of information propagation is simulated to illustrate the spreading characteristics through the results. Then parameters which impact on the power of information bomb are analyzed and some methods which control the propagation of information are given.

  17. Improvement of the Accounting System at an Enterprise with the aim of Information Support of the Strategic Analysis

    Directory of Open Access Journals (Sweden)

    Feofanova Iryna V.

    2013-11-01

    Full Text Available The goal of the article is identification of directions of improvement of the accounting system at an enterprise for ensuring procedures of strategic analysis of trustworthy information. Historical (for the study of conditions of appearance and development of the strategic analysis and logical (for identification of directions of improvement of accounting methods were used during the study. The article establishes that the modern conditions require a system of indicators that is based both on financial and non-financial information. In order to conduct the strategic analysis it is necessary to expand the volume of information, which characterises such resources of an enterprise as scientific research and developments, personnel and quality of products (services. The article selects indicators of innovation activity costs and personnel training costs, accounting of which is not sufficiently regulated, among indicators that provides such information. It offers, in order to ensure information requirements of analysts, to improve accounting by the following directions: identification of the nature and volume of information required for enterprise managers; formation of the system of accounting at the place of appearance of expenses and responsibility centres; identification and accounting of income or other results received by the enterprise due to personnel advanced training, research and developments and innovation introduction costs. The article offers a form for calculating savings in the result of reduction of costs obtained due to provision of governmental privileges to enterprises that introduce innovations and deal with personnel training.

  18. Research study on analysis/use technologies of genome information; Genome joho kaidoku riyo gijutsu no chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    For wide use of genome information in the industrial field, the required R and D was surveyed from the standpoints of biology and information science. To clarify the present state and issues of the international research on genome analysis, the genome map as well as sequence and function information are first surveyed. The current analysis/use technologies of genome information are analyzed, and the following are summarized: prediction and identification of gene regions in genome sequences, techniques for searching and selecting useful genes, and techniques for predicting the expression of gene functions and the gene-product structure and functions. It is recommended that R and D and data collection/interpretation necessary to clarify inter-gene interactions and information networks should be promoted by integrating Japanese advanced know-how and technologies. As examples of the impact of the research results on industry and society, the present state and future expected effect are summarized for medicines, diagnosis/analysis instruments, chemicals, foods, agriculture, fishery, animal husbandry, electronics, environment and information. 278 refs., 42 figs., 5 tabs.

  19. A toolbox for the fast information analysis of multiple-site LFP, EEG and spike train recordings

    Directory of Open Access Journals (Sweden)

    Logothetis Nikos K

    2009-07-01

    Full Text Available Abstract Background Information theory is an increasingly popular framework for studying how the brain encodes sensory information. Despite its widespread use for the analysis of spike trains of single neurons and of small neural populations, its application to the analysis of other types of neurophysiological signals (EEGs, LFPs, BOLD has remained relatively limited so far. This is due to the limited-sampling bias which affects calculation of information, to the complexity of the techniques to eliminate the bias, and to the lack of publicly available fast routines for the information analysis of multi-dimensional responses. Results Here we introduce a new C- and Matlab-based information theoretic toolbox, specifically developed for neuroscience data. This toolbox implements a novel computationally-optimized algorithm for estimating many of the main information theoretic quantities and bias correction techniques used in neuroscience applications. We illustrate and test the toolbox in several ways. First, we verify that these algorithms provide accurate and unbiased estimates of the information carried by analog brain signals (i.e. LFPs, EEGs, or BOLD even when using limited amounts of experimental data. This test is important since existing algorithms were so far tested primarily on spike trains. Second, we apply the toolbox to the analysis of EEGs recorded from a subject watching natural movies, and we characterize the electrodes locations, frequencies and signal features carrying the most visual information. Third, we explain how the toolbox can be used to break down the information carried by different features of the neural signal into distinct components reflecting different ways in which correlations between parts of the neural signal contribute to coding. We illustrate this breakdown by analyzing LFPs recorded from primary visual cortex during presentation of naturalistic movies. Conclusion The new toolbox presented here implements fast

  20. Parameter Analysis of the VPIN (Volume synchronized Probability of Informed Trading) Metric

    Energy Technology Data Exchange (ETDEWEB)

    Song, Jung Heon; Wu, Kesheng; Simon, Horst D.

    2014-03-01

    VPIN (Volume synchronized Probability of Informed trading) is a leading indicator of liquidity-induced volatility. It is best known for having produced a signal more than hours before the Flash Crash of 2010. On that day, the market saw the biggest one-day point decline in the Dow Jones Industrial Average, which culminated to the market value of $1 trillion disappearing, but only to recover those losses twenty minutes later (Lauricella 2010). The computation of VPIN requires the user to set up a handful of free parameters. The values of these parameters significantly affect the effectiveness of VPIN as measured by the false positive rate (FPR). An earlier publication reported that a brute-force search of simple parameter combinations yielded a number of parameter combinations with FPR of 7%. This work is a systematic attempt to find an optimal parameter set using an optimization package, NOMAD (Nonlinear Optimization by Mesh Adaptive Direct Search) by Audet, le digabel, and tribes (2009) and le digabel (2011). We have implemented a number of techniques to reduce the computation time with NOMAD. Tests show that we can reduce the FPR to only 2%. To better understand the parameter choices, we have conducted a series of sensitivity analysis via uncertainty quantification on the parameter spaces using UQTK (Uncertainty Quantification Toolkit). Results have shown dominance of 2 parameters in the computation of FPR. Using the outputs from NOMAD optimization and sensitivity analysis, We recommend A range of values for each of the free parameters that perform well on a large set of futures trading records.

  1. Health-care district management information system plan: Review of operations analysis activities during calendar year 1975 and plan for continued research and analysis activities

    Science.gov (United States)

    Nielson, G. J.; Stevenson, W. G.

    1976-01-01

    Operations research activities developed to identify the information required to manage both the efficiency and effectiveness of the Veterans Administration (VA) health services as these services relate to individual patient care are reported. The clinical concerns and management functions that determine this information requirement are discussed conceptually. Investigations of existing VA data for useful management information are recorded, and a diagnostic index is provided. The age-specific characteristics of diseases and lengths of stay are explored, and recommendations for future analysis activities are articulated. The effect of the introduction of new technology to health care is also discussed.

  2. Effects of information on young consumers' willingness to pay for genetically modified food: experimental auction analysis.

    Science.gov (United States)

    Kajale, Dilip B; Becker, T C

    2014-01-01

    This study examines the effects of information on consumers' willingness to pay (WTP) for genetically modified food (GMF). We used Vickrey second price experimental auction method for elicitation of consumer WTP for GM potato chips and GM soya-chocolate bar. The sample used in this study was university students from Delhi, India. Four information formats (positive, negative, no information, and combined information about GM technology) were used for the examination. The results show that, when students received the combine information they were willing to pay around 17%-20% premium for GMF and when received the negative information they demanded around 22% discount for GMF. While the positive- and the no-information formats alone have no considerable effect on consumers' WTP for GMF. Overall, our findings suggest that while doing marketing of GMF in India, the best strategy is to provide combined information about GM technology.

  3. Effects of information on young consumers' willingness to pay for genetically modified food: experimental auction analysis.

    Science.gov (United States)

    Kajale, Dilip B; Becker, T C

    2014-01-01

    This study examines the effects of information on consumers' willingness to pay (WTP) for genetically modified food (GMF). We used Vickrey second price experimental auction method for elicitation of consumer WTP for GM potato chips and GM soya-chocolate bar. The sample used in this study was university students from Delhi, India. Four information formats (positive, negative, no information, and combined information about GM technology) were used for the examination. The results show that, when students received the combine information they were willing to pay around 17%-20% premium for GMF and when received the negative information they demanded around 22% discount for GMF. While the positive- and the no-information formats alone have no considerable effect on consumers' WTP for GMF. Overall, our findings suggest that while doing marketing of GMF in India, the best strategy is to provide combined information about GM technology. PMID:24735210

  4. A scheme for racquet sports video analysis with the combination of audio-visual information

    Science.gov (United States)

    Xing, Liyuan; Ye, Qixiang; Zhang, Weigang; Huang, Qingming; Yu, Hua

    2005-07-01

    As a very important category in sports video, racquet sports video, e.g. table tennis, tennis and badminton, has been paid little attention in the past years. Considering the characteristics of this kind of sports video, we propose a new scheme for structure indexing and highlight generating based on the combination of audio and visual information. Firstly, a supervised classification method is employed to detect important audio symbols including impact (ball hit), audience cheers, commentator speech, etc. Meanwhile an unsupervised algorithm is proposed to group video shots into various clusters. Then, by taking advantage of temporal relationship between audio and visual signals, we can specify the scene clusters with semantic labels including rally scenes and break scenes. Thirdly, a refinement procedure is developed to reduce false rally scenes by further audio analysis. Finally, an exciting model is proposed to rank the detected rally scenes from which many exciting video clips such as game (match) points can be correctly retrieved. Experiments on two types of representative racquet sports video, table tennis video and tennis video, demonstrate encouraging results.

  5. QTL linkage analysis of connected populations using ancestral marker and pedigree information.

    Science.gov (United States)

    Bink, Marco C A M; Totir, L Radu; ter Braak, Cajo J F; Winkler, Christopher R; Boer, Martin P; Smith, Oscar S

    2012-04-01

    The common assumption in quantitative trait locus (QTL) linkage mapping studies that parents of multiple connected populations are unrelated is unrealistic for many plant breeding programs. We remove this assumption and propose a Bayesian approach that clusters the alleles of the parents of the current mapping populations from locus-specific identity by descent (IBD) matrices that capture ancestral marker and pedigree information. Moreover, we demonstrate how the parental IBD data can be incorporated into a QTL linkage analysis framework by using two approaches: a Threshold IBD model (TIBD) and a Latent Ancestral Allele Model (LAAM). The TIBD and LAAM models are empirically tested via numerical simulation based on the structure of a commercial maize breeding program. The simulations included a pilot dataset with closely linked QTL on a single linkage group and 100 replicated datasets with five linkage groups harboring four unlinked QTL. The simulation results show that including parental IBD data (similarly for TIBD and LAAM) significantly improves the power and particularly accuracy of QTL mapping, e.g., position, effect size and individuals' genotype probability without significantly increasing computational demand.

  6. Social Media-Based Collaborative Information Access: Analysis of Online Crisis-Related Twitter Conversations

    OpenAIRE

    Tamine, Lynda; Soulier, Laure; Ben Jabeur, Lamjed; Amblard, Frederic; Hanachi, Chihab; Hubert, Gilles; Roth, Camille

    2016-01-01

    The notion of implicit (or explicit) collaborative information access refers to systems and practices allowing a group of users to unintentionally (respectively intentionally) seek, share and retrieve information to achieve similar (respectively shared) information-related goals. Despite an increasing adoption in social environments, collaboration behavior in information seeking and retrieval is mainly limited to small-sized groups, generally restricted to working spaces. Much remains to be l...

  7. Workplace Information Literacy in the Scientific Field : an Empirical Analysis Using the Semantic Differential Approach

    OpenAIRE

    Mühlbacher, Susanne; Hammwöhner, Rainer; Wolff, Christian

    2008-01-01

    The study focuses on eliciting a semantic concept of Information Literacy by capturing the information worker s perception of the information process. It is presumed that this perception influences the formation and advancement of Information Literacy at the workplace. The approach is based on the creation of a semantic differential scale. Target group are scientists from the field of natural sciences. The survey shows that five partly correlated principal aspects play a major role: personal ...

  8. Exploring drought vulnerability in Africa: an indicator based analysis to inform early warning systems

    Directory of Open Access Journals (Sweden)

    G. Naumann

    2013-10-01

    Full Text Available Drought vulnerability is a complex concept that includes both biophysical and socio-economic drivers of drought impact that determine capacity to cope with drought. In order to develop an efficient drought early warning system and to be prepared to mitigate upcoming drought events it is important to understand the drought vulnerability of the affected regions. We propose a composite Drought Vulnerability Indicator (DVI that reflects different aspects of drought vulnerability evaluated at Pan-African level in four components: the renewable natural capital, the economic capacity, the human and civic resources, and the infrastructure and technology. The selection of variables and weights reflects the assumption that a society with institutional capacity and coordination, as well as with mechanisms for public participation is less vulnerable to drought; furthermore we consider that agriculture is only one of the many sectors affected by drought. The quality and accuracy of a composite indicator depends on the theoretical framework, on the data collection and quality, and on how the different components are aggregated. This kind of approach can lead to some degree of scepticism; to overcome this problem a sensitivity analysis was done in order to measure the degree of uncertainty associated with the construction of the composite indicator. Although the proposed drought vulnerability indicator relies on a number of theoretical assumptions and some degree of subjectivity, the sensitivity analysis showed that it is a robust indicator and hence able of representing the complex processes that lead to drought vulnerability. According to the DVI computed at country level, the African countries classified with higher relative vulnerability are Somalia, Burundi, Niger, Ethiopia, Mali and Chad. The analysis of the renewable natural capital component at sub-basin level shows that the basins with high to moderate drought vulnerability can be subdivided in three

  9. The Key Roles in the Informal Organization: A Network Analysis Perspective

    Science.gov (United States)

    de Toni, Alberto F.; Nonino, Fabio

    2010-01-01

    Purpose: The purpose of this paper is to identify the key roles embedded in the informal organizational structure (informal networks) and to outline their contribution in the companies' performance. A major objective of the research is to find and characterize a new key informal role that synthesises problem solving, expertise, and accessibility…

  10. Qualitative and quantitative information flow analysis for multi-thread programs

    NARCIS (Netherlands)

    Ngo, Tri Minh

    2014-01-01

    In today's information-based society, guaranteeing information security plays an important role in all aspects of life: communication between citizens and governments, military, companies, financial information systems, web-based services etc. With the increasing popularity of computer systems with

  11. Analysis of medication information exchange at discharge from a Dutch hospital

    NARCIS (Netherlands)

    van Berlo-van de laar, Inge R. F.; Driessen, Erwin; Merkx, Maria M.; Jansman, Frank G. A.

    2012-01-01

    Background At hospitalisation and discharge the risk of errors in medication information transfer is high. Objective To study the routes by which medication information is transferred during discharge from Deventer Hospital, and to improve medication information transfer. Setting Eight hospital ward

  12. A SWOT analysis on the implementation of Building Information Models within the geospatial environment

    NARCIS (Netherlands)

    Isikdag, U.; Zlatanova, S.

    2009-01-01

    Building Information Models as product models and Building Information Modelling as a process which supports information management throughout the lifecycle of a building are becoming more widely used in the Architecture/Engineering/Construction (AEC) industry. In order to facilitate various urban m

  13. Analysis of Influencing Factors of Technical Barriers on Information Product Exports Based on the Fuzzy AHP

    Institute of Scientific and Technical Information of China (English)

    YuyingWu; NaLi

    2004-01-01

    To the actual situation of TBT impacting information product and according to the concept of the triangular fuzzy number, this paper forms the fuzzy matrix of factors of impacting export of information product, then uses the fuzzy AHP to analyze and rate factors. We put forward suggestions on how to keep away and surpass the technical barriers to trade in the information product enterprises.

  14. Persistence Factors of Women in Information Technology--A Multiple Case Study Analysis

    Science.gov (United States)

    Hua, David M.

    2010-01-01

    Women have historically been underrepresented in the field of information technology. The literature related to the underrepresentation of women in information technology has focused on developing strategies for attracting more females into the industry. Despite these efforts, the number of women in information technology has been declining. The…

  15. SWOT Analysis of King Abdullah II School for Information Technology at University of Jordan According to Quality Assurance Procedures

    Directory of Open Access Journals (Sweden)

    Lubna Naser Eddeen

    2013-02-01

    Full Text Available Many books and research papers have defined and referred to the term SWOT Analysis. SWOT Analysis can be defines as "strategic planning method used to evaluate the Strengths, Weaknesses, Opportunities, and Threats involved in a project or in a business venture". It's used to assess internal and external environmental factors which affect on the business. This paper analyze the main SWOT factors at King Abdullah II School for Information Technology.

  16. SMALL Savannah: an information system for the integrated analysis of land use change in the Far North of Cameroon

    OpenAIRE

    Fotsing, Eric

    2009-01-01

    SMALL Savannah is an Environmental Information System designed for the integrated analysis and sustainable land management in the savannas region of the Far North of Cameroon. This system combines an observation and spatial analysis module for the representation of phenomena from various geographic data sources, with a module for the explanation and prediction of land use pattern and changes, and a dynamic modelling and simulation module for the exploration of possible land use change traject...

  17. Can policy analysis theories predict and inform policy change? Reflections on the battle for legal abortion in Indonesia

    OpenAIRE

    Surjadjaja, Claudia; Mayhew, Susannah H.

    2010-01-01

    The relevance and importance of research for understanding policy processes and influencing policies has been much debated, but studies on the effectiveness of policy theories for predicting and informing opportunities for policy change (i.e. prospective policy analysis) are rare. The case study presented in this paper is drawn from a policy analysis of a contemporary process of policy debate on legalization of abortion in Indonesia, which was in flux at the time of the research and provided ...

  18. Improvement of the Accounting System at an Enterprise with the aim of Information Support of the Strategic Analysis

    OpenAIRE

    Feofanova Iryna V.; Feofanov Lev K.

    2013-01-01

    The goal of the article is identification of directions of improvement of the accounting system at an enterprise for ensuring procedures of strategic analysis of trustworthy information. Historical (for the study of conditions of appearance and development of the strategic analysis) and logical (for identification of directions of improvement of accounting) methods were used during the study. The article establishes that the modern conditions require a system of indicators that is based both ...

  19. Display analysis with the optimal control model of the human operator. [pilot-vehicle display interface and information processing

    Science.gov (United States)

    Baron, S.; Levison, W. H.

    1977-01-01

    Application of the optimal control model of the human operator to problems in display analysis is discussed. Those aspects of the model pertaining to the operator-display interface and to operator information processing are reviewed and discussed. The techniques are then applied to the analysis of advanced display/control systems for a Terminal Configured Vehicle. Model results are compared with those obtained in a large, fixed-base simulation.

  20. SWOT Analysis of King Abdullah II School for Information Technology at University of Jordan According to Quality Assurance Procedures

    OpenAIRE

    Lubna Naser Eddeen; Ansar Khoury; Osama Harfoushi; Itaf Abushanab

    2013-01-01

    Many books and research papers have defined and referred to the term SWOT Analysis. SWOT Analysis can be defines as "strategic planning method used to evaluate the Strengths, Weaknesses, Opportunities, and Threats involved in a project or in a business venture". It's used to assess internal and external environmental factors which affect on the business. This paper analyze the main SWOT factors at King Abdullah II School for Information Technology.

  1. [The High Precision Analysis Research of Multichannel BOTDR Scattering Spectral Information Based on the TTDF and CNS Algorithm].

    Science.gov (United States)

    Zhang, Yan-jun; Liu, Wen-zhe; Fu, Xing-hu; Bi, Wei-hong

    2015-07-01

    Traditional BOTDR optical fiber sensing system uses single channel sensing fiber to measure the information features. Uncontrolled factors such as cross-sensitivity can lead to a lower scattering spectrum fitting precision and make the information analysis deflection get worse. Therefore, a BOTDR system for detecting the multichannel sensor information at the same time is proposed. Also it provides a scattering spectrum analysis method for multichannel Brillouin optical time-domain reflection (BOT-DR) sensing system in order to extract high precision spectrum feature. This method combines the three times data fusion (TTDF) and the cuckoo Newton search (CNS) algorithm. First, according to the rule of Dixon and Grubbs criteria, the method uses the ability of TTDF algorithm in data fusion to eliminate the influence of abnormal value and reduce the error signal. Second, it uses the Cuckoo Newton search algorithm to improve the spectrum fitting and enhance the accuracy of Brillouin scattering spectrum information analysis. We can obtain the global optimal solution by smart cuckoo search. By using the optimal solution as the initial value of Newton algorithm for local optimization, it can ensure the spectrum fitting precision. The information extraction at different linewidths is analyzed in temperature information scattering spectrum under the condition of linear weight ratio of 1:9. The variances of the multichannel data fusion is about 0.0030, the center frequency of scattering spectrum is 11.213 GHz and the temperature error is less than 0.15 K. Theoretical analysis and simulation results show that the algorithm can be used in multichannel distributed optical fiber sensing system based on Brillouin optical time domain reflection. It can improve the accuracy of multichannel sensing signals and the precision of Brillouin scattering spectrum analysis effectively. PMID:26717729

  2. Further Analysis Of A Framework To Analyze Network Performance Based On Information Quality

    Directory of Open Access Journals (Sweden)

    A Kazmierczak

    2011-12-01

    Full Text Available Abstract In [1], Geng and Li presented a framework to analyze network performance based on information quality. In that paper, the authors based their framework on the flow of information from a Base Station (BS to clients. The theory they established can, and needs, to be extended to accommodate for the flow of information from the clients to the BS. In this work, we use that framework and study the case of client to BS data transmission. Our work closely parallels the work of Geng and Li, we use the same notation and liberally reference their work. Keywords: information theory, information quality, network protocols, network performance

  3. IQARIS : a tool for the intelligent querying, analysis, and retrieval from information systems

    International Nuclear Information System (INIS)

    Information glut is one of the primary characteristics of the electronic age. Managing such large volumes of information (e.g., keeping track of the types, where they are, their relationships, who controls them, etc.) can be done efficiently with an intelligent, user-oriented information management system. The purpose of this paper is to describe a concept for managing information resources based on an intelligent information technology system developed by the Argonne National Laboratory for managing digital libraries. The Argonne system, Intelligent Query (IQ), enables users to query digital libraries and view the holdings that match the query from different perspectives

  4. The Channels and Demands Analysis for Chinese Farmers’ Agricultural Information Acquisition

    Directory of Open Access Journals (Sweden)

    Zhensheng Tao

    2012-06-01

    Full Text Available This paper studies the characteristics of information sources and farmers’ demand for agricultural information in the process of agricultural informationization in China. We point out that it is not common for farmers to adopt modern information technology communication tools in rural areas nowadays, and the reason for this phenomenon is that farmers still rely mainly on traditional channels for information dissemination. To change the status quo, our government should respect farmers as dominant statuses in the process of agricultural informationization and encourage households with large-scale agricultural operation to use modern IT products in order to spread agricultural information in rural areas.

  5. ANALYSIS OF INTELLIGENT DATA MINING FOR INFORMATION EXTRACTION USING JAVA AGENT DEVELOPMENT ENVIRONMENT PLATFORM

    Directory of Open Access Journals (Sweden)

    M. Vinoth Kumar

    2013-01-01

    Full Text Available In this study, the problem of unstructured information extraction has been analyzed and the need for a new Information Extraction algorithm is justified. We propose an Intelligent Information Extraction using Java Agent Development Environment (JADE to be an optimal solution for intelligent information extraction. This proposed algorithm first assigns intelligent agents to gathering data, which they then classify, match and organize to construct sequential information for a user query. It improves the efficiency and performance for retrieving a proper information results as a sequence that satisfy user’s needs. It gives the user needed documents based on similarity between query matching and relevant document mechanism. The results obtained from the Intelligent Information Extraction are optimal.

  6. Analysis of the question-answer service of the Emma Children's Hospital information centre.

    Science.gov (United States)

    Kruisinga, Frea H; Heinen, Richard C; Heymans, Hugo S A

    2010-07-01

    The information centre of the Emma Children's Hospital AMC (EKZ AMC) is a specialised information centre where paediatric patients and persons involved with the patient can ask questions about all aspects of disease and its social implications. The aim of the study was to evaluate the question-answer service of this information centre in order to determine the role of a specialised information centre in an academic children's hospital, identify the appropriate resources for the service and potential positive effects. For this purpose, a case management system was developed in MS ACCESS. The characteristics of the requester and the question, the time it took to answer questions, the information sources used and the extent to which we were able to answer the questions were registered. The costs of the service were determined. We analysed all questions that were asked in the year 2007. Fourteen hundred thirty-four questions were asked. Most questions were asked by parents (23.3%), healthcare workers (other than nurses; 16.5%) and nurses (15.3%). The scope of the most frequently asked questions include disease (20.2%) and treatment (13.0%). Information on paper was the main information source used. Most questions could be solved within 15 min. Twelve percent to 28% of total working hours are used for the question-answer service. Total costs including staff salary are rather large. In conclusions, taking over the task of providing additional medical information and by providing readily available, good quality information that healthcare professionals can use to inform their patients will lead to less time investment of these more expensive staff members. A specialised information service can anticipate on the information need of parents and persons involved with the paediatric patient. It improves information by providing with relatively simple resources that has the potential to improve patient and parent satisfaction, coping and medical results. A specialised

  7. Functional MRI Representational Similarity Analysis Reveals a Dissociation between Discriminative and Relative Location Information in the Human Visual System

    Directory of Open Access Journals (Sweden)

    Zvi N Roth

    2016-03-01

    Full Text Available Neural responses in visual cortex are governed by a topographic mapping from retinal locations to cortical responses. Moreover, at the voxel population level early visual cortex (EVC activity enables accurate decoding of stimuli locations. However, in many cases information enabling one to discriminate between locations (i.e. discriminative information may be less relevant than information regarding the relative location of two objects (i.e. relative information. For example, when planning to grab a cup, determining whether the cup is located at the same retinal location as the hand is hardly relevant, whereas the location of the cup relative to the hand is crucial for performing the action.We have previously used multivariate pattern analysis techniques to measure discriminative location information, and found the highest levels in early visual cortex, in line with other studies. Here we show, using representational similarity analysis, that availability of discriminative information in fMRI activation patterns does not entail availability of relative information. Specifically, we find that relative location information can be reliably extracted from activity patterns in posterior intraparietal sulcus (pIPS, but not from EVC, where we find the spatial representation to be warped.We further show that this variability in relative information levels between regions can be explained by a computational model based on an array of receptive fields. Moreover, when the model’s receptive fields are extended to include inhibitory surround regions, the model can account for the spatial warping in EVC.These results demonstrate how size and shape properties of receptive fields in human visual cortex contribute to the transformation of discriminative spatial representation into relative spatial representation along the visual stream.

  8. Functional MRI Representational Similarity Analysis Reveals a Dissociation between Discriminative and Relative Location Information in the Human Visual System.

    Science.gov (United States)

    Roth, Zvi N

    2016-01-01

    Neural responses in visual cortex are governed by a topographic mapping from retinal locations to cortical responses. Moreover, at the voxel population level early visual cortex (EVC) activity enables accurate decoding of stimuli locations. However, in many cases information enabling one to discriminate between locations (i.e., discriminative information) may be less relevant than information regarding the relative location of two objects (i.e., relative information). For example, when planning to grab a cup, determining whether the cup is located at the same retinal location as the hand is hardly relevant, whereas the location of the cup relative to the hand is crucial for performing the action. We have previously used multivariate pattern analysis techniques to measure discriminative location information, and found the highest levels in EVC, in line with other studies. Here we show, using representational similarity analysis, that availability of discriminative information in fMRI activation patterns does not entail availability of relative information. Specifically, we find that relative location information can be reliably extracted from activity patterns in posterior intraparietal sulcus (pIPS), but not from EVC, where we find the spatial representation to be warped. We further show that this variability in relative information levels between regions can be explained by a computational model based on an array of receptive fields. Moreover, when the model's receptive fields are extended to include inhibitory surround regions, the model can account for the spatial warping in EVC. These results demonstrate how size and shape properties of receptive fields in human visual cortex contribute to the transformation of discriminative spatial representations into relative spatial representations along the visual stream.

  9. Implementation of Knowledge Management & Knowledge Mining in Information Analysis%知识管理与知识挖掘在情报研究工作中的实现

    Institute of Scientific and Technical Information of China (English)

    李宏

    2003-01-01

    Traditional environment without Intemet restricts the development of information analysis work. So information analysis institutions should design their own knowledge management system and adopt new analysis procedures. This article discusses how to utilize knowledge management and knowledge mining in information analysis and how to eonstruct the model of information analysis platform.

  10. 信息系统安全需求分析方法研究%Approaches for Security Requirements Analysis of Information Systems

    Institute of Scientific and Technical Information of China (English)

    曹阳; 张维明

    2003-01-01

    Security requirements analysis is a precondition to provide effective and appropriate safeguard for information systems. Based on the existing theories and approaches, this paper discusses the categories and analysis procedure of security requirements in information systems. And according to the basic steps of security requirements analysis, the security hazard analysis model and the security risk analysis model are presented here. At the end, the methods of security requirements specification and the corresponding improvements are also introduced.

  11. Cost-utility and value-of-information analysis of early versus delayed laparoscopic cholecystectomy for acute cholecystitis

    DEFF Research Database (Denmark)

    Wilson, E; Gurusamy, K; Gluud, C;

    2010-01-01

    -of-information analysis estimated the likely return from further investment in research in this area. RESULTS:: ELC is less costly (approximately - pound820 per patient) and results in better quality of life (+0.05 QALYs per patient) than DLC. Given a willingness-to-pay threshold of pound20 000 per QALY gained...

  12. 76 FR 65317 - 60-Day Notice of Proposed Information Collection: DS-4184, Risk Management and Analysis (RAM)

    Science.gov (United States)

    2011-10-20

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF STATE 60-Day Notice of Proposed Information Collection: DS-4184, Risk Management and Analysis (RAM) ACTION: Notice of request for public comments. SUMMARY: The Department of State is seeking Office of Management and...

  13. Independent component analysis using prior information for signal detection in a functional imaging system of the retina

    NARCIS (Netherlands)

    Barriga, E. Simon; Pattichis, Marios; Ts’o, Dan; Abramoff, Michael; Kardon, Randy; Kwon, Young; Soliz, Peter

    2011-01-01

    Independent component analysis (ICA) is a statistical technique that estimates a set of sources mixed by an unknown mixing matrix using only a set of observations. For this purpose, the only assumption is that the sources are statistically independent. In many applications, some information about th

  14. Lessons from a comparative (cross-country) study using conjoint analysis: Why not use all the information?

    DEFF Research Database (Denmark)

    Blunch, Niels Johan

    Re-examination of data from two comparative (cross-country) studies using conjoint analysis shows that significant improvement can be achieved by using two often neglected kinds of a priori information: Knowledge of the expected order of preferences for the various levels of one or more attributes...

  15. What Health-Related Information Flows through You Every Day? A Content Analysis of Microblog Messages on Air Pollution

    Science.gov (United States)

    Yang, Qinghua; Yang, Fan; Zhou, Chun

    2015-01-01

    Purpose: The purpose of this paper is to investigate how the information about haze, a term used in China to describe the air pollution problem, is portrayed on Chinese social media by different types of organizations using the theoretical framework of the health belief model (HBM). Design/methodology/approach: A content analysis was conducted…

  16. Exploring the Structure of Library and Information Science Web Space Based on Multivariate Analysis of Social Tags

    Science.gov (United States)

    Joo, Soohyung; Kipp, Margaret E. I.

    2015-01-01

    Introduction: This study examines the structure of Web space in the field of library and information science using multivariate analysis of social tags from the Website, Delicious.com. A few studies have examined mathematical modelling of tags, mainly examining tagging in terms of tripartite graphs, pattern tracing and descriptive statistics. This…

  17. Carbon dioxide Information Analysis Center and World Data Center: A for Atmospheric trace gases. Annual progress report, FY 1994

    Energy Technology Data Exchange (ETDEWEB)

    Burtis, M.D. [comp.] [Tennessee Univ., Knoxville, TN (United States). Energy, Environment and Resources Center; Cushman, R.M.; Boden, T.A.; Jones, S.B.; Nelson, T.R.; Stoss, F.W. [Oak Ridge National Lab., TN (United States)

    1995-03-01

    This report summarizes the activities and accomplishments made by the Carbon Dioxide Information Analysis Center and World Data Center-A for Atmospheric Trace Gases during the fiscal year 1994. Topics discussed in this report include; organization and staff, user services, systems, communications, Collaborative efforts with China, networking, ocean data and activities of the World Data Center-A.

  18. Information for policy makers 2. Analysis of the EU's energy roadmap 2050 scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Foerster, Hannah; Healy, Sean; Loreck, Charlotte; Matthes, Felix [Oeko-Institut e.V. - Institut fuer Angewandte Oekologie, Freiburg im Breisgau (Germany); Fischedick, Manfred; Lechtenboehmer, Stefan; Samadi, Sascha; Venjakob, Johannes [Wuppertal Institut (Germany)

    2012-05-15

    With growing concerns about climate change, energy import dependency and increasing fuel costs, a political consensus has formed in Europe in recent years about the need to transform the way we supply and consume energy. However, there is less political consensus on the specific steps that need to be taken in order to achieve a future sustainable energy system. Questions about which technologies should be used to what extent and how fast changes in the energy system should be instituted are being discussed on the European Union as well as on the Member State level. Energy scenarios are seen as a helpful tool to guide and inform these discussions. Several scenario studies on the European energy system have been released in recent years by stakeholders like environmental NGOs and industry associations. A number of these studies have recently been analysed by the Oeko-Institut and the Wuppertal Institute within an ongoing project commissioned by the Smart Energy for Europe Platform (SEFEF). The project aims to advance the debate on the decarbonisation of the energy system in the EU as well as its Member States during the course of 2012 and to make contributions to the scientific literature on this topic. Analysis within the project focuses on the development of the electricity system, as this system today is the main source for CO{sub 2} emissions and is widely regarded to be the key to any future decarbonisation pathway. The paper at hand summarises the analyses accomplished based on scenarios developed within the recently released Energy Roadmap 2050 of the European Union. The Roadmap explores different energy system pathways, which are compatible with the EU's long-term climate targets. It is a highly influential publication and will play a significant role in determining what will follow the EU's 2020 energy agenda. The Roadmap's analysis is currently discussed by EU and Member States policymakers as well as by stakeholders throughout Europe

  19. A fuzzy logic decision support system for open source information analysis in a non-proliferation framework

    International Nuclear Information System (INIS)

    Full text: In the last decades events in the international scene forced a radical enhancement in the mission of the IAEA: from verifying the non-diversion of safeguarded material to giving assurance about the absence of undeclared material, in compliance with States' safeguards undertakings. For that purpose, a new legal and technical tool was open to signature: the Additional Protocol, which requires States to provide the IAEA with a wider range of information about materials, equipment, know how and sites directly or indirectly related to the nuclear fuel cycle. Subsequently, the declarations provided by the States are confronted with inspections' results and other information sources, including open sources, in order to assess both their correctness and completeness. Hence, the information analysis process, and especially the analysis of open source information has become a crucial aspect of the State Evaluation Process. A decision support system (dss) is proposed as a tool that supports the analyst in the acquisition of relevant information from open sources and that performs a synthesis of the information acquired. Based on the most comprehensive set of proliferation indicators available, the IAEA's Physical Model (PM), the dss breaks up in two components: an acquisition and analysis module where information is drawn from open sources and an aggregation and output module. In the acquisition and analysis module, a semantic search tool (topic tree) is developed based on the Physical Model, that supports and narrows the results of the search of information among open sources, typically the internet. Subsequently, according to what is stated in the documents found, experts assign values to the related Physical Model indicators. In order to adapt to the specific nature of open source information, imprecise, uncertain and often lacking, indicators are assigned linguistic values, which are stored in the indicators' database. The aim of the synthesis and output module

  20. Guidelines for Using Building Information Modeling for Energy Analysis of Buildings

    Directory of Open Access Journals (Sweden)

    Thomas Reeves

    2015-12-01

    Full Text Available Building energy modeling (BEM, a subset of building information modeling (BIM, integrates energy analysis into the design, construction, and operation and maintenance of buildings. As there are various existing BEM tools available, there is a need to evaluate the utility of these tools in various phases of the building lifecycle. The goal of this research was to develop guidelines for evaluation and selection of BEM tools to be used in particular building lifecycle phases. The objectives of this research were to: (1 Evaluate existing BEM tools; (2 Illustrate the application of the three BEM tools; (3 Re-evaluate the three BEM tools; and (4 Develop guidelines for evaluation, selection and application of BEM tools in the design, construction and operation/maintenance phases of buildings. Twelve BEM tools were initially evaluated using four criteria: interoperability, usability, available inputs, and available outputs. Each of the top three BEM tools selected based on this initial evaluation was used in a case study to simulate and evaluate energy usage, daylighting performance, and natural ventilation for two academic buildings (LEED-certified and non-LEED-certified. The results of the case study were used to re-evaluate the three BEM tools using the initial criteria with addition of the two new criteria (speed and accuracy, and to develop guidelines for evaluating and selecting BEM tools to analyze building energy performance. The major contribution of this research is the development of these guidelines that can help potential BEM users to identify the most appropriate BEM tool for application in particular building lifecycle phases.