WorldWideScience

Sample records for analysis georgraphic information

  1. Canonical Information Analysis

    DEFF Research Database (Denmark)

    Vestergaard, Jacob Schack; Nielsen, Allan Aasbjerg

    2015-01-01

    is replaced by the information theoretical, entropy based measure mutual information, which is a much more general measure of association. We make canonical information analysis feasible for large sample problems, including for example multispectral images, due to the use of a fast kernel density estimator......Canonical correlation analysis is an established multivariate statistical method in which correlation between linear combinations of multivariate sets of variables is maximized. In canonical information analysis introduced here, linear correlation as a measure of association between variables...... for entropy estimation. Canonical information analysis is applied successfully to (1) simple simulated data to illustrate the basic idea and evaluate performance, (2) fusion of weather radar and optical geostationary satellite data in a situation with heavy precipitation, and (3) change detection in optical...

  2. Information Flow Analysis for VHDL

    DEFF Research Database (Denmark)

    Tolstrup, Terkel Kristian; Nielson, Flemming; Nielson, Hanne Riis

    2005-01-01

    We describe a fragment of the hardware description language VHDL that is suitable for implementing the Advanced Encryption Standard algorithm. We then define an Information Flow analysis as required by the international standard Common Criteria. The goal of the analysis is to identify the entire...... information flow through the VHDL program. The result of the analysis is presented as a non-transitive directed graph that connects those nodes (representing either variables or signals) where an information flow might occur. We compare our approach to that of Kemmerer and conclude that our approach yields...

  3. Inform: Efficient Information-Theoretic Analysis of Collective Behaviors

    Directory of Open Access Journals (Sweden)

    Douglas G. Moore

    2018-06-01

    Full Text Available The study of collective behavior has traditionally relied on a variety of different methodological tools ranging from more theoretical methods such as population or game-theoretic models to empirical ones like Monte Carlo or multi-agent simulations. An approach that is increasingly being explored is the use of information theory as a methodological framework to study the flow of information and the statistical properties of collectives of interacting agents. While a few general purpose toolkits exist, most of the existing software for information theoretic analysis of collective systems is limited in scope. We introduce Inform, an open-source framework for efficient information theoretic analysis that exploits the computational power of a C library while simplifying its use through a variety of wrappers for common higher-level scripting languages. We focus on two such wrappers here: PyInform (Python and rinform (R. Inform and its wrappers are cross-platform and general-purpose. They include classical information-theoretic measures, measures of information dynamics and information-based methods to study the statistical behavior of collective systems, and expose a lower-level API that allow users to construct measures of their own. We describe the architecture of the Inform framework, study its computational efficiency and use it to analyze three different case studies of collective behavior: biochemical information storage in regenerating planaria, nest-site selection in the ant Temnothorax rugatulus, and collective decision making in multi-agent simulations.

  4. Canonical analysis based on mutual information

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Vestergaard, Jacob Schack

    2015-01-01

    combinations with the information theoretical measure mutual information (MI). We term this type of analysis canonical information analysis (CIA). MI allows for the actual joint distribution of the variables involved and not just second order statistics. While CCA is ideal for Gaussian data, CIA facilitates...

  5. INFORMATION SYSTEM OF THE FINANCIAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    MIRELA MONEA

    2013-12-01

    Full Text Available Financial analysis provides the information necessary for decision making, and also helps both the external and internal users of these. The results of the financial analysis work are dependent on the quality, accuracy, relevance and effectiveness of the information collected, and processed. Essential sources of information for financial analysis are financial statements, which are considered the raw material of financial analysis. One of the financial statements -the balance sheet - provides information about assets, liabilities, equity, liquidity, solvency, risk, financial flexibility. The profit and loss account is a synthesis accounting document, part of the financial statement reporting enterprise financial performances during of a specified accounting period and summarizes all revenues earned and expenses of an accounting period and reports the results.

  6. Information needs analysis principles and practice in information organizations

    CERN Document Server

    Dorner, Daniel G; Calvert, Philip J

    2010-01-01

    If you want to provide an information service that truly fulfils your users' needs, this book is essential reading. The book supports practitioners in developing an information needs analysis strategy and offers the necessary professional skills and techniques to do so.

  7. Communication Analysis of Information Complexes.

    Science.gov (United States)

    Malik, M. F.

    Communication analysis is a tool for perceptual assessment of existing or projected information complexes, i.e., an established reality perceived by one or many humans. An information complex could be of a physical nature, such as a building, landscape, city street; or of a pure informational nature, such as a film, television program,…

  8. Informational analysis involving application of complex information system

    Science.gov (United States)

    Ciupak, Clébia; Vanti, Adolfo Alberto; Balloni, Antonio José; Espin, Rafael

    The aim of the present research is performing an informal analysis for internal audit involving the application of complex information system based on fuzzy logic. The same has been applied in internal audit involving the integration of the accounting field into the information systems field. The technological advancements can provide improvements to the work performed by the internal audit. Thus we aim to find, in the complex information systems, priorities for the work of internal audit of a high importance Private Institution of Higher Education. The applied method is quali-quantitative, as from the definition of strategic linguistic variables it was possible to transform them into quantitative with the matrix intersection. By means of a case study, where data were collected via interview with the Administrative Pro-Rector, who takes part at the elaboration of the strategic planning of the institution, it was possible to infer analysis concerning points which must be prioritized at the internal audit work. We emphasize that the priorities were identified when processed in a system (of academic use). From the study we can conclude that, starting from these information systems, audit can identify priorities on its work program. Along with plans and strategic objectives of the enterprise, the internal auditor can define operational procedures to work in favor of the attainment of the objectives of the organization.

  9. Social Network Analysis and informal trade

    DEFF Research Database (Denmark)

    Walther, Olivier

    networks can be applied to better understand informal trade in developing countries, with a particular focus on Africa. The paper starts by discussing some of the fundamental concepts developed by social network analysis. Through a number of case studies, we show how social network analysis can...... illuminate the relevant causes of social patterns, the impact of social ties on economic performance, the diffusion of resources and information, and the exercise of power. The paper then examines some of the methodological challenges of social network analysis and how it can be combined with other...... approaches. The paper finally highlights some of the applications of social network analysis and their implications for trade policies....

  10. Hybrid Information Flow Analysis for Programs with Arrays

    Directory of Open Access Journals (Sweden)

    Gergö Barany

    2016-07-01

    Full Text Available Information flow analysis checks whether certain pieces of (confidential data may affect the results of computations in unwanted ways and thus leak information. Dynamic information flow analysis adds instrumentation code to the target software to track flows at run time and raise alarms if a flow policy is violated; hybrid analyses combine this with preliminary static analysis. Using a subset of C as the target language, we extend previous work on hybrid information flow analysis that handled pointers to scalars. Our extended formulation handles arrays, pointers to array elements, and pointer arithmetic. Information flow through arrays of pointers is tracked precisely while arrays of non-pointer types are summarized efficiently. A prototype of our approach is implemented using the Frama-C program analysis and transformation framework. Work on a full machine-checked proof of the correctness of our approach using Isabelle/HOL is well underway; we present the existing parts and sketch the rest of the correctness argument.

  11. Mathematical Analysis of Evolution, Information, and Complexity

    CERN Document Server

    Arendt, Wolfgang

    2009-01-01

    Mathematical Analysis of Evolution, Information, and Complexity deals with the analysis of evolution, information and complexity. The time evolution of systems or processes is a central question in science, this text covers a broad range of problems including diffusion processes, neuronal networks, quantum theory and cosmology. Bringing together a wide collection of research in mathematics, information theory, physics and other scientific and technical areas, this new title offers elementary and thus easily accessible introductions to the various fields of research addressed in the book.

  12. Information architecture. Volume 2, Part 1: Baseline analysis summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-01

    The Department of Energy (DOE) Information Architecture, Volume 2, Baseline Analysis, is a collaborative and logical next-step effort in the processes required to produce a Departmentwide information architecture. The baseline analysis serves a diverse audience of program management and technical personnel and provides an organized way to examine the Department`s existing or de facto information architecture. A companion document to Volume 1, The Foundations, it furnishes the rationale for establishing a Departmentwide information architecture. This volume, consisting of the Baseline Analysis Summary (part 1), Baseline Analysis (part 2), and Reference Data (part 3), is of interest to readers who wish to understand how the Department`s current information architecture technologies are employed. The analysis identifies how and where current technologies support business areas, programs, sites, and corporate systems.

  13. Analysis of the brazilian scientific production about information flows

    Directory of Open Access Journals (Sweden)

    Danielly Oliveira Inomata

    2015-07-01

    Full Text Available Objective. This paper presents and discuss the concepts, contexts and applications involving information flows in organizations. Method. Systematic review, followed by a bibliometric analysis and system analysis. The systematic review aimed to search for, evaluate and review evidence about the research topic. The systematic review process comprised the following steps: 1 definition of keywords, 2 systematic review, 3 exploration and analysis of articles and 4 comparison and consolidation of results. Results. A bibliometric analysis aimed to provide a statement of the relevance of articles where the authors, dates of publications, citation index, and periodic keywords with higher occurrence. Conclusions. As survey results confirms the emphasis on information featured in the knowledge management process, and advancing years, it seems that the emphasis is on networks, ie, studies are turning to the operationalization and analysis of flows information networks. The literature produced demonstrates the relationship of information flow with its management, applied to different organizational contexts, including showing new trends in information science as the study and analysis of information flow in networks.

  14. Automated information retrieval system for radioactivation analysis

    International Nuclear Information System (INIS)

    Lambrev, V.G.; Bochkov, P.E.; Gorokhov, S.A.; Nekrasov, V.V.; Tolstikova, L.I.

    1981-01-01

    An automated information retrieval system for radioactivation analysis has been developed. An ES-1022 computer and a problem-oriented software ''The description information search system'' were used for the purpose. Main aspects and sources of forming the system information fund, characteristics of the information retrieval language of the system are reported and examples of question-answer dialogue are given. Two modes can be used: selective information distribution and retrospective search [ru

  15. Structured information analysis for human reliability analysis of emergency tasks in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Kim, Jae Whan; Park, Jin Kyun; Ha, Jae Joo [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-02-01

    More than twenty HRA (Human Reliability Analysis) methodologies have been developed and used for the safety analysis in nuclear field during the past two decades. However, no methodology appears to have universally been accepted, as various limitations have been raised for more widely used ones. One of the most important limitations of conventional HRA is insufficient analysis of the task structure and problem space. To resolve this problem, we suggest SIA (Structured Information Analysis) for HRA. The proposed SIA consists of three parts. The first part is the scenario analysis that investigates the contextual information related to the given task on the basis of selected scenarios. The second is the goals-means analysis to define the relations between the cognitive goal and task steps. The third is the cognitive function analysis module that identifies the cognitive patterns and information flows involved in the task. Through the three-part analysis, systematic investigation is made possible from the macroscopic information on the tasks to the microscopic information on the specific cognitive processes. It is expected that analysts can attain a structured set of information that helps to predict the types and possibility of human error in the given task. 48 refs., 12 figs., 11 tabs. (Author)

  16. Analysis and design of nuclear energy information systems

    International Nuclear Information System (INIS)

    Yohanes Dwi Anggoro; Sriyana; Arief Tris Yuliyanto; Wiku Lulus Widodo

    2015-01-01

    Management of research reports and activities of the Center for Nuclear Energy System Assessment (PKSEN), either in the form of documents and the results of other activities, are important part of the series of activities PKSEN mission achievement. Management of good documents will facilitate the provision of improved inputs or use the maximum results. But over the past few years, there are still some problem in the management of research reports and activities performed by PKSEN. The purpose of this study is to analyze and design flow layout of the Nuclear Energy Information System to facilitate the implementation of the Nuclear Energy Information System. In addition to be used as a research management system and PKSEN activities, it can also be used as information media for the community. Nuclear Energy Information System package is expected to be ''one gate systems for PKSEN information. The research methodology used are: (i) analysis of organizational systems, (ii) the analysis and design of information systems; (iii) the analysis and design of software systems; (iv) the analysis and design of database systems. The results of this study are: had identified and resources throughout the organization PKSEN activation, had analyzed the application of SIEN using SWOT analysis, had identified several types of devices required, had been compiled hierarchy of SIEN, had determined that the database system used is a centralized database system and had elections MySQL as DBMS. The result is a basic design of the Nuclear Energy Information System) which will used as a research and activities management system of PKSEN and also can be used as a medium of information for the community. (author)

  17. Benefit analysis of proposed information systems

    OpenAIRE

    Besore, Mark H.

    1991-01-01

    Approved for public release; distribution is unlimited This thesis reviewed two different approaches to benefit analysis, benefit comparison and user satisfaction, that could be applied to the evaluation of proposed information systems which are under consideration for acquisition by the federal government. Currently the General Services Administration only recommends that present value analysis methods be used in the analysis of alternatives even though the GSA specifies...

  18. Crime analysis using open source information

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah; Shah, Azhar Ali

    2015-01-01

    In this paper, we present a method of crime analysis from open source information. We employed un-supervised methods of data mining to explore the facts regarding the crimes of an area of interest. The analysis is based on well known clustering and association techniques. The results show...

  19. A Comparative Analysis of University Information Systems within the Scope of the Information Security Risks

    Directory of Open Access Journals (Sweden)

    Rustu Yilmaz

    2016-05-01

    Full Text Available Universities are the leading institutions that are the sources of educated human population who both produce information and ensure to develop new products and new services by using information effectively, and who are needed in every area. Therefore, universities are expected to be institutions where information and information management are used efficiently. In the present study, the topics such as infrastructure, operation, application, information, policy and human-based information security at universities were examined within the scope of the information security standards which are highly required and intended to be available at each university today, and then a comparative analysis was conducted specific to Turkey. Within the present study, the Microsoft Security Assessment Tool developed by Microsoft was used as the risk analysis tool. The analyses aim to enable the universities to compare their information systems with the information systems of other universities within the scope of the information security awareness, and to make suggestions in this regard.

  20. Information flow analysis of interactome networks.

    Directory of Open Access Journals (Sweden)

    Patrycja Vasilyev Missiuro

    2009-04-01

    Full Text Available Recent studies of cellular networks have revealed modular organizations of genes and proteins. For example, in interactome networks, a module refers to a group of interacting proteins that form molecular complexes and/or biochemical pathways and together mediate a biological process. However, it is still poorly understood how biological information is transmitted between different modules. We have developed information flow analysis, a new computational approach that identifies proteins central to the transmission of biological information throughout the network. In the information flow analysis, we represent an interactome network as an electrical circuit, where interactions are modeled as resistors and proteins as interconnecting junctions. Construing the propagation of biological signals as flow of electrical current, our method calculates an information flow score for every protein. Unlike previous metrics of network centrality such as degree or betweenness that only consider topological features, our approach incorporates confidence scores of protein-protein interactions and automatically considers all possible paths in a network when evaluating the importance of each protein. We apply our method to the interactome networks of Saccharomyces cerevisiae and Caenorhabditis elegans. We find that the likelihood of observing lethality and pleiotropy when a protein is eliminated is positively correlated with the protein's information flow score. Even among proteins of low degree or low betweenness, high information scores serve as a strong predictor of loss-of-function lethality or pleiotropy. The correlation between information flow scores and phenotypes supports our hypothesis that the proteins of high information flow reside in central positions in interactome networks. We also show that the ranks of information flow scores are more consistent than that of betweenness when a large amount of noisy data is added to an interactome. Finally, we

  1. Cancer Patients' Informational Needs: Qualitative Content Analysis.

    Science.gov (United States)

    Heidari, Haydeh; Mardani-Hamooleh, Marjan

    2016-12-01

    Understanding the informational needs of cancer patients is a requirement to plan any educative care program for them. The aim of this study was to identify Iranian cancer patients' perceptions of informational needs. The study took a qualitative approach. Semi-structured interviews were held with 25 cancer patients in two teaching hospitals in Iran. Transcripts of the interviews underwent conventional content analysis, and categories were extracted. The results came under two main categories: disease-related informational needs and information needs related to daily life. Disease-related informational needs had two subcategories: obtaining information about the nature of disease and obtaining information about disease prognosis. Information needs related to daily life also had two subcategories: obtaining information about healthy lifestyle and obtaining information about regular activities of daily life. The findings provide deep understanding of cancer patients' informational needs in Iran.

  2. A Strategic Analysis of Information Sharing Among Cyber Attackers

    Directory of Open Access Journals (Sweden)

    Kjell Hausken

    2015-10-01

    Full Text Available We build a game theory model where the market design is such that one firm invests in security to defend against cyber attacks by two hackers. The firm has an asset, which is allocated between the three market participants dependent on their contest success. Each hacker chooses an optimal attack, and they share information with each other about the firm’s vulnerabilities. Each hacker prefers to receive information, but delivering information gives competitive advantage to the other hacker. We find that each hacker’s attack and information sharing are strategic complements while one hacker’s attack and the other hacker’s information sharing are strategic substitutes. As the firm’s unit defense cost increases, the attack is inverse U-shaped and reaches zero, while the firm’s defense and profit decrease, and the hackers’ information sharing and profit increase. The firm’s profit increases in the hackers’ unit cost of attack, while the hackers’ information sharing and profit decrease. Our analysis also reveals the interesting result that the cumulative attack level of the hackers is not affected by the effectiveness of information sharing between them and moreover, is also unaffected by the intensity of joint information sharing. We also find that as the effectiveness of information sharing between hackers increases relative to the investment in attack, the firm’s investment in cyber security defense and profit are constant, the hackers’ investments in attacks decrease, and information sharing levels and hacker profits increase. In contrast, as the intensity of joint information sharing increases, while the firm’s investment in cyber security defense and profit remain constant, the hackers’ investments in attacks increase, and the hackers’ information sharing levels and profits decrease. Increasing the firm’s asset causes all the variables to increase linearly, except information sharing which is constant. We extend

  3. Exploring health information technology education: an analysis of the research.

    Science.gov (United States)

    Virgona, Thomas

    2012-01-01

    This article is an analysis of the Health Information Technology Education published research. The purpose of this study was to examine selected literature using variables such as journal frequency, keyword analysis, universities associated with the research and geographic diversity. The analysis presented in this paper has identified intellectually significant studies that have contributed to the development and accumulation of intellectual wealth of Health Information Technology. The keyword analysis suggests that Health Information Technology research has evolved from establishing concepts and domains of health information systems, technology and management to contemporary issues such as education, outsourcing, web services and security. The research findings have implications for educators, researchers, journal.

  4. A Concept Analysis of Fully Informed: Breastfeeding Promotion

    Science.gov (United States)

    2005-12-21

    updated breastfeeding policy statement, the American Academy of Pediatrics 3 ( AAP , 2005) identified the compelling advantages of breastfeeding and urged...healthcare 4 professionals to implement principles to promote breastfeeding . The AAP cited obstacles 5 to the initiation and continuation of...Analysis of Fully Informed 2 14 A Concept Analysis of Fully Informed: Breastfeeding Promotion 15 In February 2005, the American Academy of Pediatrics ( AAP

  5. Change detection in bi-temporal data by canonical information analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Vestergaard, Jacob Schack

    2015-01-01

    combinations with the information theoretical measure mutual information (MI). We term this type of analysis canonical information analysis (CIA). MI allows for the actual joint distribution of the variables involved and not just second order statistics. Where CCA is ideal for Gaussian data, CIA facilitates...

  6. The informational subject in the contemporary context. An analysis from the epistemology of informational community identity

    Directory of Open Access Journals (Sweden)

    Miguel Angel Rendón-Rojas

    2012-04-01

    Full Text Available The Epistemology of Informational Community Identity (ECI-I is proposed as a toolbox for the analysis of informational reality within categories as contextual paradigm, informational subject and informational entity, built ex profeso for this theoretical-methodological analysis. The concepts of information user´s and informational subject are distinguished, the latest, to seek an answer from a concrete social enclave within a particular community and its interrelationships with others, to under go a process of self construction, from which specific information needs arise. And the user needs to seek concrete answer after formal questioning many facts occurring in a consumerist, unequal and alienating world. So the emphasis is put on the need for an interdisciplinary approach between social theory and library science in the study of the documentary information world of particular informational subjects, which is often marginalized and excluded

  7. Value of information analysis for corrective action unit No. 98: Frenchman Flat

    International Nuclear Information System (INIS)

    1997-06-01

    A value of information analysis has been completed as part of the corrective action process for Frenchman Flat, the first Nevada Test Site underground test area to be scheduled for the corrective action process. A value of information analysis is a cost-benefit analysis applied to the acquisition of new information which is needed to reduce the uncertainty in the prediction of a contaminant boundary surrounding underground nuclear tests in Frenchman Flat. The boundary location will be established to protect human health and the environment from the consequences of using contaminated groundwater on the Nevada Test Site. Uncertainties in the boundary predictions are assumed to be the result of data gaps. The value of information analysis in this document compares the cost of acquiring new information with the benefit of acquiring that information during the corrective action investigation at Frenchman Flat. Methodologies incorporated into the value of information analysis include previous geological modeling, groundwater flow modeling, contaminant transport modeling, statistics, sensitivity analysis, uncertainty analysis, and decision analysis

  8. Information Retrieval Using Hadoop Big Data Analysis

    Science.gov (United States)

    Motwani, Deepak; Madan, Madan Lal

    This paper concern on big data analysis which is the cognitive operation of probing huge amounts of information in an attempt to get uncovers unseen patterns. Through Big Data Analytics Applications such as public and private organization sectors have formed a strategic determination to turn big data into cut throat benefit. The primary occupation of extracting value from big data give rise to a process applied to pull information from multiple different sources; this process is known as extract transforms and lode. This paper approach extract information from log files and Research Paper, awareness reduces the efforts for blueprint finding and summarization of document from several positions. The work is able to understand better Hadoop basic concept and increase the user experience for research. In this paper, we propose an approach for analysis log files for finding concise information which is useful and time saving by using Hadoop. Our proposed approach will be applied on different research papers on a specific domain and applied for getting summarized content for further improvement and make the new content.

  9. Informational analysis for compressive sampling in radar imaging.

    Science.gov (United States)

    Zhang, Jingxiong; Yang, Ke

    2015-03-24

    Compressive sampling or compressed sensing (CS) works on the assumption of the sparsity or compressibility of the underlying signal, relies on the trans-informational capability of the measurement matrix employed and the resultant measurements, operates with optimization-based algorithms for signal reconstruction and is thus able to complete data compression, while acquiring data, leading to sub-Nyquist sampling strategies that promote efficiency in data acquisition, while ensuring certain accuracy criteria. Information theory provides a framework complementary to classic CS theory for analyzing information mechanisms and for determining the necessary number of measurements in a CS environment, such as CS-radar, a radar sensor conceptualized or designed with CS principles and techniques. Despite increasing awareness of information-theoretic perspectives on CS-radar, reported research has been rare. This paper seeks to bridge the gap in the interdisciplinary area of CS, radar and information theory by analyzing information flows in CS-radar from sparse scenes to measurements and determining sub-Nyquist sampling rates necessary for scene reconstruction within certain distortion thresholds, given differing scene sparsity and average per-sample signal-to-noise ratios (SNRs). Simulated studies were performed to complement and validate the information-theoretic analysis. The combined strategy proposed in this paper is valuable for information-theoretic orientated CS-radar system analysis and performance evaluation.

  10. Regional Analysis of Remote Sensing Based Evapotranspiration Information

    Science.gov (United States)

    Geli, H. M. E.; Hain, C.; Anderson, M. C.; Senay, G. B.

    2017-12-01

    Recent research findings on modeling actual evapotranspiration (ET) using remote sensing data and methods have proven the ability of these methods to address wide range of hydrological and water resources issues including river basin water balance for improved water resources management, drought monitoring, drought impact and socioeconomic responses, agricultural water management, optimization of land-use for water conservations, water allocation agreement among others. However, there is still a critical need to identify appropriate type of ET information that can address each of these issues. The current trend of increasing demand for water due to population growth coupled with variable and limited water supply due to drought especially in arid and semiarid regions with limited water supply have highlighted the need for such information. To properly address these issues different spatial and temporal resolutions of ET information will need to be used. For example, agricultural water management applications require ET information at field (30-m) and daily time scales while for river basin hydrologic analysis relatively coarser spatial and temporal scales can be adequate for such regional applications. The objective of this analysis is to evaluate the potential of using an integrated ET information that can be used to address some of these issues collectively. This analysis will highlight efforts to address some of the issues that are applicable to New Mexico including assessment of statewide water budget as well as drought impact and socioeconomic responses which all require ET information but at different spatial and temporal scales. This analysis will provide an evaluation of four remote sensing based ET models including ALEXI, DisALEXI, SSEBop, and SEBAL3.0. The models will be compared with ground-based observations from eddy covariance towers and water balance calculations. Remote sensing data from Landsat, MODIS, and VIIRS sensors will be used to provide ET

  11. Analysis of information security reliability: A tutorial

    International Nuclear Information System (INIS)

    Kondakci, Suleyman

    2015-01-01

    This article presents a concise reliability analysis of network security abstracted from stochastic modeling, reliability, and queuing theories. Network security analysis is composed of threats, their impacts, and recovery of the failed systems. A unique framework with a collection of the key reliability models is presented here to guide the determination of the system reliability based on the strength of malicious acts and performance of the recovery processes. A unique model, called Attack-obstacle model, is also proposed here for analyzing systems with immunity growth features. Most computer science curricula do not contain courses in reliability modeling applicable to different areas of computer engineering. Hence, the topic of reliability analysis is often too diffuse to most computer engineers and researchers dealing with network security. This work is thus aimed at shedding some light on this issue, which can be useful in identifying models, their assumptions and practical parameters for estimating the reliability of threatened systems and for assessing the performance of recovery facilities. It can also be useful for the classification of processes and states regarding the reliability of information systems. Systems with stochastic behaviors undergoing queue operations and random state transitions can also benefit from the approaches presented here. - Highlights: • A concise survey and tutorial in model-based reliability analysis applicable to information security. • A framework of key modeling approaches for assessing reliability of networked systems. • The framework facilitates quantitative risk assessment tasks guided by stochastic modeling and queuing theory. • Evaluation of approaches and models for modeling threats, failures, impacts, and recovery analysis of information systems

  12. Point Information Gain and Multidimensional Data Analysis

    Directory of Open Access Journals (Sweden)

    Renata Rychtáriková

    2016-10-01

    Full Text Available We generalize the point information gain (PIG and derived quantities, i.e., point information gain entropy (PIE and point information gain entropy density (PIED, for the case of the Rényi entropy and simulate the behavior of PIG for typical distributions. We also use these methods for the analysis of multidimensional datasets. We demonstrate the main properties of PIE/PIED spectra for the real data with the examples of several images and discuss further possible utilizations in other fields of data processing.

  13. All-Source Information Acquisition and Analysis in the IAEA Department of Safeguards

    International Nuclear Information System (INIS)

    Ferguson, Matthew; Norman, Claude

    2010-01-01

    All source information analysis enables proactive implementation of in-field verification activities, supports the State Evaluation process, and is essential to the IAEA's strengthened safeguards system. Information sources include State-declared nuclear material accounting and facility design information; voluntarily supplied information such as nuclear procurement data; commercial satellite imagery; open source information and information/results from design information verifications (DIVs), inspections and complementary accesses (CAs). The analysis of disparate information sources directly supports inspections, design information verifications and complementary access, and enables both more reliable cross-examination for consistency and completeness as well as in-depth investigation of possible safeguards compliance issues. Comparison of State-declared information against information on illicit nuclear procurement networks, possible trafficking in nuclear materials, and scientific and technical information on nuclear-related research and development programmes, provides complementary measures for monitoring nuclear developments and increases Agency capabilities to detect possible undeclared nuclear activities. Likewise, expert analysis of commercial satellite imagery plays a critical role for monitoring un-safeguarded sites and facilities. In sum, the combination of these measures provides early identification of possible undeclared nuclear material or activities, thus enhancing deterrence of safeguards system that is fully information driven, and increasing confidence in Safeguards conclusions. By increasing confidence that nuclear materials and technologies in States under Safeguards are used solely for peaceful purposes, information-driven safeguards will strengthen the nuclear non-proliferation system. Key assets for Agency collection, processing, expert analysis, and integration of these information sources are the Information Collection and Analysis

  14. Sensitivity Analysis for Urban Drainage Modeling Using Mutual Information

    Directory of Open Access Journals (Sweden)

    Chuanqi Li

    2014-11-01

    Full Text Available The intention of this paper is to evaluate the sensitivity of the Storm Water Management Model (SWMM output to its input parameters. A global parameter sensitivity analysis is conducted in order to determine which parameters mostly affect the model simulation results. Two different methods of sensitivity analysis are applied in this study. The first one is the partial rank correlation coefficient (PRCC which measures nonlinear but monotonic relationships between model inputs and outputs. The second one is based on the mutual information which provides a general measure of the strength of the non-monotonic association between two variables. Both methods are based on the Latin Hypercube Sampling (LHS of the parameter space, and thus the same datasets can be used to obtain both measures of sensitivity. The utility of the PRCC and the mutual information analysis methods are illustrated by analyzing a complex SWMM model. The sensitivity analysis revealed that only a few key input variables are contributing significantly to the model outputs; PRCCs and mutual information are calculated and used to determine and rank the importance of these key parameters. This study shows that the partial rank correlation coefficient and mutual information analysis can be considered effective methods for assessing the sensitivity of the SWMM model to the uncertainty in its input parameters.

  15. Enhancing Safeguards through Information Analysis: Business Analytics Tools

    International Nuclear Information System (INIS)

    Vincent, J.; Midwinter, J.

    2015-01-01

    For the past 25 years the IBM i2 Intelligence Analysis product portfolio has assisted over 4,500 organizations across law enforcement, defense, government agencies, and commercial private sector businesses to maximize the value of the mass of information to discover and disseminate actionable intelligence that can help identify, investigate, predict, prevent, and disrupt criminal, terrorist, and fraudulent acts; safeguarding communities, organizations, infrastructures, and investments. The collaborative Intelligence Analysis environment delivered by i2 is specifically designed to be: · scalable: supporting business needs as well as operational and end user environments · modular: an architecture which can deliver maximum operational flexibility with ability to add complimentary analytics · interoperable: integrating with existing environments and eases information sharing across partner agencies · extendable: providing an open source developer essential toolkit, examples, and documentation for custom requirements i2 Intelligence Analysis brings clarity to complex investigations and operations by delivering industry leading multidimensional analytics that can be run on-demand across disparate data sets or across a single centralized analysis environment. The sole aim is to detect connections, patterns, and relationships hidden within high-volume, all-source data, and to create and disseminate intelligence products in near real time for faster informed decision making. (author)

  16. A Technical Analysis Information Fusion Approach for Stock Price Analysis and Modeling

    Science.gov (United States)

    Lahmiri, Salim

    In this paper, we address the problem of technical analysis information fusion in improving stock market index-level prediction. We present an approach for analyzing stock market price behavior based on different categories of technical analysis metrics and a multiple predictive system. Each category of technical analysis measures is used to characterize stock market price movements. The presented predictive system is based on an ensemble of neural networks (NN) coupled with particle swarm intelligence for parameter optimization where each single neural network is trained with a specific category of technical analysis measures. The experimental evaluation on three international stock market indices and three individual stocks show that the presented ensemble-based technical indicators fusion system significantly improves forecasting accuracy in comparison with single NN. Also, it outperforms the classical neural network trained with index-level lagged values and NN trained with stationary wavelet transform details and approximation coefficients. As a result, technical information fusion in NN ensemble architecture helps improving prediction accuracy.

  17. Thinking about information work of nuclear science and technology in the age of big data: speaking of the information analysis and research

    International Nuclear Information System (INIS)

    Chen Tieyong

    2014-01-01

    Human society is entering a 'PB' (1024TB) the new era as the unit of structured and unstructured data, In the network era, with the development of mobile communications, electronic commerce, the emergence and development of social network. Now, a large-scale production, sharing and application data era is opening. How to explore the value of data, to conquer big data, to get useful information, is an important task of our science and technology information workers. This paper tries to analyze the development of the nuclear science and technology information work from big data obtain, analysis, application. Our analysis and research work for information will be increasingly based on all data and analysis, Instead of random sampling. The data 'sound' is possible. A lot of results of information analysis and research can be expressed quantitatively. We should attach great importance to data collection, careful analysis of the big data. We involves the professional division of labor, but also to cooperation In nuclear science and technology information analysis and research process. In addition, we should strengthen the nuclear science and technology information resource construction, improve Information supply; strengthen the analysis and research of nuclear science and technology information, improve the information service; strengthen information management of nuclear science and technology, pay attention to the security problems and intellectual property rights in information sharing; strengthen personnel training, continuously improve the nuclear science and technology information work efficiency and performance. In the age of big data, our nuclear science and technology information workers shall be based on the information analysis and study as the core, one hand grasping information collection, another hand grasping information service, forge ahead and innovation, continuous improvement working ability of nuclear science and technology information, improve the

  18. Cluster Analysis of International Information and Social Development.

    Science.gov (United States)

    Lau, Jesus

    1990-01-01

    Analyzes information activities in relation to socioeconomic characteristics in low, middle, and highly developed economies for the years 1960 and 1977 through the use of cluster analysis. Results of data from 31 countries suggest that information development is achieved mainly by countries that have also achieved social development. (26…

  19. METHODOLOGICAL APPROACH TO ANALYSIS AND EVALUATION OF INFORMATION PROTECTION IN INFORMATION SYSTEMS BASED ON VULNERABILITY DANGER

    Directory of Open Access Journals (Sweden)

    Y. M. Krotiuk

    2008-01-01

    Full Text Available The paper considers a methodological approach to an analysis and estimation of information security in the information systems which is based on the analysis of vulnerabilities and an extent of their hazard. By vulnerability hazard it is meant a complexity of its operation as a part of an information system. The required and sufficient vulnerability operational conditions  have  been  determined in the paper. The paper proposes a generalized model for attack realization which is used as a basis for construction of an attack realization model for an operation of a particular vulnerability. A criterion for estimation of information protection in the information systems which is based on the estimation of vulnerability hazard is formulated in the paper. The proposed approach allows to obtain a quantitative estimation of the information system security on the basis of the proposed schemes on realization of typical attacks for the distinguished classes of vulnerabilities.The methodical approach is used for choosing variants to be applied for realization of protection mechanisms in the information systems as well as for estimation of information safety in the operating information systems.

  20. Risk Analysis of Accounting Information System Infrastructure

    OpenAIRE

    MIHALACHE, Arsenie-Samoil

    2011-01-01

    National economy and security are fully dependent on information technology and infrastructure. At the core of the information infrastructure society relies on, we have the Internet, a system designed initially as a scientists’ forum for unclassified research. The use of communication networks and systems may lead to hazardous situations that generate undesirable effects such as communication systems breakdown, loss of data or taking the wrong decisions. The paper studies the risk analysis of...

  1. Ground subsidence information as a valuable layer in GIS analysis

    Directory of Open Access Journals (Sweden)

    Murdzek Radosław

    2018-01-01

    Full Text Available Among the technologies used to improve functioning of local governments the geographic information systems (GIS are widely used. GIS tools allow to simultaneously integrate spatial data resources, analyse them, process and use them to make strategic decisions. Nowadays GIS analysis is widely used in spatial planning or environmental protection. In these applications a number of spatial information are utilized, but rarely it is an information about environmental hazards. This paper includes information about ground subsidence that occurred in USCB mining area into GIS analysis. Monitoring of this phenomenon can be carried out using the radar differential interferometry (DInSAR method.

  2. Ground subsidence information as a valuable layer in GIS analysis

    Science.gov (United States)

    Murdzek, Radosław; Malik, Hubert; Leśniak, Andrzej

    2018-04-01

    Among the technologies used to improve functioning of local governments the geographic information systems (GIS) are widely used. GIS tools allow to simultaneously integrate spatial data resources, analyse them, process and use them to make strategic decisions. Nowadays GIS analysis is widely used in spatial planning or environmental protection. In these applications a number of spatial information are utilized, but rarely it is an information about environmental hazards. This paper includes information about ground subsidence that occurred in USCB mining area into GIS analysis. Monitoring of this phenomenon can be carried out using the radar differential interferometry (DInSAR) method.

  3. INFORMATION ARCHITECTURE ANALYSIS USING BUSINESS INTELLIGENCE TOOLS BASED ON THE INFORMATION NEEDS OF EXECUTIVES

    Directory of Open Access Journals (Sweden)

    Fabricio Sobrosa Affeldt

    2013-08-01

    Full Text Available Devising an information architecture system that enables an organization to centralize information regarding its operational, managerial and strategic performance is one of the challenges currently facing information technology. The present study aimed to analyze an information architecture system developed using Business Intelligence (BI technology. The analysis was performed based on a questionnaire enquiring as to whether the information needs of executives were met during the process. A theoretical framework was applied consisting of information architecture and BI technology, using a case study methodology. Results indicated that the transaction processing systems studied did not meet the information needs of company executives. Information architecture using data warehousing, online analytical processing (OLAP tools and data mining may provide a more agile means of meeting these needs. However, some items must be included and others modified, in addition to improving the culture of information use by company executives.

  4. Cost-volume-profit and net present value analysis of health information systems.

    Science.gov (United States)

    McLean, R A

    1998-08-01

    The adoption of any information system should be justified by an economic analysis demonstrating that its projected benefits outweigh its projected costs. Analysis differ, however, on which methods to employ for such a justification. Accountants prefer cost-volume-profit analysis, and economists prefer net present value analysis. The article explains the strengths and weaknesses of each method and shows how they can be used together so that well-informed investments in information systems can be made.

  5. CISAPS: Complex Informational Spectrum for the Analysis of Protein Sequences

    Directory of Open Access Journals (Sweden)

    Charalambos Chrysostomou

    2015-01-01

    Full Text Available Complex informational spectrum analysis for protein sequences (CISAPS and its web-based server are developed and presented. As recent studies show, only the use of the absolute spectrum in the analysis of protein sequences using the informational spectrum analysis is proven to be insufficient. Therefore, CISAPS is developed to consider and provide results in three forms including absolute, real, and imaginary spectrum. Biologically related features to the analysis of influenza A subtypes as presented as a case study in this study can also appear individually either in the real or imaginary spectrum. As the results presented, protein classes can present similarities or differences according to the features extracted from CISAPS web server. These associations are probable to be related with the protein feature that the specific amino acid index represents. In addition, various technical issues such as zero-padding and windowing that may affect the analysis are also addressed. CISAPS uses an expanded list of 611 unique amino acid indices where each one represents a different property to perform the analysis. This web-based server enables researchers with little knowledge of signal processing methods to apply and include complex informational spectrum analysis to their work.

  6. Agricultural information dissemination using ICTs: A review and analysis of information dissemination models in China

    OpenAIRE

    Yun Zhang; Lei Wang; Yanqing Duan

    2016-01-01

    Over the last three decades, China’s agriculture sector has been transformed from the traditional to modern practice through the effective deployment of Information and Communication Technologies (ICTs). Information processing and dissemination have played a critical role in this transformation process. Many studies in relation to agriculture information services have been conducted in China, but few of them have attempted to provide a comprehensive review and analysis of different informatio...

  7. Development of efficient system for collection-analysis-application of information using system for technology and information in field of RI-biomics

    International Nuclear Information System (INIS)

    Jang, Sol Ah; Kim, Joo Yeon; Park, Tai Jin

    2015-01-01

    RI-Biomics is the new radiation fusion technology of which, such as the characteristics of radioisotope, is applied to the biomics. In order to sharing and overall analysis of data between the institutions through total management of information in the field of RI-Biomics, RI-Biomics Information portal ‘RIBio-Info’ was constructed by KARA (Korean Association for Radiation Application) in February 2015. For systematic operation of this ‘RIBio-Info’ system, it is required to develop system of collection-analysis-application of information. So, in this paper, we summarized development of document forms at each processes of collection-analysis-application of information and systematization of collection methods of information, establishment of characteristically analysis methods of reports such as issue paper, policy report, global market report and watch report. Therefore, these are expected to improving the practical applicability in this field through the vitalization of technology development of users by achieving the circular structure of collection analysis-application of information

  8. Development of efficient system for collection-analysis-application of information using system for technology and information in field of RI-biomics

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Sol Ah; Kim, Joo Yeon; Park, Tai Jin [Korean Association for Radiation Application, Seoul (Korea, Republic of)

    2015-08-15

    RI-Biomics is the new radiation fusion technology of which, such as the characteristics of radioisotope, is applied to the biomics. In order to sharing and overall analysis of data between the institutions through total management of information in the field of RI-Biomics, RI-Biomics Information portal ‘RIBio-Info’ was constructed by KARA (Korean Association for Radiation Application) in February 2015. For systematic operation of this ‘RIBio-Info’ system, it is required to develop system of collection-analysis-application of information. So, in this paper, we summarized development of document forms at each processes of collection-analysis-application of information and systematization of collection methods of information, establishment of characteristically analysis methods of reports such as issue paper, policy report, global market report and watch report. Therefore, these are expected to improving the practical applicability in this field through the vitalization of technology development of users by achieving the circular structure of collection analysis-application of information.

  9. Information-Pooling Bias in Collaborative Security Incident Correlation Analysis.

    Science.gov (United States)

    Rajivan, Prashanth; Cooke, Nancy J

    2018-03-01

    Incident correlation is a vital step in the cybersecurity threat detection process. This article presents research on the effect of group-level information-pooling bias on collaborative incident correlation analysis in a synthetic task environment. Past research has shown that uneven information distribution biases people to share information that is known to most team members and prevents them from sharing any unique information available with them. The effect of such biases on security team collaborations are largely unknown. Thirty 3-person teams performed two threat detection missions involving information sharing and correlating security incidents. Incidents were predistributed to each person in the team based on the hidden profile paradigm. Participant teams, randomly assigned to three experimental groups, used different collaboration aids during Mission 2. Communication analysis revealed that participant teams were 3 times more likely to discuss security incidents commonly known to the majority. Unaided team collaboration was inefficient in finding associations between security incidents uniquely available to each member of the team. Visualizations that augment perceptual processing and recognition memory were found to mitigate the bias. The data suggest that (a) security analyst teams, when conducting collaborative correlation analysis, could be inefficient in pooling unique information from their peers; (b) employing off-the-shelf collaboration tools in cybersecurity defense environments is inadequate; and (c) collaborative security visualization tools developed considering the human cognitive limitations of security analysts is necessary. Potential applications of this research include development of team training procedures and collaboration tool development for security analysts.

  10. Carbon Dioxide Information Analysis Center: FY 1992 activities

    Energy Technology Data Exchange (ETDEWEB)

    Cushman, R.M. [Oak Ridge National Lab., TN (United States). Carbon Dioxide Information Analysis Center; Stoss, F.W. [Tennessee Univ., Knoxville, TN (United States). Energy, Environment and Resources Center

    1993-03-01

    During the course of a fiscal year, Oak Ridge National Laboratory`s Carbon Dioxide Information Analysis Center (CDIAC) distributes thousands of specialty publications-numeric data packages (NDPs), computer model packages (CMPs), technical reports, public communication publications, newsletters, article reprints, and reference books-in response to requests for information related to global environmental issues, primarily those pertaining to climate change. CDIACs staff also provides technical responses to specific inquiries related to carbon dioxide (CO{sub 2}), other trace gases, and climate. Hundreds of referrals to other researchers, policy analysts, information specialists, or organizations are also facilitated by CDIAC`s staff. This report provides an account of the activities accomplished by CDIAC during the period October 1, 1991 to September 30, 1992. An organizational overview of CDIAC and its staff is supplemented by a detailed description of inquiries received and CDIAC`s response to those inquiries. As analysis and description of the preparation and distribution of numeric data packages, computer model packages, technical reports, newsletters, fact sheets, specialty publications, and reprints is provided. Comments and descriptions of CDIAC`s information management systems, professional networking, and special bilateral agreements are also described.

  11. Carbon Dioxide Information Analysis Center: FY 1991 activities

    Energy Technology Data Exchange (ETDEWEB)

    Cushman, R.M.; Stoss, F.W.

    1992-06-01

    During the course of a fiscal year, Oak Ridge National Laboratory`s Carbon Dioxide Information Analysis Center (CDIAC) distributes thousands of specially publications-numeric data packages (NDPs), computer model packages (CMPs), technical reports, public communication publications, newsletters, article reprints, and reference books-in response to requests for information related to global environmental issues, primarily those pertaining to climate change. CDIAC`s staff also provides technical responses to specific inquiries related to carbon dioxide (CO{sub 2}), other trace gases, and climate. Hundreds of referrals to other researchers, policy analysts, information specialists, or organizations are also facilitated by CDIAC`s staff. This report provides an account of the activities accomplished by CDIAC during the period October 1, 1990 to September 30, 1991. An organizational overview of CDIAC and its staff is supplemented by a detailed description of inquiries received and CDIAC`s response to those inquiries. An analysis and description of the preparation and distribution of numeric data packages, computer model packages, technical reports, newsletters, factsheets, specially publications, and reprints is provided. Comments and descriptions of CDIAC`s information management systems, professional networking, and special bilateral agreements are also described.

  12. Carbon Dioxide Information Analysis Center: FY 1991 activities

    Energy Technology Data Exchange (ETDEWEB)

    Cushman, R.M.; Stoss, F.W.

    1992-06-01

    During the course of a fiscal year, Oak Ridge National Laboratory's Carbon Dioxide Information Analysis Center (CDIAC) distributes thousands of specially publications-numeric data packages (NDPs), computer model packages (CMPs), technical reports, public communication publications, newsletters, article reprints, and reference books-in response to requests for information related to global environmental issues, primarily those pertaining to climate change. CDIAC's staff also provides technical responses to specific inquiries related to carbon dioxide (CO{sub 2}), other trace gases, and climate. Hundreds of referrals to other researchers, policy analysts, information specialists, or organizations are also facilitated by CDIAC's staff. This report provides an account of the activities accomplished by CDIAC during the period October 1, 1990 to September 30, 1991. An organizational overview of CDIAC and its staff is supplemented by a detailed description of inquiries received and CDIAC's response to those inquiries. An analysis and description of the preparation and distribution of numeric data packages, computer model packages, technical reports, newsletters, factsheets, specially publications, and reprints is provided. Comments and descriptions of CDIAC's information management systems, professional networking, and special bilateral agreements are also described.

  13. Information Superiority via Formal Concept Analysis

    Science.gov (United States)

    Koester, Bjoern; Schmidt, Stefan E.

    This chapter will show how to get more mileage out of information. To achieve that, we first start with an introduction to the fundamentals of Formal Concept Analysis (FCA). FCA is a highly versatile field of applied lattice theory, which allows hidden relationships to be uncovered in relational data. Moreover, FCA provides a distinguished supporting framework to subsequently find and fill information gaps in a systematic and rigorous way. In addition, we would like to build bridges via a universal approach to other communities which can be related to FCA in order for other research areas to benefit from a theory that has been elaborated for more than twenty years. Last but not least, the essential benefits of FCA will be presented algorithmically as well as theoretically by investigating a real data set from the MIPT Terrorism Knowledge Base and also by demonstrating an application in the field of Web Information Retrieval and Web Intelligence.

  14. Information Analysis Centers in the Department of Defense. Revision

    Science.gov (United States)

    1987-07-01

    Combat Data Information Center (CDIC) and the Aircraft Survivability Model Repository ( ASMR ) into the Survivability/Vulnerability Information Analysis...Information Center (CDIC) and the Aircraft Survivability Model Respository ( ASMR ). The CDIC was a central repository for combat and test data related to...and ASMR were operated under the technical monitorship of the Flight Dynamics Laboratory at Wright-Patterson AFB, Ohio and were located in Flight

  15. Analysis of Emergency Information Management Research Hotspots Based on Bibliometric and Co-occurrence Analysis

    Directory of Open Access Journals (Sweden)

    Zou Qingyun

    2017-04-01

    Full Text Available [Purpose/significance] Emergency information management is an interdisciplinary field of emergency management and information management. Summarizing the major research output is helpful to strengthen the effective utilization of information resources in emergency management research, and to provide references for the follow-up development and practical exploration of emergency information management research. [Method/process] By retrieving concerned literature from CNKI, this paper used the bibliometric and co-word clustering analysis methods to analyze the domestic emergency management research output. [Result/conclusion] Domestic emergency information management research mainly focuses on five hot topics: disaster emergency information management, crisis information disclosure, emergency information management system, emergency response, wisdom emergency management. China should strengthen the emergency management information base for future theoretical research, and build the emergency information management theoretical framework.

  16. Army Information Operations Officer Needs Analysis Report

    Science.gov (United States)

    2016-03-01

    helping with formatting the final report iv ARMY INFORMATION OPERATIONS OFFICER NEEDS ANALYSIS REPORT EXECUTIVE SUMMARY Research...time.” One IO officer suggested the IPO try to get a access the database that has all the old APA reports archived as a way to look at assessment

  17. Information theoretic analysis of canny edge detection in visual communication

    Science.gov (United States)

    Jiang, Bo; Rahman, Zia-ur

    2011-06-01

    In general edge detection evaluation, the edge detectors are examined, analyzed, and compared either visually or with a metric for specific an application. This analysis is usually independent of the characteristics of the image-gathering, transmission and display processes that do impact the quality of the acquired image and thus, the resulting edge image. We propose a new information theoretic analysis of edge detection that unites the different components of the visual communication channel and assesses edge detection algorithms in an integrated manner based on Shannon's information theory. The edge detection algorithm here is considered to achieve high performance only if the information rate from the scene to the edge approaches the maximum possible. Thus, by setting initial conditions of the visual communication system as constant, different edge detection algorithms could be evaluated. This analysis is normally limited to linear shift-invariant filters so in order to examine the Canny edge operator in our proposed system, we need to estimate its "power spectral density" (PSD). Since the Canny operator is non-linear and shift variant, we perform the estimation for a set of different system environment conditions using simulations. In our paper we will first introduce the PSD of the Canny operator for a range of system parameters. Then, using the estimated PSD, we will assess the Canny operator using information theoretic analysis. The information-theoretic metric is also used to compare the performance of the Canny operator with other edge-detection operators. This also provides a simple tool for selecting appropriate edgedetection algorithms based on system parameters, and for adjusting their parameters to maximize information throughput.

  18. Implantation of a safety management system information under the ISO 27001: risk analysis information

    Directory of Open Access Journals (Sweden)

    José Gregorio Arévalo Ascanio

    2015-11-01

    Full Text Available In this article the structure of the business of the city of Ocaña is explored with the aim of expanding the information and knowledge of the main variables of the productive activity of the municipality, its entrepreneurial spirit, technological development and productive structure. For this, a descriptive research was performed to identify economic activity in its various forms and promote the implementation of administrative practices consistent with national and international references.The results allowed to establish business weaknesses, including information, which once identified are used to design spaces training, acquisition of abilities and employers management practices in consistent with the challenges of competitiveness and stay on the market.As of the results was collected information regarding technological component companies of the productive fabric of the city, for which the application of tools for the analysis of information systems is proposed using the ISO 27001: 2005, using most appropriate technologies to study organizations that protect their most important asset information: information.

  19. RISK ANALYSIS IN INFORMATION TECHNOLOGY AND COMMUNICATION OUTSOURCING

    Directory of Open Access Journals (Sweden)

    Edmir Parada Vasques Prado

    2011-12-01

    Full Text Available This research aims at evaluating the risk analysis process in Information Technology and Communication (ICT outsourcing conducted by organizations of the private sector. The research is characterized by being a descriptive, quantitative and transversal type study, which was used the survey method. Data were collected through questionnaire, the sample is not random and we used a convenience sampling process. The research made contributions to understanding the risk analysis process in ICT services outsourcing, and identified statistically significant relationships between risk analysis, organization's size and its industry, and between risk analysis and diversity of outsourced services

  20. Meeting the reactor operator's information needs using functional analysis

    International Nuclear Information System (INIS)

    Nelson, W.R.; Clark, M.T.

    1980-01-01

    Since the accident at Three Mile Island, many ideas have been proposed for assisting the reactor operator during emergency situations. However, some of the suggested remedies do not alleviate an important shortcoming of the TMI control room: the operators were not presented with the information they needed in a manner which would allow prompt diagnosis of the problem. To address this problem, functional analysis is being applied at the LOFT facility to ensure that the operator's information needs are being met in his procedures and graphic displays. This paper summarizes the current applications of functional analysis at LOFT

  1. Environmental Quality Information Analysis Center multi-year plan

    International Nuclear Information System (INIS)

    Rivera, R.G.; Das, S.; Walsh, T.E.

    1992-09-01

    An information analysis center (IAC) is a federal resource that provides technical information for a specific technology field. An IAC links an expert technical staff with an experienced information specialist group, supported by in-house or external data bases to provide technical information and maintain a corporate knowledge in a technical area. An IAC promotes the rapid transfer of technology among its users and provides assistance in adopting new technology and predicting and assessing emerging technology. This document outlines the concept, requirements, and proposed development of an Environmental Quality IAC (EQIAC). An EQIAC network is composed of several nodes, each of which has specific technology capabilities. This document outlines strategic and operational objectives for the phased development of one such node of an EQIAC network

  2. IMMAN: free software for information theory-based chemometric analysis.

    Science.gov (United States)

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  3. Information Technology Systems Vulnerabilities Detecting based on Network’s Traffic Analysis

    Directory of Open Access Journals (Sweden)

    Dmitry Anatolevich Melnikov

    2013-12-01

    Full Text Available This paper proposes traffic analysis procedure that is very effective and sometimes single countermeasure on counteracting of network attacks and information leakage channels (hidden control channels. Traffic analysis envisages certain measures to control the security of the Russian Federation information technology infrastructure and, most importantly, to establish the reasons of the occurred and predictable computer incidents.

  4. Performance of the Carbon Dioxide Information Analysis Center (CDIAC)

    Energy Technology Data Exchange (ETDEWEB)

    Stoss, F.W. [Univ. of Tennessee, Knoxville, TN (United States). Environment, Energy, and Resources Center; Jones, S.B. [Oak Ridge National Lab., TN (United States)

    1993-11-01

    The Carbon Dioxide Information Analysis Center (CDIAC) provides information and data resources in support of the US Department of Energy`s Global Change Research Program. CDIAC also serves as a resource of global change information for a broader international commonly of researchers, policymakers, managers, educators, and students. The number of requests for CDIAC`s data products, information services, and publications has grown over the years and represents multidisciplinary interests in the physical, life, and social sciences and from diverse work settings in government, business, and academia. CDIAC`s staff addresses thousands of requests yearly for data and information resources. In response to these requests, CDIAC has distributed tens of thousands of data products, technical reports, newsletters, and other information resources worldwide since 1982. This paper describes CDIAC, examines CDIAC`s user community, and describes CDIAC`s response to requests for information. The CDIAC Information System, which serves as a comprehensive PC-based inventory and information management tracking system, is also described.

  5. Automatic circuit analysis based on mask information

    International Nuclear Information System (INIS)

    Preas, B.T.; Lindsay, B.W.; Gwyn, C.W.

    1976-01-01

    The Circuit Mask Translator (CMAT) code has been developed which converts integrated circuit mask information into a circuit schematic. Logical operations, pattern recognition, and special functions are used to identify and interconnect diodes, transistors, capacitors, and resistances. The circuit topology provided by the translator is compatible with the input required for a circuit analysis program

  6. Open source tools for the information theoretic analysis of neural data

    Directory of Open Access Journals (Sweden)

    Robin A. A Ince

    2010-05-01

    Full Text Available The recent and rapid development of open-source software tools for the analysis of neurophysiological datasets consisting of multiple simultaneous recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and integrate the information obtained at different spatial and temporal scales. In this Review we focus on recent advances in open source toolboxes for the information theoretic analysis of neural responses. We also present examples of their use to investigate the role of spike timing precision, correlations across neurons and field potential fluctuations in the encoding of sensory information. These information toolboxes, available both in Matlab and Python programming environments, hold the potential to enlarge the domain of application of information theory to neuroscience and to lead to new discoveries about how neurons encode and transmit information.

  7. Open source tools for the information theoretic analysis of neural data.

    Science.gov (United States)

    Ince, Robin A A; Mazzoni, Alberto; Petersen, Rasmus S; Panzeri, Stefano

    2010-01-01

    The recent and rapid development of open source software tools for the analysis of neurophysiological datasets consisting of simultaneous multiple recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and for the integration of information obtained at different spatial and temporal scales. In this review we focus on recent advances in open source toolboxes for the information theoretic analysis of neural responses. We also present examples of their use to investigate the role of spike timing precision, correlations across neurons, and field potential fluctuations in the encoding of sensory information. These information toolboxes, available both in MATLAB and Python programming environments, hold the potential to enlarge the domain of application of information theory to neuroscience and to lead to new discoveries about how neurons encode and transmit information.

  8. Performance Analysis of Information Services in a Grid Environment

    Directory of Open Access Journals (Sweden)

    Giovanni Aloisio

    2004-10-01

    Full Text Available The Information Service is a fundamental component in a grid environment. It has to meet a lot of requirements such as access to static and dynamic information related to grid resources, efficient and secure access to dynamic data, decentralized maintenance, fault tolerance etc., in order to achieve better performance, scalability, security and extensibility. Currently there are two different major approaches. One is based on a directory infrastructure and another one on a novel approach that exploits a relational DBMS. In this paper we present a performance comparison analysis between Grid Resource Information Service (GRIS and Local Dynamic Grid Catalog relational information service (LDGC, providing also information about two projects (iGrid and Grid Relational Catalog in the grid data management area.

  9. A cost-benefit analysis for materials management information systems.

    Science.gov (United States)

    Slapak-Iacobelli, L; Wilde, A H

    1993-02-01

    The cost-benefit analysis provided the system planners with valuable information that served many purposes. It answered the following questions: Why was the CCF undertaking this project? What were the alternatives? How much was it going to cost? And what was the expected outcome? The process of developing cost-benefit the document kept the project team focused. It also motivated them to involve additional individuals from materials management and accounts payable in its development. A byproduct of this involvement was buy-in and commitment to the project by everyone in these areas. Consequently, the project became a team effort championed by many and not just one. We were also able to introduce two new information system processes: 1) a management review process with goals and anticipated results, and 2) a quality assurance process that ensured the CCF had a better product in the end. The cost-benefit analysis provided a planning tool that assisted in successful implementation of an integrated materials management information system.

  10. Agricultural information dissemination using ICTs: A review and analysis of information dissemination models in China

    Directory of Open Access Journals (Sweden)

    Yun Zhang

    2016-03-01

    Full Text Available Over the last three decades, China’s agriculture sector has been transformed from the traditional to modern practice through the effective deployment of Information and Communication Technologies (ICTs. Information processing and dissemination have played a critical role in this transformation process. Many studies in relation to agriculture information services have been conducted in China, but few of them have attempted to provide a comprehensive review and analysis of different information dissemination models and their applications. This paper aims to review and identify the ICT based information dissemination models in China and to share the knowledge and experience in applying emerging ICTs in disseminating agriculture information to farmers and farm communities to improve productivity and economic, social and environmental sustainability. The paper reviews and analyzes the development stages of China’s agricultural information dissemination systems and different mechanisms for agricultural information service development and operations. Seven ICT-based information dissemination models are identified and discussed. Success cases are presented. The findings provide a useful direction for researchers and practitioners in developing future ICT based information dissemination systems. It is hoped that this paper will also help other developing countries to learn from China’s experience and best practice in their endeavor of applying emerging ICTs in agriculture information dissemination and knowledge transfer.

  11. The Effect of Information Analysis Automation Display Content on Human Judgment Performance in Noisy Environments

    OpenAIRE

    Bass, Ellen J.; Baumgart, Leigh A.; Shepley, Kathryn Klein

    2012-01-01

    Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noi...

  12. Analysis of Computer Network Information Based on "Big Data"

    Science.gov (United States)

    Li, Tianli

    2017-11-01

    With the development of the current era, computer network and large data gradually become part of the people's life, people use the computer to provide convenience for their own life, but at the same time there are many network information problems has to pay attention. This paper analyzes the information security of computer network based on "big data" analysis, and puts forward some solutions.

  13. Integrated information system for analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Galperin, A.

    1994-01-01

    Performing complicated engineering analyses of a nuclear power plant requires storage and manipulation of a large amount of information, both data and knowledge. This information is characterized by its multidisciplinary nature, complexity, and diversity. The problems caused by inefficient and lengthy manual operations involving the data flow management within the frame-work of the safety-related analysis of a power plant can be solved by applying the computer aided engineering principles. These principles are the basis of the design of an integrated information storage system (IRIS). The basic idea is to create a computerized environment, which includes both database and functional capabilities. Consideration and analysis of the data types and required data manipulation capabilities as well as operational requirements, resulted in the choice of an object-oriented data-base management system (OODBMS) as a development platform for solving the software engineering problems. Several advantages of OODBMSs over conventional relations database systems were found of crucial importance, especially providing the necessary flexibility for different data types and extensibility potential. A detailed design of a data model is produced for the plant technical data and for the storage of analysis results. The overall system architecture was designed to assure the feasibility of integrating database capabilities with procedures and functions written in conventional algorithmic programming languages

  14. Geriatric information analysis of the molecular properties of mexidole

    Directory of Open Access Journals (Sweden)

    O. A. Gromova

    2017-01-01

    Full Text Available Objective: by using the pharmacoinformation profiling, to comprehensively assess all possible effects of the molecules of mexidol, choline alfoscerate, piracetam, glycine, and semax in accordance with the anatomical therapeutic and chemical (ATC classification system.Material and methods. Chemoreactomic, pharmacoinformation, and geriatric information analyses of the properties of the molecules are based on chemoreactomic methodology. The chemoreactomic analysis uses the information from the PubChem, HMDB, and String databases; the pharmacoinformation analysis applies the information from the international ATC classification and a combined sample of data from the Therapeutic Target Database (TTD, SuperTarget, Manually Annotated Targets and Drugs Online Resource (MATADOR, and Potential Drug Target Database (PDTD; geriatric information analysis employs the data on the geroprotective effect of individual substances from the PubChem database and the data available in the literature data on geroprotection from the PubMed database, which have been collected through the artificial intelligence system.Results and discussion. Mexidol is characterized by the maximum set of positive effects (the drug is used to treat CNS and cardiovascular diseases and metabolic disorders and has anti-inflammatory and anti-infective properties, etc.. Mexidol and glycine are predicted to cause the lowest frequency of adverse reactions, such as itching, constipation, paresthesia, vomiting, etc. Geriatric information assessments of changes in the life span of model organisms have shown that mexidol contributes to the higher life expectancy of C. elegans (by 22.7±10%, Drosophila (by 14.4±15%, and mice (by 14.6±3%; the control drugs do by no more than 6.1%.Conclusion. The results of the study indicate that mexidol has a high potential to be used as a geroprotector.

  15. Social inclusion and its approach at Information Science: scientific production analysis in the area of information science periodicals between 2001 and 2010

    Directory of Open Access Journals (Sweden)

    Alex Serrano Almeida

    2013-08-01

    Full Text Available This study has the purpose to check how the social inclusion has been approached at Information Science area, from the scientific production area published at the area national periodicals. Over there, to verify which inclusion forms are recurrently approached at Information Science area; to show the use tendencies of social inclusion concept at the Science Information area scientific articles; to find how it presents the social inclusion concept connected to the information professional and analyze if it there is any association to other themes. It was realized searches in six periodicals at the period between 2001 and 2010. We used how analysis method the Bardin content analysis reference. The analysis corpus was constituted of 30 articles which approached the social inclusion theme. As the results, it was showed that the social inclusion on Information Science area publications, in general, is turned to digital inclusion and to the Information Science area publications uses. Besides, it was still identified connections with the information professionals, which one must serve as mediator between the information and the environment where information and users are inserted.

  16. Constructing a model of effective information dissemination in a crisis. Information dissemination, Crisis, Crises, Tuberculosis, Dissemination of information, Meta-ethnographic analysis, Social marketing

    OpenAIRE

    Fiona Duggan; Linda Banwell

    2004-01-01

    A model of effective information dissemination in a crisis was developed from a Ph.D. study of information dissemination during a suspected TB outbreak. The research aimed to characterise and evaluate the dissemination of information to the community during the incident. A qualitative systematic review of the research literature identified twenty relevant studies. Meta-ethnographic analysis of these studies highlighted the key factors in effective dissemination. Consideration of these factors...

  17. The Effect of Information Analysis Automation Display Content on Human Judgment Performance in Noisy Environments

    Science.gov (United States)

    Bass, Ellen J.; Baumgart, Leigh A.; Shepley, Kathryn Klein

    2014-01-01

    Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noisy sensor data are used by both the human and the information analysis automation to make judgments. In a simplified air traffic conflict prediction experiment, 32 participants made probability of horizontal conflict judgments under different display content conditions. After being exposed to the information analysis automation, judgment achievement significantly improved for all participants as compared to judgments without any of the automation's information. Participants provided with additional display content pertaining to cue variability in the task environment had significantly higher aided judgment achievement compared to those provided with only the automation's judgment of a probability of conflict. When designing information analysis automation for environments where the automation's judgment achievement is impacted by noisy environmental data, it may be beneficial to show additional task environment information to the human judge in order to improve judgment performance. PMID:24847184

  18. The Effect of Information Analysis Automation Display Content on Human Judgment Performance in Noisy Environments.

    Science.gov (United States)

    Bass, Ellen J; Baumgart, Leigh A; Shepley, Kathryn Klein

    2013-03-01

    Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noisy sensor data are used by both the human and the information analysis automation to make judgments. In a simplified air traffic conflict prediction experiment, 32 participants made probability of horizontal conflict judgments under different display content conditions. After being exposed to the information analysis automation, judgment achievement significantly improved for all participants as compared to judgments without any of the automation's information. Participants provided with additional display content pertaining to cue variability in the task environment had significantly higher aided judgment achievement compared to those provided with only the automation's judgment of a probability of conflict. When designing information analysis automation for environments where the automation's judgment achievement is impacted by noisy environmental data, it may be beneficial to show additional task environment information to the human judge in order to improve judgment performance.

  19. Informational-computer system for the neutron spectra analysis

    International Nuclear Information System (INIS)

    Berzonis, M.A.; Bondars, H.Ya.; Lapenas, A.A.

    1979-01-01

    In this article basic principles of the build-up of the informational-computer system for the neutron spectra analysis on a basis of measured reaction rates are given. The basic data files of the system, needed software and hardware for the system operation are described

  20. 75 FR 35457 - Draft of the 2010 Causal Analysis/Diagnosis Decision Information System (CADDIS)

    Science.gov (United States)

    2010-06-22

    ... Causal Analysis/Diagnosis Decision Information System (CADDIS) AGENCY: Environmental Protection Agency... site, ``2010 release of the Causal Analysis/Diagnosis Decision Information System (CADDIS).'' The..., organize, and share information useful for causal evaluations in aquatic systems. CADDIS is based on EPA's...

  1. Formal Concept Analysis for Information Retrieval

    OpenAIRE

    Qadi, Abderrahim El; Aboutajedine, Driss; Ennouary, Yassine

    2010-01-01

    In this paper we describe a mechanism to improve Information Retrieval (IR) on the web. The method is based on Formal Concepts Analysis (FCA) that it is makes semantical relations during the queries, and allows a reorganizing, in the shape of a lattice of concepts, the answers provided by a search engine. We proposed for the IR an incremental algorithm based on Galois lattice. This algorithm allows a formal clustering of the data sources, and the results which it turns over are classified by ...

  2. 40 CFR 1400.3 - Public access to paper copies of off-site consequence analysis information.

    Science.gov (United States)

    2010-07-01

    ...-site consequence analysis information. 1400.3 Section 1400.3 Protection of Environment ENVIRONMENTAL... PROGRAMS UNDER THE CLEAN AIR ACT SECTION 112(r)(7); DISTRIBUTION OF OFF-SITE CONSEQUENCE ANALYSIS INFORMATION DISTRIBUTION OF OFF-SITE CONSEQUENCE ANALYSIS INFORMATION Public Access § 1400.3 Public access to...

  3. Investigation research of core-basic information associated with the coupling analysis. Outline report

    International Nuclear Information System (INIS)

    Kataoka, Shinichi; Matsunaga, Kenichi; Ishihara, Yoshinao; Kawahara, Kenichi; Neyama, Atsushi; Nakagawa, Koichi; Iwata, Hiroshi; Mori, Koji

    2001-03-01

    The newest literature information in the foreign countries was researched, and this research showed the basic concept of the coupling analysis code to realize coupling analysis in near field of the geological disposal system. The outline of this research is shown in the following. (1) The combination of M (Mechanical) and C (Chemistry) is placed on the weak relations, because coupling analysis of the United States Yucca Mountain limits a site and the specifications of engineered barrier. (2) One of the purposes of this research is information collecting about coupling analysis code NUFT-C adopted in the United States Yucca Mountain. Therefore, we carried out an information exchange with the United States Lawrence Livermore National Laboratory. We could collect the development purpose of analysis code, key function, and information such as a test case analysis. (3) The investigation of the analysis code concerned with the newest information of coupling analysis which contains the geochemistry process and 2 phase system was done based on the public information for the purpose of building some concept of the coupling analysis code, the extraction of the development issues. It could be understood about the future development strategy and the precaution in addition to a phenomenon to deal with, the current status of the coupling analysis technique as a result of the investigation. (4) It was cleared about the mission of the coupling analysis code and the requirement items (function, quality) by this research. Then, some development options were presented. (5) It was studied about the procedure of developing it to satisfy the above requirement toward the conditions that a site isn't selected, the short development. The tool (Diffpack) which could cope with the speed-up of the calculation time and visualization flexibly was effective, and it was summarized about the test case by using this tool, the key function of this tool as that result. (author)

  4. Systematic review and meta-analysis: tools for the information age.

    Science.gov (United States)

    Weatherall, Mark

    2017-11-01

    The amount of available biomedical information is vast and growing. Natural limitations of the way clinicians and researchers approach this treasure trove of information comprise difficulties locating the information, and once located, cognitive biases may lead to inappropriate use of the information. Systematic reviews and meta-analyses represent important tools in the information age to improve knowledge and action. Systematic reviews represent a census approach to identifying literature to avoid non-response bias. They are a necessary prelude to producing combined quantitative summaries of associations or treatment effects. Meta-analysis comprises the arithmetical techniques for producing combined summaries from individual study reports. Careful, thoughtful and rigorous use of these tools is likely to enhance knowledge and action. Use of standard guidelines, such as the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, or embedding these activities within collaborative groups such as the Cochrane Collaboration, are likely to lead to more useful systematic review and meta-analysis reporting. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  5. Using task analysis to improve the requirements elicitation in health information system.

    Science.gov (United States)

    Teixeira, Leonor; Ferreira, Carlos; Santos, Beatriz Sousa

    2007-01-01

    This paper describes the application of task analysis within the design process of a Web-based information system for managing clinical information in hemophilia care, in order to improve the requirements elicitation and, consequently, to validate the domain model obtained in a previous phase of the design process (system analysis). The use of task analysis in this case proved to be a practical and efficient way to improve the requirements engineering process by involving users in the design process.

  6. Self-Informant Agreement in Well-Being Ratings: A Meta-Analysis

    Science.gov (United States)

    Schneider, Leann; Schimmack, Ulrich

    2009-01-01

    A meta-analysis of published studies that reported correlations between self-ratings and informant ratings of well-being (life-satisfaction, happiness, positive affect, negative affect) was performed. The average self-informant correlation based on 44 independent samples and 81 correlations for a total of 8,897 participants was r = 0.42 [99%…

  7. Information Systems Security Job Advertisement Analysis: Skills Review and Implications for Information Systems Curriculum

    Science.gov (United States)

    Brooks, Nita G.; Greer, Timothy H.; Morris, Steven A.

    2018-01-01

    The authors' focus was the assessment of skill requirements for information systems security positions to understand expectations for security jobs and to highlight issues relevant to curriculum management. The analysis of 798 job advertisements involved the exploration of domain-related and soft skills as well as degree and certification…

  8. Intelligent Data Analysis in the EMERCOM Information System

    Science.gov (United States)

    Elena, Sharafutdinova; Tatiana, Avdeenko; Bakaev, Maxim

    2017-01-01

    The paper describes an information system development project for the Russian Ministry of Emergency Situations (MES, whose international operations body is known as EMERCOM), which was attended by the representatives of both the IT industry and the academia. Besides the general description of the system, we put forward OLAP and Data Mining-based approaches towards the intelligent analysis of the data accumulated in the database. In particular, some operational OLAP reports and an example of multi-dimensional information space based on OLAP Data Warehouse are presented. Finally, we outline Data Mining application to support decision-making regarding security inspections planning and results consideration.

  9. The intellectual core of enterprise information systems: a co-citation analysis

    Science.gov (United States)

    Shiau, Wen-Lung

    2016-10-01

    Enterprise information systems (EISs) have evolved in the past 20 years, attracting the attention of international practitioners and scholars. Although literature reviews and analyses have been conducted to examine the multiple dimensions of EISs, no co-citation analysis has been conducted to examine the knowledge structures involved in EIS studies; thus, the current study fills this research gap. This study investigated the intellectual structures of EISs. All data source documents (1083 articles and 24,090 citations) were obtained from the Institute for Scientific Information Web of Knowledge database. A co-citation analysis was used to analyse EIS data. By using factor analysis, we identified eight critical factors: (a) factors affecting the implementation and success of information systems (ISs); (b) the successful implementation of enterprise resource planning (ERP); (c) IS evaluation and success, (d) system science studies; (e) factors influencing ERP success; (f) case research and theoretical models; (g) user acceptance of information technology; and (h) IS frameworks. Multidimensional scaling and cluster analysis were used to visually map the resultant EIS knowledge. It is difficult to implement an EIS in an enterprise and each organisation exhibits specific considerations. The current findings indicate that managers must focus on ameliorating inferior project performance levels, enabling a transition from 'vicious' to 'virtuous' projects. Successful EIS implementation yields substantial organisational advantages.

  10. Water Information Management & Analysis System (WIMAS) v 4.0

    Data.gov (United States)

    Kansas Data Access and Support Center — The Water Information Management and Analysis System (WIMAS) is an ArcView based GIS application that allows users to query Kansas water right data maintained by the...

  11. Efficiency and Effectiveness in the Collection and Analysis of S&T Open Source Information

    International Nuclear Information System (INIS)

    Pericou-Cayere, M.; Lemaire, P.; Pace, J.-M.; Baude, S.; Samson, N.

    2015-01-01

    While looking for information in scientific database, we are overwhelmed by the amount of information that we encounter. In this big data collection, getting information with added-value could be strategic for nuclear verification. In our study, we have worked about ''best practices'' in collecting, processing and analyzing open source scientific and technical information. First, we were insistent on working with information authenticated by referees such as scientific publications (structured information). Analysis of this structured data is made with bibliometric tools. Several steps are carried out: collecting data related to the paradigm, creating a database to store data generated by bibliographic research, analyzing data with selected tools. With analysis of bibliographic data only, we are able to get: · a panoramic view of countries that publish in the paradigm, · co-publication networks, · organizations that contribute to scientific publications, · countries with which a country collaborates, · areas of interest of a country, . . . So we are able to identify a target. On a second phase, we can focus on a target (countries for example). Working with non-structured data (i.e., press release, social networks, full text analysis of publications) is in progress and needs other tools to be added to the process, as we will discuss in this paper. In information analysis, methodology and expert analysis are important. Software analysis is just a tool to achieve our goal. This presentation deals with concrete measures that improve the efficiency and effectiveness in the use of open source S&T information and in the management of that information over time. Examples are shown. (author)

  12. Information-theoretic analysis of the dynamics of an executable biological model.

    Directory of Open Access Journals (Sweden)

    Avital Sadot

    Full Text Available To facilitate analysis and understanding of biological systems, large-scale data are often integrated into models using a variety of mathematical and computational approaches. Such models describe the dynamics of the biological system and can be used to study the changes in the state of the system over time. For many model classes, such as discrete or continuous dynamical systems, there exist appropriate frameworks and tools for analyzing system dynamics. However, the heterogeneous information that encodes and bridges molecular and cellular dynamics, inherent to fine-grained molecular simulation models, presents significant challenges to the study of system dynamics. In this paper, we present an algorithmic information theory based approach for the analysis and interpretation of the dynamics of such executable models of biological systems. We apply a normalized compression distance (NCD analysis to the state representations of a model that simulates the immune decision making and immune cell behavior. We show that this analysis successfully captures the essential information in the dynamics of the system, which results from a variety of events including proliferation, differentiation, or perturbations such as gene knock-outs. We demonstrate that this approach can be used for the analysis of executable models, regardless of the modeling framework, and for making experimentally quantifiable predictions.

  13. A Distributed Flocking Approach for Information Stream Clustering Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Xiaohui [ORNL; Potok, Thomas E [ORNL

    2006-01-01

    Intelligence analysts are currently overwhelmed with the amount of information streams generated everyday. There is a lack of comprehensive tool that can real-time analyze the information streams. Document clustering analysis plays an important role in improving the accuracy of information retrieval. However, most clustering technologies can only be applied for analyzing the static document collection because they normally require a large amount of computation resource and long time to get accurate result. It is very difficult to cluster a dynamic changed text information streams on an individual computer. Our early research has resulted in a dynamic reactive flock clustering algorithm which can continually refine the clustering result and quickly react to the change of document contents. This character makes the algorithm suitable for cluster analyzing dynamic changed document information, such as text information stream. Because of the decentralized character of this algorithm, a distributed approach is a very natural way to increase the clustering speed of the algorithm. In this paper, we present a distributed multi-agent flocking approach for the text information stream clustering and discuss the decentralized architectures and communication schemes for load balance and status information synchronization in this approach.

  14. Analysis of methods. [information systems evolution environment

    Science.gov (United States)

    Mayer, Richard J. (Editor); Ackley, Keith A.; Wells, M. Sue; Mayer, Paula S. D.; Blinn, Thomas M.; Decker, Louis P.; Toland, Joel A.; Crump, J. Wesley; Menzel, Christopher P.; Bodenmiller, Charles A.

    1991-01-01

    Information is one of an organization's most important assets. For this reason the development and maintenance of an integrated information system environment is one of the most important functions within a large organization. The Integrated Information Systems Evolution Environment (IISEE) project has as one of its primary goals a computerized solution to the difficulties involved in the development of integrated information systems. To develop such an environment a thorough understanding of the enterprise's information needs and requirements is of paramount importance. This document is the current release of the research performed by the Integrated Development Support Environment (IDSE) Research Team in support of the IISEE project. Research indicates that an integral part of any information system environment would be multiple modeling methods to support the management of the organization's information. Automated tool support for these methods is necessary to facilitate their use in an integrated environment. An integrated environment makes it necessary to maintain an integrated database which contains the different kinds of models developed under the various methodologies. In addition, to speed the process of development of models, a procedure or technique is needed to allow automatic translation from one methodology's representation to another while maintaining the integrity of both. The purpose for the analysis of the modeling methods included in this document is to examine these methods with the goal being to include them in an integrated development support environment. To accomplish this and to develop a method for allowing intra-methodology and inter-methodology model element reuse, a thorough understanding of multiple modeling methodologies is necessary. Currently the IDSE Research Team is investigating the family of Integrated Computer Aided Manufacturing (ICAM) DEFinition (IDEF) languages IDEF(0), IDEF(1), and IDEF(1x), as well as ENALIM, Entity

  15. 10 CFR 52.157 - Contents of applications; technical information in final safety analysis report.

    Science.gov (United States)

    2010-01-01

    ...; technical information in final safety analysis report. The application must contain a final safety analysis... 10 Energy 2 2010-01-01 2010-01-01 false Contents of applications; technical information in final safety analysis report. 52.157 Section 52.157 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSES...

  16. Building Information Modeling (BIM) for Indoor Environmental Performance Analysis

    DEFF Research Database (Denmark)

    The report is a part of a research assignment carried out by students in the 5ETCS course “Project Byggeri – [entitled as: Building Information Modeling (BIM) – Modeling & Analysis]”, during the 3rd semester of master degree in Civil and Architectural Engineering, Department of Engineering, Aarhus...... University. This includes seven papers describing BIM for Sustainability, concentrating specifically on individual topics regarding to Indoor Environment Performance Analysis....

  17. Analysis of Informal Credit Operations among Farmers In Atisbo ...

    African Journals Online (AJOL)

    Analysis of Informal Credit Operations among Farmers In Atisbo Local ... Some of the respondents (16.7%) had no formal education though they were managing ... Intensive use of extension services to facilitate adult literacy and learning ...

  18. The Information Professional's Profile: An Analysis of Brazilian Job Vacancies on the Internet

    Science.gov (United States)

    da Cunha, Miriam Vieira

    2009-01-01

    Introduction: Report of a study to discover and describe job vacancies for information professionals available online at specific sites and discussion lists between January 2005 and February 2008. Method: The study uses Bardin's content analysis technique and the following analysis criteria: information source, institutional type, professional…

  19. A critical analysis of the literature on the Internet and consumer health information.

    Science.gov (United States)

    Powell, J A; Lowe, P; Griffiths, F E; Thorogood, M

    2005-01-01

    A critical review of the published literature investigating the Internet and consumer health information was undertaken in order to inform further research and policy. A qualitative, narrative method was used, consisting of a three-stage process of identification and collation, thematic coding, and critical analysis. This analysis identified five main themes in the research in this area: (1) the quality of online health information for consumers; (2) consumer use of the Internet for health information; (3) the effect of e-health on the practitioner-patient relationship; (4) virtual communities and online social support and (5) the electronic delivery of information-based interventions. Analysis of these themes revealed more about the concerns of health professionals than about the effect of the Internet on users. Much of the existing work has concentrated on quantifying characteristics of the Internet: for example, measuring the quality of online information, or describing the numbers of users in different health-care settings. There is a lack of qualitative research that explores how citizens are actually using the Internet for health care.

  20. Financial Ratio Analysis: the Development of a Dedicated Management Information System

    Directory of Open Access Journals (Sweden)

    Voicu-Dan Dragomir

    2007-01-01

    Full Text Available This paper disseminates the results of the development process for a financial analysis information system. The system has been subject to conceptual design using the Unified Modeling Language (UML and has been implemented in object-oriented manner using the Visual Basic .NET 2003 programming language. The classic financial analysis literature is focused on the chain-substitution method of computing the prior-year to current-year variation of linked financial ratios. We have applied this technique on the DuPont System of analysis concerning the Return on Equity ratio, by designing several structural UML diagrams depicting the breakdown and analysis of each financial ratio involved. The resulting computer application offers a flexible approach to the analytical tools: the user is required to introduce the raw data and the system provides both table-style and charted information on the output of computation. User-friendliness is also a key feature of this particular financial analysis application.

  1. Scientific Information Analysis of Chemistry Dissertations Using Thesaurus of Chemistry

    Directory of Open Access Journals (Sweden)

    Taghi Rajabi

    2017-09-01

    Full Text Available : Concept maps of chemistry can be obtained from thesaurus of chemistry. Analysis of information in the field of chemistry is done at graduate level, based on comparing and analyzing chemistry dissertations by using these maps. Therefore, the use of thesaurus for analyzing scientific information is recommended. Major advantage of using this method, is that it is possible to obtain a detailed map of all academic researches across all branches of science. The researches analysis results in chemical science can play a key role in developing strategic research policies, educational programming, linking universities to industries and postgraduate educational programming. This paper will first introduce the concept maps of chemistry. Then, emerging patterns from the concept maps of chemistry will be used to analyze the trend in the academic dissertations in chemistry, using the data collected and stored in our database at Iranian Research Institute for Information Science and Technology (IranDoc over the past 10 years (1998-2009.

  2. Information management for global environmental change, including the Carbon Dioxide Information Analysis Center

    Energy Technology Data Exchange (ETDEWEB)

    Stoss, F.W. [Oak Ridge National Lab., TN (United States). Carbon Dioxide Information Analysis Center

    1994-06-01

    The issue of global change is international in scope. A body of international organizations oversees the worldwide coordination of research and policy initiatives. In the US the National Science and Technology Council (NSTC) was established in November of 1993 to provide coordination of science, space, and technology policies throughout the federal government. NSTC is organized into nine proposed committees. The Committee on Environmental and Natural Resources (CERN) oversees the US Department of Energy`s Global Change Research Program (USGCRP). As part of the USGCRP, the US Department of Energy`s Global Change Research Program aims to improve the understanding of Earth systems and to strengthen the scientific basis for the evaluation of policy and government action in response to potential global environmental changes. This paper examines the information and data management roles of several international and national programs, including Oak Ridge National Laboratory`s (ORNL`s) global change information programs. An emphasis will be placed on the Carbon Dioxide Information Analysis Center (CDIAC), which also serves as the World Data Center-A for Atmospheric Trace Gases.

  3. Experiments in Discourse Analysis Impact on Information Classification and Retrieval Algorithms.

    Science.gov (United States)

    Morato, Jorge; Llorens, J.; Genova, G.; Moreiro, J. A.

    2003-01-01

    Discusses the inclusion of contextual information in indexing and retrieval systems to improve results and the ability to carry out text analysis by means of linguistic knowledge. Presents research that investigated whether discourse variables have an impact on information and retrieval and classification algorithms. (Author/LRW)

  4. Understanding information exchange during disaster response: Methodological insights from infocentric analysis

    Science.gov (United States)

    Toddi A. Steelman; Branda Nowell; Deena. Bayoumi; Sarah. McCaffrey

    2014-01-01

    We leverage economic theory, network theory, and social network analytical techniques to bring greater conceptual and methodological rigor to understand how information is exchanged during disasters. We ask, "How can information relationships be evaluated more systematically during a disaster response?" "Infocentric analysis"—a term and...

  5. HJD-I record and analysis meter for nuclear information

    International Nuclear Information System (INIS)

    Di Shaoliang; Huang Yong; Xiao Yanbin

    1992-01-01

    A low-cost, small-volume, multi-function and new model intelligent nuclear electronic meter HJD-I Record and Analysis Meter are stated for Nuclear Information. It's hardware and software were detailed and the 137 Cs spectrum with this meter was presented

  6. Information Operations - Analysis Support and Capability Requirements (Operations d'information - Soutien a l'analyse et exigences de capacites) (CD-ROM)

    National Research Council Canada - National Science Library

    2006-01-01

    ...: The focus of the study "Information Operations - Analysis Support and Capability Requirements" undertaken by the RTO Task Group SAS-057 was to provide recommendations to improve analysis support...

  7. [Analysis of utilization of information in the journal Medicina Clinica].

    Science.gov (United States)

    Aleixandre, R; Giménez Sánchez, J V; Terrada, M L; López Piñero, J M

    1994-09-10

    Scientific communication knowledge is specifically based in the analysis of the bibliographic references inside the publications. Pattern and laws determining the information consumption in the items of the journal Medicina Clinica are investigated in the present study as its own aim. An analysis was performed on the 13,286 references downloaded from 618 papers published by the journal in 1990. With dBASE IV was generated a database for the management of the information; data was distributed in several tables through criteria of age, documentary types, countries, journals and Bradford zones. The analysed references belong to 1,241 different journals, 110 from Spain. Being two thirds of the total sum, the publications from United States and United Kingdom have received more citations than those from Spain. The publications from european countries, like France, Germany and Italy, are scarcely present. Bradford core is constituted by the journals Medicina Clinica and The Lancet. The analysis of the bibliographic references available from the articles in this journal is able to produce knowledge on the information consumption by the practitioners; its usefulness as a complementary utility to the Indice de Citas e Indicadores Bibliométricos de Revistas Españolas de Medicina Interna y sus especialidades 1990 must be considered.

  8. Information needs of engineers. The methodology developed by the WFEO Committee on Engineering Information and the use of value analysis for improving information services

    International Nuclear Information System (INIS)

    Darjoto, S.W.; Martono, A.; Michel, J.

    1990-05-01

    The World Federation of Engineering Organizations - WFEO - through the work of its Committee on Engineering Information, aims at improving the efficiency of engineers and particularly at developing new attitudes and practices concerning the specialized information mastering. One important part of the WFEO/CEI programme of activities during the last years and for the next years was and is devoted to a better understanding of the information needs of engineers. But also, it seems now essential to WFEO/CEI to better evaluate information services in order to correctly adapt them to the identified needs of engineers. The following communication will emphasize these two main and related perspectives: identifying the information needs of engineers; developing Value Analysis approaches for engineering information services. (author). 3 refs

  9. Information needs of engineers. The methodology developed by the WFEO Committee on Engineering Information and the use of value analysis for improving information services

    Energy Technology Data Exchange (ETDEWEB)

    Darjoto, S W [Indonesian Inst. of Sciences, Bandung (Indonesia); Martono, A [Indonesian Inst. of Engineers, Jakarta (Indonesia); Michel, J [Ecole Nationale des Ponts et Chaussees, Paris (France)

    1990-05-01

    The World Federation of Engineering Organizations - WFEO - through the work of its Committee on Engineering Information, aims at improving the efficiency of engineers and particularly at developing new attitudes and practices concerning the specialized information mastering. One important part of the WFEO/CEI programme of activities during the last years and for the next years was and is devoted to a better understanding of the information needs of engineers. But also, it seems now essential to WFEO/CEI to better evaluate information services in order to correctly adapt them to the identified needs of engineers. The following communication will emphasize these two main and related perspectives: identifying the information needs of engineers; developing Value Analysis approaches for engineering information services. (author). 3 refs.

  10. Use of historical information in extreme storm surges frequency analysis

    Science.gov (United States)

    Hamdi, Yasser; Duluc, Claire-Marie; Deville, Yves; Bardet, Lise; Rebour, Vincent

    2013-04-01

    The prevention of storm surge flood risks is critical for protection and design of coastal facilities to very low probabilities of failure. The effective protection requires the use of a statistical analysis approach having a solid theoretical motivation. Relating extreme storm surges to their frequency of occurrence using probability distributions has been a common issue since 1950s. The engineer needs to determine the storm surge of a given return period, i.e., the storm surge quantile or design storm surge. Traditional methods for determining such a quantile have been generally based on data from the systematic record alone. However, the statistical extrapolation, to estimate storm surges corresponding to high return periods, is seriously contaminated by sampling and model uncertainty if data are available for a relatively limited period. This has motivated the development of approaches to enlarge the sample extreme values beyond the systematic period. The nonsystematic data occurred before the systematic period is called historical information. During the last three decades, the value of using historical information as a nonsystematic data in frequency analysis has been recognized by several authors. The basic hypothesis in statistical modeling of historical information is that a perception threshold exists and that during a giving historical period preceding the period of tide gauging, all exceedances of this threshold have been recorded. Historical information prior to the systematic records may arise from high-sea water marks left by extreme surges on the coastal areas. It can also be retrieved from archives, old books, earliest newspapers, damage reports, unpublished written records and interviews with local residents. A plotting position formula, to compute empirical probabilities based on systematic and historical data, is used in this communication paper. The objective of the present work is to examine the potential gain in estimation accuracy with the

  11. Knowledge-Based Information Management for Watershed Analysis in the Pacific Northwest U.S.

    Science.gov (United States)

    Keith Reynolds; Richard Olson; Michael Saunders; Donald Latham; Michael Foster; Bruce Miller; Lawrence Bednar; Daniel Schmoldt; Patrick Cunningham; John Steffenson

    1996-01-01

    We are developing a knowledge-based information management system to provide decision support for watershed analysis in the Pacific Northwest region of the U.S. The system includes: (1) a GIS interface that allows users to graphically navigate to specific provinces and watersheds and display a variety of themes and other area-specific information, (2) an analysis...

  12. Activation Analysis. Proceedings of an Informal Study Group Meeting

    International Nuclear Information System (INIS)

    1971-01-01

    As part of its programme to promote the exchange of information relating to nuclear science and technology, the International Atomic Energy Agency convened in Bangkok, Thailand, from 6-8 July 1970, an informal meeting to discuss the topic of Activation Analysis. The meeting was attended by participants drawn from the following countries: Australia, Burma, Ceylon, Republic of China, India, Indonesia, Prance, Japan, Republic of Korea, New Zealand, Philippines, Singapore, Thailand, United States of America and Vietnam. The proceedings consist of the contributions presented at the meeting with minor editorial changes

  13. Activation Analysis. Proceedings of an Informal Study Group Meeting

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1971-07-01

    As part of its programme to promote the exchange of information relating to nuclear science and technology, the International Atomic Energy Agency convened in Bangkok, Thailand, from 6-8 July 1970, an informal meeting to discuss the topic of Activation Analysis. The meeting was attended by participants drawn from the following countries: Australia, Burma, Ceylon, Republic of China, India, Indonesia, Prance, Japan, Republic of Korea, New Zealand, Philippines, Singapore, Thailand, United States of America and Vietnam. The proceedings consist of the contributions presented at the meeting with minor editorial changes.

  14. Information-seeking at a caregiving website: a qualitative analysis.

    Science.gov (United States)

    Kernisan, Leslie P; Sudore, Rebecca L; Knight, Sara J

    2010-07-28

    The Internet is widely used for health information, yet little is known about the online activity of family caregivers of elders, a rapidly growing group. In order to better understand the online information-seeking activity of "e-caregivers" and other visitors at a caregiving website, we undertook a qualitative analysis of survey data from a website marketed as a comprehensive resource for adults caring for aging parents. The objectives were to better understand what types of information are sought by those visiting a website focused on elder-care issues and to identify overarching themes that might inform future development of Internet resources related to caregiving and aging. From March 2008 to March 2009, a 5-question pop-up survey was offered 9662 times and completed 2161 times. For 1838 respondents, included was a free text answer to the question "What were you looking for?" and 1467 offered relevant and detailed responses. The survey also asked about satisfaction with the site, gender of the respondent, and relationship to the individual being cared for. Content analysis was used to develop a coding dictionary, to code responses into information-seeking categories, and to identify overarching themes. Of the respondents (76% of whom were female), 50% indicated they were caring for parents, 17% for themselves only, and 31% for others. Over half (57%) reported finding what they were looking for, and 46% stated they were extremely likely to recommend the website. Frequently mentioned information-seeking categories included "health information," "practical caregiving," and "support." Respondents also requested information related to housing, legal, insurance, and financial issues. Many responses referred to multiple comorbid conditions and complex caregiving situations. Overarching themes included (1) a desire for assistance with a wide range of practical skills and information and (2) help interpreting symptoms and behavior, such as knowing what life impacts to

  15. 40 CFR 1400.9 - Access to off-site consequence analysis information by State and local government officials.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Access to off-site consequence... CONSEQUENCE ANALYSIS INFORMATION DISTRIBUTION OF OFF-SITE CONSEQUENCE ANALYSIS INFORMATION Access to Off-Site Consequence Analysis Information by Government Officials. § 1400.9 Access to off-site consequence analysis...

  16. Meeting the reactor operator's information needs using functional analysis

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, W.R.; Clark, M.T.

    1980-01-01

    Since the accident at Three Mile Island, many ideas have been proposed for assisting the reactor operator during emergency situations. However, some of the suggested remedies do not alleviate an important shortcoming of the TMI control room: the operators were not presented with the information they needed in a manner which would allow prompt diagnosis of the problem. To address this problem, functional analysis is being applied at the LOFT facility to ensure that the operator's information needs are being met in his procedures and graphic displays. This paper summarizes the current applications of functional analysis at LOFT.

  17. Information technology portfolio in supply chain management using factor analysis

    Directory of Open Access Journals (Sweden)

    Ahmad Jaafarnejad

    2013-11-01

    Full Text Available The adoption of information technology (IT along with supply chain management (SCM has become increasingly a necessity among most businesses. This enhances supply chain (SC performance and helps companies achieve the organizational competitiveness. IT systems capture and analyze information and enable management to make decisions by considering a global scope across the entire SC. This paper reviews the existing literature on IT in SCM and considers pertinent criteria. Using principal component analysis (PCA of factor analysis (FA, a number of related criteria are divided into smaller groups. Finally, SC managers can develop an IT portfolio in SCM using mean values of few extracted components on the relevance –emergency matrix. A numerical example is provided to explain details of the proposed method.

  18. 75 FR 58374 - 2010 Release of CADDIS (Causal Analysis/Diagnosis Decision Information System)

    Science.gov (United States)

    2010-09-24

    ... Decision Information System) AGENCY: Environmental Protection Agency (EPA). ACTION: Notice of public... 2010 version of the Causal Analysis/Diagnosis Decision Information System (CADDIS). This Web site was developed to help scientists find, develop, organize, and use environmental information to improve causal...

  19. Petroleum labour market information supply demand analysis 2009-2020

    International Nuclear Information System (INIS)

    2010-03-01

    Since 2006, the petroleum industry has been interested in collaboration to determine labour demand and supply/demand gaps for the upstream petroleum industry. In 2006, the petroleum industry experienced strong employment growth and was having difficulty finding workers. Comprehensive, up-to-date labour market information and analysis are the key foundation for addressing labour supply/demand issues. This document presented labour market information on the petroleum industry in order to inform company retention and recruitment offices; government departments involved in development of labour market policies and programs; education and training institutions; guidance counsellors, employment centres and organizations that work with youth and labour supply pools; and job seekers. Specific topics that were discussed included two industry scenarios (growth and base case) in determining the petroleum industry's medium-and long-term employment needs; labour supply/demand considerations for the industry as a whole and an industry-wide cost management; and an analysis of exploration and production, oil sands, services, and pipeline sectors to 2020. It was concluded that while new employment is not expected to lead to labour shortages within the pipeline sector, attrition due to requirements almost certainly would. In the growth scenario, it is likely the pipeline sector will be challenged by competition from the other petroleum industry sectors. tabs., figs., appendices.

  20. Petroleum labour market information supply demand analysis 2009-2020

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2010-03-15

    Since 2006, the petroleum industry has been interested in collaboration to determine labour demand and supply/demand gaps for the upstream petroleum industry. In 2006, the petroleum industry experienced strong employment growth and was having difficulty finding workers. Comprehensive, up-to-date labour market information and analysis are the key foundation for addressing labour supply/demand issues. This document presented labour market information on the petroleum industry in order to inform company retention and recruitment offices; government departments involved in development of labour market policies and programs; education and training institutions; guidance counsellors, employment centres and organizations that work with youth and labour supply pools; and job seekers. Specific topics that were discussed included two industry scenarios (growth and base case) in determining the petroleum industry's medium-and long-term employment needs; labour supply/demand considerations for the industry as a whole and an industry-wide cost management; and an analysis of exploration and production, oil sands, services, and pipeline sectors to 2020. It was concluded that while new employment is not expected to lead to labour shortages within the pipeline sector, attrition due to requirements almost certainly would. In the growth scenario, it is likely the pipeline sector will be challenged by competition from the other petroleum industry sectors. tabs., figs., appendices.

  1. Climate Informed Low Flow Frequency Analysis Using Nonstationary Modeling

    Science.gov (United States)

    Liu, D.; Guo, S.; Lian, Y.

    2014-12-01

    Stationarity is often assumed for frequency analysis of low flows in water resources management and planning. However, many studies have shown that flow characteristics, particularly the frequency spectrum of extreme hydrologic events,were modified by climate change and human activities and the conventional frequency analysis without considering the non-stationary characteristics may lead to costly design. The analysis presented in this paper was based on the more than 100 years of daily flow data from the Yichang gaging station 44 kilometers downstream of the Three Gorges Dam. The Mann-Kendall trend test under the scaling hypothesis showed that the annual low flows had significant monotonic trend, whereas an abrupt change point was identified in 1936 by the Pettitt test. The climate informed low flow frequency analysis and the divided and combined method are employed to account for the impacts from related climate variables and the nonstationarities in annual low flows. Without prior knowledge of the probability density function for the gaging station, six distribution functions including the Generalized Extreme Values (GEV), Pearson Type III, Gumbel, Gamma, Lognormal, and Weibull distributions have been tested to find the best fit, in which the local likelihood method is used to estimate the parameters. Analyses show that GEV had the best fit for the observed low flows. This study has also shown that the climate informed low flow frequency analysis is able to exploit the link between climate indices and low flows, which would account for the dynamic feature for reservoir management and provide more accurate and reliable designs for infrastructure and water supply.

  2. Information Retrieval and Graph Analysis Approaches for Book Recommendation

    Directory of Open Access Journals (Sweden)

    Chahinez Benkoussas

    2015-01-01

    Full Text Available A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval approach to related document network comprised of social links. We called Directed Graph of Documents (DGD a network constructed with documents and social information provided from each one of them. Specifically, this work tackles the problem of book recommendation in the context of INEX (Initiative for the Evaluation of XML retrieval Social Book Search track. A series of reranking experiments demonstrate that combining retrieval models yields significant improvements in terms of standard ranked retrieval metrics. These results extend the applicability of link analysis algorithms to different environments.

  3. Information Retrieval and Graph Analysis Approaches for Book Recommendation.

    Science.gov (United States)

    Benkoussas, Chahinez; Bellot, Patrice

    2015-01-01

    A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model) and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval approach to related document network comprised of social links. We called Directed Graph of Documents (DGD) a network constructed with documents and social information provided from each one of them. Specifically, this work tackles the problem of book recommendation in the context of INEX (Initiative for the Evaluation of XML retrieval) Social Book Search track. A series of reranking experiments demonstrate that combining retrieval models yields significant improvements in terms of standard ranked retrieval metrics. These results extend the applicability of link analysis algorithms to different environments.

  4. Imaging for dismantlement verification: Information management and analysis algorithms

    International Nuclear Information System (INIS)

    Robinson, S.M.; Jarman, K.D.; Pitts, W.K.; Seifert, A.; Misner, A.C.; Woodring, M.L.; Myjak, M.J.

    2012-01-01

    The level of detail discernible in imaging techniques has generally excluded them from consideration as verification tools in inspection regimes. An image will almost certainly contain highly sensitive information, and storing a comparison image will almost certainly violate a cardinal principle of information barriers: that no sensitive information be stored in the system. To overcome this problem, some features of the image might be reduced to a few parameters suitable for definition as an attribute, which must be non-sensitive to be acceptable in an Information Barrier regime. However, this process must be performed with care. Features like the perimeter, area, and intensity of an object, for example, might reveal sensitive information. Any data-reduction technique must provide sufficient information to discriminate a real object from a spoofed or incorrect one, while avoiding disclosure (or storage) of any sensitive object qualities. Ultimately, algorithms are intended to provide only a yes/no response verifying the presence of features in the image. We discuss the utility of imaging for arms control applications and present three image-based verification algorithms in this context. The algorithms reduce full image information to non-sensitive feature information, in a process that is intended to enable verification while eliminating the possibility of image reconstruction. The underlying images can be highly detailed, since they are dynamically generated behind an information barrier. We consider the use of active (conventional) radiography alone and in tandem with passive (auto) radiography. We study these algorithms in terms of technical performance in image analysis and application to an information barrier scheme.

  5. 40 CFR 1400.8 - Access to off-site consequence analysis information by Federal government officials.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Access to off-site consequence... MANAGEMENT PROGRAMS UNDER THE CLEAN AIR ACT SECTION 112(r)(7); DISTRIBUTION OF OFF-SITE CONSEQUENCE ANALYSIS INFORMATION DISTRIBUTION OF OFF-SITE CONSEQUENCE ANALYSIS INFORMATION Access to Off-Site Consequence Analysis...

  6. 10 CFR 52.79 - Contents of applications; technical information in final safety analysis report.

    Science.gov (United States)

    2010-01-01

    ...; technical information in final safety analysis report. (a) The application must contain a final safety... 10 Energy 2 2010-01-01 2010-01-01 false Contents of applications; technical information in final safety analysis report. 52.79 Section 52.79 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSES...

  7. Open source information acquisition, analysis and integration in the IAEA Department of Safeguards

    International Nuclear Information System (INIS)

    Barletta, M.; Zarimpas, N.; Zarucki, R.

    2010-10-01

    Acquisition and analysis of open source information plays an increasingly important role in the IAEA strengthened safeguards system. The Agency's focal point for open source information collection and analysis is the Division of Safeguards Information Management (SGIM) within the IAEA Department of Safeguards. In parallel with the approval of the Model Additional Protocol in 1997, a new centre of information acquisition and analysis expertise was created within SGIM. By acquiring software, developing databases, retraining existing staff and hiring new staff with diverse analytical skills, SGIM is pro actively contributing to the future implementation of information-driven safeguards in collaboration with other Divisions within the Department of Safeguards. Open source information support is now fully integrated with core safeguards processes and activities, and has become an effective tool in the work of the Department of Safeguards. This provides and overview of progress realized through the acquisition and use of open source information in several thematic areas: evaluation of additional protocol declarations; support to the State Evaluation process; in-depth investigation of safeguards issues, including assisting inspections and complementary access; research on illicit nuclear procurement networks and trafficking; and monitoring nuclear developments. Demands for open source information have steadily grown and are likely to continue to grow in the future. Coupled with the enormous growth and accessibility in the volume and sources of information, new challenges are presented, both technical and analytical. This paper discusses actions taken and future plans for multi-source and multi-disciplinary analytic integration to strengthen confidence in safeguards conclusions - especially regarding the absence of undeclared nuclear materials and activities. (Author)

  8. Information theoretic analysis of edge detection in visual communication

    Science.gov (United States)

    Jiang, Bo; Rahman, Zia-ur

    2010-08-01

    Generally, the designs of digital image processing algorithms and image gathering devices remain separate. Consequently, the performance of digital image processing algorithms is evaluated without taking into account the artifacts introduced into the process by the image gathering process. However, experiments show that the image gathering process profoundly impacts the performance of digital image processing and the quality of the resulting images. Huck et al. proposed one definitive theoretic analysis of visual communication channels, where the different parts, such as image gathering, processing, and display, are assessed in an integrated manner using Shannon's information theory. In this paper, we perform an end-to-end information theory based system analysis to assess edge detection methods. We evaluate the performance of the different algorithms as a function of the characteristics of the scene, and the parameters, such as sampling, additive noise etc., that define the image gathering system. The edge detection algorithm is regarded to have high performance only if the information rate from the scene to the edge approaches the maximum possible. This goal can be achieved only by jointly optimizing all processes. People generally use subjective judgment to compare different edge detection methods. There is not a common tool that can be used to evaluate the performance of the different algorithms, and to give people a guide for selecting the best algorithm for a given system or scene. Our information-theoretic assessment becomes this new tool to which allows us to compare the different edge detection operators in a common environment.

  9. Food labelled Information: An Empirical Analysis of Consumer Preferences

    Directory of Open Access Journals (Sweden)

    Alessandro Banterle

    2012-12-01

    Full Text Available This paper aims at analysing which kinds of currently labelled information are of interest and actually used by consumers, and which additional kinds could improve consumer choices. We investigate the attitude of consumers with respect to innovative strategies for the diffusion of product information, as smart-labels for mobile-phones. The empirical analysis was organised in focus groups followed by a survey on 240 consumers. Results show that the most important nutritional claims are vitamins, energy and fat content. Consumers show a high interest in the origin of the products, GMOs, environmental impact, animal welfare and type of breeding.

  10. Installing and Executing Information Object Analysis, Intent, Dissemination, and Enhancement (IOAIDE) and Its Dependencies

    Science.gov (United States)

    2017-02-01

    SUPPLEMENTARY NOTES 14. ABSTRACT Information Object Analysis, Intent, Dissemination, and Enhancement (IOAIDE) is a novel information framework developed...prototyping. It supports dynamic plugin of analysis modules, for either research or analysis tasks. The framework integrates multiple image processing...Requirements 2 3. Installing the Software for IOAIDE 2 3.1 Load ARL Software 2 3.2 Load ARL Applications 4 3.3 Load the DSPro Software 7 3.4 Update Java

  11. RACLOUDS - Model for Clouds Risk Analysis in the Information Assets Context

    Directory of Open Access Journals (Sweden)

    SILVA, P. F.

    2016-06-01

    Full Text Available Cloud computing offers benefits in terms of availability and cost, but transfers the responsibility of information security management for the cloud service provider. Thus the consumer loses control over the security of their information and services. This factor has prevented the migration to cloud computing in many businesses. This paper proposes a model where the cloud consumer can perform risk analysis on providers before and after contracting the service. The proposed model establishes the responsibilities of three actors: Consumer, Provider and Security Labs. The inclusion of actor Security Labs provides more credibility to risk analysis making the results more consistent for the consumer.

  12. MIToS.jl: mutual information tools for protein sequence analysis in the Julia language

    DEFF Research Database (Denmark)

    Zea, Diego J.; Anfossi, Diego; Nielsen, Morten

    2017-01-01

    Motivation: MIToS is an environment for mutual information analysis and a framework for protein multiple sequence alignments (MSAs) and protein structures (PDB) management in Julia language. It integrates sequence and structural information through SIFTS, making Pfam MSAs analysis straightforward....... MIToS streamlines the implementation of any measure calculated from residue contingency tables and its optimization and testing in terms of protein contact prediction. As an example, we implemented and tested a BLOSUM62-based pseudo-count strategy in mutual information analysis. Availability...... and Implementation: The software is totally implemented in Julia and supported for Linux, OS X and Windows. It’s freely available on GitHub under MIT license: http://mitos.leloir.org.ar. Contacts:diegozea@gmail.com or cmb@leloir.org.ar Supplementary information: Supplementary data are available at Bioinformatics...

  13. AN ANALYSIS OF SALES INFORMATION SYSTEM AND COMPETITIVE ADVANTAGE (Study Case of UD. Citra Helmet

    Directory of Open Access Journals (Sweden)

    Hendra Alianto

    2012-10-01

    Full Text Available Business development in this era of globalization leads companies to use information system in running business relationship by changing the traditional way of working in non-integrated information systems into integrated information systems. The intended use of the integrated information system will improve the effective and efficient way of working, such as the availability of information in real-time, accurate and informative for decision-making for the benefit of operational activities, as well as decision-making for strategic interests and the company’s business development. Especially with the application of sales information system, it will improve the company’s performance and will affect the competitiveness of companies, which can ultimately increase the maximum profit. However, in reality it is not easy to implement the integrated information system, because it is influenced by the customs, culture and mindset of the user company. It is necessary for running system analysis activity and building an integrated information system by concerning into the needs of users, management, customers and stakeholders. The implementation of integrated information system will increase productivity and achieve the effectiveness and efficiency level of company’s operations, through the analysis of sales information system will affect the competitiveness of companies.Keywords: Sales Information System Analysis

  14. Information Theory for Correlation Analysis and Estimation of Uncertainty Reduction in Maps and Models

    Directory of Open Access Journals (Sweden)

    J. Florian Wellmann

    2013-04-01

    Full Text Available The quantification and analysis of uncertainties is important in all cases where maps and models of uncertain properties are the basis for further decisions. Once these uncertainties are identified, the logical next step is to determine how they can be reduced. Information theory provides a framework for the analysis of spatial uncertainties when different subregions are considered as random variables. In the work presented here, joint entropy, conditional entropy, and mutual information are applied for a detailed analysis of spatial uncertainty correlations. The aim is to determine (i which areas in a spatial analysis share information, and (ii where, and by how much, additional information would reduce uncertainties. As an illustration, a typical geological example is evaluated: the case of a subsurface layer with uncertain depth, shape and thickness. Mutual information and multivariate conditional entropies are determined based on multiple simulated model realisations. Even for this simple case, the measures not only provide a clear picture of uncertainties and their correlations but also give detailed insights into the potential reduction of uncertainties at each position, given additional information at a different location. The methods are directly applicable to other types of spatial uncertainty evaluations, especially where multiple realisations of a model simulation are analysed. In summary, the application of information theoretic measures opens up the path to a better understanding of spatial uncertainties, and their relationship to information and prior knowledge, for cases where uncertain property distributions are spatially analysed and visualised in maps and models.

  15. Research of Classical and Intelligent Information System Solutions for Criminal Intelligence Analysis

    OpenAIRE

    Šimović, Vladimir

    2001-01-01

    The objective of this study is to present research on classical and intelligent information system solutions used in criminal intelligence analysis in Croatian security system theory. The study analyses objective and classical methods of information science, including artificial intelligence and other scientific methods. The intelligence and classical software solutions researched, proposed, and presented in this study were used in developing the integrated information system for the Croatian...

  16. Analysis of data as information: quality assurance approach.

    Science.gov (United States)

    Ivankovic, D; Kern, J; Bartolic, A; Vuletic, S

    1993-01-01

    Describes a prototype module for data analysis of the healthcare delivery system. It consists of three main parts: data/variable selection; algorithms for the analysis of quantitative and qualitative changes in the system; and interpretation and explanation of the results. Such a module designed for primary health care has been installed on a PC in the health manager's office. Data enter the information system through the standard DBMS procedures, followed by calculating a number of different indicators and the time series, as the ordered sequences of indicators, according to demands of the manager. The last procedure is "the change analysis" with estimation of unexpected differences between and within some units, e.g. health-care teams, as well as some unexpected variabilities and trends. As an example, presents and discusses the diagnostic pattern of neurotic cases, referral patterns and preventive behaviour of GP's teams as well.

  17. Open source information acquisition, analysis and integration in the IAEA Department of Safeguards

    Energy Technology Data Exchange (ETDEWEB)

    Barletta, M.; Zarimpas, N.; Zarucki, R., E-mail: M.Barletta@iaea.or [IAEA, Wagramerstrasse 5, P.O. Box 100, 1400 Vienna (Austria)

    2010-10-15

    Acquisition and analysis of open source information plays an increasingly important role in the IAEA strengthened safeguards system. The Agency's focal point for open source information collection and analysis is the Division of Safeguards Information Management (SGIM) within the IAEA Department of Safeguards. In parallel with the approval of the Model Additional Protocol in 1997, a new centre of information acquisition and analysis expertise was created within SGIM. By acquiring software, developing databases, retraining existing staff and hiring new staff with diverse analytical skills, SGIM is pro actively contributing to the future implementation of information-driven safeguards in collaboration with other Divisions within the Department of Safeguards. Open source information support is now fully integrated with core safeguards processes and activities, and has become an effective tool in the work of the Department of Safeguards. This provides and overview of progress realized through the acquisition and use of open source information in several thematic areas: evaluation of additional protocol declarations; support to the State Evaluation process; in-depth investigation of safeguards issues, including assisting inspections and complementary access; research on illicit nuclear procurement networks and trafficking; and monitoring nuclear developments. Demands for open source information have steadily grown and are likely to continue to grow in the future. Coupled with the enormous growth and accessibility in the volume and sources of information, new challenges are presented, both technical and analytical. This paper discusses actions taken and future plans for multi-source and multi-disciplinary analytic integration to strengthen confidence in safeguards conclusions - especially regarding the absence of undeclared nuclear materials and activities. (Author)

  18. Urban Planning and Management Information Systems Analysis and Design Based on GIS

    Science.gov (United States)

    Xin, Wang

    Based on the analysis of existing relevant systems on the basis of inadequate, after a detailed investigation and research, urban planning and management information system will be designed for three-tier structure system, under the LAN using C/S mode architecture. Related functions for the system designed in accordance with the requirements of the architecture design of the functional relationships between the modules. Analysis of the relevant interface and design, data storage solutions proposed. The design for small and medium urban planning information system provides a viable building program.

  19. Fault Diagnosis Method Based on Information Entropy and Relative Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    Xiaoming Xu

    2017-01-01

    Full Text Available In traditional principle component analysis (PCA, because of the neglect of the dimensions influence between different variables in the system, the selected principal components (PCs often fail to be representative. While the relative transformation PCA is able to solve the above problem, it is not easy to calculate the weight for each characteristic variable. In order to solve it, this paper proposes a kind of fault diagnosis method based on information entropy and Relative Principle Component Analysis. Firstly, the algorithm calculates the information entropy for each characteristic variable in the original dataset based on the information gain algorithm. Secondly, it standardizes every variable’s dimension in the dataset. And, then, according to the information entropy, it allocates the weight for each standardized characteristic variable. Finally, it utilizes the relative-principal-components model established for fault diagnosis. Furthermore, the simulation experiments based on Tennessee Eastman process and Wine datasets demonstrate the feasibility and effectiveness of the new method.

  20. Investigation Into Informational Compatibility Of Building Information Modelling And Building Performance Analysis Software Solutions

    OpenAIRE

    Hyun, S.; Marjanovic-Halburd, L.; Raslan, R.

    2015-01-01

    There are significant opportunities for Building Information Modelling (BIM) to address issues related to sustainable and energy efficient building design. While the potential benefits associated with the integration of BIM and BPA (Building Performance Analysis) have been recognised, its specifications and formats remain in their early infancy and often fail to live up to the promise of seamless interoperability at various stages of design process. This paper conducts a case study to investi...

  1. Knowledge-based image analysis: some aspects on the analysis of images using other types of information

    Energy Technology Data Exchange (ETDEWEB)

    Eklundh, J O

    1982-01-01

    The computer vision approach to image analysis is discussed from two aspects. First, this approach is constrasted to the pattern recognition approach. Second, how external knowledge and information and models from other fields of science and engineering can be used for image and scene analysis is discussed. In particular, the connections between computer vision and computer graphics are pointed out.

  2. Implementation of a drainage information, analysis and management system

    Directory of Open Access Journals (Sweden)

    J.N. Meegoda

    2017-04-01

    Full Text Available An integrated drainage information, analysis and management system (DIAMS was developed and implemented for the New Jersey Department of Transportation (NJDOT. The purpose of the DIAMS is to provide a useful tool for managers to evaluate drainage infrastructure, to facilitate the determination of the present costs of preserving those infrastructures, and to make decisions regarding the optimal use of their infrastructure budgets. The impetus for DIAMS is the culvert information management system (CIMS, which is developed to manage the data for culvert pipes. DIAMS maintains and summarizes accumulated inspection data for all types of drainage infrastructure assets, including pipes, inlet/outlet structures, outfalls and manufactured treatment devices. DIAMS capabilities include identifying drainage infrastructure, maintaining inspection history, mapping locations, predicting service life based on the current condition states, and assessing present asset value. It also includes unit cost values of 72 standard items to estimate the current cost for new assets with the ability to adjust for future inflation. In addition, DIAMS contains several different repair, rehabilitation and replacement options to remedy the drainage infrastructure. DIAMS can analyze asset information and determine decisions to inspect, rehabilitate, replace or do nothing at the project and network levels by comparing costs with risks and failures. Costs may be optimized to meet annual maintenance budget allocations by prioritizing drainage infrastructure needing inspection, cleaning and repair. DIAMS functional modules include vendor data uploading, asset identification, system administration and financial analysis. Among the significant performance feature of DIAMS is its proactive nature, which affords decision makers the means of conducting a comprehensive financial analysis to determine the optimal proactive schedule for the proper maintenance actions and to prioritize them

  3. Information Flow Analysis for Human-System Interaction in the SG Level Control

    International Nuclear Information System (INIS)

    Kim, Jong Hyun; Shin, Yeong Cheol

    2008-01-01

    Interaction between automatic control and operators is one of main issues in the application of automation technology. Inappropriate information from automatic control systems causes unexpected problems in human-automation collaboration. Poor information becomes critical, especially when the operator takes over the control from an automation system. Operators cannot properly handle the situation transferred from the automatic mode because of inadequate situation awareness, if the operator is out-of-the loop and the automatic control system fails. Some cases of unplanned reactor trips during the transition between the manual mode and the automatic mode are reported in nuclear power plants (NPPs). Among unplanned reactor trips since 2002, two cases were partially caused by automation-related failures of steam generator (SG) level control. This paper conducts information flow analysis to identify information and control requirement for human-system interaction of SG level control. At first, this paper identifies the level of automation in SG level control systems and then function allocation between system control and human operators. Then information flow analysis for monitoring and transition of automation is performed by adapting job process chart. Information and control requirements will be useful as an input for the human-system interface (HSI) design of SG level control

  4. Application of mathematical methods of analysis in selection of competing information technologies

    Science.gov (United States)

    Semenov, V. L.; Kadyshev, E. N.; Zakharova, A. N.; Patianova, A. O.; Dulina, G. S.

    2018-05-01

    The article discusses the use of qualimetry methods using the apparatus of mathematical analysis in the formation of the integral index that allows one to select the best option among competing information technology. The authors propose the use of affine space in the evaluation and selection of competing information technologies.

  5. A comparative study of information diffusion in weblogs and microblogs based on social network analysis

    Institute of Scientific and Technical Information of China (English)

    Yang ZHANG; Wanyang LING

    2012-01-01

    Purpose:This paper intends to explore a quantitative method for investigating the characteristics of information diffusion through social media like weblogs and microblogs.By using the social network analysis methods,we attempt to analyze the different characteristics of information diffusion in weblogs and microblogs as well as the possible reasons of these differences.Design/methodology/approach:Using the social network analysis methods,this paper carries out an empirical study by taking the Chinese weblogs and microblogs in the field of Library and Information Science (LIS) as the research sample and employing measures such as network density,core/peripheral structure and centrality.Findings:Firstly,both bloggers and microbloggers maintain weak ties,and both of their social networks display a small-world effect.Secondly,compared with weblog users,microblog users are more interconnected,more equal and more capable of developing relationships with people outside their own social networks.Thirdly,the microblogging social network is more conducive to information diffusion than the blogging network,because of their differences in functions and the information flow mechanism.Finally,the communication mode emerged with microblogging,with the characteristics of micro-content,multi-channel information dissemination,dense and decentralized social network and content aggregation,will be one of the trends in the development of the information exchange platform in the future.Research limitations:The sample size needs to be increased so that samples are more representative.Errors may exist during the data collection.Moreover,the individual-level characteristics of the samples as well as the types of information exchanged need to be further studied.Practical implications:This preliminary study explores the characteristics of information diffusion in the network environment and verifies the feasibility of conducting a quantitative analysis of information diffusion through

  6. A comparative study of information diffusion in weblogs and microblogs based on social network analysis

    Institute of Scientific and Technical Information of China (English)

    Yang; ZHANG; Wanyang; LING

    2012-01-01

    Purpose:This paper intends to explore a quantitative method for investigating the characteristics of information diffusion through social media like weblogs and microblogs.By using the social network analysis methods,we attempt to analyze the different characteristics of information diffusion in weblogs and microblogs as well as the possible reasons of these differences.Design/methodology/approach:Using the social network analysis methods,this paper carries out an empirical study by taking the Chinese weblogs and microblogs in the field of Library and Information Science(LIS)as the research sample and employing measures such as network density,core/peripheral structure and centrality.Findings:Firstly,both bloggers and microbloggers maintain weak ties,and both of their social networks display a small-world effect.Secondly,compared with weblog users,microblog users are more interconnected,more equal and more capable of developing relationships with people outside their own social networks.Thirdly,the microblogging social network is more conducive to information diffusion than the blogging network,because of their differences in functions and the information flow mechanism.Finally,the communication mode emerged with microblogging,with the characteristics of micro-content,multi-channel information dissemination,dense and decentralized social network and content aggregation,will be one of the trends in the development of the information exchange platform in the future.Research limitations:The sample size needs to be increased so that samples are more representative.Errors may exist during the data collection.Moreover,the individual-level characteristics of the samples as well as the types of information exchanged need to be further studied.Practical implications:This preliminary study explores the characteristics of information diffusion in the network environment and verifies the feasibility of conducting a quantitative analysis of information diffusion through social

  7. PACC information management code for common cause failures analysis

    International Nuclear Information System (INIS)

    Ortega Prieto, P.; Garcia Gay, J.; Mira McWilliams, J.

    1987-01-01

    The purpose of this paper is to present the PACC code, which, through an adequate data management, makes the task of computerized common-mode failure analysis easier. PACC processes and generates information in order to carry out the corresponding qualitative analysis, by means of the boolean technique of transformation of variables, and the quantitative analysis either using one of several parametric methods or a direct data-base. As far as the qualitative analysis is concerned, the code creates several functional forms for the transformation equations according to the user's choice. These equations are subsequently processed by boolean manipulation codes, such as SETS. The quantitative calculations of the code can be carried out in two different ways: either starting from a common cause data-base, or through parametric methods, such as the Binomial Failure Rate Method, the Basic Parameters Method or the Multiple Greek Letter Method, among others. (orig.)

  8. Information architecture: study and analysis of data Public Medical base (PubMed

    Directory of Open Access Journals (Sweden)

    Odete Máyra Mesquita Sales

    2016-07-01

    Full Text Available Objective. Based on principles proposed by Rosenfeld and Morville (2006, the present study examined the PubMed database interface, since a well-structured information architecture contributes to good usability in any digital environment. Method. The research development occurred through the use of literature techniques and empirical study on the analysis of information architecture based on organization, navigation, recommended labeling and search for Rosenfeld and Morville (2006 for the sake of usability base PubMed. For better understanding and description of these principles, we used the technique of content analysis. Results. The results showed that the database interface meets the criteria established by the elements of Information Architecture, such as organization based on hypertext structure, horizontal menu and local content divided into categories, identifying active links, global navigation , breadcrumb, textual labeling and iconographic and highlight the search engine. Conclusions. This research showed that the PubMed database interface is well structured, friendly and objective, with numerous possibilities of search and information retrieval. However, there is a need to adopt accessibility standards on this website, so that it reaches more efficiently its purpose of facilitating access to information organized and stored in the PubMed database.

  9. Reducing the information load in map animations as a tool for exploratory analysis

    OpenAIRE

    Multimäki, Salla

    2016-01-01

    This dissertation investigates the information load that animated maps cause to their viewers, and presents two novel visualisation methods to support the exploratory visual analysis of the animations. Information load consists of the information content of the map and its presentation. The number of objects and their attributes are the unavoidable content, but the visualisation of the objects, the background map, and display settings of an animation have an effect on the information load and...

  10. Health information systems in Africa: descriptive analysis of data sources, information products and health statistics.

    Science.gov (United States)

    Mbondji, Peter Ebongue; Kebede, Derege; Soumbey-Alley, Edoh William; Zielinski, Chris; Kouvividila, Wenceslas; Lusamba-Dikassa, Paul-Samson

    2014-05-01

    To identify key data sources of health information and describe their availability in countries of the World Health Organization (WHO) African Region. An analytical review on the availability and quality of health information data sources in countries; from experience, observations, literature and contributions from countries. Forty-six Member States of the WHO African Region. No participants. The state of data sources, including censuses, surveys, vital registration and health care facility-based sources. In almost all countries of the Region, there is a heavy reliance on household surveys for most indicators, with more than 121 household surveys having been conducted in the Region since 2000. Few countries have civil registration systems that permit adequate and regular tracking of mortality and causes of death. Demographic surveillance sites function in several countries, but the data generated are not integrated into the national health information system because of concerns about representativeness. Health management information systems generate considerable data, but the information is rarely used because of concerns about bias, quality and timeliness. To date, 43 countries in the Region have initiated Integrated Disease Surveillance and Response. A multitude of data sources are used to track progress towards health-related goals in the Region, with heavy reliance on household surveys for most indicators. Countries need to develop comprehensive national plans for health information that address the full range of data needs and data sources and that include provision for building national capacities for data generation, analysis, dissemination and use. © The Royal Society of Medicine.

  11. REAL-TIME ENERGY INFORMATION AND CONSUMER BEHAVIOR: A META-ANALYSIS AND FORECAST

    Science.gov (United States)

    The meta-analysis of literature and program results will shed light on potential causes of study-to-study variation in information feedback programs and trials. Outputs from the meta-analysis, such as price elasticity, will be used in NEMS to estimate the impact of a nation...

  12. Deprival value: information utility analysis

    Directory of Open Access Journals (Sweden)

    Marco Antonio Pereira

    Full Text Available ABSTRACT This article contributes to the perception that the users’ learning process plays a key role in order to apply an accounting concept and this involves a presentation that fits its informative potential, free of previous accounting fixations. Deprival value is a useful measure for managerial and corporate purposes, it may be applied to the current Conceptual Framework of the International Accounting Standards Board (IASB. This study analyzes its utility, taking into account cognitive aspects. Also known as value to the business, deprival value is a measurement system that followed a path where it was misunderstood, confused with another one, it faced resistance to be implemented and fell into disuse; everything that a standardized measurement method tries to avoid. In contrast, deprival value has found support in the academy and in specific applications, such as those related to the public service regulation. The accounting area has been impacted by sophistication of the measurement methods that increasingly require the ability to analyze accounting facts on an economic basis, at the risk of loss of their information content. This development becomes possible only when the potential of a measurement system is known and it is feasible to be achieved. This study consists in a theoretical essay based on literature review to discuss its origin, presentation, and application. Considering the concept’s cognitive difficulties, deprival value was analyzed, as well as its corresponding heteronym, value to the business, in order to explain some of these changes. The concept’s utility was also explored through cross-analysis with impairment and the scheme developed was applied to actual economic situations faced by a company listed on stock exchange.

  13. Analysis of the Effect of Information System Quality to Intention to Reuse of Employee Management Information System (Simpeg Based on Information Systems Success Model

    Directory of Open Access Journals (Sweden)

    Suryanto Tri Lathif Mardi

    2016-01-01

    Full Text Available This study examines the effect of Information Quality, Systems Quality and Service Quality on the user intention to reuse Employee Management Information System (SIMPEG in University in the city of Surabaya, based on the theoretical foundation of DeLone and McLane Information Systems Success (ISS Model. The distribution of questionnaire was conducted to 120 employees of different universities by means of stratified random sampling. The results showed that: (1 there is a significant positive effect of the System Quality on the Quality of Information, (2 there is a significant positive effect of the Information Quality on the Intention to Reuse, information related to the fulfillment of the user’s needs; (3 there is a significant positive effect of the Quality of the Intention on system re-use, the system related to the fulfillment of the needs of users; (4 there is no effect of the Quality of Service to the Intention to Reuse. In the end, the results of this study provide an analysis and advice to The University officials that can be used as a consideration for Information Technology/Information System investment and development in accordance with the Success of Information System and Intention to Reuse model.

  14. Zone analysis in biology articles as a basis for information extraction.

    Science.gov (United States)

    Mizuta, Yoko; Korhonen, Anna; Mullen, Tony; Collier, Nigel

    2006-06-01

    In the field of biomedicine, an overwhelming amount of experimental data has become available as a result of the high throughput of research in this domain. The amount of results reported has now grown beyond the limits of what can be managed by manual means. This makes it increasingly difficult for the researchers in this area to keep up with the latest developments. Information extraction (IE) in the biological domain aims to provide an effective automatic means to dynamically manage the information contained in archived journal articles and abstract collections and thus help researchers in their work. However, while considerable advances have been made in certain areas of IE, pinpointing and organizing factual information (such as experimental results) remains a challenge. In this paper we propose tackling this task by incorporating into IE information about rhetorical zones, i.e. classification of spans of text in terms of argumentation and intellectual attribution. As the first step towards this goal, we introduce a scheme for annotating biological texts for rhetorical zones and provide a qualitative and quantitative analysis of the data annotated according to this scheme. We also discuss our preliminary research on automatic zone analysis, and its incorporation into our IE framework.

  15. A risk-informed perspective on deterministic safety analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Wan, P.T.

    2009-01-01

    In this work, the deterministic safety analysis (DSA) approach to nuclear safety is examined from a risk-informed perspective. One objective of safety analysis of a nuclear power plant is to demonstrate via analysis that the risks to the public from events or accidents that are within the design basis of the power plant are within acceptable levels with a high degree of assurance. This nuclear safety analysis objective can be translated into two requirements on the risk estimates of design basis events or accidents: the nominal risk estimate to the public must be shown to be within acceptable levels, and the uncertainty in the risk estimates must be shown to be small on an absolute or relative basis. The DSA approach combined with the defense-in-depth (DID) principle is a simplified safety analysis approach that attempts to achieve the above safety analysis objective in the face of potentially large uncertainties in the risk estimates of a nuclear power plant by treating the various uncertainty contributors using a stylized conservative binary (yes-no) approach, and applying multiple overlapping physical barriers and defense levels to protect against the release of radioactivity from the reactor. It is shown that by focusing on the consequence aspect of risk, the previous two nuclear safety analysis requirements on risk can be satisfied with the DSA-DID approach to nuclear safety. It is also shown the use of multiple overlapping physical barriers and defense levels in the traditional DSA-DID approach to nuclear safety is risk-informed in the sense that it provides a consistently high level of confidence in the validity of the safety analysis results for various design basis events or accidents with a wide range of frequency of occurrence. It is hoped that by providing a linkage between the consequence analysis approach in DSA with a risk-informed perspective, greater understanding of the limitation and capability of the DSA approach is obtained. (author)

  16. Geographic information systems, remote sensing, and spatial analysis activities in Texas, 2002-07

    Science.gov (United States)

    Pearson, D.K.; Gary, R.H.; Wilson, Z.D.

    2007-01-01

    Geographic information system (GIS) technology has become an important tool for scientific investigation, resource management, and environmental planning. A GIS is a computer-aided system capable of collecting, storing, analyzing, and displaying spatially referenced digital data. GIS technology is particularly useful when analyzing a wide variety of spatial data such as with remote sensing and spatial analysis. Remote sensing involves collecting remotely sensed data, such as satellite imagery, aerial photography, or radar images, and analyzing the data to gather information or investigate trends about the environment or the Earth's surface. Spatial analysis combines remotely sensed, thematic, statistical, quantitative, and geographical data through overlay, modeling, and other analytical techniques to investigate specific research questions. It is the combination of data formats and analysis techniques that has made GIS an essential tool in scientific investigations. This document presents information about the technical capabilities and project activities of the U.S. Geological Survey (USGS) Texas Water Science Center (TWSC) GIS Workgroup from 2002 through 2007.

  17. Analysis of high-throughput plant image data with the information system IAP

    Directory of Open Access Journals (Sweden)

    Klukas Christian

    2012-06-01

    Full Text Available This work presents a sophisticated information system, the Integrated Analysis Platform (IAP, an approach supporting large-scale image analysis for different species and imaging systems. In its current form, IAP supports the investigation of Maize, Barley and Arabidopsis plants based on images obtained in different spectra.

  18. Information seeking for making evidence-informed decisions: a social network analysis on the staff of a public health department in Canada

    Directory of Open Access Journals (Sweden)

    Yousefi-Nooraie Reza

    2012-05-01

    Full Text Available Abstract Background Social network analysis is an approach to study the interactions and exchange of resources among people. It can help understanding the underlying structural and behavioral complexities that influence the process of capacity building towards evidence-informed decision making. A social network analysis was conducted to understand if and how the staff of a public health department in Ontario turn to peers to get help incorporating research evidence into practice. Methods The staff were invited to respond to an online questionnaire inquiring about information seeking behavior, identification of colleague expertise, and friendship status. Three networks were developed based on the 170 participants. Overall shape, key indices, the most central people and brokers, and their characteristics were identified. Results The network analysis showed a low density and localized information-seeking network. Inter-personal connections were mainly clustered by organizational divisions; and people tended to limit information-seeking connections to a handful of peers in their division. However, recognition of expertise and friendship networks showed more cross-divisional connections. Members of the office of the Medical Officer of Health were located at the heart of the department, bridging across divisions. A small group of professional consultants and middle managers were the most-central staff in the network, also connecting their divisions to the center of the information-seeking network. In each division, there were some locally central staff, mainly practitioners, who connected their neighboring peers; but they were not necessarily connected to other experts or managers. Conclusions The methods of social network analysis were useful in providing a systems approach to understand how knowledge might flow in an organization. The findings of this study can be used to identify early adopters of knowledge translation interventions, forming

  19. Information seeking for making evidence-informed decisions: a social network analysis on the staff of a public health department in Canada

    Science.gov (United States)

    2012-01-01

    Background Social network analysis is an approach to study the interactions and exchange of resources among people. It can help understanding the underlying structural and behavioral complexities that influence the process of capacity building towards evidence-informed decision making. A social network analysis was conducted to understand if and how the staff of a public health department in Ontario turn to peers to get help incorporating research evidence into practice. Methods The staff were invited to respond to an online questionnaire inquiring about information seeking behavior, identification of colleague expertise, and friendship status. Three networks were developed based on the 170 participants. Overall shape, key indices, the most central people and brokers, and their characteristics were identified. Results The network analysis showed a low density and localized information-seeking network. Inter-personal connections were mainly clustered by organizational divisions; and people tended to limit information-seeking connections to a handful of peers in their division. However, recognition of expertise and friendship networks showed more cross-divisional connections. Members of the office of the Medical Officer of Health were located at the heart of the department, bridging across divisions. A small group of professional consultants and middle managers were the most-central staff in the network, also connecting their divisions to the center of the information-seeking network. In each division, there were some locally central staff, mainly practitioners, who connected their neighboring peers; but they were not necessarily connected to other experts or managers. Conclusions The methods of social network analysis were useful in providing a systems approach to understand how knowledge might flow in an organization. The findings of this study can be used to identify early adopters of knowledge translation interventions, forming Communities of Practice, and

  20. Demand Analysis of Logistics Information Matching Platform: A Survey from Highway Freight Market in Zhejiang Province

    Science.gov (United States)

    Chen, Daqiang; Shen, Xiahong; Tong, Bing; Zhu, Xiaoxiao; Feng, Tao

    With the increasing competition in logistics industry and promotion of lower logistics costs requirements, the construction of logistics information matching platform for highway transportation plays an important role, and the accuracy of platform design is the key to successful operation or not. Based on survey results of logistics service providers, customers and regulation authorities to access to information and in-depth information demand analysis of logistics information matching platform for highway transportation in Zhejiang province, a survey analysis for framework of logistics information matching platform for highway transportation is provided.

  1. Information security risk analysis

    CERN Document Server

    Peltier, Thomas R

    2001-01-01

    Effective Risk AnalysisQualitative Risk AnalysisValue AnalysisOther Qualitative MethodsFacilitated Risk Analysis Process (FRAP)Other Uses of Qualitative Risk AnalysisCase StudyAppendix A: QuestionnaireAppendix B: Facilitated Risk Analysis Process FormsAppendix C: Business Impact Analysis FormsAppendix D: Sample of ReportAppendix E: Threat DefinitionsAppendix F: Other Risk Analysis OpinionsIndex

  2. Multicriteria analysis of ontologically represented information

    Science.gov (United States)

    Wasielewska, K.; Ganzha, M.; Paprzycki, M.; Bǎdicǎ, C.; Ivanovic, M.; Lirkov, I.

    2014-11-01

    Our current work concerns the development of a decision support system for the software selection problem. The main idea is to utilize expert knowledge to help the user in selecting the best software / method / computational resource to solve a computational problem. Obviously, this involves multicriterial decision making and the key open question is: which method to choose. The context of the work is provided by the Agents in Grid (AiG) project, where the software selection (and thus multicriterial analysis) is to be realized when all information concerning the problem, the hardware and the software is ontologically represented. Initially, we have considered the Analytical Hierarchy Process (AHP), which is well suited for the hierarchical data structures (e.g., such that have been formulated in terms of ontologies). However, due to its well-known shortcomings, we have decided to extend our search for the multicriterial analysis method best suited for the problem in question. In this paper we report results of our search, which involved: (i) TOPSIS (Technique for Order Preference by Similarity to Ideal Solution), (ii) PROMETHEE, and (iii) GRIP (Generalized Regression with Intensities of Preference). We also briefly argue why other methods have not been considered as valuable candidates.

  3. Multihop Capability Analysis in Wireless Information and Power Transfer Multirelay Cooperative Networks

    Directory of Open Access Journals (Sweden)

    Qilin Wu

    2018-01-01

    Full Text Available We study simultaneous wireless information and power transfer (SWIPT in multihop wireless cooperative networks, where the multihop capability that denotes the largest number of transmission hops is investigated. By utilizing the broadcast nature of multihop wireless networks, we first propose a cooperative forwarding power (CFP scheme. In CFP scheme, the multiple relays and receiver have distinctly different tasks. Specifically, multiple relays close to the transmitter harvest power from the transmitter first and then cooperatively forward the power (not the information towards the receiver. The receiver receives the information (not the power from the transmitter first, and then it harvests the power from the relays and is taken as the transmitter of the next hop. Furthermore, for performance comparison, we suggest two schemes: cooperative forwarding information and power (CFIP and direct receiving information and power (DFIP. Also, we construct an analysis model to investigate the multihop capabilities of CFP, CFIP, and DFIP schemes under the given targeted throughput requirement. Finally, simulation results validate the analysis model and show that the multihop capability of CFP is better than CFIP and DFIP, and for improving the multihop capabilities, it is best effective to increase the average number of relay nodes in cooperative set.

  4. Evidence-based health information from the users' perspective--a qualitative analysis.

    Science.gov (United States)

    Hirschberg, Irene; Seidel, Gabriele; Strech, Daniel; Bastian, Hilda; Dierks, Marie-Luise

    2013-10-10

    Evidence-based information is a precondition for informed decision-making and participation in health. There are several recommendations and definitions available on the generation and assessment of so called evidence-based health information for patients and consumers (EBHI). They stress the importance of objectively informing people about benefits and harms and any uncertainties in health-related procedures. There are also studies on the comprehensibility, relevance and user-friendliness of these informational materials. But to date there has been little research on the perceptions and cognitive reactions of users or lay people towards EBHI. The aim of our study is to define the spectrum of consumers' reaction patterns to written EBHI in order to gain a deeper understanding of their comprehension and assumptions, as well as their informational needs and expectations. This study is based on an external user evaluation of EBHI produced by the German Institute for Quality and Efficiency in Health Care (IQWiG), commissioned by the IQWiG. The EBHI were examined within guided group discussions, carried out with lay people. The test readers' first impressions and their appraisal of the informational content, presentation, structure, comprehensibility and effect were gathered. Then a qualitative text analysis of 25 discussion transcripts involving 94 test readers was performed. Based on the qualitative text analysis a framework for reaction patterns was developed, comprising eight main categories: (i) interest, (ii) satisfaction, (iii) reassurance and trust, (iv) activation, (v) disinterest, (vi) dissatisfaction and disappointment, (vii) anxiety and worry, (viii) doubt. Many lay people are unfamiliar with core characteristics of this special information type. Two particularly critical issues are the description of insufficient evidence and the attendant absence of clear-cut recommendations. Further research is needed to examine strategies to explain the specific

  5. Signal correlations in biomass combustion. An information theoretic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ruusunen, M.

    2013-09-01

    Increasing environmental and economic awareness are driving the development of combustion technologies to efficient biomass use and clean burning. To accomplish these goals, quantitative information about combustion variables is needed. However, for small-scale combustion units the existing monitoring methods are often expensive or complex. This study aimed to quantify correlations between flue gas temperatures and combustion variables, namely typical emission components, heat output, and efficiency. For this, data acquired from four small-scale combustion units and a large circulating fluidised bed boiler was studied. The fuel range varied from wood logs, wood chips, and wood pellets to biomass residue. Original signals and a defined set of their mathematical transformations were applied to data analysis. In order to evaluate the strength of the correlations, a multivariate distance measure based on information theory was derived. The analysis further assessed time-varying signal correlations and relative time delays. Ranking of the analysis results was based on the distance measure. The uniformity of the correlations in the different data sets was studied by comparing the 10-quantiles of the measured signal. The method was validated with two benchmark data sets. The flue gas temperatures and the combustion variables measured carried similar information. The strongest correlations were mainly linear with the transformed signal combinations and explicable by the combustion theory. Remarkably, the results showed uniformity of the correlations across the data sets with several signal transformations. This was also indicated by simulations using a linear model with constant structure to monitor carbon dioxide in flue gas. Acceptable performance was observed according to three validation criteria used to quantify modelling error in each data set. In general, the findings demonstrate that the presented signal transformations enable real-time approximation of the studied

  6. ANALYSIS OF FOREIGN EXPERIENCE OF SYSTEMIC DEVELOPMENT OF FUTURE SOCIAL PEDAGOGISTS’ INFORMATIONAL CULTURE

    Directory of Open Access Journals (Sweden)

    Oleksandr A. Ratsul

    2014-10-01

    Full Text Available The article deals with the analysis of foreign experience of systemic development of future social pedagogists’ informational culture. A number of cultural universals are identified, each of them is treated as the core of culture. A list of components of future social pedagogists’ information culture is given. Personality traits that enable future social pedagogists to participate effectively in all kinds of work with information are characterized. Two structural levels (contents and functions in future social pedagogists’ information culture are singled out. Main functions of future social pedagogists’ information culture are defined. The structural organization of future social pedagogists’ information culture is analyzed.

  7. Principal Component Analysis - A Powerful Tool in Computing Marketing Information

    Directory of Open Access Journals (Sweden)

    Constantin C.

    2014-12-01

    Full Text Available This paper is about an instrumental research regarding a powerful multivariate data analysis method which can be used by the researchers in order to obtain valuable information for decision makers that need to solve the marketing problem a company face with. The literature stresses the need to avoid the multicollinearity phenomenon in multivariate analysis and the features of Principal Component Analysis (PCA in reducing the number of variables that could be correlated with each other to a small number of principal components that are uncorrelated. In this respect, the paper presents step-by-step the process of applying the PCA in marketing research when we use a large number of variables that naturally are collinear.

  8. Concept similarity and related categories in information retrieval using formal concept analysis

    Science.gov (United States)

    Eklund, P.; Ducrou, J.; Dau, F.

    2012-11-01

    The application of formal concept analysis to the problem of information retrieval has been shown useful but has lacked any real analysis of the idea of relevance ranking of search results. SearchSleuth is a program developed to experiment with the automated local analysis of Web search using formal concept analysis. SearchSleuth extends a standard search interface to include a conceptual neighbourhood centred on a formal concept derived from the initial query. This neighbourhood of the concept derived from the search terms is decorated with its upper and lower neighbours representing more general and special concepts, respectively. SearchSleuth is in many ways an archetype of search engines based on formal concept analysis with some novel features. In SearchSleuth, the notion of related categories - which are themselves formal concepts - is also introduced. This allows the retrieval focus to shift to a new formal concept called a sibling. This movement across the concept lattice needs to relate one formal concept to another in a principled way. This paper presents the issues concerning exploring, searching, and ordering the space of related categories. The focus is on understanding the use and meaning of proximity and semantic distance in the context of information retrieval using formal concept analysis.

  9. The Readability of Electronic Cigarette Health Information and Advice: A Quantitative Analysis of Web-Based Information.

    Science.gov (United States)

    Park, Albert; Zhu, Shu-Hong; Conway, Mike

    2017-01-06

    The popularity and use of electronic cigarettes (e-cigarettes) has increased across all demographic groups in recent years. However, little is currently known about the readability of health information and advice aimed at the general public regarding the use of e-cigarettes. The objective of our study was to examine the readability of publicly available health information as well as advice on e-cigarettes. We compared information and advice available from US government agencies, nongovernment organizations, English speaking government agencies outside the United States, and for-profit entities. A systematic search for health information and advice on e-cigarettes was conducted using search engines. We manually verified search results and converted to plain text for analysis. We then assessed readability of the collected documents using 4 readability metrics followed by pairwise comparisons of groups with adjustment for multiple comparisons. A total of 54 documents were collected for this study. All 4 readability metrics indicate that all information and advice on e-cigarette use is written at a level higher than that recommended for the general public by National Institutes of Health (NIH) communication guidelines. However, health information and advice written by for-profit entities, many of which were promoting e-cigarettes, were significantly easier to read. A substantial proportion of potential and current e-cigarette users are likely to have difficulty in fully comprehending Web-based health information regarding e-cigarettes, potentially hindering effective health-seeking behaviors. To comply with NIH communication guidelines, government entities and nongovernment organizations would benefit from improving the readability of e-cigarettes information and advice. ©Albert Park, Shu-Hong Zhu, Mike Conway. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 06.01.2017.

  10. Regression analysis of informative current status data with the additive hazards model.

    Science.gov (United States)

    Zhao, Shishun; Hu, Tao; Ma, Ling; Wang, Peijie; Sun, Jianguo

    2015-04-01

    This paper discusses regression analysis of current status failure time data arising from the additive hazards model in the presence of informative censoring. Many methods have been developed for regression analysis of current status data under various regression models if the censoring is noninformative, and also there exists a large literature on parametric analysis of informative current status data in the context of tumorgenicity experiments. In this paper, a semiparametric maximum likelihood estimation procedure is presented and in the method, the copula model is employed to describe the relationship between the failure time of interest and the censoring time. Furthermore, I-splines are used to approximate the nonparametric functions involved and the asymptotic consistency and normality of the proposed estimators are established. A simulation study is conducted and indicates that the proposed approach works well for practical situations. An illustrative example is also provided.

  11. Applying a sociolinguistic model to the analysis of informed consent documents.

    Science.gov (United States)

    Granero-Molina, José; Fernández-Sola, Cayetano; Aguilera-Manrique, Gabriel

    2009-11-01

    Information on the risks and benefits related to surgical procedures is essential for patients in order to obtain their informed consent. Some disciplines, such as sociolinguistics, offer insights that are helpful for patient-professional communication in both written and oral consent. Communication difficulties become more acute when patients make decisions through an informed consent document because they may sign this with a lack of understanding and information, and consequently feel deprived of their freedom to make their choice about different treatments or surgery. This article discusses findings from documentary analysis using the sociolinguistic SPEAKING model, which was applied to the general and specific informed consent documents required for laparoscopic surgery of the bile duct at Torrecárdenas Hospital, Almería, Spain. The objective of this procedure was to identify flaws when information was provided, together with its readability, its voluntary basis, and patients' consent. The results suggest potential linguistic communication difficulties, different languages being used, cultural clashes, asymmetry of communication between professionals and patients, assignment of rights on the part of patients, and overprotection of professionals and institutions.

  12. Patient information on breast reconstruction in the era of the world wide web. A snapshot analysis of information available on youtube.com.

    Science.gov (United States)

    Tan, M L H; Kok, K; Ganesh, V; Thomas, S S

    2014-02-01

    Breast cancer patient's expectation and choice of reconstruction is increasing and patients often satisfy their information needs outside clinic time by searching the world wide web. The aim of our study was to analyse the quality of content and extent of information regarding breast reconstruction available on YouTube videos and whether this is an appropriate additional source of information for patients. A snapshot qualitative and quantitative analysis of the first 100 videos was performed after the term 'breast reconstruction' was input into the search window of the video sharing website www.youtube.com on the 1st of September 2011. Qualitative categorical analysis included patient, oncological and reconstruction factors. It was concluded that although videos uploaded onto YouTube do not provide comprehensive information, it is a useful resource that can be utilised in patient education provided comprehensive and validated videos are made available. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Analysis of the Health Information and Communication System and Cloud Computing

    Directory of Open Access Journals (Sweden)

    Matija Varga

    2015-05-01

    Full Text Available This paper describes an analysis and shows its use in analysing strengths, weaknesses, opportunities and threats (risks within the health care system.The aim is further more to show strengths, weaknesses, opportunities and threats when using cloud computing in the health care system. Cloud computing in medicine is an integral part of telemedicine. Based on the information presented in this paper, employees may identify the advantages and disadvantages of using cloud computing. When introducing new information technologies in the health care business the implementers will encounter numerous problems, such as: the complexity of the existing and the new information system, the costs of maintaining and updating the software, the cost of implementing new modules,a way of protecting the existing data in the database and the data that will be collected in the diagnosis. Using the SWOT analysis this paper evaluates the feasibility and possibility of adopting cloud computing in the health sector to improve health services based on samples (examples from abroad. The intent of cloud computing in medicine is to send data of the patient to the doctor instead of the patient sending it himself/herself.

  14. Analysis Of Factors Affecting The Success Of The Application Of Accounting Information System

    Directory of Open Access Journals (Sweden)

    Deni Iskandar

    2015-02-01

    Full Text Available Abstract The purpose of this study was to find solutions for problems related to the quality of accounting information systems accounting information quality when connected with management commitment user competency and organizational culture. This research was conducted through deductive analysis supported the phenomenon then sought evidence through empirical facts especially about the effect of management commitment competence and users of organizational culture on the quality of accounting information systems and their impact on the quality of accounting information. This research was conducted at the State-Owned Enterprises SOEs.

  15. Marketing Information: A Competitive Analysis

    OpenAIRE

    Miklos Sarvary; Philip M. Parker

    1997-01-01

    Selling information that is later used in decision making constitutes an increasingly important business in modern economies (Jensen [Jensen, Fred O. 1991. Information services. Congram, Friedman, eds. , Chapter 22. AMA-COM, New York, 423–443.]). Information is sold under a large variety of forms: industry reports, consulting services, database access, and/or professional opinions given by medical, engineering, accounting/financial, and legal professionals, among others. This paper is the fir...

  16. Information Retrieval and Graph Analysis Approaches for Book Recommendation

    OpenAIRE

    Chahinez Benkoussas; Patrice Bellot

    2015-01-01

    A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model) and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval ...

  17. Formal Concept Analysis and Information Retrieval – A Survey

    OpenAIRE

    Codocedo , Victor; Napoli , Amedeo

    2015-01-01

    International audience; One of the first models to be proposed as a document index for retrieval purposes was a lattice structure, decades before the introduction of Formal Concept Analysis. Nevertheless, the main notions that we consider so familiar within the community (" extension " , " intension " , " closure operators " , " order ") were already an important part of it. In the '90s, as FCA was starting to settle as an epistemic community, lattice-based Information Retrieval (IR) systems ...

  18. Crime Mapping and Geographical Information Systems in Crime Analysis

    OpenAIRE

    Dağlar, Murat; Argun, Uğur

    2016-01-01

    As essential apparatus in crime analysis, crime mapping and Geographical Information Systems (GIS) are being progressively more accepted by police agencies. Development in technology and the accessibility of geographic data sources make it feasible for police departments to use GIS and crime mapping. GIS and crime mapping can be utilized as devices to discover reasons contributing to crime, and hence let law enforcement agencies proactively take action against the crime problems before they b...

  19. Key attributes of the SAPHIRE risk and reliability analysis software for risk-informed probabilistic applications

    International Nuclear Information System (INIS)

    Smith, Curtis; Knudsen, James; Kvarfordt, Kellie; Wood, Ted

    2008-01-01

    The Idaho National Laboratory is a primary developer of probabilistic risk and reliability analysis (PRRA) tools, dating back over 35 years. Evolving from mainframe-based software, the current state-of-the-practice has led to the creation of the SAPHIRE software. Currently, agencies such as the Nuclear Regulatory Commission, the National Aeronautics and Aerospace Agency, the Department of Energy, and the Department of Defense use version 7 of the SAPHIRE software for many of their risk-informed activities. In order to better understand and appreciate the power of software as part of risk-informed applications, we need to recall that our current analysis methods and solution methods have built upon pioneering work done 30-40 years ago. We contrast this work with the current capabilities in the SAPHIRE analysis package. As part of this discussion, we provide information for both the typical features and special analysis capabilities, which are available. We also present the application and results typically found with state-of-the-practice PRRA models. By providing both a high-level and detailed look at the SAPHIRE software, we give a snapshot in time for the current use of software tools in a risk-informed decision arena

  20. The information needs and behaviour of clinical researchers: a user-needs analysis.

    Science.gov (United States)

    Korjonen-Close, Helena

    2005-06-01

    As part of the strategy to set up a new information service, including a physical Resource Centre, the analysis of information needs of clinical research professionals involved with clinical research and development in the UK and Europe was required. It also aimed to identify differences in requirements between the various roles of professionals and establish what information resources are currently used. A user-needs survey online of the members of The Institute. Group discussions with specialist subcommittees of members. Two hundred and ninety members responded to the online survey of 20 questions. This makes it a response rate of 7.9%. Members expressed a lack of information in their particular professional area, and lack the skills to retrieve and appraise information. The results of the survey are discussed in more detail, giving indications of what the information service should collect, what types of materials should be provided to members and what services should be on offer. These were developed from the results of the needs analysis and submitted to management for approval. Issues of concern, such as financial constraint and staff constraints are also discussed. There is an opportunity to build a unique collection of clinical research material, which will promote The Institute not only to members, but also to the wider health sector. Members stated that the most physical medical libraries don't provide what they need, but the main finding through the survey and discussions is that it's pointless to set up 'yet another medical library'.

  1. Quality analysis of patient information about knee arthroscopy on the World Wide Web.

    Science.gov (United States)

    Sambandam, Senthil Nathan; Ramasamy, Vijayaraj; Priyanka, Priyanka; Ilango, Balakrishnan

    2007-05-01

    This study was designed to ascertain the quality of patient information available on the World Wide Web on the topic of knee arthroscopy. For the purpose of quality analysis, we used a pool of 232 search results obtained from 7 different search engines. We used a modified assessment questionnaire to assess the quality of these Web sites. This questionnaire was developed based on similar studies evaluating Web site quality and includes items on illustrations, accessibility, availability, accountability, and content of the Web site. We also compared results obtained with different search engines and tried to establish the best possible search strategy to attain the most relevant, authentic, and adequate information with minimum time consumption. For this purpose, we first compared 100 search results from the single most commonly used search engine (AltaVista) with the pooled sample containing 20 search results from each of the 7 different search engines. The search engines used were metasearch (Copernic and Mamma), general search (Google, AltaVista, and Yahoo), and health topic-related search engines (MedHunt and Healthfinder). The phrase "knee arthroscopy" was used as the search terminology. Excluding the repetitions, there were 117 Web sites available for quality analysis. These sites were analyzed for accessibility, relevance, authenticity, adequacy, and accountability by use of a specially designed questionnaire. Our analysis showed that most of the sites providing patient information on knee arthroscopy contained outdated information, were inadequate, and were not accountable. Only 16 sites were found to be providing reasonably good patient information and hence can be recommended to patients. Understandably, most of these sites were from nonprofit organizations and educational institutions. Furthermore, our study revealed that using multiple search engines increases patients' chances of obtaining more relevant information rather than using a single search

  2. The Impact of the Introduction of Web Information Systems (WIS) on Information Policies: An Analysis of the Canadian Federal Government Policies Related to WIS.

    Science.gov (United States)

    Dufour, Christine; Bergeron, Pierette

    2002-01-01

    Presents results of an analysis of the Canadian federal government information policies that govern its Web information systems (WIS) that was conducted to better understand how the government has adapted its information policies to the WIS. Discusses results that indicate new policies have been crafted to take into account the WIS context.…

  3. A toolbox for the fast information analysis of multiple-site LFP, EEG and spike train recordings.

    Science.gov (United States)

    Magri, Cesare; Whittingstall, Kevin; Singh, Vanessa; Logothetis, Nikos K; Panzeri, Stefano

    2009-07-16

    Information theory is an increasingly popular framework for studying how the brain encodes sensory information. Despite its widespread use for the analysis of spike trains of single neurons and of small neural populations, its application to the analysis of other types of neurophysiological signals (EEGs, LFPs, BOLD) has remained relatively limited so far. This is due to the limited-sampling bias which affects calculation of information, to the complexity of the techniques to eliminate the bias, and to the lack of publicly available fast routines for the information analysis of multi-dimensional responses. Here we introduce a new C- and Matlab-based information theoretic toolbox, specifically developed for neuroscience data. This toolbox implements a novel computationally-optimized algorithm for estimating many of the main information theoretic quantities and bias correction techniques used in neuroscience applications. We illustrate and test the toolbox in several ways. First, we verify that these algorithms provide accurate and unbiased estimates of the information carried by analog brain signals (i.e. LFPs, EEGs, or BOLD) even when using limited amounts of experimental data. This test is important since existing algorithms were so far tested primarily on spike trains. Second, we apply the toolbox to the analysis of EEGs recorded from a subject watching natural movies, and we characterize the electrodes locations, frequencies and signal features carrying the most visual information. Third, we explain how the toolbox can be used to break down the information carried by different features of the neural signal into distinct components reflecting different ways in which correlations between parts of the neural signal contribute to coding. We illustrate this breakdown by analyzing LFPs recorded from primary visual cortex during presentation of naturalistic movies. The new toolbox presented here implements fast and data-robust computations of the most relevant

  4. A toolbox for the fast information analysis of multiple-site LFP, EEG and spike train recordings

    Directory of Open Access Journals (Sweden)

    Logothetis Nikos K

    2009-07-01

    Full Text Available Abstract Background Information theory is an increasingly popular framework for studying how the brain encodes sensory information. Despite its widespread use for the analysis of spike trains of single neurons and of small neural populations, its application to the analysis of other types of neurophysiological signals (EEGs, LFPs, BOLD has remained relatively limited so far. This is due to the limited-sampling bias which affects calculation of information, to the complexity of the techniques to eliminate the bias, and to the lack of publicly available fast routines for the information analysis of multi-dimensional responses. Results Here we introduce a new C- and Matlab-based information theoretic toolbox, specifically developed for neuroscience data. This toolbox implements a novel computationally-optimized algorithm for estimating many of the main information theoretic quantities and bias correction techniques used in neuroscience applications. We illustrate and test the toolbox in several ways. First, we verify that these algorithms provide accurate and unbiased estimates of the information carried by analog brain signals (i.e. LFPs, EEGs, or BOLD even when using limited amounts of experimental data. This test is important since existing algorithms were so far tested primarily on spike trains. Second, we apply the toolbox to the analysis of EEGs recorded from a subject watching natural movies, and we characterize the electrodes locations, frequencies and signal features carrying the most visual information. Third, we explain how the toolbox can be used to break down the information carried by different features of the neural signal into distinct components reflecting different ways in which correlations between parts of the neural signal contribute to coding. We illustrate this breakdown by analyzing LFPs recorded from primary visual cortex during presentation of naturalistic movies. Conclusion The new toolbox presented here implements fast

  5. Risk-Informed External Hazards Analysis for Seismic and Flooding Phenomena for a Generic PWR

    Energy Technology Data Exchange (ETDEWEB)

    Parisi, Carlo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Prescott, Steve [Idaho National Lab. (INL), Idaho Falls, ID (United States); Ma, Zhegang [Idaho National Lab. (INL), Idaho Falls, ID (United States); Spears, Bob [Idaho National Lab. (INL), Idaho Falls, ID (United States); Szilard, Ronaldo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Coleman, Justin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kosbab, Ben [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-07-26

    This report describes the activities performed during the FY2017 for the US-DOE Light Water Reactor Sustainability Risk-Informed Safety Margin Characterization (LWRS-RISMC), Industry Application #2. The scope of Industry Application #2 is to deliver a risk-informed external hazards safety analysis for a representative nuclear power plant. Following the advancements occurred during the previous FYs (toolkits identification, models development), FY2017 focused on: increasing the level of realism of the analysis; improving the tools and the coupling methodologies. In particular the following objectives were achieved: calculation of buildings pounding and their effects on components seismic fragility; development of a SAPHIRE code PRA models for 3-loops Westinghouse PWR; set-up of a methodology for performing static-dynamic PRA coupling between SAPHIRE and EMRALD codes; coupling RELAP5-3D/RAVEN for performing Best-Estimate Plus Uncertainty analysis and automatic limit surface search; and execute sample calculations for demonstrating the capabilities of the toolkit in performing a risk-informed external hazards safety analyses.

  6. Scientific and technological information: analysis of periodic publications of information science

    OpenAIRE

    Mayara Cintya do Nascimento Vasconcelos; Gabriela Belmont de Farias

    2017-01-01

    The research analyzes the articles published in national scientific journals of the area of Information Science, classified with Qualis A1, having as parameter the term "scientific and technological information". It presents concepts about scientific and technological information and the processes that involve its uses, as well as scientific communication, information flows and sources of information. The methodology used is a descriptive study with a quantitative-qualitative approach, using ...

  7. Health Information Needs and Health Seeking Behavior During the 2014-2016 Ebola Outbreak: A Twitter Content Analysis.

    Science.gov (United States)

    Odlum, Michelle; Yoon, Sunmoo

    2018-03-23

    For effective public communication during major disease outbreaks like the 2014-2016 Ebola epidemic, health information needs of the population must be adequately assessed. Through content analysis of social media data, like tweets, public health information needs can be effectively assessed and in turn provide appropriate health information to address such needs. The aim of the current study was to assess health information needs about Ebola, at distinct epidemic time points, through longitudinal tracking. Natural language processing was applied to explore public response to Ebola over time from July 2014 to March 2015. A total 155,647 tweets (unique 68,736, retweet 86,911) mentioning Ebola were analyzed and visualized with infographics. Public fear, frustration, and health information seeking regarding Ebola-related global priorities were observed across time. Our longitudinal content analysis revealed that due to ongoing health information deficiencies, resulting in fear and frustration, social media was at times an impediment and not a vehicle to support health information needs. Content analysis of tweets effectively assessed Ebola information needs. Our study also demonstrates the use of Twitter as a method for capturing real-time data to assess ongoing information needs, fear, and frustration over time.

  8. Analysis of College Students' Personal Health Information Activities: Online Survey.

    Science.gov (United States)

    Kim, Sujin; Sinn, Donghee; Syn, Sue Yeon

    2018-04-20

    With abundant personal health information at hand, individuals are faced with a critical challenge in evaluating the informational value of health care records to keep useful information and discard that which is determined useless. Young, healthy college students who were previously dependents of adult parents or caregivers are less likely to be concerned with disease management. Personal health information management (PHIM) is a special case of personal information management (PIM) that is associated with multiple interactions among varying stakeholders and systems. However, there has been limited evidence to understand informational or behavioral underpinning of the college students' PHIM activities, which can influence their health in general throughout their lifetime. This study aimed to investigate demographic and academic profiles of college students with relevance to PHIM activities. Next, we sought to construct major PHIM-related activity components and perceptions among college students. Finally, we sought to discover major factors predicting core PHIM activities among college students we sampled. A Web survey was administered to collect responses about PHIM behaviors and perceptions among college students from the University of Kentucky from January through March 2017. A total of 1408 college students were included in the analysis. PHIM perceptions, demographics, and academic variations were used as independent variables to predict diverse PHIM activities using a principal component analysis (PCA) and hierarchical regression analyses (SPSS v.24, IBM Corp, Armonk, NY, USA). Majority of the participants were female (956/1408, 67.90%), and the age distribution of this population included an adequate representation of college students of all ages. The most preferred health information resources were family (612/1408, 43.47%), health care professionals (366/1408, 26.00%), friends (27/1408, 1.91%), and the internet (157/1408, 11.15%). Organizational or

  9. Towards a Structurational Theory of Information Systems: a substantive case analysis

    DEFF Research Database (Denmark)

    Rose, Jeremy; Hackney, R. H

    2003-01-01

    This paper employs the analysis of an interpretive case study within a Regional Train Operating Company (RTOC) to arrive at theoretical understandings of Information Systems (IS). Giddens’ ‘structuration theory’ is developed which offers an account of structure and agency; social practices develo...

  10. Astrophysical data analysis with information field theory

    International Nuclear Information System (INIS)

    Enßlin, Torsten

    2014-01-01

    Non-parametric imaging and data analysis in astrophysics and cosmology can be addressed by information field theory (IFT), a means of Bayesian, data based inference on spatially distributed signal fields. IFT is a statistical field theory, which permits the construction of optimal signal recovery algorithms. It exploits spatial correlations of the signal fields even for nonlinear and non-Gaussian signal inference problems. The alleviation of a perception threshold for recovering signals of unknown correlation structure by using IFT will be discussed in particular as well as a novel improvement on instrumental self-calibration schemes. IFT can be applied to many areas. Here, applications in in cosmology (cosmic microwave background, large-scale structure) and astrophysics (galactic magnetism, radio interferometry) are presented

  11. Astrophysical data analysis with information field theory

    Science.gov (United States)

    Enßlin, Torsten

    2014-12-01

    Non-parametric imaging and data analysis in astrophysics and cosmology can be addressed by information field theory (IFT), a means of Bayesian, data based inference on spatially distributed signal fields. IFT is a statistical field theory, which permits the construction of optimal signal recovery algorithms. It exploits spatial correlations of the signal fields even for nonlinear and non-Gaussian signal inference problems. The alleviation of a perception threshold for recovering signals of unknown correlation structure by using IFT will be discussed in particular as well as a novel improvement on instrumental self-calibration schemes. IFT can be applied to many areas. Here, applications in in cosmology (cosmic microwave background, large-scale structure) and astrophysics (galactic magnetism, radio interferometry) are presented.

  12. Astrophysical data analysis with information field theory

    Energy Technology Data Exchange (ETDEWEB)

    Enßlin, Torsten, E-mail: ensslin@mpa-garching.mpg.de [Max Planck Institut für Astrophysik, Karl-Schwarzschild-Straße 1, D-85748 Garching, Germany and Ludwig-Maximilians-Universität München, Geschwister-Scholl-Platz 1, D-80539 München (Germany)

    2014-12-05

    Non-parametric imaging and data analysis in astrophysics and cosmology can be addressed by information field theory (IFT), a means of Bayesian, data based inference on spatially distributed signal fields. IFT is a statistical field theory, which permits the construction of optimal signal recovery algorithms. It exploits spatial correlations of the signal fields even for nonlinear and non-Gaussian signal inference problems. The alleviation of a perception threshold for recovering signals of unknown correlation structure by using IFT will be discussed in particular as well as a novel improvement on instrumental self-calibration schemes. IFT can be applied to many areas. Here, applications in in cosmology (cosmic microwave background, large-scale structure) and astrophysics (galactic magnetism, radio interferometry) are presented.

  13. [Design and implementation of online statistical analysis function in information system of air pollution and health impact monitoring].

    Science.gov (United States)

    Lü, Yiran; Hao, Shuxin; Zhang, Guoqing; Liu, Jie; Liu, Yue; Xu, Dongqun

    2018-01-01

    To implement the online statistical analysis function in information system of air pollution and health impact monitoring, and obtain the data analysis information real-time. Using the descriptive statistical method as well as time-series analysis and multivariate regression analysis, SQL language and visual tools to implement online statistical analysis based on database software. Generate basic statistical tables and summary tables of air pollution exposure and health impact data online; Generate tendency charts of each data part online and proceed interaction connecting to database; Generate butting sheets which can lead to R, SAS and SPSS directly online. The information system air pollution and health impact monitoring implements the statistical analysis function online, which can provide real-time analysis result to its users.

  14. Method of extracting significant trouble information of nuclear power plants using probabilistic analysis technique

    International Nuclear Information System (INIS)

    Shimada, Yoshio; Miyazaki, Takamasa

    2005-01-01

    In order to analyze and evaluate large amounts of trouble information of overseas nuclear power plants, it is necessary to select information that is significant in terms of both safety and reliability. In this research, a method of efficiently and simply classifying degrees of importance of components in terms of safety and reliability while paying attention to root-cause components appearing in the information was developed. Regarding safety, the reactor core damage frequency (CDF), which is used in the probabilistic analysis of a reactor, was used. Regarding reliability, the automatic plant trip probability (APTP), which is used in the probabilistic analysis of automatic reactor trips, was used. These two aspects were reflected in the development of criteria for classifying degrees of importance of components. By applying these criteria, a simple method of extracting significant trouble information of overseas nuclear power plants was developed. (author)

  15. Information Science and Information Systems: Conjunct Subjects Disjunct Disciplines.

    Science.gov (United States)

    Ellis, David; Allen, David; Wilson, Tom

    1999-01-01

    Examines the relationship between information science and information-systems (IS) research through analysis of the subject literature of each field and by citation and co-citation analysis of highly cited researchers in each field. Subfields of user studies and information-retrieval research were selected to represent information-science…

  16. The Information Content of Discrete Functions and Their Application in Genetic Data Analysis.

    Science.gov (United States)

    Sakhanenko, Nikita A; Kunert-Graf, James; Galas, David J

    2017-12-01

    The complex of central problems in data analysis consists of three components: (1) detecting the dependence of variables using quantitative measures, (2) defining the significance of these dependence measures, and (3) inferring the functional relationships among dependent variables. We have argued previously that an information theory approach allows separation of the detection problem from the inference of functional form problem. We approach here the third component of inferring functional forms based on information encoded in the functions. We present here a direct method for classifying the functional forms of discrete functions of three variables represented in data sets. Discrete variables are frequently encountered in data analysis, both as the result of inherently categorical variables and from the binning of continuous numerical variables into discrete alphabets of values. The fundamental question of how much information is contained in a given function is answered for these discrete functions, and their surprisingly complex relationships are illustrated. The all-important effect of noise on the inference of function classes is found to be highly heterogeneous and reveals some unexpected patterns. We apply this classification approach to an important area of biological data analysis-that of inference of genetic interactions. Genetic analysis provides a rich source of real and complex biological data analysis problems, and our general methods provide an analytical basis and tools for characterizing genetic problems and for analyzing genetic data. We illustrate the functional description and the classes of a number of common genetic interaction modes and also show how different modes vary widely in their sensitivity to noise.

  17. Information analysis of iris biometrics for the needs of cryptology key extraction

    Directory of Open Access Journals (Sweden)

    Adamović Saša

    2013-01-01

    Full Text Available The paper presents a rigorous analysis of iris biometric information for the synthesis of an optimized system for the extraction of a high quality cryptology key. Estimations of local entropy and mutual information were identified as segments of the iris most suitable for this purpose. In order to optimize parameters, corresponding wavelets were transformed, in order to obtain the highest possible entropy and mutual information lower in the transformation domain, which set frameworks for the synthesis of systems for the extraction of truly random sequences of iris biometrics, without compromising authentication properties. [Projekat Ministarstva nauke Republike Srbije, br. TR32054 i br. III44006

  18. Using Meta-Analysis to Inform the Design of Subsequent Studies of Diagnostic Test Accuracy

    Science.gov (United States)

    Hinchliffe, Sally R.; Crowther, Michael J.; Phillips, Robert S.; Sutton, Alex J.

    2013-01-01

    An individual diagnostic accuracy study rarely provides enough information to make conclusive recommendations about the accuracy of a diagnostic test; particularly when the study is small. Meta-analysis methods provide a way of combining information from multiple studies, reducing uncertainty in the result and hopefully providing substantial…

  19. The Structure of the Enterprise’s Information Potential in the Context of Carrying out Strategic Analysis

    Directory of Open Access Journals (Sweden)

    Saukh Iryna V.

    2017-04-01

    Full Text Available The article is aimed at studying the structure of the enterprise’s information potential, evaluating its subsystems and the depth of strategic analysis based on the level of strategic uncertainty. It has been proven that information potential as a system includes the subsystem of incoming strategic information; the subsystem of evaluation of the information received; the subsystem for processing and transmission of strategic information. An assessment of the level of information saturation of the external environment together with the extent to which the information potential is used depending on the allocated criteria has beens made. A descriptive model of information potential has been developed, application of which will make possible to assess its level of development in terms of financial, based on external and internal information received, and a methodical approach to determining the depth of strategic analysis, depending on the level of strategic uncertainties in the external environment, using the matrix method (based on assessment of the ratio of impact of the environmental factors and the urgency of strategic decision-making has been suggested.

  20. Developing resources for sentiment analysis of informal Arabic text in social media

    OpenAIRE

    Itani, Maher; Roast, Chris; Al-Khayatt, Samir

    2017-01-01

    Natural Language Processing (NLP) applications such as text categorization, machine translation, sentiment analysis, etc., need annotated corpora and lexicons to check quality and performance. This paper describes the development of resources for sentiment analysis specifically for Arabic text in social media. A distinctive feature of the corpora and lexicons developed are that they are determined from informal Arabic that does not conform to grammatical or spelling standards. We refer to Ara...

  1. Information needs: an sociocognitive analysis in academic management in the context of regulation

    Directory of Open Access Journals (Sweden)

    Nadi Helena Presser

    2012-12-01

    Full Text Available It presents a sociocognitive analysis of information needs investigation that arises from the different tasks that the coordinators of the graduate studies take in their activity. The context of regulation, research subject, constitutes in the social environment under which information are produced and used. The study of the document of Applied Social Sciences I area composed the empirical basis of research. It was found that the information needs, which arise from the set of tasks which are in the center of regulation, are formed in the academic communities. At the same time they produce complex results, many tasks can be decomposed into understandable elements and identified information needs.

  2. A flood-based information flow analysis and network minimization method for gene regulatory networks.

    Science.gov (United States)

    Pavlogiannis, Andreas; Mozhayskiy, Vadim; Tagkopoulos, Ilias

    2013-04-24

    Biological networks tend to have high interconnectivity, complex topologies and multiple types of interactions. This renders difficult the identification of sub-networks that are involved in condition- specific responses. In addition, we generally lack scalable methods that can reveal the information flow in gene regulatory and biochemical pathways. Doing so will help us to identify key participants and paths under specific environmental and cellular context. This paper introduces the theory of network flooding, which aims to address the problem of network minimization and regulatory information flow in gene regulatory networks. Given a regulatory biological network, a set of source (input) nodes and optionally a set of sink (output) nodes, our task is to find (a) the minimal sub-network that encodes the regulatory program involving all input and output nodes and (b) the information flow from the source to the sink nodes of the network. Here, we describe a novel, scalable, network traversal algorithm and we assess its potential to achieve significant network size reduction in both synthetic and E. coli networks. Scalability and sensitivity analysis show that the proposed method scales well with the size of the network, and is robust to noise and missing data. The method of network flooding proves to be a useful, practical approach towards information flow analysis in gene regulatory networks. Further extension of the proposed theory has the potential to lead in a unifying framework for the simultaneous network minimization and information flow analysis across various "omics" levels.

  3. Auditing information structures in organizations: A review of data collection techniques for network analysis

    NARCIS (Netherlands)

    Koning, K.H.; de Jong, Menno D.T.

    2005-01-01

    Network analysis is one of the current techniques for investigating organizational communication. Despite the amount of how-to literature about using network analysis to assess information flows and relationships in organizations, little is known about the methodological strengths and weaknesses of

  4. An interview-informed synthesized contingency analysis to inform the treatment of challenging behavior in a young child with autism.

    Science.gov (United States)

    Herman, Ciara; Healy, Olive; Lydon, Sinéad

    2018-04-01

    Experimental Functional analysis (EFA) is considered the "gold standard" of behavioural assessment and its use is predictive of treatment success. However, EFA has a number of limitations including its lengthy nature, the high level of expertise required, and the reinforcement of challenging behaviour. This study aimed to further validate a novel interview-informed synthesised contingency analysis (IISCA). An open-ended interview and brief direct observation informed an IISCA for a young boy with autism who engaged in challenging behaviour. Resulting data supported the hypothesis that the target behaviour was multiply controlled by escape from demands and access to tangible items. An intervention comprised of most-to-least prompting, escape extinction, differential reinforcement and a high-probability instruction sequence was evaluated using a reversal design. This intervention reduced challenging behaviour to low levels and resulted in increased compliance. Findings support the status of the IISCA as a valid, practical, and effective process for designing function-based interventions.

  5. Can Raters with Reduced Job Descriptive Information Provide Accurate Position Analysis Questionnaire (PAQ) Ratings?

    Science.gov (United States)

    Friedman, Lee; Harvey, Robert J.

    1986-01-01

    Job-naive raters provided with job descriptive information made Position Analysis Questionnaire (PAQ) ratings which were validated against ratings of job analysts who were also job content experts. None of the reduced job descriptive information conditions enabled job-naive raters to obtain either acceptable levels of convergent validity with…

  6. Analysis of Web Server Log Files: Website of Information Management Department of Hacettepe University

    Directory of Open Access Journals (Sweden)

    Mandana Mir Moftakhari

    2015-09-01

    Full Text Available Over the last decade, the importance of analysing information management systems logs has grown, because it has proved that results of the analysing log data can help developing in information system design, interface and architecture of websites. Log file analysis is one of the best ways in order to understand information-searching process of online searchers, users’ needs, interests, knowledge, and prejudices. The utilization of data collected in transaction logs of web search engines helps designers, researchers and web site managers to find complex interactions of users’ goals and behaviours to increase efficiency and effectiveness of websites. Before starting any analysis it should be observed that the log file of the web site contain enough information, otherwise analyser wouldn’t be able to create complete report. In this study we evaluate the website of Information Management Department of Hacettepe University by analysing the server log files. Results show that there is not adequate amount of information in log files which are provided by web site server. The reports which we have created have some information about users’ behaviour and need but they are not sufficient for taking ideal decisions about contents & hyperlink structure of website. It also provides that creating an extended log file is essential for the website. Finally we believe that results can be helpful to improve, redesign and create better website.

  7. PAVLOV: An Information Retrieval Program for the Analysis of Learning Data.

    Science.gov (United States)

    Kjeldergaard, Paul M.

    1967-01-01

    PAVLOV (Paired Associate Verbal Learning Organizational Vehicle) is a Fortran coded program designed to facilitate the analysis of learning data. The program analyzes four classes of information parameters, list order, data format, and data. Utilizing this input, the program performs an in-depth measurement of several dependent variables for each…

  8. Computed ABC Analysis for Rational Selection of Most Informative Variables in Multivariate Data.

    Science.gov (United States)

    Ultsch, Alfred; Lötsch, Jörn

    2015-01-01

    Multivariate data sets often differ in several factors or derived statistical parameters, which have to be selected for a valid interpretation. Basing this selection on traditional statistical limits leads occasionally to the perception of losing information from a data set. This paper proposes a novel method for calculating precise limits for the selection of parameter sets. The algorithm is based on an ABC analysis and calculates these limits on the basis of the mathematical properties of the distribution of the analyzed items. The limits implement the aim of any ABC analysis, i.e., comparing the increase in yield to the required additional effort. In particular, the limit for set A, the "important few", is optimized in a way that both, the effort and the yield for the other sets (B and C), are minimized and the additional gain is optimized. As a typical example from biomedical research, the feasibility of the ABC analysis as an objective replacement for classical subjective limits to select highly relevant variance components of pain thresholds is presented. The proposed method improved the biological interpretation of the results and increased the fraction of valid information that was obtained from the experimental data. The method is applicable to many further biomedical problems including the creation of diagnostic complex biomarkers or short screening tests from comprehensive test batteries. Thus, the ABC analysis can be proposed as a mathematically valid replacement for traditional limits to maximize the information obtained from multivariate research data.

  9. An information-theoretic approach to the modeling and analysis of whole-genome bisulfite sequencing data.

    Science.gov (United States)

    Jenkinson, Garrett; Abante, Jordi; Feinberg, Andrew P; Goutsias, John

    2018-03-07

    DNA methylation is a stable form of epigenetic memory used by cells to control gene expression. Whole genome bisulfite sequencing (WGBS) has emerged as a gold-standard experimental technique for studying DNA methylation by producing high resolution genome-wide methylation profiles. Statistical modeling and analysis is employed to computationally extract and quantify information from these profiles in an effort to identify regions of the genome that demonstrate crucial or aberrant epigenetic behavior. However, the performance of most currently available methods for methylation analysis is hampered by their inability to directly account for statistical dependencies between neighboring methylation sites, thus ignoring significant information available in WGBS reads. We present a powerful information-theoretic approach for genome-wide modeling and analysis of WGBS data based on the 1D Ising model of statistical physics. This approach takes into account correlations in methylation by utilizing a joint probability model that encapsulates all information available in WGBS methylation reads and produces accurate results even when applied on single WGBS samples with low coverage. Using the Shannon entropy, our approach provides a rigorous quantification of methylation stochasticity in individual WGBS samples genome-wide. Furthermore, it utilizes the Jensen-Shannon distance to evaluate differences in methylation distributions between a test and a reference sample. Differential performance assessment using simulated and real human lung normal/cancer data demonstrate a clear superiority of our approach over DSS, a recently proposed method for WGBS data analysis. Critically, these results demonstrate that marginal methods become statistically invalid when correlations are present in the data. This contribution demonstrates clear benefits and the necessity of modeling joint probability distributions of methylation using the 1D Ising model of statistical physics and of

  10. Trial sequential analysis reveals insufficient information size and potentially false positive results in many meta-analyses

    DEFF Research Database (Denmark)

    Brok, J.; Thorlund, K.; Gluud, C.

    2008-01-01

    in 80% (insufficient information size). TSA(15%) and TSA(LBHIS) found that 95% and 91% had absence of evidence. The remaining nonsignificant meta-analyses had evidence of lack of effect. CONCLUSION: TSA reveals insufficient information size and potentially false positive results in many meta......OBJECTIVES: To evaluate meta-analyses with trial sequential analysis (TSA). TSA adjusts for random error risk and provides the required number of participants (information size) in a meta-analysis. Meta-analyses not reaching information size are analyzed with trial sequential monitoring boundaries...... analogous to interim monitoring boundaries in a single trial. STUDY DESIGN AND SETTING: We applied TSA on meta-analyses performed in Cochrane Neonatal reviews. We calculated information sizes and monitoring boundaries with three different anticipated intervention effects of 30% relative risk reduction (TSA...

  11. [Italian physician's needs for medical information. Retrospective analysis of the medical information service provided by Novartis Pharma to clinicians].

    Science.gov (United States)

    Speroni, Elisabetta; Poggi, Susanna; Vinaccia, Vincenza

    2013-10-01

    The physician's need for medical information updates has been studied extensively in recent years but the point of view of the pharmaceutical industry on this need has rarely been considered. This paper reports the results of a retrospective analysis of the medical information service provided to Italian physicians by an important pharmaceutical company, Novartis Pharma, from 2004 to 2012. The results confirm clinicians' appreciation of a service that gives them access to tailored scientific documentation and the number of requests made to the network of medical representatives has been rising steadily, peaking whenever new drugs become available to physicians. The analysis confirms what -other international studies have ascertained, that most queries are about how to use the drugs and what their properties are. The results highlight some differences between different medical specialties: for example, proportionally, neurologists seem to be the most curious. This, as well as other interesting snippets, is worth further exploration. Despite its limits in terms of representativeness, what comes out of the study is the existence of an real unmet need for information by healthcare institutions and that the support offered by the pharmaceutical industry could be invaluable; its role could go well beyond that of a mere supplier to National Healthcare Systems, to that of being recognised as an active partner the process of ensuring balanced and evidence-based information. At the same time, closer appraisal of clinicians' needs could help the pharma industries to improve their communication and educational strategies in presenting their latest clinical research and their own products.

  12. Relating Maxwell’s demon and quantitative analysis of information leakage for practical imperative programs

    International Nuclear Information System (INIS)

    Anjaria, Kushal; Mishra, Arun

    2017-01-01

    Shannon observed the relation between information entropy and Maxwell demon experiment to come up with information entropy formula. After that, Shannon’s entropy formula is widely used to measure information leakage in imperative programs. But in the present work, our aim is to go in a reverse direction and try to find possible Maxwell’s demon experimental setup for contemporary practical imperative programs in which variations of Shannon’s entropy formula has been applied to measure the information leakage. To establish the relation between the second principle of thermodynamics and quantitative analysis of information leakage, present work models contemporary variations of imperative programs in terms of Maxwell’s demon experimental setup. In the present work five contemporary variations of imperative program related to information quantification are identified. They are: (i) information leakage in imperative program, (ii) imperative multithreaded program, (iii) point to point leakage in the imperative program, (iv) imperative program with infinite observation, and (v) imperative program in the SOA-based environment. For these variations, minimal work required by an attacker to gain the secret is also calculated using historical Maxwell’s demon experiment. To model the experimental setup of Maxwell’s demon, non-interference security policy is used. In the present work, imperative programs with one-bit secret information have been considered to avoid the complexity. The findings of the present work from the history of physics can be utilized in many areas related to information flow of physical computing, nano-computing, quantum computing, biological computing, energy dissipation in computing, and computing power analysis. (paper)

  13. APPLICATION OF OLAP SYSTEM IN INFORMATION SUB-SYSTEM OF QMS INCOSISTENCY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Alempije Veljovic

    2008-03-01

    Full Text Available Records of inconsistencies arise as a result of incompliance of certain requirements during the execution of the process for the quality management system (QMS functioning. In this study, the established connection between QMS and projected information sub-system for inconsistencies management is presented. The information model of inconsistencies management provides a possibility to analyse inconsistencies from the aspect of interactive analytical data processing (OLAPsystems on the basis of multi-dimensional tables (OLAPcubes created in MSSQL Server-Analysis Services programme.

  14. Event Sequence Analysis of the Air Intelligence Agency Information Operations Center Flight Operations

    National Research Council Canada - National Science Library

    Larsen, Glen

    1998-01-01

    This report applies Event Sequence Analysis, methodology adapted from aircraft mishap investigation, to an investigation of the performance of the Air Intelligence Agency's Information Operations Center (IOC...

  15. Research study on analysis/use technologies of genome information; Genome joho kaidoku riyo gijutsu no chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    For wide use of genome information in the industrial field, the required R and D was surveyed from the standpoints of biology and information science. To clarify the present state and issues of the international research on genome analysis, the genome map as well as sequence and function information are first surveyed. The current analysis/use technologies of genome information are analyzed, and the following are summarized: prediction and identification of gene regions in genome sequences, techniques for searching and selecting useful genes, and techniques for predicting the expression of gene functions and the gene-product structure and functions. It is recommended that R and D and data collection/interpretation necessary to clarify inter-gene interactions and information networks should be promoted by integrating Japanese advanced know-how and technologies. As examples of the impact of the research results on industry and society, the present state and future expected effect are summarized for medicines, diagnosis/analysis instruments, chemicals, foods, agriculture, fishery, animal husbandry, electronics, environment and information. 278 refs., 42 figs., 5 tabs.

  16. The Role of Mother in Informing Girls About Puberty: A Meta-Analysis Study

    Science.gov (United States)

    Sooki, Zahra; Shariati, Mohammad; Chaman, Reza; Khosravi, Ahmad; Effatpanah, Mohammad; Keramat, Afsaneh

    2016-01-01

    Context Family, especially the mother, has the most important role in the education, transformation of information, and health behaviors of girls in order for them to have a healthy transition from the critical stage of puberty, but there are different views in this regard. Objectives Considering the various findings about the source of information about puberty, a meta-analysis study was conducted to investigate the extent of the mother’s role in informing girls about puberty. Data Sources This meta-analysis study was based on English articles published from 2000 to February 2015 in the Scopus, PubMed, and Science direct databases and on Persian articles in the SID, Magiran, and Iran Medex databases with determined key words and their MeSH equivalent. Study Selection Quantitative cross-sectional articles were extracted by two independent researchers and finally 46 articles were selected based on inclusion criteria. STROBE list were used for evaluation of studies. Data Extraction The percent of mothers as the current and preferred source of gaining information about the process of puberty, menarche, and menstruation from the perspective of adolescent girls was extracted from the articles. The results of studies were analyzed using meta-analysis (random effects model) and the studies’ heterogeneity was analyzed using the I2 calculation index. Variance between studies was analyzed using tau squared (Tau2) and review manager 5 software. Results The results showed that, from the perspective of teenage girls in Iran and other countries, in 56% of cases, the mother was the current source of information about the process of puberty, menarche, and menstruation. The preferred source of information about the process of puberty, menarche, and menstruation was the mother in all studies at 60% (Iran 57%, and other countries 66%). Conclusions According to the findings of this study, it is essential that health professionals and officials of the ministry of health train

  17. #FluxFlow: Visual Analysis of Anomalous Information Spreading on Social Media.

    Science.gov (United States)

    Zhao, Jian; Cao, Nan; Wen, Zhen; Song, Yale; Lin, Yu-Ru; Collins, Christopher

    2014-12-01

    We present FluxFlow, an interactive visual analysis system for revealing and analyzing anomalous information spreading in social media. Everyday, millions of messages are created, commented, and shared by people on social media websites, such as Twitter and Facebook. This provides valuable data for researchers and practitioners in many application domains, such as marketing, to inform decision-making. Distilling valuable social signals from the huge crowd's messages, however, is challenging, due to the heterogeneous and dynamic crowd behaviors. The challenge is rooted in data analysts' capability of discerning the anomalous information behaviors, such as the spreading of rumors or misinformation, from the rest that are more conventional patterns, such as popular topics and newsworthy events, in a timely fashion. FluxFlow incorporates advanced machine learning algorithms to detect anomalies, and offers a set of novel visualization designs for presenting the detected threads for deeper analysis. We evaluated FluxFlow with real datasets containing the Twitter feeds captured during significant events such as Hurricane Sandy. Through quantitative measurements of the algorithmic performance and qualitative interviews with domain experts, the results show that the back-end anomaly detection model is effective in identifying anomalous retweeting threads, and its front-end interactive visualizations are intuitive and useful for analysts to discover insights in data and comprehend the underlying analytical model.

  18. The Structure of the Enterprise’s Information Potential in the Context of Carrying out Strategic Analysis

    OpenAIRE

    Saukh Iryna V.

    2017-01-01

    The article is aimed at studying the structure of the enterprise’s information potential, evaluating its subsystems and the depth of strategic analysis based on the level of strategic uncertainty. It has been proven that information potential as a system includes the subsystem of incoming strategic information; the subsystem of evaluation of the information received; the subsystem for processing and transmission of strategic information. An assessment of the level of information saturation of...

  19. Analysis And Assistant Planning System Ofregional Agricultural Economic Inform

    Science.gov (United States)

    Han, Jie; Zhang, Junfeng

    For the common problems existed in regional development and planning, we try to design a decision support system for assisting regional agricultural development and alignment as a decision-making tool for local government and decision maker. The analysis methods of forecast, comparative advantage, liner programming and statistical analysis are adopted. According to comparative advantage theory, the regional advantage can be determined by calculating and comparing yield advantage index (YAI), Scale advantage index (SAI), Complicated advantage index (CAI). Combining with GIS, agricultural data are presented as a form of graph such as area, bar and pie to uncover the principle and trend for decision-making which can't be found in data table. This system provides assistant decisions for agricultural structure adjustment, agro-forestry development and planning, and can be integrated to information technologies such as RS, AI and so on.

  20. 30 CFR 250.227 - What environmental impact analysis (EIA) information must accompany the EP?

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false What environmental impact analysis (EIA... and Information Contents of Exploration Plans (ep) § 250.227 What environmental impact analysis (EIA... requirements. Your EIA must: (1) Assess the potential environmental impacts of your proposed exploration...

  1. [Pitfalls in informed consent: a statistical analysis of malpractice law suits].

    Science.gov (United States)

    Echigo, Junko

    2014-05-01

    In medical malpractice law suits, the notion of informed consent is often relevant in assessing whether negligence can be attributed to the medical practitioner who has caused injury to a patient. Furthermore, it is not rare that courts award damages for a lack of appropriate informed consent alone. In this study, two results were arrived at from a statistical analysis of medical malpractice law suits. One, unexpectedly, was that the severity of a patient's illness made no significant difference to whether damages were awarded. The other was that cases of typical medical treatment that national medical insurance does not cover were involved significantly more often than insured treatment cases. In cases where damages were awarded, the courts required more disclosure and written documents of information by medical practitioners, especially about complications and adverse effects that the patient might suffer.

  2. Parametric sensitivity analysis for biochemical reaction networks based on pathwise information theory.

    Science.gov (United States)

    Pantazis, Yannis; Katsoulakis, Markos A; Vlachos, Dionisios G

    2013-10-22

    Stochastic modeling and simulation provide powerful predictive methods for the intrinsic understanding of fundamental mechanisms in complex biochemical networks. Typically, such mathematical models involve networks of coupled jump stochastic processes with a large number of parameters that need to be suitably calibrated against experimental data. In this direction, the parameter sensitivity analysis of reaction networks is an essential mathematical and computational tool, yielding information regarding the robustness and the identifiability of model parameters. However, existing sensitivity analysis approaches such as variants of the finite difference method can have an overwhelming computational cost in models with a high-dimensional parameter space. We develop a sensitivity analysis methodology suitable for complex stochastic reaction networks with a large number of parameters. The proposed approach is based on Information Theory methods and relies on the quantification of information loss due to parameter perturbations between time-series distributions. For this reason, we need to work on path-space, i.e., the set consisting of all stochastic trajectories, hence the proposed approach is referred to as "pathwise". The pathwise sensitivity analysis method is realized by employing the rigorously-derived Relative Entropy Rate, which is directly computable from the propensity functions. A key aspect of the method is that an associated pathwise Fisher Information Matrix (FIM) is defined, which in turn constitutes a gradient-free approach to quantifying parameter sensitivities. The structure of the FIM turns out to be block-diagonal, revealing hidden parameter dependencies and sensitivities in reaction networks. As a gradient-free method, the proposed sensitivity analysis provides a significant advantage when dealing with complex stochastic systems with a large number of parameters. In addition, the knowledge of the structure of the FIM can allow to efficiently address

  3. Analysis of informational redundancy in the protein-assembling machinery

    Science.gov (United States)

    Berkovich, Simon

    2004-03-01

    Entropy analysis of the DNA structure does not reveal a significant departure from randomness indicating lack of informational redundancy. This signifies the absence of a hidden meaning in the genome text and supports the 'barcode' interpretation of DNA given in [1]. Lack of informational redundancy is a characteristic property of an identification label rather than of a message of instructions. Yet randomness of DNA has to induce non-random structures of the proteins. Protein synthesis is a two-step process: transcription into RNA with gene splicing and formation a structure of amino acids. Entropy estimations, performed by A. Djebbari, show typical values of redundancy of the biomolecules along these pathways: DNA gene 4proteins 15-40in gene expression, the RNA copy carries the same information as the original DNA template. Randomness is essentially eliminated only at the step of the protein creation by a degenerate code. According to [1], the significance of the substitution of U for T with a subsequent gene splicing is that these transformations result in a different pattern of RNA oscillations, so the vital DNA communications are protected against extraneous noise coming from the protein making activities. 1. S. Berkovich, "On the 'barcode' functionality of DNA, or the Phenomenon of Life in the Physical Universe", Dorrance Publishing Co., Pittsburgh, 2003

  4. Efficiency of crude oil markets: Evidences from informational entropy analysis

    International Nuclear Information System (INIS)

    Ortiz-Cruz, Alejandro; Rodriguez, Eduardo; Ibarra-Valdez, Carlos; Alvarez-Ramirez, Jose

    2012-01-01

    The role of crude oil as the main energy source for the global economic activity has motivated the discussion about the dynamics and causes of crude oil price changes. An accurate understanding of the issue should provide important guidelines for the design of optimal policies and government budget planning. Using daily data for WTI over the period January 1986–March 2011, we analyze the evolution of the informational complexity and efficiency for the crude oil market through multiscale entropy analysis. The results indicated that the crude oil market is informationally efficient over the scrutinized period except for two periods that correspond to the early 1990s and late 2000s US recessions. Overall, the results showed that deregulation has improved the operation of the market in the sense of making returns less predictable. On the other hand, there is some evidence that the probability of having a severe US economic recession increases as the informational efficiency decreases, which indicates that returns from crude oil markets are less uncertain during economic downturns. - Highlights: ► Entropy concepts are used to characterize crude oil prices. ► An index of market efficiency is introduced. ► Except for periods of economic recession, the crude oil market is informationally efficient.

  5. Thermal analysis and safety information for metal nanopowders by DSC

    Energy Technology Data Exchange (ETDEWEB)

    Tseng, J.M.; Huang, S.T. [Institute of Safety and Disaster Prevention Technology, Central Taiwan University of Science and Technology, 666, Buzih Road, Beitun District, Taichung 40601, Taiwan, ROC (China); Duh, Y.S.; Hsieh, T.Y.; Sun, Y.Y. [Department of Safety Health and Environmental Engineering, National United University, Miaoli, Taiwan, ROC (China); Lin, J.Z. [Institute of Safety and Disaster Prevention Technology, Central Taiwan University of Science and Technology, 666, Buzih Road, Beitun District, Taichung 40601, Taiwan, ROC (China); Wu, H.C. [Institute of Occupational Safety and Health, Council of Labor Affairs, Taipei, Taiwan, ROC (China); Kao, C.S., E-mail: jcsk@nuu.edu.tw [Department of Safety Health and Environmental Engineering, National United University, Miaoli, Taiwan, ROC (China)

    2013-08-20

    Highlights: • Metal nanopowders are common and frequently employed in industry. • Nano iron powder experimental results of T{sub o} were 140–150 °C. • Safety information can benefit relevant metal powders industries. - Abstract: Metal nanopowders are common and frequently employed in industry. Iron is mostly applied in high-performance magnetic materials and pollutants treatment for groundwater. Zinc is widely used in brass, bronze, die casting metal, alloys, rubber, and paints, etc. Nonetheless, some disasters induced by metal powders are due to the lack of related safety information. In this study, we applied differential scanning calorimetry (DSC) and used thermal analysis software to evaluate the related thermal safety information, such as exothermic onset temperature (T{sub o}), peak of temperature (T{sub p}), and heat of reaction (ΔH). The nano iron powder experimental results of T{sub o} were 140–150 °C, 148–158 °C, and 141–149 °C for 15 nm, 35 nm, and 65 nm, respectively. The ΔH was larger than 3900 J/g, 5000 J/g, and 3900 J/g for 15 nm, 35 nm, and 65 nm, respectively. Safety information can benefit the relevant metal powders industries for preventing accidents from occurring.

  6. The information activity of rail passenger information staff: a foundation for information system requirements

    Directory of Open Access Journals (Sweden)

    Martin Rose

    2006-01-01

    Full Text Available Introduction. This paper examines the goal-directed information activity of passenger information staff, working in the dynamic environment of rail network control. The explicit aim is to define a meaningful set of information system requirements. The report shows how dynamic situations may lead us to question a number of established theories of information science. Method. Passenger information officers (PIOs were observed on duty within the rail command and control headquarters. Observation sessions totally eight hours involved the manual recording of sequential information flows and the associated activity of PIOs. A semi-structured management interview was also conducted to provide further insight into the organizational context. Analysis. A viewpoint-oriented analysis technique was used to analyse sequential data captured during observation sessions. Event sequences that represent and explain the viewpoints were identified and elaborated into detailed scenario descriptions. Results. The analysis both supports and contests a number of established theories from information science. Additionally, a range of 'mandatory' and 'desirable' system requirements are derived from the scenario and viewpoint analyses. Conclusion. Dynamic situations have a significant impact on information behaviour which is not always predicted by current theories of information science.

  7. Information System Hazard Analysis: A Method for Identifying Technology-induced Latent Errors for Safety.

    Science.gov (United States)

    Weber, Jens H; Mason-Blakley, Fieran; Price, Morgan

    2015-01-01

    Many health information and communication technologies (ICT) are safety-critical; moreover, reports of technology-induced adverse events related to them are plentiful in the literature. Despite repeated criticism and calls to action, recent data collected by the Institute of Medicine (IOM) and other organization do not indicate significant improvements with respect to the safety of health ICT systems. A large part of the industry still operates on a reactive "break & patch" model; the application of pro-active, systematic hazard analysis methods for engineering ICT that produce "safe by design" products is sparse. This paper applies one such method: Information System Hazard Analysis (ISHA). ISHA adapts and combines hazard analysis techniques from other safety-critical domains and customizes them for ICT. We provide an overview of the steps involved in ISHA and describe.

  8. Analysis of Internet Information on Lateral Lumbar Interbody Fusion.

    Science.gov (United States)

    Belayneh, Rebekah; Mesfin, Addisu

    2016-07-01

    Lateral lumbar interbody fusion (LLIF) is a surgical technique that is being increasingly used. The authors' objective was to examine information on the Internet pertaining to the LLIF technique. An analysis was conducted of publicly accessible websites pertaining to LLIF. The following search engines were used: Google (www.google.com), Bing (www.bing.com), and Yahoo (www.yahoo.com). DuckDuckGo (www.duckduckgo.com) was an additional search engine used due to its emphasis on generating accurate and consistent results while protecting searchers' privacy and reducing advertisements. The top 35 websites providing information on LLIF from the 4 search engines were identified. A total of 140 websites were evaluated. Each web-site was categorized based on authorship (academic, private, medical industry, insurance company, other) and content of information. Using the search term lateral lumbar interbody fusion, 174,000 Google results, 112,000 Yahoo results, and 112,000 Bing results were obtained. DuckDuckGo does not display the number of results found for a search. From the top 140 websites collected from each website, 78 unique websites were identified. Websites were authored by a private medical group in 46.2% of the cases, an academic medical group in 26.9% of the cases, and the biomedical industry in 5.1% of the cases. Sixty-eight percent of websites reported indications, and 24.4% reported contraindications. Benefits of LLIF were reported by 69.2% of websites. Thirty-six percent of websites reported complications of LLIF. Overall, the quality of information regarding LLIF on the Internet is poor. Spine surgeons and spine societies can assist in improving the quality of the information on the Internet regarding LLIF. [Orthopedics. 2016; 39(4):e701-e707.]. Copyright 2016, SLACK Incorporated.

  9. Analysis and design of hospital management information system based on UML

    Science.gov (United States)

    Ma, Lin; Zhao, Huifang; You, Shi Jun; Ge, Wenyong

    2018-05-01

    With the rapid development of computer technology, computer information management system has been utilized in many industries. Hospital Information System (HIS) is in favor of providing data for directors, lightening the workload for the medical workers, and improving the workers efficiency. According to the HIS demand analysis and system design, this paper focus on utilizing unified modeling language (UML) models to establish the use case diagram, class diagram, sequence chart and collaboration diagram, and satisfying the demands of the daily patient visit, inpatient, drug management and other relevant operations. At last, the paper summarizes the problems of the system and puts forward an outlook of the HIS system.

  10. Alice and Bob meet Banach the interface of asymptotic geometric analysis and quantum information theory

    CERN Document Server

    Aubrun, Guillaume

    2017-01-01

    The quest to build a quantum computer is arguably one of the major scientific and technological challenges of the twenty-first century, and quantum information theory (QIT) provides the mathematical framework for that quest. Over the last dozen or so years, it has become clear that quantum information theory is closely linked to geometric functional analysis (Banach space theory, operator spaces, high-dimensional probability), a field also known as asymptotic geometric analysis (AGA). In a nutshell, asymptotic geometric analysis investigates quantitative properties of convex sets, or other geometric structures, and their approximate symmetries as the dimension becomes large. This makes it especially relevant to quantum theory, where systems consisting of just a few particles naturally lead to models whose dimension is in the thousands, or even in the billions. Alice and Bob Meet Banach is aimed at multiple audiences connected through their interest in the interface of QIT and AGA: at quantum information resea...

  11. The emerging potential for network analysis to inform precision cancer medicine.

    Science.gov (United States)

    Ozturk, Kivilcim; Dow, Michelle; Carlin, Daniel E; Bejar, Rafael; Carter, Hannah

    2018-06-14

    Precision cancer medicine promises to tailor clinical decisions to patients using genomic information. Indeed, successes of drugs targeting genetic alterations in tumors, such as imatinib that targets BCR-ABL in chronic myelogenous leukemia, have demonstrated the power of this approach. However biological systems are complex, and patients may differ not only by the specific genetic alterations in their tumor, but by more subtle interactions among such alterations. Systems biology and more specifically, network analysis, provides a framework for advancing precision medicine beyond clinical actionability of individual mutations. Here we discuss applications of network analysis to study tumor biology, early methods for N-of-1 tumor genome analysis and the path for such tools to the clinic. Copyright © 2018. Published by Elsevier Ltd.

  12. A network meta-analysis on the effects of information technology application on preoperative knowledge of patients.

    Science.gov (United States)

    Lai, Yi-Horng

    2015-01-01

    The application of information technology in health education plan in Taiwan has existed for a long time. The purpose of this study is to explore the relationship between information technology application in health education and patients' preoperative knowledge by synthesizing existing researches that compare the effectiveness of information technology application and traditional instruction in the health education plan. In spite of claims regarding the potential benefits of using information technology in health education plan, results of previous researches were conflicting. This study is carried out to examine the effectiveness of information technology by using network meta-analysis, which is a statistical analysis of separate but similar studies in order to test the pooled data for statistical significance. Information technology application in health education discussed in this study include interactive technology therapy (person-computer), group interactive technology therapy (person-person), multimedia technology therapy and video therapy. The result has shown that group interactive technology therapy is the most effective, followed by interactive technology therapy. And these four therapies of information technology are all superior to the traditional health education plan (leaflet therapy).

  13. Electronic tools for health information exchange: an evidence-based analysis.

    Science.gov (United States)

    2013-01-01

    As patients experience transitions in care, there is a need to share information between care providers in an accurate and timely manner. With the push towards electronic medical records and other electronic tools (eTools) (and away from paper-based health records) for health information exchange, there remains uncertainty around the impact of eTools as a form of communication. To examine the impact of eTools for health information exchange in the context of care coordination for individuals with chronic disease in the community. A literature search was performed on April 26, 2012, using OVID MEDLINE, OVID MEDLINE In-Process and Other Non-Indexed Citations, OVID EMBASE, EBSCO Cumulative Index to Nursing & Allied Health Literature (CINAHL), the Wiley Cochrane Library, and the Centre for Reviews and Dissemination database, for studies published until April 26, 2012 (no start date limit was applied). A systematic literature search was conducted, and meta-analysis conducted where appropriate. Outcomes of interest fell into 4 categories: health services utilization, disease-specific clinical outcomes, process-of-care indicators, and measures of efficiency. The quality of the evidence was assessed individually for each outcome. Expert panels were assembled for stakeholder engagement and contextualization. Eleven articles were identified (4 randomized controlled trials and 7 observational studies). There was moderate quality evidence of a reduction in hospitalizations, hospital length of stay, and emergency department visits following the implementation of an electronically generated laboratory report with recommendations based on clinical guidelines. The evidence showed no difference in disease-specific outcomes; there was no evidence of a positive impact on process-of-care indicators or measures of efficiency. A limited body of research specifically examined eTools for health information exchange in the population and setting of interest. This evidence included a

  14. Components of spatial information management in wildlife ecology: Software for statistical and modeling analysis [Chapter 14

    Science.gov (United States)

    Hawthorne L. Beyer; Jeff Jenness; Samuel A. Cushman

    2010-01-01

    Spatial information systems (SIS) is a term that describes a wide diversity of concepts, techniques, and technologies related to the capture, management, display and analysis of spatial information. It encompasses technologies such as geographic information systems (GIS), global positioning systems (GPS), remote sensing, and relational database management systems (...

  15. Information findability: An informal study to explore options for improving information findability for the systems analysis group

    Energy Technology Data Exchange (ETDEWEB)

    Stoecker, Nora Kathleen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-03-01

    A Systems Analysis Group has existed at Sandia National Laboratories since at least the mid-1950s. Much of the groups work output (reports, briefing documents, and other materials) has been retained, along with large numbers of related documents. Over time the collection has grown to hundreds of thousands of unstructured documents in many formats contained in one or more of several different shared drives or SharePoint sites, with perhaps five percent of the collection still existing in print format. This presents a challenge. How can the group effectively find, manage, and build on information contained somewhere within such a large set of unstructured documents? In response, a project was initiated to identify tools that would be able to meet this challenge. This report documents the results found and recommendations made as of August 2013.

  16. Application of information technology in the process analysis of structural subdivisions of light industry enterprises

    Directory of Open Access Journals (Sweden)

    Тарана Тахир кызы Мусаева

    2016-02-01

    Full Text Available The problems of application of information technologies in the process analysis of structural subdivisions of the enterprises of light industry enterprises are considered. For the functional cost analysis in the company it was prepared a computerized structure of the workplace and its principle of operation is described. The functionality of the computerized workplace is tested for textile enterprises. The results showed that the use of information technology has good prospects at the implementation of the quality management system at the textile enterprises

  17. How Analysis Informs Regulation:Success and Failure of ...

    Science.gov (United States)

    How Analysis Informs Regulation:Success and Failure of Evolving Approaches to Polyfluoroalkyl Acid Contamination The National Exposure Research Laboratory (NERL) Human Exposure and Atmospheric Sciences Division (HEASD) conducts research in support of EPA mission to protect human health and the environment. HEASD research program supports Goal 1 (Clean Air) and Goal 4 (Healthy People) of EPA strategic plan. More specifically, our division conducts research to characterize the movement of pollutants from the source to contact with humans. Our multidisciplinary research program produces Methods, Measurements, and Models to identify relationships between and characterize processes that link source emissions, environmental concentrations, human exposures, and target-tissue dose. The impact of these tools is improved regulatory programs and policies for EPA.

  18. Geographic information systems, remote sensing, and spatial analysis activities in Texas, 2008-09

    Science.gov (United States)

    ,

    2009-01-01

    Geographic information system (GIS) technology has become an important tool for scientific investigation, resource management, and environmental planning. A GIS is a computer-aided system capable of collecting, storing, analyzing, and displaying spatially referenced digital data. GIS technology is useful for analyzing a wide variety of spatial data. Remote sensing involves collecting remotely sensed data, such as satellite imagery, aerial photography, or radar images, and analyzing the data to gather information or investigate trends about the environment or the Earth's surface. Spatial analysis combines remotely sensed, thematic, statistical, quantitative, and geographical data through overlay, modeling, and other analytical techniques to investigate specific research questions. It is the combination of data formats and analysis techniques that has made GIS an essential tool in scientific investigations. This fact sheet presents information about the technical capabilities and project activities of the U.S. Geological Survey (USGS) Texas Water Science Center (TWSC) GIS Workgroup during 2008 and 2009. After a summary of GIS Workgroup capabilities, brief descriptions of activities by project at the local and national levels are presented. Projects are grouped by the fiscal year (October-September 2008 or 2009) the project ends and include overviews, project images, and Internet links to additional project information and related publications or articles.

  19. LANDSAT-4 MSS and Thematic Mapper data quality and information content analysis

    Science.gov (United States)

    Anuta, P.; Bartolucci, L.; Dean, E.; Lozano, F.; Malaret, E.; Mcgillem, C. D.; Valdes, J.; Valenzuela, C.

    1984-01-01

    LANDSAT-4 thematic mapper (TM) and multispectral scanner (MSS) data were analyzed to obtain information on data quality and information content. Geometric evaluations were performed to test band-to-band registration accuracy. Thematic mapper overall system resolution was evaluated using scene objects which demonstrated sharp high contrast edge responses. Radiometric evaluation included detector relative calibration, effects of resampling, and coherent noise effects. Information content evaluation was carried out using clustering, principal components, transformed divergence separability measure, and supervised classifiers on test data. A detailed spectral class analysis (multispectral classification) was carried out to compare the information content of the MSS and TM for a large number of scene classes. A temperature-mapping experiment was carried out for a cooling pond to test the quality of thermal-band calibration. Overall TM data quality is very good. The MSS data are noisier than previous LANDSAT results.

  20. Clinical psychology service users' experiences of confidentiality and informed consent: a qualitative analysis.

    Science.gov (United States)

    Martindale, S J; Chambers, E; Thompson, A R

    2009-12-01

    To explore and describe the experience of clinical psychology service users in relation to the processes associated with confidentiality and the generation of informed consent in individual therapy. A qualitative interview-based study employing interpretative phenomenological analysis was conducted with service users. User researchers were active collaborators in the study. A focus group of four users was convened to explore issues related to confidentiality and consent, which then informed the development of the semi-structured interview schedule. Twelve users of community mental health clinical psychology services were interviewed by user researchers. A user researcher and a clinical psychologist undertook joint analysis of the data. A second clinical psychologist facilitated reflexivity and wider consideration of validity issues. Four main themes were identified from the data: being referred; the participant's feelings, mental health difficulties, and their impact; relationships with workers and carers; and autonomy. The meaningfulness of processes of discussing confidentiality, and generating informed consent, can be improved by psychologists placing a greater emphasis on choice, control, autonomy, individual preferences, and actively involving the user in dialogue on repeated occasions.

  1. Improving access to health information for older migrants by using grounded theory and social network analysis to understand their information behaviour and digital technology use.

    Science.gov (United States)

    Goodall, K T; Newman, L A; Ward, P R

    2014-11-01

    Migrant well-being can be strongly influenced by the migration experience and subsequent degree of mainstream language acquisition. There is little research on how older Culturally And Linguistically Diverse (CALD) migrants who have 'aged in place' find health information, and the role which digital technology plays in this. Although the research for this paper was not focused on cancer, we draw out implications for providing cancer-related information to this group. We interviewed 54 participants (14 men and 40 women) aged 63-94 years, who were born in Italy or Greece, and who migrated to Australia mostly as young adults after World War II. Constructivist grounded theory and social network analysis were used for data analysis. Participants identified doctors, adult children, local television, spouse, local newspaper and radio as the most important information sources. They did not generally use computers, the Internet or mobile phones to access information. Literacy in their birth language, and the degree of proficiency in understanding and using English, influenced the range of information sources accessed and the means used. The ways in which older CALD migrants seek and access information has important implications for how professionals and policymakers deliver relevant information to them about cancer prevention, screening, support and treatment, particularly as information and resources are moved online as part of e-health. © 2014 John Wiley & Sons Ltd.

  2. Construction Process Simulation and Safety Analysis Based on Building Information Model and 4D Technology

    Institute of Scientific and Technical Information of China (English)

    HU Zhenzhong; ZHANG Jianping; DENG Ziyin

    2008-01-01

    Time-dependent structure analysis theory has been proved to be more accurate and reliable com-pared to commonly used methods during construction. However, so far applications are limited to partial pe-riod and part of the structure because of immeasurable artificial intervention. Based on the building informa-tion model (BIM) and four-dimensional (4D) technology, this paper proposes an improves structure analysis method, which can generate structural geometry, resistance model, and loading conditions automatically by a close interlink of the schedule information, architectural model, and material properties. The method was applied to a safety analysis during a continuous and dynamic simulation of the entire construction process.The results show that the organic combination of the BIM, 4D technology, construction simulation, and safety analysis of time-dependent structures is feasible and practical. This research also lays a foundation for further researches on building lifecycle management by combining architectural design, structure analy-sis, and construction management.

  3. Factor analysis of sources of information on organ donation and transplantation in journalism students.

    Science.gov (United States)

    Martínez-Alarcón, L; Ríos, A; Ramis, G; López-Navas, A; Febrero, B; Ramírez, P; Parrilla, P

    2013-01-01

    Journalists and the information they disseminate are essential to promote health and organ donation and transplantation (ODT). The attitude of journalism students toward ODT could influence public opinion and help promote this treatment option. The aim of this study was to determine the media through which journalism students receive information on ODT and to analyze the association between the sources of information and psychosocial variables. We surveyed journalism students (n = 129) recruited in compulsory classes. A validated psychosocial questionnaire (self-administered, anonymous) about ODT was used. Student t test and χ(2) test were applied. Questionnaire completion rate was 98% (n = 126). The medium with the greatest incidence on students was television (TV), followed by press and magazines/books. In the factor analysis to determine the impact of the information by its source, the first factor was talks with friends and family; the second was shared by hoardings/publicity posters, health professionals, and college/school; and the third was TV and radio. In the factor analysis between information sources and psychosocial variables, the associations were between information about organ donation transmitted by friends and family and having spoken about ODT with them; by TV, radio, and hoardings and not having spoken in the family; and by TV/radio and the father's and mother's opinion about ODT. The medium with the greatest incidence on students is TV, and the medium with the greatest impact on broadcasting information was conversations with friends, family, and health professionals. This could be useful for society, because they should be provided with clear and concise information. Copyright © 2013 Elsevier Inc. All rights reserved.

  4. Temporal Information Processing and Stability Analysis of the MHSN Neuron Model in DDF

    Directory of Open Access Journals (Sweden)

    Saket Kumar Choudhary

    2016-12-01

    Full Text Available Implementation of a neuron like information processing structure at hardware level is a burning research problem. In this article, we analyze the modified hybrid spiking neuron model (the MHSN model in distributed delay framework (DDF for hardware level implementation point of view. We investigate its temporal information processing capability in term of inter-spike-interval (ISI distribution. We also perform the stability analysis of the MHSN model, in which, we compute nullclines, steady state solution, eigenvalues corresponding the MHSN model. During phase plane analysis, we notice that the MHSN model generates limit cycle oscillations which is an important phenomenon in many biological processes. Qualitative behavior of these limit cycle does not changes due to the variation in applied input stimulus, however, delay effect the spiking activity and duration of cycle get altered.

  5. Transportation Routing Analysis Geographic Information System (WebTRAGIS) User's Manual

    International Nuclear Information System (INIS)

    Michelhaugh, R.D.

    2000-01-01

    In the early 1980s, Oak Ridge National Laboratory (ORNL) developed two transportation routing models: HIGHWAY, which predicts truck transportation routes, and INTERLINE, which predicts rail transportation routes. Both of these models have been used by the U.S. Department of Energy (DOE) community for a variety of routing needs over the years. One of the primary uses of the models has been to determine population-density information, which is used as input for risk assessment with the RADTRAN model, which is available on the TRANSNET computer system. During the recent years, advances in the development of geographic information systems (GISs) have resulted in increased demands from the user community for a GIS version of the ORNL routing models. In April 1994, the DOE Transportation Management Division (EM-261) held a Baseline Requirements Assessment Session with transportation routing experts and users of the HIGHWAY and INTERLINE models. As a result of the session, the development of a new GIS routing model, Transportation Routing Analysis GIS (TRAGIS), was initiated. TRAGIS is a user-friendly, GIS-based transportation and analysis computer model. The older HIGHWAY and INTERLINE models are useful to calculate routes, but they cannot display a graphic of the calculated route. Consequently, many users have experienced difficulty determining the proper node for facilities and have been confused by or have misinterpreted the text-based listing from the older routing models. Some of the primary reasons for the development of TRAGIS are (a) to improve the ease of selecting locations for routing, (b) to graphically display the calculated route, and (c) to provide for additional geographic analysis of the route

  6. Numerical Investigations into the Value of Information in Lifecycle Analysis of Structural Systems

    DEFF Research Database (Denmark)

    Konakli, Katerina; Sudret, Bruno; Faber, Michael Havbro

    2015-01-01

    of decisions related to maintenance of structural systems. In this context, experiments may refer to inspections or structural health monitoring. The value-of-information concept comprises a powerful tool for determining whether the experimental cost is justified by the expected gained benefit during...... investigations demonstrate how the decision problem is influenced by the assumed probabilistic models, including the type of probability distribution and the degree of uncertainty reflected in the coefficient of variation, the degradation law, the quantity and quality of information, and the probabilistic...... dependencies between the components of a system. Furthermore, challenges and potentials in value-of-information analysis for structural systems are discussed....

  7. SWOT analysis on National Common Geospatial Information Service Platform of China

    Science.gov (United States)

    Zheng, Xinyan; He, Biao

    2010-11-01

    Currently, the trend of International Surveying and Mapping is shifting from map production to integrated service of geospatial information, such as GOS of U.S. etc. Under this circumstance, the Surveying and Mapping of China is inevitably shifting from 4D product service to NCGISPC (National Common Geospatial Information Service Platform of China)-centered service. Although State Bureau of Surveying and Mapping of China has already provided a great quantity of geospatial information service to various lines of business, such as emergency and disaster management, transportation, water resource, agriculture etc. The shortcomings of the traditional service mode are more and more obvious, due to the highly emerging requirement of e-government construction, the remarkable development of IT technology and emerging online geospatial service demands of various lines of business. NCGISPC, which aimed to provide multiple authoritative online one-stop geospatial information service and API for further development to government, business and public, is now the strategic core of SBSM (State Bureau of Surveying and Mapping of China). This paper focuses on the paradigm shift that NCGISPC brings up by using SWOT (Strength, Weakness, Opportunity and Threat) analysis, compared to the service mode that based on 4D product. Though NCGISPC is still at its early stage, it represents the future service mode of geospatial information of China, and surely will have great impact not only on the construction of digital China, but also on the way that everyone uses geospatial information service.

  8. Information Presentation in Decision and Risk Analysis: Answered, Partly Answered, and Unanswered Questions.

    Science.gov (United States)

    Keller, L Robin; Wang, Yitong

    2017-06-01

    For the last 30 years, researchers in risk analysis, decision analysis, and economics have consistently proven that decisionmakers employ different processes for evaluating and combining anticipated and actual losses, gains, delays, and surprises. Although rational models generally prescribe a consistent response, people's heuristic processes will sometimes lead them to be inconsistent in the way they respond to information presented in theoretically equivalent ways. We point out several promising future research directions by listing and detailing a series of answered, partly answered, and unanswered questions. © 2016 Society for Risk Analysis.

  9. SUCCESS CONCEPT ANALYSIS APPLIED TO THE INFORMATION TECHNOLOGY PROJECT MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Cassio C. Montenegro Duarte

    2012-05-01

    Full Text Available This study evaluates the concept of success in project management that is applicable to the IT universe, from the classical theory associated with the techniques of project management. Therefore, it applies the theoretical analysis associated to the context of information technology in enterprises as well as the classic literature of traditional project management, focusing on its application in business information technology. From the literature developed in the first part of the study, four propositions were prepared for study which formed the basis for the development of the field research with three large companies that develop projects of Information Technology. The methodology used in the study predicted the development of the multiple case study. Empirical evidence suggests that the concept of success found in the classical literature in project management adjusts to the environment management of IT projects. Showed that it is possible to create the model of standard IT projects in order to replicate it in future derivatives projects, which depends on the learning acquired at the end of a long and continuous process and sponsorship of senior management, which ultimately results in its merger into the company culture.

  10. 78 FR 68463 - Notice of Emergency Approval of an Information Collection: Regional Analysis of Impediments...

    Science.gov (United States)

    2013-11-14

    ... Approval of an Information Collection: Regional Analysis of Impediments Guidance for [email protected] or telephone 202- 402-2102. This is not a toll-free number. Persons with hearing or speech... Collection: Regional Analysis of Impediments Guidance for Sustainable Communities Grantees. OMB Approval...

  11. Applying information network analysis to fire-prone landscapes: implications for community resilience

    Directory of Open Access Journals (Sweden)

    Derric B. Jacobs

    2017-03-01

    Full Text Available Resilient communities promote trust, have well-developed networks, and can adapt to change. For rural communities in fire-prone landscapes, current resilience strategies may prove insufficient in light of increasing wildfire risks due to climate change. It is argued that, given the complexity of climate change, adaptations are best addressed at local levels where specific social, cultural, political, and economic conditions are matched with local risks and opportunities. Despite the importance of social networks as key attributes of community resilience, research using social network analysis on coupled human and natural systems is scarce. Furthermore, the extent to which local communities in fire-prone areas understand climate change risks, accept the likelihood of potential changes, and have the capacity to develop collaborative mitigation strategies is underexamined, yet these factors are imperative to community resiliency. We apply a social network framework to examine information networks that affect perceptions of wildfire and climate change in Central Oregon. Data were collected using a mailed questionnaire. Analysis focused on the residents' information networks that are used to gain awareness of governmental activities and measures of community social capital. A two-mode network analysis was used to uncover information exchanges. Results suggest that the general public develops perceptions about climate change based on complex social and cultural systems rather than as patrons of scientific inquiry and understanding. It appears that perceptions about climate change itself may not be the limiting factor in these communities' adaptive capacity, but rather how they perceive local risks. We provide a novel methodological approach in understanding rural community adaptation and resilience in fire-prone landscapes and offer a framework for future studies.

  12. Teaching motor skills by means of biomechanical analysis of the motion: the physiological basis and applied information technologies

    Directory of Open Access Journals (Sweden)

    Razuvanova A.V.

    2016-01-01

    Full Text Available The article proves the possibility of training athletes using motor skills on the basis of biomechanical analysis of movements with application of information technologies. Motion Tracking – digital single frame shooting photography – is proposed as a method for biomechanical analysis. The relevance of this method is conditioned by the results of the study of a repulsion phase in the performing of the standing jump by athletes of different qualifications. The conclusion about the importance of an optimal model of a jump based on biomechanical analysis is given, and the formation of athletes’ skills, using information technologies and the principle of urgent information, is discussed.

  13. Content analysis in information flows

    Energy Technology Data Exchange (ETDEWEB)

    Grusho, Alexander A. [Institute of Informatics Problems of Federal Research Center “Computer Science and Control” of the Russian Academy of Sciences, Vavilova str., 44/2, Moscow (Russian Federation); Faculty of Computational Mathematics and Cybernetics, Moscow State University, Moscow (Russian Federation); Grusho, Nick A.; Timonina, Elena E. [Institute of Informatics Problems of Federal Research Center “Computer Science and Control” of the Russian Academy of Sciences, Vavilova str., 44/2, Moscow (Russian Federation)

    2016-06-08

    The paper deals with architecture of content recognition system. To analyze the problem the stochastic model of content recognition in information flows was built. We proved that under certain conditions it is possible to solve correctly a part of the problem with probability 1, viewing a finite section of the information flow. That means that good architecture consists of two steps. The first step determines correctly certain subsets of contents, while the second step may demand much more time for true decision.

  14. Transportation Routing Analysis Geographic Information System -- TRAGIS, progress on improving a routing tool

    International Nuclear Information System (INIS)

    Johnson, P.E.; Lester, P.B.

    1998-05-01

    The Transportation Routing Analysis Geographic Information System (TRAGIS) model provides a useful tool to calculate and analyze transportation routes for radioactive materials within the continental US. This paper outlines some of the features available in this model

  15. Analysis Methods for Extracting Knowledge from Large-Scale WiFi Monitoring to Inform Building Facility Planning

    DEFF Research Database (Denmark)

    Ruiz-Ruiz, Antonio; Blunck, Henrik; Prentow, Thor Siiger

    2014-01-01

    realistic data to inform facility planning. In this paper, we propose analysis methods to extract knowledge from large sets of network collected WiFi traces to better inform facility management and planning in large building complexes. The analysis methods, which build on a rich set of temporal and spatial......The optimization of logistics in large building com- plexes with many resources, such as hospitals, require realistic facility management and planning. Current planning practices rely foremost on manual observations or coarse unverified as- sumptions and therefore do not properly scale or provide....... Spatio-temporal visualization tools built on top of these methods enable planners to inspect and explore extracted information to inform facility-planning activities. To evaluate the methods, we present results for a large hospital complex covering more than 10 hectares. The evaluation is based on Wi...

  16. Formulating informative, data-based priors for failure probability estimation in reliability analysis

    International Nuclear Information System (INIS)

    Guikema, Seth D.

    2007-01-01

    Priors play an important role in the use of Bayesian methods in risk analysis, and using all available information to formulate an informative prior can lead to more accurate posterior inferences. This paper examines the practical implications of using five different methods for formulating an informative prior for a failure probability based on past data. These methods are the method of moments, maximum likelihood (ML) estimation, maximum entropy estimation, starting from a non-informative 'pre-prior', and fitting a prior based on confidence/credible interval matching. The priors resulting from the use of these different methods are compared qualitatively, and the posteriors are compared quantitatively based on a number of different scenarios of observed data used to update the priors. The results show that the amount of information assumed in the prior makes a critical difference in the accuracy of the posterior inferences. For situations in which the data used to formulate the informative prior is an accurate reflection of the data that is later observed, the ML approach yields the minimum variance posterior. However, the maximum entropy approach is more robust to differences between the data used to formulate the prior and the observed data because it maximizes the uncertainty in the prior subject to the constraints imposed by the past data

  17. Internet Information for Patients on Cancer Diets - an Analysis of German Websites.

    Science.gov (United States)

    Herth, Natalie; Kuenzel, Ulrike; Liebl, Patrick; Keinki, Christian; Zell, Joerg; Huebner, Jutta

    2016-01-01

    In recent years, the Internet has become an important source of information for cancer patients. Various cancer diets that are publicized on the Web promise significant benefits. The aim of our study was to evaluate the quality of online patient information about cancer diets. A patient's search for 'cancer diets' on German websites was simulated using the search engine Google. The websites were evaluated utilizing a standardized instrument with formal and content aspects. An analysis of 60 websites revealed that websites from nonprofit associations as well as self-help groups offer the best content and formal ranking. Websites whose owners aim to make a profit, practices that offer cancer diet therapies, and newspapers received the poorest quality score. The majority of content provided on the Web gets published by profit-oriented content groups. The divergence between profit-driven websites offering low-quality content and the few trustworthy websites on cancer diets is enormous. The information given online about cancer diets may turn out to be a hazardous pitfall. In order to present evidence-based information about cancer diets, online information should be replenished to create a more accurate picture and give higher visibility to the right information. © 2016 S. Karger GmbH, Freiburg.

  18. Opportunities and challenges in conducting secondary analysis of HIV programmes using data from routine health information systems and personal health information.

    Science.gov (United States)

    Gloyd, Stephen; Wagenaar, Bradley H; Woelk, Godfrey B; Kalibala, Samuel

    2016-01-01

    HIV programme data from routine health information systems (RHIS) and personal health information (PHI) provide ample opportunities for secondary data analysis. However, these data pose unique opportunities and challenges for use in health system monitoring, along with process and impact evaluations. Analyses focused on retrospective case reviews of four of the HIV-related studies published in this JIAS supplement. We identify specific opportunities and challenges with respect to the secondary analysis of RHIS and PHI data. Challenges working with both HIV-related RHIS and PHI included missing, inconsistent and implausible data; rapidly changing indicators; systematic differences in the utilization of services; and patient linkages over time and different data sources. Specific challenges among RHIS data included numerous registries and indicators, inconsistent data entry, gaps in data transmission, duplicate registry of information, numerator-denominator incompatibility and infrequent use of data for decision-making. Challenges specific to PHI included the time burden for busy providers, the culture of lax charting, overflowing archives for paper charts and infrequent chart review. Many of the challenges that undermine effective use of RHIS and PHI data for analyses are related to the processes and context of collecting the data, excessive data requirements, lack of knowledge of the purpose of data and the limited use of data among those generating the data. Recommendations include simplifying data sources, analysis and reporting; conducting systematic data quality audits; enhancing the use of data for decision-making; promoting routine chart review linked with simple patient tracking systems; and encouraging open access to RHIS and PHI data for increased use.

  19. Cultural-Historical Activity Theory and Domain Analysis: Metatheoretical Implications for Information Science

    Science.gov (United States)

    Wang, Lin

    2013-01-01

    Background: Cultural-historical activity theory is an important theory in modern psychology. In recent years, it has drawn more attention from related disciplines including information science. Argument: This paper argues that activity theory and domain analysis which uses the theory as one of its bases could bring about some important…

  20. A Method for the Analysis of Information Use in Source-Based Writing

    Science.gov (United States)

    Sormunen, Eero; Heinstrom, Jannica; Romu, Leena; Turunen, Risto

    2012-01-01

    Introduction: Past research on source-based writing assignments has hesitated to scrutinize how students actually use information afforded by sources. This paper introduces a method for the analysis of text transformations from sources to texts composed. The method is aimed to serve scholars in building a more detailed understanding of how…

  1. Informed Consent and Psychotherapy : An Interpretative Phenomenological Analysis of Therapists’ Views.

    OpenAIRE

    Goddard, Angela; Murray, Craig; Simpson, Jane

    2008-01-01

    Objectives: To examine the issue of informed consent and how this is translated into clinical psychotherapy practice. Design: A qualitative approach was taken in which interviews were used to produce data. Methods: Nine clinical psychologists with specialist psychodynamic training took part in the research. Participants were interviewed using a semi-structured interview schedule. The interviews were transcribed and the data were analysed using Interpretative Phenomenological Analysis. Results...

  2. Using visual information analysis to explore complex patterns in the activity of designers

    DEFF Research Database (Denmark)

    Cash, Philip; Stanković, Tino; Štorga, Mario

    2014-01-01

    The analysis of complex interlinked datasets poses a significant problem for design researchers. This is addressed by proposing an information visualisation method for analysing patterns of design activity, qualitatively and quantitatively, with respect to time. This method visualises the tempora...

  3. Management of organizations in Serbia from the aspect of the maturity analysis of information security

    Directory of Open Access Journals (Sweden)

    Trivan Dragan

    2016-01-01

    Full Text Available The aim of this work is focused on research of information security in organizations, with a focus on cybersecurity. In accordance with the theoretical analysis, the subject of the empirical part of the work is the analysis of information security in Serbia, in order to better understand the information security programs and management structures in organizations in Serbia. The survey covers a variety of industries and discusses how organizations assess, develop, create and support their programs to ensure information security. The survey included 53 companies. The results that were obtained enabled us to select five core elements of the program on the state of information security and cybersecurity in Serbian companies: most companies had not been exposed to cybersecurity incidents; in most companies policy, procedures and spheres of responsibility for information security exist, there are not enough controls to ensure compliance with relevant safety standards by third parties, top management and end-users are insufficiently familiar with cybersecurity risks, although they apply basic measures of protection, safety protection systems are very rare. The scientific goal of this work is to, on the basis of the results obtained, make conclusions that can contribute to the study of corporate information security with special emphasis on cybersecurity. The practical aim of the research is the application of the results for more efficient implementation process of security against cyber attacks in the Serbian organizations.

  4. BGI-RIS: an integrated information resource and comparative analysis workbench for rice genomics

    DEFF Research Database (Denmark)

    Zhao, Wenming; Wang, Jing; He, Ximiao

    2004-01-01

    Rice is a major food staple for the world's population and serves as a model species in cereal genome research. The Beijing Genomics Institute (BGI) has long been devoting itself to sequencing, information analysis and biological research of the rice and other crop genomes. In order to facilitate....... Designed as a basic platform, BGI-RIS presents the sequenced genomes and related information in systematic and graphical ways for the convenience of in-depth comparative studies (http://rise.genomics.org.cn/). Udgivelsesdato: 2004-Jan-1...

  5. Rapid automatic keyword extraction for information retrieval and analysis

    Science.gov (United States)

    Rose, Stuart J [Richland, WA; Cowley,; E, Wendy [Richland, WA; Crow, Vernon L [Richland, WA; Cramer, Nicholas O [Richland, WA

    2012-03-06

    Methods and systems for rapid automatic keyword extraction for information retrieval and analysis. Embodiments can include parsing words in an individual document by delimiters, stop words, or both in order to identify candidate keywords. Word scores for each word within the candidate keywords are then calculated based on a function of co-occurrence degree, co-occurrence frequency, or both. Based on a function of the word scores for words within the candidate keyword, a keyword score is calculated for each of the candidate keywords. A portion of the candidate keywords are then extracted as keywords based, at least in part, on the candidate keywords having the highest keyword scores.

  6. Beyond usage: understanding the use of electronic journals on the basis of information activity analysis. Electronic journals, Use studies, Information activity, Scientific communication

    Directory of Open Access Journals (Sweden)

    Annaïg Mahé

    2004-01-01

    Full Text Available In this article, which reports the second part of a two-part study of the use of electronic journals by researchers in two French research institutions, we attempt to explain the integration of the use of electronic journals in the scientists' information habits, going beyond usage analysis. First, we describe how the development of electronic journals use follows a three-phase innovation process - research-development, first uses, and technical acculturation. Then, we attempt to find more significant explanatory factors, and emphasis is placed on the wider context of information activity. Three main information activity types are outlined - marginal, parallel, and integrated. Each of these types corresponds to a particular attitude towards scientific information and to different levels of electronic journal use.

  7. Analysis of Russian Federation Foreign Policy in the Field of International Information Security

    Directory of Open Access Journals (Sweden)

    Elena S. Zinovieva

    2014-01-01

    Full Text Available Information and communication technologies (ICT play an essential role in the improvement of the quality of life, economic and socio-political of individual countries and humanity in general. However, ICT development is fraught with new challenges and threats to international and national security. Interstate rivalry in the information sphere generates conflicts, an extreme form of which is an information war. Since 1998, the Russian initiative supports the international cooperation on information security at the global and regional level as well as within the framework of the bilateral relations. The article analyzes the characteristics of the global information society, which has a decisive influence on the international security in the information age, as well as international cooperation in this field. The analysis of Russian foreign policy initiatives in the field of international information security is also presented. Today more than 130 countries develop cyber capabilities, both defensive and offensive, that pose serious threats to the international stability. It's difficult to trace the source of information attacks and its consequences can be devastating and cause retaliation, including the use of conventional weapons. In this situation Russian approach, advocating for the development of the rules of conduct of States and demilitarization of information space in order to ensure its safety, seems urgent and relevant with the international situation.

  8. Evaluation of a gene information summarization system by users during the analysis process of microarray datasets

    Directory of Open Access Journals (Sweden)

    Cohen Aaron

    2009-02-01

    Full Text Available Abstract Background Summarization of gene information in the literature has the potential to help genomics researchers translate basic research into clinical benefits. Gene expression microarrays have been used to study biomarkers for disease and discover novel types of therapeutics and the task of finding information in journal articles on sets of genes is common for translational researchers working with microarray data. However, manually searching and scanning the literature references returned from PubMed is a time-consuming task for scientists. We built and evaluated an automatic summarizer of information on genes studied in microarray experiments. The Gene Information Clustering and Summarization System (GICSS is a system that integrates two related steps of the microarray data analysis process: functional gene clustering and gene information gathering. The system evaluation was conducted during the process of genomic researchers analyzing their own experimental microarray datasets. Results The clusters generated by GICSS were validated by scientists during their microarray analysis process. In addition, presenting sentences in the abstract provided significantly more important information to the users than just showing the title in the default PubMed format. Conclusion The evaluation results suggest that GICSS can be useful for researchers in genomic area. In addition, the hybrid evaluation method, partway between intrinsic and extrinsic system evaluation, may enable researchers to gauge the true usefulness of the tool for the scientists in their natural analysis workflow and also elicit suggestions for future enhancements. Availability GICSS can be accessed online at: http://ir.ohsu.edu/jianji/index.html

  9. An Empirical Analysis of the Default Rate of Informal Lending—Evidence from Yiwu, China

    Science.gov (United States)

    Lu, Wei; Yu, Xiaobo; Du, Juan; Ji, Feng

    This study empirically analyzes the underlying factors contributing to the default rate of informal lending. This paper adopts snowball sampling interview to collect data and uses the logistic regression model to explore the specific factors. The results of these analyses validate the explanation of how the informal lending differs from the commercial loan. Factors that contribute to the default rate have particular attributes, while sharing some similarities with commercial bank or FICO credit scoring Index. Finally, our concluding remarks draw some inferences from empirical analysis and speculate as to what this may imply for the role of formal and informal financial sectors.

  10. Information Analysis Methodology for Border Security Deployment Prioritization and Post Deployment Evaluation

    International Nuclear Information System (INIS)

    Booker, Paul M.; Maple, Scott A.

    2010-01-01

    Due to international commerce, cross-border conflicts, and corruption, a holistic, information driven, approach to border security is required to best understand how resources should be applied to affect sustainable improvements in border security. The ability to transport goods and people by land, sea, and air across international borders with relative ease for legitimate commercial purposes creates a challenging environment to detect illicit smuggling activities that destabilize national level border security. Smuggling activities operated for profit or smuggling operations driven by cross border conflicts where militant or terrorist organizations facilitate the transport of materials and or extremists to advance a cause add complexity to smuggling interdiction efforts. Border security efforts are further hampered when corruption thwarts interdiction efforts or reduces the effectiveness of technology deployed to enhance border security. These issues necessitate the implementation of a holistic approach to border security that leverages all available data. Large amounts of information found in hundreds of thousands of documents can be compiled to assess national or regional borders to identify variables that influence border security. Location data associated with border topics of interest may be extracted and plotted to better characterize the current border security environment for a given country or region. This baseline assessment enables further analysis, but also documents the initial state of border security that can be used to evaluate progress after border security improvements are made. Then, border security threats are prioritized via a systems analysis approach. Mitigation factors to address risks can be developed and evaluated against inhibiting factor such as corruption. This holistic approach to border security helps address the dynamic smuggling interdiction environment where illicit activities divert to a new location that provides less resistance

  11. A systematic approach for analysis and design of secure health information systems.

    Science.gov (United States)

    Blobel, B; Roger-France, F

    2001-06-01

    A toolset using object-oriented techniques including the nowadays popular unified modelling language (UML) approach has been developed to facilitate the different users' views for security analysis and design of health care information systems. Paradigm and concepts used are based on the component architecture of information systems and on a general layered security model. The toolset was developed in 1996/1997 within the ISHTAR project funded by the European Commission as well as through international standardisation activities. Analysing and systematising real health care scenarios, only six and nine use case types could be found in the health and the security-related view, respectively. By combining these use case types, the analysis and design of any thinkable system architecture can be simplified significantly. Based on generic schemes, the environment needed for both communication and application security can be established by appropriate sets of security services and mechanisms. Because of the importance and the basic character of electronic health care record (EHCR) systems, the understanding of the approach is facilitated by (incomplete) examples for this application.

  12. Content Analysis of Papers Submitted to Communications in Information Literacy, 2007-2013

    Directory of Open Access Journals (Sweden)

    Christopher V. Hollister

    2014-07-01

    Full Text Available The author conducted a content analysis of papers submitted to the journal, Communications in Information Literacy, from the years 2007-2013. The purpose was to investigate and report on the overall quality characteristics of a statistically significant sample of papers submitted to a single-topic, open access, library and information science (LIS journal. Characteristics of manuscript submissions, authorship, reviewer evaluations, and editorial decisions were illuminated to provide context; particular emphasis was given to the analysis of major criticisms found in reviewer evaluations of rejected papers. Overall results were compared to previously published research. The findings suggest a trend in favor of collaborative authorship, and a possible trend toward a more practice-based literature. The findings also suggest a possible deterioration in some of the skills that are required of LIS authors relative to the preparation of scholarly papers. The author discusses potential implications for authors and the disciplinary literature, recommends directions for future research, and where possible, provides recommendations for the benefit of the greater community of LIS scholars.

  13. The information value of early career productivity in mathematics: a ROC analysis of prediction errors in bibliometricly informed decision making.

    Science.gov (United States)

    Lindahl, Jonas; Danell, Rickard

    The aim of this study was to provide a framework to evaluate bibliometric indicators as decision support tools from a decision making perspective and to examine the information value of early career publication rate as a predictor of future productivity. We used ROC analysis to evaluate a bibliometric indicator as a tool for binary decision making. The dataset consisted of 451 early career researchers in the mathematical sub-field of number theory. We investigated the effect of three different definitions of top performance groups-top 10, top 25, and top 50 %; the consequences of using different thresholds in the prediction models; and the added prediction value of information on early career research collaboration and publications in prestige journals. We conclude that early career performance productivity has an information value in all tested decision scenarios, but future performance is more predictable if the definition of a high performance group is more exclusive. Estimated optimal decision thresholds using the Youden index indicated that the top 10 % decision scenario should use 7 articles, the top 25 % scenario should use 7 articles, and the top 50 % should use 5 articles to minimize prediction errors. A comparative analysis between the decision thresholds provided by the Youden index which take consequences into consideration and a method commonly used in evaluative bibliometrics which do not take consequences into consideration when determining decision thresholds, indicated that differences are trivial for the top 25 and the 50 % groups. However, a statistically significant difference between the methods was found for the top 10 % group. Information on early career collaboration and publication strategies did not add any prediction value to the bibliometric indicator publication rate in any of the models. The key contributions of this research is the focus on consequences in terms of prediction errors and the notion of transforming uncertainty

  14. Function analysis for waste information systems

    International Nuclear Information System (INIS)

    Sexton, J.L.; Neal, C.T.; Heath, T.C.; Starling, C.D.

    1996-04-01

    This study has a two-fold purpose. It seeks to identify the functional requirements of a waste tracking information system and to find feasible alternatives for meeting those requirements on the Oak Ridge Reservation (ORR) and the Portsmouth (PORTS) and Paducah (PGDP) facilities; identify options that offer potential cost savings to the US government and also show opportunities for improved efficiency and effectiveness in managing waste information; and, finally, to recommend a practical course of action that can be immediately initiated. In addition to identifying relevant requirements, it also identifies any existing requirements that are currently not being completely met. Another aim of this study is to carry out preliminary benchmarking by contacting representative companies about their strategic directions in waste information. The information obtained from representatives of these organizations is contained in an appendix to the document; a full benchmarking effort, however, is beyond the intended scope of this study

  15. Organizational Lerning and Strategy: Information Processing Approach of Organizaitonal Learning to Perform Strategic Choice Analysis

    Directory of Open Access Journals (Sweden)

    Agustian Budi Prasetya

    2017-03-01

    Full Text Available Study of organizational learning required to discuss the issue of strategy to understand company’s organizational knowledge and how company applied the organizational knowledge toward the changing of the environment. Method of the analysis for this research was based on desk research thoroughly on the existing literature. This research analyzed the viewpoints of different researchers in organizational learning and elaborates the information processing abilities approach of Organizational Learning (OL. Based on desk research on literature, the research discussed information processing approach to explain organizational learning and strategy choice by describing the importance of information and assumptions, the activities of knowledge acquisition, interpreting and distribution of the knowledge, typology of exploitation and exploration learning. It proposed the importance of the company to perform alignment between internal managerial process arrangement and external environment while doing the learning, based on the strategic choice space, as theatrical clustering map of the learning, the fit, the alignment, and the alliances of the organization. This research finds that the strategic space might help the analysis of balancing between exploitation and exploration learning while applying the analysis of varied firm characteristics, strategic orientation, and industrial environments.

  16. Comprehensive analysis of information dissemination in disasters

    Science.gov (United States)

    Zhang, N.; Huang, H.; Su, Boni

    2016-11-01

    China is a country that experiences a large number of disasters. The number of deaths caused by large-scale disasters and accidents in past 10 years is around 900,000. More than 92.8 percent of these deaths could be avoided if there were an effective pre-warning system deployed. Knowledge of the information dissemination characteristics of different information media taking into consideration governmental assistance (information published by a government) in disasters in urban areas, plays a critical role in increasing response time and reducing the number of deaths and economic losses. In this paper we have developed a comprehensive information dissemination model to optimize efficiency of pre-warning mechanics. This model also can be used for disseminating information for evacuees making real-time evacuation plans. We analyzed every single information dissemination models for pre-warning in disasters by considering 14 media: short message service (SMS), phone, television, radio, news portals, Wechat, microblogs, email, newspapers, loudspeaker vehicles, loudspeakers, oral communication, and passive information acquisition via visual and auditory senses. Since governmental assistance is very useful in a disaster, we calculated the sensitivity of governmental assistance ratio. The results provide useful references for information dissemination during disasters in urban areas.

  17. Profile of e-patients: analysis of their cancer information-seeking from a national survey.

    Science.gov (United States)

    Kim, Kyunghye; Kwon, Nahyun

    2010-10-01

    Researchers have yet to fully understand how competent e-patients are in selecting and using health information sources, or, more importantly, who e-patients are. This study attempted to uncover how cancer e-patients differ from other cancer information seekers in terms of their sociodemographic background, social networks, information competence, and selection of cancer information sources. We analyzed data from the National Cancer Institute's 2005 Health Information National Trends Survey, and a series of chi-square tests showed that factors that distinguished cancer e-patients from other cancer information seekers were age, gender, education, employment status, health insurance, and membership in online support groups. They were not different in the other factors measured by the survey. Our logistic regression analysis revealed that the e-patients were older and talked about their health issues with friends or family more frequently compared with online health information seekers without cancer. While preferring information from their doctors over the Internet, e-patients used the Internet as their primary source. In contrast to previous literature, we found little evidence that e-patients were savvy health information consumers who could make informed decisions on their own health. The findings of this study addressed a need for a better design and delivery of health information literacy programs for cancer e-patients.

  18. An analysis of contingent factors for the detection of strategic relevance in business information technologies

    Directory of Open Access Journals (Sweden)

    Antonio Paños Álvarez

    2005-01-01

    Full Text Available Information Technologies are resources able to create competitive advantages for companies. In this analysis, the Resource-based perspective have taken special relevance, because it is argued that this advantages should be identified, reached and maintained. This work is positioned in the analysis of several contingent factors in the process of pointing the possible assesment of these advantages. It is aproaching a portfolio for helping to select what Information Technologies are valuable for what companies and in what activity areas and the study of in what way the sector, the technological innovation profile, the size and the financial capacity of the companies affects this process

  19. Mediating Informal Care Online: Findings from an Extensive Requirements Analysis

    Directory of Open Access Journals (Sweden)

    Christiane Moser

    2015-05-01

    Full Text Available Organizing and satisfying the increasing demand for social and informal care for older adults is an important topic. We aim at building a peer-to-peer exchange platform that empowers older adults to benefit from receiving support for daily activities and reciprocally offering support to others. In situated interviews and within a survey we investigated the requirements and needs of 246 older adults with mild impairments. Additionally, we conducted an interpretative role analysis of older adults’ collaborative care processes (i.e., support exchange practices in order to identify social roles and understand the inherent expectations towards the execution of support. We will describe our target group in the form of personas and different social roles, as well as user requirements for establishing a successful peer-to-peer collaboration. We also consider our finding from the perspective of social capital theory that allows us to describe in our requirements how relationships provide valuable social resources (i.e., social capital for informal and social care.

  20. Trajectory Shape Analysis and Anomaly Detection Utilizing Information Theory Tools

    Directory of Open Access Journals (Sweden)

    Yuejun Guo

    2017-06-01

    Full Text Available In this paper, we propose to improve trajectory shape analysis by explicitly considering the speed attribute of trajectory data, and to successfully achieve anomaly detection. The shape of object motion trajectory is modeled using Kernel Density Estimation (KDE, making use of both the angle attribute of the trajectory and the speed of the moving object. An unsupervised clustering algorithm, based on the Information Bottleneck (IB method, is employed for trajectory learning to obtain an adaptive number of trajectory clusters through maximizing the Mutual Information (MI between the clustering result and a feature set of the trajectory data. Furthermore, we propose to effectively enhance the performance of IB by taking into account the clustering quality in each iteration of the clustering procedure. The trajectories are determined as either abnormal (infrequently observed or normal by a measure based on Shannon entropy. Extensive tests on real-world and synthetic data show that the proposed technique behaves very well and outperforms the state-of-the-art methods.

  1. Flipping the analytical coin : closing the information flow loop in high speed (real time) analysis

    NARCIS (Netherlands)

    K.E. Shahroudi

    1997-01-01

    textabstractAnalysis modules tend to be set up as one way flow of information, i.e a clear distinction between cause and effect or input and output. However, as the speed of analysis approaches real time (or faster than movie rate), it becomes increasingly difficult for an external user to

  2. Quantifying information transfer by protein domains: Analysis of the Fyn SH2 domain structure

    DEFF Research Database (Denmark)

    Lenaerts, Tom; Ferkinghoff-Borg, Jesper; Stricher, Francois

    2008-01-01

    instance of communication over a noisy channel. In particular, we analyze the conformational correlations between protein residues and apply the concept of mutual information to quantify information exchange. Mapping out changes of mutual information on the protein structure then allows visualizing how...... distal communication is achieved. We illustrate the approach by analyzing information transfer by the SH2 domain of Fyn tyrosine kinase, obtained from Monte Carlo dynamics simulations. Our analysis reveals that the Fyn SH2 domain forms a noisy communication channel that couples residues located......Background: Efficient communication between distant sites within a protein is essential for cooperative biological response. Although often associated with large allosteric movements, more subtle changes in protein dynamics can also induce long-range correlations. However, an appropriate formalism...

  3. On the development of an interactive resource information management system for analysis and display of spatiotemporal data

    Science.gov (United States)

    Schell, J. A.

    1974-01-01

    The recent availability of timely synoptic earth imagery from the Earth Resources Technology Satellites (ERTS) provides a wealth of information for the monitoring and management of vital natural resources. Formal language definitions and syntax interpretation algorithms were adapted to provide a flexible, computer information system for the maintenance of resource interpretation of imagery. These techniques are incorporated, together with image analysis functions, into an Interactive Resource Information Management and Analysis System, IRIMAS, which is implemented on a Texas Instruments 980A minicomputer system augmented with a dynamic color display for image presentation. A demonstration of system usage and recommendations for further system development are also included.

  4. 76 FR 65317 - 60-Day Notice of Proposed Information Collection: DS-4184, Risk Management and Analysis (RAM)

    Science.gov (United States)

    2011-10-20

    ..., Risk Management and Analysis (RAM) ACTION: Notice of request for public comments. SUMMARY: The... of 1995. Title of Information Collection: Risk Analysis and Management. OMB Control Number: None.... Methodology: The State Department, is implementing a Risk Analysis and Management Program to vet potential...

  5. Information entropies in antikaon-nucleon scattering and optimal state analysis

    International Nuclear Information System (INIS)

    Ion, D.B.; Ion, M.L.; Petrascu, C.

    1998-01-01

    It is known that Jaynes interpreted the entropy as the expected self-information of a class of mutually exclusive and exhaustive events, while the probability is considered to be the rational degree of belief we assign to events based on available experimental evidence. The axiomatic derivation of Jaynes principle of maximum entropy as well as of the Kullback principle of minimum cross-entropy have been reported. Moreover, the optimal states in the Hilbert space of the scattering amplitude, which are analogous to the coherent states from the Hilbert space of the wave functions, were introduced and developed. The possibility that each optimal state possesses a specific minimum entropic uncertainty relation similar to that of the coherent states was recently conjectured. In fact, the (angle and angular momenta) information entropies, as well as the entropic angle-angular momentum uncertainty relations, in the hadron-hadron scattering, are introduced. The experimental information entropies for the pion-nucleon scattering are calculated by using the available phase shift analyses. These results are compared with the information entropies of the optimal states. Then, the optimal state dominance in the pion-nucleon scattering is systematically observed for all P LAB = 0.02 - 10 GeV/c. Also, it is shown that the angle-angular momentum entropic uncertainty relations are satisfied with high accuracy by all the experimental information entropies. In this paper the (angle and angular momentum) information entropies of hadron-hadron scattering are experimentally investigated by using the antikaon-nucleon phase shift analysis. Then, it is shown that the experimental entropies are in agreement with the informational entropies of optimal states. The results obtained in this paper can be explained not only by the presence of an optimal background which accompanied the production of the elementary resonances but also by the presence of the optimal resonances. On the other hand

  6. The Accuracy of the Information Presented in Credit Bureau Reports: Research and Comparative Analysis

    Directory of Open Access Journals (Sweden)

    Vladimir Simović

    2015-12-01

    Full Text Available This paper presents research results regarding information accuracy in the Serbian credit bureau reports and tries to identify the reasons which affect the accuracy of the information presented in credit bureau reports, in global terms. The research was conducted by interviewing respondents. Comparative analysis was used in order to formulate proposal of factors which determine information accuracy in the credit bureau reports. The results show that the materially significant errors in information presented in Serbian credit bureau reports make 0.5% of the sample. This implies that creditors in Serbia base their credit decisions on reliable information. The results of this study were compared to results of the studies conducted in USA and Germany in order to formulate proposal of factors which influence the information accuracy in the credit bureau reports. In order toimprove information accuracy in credit bureau reports, in global terms, special attention should be paid to formulation of international standards of credit reporting and identification systems of natural persons and legal entities.

  7. Selecting essential information for biosurveillance--a multi-criteria decision analysis.

    Directory of Open Access Journals (Sweden)

    Nicholas Generous

    Full Text Available The National Strategy for Biosurveillance defines biosurveillance as "the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels." However, the strategy does not specify how "essential information" is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being "essential". The question of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of "essential information" for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system.

  8. Selecting essential information for biosurveillance--a multi-criteria decision analysis.

    Science.gov (United States)

    Generous, Nicholas; Margevicius, Kristen J; Taylor-McCabe, Kirsten J; Brown, Mac; Daniel, W Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina

    2014-01-01

    The National Strategy for Biosurveillance defines biosurveillance as "the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels." However, the strategy does not specify how "essential information" is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being "essential". The question of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of "essential information" for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system.

  9. Transportation Routing Analysis Geographic Information System (TRAGIS) User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, PE

    2003-09-18

    The Transportation Routing Analysis Geographic Information System (TRAGIS) model is used to calculate highway, rail, or waterway routes within the United States. TRAGIS is a client-server application with the user interface and map data files residing on the user's personal computer and the routing engine and network data files on a network server. The user's manual provides documentation on installation and the use of the many features of the model.

  10. Maximising Organisational Information Sharing and Effective Intelligence Analysis in Critical Data Sets. A case study on the information science needs of the Norwegian criminal intelligence and law enforcement community

    OpenAIRE

    Wilhelmsen, Sonja

    2009-01-01

    Organisational information sharing has become more and more important as the amount of information grows. In order to accomplish the most effective and efficient sharing of information, analysis of the information needs and the organisation needs are vital. This dissertation focuses on the information needs sourced through the critical data sets of law enforcement organisations; specifically the Norwegian criminal intelligence and law enforcement community represented by the Na...

  11. How information about overdetection changes breast cancer screening decisions: a mediation analysis within a randomised controlled trial.

    Science.gov (United States)

    Hersch, Jolyn; McGeechan, Kevin; Barratt, Alexandra; Jansen, Jesse; Irwig, Les; Jacklyn, Gemma; Houssami, Nehmat; Dhillon, Haryana; McCaffery, Kirsten

    2017-10-06

    In a randomised controlled trial, we found that informing women about overdetection changed their breast screening decisions. We now present a mediation analysis exploring the psychological pathways through which study participants who received the intervention processed information about overdetection and how this influenced their decision-making. We examined a series of potential mediators in the causal chain between exposure to overdetection information and women's subsequently reported breast screening intentions. Serial multiple mediation analysis within a randomised controlled trial. New South Wales, Australia. 811 women aged 48-50 years with no personal history of breast cancer. Two versions of a decision aid giving women information about breast cancer deaths averted and false positives from mammography screening, either with (intervention) or without (control) information on overdetection. Intentions to undergo breast cancer screening in the next 2-3 years. Knowledge about overdetection, worry about breast cancer, attitudes towards breast screening and anticipated regret. The effect of information about overdetection on women's breast screening intentions was mediated through multiple cognitive and affective processes. In particular, the information led to substantial improvements in women's understanding of overdetection, and it influenced-both directly and indirectly via its effect on knowledge-their attitudes towards having screening. Mediation analysis showed that the mechanisms involving knowledge and attitudes were particularly important in determining women's intentions about screening participation. Even in this emotive context, new information influenced women's decision-making by changing their understanding of possible consequences of screening and their attitudes towards undergoing it. These findings emphasise the need to provide good-quality information on screening outcomes and to communicate this information effectively, so that women can

  12. Needs of informal caregivers across the caregiving course in amyotrophic lateral sclerosis: a qualitative analysis.

    LENUS (Irish Health Repository)

    Galvin, Miriam

    2018-01-27

    Amyotrophic lateral sclerosis (ALS), also known as motor neuron disease (MND), is a debilitating terminal condition. Informal caregivers are key figures in ALS care provision. The physical, psychological and emotional impact of providing care in the home requires appropriate assistance and support. The objective of this analysis is to explore the needs of informal ALS caregivers across the caregiving course.

  13. INFORMED DESIGN DECISION-MAKING: FROM DIGITAL ANALYSIS TO URBAN DESIGN

    Directory of Open Access Journals (Sweden)

    Camilla Pezzica

    2017-11-01

    Full Text Available This study describes a new approach to explore the design of public open spaces based on a multidimensional analysis useful to inform decision-making and foster the development of evidence-based architectural solutions. It presents an overview of the most relevant design variables and their constraints, providing, in this way, valuable information for the elaboration of a more sustainable urban design, considerate of the local socio-cultural values. This research aims at providing holistic guidance for the development of better design proposals in contemporary urban environments. More specifically, it seeks to synchronously characterize urban spaces at a multi-scale and multidimensional level, both quantitatively and qualitatively, by collecting contributions from Space Syntax Theory, Public Life Studies, Building Science and Environmental/Comfort Analysis in public open spaces. Many advanced digital tools are used for data management purposes and to generate and test iteratively different design proposals. The proposed methodology is based on a range of tests and analyses performed in the process of developing a new experimental project for Largo da Graça, an urban square located in Lisbon’s historic centre, which allowed the testing of different design solutions. This experiment generated a digital workflow for the design of the urban square, in which are registered all the steps undertaken to solve the many design problems identified by considering the efficiency targets (centrality, connectivity, enclosure, thermal comfort, security, social equity and interaction. The process comprises a sequence of comparative design reviews and records the choices made when dealing with latent information underlying changing conditions in the use of public space and the programmatic malleability of the Portuguese plaza. The description of the adopted design strategy and the examples extracted from the workflow are used to illustrate the practical

  14. Information Crisis

    CERN Document Server

    Losavio, Michael

    2012-01-01

    Information Crisis discusses the scope and types of information available online and teaches readers how to critically assess it and analyze potentially dangerous information, especially when teachers, editors, or other information gatekeepers are not available to assess the information for them. Chapters and topics include:. The Internet as an information tool. Critical analysis. Legal issues, traps, and tricks. Protecting personal safety and identity. Types of online information.

  15. Big Data Analysis of Contractor Performance Information for Services Acquisition in DoD: A Proof of Concept

    Science.gov (United States)

    2016-04-30

    qÜáêíÉÉåíÜ=^ååì~ä= ^Åèìáëáíáçå=oÉëÉ~êÅÜ= póãéçëáìã= qÜìêëÇ~ó=pÉëëáçåë= sçäìãÉ=ff= = Big Data Analysis of Contractor Performance Information for Services...Director, Acquisition Career Management, ASN(RD&A) Big Data Analysis of Contractor Performance Information for Services Acquisition in DoD: A Proof of...oÉëÉ~êÅÜ=mêçÖê~ãW= `êÉ~íáåÖ=póåÉêÖó=Ñçê=fåÑçêãÉÇ=`Ü~åÖÉ= - 398 - Big Data Analysis of Contractor Performance Information for Services Acquisition in

  16. Development and application of traffic flow information collecting and analysis system based on multi-type video

    Science.gov (United States)

    Lu, Mujie; Shang, Wenjie; Ji, Xinkai; Hua, Mingzhuang; Cheng, Kuo

    2015-12-01

    Nowadays, intelligent transportation system (ITS) has already become the new direction of transportation development. Traffic data, as a fundamental part of intelligent transportation system, is having a more and more crucial status. In recent years, video observation technology has been widely used in the field of traffic information collecting. Traffic flow information contained in video data has many advantages which is comprehensive and can be stored for a long time, but there are still many problems, such as low precision and high cost in the process of collecting information. This paper aiming at these problems, proposes a kind of traffic target detection method with broad applicability. Based on three different ways of getting video data, such as aerial photography, fixed camera and handheld camera, we develop a kind of intelligent analysis software which can be used to extract the macroscopic, microscopic traffic flow information in the video, and the information can be used for traffic analysis and transportation planning. For road intersections, the system uses frame difference method to extract traffic information, for freeway sections, the system uses optical flow method to track the vehicles. The system was applied in Nanjing, Jiangsu province, and the application shows that the system for extracting different types of traffic flow information has a high accuracy, it can meet the needs of traffic engineering observations and has a good application prospect.

  17. Three dimensional visualization breakthrough in analysis and communication of technical information for nuclear waste management

    International Nuclear Information System (INIS)

    Alexander, D.H.; Cerny, B.A.; Hill, E.R.; Krupka, K.M.; Smoot, J.L.; Smith, D.R.; Waldo, K.

    1990-11-01

    Computer graphics systems that provide interactive display and manipulation of three-dimensional data are powerful tools for the analysis and communication of technical information required for characterization and design of a geologic repository for nuclear waste. Greater understanding of site performance and repository design information is possible when performance-assessment modeling results can be visually analyzed in relation to site geologic and hydrologic information and engineering data for surface and subsurface facilities. In turn, this enhanced visualization capability provides better communication between technical staff and program management with respect to analysis of available information and prioritization of program planning. A commercially-available computer system was used to demonstrate some of the current technology for three-dimensional visualization within the architecture of systems for nuclear waste management. This computer system was used to interactively visualize and analyze the information for two examples: (1) site-characterization and engineering data for a potential geologic repository at Yucca Mountain, Nevada; and (2) three-dimensional simulations of a hypothetical release and transport of contaminants from a source of radionuclides to the vadose zone. Users may assess the three-dimensional distribution of data and modeling results by interactive zooming, rotating, slicing, and peeling operations. For those parts of the database where information is sparse or not available, the software incorporates models for the interpolation and extrapolation of data over the three-dimensional space of interest. 12 refs., 4 figs

  18. Value of Information Analysis Applied to the Economic Evaluation of Interventions Aimed at Reducing Juvenile Delinquency: An Illustration.

    Directory of Open Access Journals (Sweden)

    Hester V Eeren

    Full Text Available To investigate whether a value of information analysis, commonly applied in health care evaluations, is feasible and meaningful in the field of crime prevention.Interventions aimed at reducing juvenile delinquency are increasingly being evaluated according to their cost-effectiveness. Results of cost-effectiveness models are subject to uncertainty in their cost and effect estimates. Further research can reduce that parameter uncertainty. The value of such further research can be estimated using a value of information analysis, as illustrated in the current study. We built upon an earlier published cost-effectiveness model that demonstrated the comparison of two interventions aimed at reducing juvenile delinquency. Outcomes were presented as costs per criminal activity free year.At a societal willingness-to-pay of €71,700 per criminal activity free year, further research to eliminate parameter uncertainty was valued at €176 million. Therefore, in this illustrative analysis, the value of information analysis determined that society should be willing to spend a maximum of €176 million in reducing decision uncertainty in the cost-effectiveness of the two interventions. Moreover, the results suggest that reducing uncertainty in some specific model parameters might be more valuable than in others.Using a value of information framework to assess the value of conducting further research in the field of crime prevention proved to be feasible. The results were meaningful and can be interpreted according to health care evaluation studies. This analysis can be helpful in justifying additional research funds to further inform the reimbursement decision in regard to interventions for juvenile delinquents.

  19. Needs of informal caregivers across the caregiving course in amyotrophic lateral sclerosis: a qualitative analysis

    Science.gov (United States)

    Carney, Sile; Corr, Bernie; Mays, Iain; Pender, Niall; Hardiman, Orla

    2018-01-01

    Objectives Amyotrophic lateral sclerosis (ALS), also known as motor neuron disease (MND), is a debilitating terminal condition. Informal caregivers are key figures in ALS care provision. The physical, psychological and emotional impact of providing care in the home requires appropriate assistance and support. The objective of this analysis is to explore the needs of informal ALS caregivers across the caregiving course. Design In an open-ended question as part of a semistructured interview, caregivers were asked what would help them in their role. Interviews took place on three occasions at 4-month to 6-month intervals. Demographic, burden and quality of life data were collected, in addition to the open-ended responses. We carried out descriptive statistical analysis and thematic analysis of qualitative data. Setting and participants Home interviews at baseline (n=81) and on two further occasions (n=56, n=41) with informal caregivers of people with ALS attending the National ALS/MND Clinic at Beaumont Hospital, Dublin, Ireland. Results The majority of caregivers were family members. Hours of care provided and caregiver burden increased across the interview series. Thematic analysis identified what would help them in their role, and needs related to external support and services, psychological-emotional factors, patient-related behaviours, a cure and ‘nothing’. Themes were interconnected and their prevalence varied across the interview time points. Conclusion This study has shown the consistency and adaptation in what caregivers identified as helpful in their role, across 12–18 months of a caregiving journey. Support needs are clearly defined, and change with time and the course of caregiving. Caregivers need support from family, friends and healthcare professionals in managing their tasks and the emotional demands of caregiving. Identifying the specific needs of informal caregivers should enable health professionals to provide tailored supportive interventions

  20. A large scale analysis of information-theoretic network complexity measures using chemical structures.

    Directory of Open Access Journals (Sweden)

    Matthias Dehmer

    Full Text Available This paper aims to investigate information-theoretic network complexity measures which have already been intensely used in mathematical- and medicinal chemistry including drug design. Numerous such measures have been developed so far but many of them lack a meaningful interpretation, e.g., we want to examine which kind of structural information they detect. Therefore, our main contribution is to shed light on the relatedness between some selected information measures for graphs by performing a large scale analysis using chemical networks. Starting from several sets containing real and synthetic chemical structures represented by graphs, we study the relatedness between a classical (partition-based complexity measure called the topological information content of a graph and some others inferred by a different paradigm leading to partition-independent measures. Moreover, we evaluate the uniqueness of network complexity measures numerically. Generally, a high uniqueness is an important and desirable property when designing novel topological descriptors having the potential to be applied to large chemical databases.

  1. Niche Overlap and Discrediting Acts: An Empirical Analysis of Informing in Hollywood

    Directory of Open Access Journals (Sweden)

    Giacomo Negro

    2015-06-01

    Full Text Available This article examines informing on others as a discrediting act between individual agents in a labor market. We conduct an empirical analysis of artists called to testify during the 1950s Congressional hearings into Communism in Hollywood, and multi-level regression models reveal that the odds of an artist informing on another increase when their career histories are more similar. The similarity reflects levels of niche overlap in the labor market. The finding that similarity contributes to discredit in the context of resource competition is compatible with a social comparison process, whereby uncertainty about performance leads more similar people to attend to and exclude one another to a greater extent.

  2. Duopoly Market Analysis within One-Shot Decision Framework with Asymmetric Possibilistic Information

    Directory of Open Access Journals (Sweden)

    Peijun Guo

    2010-12-01

    Full Text Available In this paper, a newly emerging duopoly market with a short life cycle is analyzed. The partially known information of market is characterized by the possibility distribution of the parameter in the demand function. Since the life cycle of the new product is short, how many products should be produced by two rival firms is a typical one-shot decision problem. Within the one-shot decision framework, the possibilistic Cournot equilibrium is obtained for the optimal production level of each firm in a duopoly market with asymmetrical possibilistic information. The analysis results show that the proposed approaches are reasonable for one-shot decision problems, which are extensively encountered in business and economics.

  3. A Comparative Analysis of the Value of Information in a Continuous Time Market Model with Partial Information: The Cases of Log-Utility and CRRA

    Directory of Open Access Journals (Sweden)

    Zhaojun Yang

    2011-01-01

    Full Text Available We study the question what value an agent in a generalized Black-Scholes model with partial information attributes to the complementary information. To do this, we study the utility maximization problems from terminal wealth for the two cases partial information and full information. We assume that the drift term of the risky asset is a dynamic process of general linear type and that the two levels of observation correspond to whether this drift term is observable or not. Applying methods from stochastic filtering theory we derive an analytical tractable formula for the value of information in the case of logarithmic utility. For the case of constant relative risk aversion (CRRA we derive a semianalytical formula, which uses as an input the numerical solution of a system of ODEs. For both cases we present a comparative analysis.

  4. Analysis of information for cerebrovascular disorders obtained by 3D MR imaging

    International Nuclear Information System (INIS)

    Yoshikawa, Kohki; Yoshioka, Naoki; Watanabe, Fumio; Shiono, Takahiro; Sugishita, Morihiro; Umino, Kazunori.

    1995-01-01

    Recently, it becomes easy to analyze information obtained by 3D MR imaging due to remarkable progress of fast MR imaging technique and analysis tool. Six patients suffered from aphasia (4 cerebral infarctions and 2 bleedings) were performed 3D MR imaging (3D FLASH-TR/TE/flip angle; 20-50 msec/6-10 msec/20-30 degrees) and their volume information were analyzed by multiple projection reconstruction (MPR), surface rendering 3D reconstruction, and volume rendering 3D reconstruction using Volume Design PRO (Medical Design Co., Ltd.). Four of them were diagnosed as Broca's aphasia clinically and their lesions could be detected around the cortices of the left inferior frontal gyrus. Another 2 patients were diagnosed as Wernicke's aphasia and the lesions could be detected around the cortices of the left supramarginal gyrus. This technique for 3D volume analyses would provide quite exact locational information about cerebral cortical lesions. (author)

  5. Analysis of information for cerebrovascular disorders obtained by 3D MR imaging

    Energy Technology Data Exchange (ETDEWEB)

    Yoshikawa, Kohki [Tokyo Univ. (Japan). Inst. of Medical Science; Yoshioka, Naoki; Watanabe, Fumio; Shiono, Takahiro; Sugishita, Morihiro; Umino, Kazunori

    1995-12-01

    Recently, it becomes easy to analyze information obtained by 3D MR imaging due to remarkable progress of fast MR imaging technique and analysis tool. Six patients suffered from aphasia (4 cerebral infarctions and 2 bleedings) were performed 3D MR imaging (3D FLASH-TR/TE/flip angle; 20-50 msec/6-10 msec/20-30 degrees) and their volume information were analyzed by multiple projection reconstruction (MPR), surface rendering 3D reconstruction, and volume rendering 3D reconstruction using Volume Design PRO (Medical Design Co., Ltd.). Four of them were diagnosed as Broca`s aphasia clinically and their lesions could be detected around the cortices of the left inferior frontal gyrus. Another 2 patients were diagnosed as Wernicke`s aphasia and the lesions could be detected around the cortices of the left supramarginal gyrus. This technique for 3D volume analyses would provide quite exact locational information about cerebral cortical lesions. (author).

  6. Information-Based Analysis of Data Assimilation (Invited)

    Science.gov (United States)

    Nearing, G. S.; Gupta, H. V.; Crow, W. T.; Gong, W.

    2013-12-01

    Data assimilation is defined as the Bayesian conditioning of uncertain model simulations on observations for the purpose of reducing uncertainty about model states. Practical data assimilation methods make the application of Bayes' law tractable either by employing assumptions about the prior, posterior and likelihood distributions (e.g., the Kalman family of filters) or by using resampling methods (e.g., bootstrap filter). We propose to quantify the efficiency of these approximations in an OSSE setting using information theory and, in an OSSE or real-world validation setting, to measure the amount - and more importantly, the quality - of information extracted from observations during data assimilation. To analyze DA assumptions, uncertainty is quantified as the Shannon-type entropy of a discretized probability distribution. The maximum amount of information that can be extracted from observations about model states is the mutual information between states and observations, which is equal to the reduction in entropy in our estimate of the state due to Bayesian filtering. The difference between this potential and the actual reduction in entropy due to Kalman (or other type of) filtering measures the inefficiency of the filter assumptions. Residual uncertainty in DA posterior state estimates can be attributed to three sources: (i) non-injectivity of the observation operator, (ii) noise in the observations, and (iii) filter approximations. The contribution of each of these sources is measurable in an OSSE setting. The amount of information extracted from observations by data assimilation (or system identification, including parameter estimation) can also be measured by Shannon's theory. Since practical filters are approximations of Bayes' law, it is important to know whether the information that is extracted form observations by a filter is reliable. We define information as either good or bad, and propose to measure these two types of information using partial

  7. Using pattern structures to support information retrieval with Formal Concept Analysis

    OpenAIRE

    Codocedo , Victor; Lykourentzou , Ioanna; Astudillo , Hernan; Napoli , Amedeo

    2013-01-01

    International audience; In this paper we introduce a novel approach to information retrieval (IR) based on Formal Concept Analysis (FCA). The use of concept lattices to support the task of document retrieval in IR has proven effective since they allow querying in the space of terms modelled by concept intents and navigation in the space of documents modelled by concept extents. However, current approaches use binary representations to illustrate the relations between documents and terms (''do...

  8. Current status and future plan of INSS's analysis of overseas trouble information

    International Nuclear Information System (INIS)

    Akazawa, Takashi; Okumoto, Masaru

    2017-01-01

    In order to enhance safety of nuclear power stations, it is important to analyze trouble information of both domestic and overseas to understand the causes, so as to prevent similar troubles which may be caused by the ones at our own nuclear power stations, by carrying out countermeasures beforehand. At Institute of Nuclear Safety System (INSS), we mainly collect and analyze overseas trouble information. So far, we have accumulated about 130,000 of such data, making analysis on about 4,000 of event reports every year, and presenting suggestions to PWR utilities from which we recognized needs of countermeasures in Japan. For example, we submitted 5 suggestions to PWR utilities last year. Moreover, we are learning about similar activities which are carried out by Institut de Radioprotection et de Sûreté Nucléaire (IRSN) in France to help us further improve our analyzing skills. As our plan for further improvement, we have started our effort to share knowledge and co-operate between PWR and BWR for deepening analysis. Our activity of Operating Experience (OE) analysis is to be consolidated in the fiscal year of 2019 under the structure of JANSI including BWR OE analysis, to effectively follow additional activities such as suggestions co-operating with JANSI's peer reviews. (author)

  9. Prediction Model of Collapse Risk Based on Information Entropy and Distance Discriminant Analysis Method

    Directory of Open Access Journals (Sweden)

    Hujun He

    2017-01-01

    Full Text Available The prediction and risk classification of collapse is an important issue in the process of highway construction in mountainous regions. Based on the principles of information entropy and Mahalanobis distance discriminant analysis, we have produced a collapse hazard prediction model. We used the entropy measure method to reduce the influence indexes of the collapse activity and extracted the nine main indexes affecting collapse activity as the discriminant factors of the distance discriminant analysis model (i.e., slope shape, aspect, gradient, and height, along with exposure of the structural face, stratum lithology, relationship between weakness face and free face, vegetation cover rate, and degree of rock weathering. We employ postearthquake collapse data in relation to construction of the Yingxiu-Wolong highway, Hanchuan County, China, as training samples for analysis. The results were analyzed using the back substitution estimation method, showing high accuracy and no errors, and were the same as the prediction result of uncertainty measure. Results show that the classification model based on information entropy and distance discriminant analysis achieves the purpose of index optimization and has excellent performance, high prediction accuracy, and a zero false-positive rate. The model can be used as a tool for future evaluation of collapse risk.

  10. Application of Vibration and Oil Analysis for Reliability Information on Helicopter Main Rotor Gearbox

    Science.gov (United States)

    Murrad, Muhamad; Leong, M. Salman

    Based on the experiences of the Malaysian Armed Forces (MAF), failure of the main rotor gearbox (MRGB) was one of the major contributing factors to helicopter breakdowns. Even though vibration and oil analysis are the effective techniques for monitoring the health of helicopter components, these two techniques were rarely combined to form an effective assessment tool in MAF. Results of the oil analysis were often used only for oil changing schedule while assessments of MRGB condition were mainly based on overall vibration readings. A study group was formed and given a mandate to improve the maintenance strategy of S61-A4 helicopter fleet in the MAF. The improvement consisted of a structured approach to the reassessment/redefinition suitable maintenance actions that should be taken for the MRGB. Basic and enhanced tools for condition monitoring (CM) are investigated to address the predominant failures of the MRGB. Quantitative accelerated life testing (QALT) was considered in this work with an intent to obtain the required reliability information in a shorter time with tests under normal stress conditions. These tests when performed correctly can provide valuable information about MRGB performance under normal operating conditions which enable maintenance personnel to make decision more quickly, accurately and economically. The time-to-failure and probability of failure information of the MRGB were generated by applying QALT analysis principles. This study is anticipated to make a dramatic change in its approach to CM, bringing significant savings and various benefits to MAF.

  11. A Review of Methods for Analysis of the Expected Value of Information.

    Science.gov (United States)

    Heath, Anna; Manolopoulou, Ioanna; Baio, Gianluca

    2017-10-01

    In recent years, value-of-information analysis has become more widespread in health economic evaluations, specifically as a tool to guide further research and perform probabilistic sensitivity analysis. This is partly due to methodological advancements allowing for the fast computation of a typical summary known as the expected value of partial perfect information (EVPPI). A recent review discussed some approximation methods for calculating the EVPPI, but as the research has been active over the intervening years, that review does not discuss some key estimation methods. Therefore, this paper presents a comprehensive review of these new methods. We begin by providing the technical details of these computation methods. We then present two case studies in order to compare the estimation performance of these new methods. We conclude that a method based on nonparametric regression offers the best method for calculating the EVPPI in terms of accuracy, computational time, and ease of implementation. This means that the EVPPI can now be used practically in health economic evaluations, especially as all the methods are developed in parallel with R functions and a web app to aid practitioners.

  12. Automated analysis of information processing, kinetic independence and modular architecture in biochemical networks using MIDIA.

    Science.gov (United States)

    Bowsher, Clive G

    2011-02-15

    Understanding the encoding and propagation of information by biochemical reaction networks and the relationship of such information processing properties to modular network structure is of fundamental importance in the study of cell signalling and regulation. However, a rigorous, automated approach for general biochemical networks has not been available, and high-throughput analysis has therefore been out of reach. Modularization Identification by Dynamic Independence Algorithms (MIDIA) is a user-friendly, extensible R package that performs automated analysis of how information is processed by biochemical networks. An important component is the algorithm's ability to identify exact network decompositions based on both the mass action kinetics and informational properties of the network. These modularizations are visualized using a tree structure from which important dynamic conditional independence properties can be directly read. Only partial stoichiometric information needs to be used as input to MIDIA, and neither simulations nor knowledge of rate parameters are required. When applied to a signalling network, for example, the method identifies the routes and species involved in the sequential propagation of information between its multiple inputs and outputs. These routes correspond to the relevant paths in the tree structure and may be further visualized using the Input-Output Path Matrix tool. MIDIA remains computationally feasible for the largest network reconstructions currently available and is straightforward to use with models written in Systems Biology Markup Language (SBML). The package is distributed under the GNU General Public License and is available, together with a link to browsable Supplementary Material, at http://code.google.com/p/midia. Further information is at www.maths.bris.ac.uk/~macgb/Software.html.

  13. PathNet: a tool for pathway analysis using topological information

    Directory of Open Access Journals (Sweden)

    Dutta Bhaskar

    2012-09-01

    Full Text Available Abstract Background Identification of canonical pathways through enrichment of differentially expressed genes in a given pathway is a widely used method for interpreting gene lists generated from high-throughput experimental studies. However, most algorithms treat pathways as sets of genes, disregarding any inter- and intra-pathway connectivity information, and do not provide insights beyond identifying lists of pathways. Results We developed an algorithm (PathNet that utilizes the connectivity information in canonical pathway descriptions to help identify study-relevant pathways and characterize non-obvious dependencies and connections among pathways using gene expression data. PathNet considers both the differential expression of genes and their pathway neighbors to strengthen the evidence that a pathway is implicated in the biological conditions characterizing the experiment. As an adjunct to this analysis, PathNet uses the connectivity of the differentially expressed genes among all pathways to score pathway contextual associations and statistically identify biological relations among pathways. In this study, we used PathNet to identify biologically relevant results in two Alzheimer’s disease microarray datasets, and compared its performance with existing methods. Importantly, PathNet identified de-regulation of the ubiquitin-mediated proteolysis pathway as an important component in Alzheimer’s disease progression, despite the absence of this pathway in the standard enrichment analyses. Conclusions PathNet is a novel method for identifying enrichment and association between canonical pathways in the context of gene expression data. It takes into account topological information present in pathways to reveal biological information. PathNet is available as an R workspace image from http://www.bhsai.org/downloads/pathnet/.

  14. Importance of Viral Sequence Length and Number of Variable and Informative Sites in Analysis of HIV Clustering.

    Science.gov (United States)

    Novitsky, Vlad; Moyo, Sikhulile; Lei, Quanhong; DeGruttola, Victor; Essex, M

    2015-05-01

    To improve the methodology of HIV cluster analysis, we addressed how analysis of HIV clustering is associated with parameters that can affect the outcome of viral clustering. The extent of HIV clustering and tree certainty was compared between 401 HIV-1C near full-length genome sequences and subgenomic regions retrieved from the LANL HIV Database. Sliding window analysis was based on 99 windows of 1,000 bp and 45 windows of 2,000 bp. Potential associations between the extent of HIV clustering and sequence length and the number of variable and informative sites were evaluated. The near full-length genome HIV sequences showed the highest extent of HIV clustering and the highest tree certainty. At the bootstrap threshold of 0.80 in maximum likelihood (ML) analysis, 58.9% of near full-length HIV-1C sequences but only 15.5% of partial pol sequences (ViroSeq) were found in clusters. Among HIV-1 structural genes, pol showed the highest extent of clustering (38.9% at a bootstrap threshold of 0.80), although it was significantly lower than in the near full-length genome sequences. The extent of HIV clustering was significantly higher for sliding windows of 2,000 bp than 1,000 bp. We found a strong association between the sequence length and proportion of HIV sequences in clusters, and a moderate association between the number of variable and informative sites and the proportion of HIV sequences in clusters. In HIV cluster analysis, the extent of detectable HIV clustering is directly associated with the length of viral sequences used, as well as the number of variable and informative sites. Near full-length genome sequences could provide the most informative HIV cluster analysis. Selected subgenomic regions with a high extent of HIV clustering and high tree certainty could also be considered as a second choice.

  15. 30 CFR 250.261 - What environmental impact analysis (EIA) information must accompany the DPP or DOCD?

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false What environmental impact analysis (EIA) information must accompany the DPP or DOCD? 250.261 Section 250.261 Mineral Resources MINERALS MANAGEMENT SERVICE, DEPARTMENT OF THE INTERIOR OFFSHORE OIL AND GAS AND SULPHUR OPERATIONS IN THE OUTER CONTINENTAL SHELF Plans and Information Contents of...

  16. Information Diffusion in Facebook-Like Social Networks Under Information Overload

    Science.gov (United States)

    Li, Pei; Xing, Kai; Wang, Dapeng; Zhang, Xin; Wang, Hui

    2013-07-01

    Research on social networks has received remarkable attention, since many people use social networks to broadcast information and stay connected with their friends. However, due to the information overload in social networks, it becomes increasingly difficult for users to find useful information. This paper takes Facebook-like social networks into account, and models the process of information diffusion under information overload. The term view scope is introduced to model the user information-processing capability under information overload, and the average number of times a message appears in view scopes after it is generated is proposed to characterize the information diffusion efficiency. Through theoretical analysis, we find that factors such as network structure and view scope number have no impact on the information diffusion efficiency, which is a surprising result. To verify the results, we conduct simulations and provide the simulation results, which are consistent with the theoretical analysis results perfectly.

  17. Properties of the numerical algorithms for problems of quantum information technologies: Benefits of deep analysis

    Science.gov (United States)

    Chernyavskiy, Andrey; Khamitov, Kamil; Teplov, Alexey; Voevodin, Vadim; Voevodin, Vladimir

    2016-10-01

    In recent years, quantum information technologies (QIT) showed great development, although, the way of the implementation of QIT faces the serious difficulties, some of which are challenging computational tasks. This work is devoted to the deep and broad analysis of the parallel algorithmic properties of such tasks. As an example we take one- and two-qubit transformations of a many-qubit quantum state, which are the most critical kernels of many important QIT applications. The analysis of the algorithms uses the methodology of the AlgoWiki project (algowiki-project.org) and consists of two parts: theoretical and experimental. Theoretical part includes features like sequential and parallel complexity, macro structure, and visual information graph. Experimental part was made by using the petascale Lomonosov supercomputer (Moscow State University, Russia) and includes the analysis of locality and memory access, scalability and the set of more specific dynamic characteristics of realization. This approach allowed us to obtain bottlenecks and generate ideas of efficiency improvement.

  18. Bridging information requirements and information needs assessment: do scenarios and vignettes provide a link?

    Directory of Open Access Journals (Sweden)

    Christine Urquhart

    2001-01-01

    Full Text Available The aim of the paper is to compare the philosophies of the vignette and critical incident techniques in information behaviour research, with the methodologies used in object oriented analysis such as use case scenarios and CRC (class, responsibility, collaboration cards. The principles of object oriented analysis are outlined, noting the emphasis on obtaining the "storyline" or "scripts" for information requirements analysis through use cases and CRC cards.  The critical incident technique and vignettes are used to obtain valid interpretations of users" information behaviour, using a storyline approach for data collection (and analysis which is similar to that of object oriented analysis. Some examples illustrate how techniques developed in object oriented analysis could be used for data display in information behaviour studies. Concludes that the methods developed by software engineering could be adapted usefully for information behaviour research.

  19. Intelligent acoustic data fusion technique for information security analysis

    Science.gov (United States)

    Jiang, Ying; Tang, Yize; Lu, Wenda; Wang, Zhongfeng; Wang, Zepeng; Zhang, Luming

    2017-08-01

    Tone is an essential component of word formation in all tonal languages, and it plays an important role in the transmission of information in speech communication. Therefore, tones characteristics study can be applied into security analysis of acoustic signal by the means of language identification, etc. In speech processing, fundamental frequency (F0) is often viewed as representing tones by researchers of speech synthesis. However, regular F0 values may lead to low naturalness in synthesized speech. Moreover, F0 and tone are not equivalent linguistically; F0 is just a representation of a tone. Therefore, the Electroglottography (EGG) signal is collected for deeper tones characteristics study. In this paper, focusing on the Northern Kam language, which has nine tonal contours and five level tone types, we first collected EGG and speech signals from six natural male speakers of the Northern Kam language, and then achieved the clustering distributions of the tone curves. After summarizing the main characteristics of tones of Northern Kam, we analyzed the relationship between EGG and speech signal parameters, and laid the foundation for further security analysis of acoustic signal.

  20. Risk-informed importance analysis of in-service testing components for Ulchin units 3 and 4

    International Nuclear Information System (INIS)

    Kang, D. I.; Kim, K. Y.; Ha, J. J.

    2001-01-01

    In this paper, we perform risk-informed importance analysis of in-service tesing (IST) components for Ulchin Units 3 and 4. The importance analysis using PSA is performed through Level 1 internal and external, shutdown/low power operation, and Level 2 internal PSA. The sensitivity analysis is also performed. For the components not modeled in PSA logic, we develop and apply a new integrated importance analysis method. The importance analysis results for IST valves show that 167 (26.55%) of 629 IST valves are HSSCs and 462 (73.45%) are LSSCs. The importance analysis results for IST pumps show that 28 (70%) of 40 IST pumps are HSSCs and 12 (30%) are KSSCs

  1. Integrating Information and Communication Technology for Health Information System Strengthening: A Policy Analysis.

    Science.gov (United States)

    Marzuki, Nuraidah; Ismail, Saimy; Al-Sadat, Nabilla; Ehsan, Fauziah Z; Chan, Chee-Khoon; Ng, Chiu-Wan

    2015-11-01

    Despite the high costs involved and the lack of definitive evidence of sustained effectiveness, many low- and middle-income countries had begun to strengthen their health information system using information and communication technology in the past few decades. Following this international trend, the Malaysian Ministry of Health had been incorporating Telehealth (National Telehealth initiatives) into national health policies since the 1990s. Employing qualitative approaches, including key informant interviews and document review, this study examines the agenda-setting processes of the Telehealth policy using Kingdon's framework. The findings suggested that Telehealth policies emerged through actions of policy entrepreneurs within the Ministry of Health, who took advantage of several simultaneously occurring opportunities--official recognition of problems within the existing health information system, availability of information and communication technology to strengthen health information system and political interests surrounding the national Multimedia Super Corridor initiative being developed at the time. The last was achieved by the inclusion of Telehealth as a component of the Multimedia Super Corridor. © 2015 APJPH.

  2. A qualitative analysis of information sharing for children with medical complexity within and across health care organizations.

    Science.gov (United States)

    Quigley, Laura; Lacombe-Duncan, Ashley; Adams, Sherri; Hepburn, Charlotte Moore; Cohen, Eyal

    2014-06-30

    Children with medical complexity (CMC) are characterized by substantial family-identified service needs, chronic and severe conditions, functional limitations, and high health care use. Information exchange is critically important in high quality care of complex patients at high risk for poor care coordination. Written care plans for CMC are an excellent test case for how well information sharing is currently occurring. The purpose of this study was to identify the barriers to and facilitators of information sharing for CMC across providers, care settings, and families. A qualitative study design with data analysis informed by a grounded theory approach was utilized. Two independent coders conducted secondary analysis of interviews with parents of CMC and health care professionals involved in the care of CMC, collected from two studies of healthcare service delivery for this population. Additional interviews were conducted with privacy officers of associated organizations to supplement these data. Emerging themes related to barriers and facilitators to information sharing were identified by the two coders and the research team, and a theory of facilitators and barriers to information exchange evolved. Barriers to information sharing were related to one of three major themes; 1) the lack of an integrated, accessible, secure platform on which summative health care information is stored, 2) fragmentation of the current health system, and 3) the lack of consistent policies, standards, and organizational priorities across organizations for information sharing. Facilitators of information sharing were related to improving accessibility to a common document, expanding the use of technology, and improving upon a structured communication plan. Findings informed a model of how various barriers to information sharing interact to prevent optimal information sharing both within and across organizations and how the use of technology to improve communication and access to

  3. Sensitivity Analysis of Multiple Informant Models When Data Are Not Missing at Random

    Science.gov (United States)

    Blozis, Shelley A.; Ge, Xiaojia; Xu, Shu; Natsuaki, Misaki N.; Shaw, Daniel S.; Neiderhiser, Jenae M.; Scaramella, Laura V.; Leve, Leslie D.; Reiss, David

    2013-01-01

    Missing data are common in studies that rely on multiple informant data to evaluate relationships among variables for distinguishable individuals clustered within groups. Estimation of structural equation models using raw data allows for incomplete data, and so all groups can be retained for analysis even if only 1 member of a group contributes…

  4. Sorting through search results: a content analysis of HPV vaccine information online.

    Science.gov (United States)

    Madden, Kelly; Nan, Xiaoli; Briones, Rowena; Waks, Leah

    2012-05-28

    Surveys have shown that many people now turn to the Internet for health information when making health-related decisions. This study systematically analyzed the HPV vaccine information returned by online search engines. HPV is the most common sexually transmitted disease and is the leading cause of cervical cancers. We conducted a content analysis of 89 top search results from Google, Yahoo, Bing, and Ask.com. The websites were analyzed with respect to source, tone, information related to specific content analyzed through the lens of the Health Belief Model, and in terms of two content themes (i.e., conspiracy theories and civil liberties). The relations among these aspects of the websites were also explored. Most websites were published by nonprofit or academic sources (34.8%) and governmental agencies (27.4%) and were neutral in tone (57.3%), neither promoting nor opposing the HPV vaccine. Overall, the websites presented suboptimal or inaccurate information related to the five behavioral predictors stipulated in the Health Belief Model. Questions related to civil liberties were present on some websites. Health professionals designing online communication with the intent of increasing HPV vaccine uptake should take care to include information about the risks of HPV, including susceptibility and severity. Additionally, websites should include information about the benefits of the vaccine (i.e., effective against HPV), low side effects as a barrier that can be overcome, and ways in which to receive the vaccine to raise individual self-efficacy. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Information Synthesis in Uncertainty Studies: Application to the Analysis of the BEMUSE Results

    International Nuclear Information System (INIS)

    Baccou, J.; Chojnacki, E.; Destercke, S.

    2013-01-01

    To demonstrate that the nuclear power plants are designed to respond safely at numerous postulated accidents computer codes are used. The models of these computer codes are an approximation of the real physical behaviour occurring during an accident. Moreover the data used to run these codes are also known with a limited accuracy. Therefore the code predictions are not exact but uncertain. To deal with these uncertainties, 'best estimate' codes with 'best estimate' input data are used to obtain a best estimate calculation and it is necessary to derive the uncertainty associated to their estimations. For this reason, regulatory authorities demand in particular to technical safety organization such as the French Institut de Radioprotection et de Surete Nucleaire (IRSN) to provide results taking into account all the uncertainty sources to assess safety quantities are below critical values. Uncertainty analysis can be seen as a problem of information treatment and a special effort on four methodological key issues has to be done. The first one is related to information modelling. In safety studies, one can distinguish two kinds of uncertainty. The first type, called aleatory uncertainty, is due to the natural variability of an observed phenomenon and cannot be reduced by the arrival of new information. The second type, called epistemic uncertainty, can arise from imprecision. Contrary to the previous one, this uncertainty can be reduced by increasing the state of knowledge. Performing a relevant information modelling therefore requires to work with a mathematical formalism flexible enough to faithfully treat both types of uncertainties. The second one deals with information propagation through a computer code. It requires to run the codes several times and it is usually achieved thanks to a coupling to a statistical software. The complexity of the propagation is strongly connected to the mathematical framework used for the information modelling. The more general the

  6. The Philosophy of Information as an Underlying and Unifying Theory of Information Science

    Science.gov (United States)

    Tomic, Taeda

    2010-01-01

    Introduction: Philosophical analyses of theoretical principles underlying these sub-domains reveal philosophy of information as underlying meta-theory of information science. Method: Conceptual research on the knowledge sub-domains in information science and philosophy and analysis of their mutual connection. Analysis: Similarities between…

  7. Analysis and Comparison of Information Theory-based Distances for Genomic Strings

    Science.gov (United States)

    Balzano, Walter; Cicalese, Ferdinando; Del Sorbo, Maria Rosaria; Vaccaro, Ugo

    2008-07-01

    Genomic string comparison via alignment are widely applied for mining and retrieval of information in biological databases. In some situation, the effectiveness of such alignment based comparison is still unclear, e.g., for sequences with non-uniform length and with significant shuffling of identical substrings. An alternative approach is the one based on information theory distances. Biological data information content is stored in very long strings of only four characters. In last ten years, several entropic measures have been proposed for genomic string analysis. Notwithstanding their individual merit and experimental validation, to the nest of our knowledge, there is no direct comparison of these different metrics. We shall present four of the most representative alignment-free distance measures, based on mutual information. Each one has a different origin and expression. Our comparison involves a sort of arrangement, to reduce different concepts to a unique formalism, so as it has been possible to construct a phylogenetic tree for each of them. The trees produced via these metrics are compared to the ones widely accepted as biologically validated. In general the results provided more evidence of the reliability of the alignment-free distance models. Also, we observe that one of the metrics appeared to be more robust than the other three. We believe that this result can be object of further researches and observations. Many of the results of experimentation, the graphics and the table are available at the following URL: http://people.na.infn.it/˜wbalzano/BIO

  8. Information technology made simple

    CERN Document Server

    Carter, Roger

    1991-01-01

    Information Technology: Made Simple covers the full range of information technology topics, including more traditional subjects such as programming languages, data processing, and systems analysis. The book discusses information revolution, including topics about microchips, information processing operations, analog and digital systems, information processing system, and systems analysis. The text also describes computers, computer hardware, microprocessors, and microcomputers. The peripheral devices connected to the central processing unit; the main types of system software; application soft

  9. The mediated information in speech Edir Macedo: analysis of publishers of Universal Leaf

    Directory of Open Access Journals (Sweden)

    Ciro Athayde Barros Monteiro

    2017-04-01

    Full Text Available Introduction: The information mediated in the speech of Edir Macedo remains in prominent position in front of the transformations of contemporary society. The study proposed to analyze the strategies used in his speech to mediate information through the editorial of the newspaper Folha Universal (FU, the journal of the “Igreja Universal do Reino de Deus (IURD. Objective: To know the discursive strategies used by Edir Macedo in order to understand how this information is mediated and how can expand its influence daily turning it into one of the major mediators of Brazil. Methodology: Four editorials were selected writings of newspaper between 2009 and 2011, use of Discourse Analysis methodology. Results: The editorials analyzed show that the bishop uses primarily persuasive speech to get public support by appealing almost always emotional function and the imperative mood. Conclusions: We highlight the need for CI to understand this discourse, since this information is responsible for influencing a large number of people making the IURD, every day, expand its space in the press and society.

  10. On Holo-Hilbert spectral analysis: a full informational spectral representation for nonlinear and non-stationary data

    OpenAIRE

    Huang, Norden E.; Hu, Kun; Yang, Albert C. C.; Chang, Hsing-Chih; Jia, Deng; Liang, Wei-Kuang; Yeh, Jia Rong; Kao, Chu-Lan; Juan, Chi-Hung; Peng, Chung Kang; Meijer, Johanna H.; Wang, Yung-Hung; Long, Steven R.; Wu, Zhauhua

    2016-01-01

    The Holo-Hilbert spectral analysis (HHSA) method is introduced to cure the deficiencies of traditional spectral analysis and to give a full informational representation of nonlinear and non-stationary data. It uses a nested empirical mode decomposition and Hilbert–Huang transform (HHT) approach to identify intrinsic amplitude and frequency modulations often present in nonlinear systems. Comparisons are first made with traditional spectrum analysis, which usually achieved its results through c...

  11. Bim Orientation: Grades of Generation and Information for Different Type of Analysis and Management Process

    Science.gov (United States)

    Banfi, F.

    2017-08-01

    Architecture, Engineering and Construction (AEC) industry is facing a great process re-engineering of the management procedures for new constructions, and recent studies show a significant increase of the benefits obtained through the use of Building Information Modelling (BIM) methodologies. This innovative approach needs new developments for information and communication technologies (ICT) in order to improve cooperation and interoperability among different actors and scientific disciplines. Accordingly, BIM could be described as a new tool capable of collect/analyse a great quantity of information (Big data) and improve the management of building during its life of cycle (LC). The main aim of this research is, in addition to a reduction in production times, reduce physical and financial resources (economic impact), to demonstrate how technology development can support a complex generative process with new digital tools (modelling impact). This paper reviews recent BIMs of different historical Italian buildings such as Basilica of Collemaggio in L'Aquila, Masegra Castle in Sondrio, Basilica of Saint Ambrose in Milan and Visconti Bridge in Lecco and carries out a methodological analysis to optimize output information and results combining different data and modelling techniques into a single hub (cloud service) through the use of new Grade of Generation (GoG) and Information (GoI) (management impact). Finally, this study shows the need to orient GoG and GoI for a different type of analysis, which requires a high Grade of Accuracy (GoA) and an Automatic Verification System (AVS ) at the same time.

  12. Information system analysis of an e-learning system used for dental restorations simulation.

    Science.gov (United States)

    Bogdan, Crenguţa M; Popovici, Dorin M

    2012-09-01

    The goal of using virtual and augmented reality technologies in therapeutic interventions simulation, in the fixed prosthodontics (VirDenT) project, is to increase the quality of the educational process in dental faculties, by assisting students in learning how to prepare teeth for all-ceramic restorations. Its main component is an e-learning virtual reality-based software system that will be used for the developing skills in grinding teeth, needed in all-ceramic restorations. The complexity of the domain problem that the software system dealt with made the analysis of the information system supported by VirDenT necessary. The analysis contains the following activities: identification and classification of the system stakeholders, description of the business processes, formulation of the business rules, and modelling of business objects. During this stage, we constructed the context diagram, the business use case diagram, the activity diagrams and the class diagram of the domain model. These models are useful for the further development of the software system that implements the VirDenT information system. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  13. Information Base of Financial Analysis of Educational Institutions of Higher Education

    Directory of Open Access Journals (Sweden)

    Alexander A. Galushkin

    2015-12-01

    Full Text Available In this article author analyzes issues related to the formation of information base analysis of the financial condition of educational institutions of higher education. Author notes that are significantly different principles of financial (accounting statements of non-governmental and government (budget of educational institutions of higher professional education. In conclusion, author notes that when analyzing the financial condition of the group of higher professional education institutions, they can be classified into subgroups, depending on the benefits of a species (subspecies of funding and revenue.

  14. Incorporating twitter-based human activity information in spatial analysis of crashes in urban areas.

    Science.gov (United States)

    Bao, Jie; Liu, Pan; Yu, Hao; Xu, Chengcheng

    2017-09-01

    The primary objective of this study was to investigate how to incorporate human activity information in spatial analysis of crashes in urban areas using Twitter check-in data. This study used the data collected from the City of Los Angeles in the United States to illustrate the procedure. The following five types of data were collected: crash data, human activity data, traditional traffic exposure variables, road network attributes and social-demographic data. A web crawler by Python was developed to collect the venue type information from the Twitter check-in data automatically. The human activities were classified into seven categories by the obtained venue types. The collected data were aggregated into 896 Traffic Analysis Zones (TAZ). Geographically weighted regression (GWR) models were developed to establish a relationship between the crash counts reported in a TAZ and various contributing factors. Comparative analyses were conducted to compare the performance of GWR models which considered traditional traffic exposure variables only, Twitter-based human activity variables only, and both traditional traffic exposure and Twitter-based human activity variables. The model specification results suggested that human activity variables significantly affected the crash counts in a TAZ. The results of comparative analyses suggested that the models which considered both traditional traffic exposure and human activity variables had the best goodness-of-fit in terms of the highest R 2 and lowest AICc values. The finding seems to confirm the benefits of incorporating human activity information in spatial analysis of crashes using Twitter check-in data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Analog-to-digital conversion of spectrometric data in information-control systems of activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Mamonov, E I

    1972-01-01

    Analog-digital conversion (ADC) techniques in nuclear radiation spectrometer channels is a most important link of information control systems in activation analysis. For the development of the ADC of spectrometer channels logico-structural methods of increasing the capacity, procedures for boosting frequency modes and improving the accuracy are promising. Procedures are suggested for increasing the ADC capacity. Insufficient stability and noticeable non-linearity of the spectrometer channel can be corrected at the information processing stage if their regularities are known. Capacity limitations make the development of ADC featuring high stability, capacity and linearity quite urgent.

  16. The development of the risk-based cost-benefit analysis framework for risk-informed regulation

    International Nuclear Information System (INIS)

    Yang, Z. A.; Hwang, M. J.; Lee, K. S.

    2001-01-01

    US NRC (Nuclear Regulatory Committee) introduces the Risk-informed Regulation (RIR) to allocate the resources of NRC effectively and to reduce the unnecessary burden of utilities. This approach inherently includes the cost-benefit analysis (CBA) concept. The CBA method has been widely used for many problems in order to support the decision making by analyzing the effectiveness of the proposed plan and/or activity in the aspect of cost and benefit. However, in general, the conventional CBA method does not use the information such as risk that is the essential element of RIR. So, we developed a revised CBA framework that incorporates the risk information in analyzing the cost and benefit of the regulatory and/or operational activities in nuclear industry

  17. [Gender analysis of primary care professionals' perceptions and attitudes to informal care].

    Science.gov (United States)

    del Mar García-Calvente, María; del Río Lozano, María; Castaño López, Esther; Mateo Rodríguez, Inmaculada; Maroto Navarro, Gracia; Hidalgo Ruzzante, Natalia

    2010-01-01

    To analyze primary care professionals' perceptions and attitudes to informal care from a gender perspective. We performed a qualitative study using interviews and a discussion group. Eighteen primary care professionals were selected in the Health District of Grenada (Spain) by means of intentional sampling. Content analysis was performed with the following categories: a) perceptions: concepts of dependency and informal care, gender differences and impact on health, b) attitudes: not in favor of change, in favor of change and the right not to provide informal care. The health professionals emphasized the non-professional, free and strong emotional component of informal care. These professionals assigned the family (especially women) the main responsibility for caregiving and used stereotypes to differentiate between care provided by men and by women. The professionals agreed that women had a greater psychological burden associated with care, mainly because they more frequently provide caregiving on their own than men. Three major attitudes emerged among health professionals about informal care: those who did not question the current situation and idealized the family as the most appropriate framework for caregiving; those who proposed changes toward a more universal dependency system that would relieve families; and those who adopted an intermediate position, favoring education to achieve wellbeing in caregivers and prevent them from ceasing to provide care. We identified perceptions and attitudes that showed little sensitivity to gender equality, such as a conservative attitude that assigned the family the primary responsibility for informal care and some sexist stereotypes that attributed a greater ability for caregiving to women. Specific training in gender equality is required among health professionals to reduce inequalities in informal care. Copyright © 2009 SESPAS. Published by Elsevier Espana. All rights reserved.

  18. Textual Analysis of Intangible Information

    NARCIS (Netherlands)

    A.J. Moniz (Andy)

    2016-01-01

    markdownabstractTraditionally, equity investors have relied upon the information reported in firms’ financial accounts to make their investment decisions. Due to the conservative nature of accounting standards, firms cannot value their intangible assets such as corporate culture, brand value and

  19. Legal analysis of information displayed on dental material packages: An exploratory research

    Directory of Open Access Journals (Sweden)

    Bhumika Rathore

    2016-01-01

    Full Text Available Introduction: Some of the dental materials possess occupational hazards, preprocedural errors, and patient allergies as suggested by evidence. With due consideration to safety of the patients and dental professionals, it is essential that the trade of these materials is in conformity with the law. Aim: To perform the legal analysis of the information displayed on the packaging of dental materials. Materials and Methods: The Bureau of Indian Standards sets guidelines for packaging and marketing of dental products in India. An exploratory cross-sectional study was performed using various search engines and websites to access the laws and regulations existing pertaining to dental materials packaging. Based on the data obtained, a unique packaging standardization checklist was developed. Dental laboratory and impression plasters, alginates, and endodontic instruments were surveyed for all the available brands. This study considered 16 brands of plasters and alginates and 42 brands of endodontic instruments for legal analysis. Legal analysis was performed using the direct observation checklist. Descriptive statistics were obtained using SPSS version 19. Results: The guidelines set by the Bureau of Indian Standards do exist but are not updated and stand as oblivious guards for marketing standards. Overall compliance to the guidelines was reported to be 18.5% by brands of alginates, 4.1% by plaster of Paris, and 11.11% by endodontic instruments. Wave One™ File reported maximum adherence with the guidelines as 66.7%. Conclusion: This study found lower rate of adherence to the guidelines, thus indicating insufficient information being disclosed to the consumers.

  20. Improvement of the Accounting System at an Enterprise with the aim of Information Support of the Strategic Analysis

    Directory of Open Access Journals (Sweden)

    Feofanova Iryna V.

    2013-11-01

    Full Text Available The goal of the article is identification of directions of improvement of the accounting system at an enterprise for ensuring procedures of strategic analysis of trustworthy information. Historical (for the study of conditions of appearance and development of the strategic analysis and logical (for identification of directions of improvement of accounting methods were used during the study. The article establishes that the modern conditions require a system of indicators that is based both on financial and non-financial information. In order to conduct the strategic analysis it is necessary to expand the volume of information, which characterises such resources of an enterprise as scientific research and developments, personnel and quality of products (services. The article selects indicators of innovation activity costs and personnel training costs, accounting of which is not sufficiently regulated, among indicators that provides such information. It offers, in order to ensure information requirements of analysts, to improve accounting by the following directions: identification of the nature and volume of information required for enterprise managers; formation of the system of accounting at the place of appearance of expenses and responsibility centres; identification and accounting of income or other results received by the enterprise due to personnel advanced training, research and developments and innovation introduction costs. The article offers a form for calculating savings in the result of reduction of costs obtained due to provision of governmental privileges to enterprises that introduce innovations and deal with personnel training.

  1. Co-word analysis for the non-scientific information example of Reuters Business Briefings

    Directory of Open Access Journals (Sweden)

    B Delecroix

    2006-01-01

    Full Text Available Co-word analysis is based on a sociological theory developed by the CSI and the SERPIA (Callon, Courtial, Turner, 1991 in the mid eighties. This method, originally dedicated to scientific fields, measures the association strength between terms in documents to reveal and visualise the evolution of scientific fields through the construction of clusters and strategic diagram. This method has since been successfully applied to investigate the structure of many scientific areas. Nowadays it occurs in many software systems which are used by companies to improve their business, and define their strategy but its relevance to this kind of application has not been proved yet. Using the example of economic and marketing information on DSL technologies from Reuters Business Briefing, this presentation gives an interpretation of co-word analysis for this kind of information. After an overview of the software we used (Sampler and after an outline of the experimental protocol, we investigate and explain each step of the co-word analysis process: terminological extraction, computation of clusters and the strategic diagram. In particular, we explain the meaning of each parameter of the method: the choice of variables and similarity measures is discussed. Finally we try to give a global interpretation of the method in an economic context. Further studies will be added to this work in order to allow a generalisation of these results.

  2. [Habitat factor analysis for Torreya grandis cv. Merrillii based on spatial information technology].

    Science.gov (United States)

    Wang, Xiao-ming; Wang, Ke; Ao, Wei-jiu; Deng, Jin-song; Han, Ning; Zhu, Xiao-yun

    2008-11-01

    Torreya grandis cv. Merrillii, a tertiary survival plant, is a rare tree species of significant economic value and expands rapidly in China. Its special habitat factor analysis has the potential value to provide guide information for its planting, management, and sustainable development, because the suitable growth conditions for this tree species are special and strict. In this paper, the special habitat factors for T. grandis cv. Merrillii in its core region, i.e., in seven villages of Zhuji City, Zhejiang Province were analyzed with Principal Component Analysis (PCA) and a series of data, such as IKONOS image, Digital Elevation Model (DEM), and field survey data supported by the spatial information technology. The results showed that T. grandis cv. Merrillii exhibited high selectivity of environmental factors such as elevation, slope, and aspect. 96.22% of T. grandis cv. Merrillii trees were located at the elevation from 300 to 600 m, 97.52% of them were found to present on the areas whose slope was less than 300, and 74.43% of them distributed on sunny and half-sunny slopes. The results of PCA analysis indicated that the main environmental factors affecting the habitat of T. grandis cv. Merrillii were moisture, heat, and soil nutrients, and moisture might be one of the most important ecological factors for T. grandis cv. Merrillii due to the unique biological and ecological characteristics of the tree species.

  3. Information Theory Analysis of Cascading Process in a Synthetic Model of Fluid Turbulence

    Directory of Open Access Journals (Sweden)

    Massimo Materassi

    2014-02-01

    Full Text Available The use of transfer entropy has proven to be helpful in detecting which is the verse of dynamical driving in the interaction of two processes, X and Y . In this paper, we present a different normalization for the transfer entropy, which is capable of better detecting the information transfer direction. This new normalized transfer entropy is applied to the detection of the verse of energy flux transfer in a synthetic model of fluid turbulence, namely the Gledzer–Ohkitana–Yamada shell model. Indeed, this is a fully well-known model able to model the fully developed turbulence in the Fourier space, which is characterized by an energy cascade towards the small scales (large wavenumbers k, so that the application of the information-theory analysis to its outcome tests the reliability of the analysis tool rather than exploring the model physics. As a result, the presence of a direct cascade along the scales in the shell model and the locality of the interactions in the space of wavenumbers come out as expected, indicating the validity of this data analysis tool. In this context, the use of a normalized version of transfer entropy, able to account for the difference of the intrinsic randomness of the interacting processes, appears to perform better, being able to discriminate the wrong conclusions to which the “traditional” transfer entropy would drive.

  4. A Comprehensive Analysis of the Quality of Online Health-Related Information regarding Schizophrenia

    Science.gov (United States)

    Guada, Joseph; Venable, Victoria

    2011-01-01

    Social workers are major mental health providers and, thus, can be key players in guiding consumers and their families to accurate information regarding schizophrenia. The present study, using the WebMedQual scale, is a comprehensive analysis across a one-year period at two different time points of the top for-profit and nonprofit sites that…

  5. The Notion of 'Being Informative' & the Praxiological-Information Perspective on Language

    Directory of Open Access Journals (Sweden)

    Antonio Florio

    2009-11-01

    Full Text Available After a concise introduction on the analysis of truth and meaning in philosophy of language two notions of information are grasped by the analysis of Situation Semantics and Situation Theory. The first is that of correlation, the second that of constraint; the latter is reducible to the former. More than that, the phenomenon of ";alethic nature of information"; is highlighted and the notion of ";being informative"; is pointed out. The difference between a meaning-oriented and an informational-oriented perspective of language is marked. Messages are recognized as being the atomic constituents of the informational perspective of language; the architecture of language is shown; and a praxiological-information perspective on the study of language is outlined.

  6. Development of SNS Stream Analysis Based on Forest Disaster Warning Information Service System

    Science.gov (United States)

    Oh, J.; KIM, D.; Kang, M.; Woo, C.; Kim, D.; Seo, J.; Lee, C.; Yoon, H.; Heon, S.

    2017-12-01

    Forest disasters, such as landslides and wildfires, cause huge economic losses and casualties, and the cost of recovery is increasing every year. While forest disaster mitigation technologies have been focused on the development of prevention and response technologies, they are now required to evolve into evacuation and border evacuation, and to develop technologies fused with ICT. In this study, we analyze the SNS (Social Network Service) stream and implement a system to detect the message that the forest disaster occurred or the forest disaster, and search the keyword related to the forest disaster in advance in real time. It is possible to detect more accurate forest disaster messages by repeatedly learning the retrieved results using machine learning techniques. To do this, we designed and implemented a system based on Hadoop and Spark, a distributed parallel processing platform, to handle Twitter stream messages that open SNS. In order to develop the technology to notify the information of forest disaster risk, a linkage of technology such as CBS (Cell Broadcasting System) based on mobile communication, internet-based civil defense siren, SNS and the legal and institutional issues for applying these technologies are examined. And the protocol of the forest disaster warning information service system that can deliver the SNS analysis result was developed. As a result, it was possible to grasp real-time forest disaster situation by real-time big data analysis of SNS that occurred during forest disasters. In addition, we confirmed that it is possible to rapidly propagate alarm or warning according to the disaster situation by using the function of the forest disaster warning information notification service. However, the limitation of system application due to the restriction of opening and sharing of SNS data currently in service and the disclosure of personal information remains a problem to be solved in the future. Keyword : SNS stream, Big data, Machine

  7. Methods and apparatuses for information analysis on shared and distributed computing systems

    Science.gov (United States)

    Bohn, Shawn J [Richland, WA; Krishnan, Manoj Kumar [Richland, WA; Cowley, Wendy E [Richland, WA; Nieplocha, Jarek [Richland, WA

    2011-02-22

    Apparatuses and computer-implemented methods for analyzing, on shared and distributed computing systems, information comprising one or more documents are disclosed according to some aspects. In one embodiment, information analysis can comprise distributing one or more distinct sets of documents among each of a plurality of processes, wherein each process performs operations on a distinct set of documents substantially in parallel with other processes. Operations by each process can further comprise computing term statistics for terms contained in each distinct set of documents, thereby generating a local set of term statistics for each distinct set of documents. Still further, operations by each process can comprise contributing the local sets of term statistics to a global set of term statistics, and participating in generating a major term set from an assigned portion of a global vocabulary.

  8. A data-informed PIF hierarchy for model-based Human Reliability Analysis

    International Nuclear Information System (INIS)

    Groth, Katrina M.; Mosleh, Ali

    2012-01-01

    This paper addresses three problems associated with the use of Performance Shaping Factors in Human Reliability Analysis. (1) There are more than a dozen Human Reliability Analysis (HRA) methods that use Performance Influencing Factors (PIFs) or Performance Shaping Factors (PSFs) to model human performance, but there is not a standard set of PIFs used among the methods, nor is there a framework available to compare the PIFs used in various methods. (2) The PIFs currently in use are not defined specifically enough to ensure consistent interpretation of similar PIFs across methods. (3) There are few rules governing the creation, definition, and usage of PIF sets. This paper introduces a hierarchical set of PIFs that can be used for both qualitative and quantitative HRA. The proposed PIF set is arranged in a hierarchy that can be collapsed or expanded to meet multiple objectives. The PIF hierarchy has been developed with respect to a set fundamental principles necessary for PIF sets, which are also introduced in this paper. This paper includes definitions of the PIFs to allow analysts to map the proposed PIFs onto current and future HRA methods. The standardized PIF hierarchy will allow analysts to combine different types of data and will therefore make the best use of the limited data in HRA. The collapsible hierarchy provides the structure necessary to combine multiple types of information without reducing the quality of the information.

  9. A media information analysis for implementing effective countermeasure against harmful rumor

    Science.gov (United States)

    Nagao, Mitsuyoshi; Suto, Kazuhiro; Ohuchi, Azuma

    2010-04-01

    When large scale earthquake occurred, the word of "harmful rumor" came to be frequently heard. The harmful rumor means an economic damage which is caused by the action that people regard actually safe foods or areas as dangerous and then abort consumption or sightseeing. In the case of harmful rumor caused by earthquake, especially, tourism industry receives massive economic damage. Currently, harmful rumor which gives substantial economic damage have become serious social issue which must be solved. In this paper, we propose a countermeasure method for harmful rumor on the basis of media trend in order to implement speedy recovery from harmful rumor. Here, we investigate the amount and content of information which is transmitted to the general public by the media when an earthquake occurred. In addition, the media information in three earthquakes is treated as instance. Finally, we discuss an effective countermeasure method for dispeling harmful rumor through these analysis results.

  10. Text Analysis: Critical Component of Planning for Text-Based Discussion Focused on Comprehension of Informational Texts

    Science.gov (United States)

    Kucan, Linda; Palincsar, Annemarie Sullivan

    2018-01-01

    This investigation focuses on a tool used in a reading methods course to introduce reading specialist candidates to text analysis as a critical component of planning for text-based discussions. Unlike planning that focuses mainly on important text content or information, a text analysis approach focuses both on content and how that content is…

  11. Deriving Quantitative Crystallographic Information from the Wavelength-Resolved Neutron Transmission Analysis Performed in Imaging Mode

    Directory of Open Access Journals (Sweden)

    Hirotaka Sato

    2017-12-01

    Full Text Available Current status of Bragg-edge/dip neutron transmission analysis/imaging methods is presented. The method can visualize real-space distributions of bulk crystallographic information in a crystalline material over a large area (~10 cm with high spatial resolution (~100 μm. Furthermore, by using suitable spectrum analysis methods for wavelength-dependent neutron transmission data, quantitative visualization of the crystallographic information can be achieved. For example, crystallographic texture imaging, crystallite size imaging and crystalline phase imaging with texture/extinction corrections are carried out by the Rietveld-type (wide wavelength bandwidth profile fitting analysis code, RITS (Rietveld Imaging of Transmission Spectra. By using the single Bragg-edge analysis mode of RITS, evaluations of crystal lattice plane spacing (d-spacing relating to macro-strain and d-spacing distribution’s FWHM (full width at half maximum relating to micro-strain can be achieved. Macro-strain tomography is performed by a new conceptual CT (computed tomography image reconstruction algorithm, the tensor CT method. Crystalline grains and their orientations are visualized by a fast determination method of grain orientation for Bragg-dip neutron transmission spectrum. In this paper, these imaging examples with the spectrum analysis methods and the reliabilities evaluated by optical/electron microscope and X-ray/neutron diffraction, are presented. In addition, the status at compact accelerator driven pulsed neutron sources is also presented.

  12. The Technical Report: An Analysis of Information Design and Packaging for an Inelastic Market.

    Science.gov (United States)

    Pinelli, Thomas E.; And Others

    As part of an evaluation of its scientific and technical information program, the National Aeronautics and Space Administration (NASA) conducted a review and analysis of structural, language, and presentation components of its technical report form. The investigation involved comparing and contrasting NASA's publications standards for technical…

  13. Perception of urban retailing environments : an empirical analysis of consumer information and usage fields

    NARCIS (Netherlands)

    Timmermans, H.J.P.; vd Heijden, R.E.C.M.; Westerveld, J.

    1982-01-01

    This article reports on an empirical analysis of consumer information and usage fields in the city of Eindhoven. The main purposes of this study are to investigate the distance, sectoral and directional biases of these fields, to analyse whether the degree of biases is related to personal

  14. Fusion Energy: Contextual Analysis of the Information Panels Developed by the Scientific Community versus Citizen Discourse

    International Nuclear Information System (INIS)

    Ferri Anglada, S.; Cornejo Alvarez, J. M.

    2014-01-01

    The report presents an exploratory study on the impact of scientific dissemination, particularly a comparative analysis of two discourses on fusion energy as an alternative energy future. The report introduces a comparative analysis of the institutional discourse, as portrayed by the scientific jargon used in a European travelling exhibition on nuclear fusion Fusion Expo, and the social discourse, as illustrated by a citizen deliberation on this very same exhibition. Through textual analysis, the scientific discourse as deployed in the informative panels at the Fusion Expo is compared with the citizen discourse as developed in the discussions within the citizen groups. The ConText software was applied for such analysis. The purpose is to analyze how visitors assimilate, capture and understand highly technical information. Results suggest that, in despite of convergence points, the two discourses present certain differences, showing diverse levels of communication. The scientific discourse shows a great profusion of formalisms and technicalities of scientific jargon. The citizen discourse shows abundance of words associated with daily life and the more practical aspects (economy, efficiency), concerning institutional and evaluative references. In sum, the study shows that although there are a few common communicative spaces, there are still very few turning points. These data indicate that although exhibitions can be a good tool to disseminate advances in fusion energy in informal learning contexts, public feedback is a powerful tool for improving the quality of social dialogue. (Author)

  15. Radiological accidents: analysis of the information disseminated by media and public acceptance of nuclear technology

    International Nuclear Information System (INIS)

    Delgado, Jose Ubiratan; Tauhata, Luiz; Garcia, Marcia Maria

    1995-01-01

    A methodology to treat quantitatively information by Media concerning a nuclear or a radiological accident is presented. It allows us to classify information according to the amount, importance and way of showing, into one indicator, named Information Equivalent. This establishes a procedure for analysis of released information and includes: number of head-lines, illustrations, printed lines, editorials, authorities quoted and so on. Interpretation becomes easier when the evolution and statistical trend of this indicator is observed. The application to evaluate the dissemination of the accident which took place in 1987 in Goiania, Brazil, was satisfactory and allowed us to purpose a model. This will aid the planning, the decision making process and it will improve relationships between technical staff and media during the emergency. (author). 5 refs., 4 figs., 3 tabs

  16. Fisher statistics for analysis of diffusion tensor directional information.

    Science.gov (United States)

    Hutchinson, Elizabeth B; Rutecki, Paul A; Alexander, Andrew L; Sutula, Thomas P

    2012-04-30

    A statistical approach is presented for the quantitative analysis of diffusion tensor imaging (DTI) directional information using Fisher statistics, which were originally developed for the analysis of vectors in the field of paleomagnetism. In this framework, descriptive and inferential statistics have been formulated based on the Fisher probability density function, a spherical analogue of the normal distribution. The Fisher approach was evaluated for investigation of rat brain DTI maps to characterize tissue orientation in the corpus callosum, fornix, and hilus of the dorsal hippocampal dentate gyrus, and to compare directional properties in these regions following status epilepticus (SE) or traumatic brain injury (TBI) with values in healthy brains. Direction vectors were determined for each region of interest (ROI) for each brain sample and Fisher statistics were applied to calculate the mean direction vector and variance parameters in the corpus callosum, fornix, and dentate gyrus of normal rats and rats that experienced TBI or SE. Hypothesis testing was performed by calculation of Watson's F-statistic and associated p-value giving the likelihood that grouped observations were from the same directional distribution. In the fornix and midline corpus callosum, no directional differences were detected between groups, however in the hilus, significant (pstatistical comparison of tissue structural orientation. Copyright © 2012 Elsevier B.V. All rights reserved.

  17. Analysis Of Educational Services Distribution-Based Geographic Information System GIS

    Directory of Open Access Journals (Sweden)

    Waleed Lagrab

    2015-03-01

    Full Text Available Abstract This study analyzes the spatial distribution of kindergarten facilities in the study area based on the Geographic Information Systems GIS in order to test an efficiency of GIS technology to redistribute the existing kindergarten and choose the best location in the future and applying the standard criteria for selecting the suitable locations for kindergarten. To achieve this goal the data and information are collected via interviews and comprehensive statistics on the education facilities in Mukalla districts in YEMEN which contributed to building a geographic database for the study area. After that the Kindergarten spatial patterns are analyzed in terms of proximity to each other and used near some other land in the surrounding area such as streets highways factories etc. Also measures the concentration dispersion clustering and distribution direction for the kindergarten this study showed the effectiveness of the GIS for spatial data analysis. One of the most important finding that most of the Kindergarten was established in Mukalla city did not take into account the criteria that set by the authorities. Furthermore almost district suffers from a shortage in the number of kindergarten and pattern of distribution of those kindergartens dominated by spatial dispersed.

  18. Automatic Classification of Users' Health Information Need Context: Logistic Regression Analysis of Mouse-Click and Eye-Tracker Data.

    Science.gov (United States)

    Pian, Wenjing; Khoo, Christopher Sg; Chi, Jianxing

    2017-12-21

    Users searching for health information on the Internet may be searching for their own health issue, searching for someone else's health issue, or browsing with no particular health issue in mind. Previous research has found that these three categories of users focus on different types of health information. However, most health information websites provide static content for all users. If the three types of user health information need contexts can be identified by the Web application, the search results or information offered to the user can be customized to increase its relevance or usefulness to the user. The aim of this study was to investigate the possibility of identifying the three user health information contexts (searching for self, searching for others, or browsing with no particular health issue in mind) using just hyperlink clicking behavior; using eye-tracking information; and using a combination of eye-tracking, demographic, and urgency information. Predictive models are developed using multinomial logistic regression. A total of 74 participants (39 females and 35 males) who were mainly staff and students of a university were asked to browse a health discussion forum, Healthboards.com. An eye tracker recorded their examining (eye fixation) and skimming (quick eye movement) behaviors on 2 types of screens: summary result screen displaying a list of post headers, and detailed post screen. The following three types of predictive models were developed using logistic regression analysis: model 1 used only the time spent in scanning the summary result screen and reading the detailed post screen, which can be determined from the user's mouse clicks; model 2 used the examining and skimming durations on each screen, recorded by an eye tracker; and model 3 added user demographic and urgency information to model 2. An analysis of variance (ANOVA) analysis found that users' browsing durations were significantly different for the three health information contexts

  19. Science information to support Missouri River Scaphirhynchus albus (pallid sturgeon) effects analysis

    Science.gov (United States)

    Jacobson, Robert B.; Parsley, Michael J.; Annis, Mandy L.; Colvin, Michael E.; Welker, Timothy L.; James, Daniel A.

    2016-01-26

    The Missouri River Pallid Sturgeon Effects Analysis (EA) was commissioned by the U.S. Army Corps of Engineers to develop a foundation of understanding of how pallid sturgeon (Scaphirhynchus albus) population dynamics are linked to management actions in the Missouri River. The EA consists of several steps: (1) development of comprehensive, conceptual ecological models illustrating pallid sturgeon population dynamics and links to management actions and other drivers; (2) compilation and assessment of available scientific literature, databases, and models; (3) development of predictive, quantitative models to explore the system dynamics and population responses to management actions; and (4) analysis and assessment of effects of system operations and actions on species’ habitats and populations. This report addresses the second objective, compilation and assessment of relevant information.

  20. BIM ORIENTATION: GRADES OF GENERATION AND INFORMATION FOR DIFFERENT TYPE OF ANALYSIS AND MANAGEMENT PROCESS

    Directory of Open Access Journals (Sweden)

    F. Banfi

    2017-08-01

    Full Text Available Architecture, Engineering and Construction (AEC industry is facing a great process re-engineering of the management procedures for new constructions, and recent studies show a significant increase of the benefits obtained through the use of Building Information Modelling (BIM methodologies. This innovative approach needs new developments for information and communication technologies (ICT in order to improve cooperation and interoperability among different actors and scientific disciplines. Accordingly, BIM could be described as a new tool capable of collect/analyse a great quantity of information (Big data and improve the management of building during its life of cycle (LC. The main aim of this research is, in addition to a reduction in production times, reduce physical and financial resources (economic impact, to demonstrate how technology development can support a complex generative process with new digital tools (modelling impact. This paper reviews recent BIMs of different historical Italian buildings such as Basilica of Collemaggio in L’Aquila, Masegra Castle in Sondrio, Basilica of Saint Ambrose in Milan and Visconti Bridge in Lecco and carries out a methodological analysis to optimize output information and results combining different data and modelling techniques into a single hub (cloud service through the use of new Grade of Generation (GoG and Information (GoI (management impact. Finally, this study shows the need to orient GoG and GoI for a different type of analysis, which requires a high Grade of Accuracy (GoA and an Automatic Verification System (AVS at the same time.

  1. How Did the Information Flow in the #AlphaGo Hashtag Network? A Social Network Analysis of the Large-Scale Information Network on Twitter.

    Science.gov (United States)

    Kim, Jinyoung

    2017-12-01

    As it becomes common for Internet users to use hashtags when posting and searching information on social media, it is important to understand who builds a hashtag network and how information is circulated within the network. This article focused on unlocking the potential of the #AlphaGo hashtag network by addressing the following questions. First, the current study examined whether traditional opinion leadership (i.e., the influentials hypothesis) or grassroot participation by the public (i.e., the interpersonal hypothesis) drove dissemination of information in the hashtag network. Second, several unique patterns of information distribution by key users were identified. Finally, the association between attributes of key users who exerted great influence on information distribution (i.e., the number of followers and follows) and their central status in the network was tested. To answer the proffered research questions, a social network analysis was conducted using a large-scale hashtag network data set from Twitter (n = 21,870). The results showed that the leading actors in the network were actively receiving information from their followers rather than serving as intermediaries between the original information sources and the public. Moreover, the leading actors played several roles (i.e., conversation starters, influencers, and active engagers) in the network. Furthermore, the number of their follows and followers were significantly associated with their central status in the hashtag network. Based on the results, the current research explained how the information was exchanged in the hashtag network by proposing the reciprocal model of information flow.

  2. Probabilistic aspects of analysis and information operation of the system in situations of conflict

    Directory of Open Access Journals (Sweden)

    S. V. Gluschenko

    2012-01-01

    Full Text Available The problems of research of the parameters interaction structure and the conflict formation are considered in the stochastic systems by the general system theory. The mathematical aspects of relations conflict, promotion and indifference are analyzed. We consider the information approach for the analysis of the conflict.

  3. The use of cloud enabled building information models – an expert analysis

    Directory of Open Access Journals (Sweden)

    Alan Redmond

    2015-10-01

    Full Text Available The dependency of today’s construction professionals to use singular commercial applications for design possibilities creates the risk of being dictated by the language-tools they use. This unknowingly approach to converting to the constraints of a particular computer application’s style, reduces one’s association with cutting-edge design as no single computer application can support all of the tasks associated with building-design and production. Interoperability depicts the need to pass data between applications, allowing multiple types of experts and applications to contribute to the work at hand. Cloud computing is a centralized heterogeneous platform that enables different applications to be connected to each other through using remote data servers. However, the possibility of providing an interoperable process based on binding several construction applications through a single repository platform ‘cloud computing’ required further analysis. The following Delphi questionnaires analysed the exchanging information opportunities of Building Information Modelling (BIM as the possible solution for the integration of applications on a cloud platform. The survey structure is modelled to; (i identify the most appropriate applications for advancing interoperability at the early design stage, (ii detect the most severe barriers of BIM implementation from a business and legal viewpoint, (iii examine the need for standards to address information exchange between design team, and (iv explore the use of the most common interfaces for exchanging information. The anticipated findings will assist in identifying a model that will enhance the standardized passing of information between systems at the feasibility design stage of a construction project.

  4. The use of cloud enabled building information models – an expert analysis

    Directory of Open Access Journals (Sweden)

    Alan Redmond

    2012-12-01

    Full Text Available The dependency of today’s construction professionals to use singular commercial applications for design possibilities creates the risk of being dictated by the language-tools they use. This unknowingly approach to converting to the constraints of a particular computer application’s style, reduces one’s association with cutting-edge design as no single computer application can support all of the tasks associated with building-design and production. Interoperability depicts the need to pass data between applications, allowing multiple types of experts and applications to contribute to the work at hand. Cloud computing is a centralized heterogeneous platform that enables different applications to be connected to each other through using remote data servers. However, the possibility of providing an interoperable process based on binding several construction applications through a single repository platform ‘cloud computing’ required further analysis. The following Delphi questionnaires analysed the exchanging information opportunities of Building Information Modelling (BIM as the possible solution for the integration of applications on a cloud platform. The survey structure is modelled to; (i identify the most appropriate applications for advancing interoperability at the early design stage, (ii detect the most severe barriers of BIM implementation from a business and legal viewpoint, (iii examine the need for standards to address information exchange between design team, and (iv explore the use of the most common interfaces for exchanging information. The anticipated findings will assist in identifying a model that will enhance the standardized passing of information between systems at the feasibility design stage of a construction project.

  5. Information Foraging Theory: A Framework for Intelligence Analysis

    Science.gov (United States)

    2014-11-01

    oceanographic information, human intelligence (HUMINT), open-source intelligence ( OSINT ), and information provided by other governmental departments [1][5...Human Intelligence IFT Information Foraging Theory LSA Latent Semantic Similarity MVT Marginal Value Theorem OFT Optimal Foraging Theory OSINT

  6. Analysis of workers' dose records from the Greek Dose Registry Information System

    International Nuclear Information System (INIS)

    Kamenopoulou, V.; Dimitriou, P.; Proukakis, Ch.

    1995-01-01

    The object of this work is the study of the individual film badge annual dose information of classified workers in Greece, monitored and assessed by the central dosimetry service of the Greek Atomic Energy Commission. Dose summaries were recorded and processed by the Dose Registry Information System. The statistical analysis refers to the years 1989-93 and deals with the distribution of individuals in the occupational groups, the mean annual dose, the collective dose, the distribution of the dose over the different specialties and the number of workers that have exceeded any of the established dose limits. Results concerning the annual dose summaries, demonstrate a year-by-year reduction in the mean individual dose to workers in the health sector. Conversely, exposures in the industrial sector did not show any decreasing tendency during the period under consideration. (Author)

  7. Communicating Risk Information in Direct-to-Consumer Prescription Drug Television Ads: A Content Analysis.

    Science.gov (United States)

    Sullivan, Helen W; Aikin, Kathryn J; Poehlman, Jon

    2017-11-10

    Direct-to-consumer (DTC) television ads for prescription drugs are required to disclose the product's major risks in the audio or audio and visual parts of the presentation (sometimes referred to as the "major statement"). The objective of this content analysis was to determine how the major statement of risks is presented in DTC television ads, including what risk information is presented, how easy or difficult it is to understand the risk information, and the audio and visual characteristics of the major statement. We identified 68 DTC television ads for branded prescription drugs, which included a unique major statement and that aired between July 2012 and August 2014. We used subjective and objective measures to code 50 ads randomly selected from the main sample. Major statements often presented numerous risks, usually in order of severity, with no quantitative information about the risks' severity or prevalence. The major statements required a high school reading level, and many included long and complex sentences. The major statements were often accompanied by competing non-risk information in the visual images, presented with moderately fast-paced music, and read at a faster pace than benefit information. Overall, we discovered several ways in which the communication of risk information could be improved.

  8. Process-aware information systems : design, enactment and analysis

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Wah, B.W.

    2009-01-01

    Process-aware information systems support operational business processes by combining advances in information technology with recent insights from management science. Workflow management systems are typical examples of such systems. However, many other types of information systems are also "process

  9. Strategic information transmission: a mathematica tool for analysis

    OpenAIRE

    Dickhaut, John; Kaplan, Todd R; Mukherji, Arijit

    1992-01-01

    Economists and other applied researchers use game theory to study industrial organization, financial markets, and the theory of the firm. In an earlier article in the Mathematica Journal, [Dickhaut and Kaplan 1991] present a procedure for solving two-person games of complete information. In many applications, however, "asymmetric information" is a central issue. By asymmetric information, we mean that one party has access to information that the other party lacks. The branch of game the...

  10. Newcomer information seeking

    DEFF Research Database (Denmark)

    Moring, Camilla Elisabeth

    2017-01-01

    Introduction: Research on socialization and learning processes among organizational newcomers is offering valuable insight into the role of information seeking in the workplace, and to why, and how newcomers seek information when entering a new organization. Analysis: The aim of the paper is to o...... and corporeal information sources newcomers learn about the organizational practice, and the knowledge needed in order to develop as a competent practitioner and become a full member of the organization.......Introduction: Research on socialization and learning processes among organizational newcomers is offering valuable insight into the role of information seeking in the workplace, and to why, and how newcomers seek information when entering a new organization. Analysis: The aim of the paper...... is to outline and discuss the significance of information seeking in newcomer socialization and learning, and analyse how different approaches influence our understanding of the role of information seeking in the workplace. Results: It is argued, that a development in research on newcomer information seeking...

  11. Online information and support needs of women with advanced breast cancer: a qualitative analysis.

    Science.gov (United States)

    Kemp, Emma; Koczwara, Bogda; Butow, Phyllis; Turner, Jane; Girgis, Afaf; Schofield, Penelope; Hulbert-Williams, Nicholas; Levesque, Janelle; Spence, Danielle; Vatandoust, Sina; Kichenadasse, Ganessan; Roy, Amitesh; Sukumaran, Shawgi; Karapetis, Christos S; Richards, Caroline; Fitzgerald, Michael; Beatty, Lisa

    2018-04-24

    Women with advanced breast cancer (ABC) face significant adjustment challenges, yet few resources provide them with information and support, and attendance barriers can preclude access to face-to-face psychosocial support. This paper reports on two qualitative studies examining (i) whether information and support-seeking preferences of women with ABC could be addressed in an online intervention, and (ii) how an existing intervention for patients with early stage cancer could be adapted for women with ABC. Women with ABC participated in telephone interviews about their information and support-seeking preferences (N = 21) and evaluated an online intervention focused on early-stage cancer (N = 15). Interviews were transcribed and underwent thematic analysis using the framework method to identify salient themes. Participants most commonly sought medical, lifestyle-related, and practical information/support; however, when presented with an online intervention, participants most commonly gave positive feedback on content on coping with emotional distress. Difficulty finding information and barriers to using common sources of information/support including health professionals, family and friends, and peers were reported; however, some women also reported not wanting information or support. All participants evaluating the existing intervention gave positive feedback on various components, with results suggesting an online intervention could be an effective means of providing information/support to women with ABC, given improved specificity/relevance to ABC and increased tailoring to individual circumstances and preferences. Adaptation of an existing online intervention for early stage cancer appears to be a promising avenue to address the information and support needs of women with ABC.

  12. Information dimension analysis of bacterial essential and nonessential genes based on chaos game representation

    International Nuclear Information System (INIS)

    Zhou, Qian; Yu, Yong-ming

    2014-01-01

    Essential genes are indispensable for the survival of an organism. Investigating features associated with gene essentiality is fundamental to the prediction and identification of the essential genes. Selecting features associated with gene essentiality is fundamental to predict essential genes with computational techniques. We use fractal theory to make comparative analysis of essential and nonessential genes in bacteria. The information dimensions of essential genes and nonessential genes available in the DEG database for 27 bacteria are calculated based on their gene chaos game representations (CGRs). It is found that weak positive linear correlation exists between information dimension and gene length. Moreover, for genes of similar length, the average information dimension of essential genes is larger than that of nonessential genes. This indicates that essential genes show less regularity and higher complexity than nonessential genes. Our results show that for bacterium with a similar number of essential genes and nonessential genes, the CGR information dimension is helpful for the classification of essential genes and nonessential genes. Therefore, the gene CGR information dimension is very probably a useful gene feature for a genetic algorithm predicting essential genes. (paper)

  13. Applications of statistical physics and information theory to the analysis of DNA sequences

    Science.gov (United States)

    Grosse, Ivo

    2000-10-01

    DNA carries the genetic information of most living organisms, and the of genome projects is to uncover that genetic information. One basic task in the analysis of DNA sequences is the recognition of protein coding genes. Powerful computer programs for gene recognition have been developed, but most of them are based on statistical patterns that vary from species to species. In this thesis I address the question if there exist universal statistical patterns that are different in coding and noncoding DNA of all living species, regardless of their phylogenetic origin. In search for such species-independent patterns I study the mutual information function of genomic DNA sequences, and find that it shows persistent period-three oscillations. To understand the biological origin of the observed period-three oscillations, I compare the mutual information function of genomic DNA sequences to the mutual information function of stochastic model sequences. I find that the pseudo-exon model is able to reproduce the mutual information function of genomic DNA sequences. Moreover, I find that a generalization of the pseudo-exon model can connect the existence and the functional form of long-range correlations to the presence and the length distributions of coding and noncoding regions. Based on these theoretical studies I am able to find an information-theoretical quantity, the average mutual information (AMI), whose probability distributions are significantly different in coding and noncoding DNA, while they are almost identical in all studied species. These findings show that there exist universal statistical patterns that are different in coding and noncoding DNA of all studied species, and they suggest that the AMI may be used to identify genes in different living species, irrespective of their taxonomic origin.

  14. Multi-Dimensional Analysis of Dynamic Human Information Interaction

    Science.gov (United States)

    Park, Minsoo

    2013-01-01

    Introduction: This study aims to understand the interactions of perception, effort, emotion, time and performance during the performance of multiple information tasks using Web information technologies. Method: Twenty volunteers from a university participated in this study. Questionnaires were used to obtain general background information and…

  15. Information strategies and energy conservation behavior: A meta-analysis of experimental studies from 1975 to 2012

    International Nuclear Information System (INIS)

    Delmas, Magali A.; Fischlein, Miriam; Asensio, Omar I.

    2013-01-01

    Strategies that provide information about the environmental impact of activities are increasingly seen as effective to encourage conservation behavior. This article offers the most comprehensive meta-analysis of information based energy conservation experiments conducted to date. Based on evidence from 156 published field trials and 525,479 study subjects from 1975 to 2012, we quantify the energy savings from information based strategies. On average, individuals in the experiments reduced their electricity consumption by 7.4%. Our results also show that strategies providing individualized audits and consulting are comparatively more effective for conservation behavior than strategies that provide historical, peer comparison energy feedback. Interestingly, we find that pecuniary feedback and incentives lead to a relative increase in energy usage rather than induce conservation. We also find that the conservation effect diminishes with the rigor of the study, indicating potential methodological issues in the current literature. - Highlights: • We conduct a meta-analysis of information-based energy conservation experiments. • We analyze 156 published trials and 524,479 study subjects from 1975 to 2012. • On average, individuals in the experiments reduced electricity consumption by 7.4%. • Individualized feedback via audits and consulting results in the largest reductions. Pecuniary feedback and incentives lead to a relative increase in energy usage

  16. 2DB: a Proteomics database for storage, analysis, presentation, and retrieval of information from mass spectrometric experiments.

    Science.gov (United States)

    Allmer, Jens; Kuhlgert, Sebastian; Hippler, Michael

    2008-07-07

    The amount of information stemming from proteomics experiments involving (multi dimensional) separation techniques, mass spectrometric analysis, and computational analysis is ever-increasing. Data from such an experimental workflow needs to be captured, related and analyzed. Biological experiments within this scope produce heterogenic data ranging from pictures of one or two-dimensional protein maps and spectra recorded by tandem mass spectrometry to text-based identifications made by algorithms which analyze these spectra. Additionally, peptide and corresponding protein information needs to be displayed. In order to handle the large amount of data from computational processing of mass spectrometric experiments, automatic import scripts are available and the necessity for manual input to the database has been minimized. Information is in a generic format which abstracts from specific software tools typically used in such an experimental workflow. The software is therefore capable of storing and cross analysing results from many algorithms. A novel feature and a focus of this database is to facilitate protein identification by using peptides identified from mass spectrometry and link this information directly to respective protein maps. Additionally, our application employs spectral counting for quantitative presentation of the data. All information can be linked to hot spots on images to place the results into an experimental context. A summary of identified proteins, containing all relevant information per hot spot, is automatically generated, usually upon either a change in the underlying protein models or due to newly imported identifications. The supporting information for this report can be accessed in multiple ways using the user interface provided by the application. We present a proteomics database which aims to greatly reduce evaluation time of results from mass spectrometric experiments and enhance result quality by allowing consistent data handling

  17. 2DB: a Proteomics database for storage, analysis, presentation, and retrieval of information from mass spectrometric experiments

    Directory of Open Access Journals (Sweden)

    Hippler Michael

    2008-07-01

    Full Text Available Abstract Background The amount of information stemming from proteomics experiments involving (multi dimensional separation techniques, mass spectrometric analysis, and computational analysis is ever-increasing. Data from such an experimental workflow needs to be captured, related and analyzed. Biological experiments within this scope produce heterogenic data ranging from pictures of one or two-dimensional protein maps and spectra recorded by tandem mass spectrometry to text-based identifications made by algorithms which analyze these spectra. Additionally, peptide and corresponding protein information needs to be displayed. Results In order to handle the large amount of data from computational processing of mass spectrometric experiments, automatic import scripts are available and the necessity for manual input to the database has been minimized. Information is in a generic format which abstracts from specific software tools typically used in such an experimental workflow. The software is therefore capable of storing and cross analysing results from many algorithms. A novel feature and a focus of this database is to facilitate protein identification by using peptides identified from mass spectrometry and link this information directly to respective protein maps. Additionally, our application employs spectral counting for quantitative presentation of the data. All information can be linked to hot spots on images to place the results into an experimental context. A summary of identified proteins, containing all relevant information per hot spot, is automatically generated, usually upon either a change in the underlying protein models or due to newly imported identifications. The supporting information for this report can be accessed in multiple ways using the user interface provided by the application. Conclusion We present a proteomics database which aims to greatly reduce evaluation time of results from mass spectrometric experiments and enhance

  18. Informing the NCA: EPA's Climate Change Impact and Risk Analysis Framework

    Science.gov (United States)

    Sarofim, M. C.; Martinich, J.; Kolian, M.; Crimmins, A. R.

    2017-12-01

    The Climate Change Impact and Risk Analysis (CIRA) framework is designed to quantify the physical impacts and economic damages in the United States under future climate change scenarios. To date, the framework has been applied to 25 sectors, using scenarios and projections developed for the Fourth National Climate Assessment. The strength of this framework has been in the use of consistent climatic, socioeconomic, and technological assumptions and inputs across the impact sectors to maximize the ease of cross-sector comparison. The results of the underlying CIRA sectoral analyses are informing the sustained assessment process by helping to address key gaps related to economic valuation and risk. Advancing capacity and scientific literature in this area has created opportunity to consider future applications and strengthening of the framework. This presentation will describe the CIRA framework, present results for various sectors such as heat mortality, air & water quality, winter recreation, and sea level rise, and introduce potential enhancements that can improve the utility of the framework for decision analysis.

  19. Information Security Risk Analysis

    CERN Document Server

    Peltier, Thomas R

    2010-01-01

    Offers readers with the knowledge and the skill-set needed to achieve a highly effective risk analysis assessment. This title demonstrates how to identify threats and then determine if those threats pose a real risk. It is suitable for industry and academia professionals.

  20. Statistical performance and information content of time lag analysis and redundancy analysis in time series modeling.

    Science.gov (United States)

    Angeler, David G; Viedma, Olga; Moreno, José M

    2009-11-01

    Time lag analysis (TLA) is a distance-based approach used to study temporal dynamics of ecological communities by measuring community dissimilarity over increasing time lags. Despite its increased use in recent years, its performance in comparison with other more direct methods (i.e., canonical ordination) has not been evaluated. This study fills this gap using extensive simulations and real data sets from experimental temporary ponds (true zooplankton communities) and landscape studies (landscape categories as pseudo-communities) that differ in community structure and anthropogenic stress history. Modeling time with a principal coordinate of neighborhood matrices (PCNM) approach, the canonical ordination technique (redundancy analysis; RDA) consistently outperformed the other statistical tests (i.e., TLAs, Mantel test, and RDA based on linear time trends) using all real data. In addition, the RDA-PCNM revealed different patterns of temporal change, and the strength of each individual time pattern, in terms of adjusted variance explained, could be evaluated, It also identified species contributions to these patterns of temporal change. This additional information is not provided by distance-based methods. The simulation study revealed better Type I error properties of the canonical ordination techniques compared with the distance-based approaches when no deterministic component of change was imposed on the communities. The simulation also revealed that strong emphasis on uniform deterministic change and low variability at other temporal scales is needed to result in decreased statistical power of the RDA-PCNM approach relative to the other methods. Based on the statistical performance of and information content provided by RDA-PCNM models, this technique serves ecologists as a powerful tool for modeling temporal change of ecological (pseudo-) communities.

  1. 78 FR 30846 - Agency Information Collection Activities; Proposed Collection; Comment Request: Information...

    Science.gov (United States)

    2013-05-23

    ... information technology. Comments may be sent to Jon Garcia, Program Analysis and Monitoring Branch, Child... information collection should be directed to Jon Garcia, Program Analysis and Monitoring Branch, Child... CACFP to provide cash reimbursement and commodity assistance, on a per meal basis, for food service to...

  2. Trends in information behaviour research

    DEFF Research Database (Denmark)

    Greifeneder, Elke Susanne

    2014-01-01

    behaviour related publication venues between 2012 and 2014. Analysis. Publication titles, authors, years, publication venue, methods and topics were collected and quantitatively analysed. Results. Qualitative methods still dominate information behaviour research. Content analysis and participatory designs......Introduction. This paper traces current trends in information behaviour research, both in terms of methods and topics. Results are put into relation to the previous trend analysis by Julien et al. (2011) and Vakkari (2008). Method. Trends derive from a publication analysis taken from information...... are gaining terrain. Information seeking is still the major topic of interest. Important newer topics are studies focusing on users’ context and on special needs. Conclusion. Information behaviour research has evolved a great deal over the last years and has taken on new methods and new topics. A discussion...

  3. KBS4FIA: Leveraging advanced knowledge-based systems for financial information analysis

    OpenAIRE

    García-Sánchez, Francisco; Paredes-Valverde, Mario Andrés; Valencia García, Rafael; Alcaraz Mármol, Gema; Almela Sánchez-Lafuente, Ángela

    2017-01-01

    Decision making takes place in an environment of uncertainty. Therefore, it is necessary to have information which is as accurate and complete as possible in order to minimize the risk that is inherent to the decision-making process. In the financial domain, the situation becomes even more critical due to the intrinsic complexity of the analytical tasks within this field. The main aim of the KBS4FIA project is to automate the processes associated with financial analysis by leveraging the tech...

  4. Information systems for mental health in six low and middle income countries: cross country situation analysis.

    Science.gov (United States)

    Upadhaya, Nawaraj; Jordans, Mark J D; Abdulmalik, Jibril; Ahuja, Shalini; Alem, Atalay; Hanlon, Charlotte; Kigozi, Fred; Kizza, Dorothy; Lund, Crick; Semrau, Maya; Shidhaye, Rahul; Thornicroft, Graham; Komproe, Ivan H; Gureje, Oye

    2016-01-01

    Research on information systems for mental health in low and middle income countries (LMICs) is scarce. As a result, there is a lack of reliable information on mental health service needs, treatment coverage and the quality of services provided. With the aim of informing the development and implementation of a mental health information sub-system that includes reliable and measurable indicators on mental health within the Health Management Information Systems (HMIS), a cross-country situation analysis of HMIS was conducted in six LMICs (Ethiopia, India, Nepal, Nigeria, South Africa and Uganda), participating in the 'Emerging mental health systems in low and middle income countries' (Emerald) research programme. A situation analysis tool was developed to obtain and chart information from documents in the public domain. In circumstances when information was inadequate, key government officials were contacted to verify the data collected. In this paper we compare the baseline policy context, human resources situation as well as the processes and mechanisms of collecting, verifying, reporting and disseminating mental health related HMIS data. The findings suggest that countries face substantial policy, human resource and health governance challenges for mental health HMIS, many of which are common across sites. In particular, the specific policies and plans for the governance and implementation of mental health data collection, reporting and dissemination are absent. Across sites there is inadequate infrastructure, few HMIS experts, and inadequate technical support and supervision to junior staff, particularly in the area of mental health. Nonetheless there are also strengths in existing HMIS where a few mental health morbidity, mortality, and system level indicators are collected and reported. Our study indicates the need for greater technical and resources input to strengthen routine HMIS and develop standardized HMIS indicators for mental health, focusing in

  5. Analysis of public consciousness structure and consideration of information supply against the nuclear power generation

    International Nuclear Information System (INIS)

    Shimooka, Hiroshi

    2001-01-01

    The Energy Engineering Research Institute carried out six times of questionnaire on analysis of public consciousness structure for fiscal years for 1986 to 1999, to obtain a lot of informations on public recognition against the nuclear power generation. In recent, as a feasibility on change of consciousness against the power generation was supposed by occurrence of the JCO critical accident forming the first victim in Japan on September, 1999 after investigation in fiscal year 1998, by carrying out the same questionnaire as one in previous fiscal year to the same objects after the accident, to analyze how evaluation, behavior determining factor and so forth on the power generation changed by the accident. In this paper, on referring to results of past questionnaires, were introduced on the questionnaire results and their analysis carried out before and after the JCO critical accident, to consider on information supply referred by them. (G.K.)

  6. Comprehension and Analysis of Information in Text: I. Construction and Evaluation of Brief Texts.

    Science.gov (United States)

    Kozminsky, Ely; And Others

    This report describes a series of studies designed to construct and validate a set of text materials necessary to the pursuance of a long-term research project on information analysis and integration in semantically rich, naturalistic domains, primarily in the domain of the stock market. The methods and results of six separate experiments on…

  7. Log Usage Analysis: What it Discloses about Use, Information Seeking and Trustworthiness

    Directory of Open Access Journals (Sweden)

    David Nicholas

    2014-06-01

    Full Text Available The Trust and Authority in Scholarly Communications in the Light of the Digital Transition research project1 was a study which investigated the behaviours and attitudes of academic researchers as producers and consumers of scholarly information resources in respect to how they determine authority and trustworthiness. The research questions for the study arose out of CIBER’s studies of the virtual scholar. This paper focuses on elements of this study, mainly an analysis of a scholarly publisher’s usage logs, which was undertaken at the start of the project in order to build an evidence base, which would help calibrate the main methodological tools used by the project: interviews and questionnaire. The specific purpose of the log study was to identify and assess the digital usage behaviours that potentially raise trustworthiness and authority questions. Results from the self-report part of the study were additionally used to explain the logs. The main findings were that: 1 logs provide a good indicator of use and information seeking behaviour, albeit in respect to just a part of the information seeking journey; 2 the ‘lite’ form of information seeking behaviour observed in the logs is a sign of users trying to make their mind up in the face of a tsunami of information as to what is relevant and to be trusted; 3 Google and Google Scholar are the discovery platforms of choice for academic researchers, which partly points to the fact that they are influenced in what they use and read by ease of access; 4 usage is not a suitable proxy for quality. The paper also provides contextual data from CIBER’s previous studies.

  8. Information-Theoretic Performance Analysis of Sensor Networks via Markov Modeling of Time Series Data.

    Science.gov (United States)

    Li, Yue; Jha, Devesh K; Ray, Asok; Wettergren, Thomas A; Yue Li; Jha, Devesh K; Ray, Asok; Wettergren, Thomas A; Wettergren, Thomas A; Li, Yue; Ray, Asok; Jha, Devesh K

    2018-06-01

    This paper presents information-theoretic performance analysis of passive sensor networks for detection of moving targets. The proposed method falls largely under the category of data-level information fusion in sensor networks. To this end, a measure of information contribution for sensors is formulated in a symbolic dynamics framework. The network information state is approximately represented as the largest principal component of the time series collected across the network. To quantify each sensor's contribution for generation of the information content, Markov machine models as well as x-Markov (pronounced as cross-Markov) machine models, conditioned on the network information state, are constructed; the difference between the conditional entropies of these machines is then treated as an approximate measure of information contribution by the respective sensors. The x-Markov models represent the conditional temporal statistics given the network information state. The proposed method has been validated on experimental data collected from a local area network of passive sensors for target detection, where the statistical characteristics of environmental disturbances are similar to those of the target signal in the sense of time scale and texture. A distinctive feature of the proposed algorithm is that the network decisions are independent of the behavior and identity of the individual sensors, which is desirable from computational perspectives. Results are presented to demonstrate the proposed method's efficacy to correctly identify the presence of a target with very low false-alarm rates. The performance of the underlying algorithm is compared with that of a recent data-driven, feature-level information fusion algorithm. It is shown that the proposed algorithm outperforms the other algorithm.

  9. Technology and Research Requirements for Combating Human Trafficking: Enhancing Communication, Analysis, Reporting, and Information Sharing

    Energy Technology Data Exchange (ETDEWEB)

    Kreyling, Sean J.; West, Curtis L.; Olson, Jarrod

    2011-03-17

    DHS’ Science & Technology Directorate directed PNNL to conduct an exploratory study on the domain of human trafficking in the Pacific Northwest in order to examine and identify technology and research requirements for enhancing communication, analysis, reporting, and information sharing – activities that directly support efforts to track, identify, deter, and prosecute human trafficking – including identification of potential national threats from smuggling and trafficking networks. This effort was conducted under the Knowledge Management Technologies Portfolio as part of the Integrated Federal, State, and Local/Regional Information Sharing (RISC) and Collaboration Program.

  10. The application of the unified modeling language in object-oriented analysis of healthcare information systems.

    Science.gov (United States)

    Aggarwal, Vinod

    2002-10-01

    This paper concerns itself with the beneficial effects of the Unified Modeling Language (UML), a nonproprietary object modeling standard, in specifying, visualizing, constructing, documenting, and communicating the model of a healthcare information system from the user's perspective. The author outlines the process of object-oriented analysis (OOA) using the UML and illustrates this with healthcare examples to demonstrate the practicality of application of the UML by healthcare personnel to real-world information system problems. The UML will accelerate advanced uses of object-orientation such as reuse technology, resulting in significantly higher software productivity. The UML is also applicable in the context of a component paradigm that promises to enhance the capabilities of healthcare information systems and simplify their management and maintenance.

  11. A Study on the Information Analysis and Legal Affairs

    International Nuclear Information System (INIS)

    Chung, W. S.; Yang, M. H.; Yun, S. W.; Lee, D. S.; Kim, H. R.; Noh, B. C.

    2009-02-01

    It is followed that results and contents of a Study on the Nuclear Information Analyses and Legal Affairs. Our team makes an effort to secure KAERI's best legal interest in the process of enacting nuclear laws and codes, international collaborative study, and management. Moreover, as a international trend analysis, we studied Japan government's position to nuclear energy under the aspect of reducing climate change and supplying sustainable energy. Improvement of Japan's radiation use showed increasing contribution of radiation technology to the people. Results of studies of nuclear policy of Kazakhstan, forecasting global trend in 2030 of Nuclear area, and new U.S. government's policy to nuclear energy are also explained. Lastly, we performed evaluation of source of electric generator which reduce emitting carbon dioxide in the aspect of greenhouse gas emission statistic and tested green gas reducing ability of Korea's green source of electric generator that reducing greenhouse gas effect

  12. Cost-Effectiveness and Value of Information Analysis of Brief Interventions to Promote Physical Activity in Primary Care.

    Science.gov (United States)

    Gc, Vijay Singh; Suhrcke, Marc; Hardeman, Wendy; Sutton, Stephen; Wilson, Edward C F

    2018-01-01

    Brief interventions (BIs) delivered in primary care have shown potential to increase physical activity levels and may be cost-effective, at least in the short-term, when compared with usual care. Nevertheless, there is limited evidence on their longer term costs and health benefits. To estimate the cost-effectiveness of BIs to promote physical activity in primary care and to guide future research priorities using value of information analysis. A decision model was used to compare the cost-effectiveness of three classes of BIs that have been used, or could be used, to promote physical activity in primary care: 1) pedometer interventions, 2) advice/counseling on physical activity, and (3) action planning interventions. Published risk equations and data from the available literature or routine data sources were used to inform model parameters. Uncertainty was investigated with probabilistic sensitivity analysis, and value of information analysis was conducted to estimate the value of undertaking further research. In the base-case, pedometer interventions yielded the highest expected net benefit at a willingness to pay of £20,000 per quality-adjusted life-year. There was, however, a great deal of decision uncertainty: the expected value of perfect information surrounding the decision problem for the National Health Service Health Check population was estimated at £1.85 billion. Our analysis suggests that the use of pedometer BIs is the most cost-effective strategy to promote physical activity in primary care, and that there is potential value in further research into the cost-effectiveness of brief (i.e., <30 minutes) and very brief (i.e., <5 minutes) pedometer interventions in this setting. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  13. Carbon Dioxide Information Analysis Center and World Data Center-A for atmospheric trace gases: FY 1993 activities

    International Nuclear Information System (INIS)

    Cushman, R.M.; Stoss, F.W.; Univ. of Tennessee, Knoxville, TN

    1994-01-01

    During the course of a fiscal year, Oak Ridge National Laboratory's Carbon Dioxide Information Analysis Center (CDIAC) distributes thousands of specialty publications-numeric data packages (NDPs), computer model packages (CMPs), technical reports, public communication publications, newsletters, article reprints, and reference books-in response to requests for information related to global environmental issues, primarily those pertaining to climate change. CDIAC's staff also provide technical responses to specific inquiries related to carbon dioxide (CO 2 ), other trace gases, and climate. Hundreds of referrals to other researchers, policy analysts, information specialists, or organizations are also facilitated by CDIAC's staff. This report provides an account of the activities accomplished by CDIAC (including World Data Center-A for Atmospheric Trace Gases) during the period October 1, 1992, to September 30, 1993. An organizational overview of CDIAC and its staff is supplemented by a detailed description of inquiries received and CDIAC's response to those inquiries. An analysis and description of the preparation and distribution of NDPS, CMPS, technical reports, newsletters, fact sheets, specialty publications, and reprints are provided. Comments and descriptions of CDIAC's information management systems, professional networking, and special bilateral agreements are also presented

  14. Carbon Dioxide Information Analysis Center and World Data Center-A for atmospheric trace gases: FY 1993 activities

    Energy Technology Data Exchange (ETDEWEB)

    Cushman, R.M. [Oak Ridge National Lab., TN (United States). Carbon Dioxide Information Analysis Center; Stoss, F.W. [Oak Ridge National Lab., TN (United States). Carbon Dioxide Information Analysis Center]|[Univ. of Tennessee, Knoxville, TN (United States). Energy, Environment, and Resources Center

    1994-01-01

    During the course of a fiscal year, Oak Ridge National Laboratory`s Carbon Dioxide Information Analysis Center (CDIAC) distributes thousands of specialty publications-numeric data packages (NDPs), computer model packages (CMPs), technical reports, public communication publications, newsletters, article reprints, and reference books-in response to requests for information related to global environmental issues, primarily those pertaining to climate change. CDIAC`s staff also provide technical responses to specific inquiries related to carbon dioxide (CO{sub 2}), other trace gases, and climate. Hundreds of referrals to other researchers, policy analysts, information specialists, or organizations are also facilitated by CDIAC`s staff. This report provides an account of the activities accomplished by CDIAC (including World Data Center-A for Atmospheric Trace Gases) during the period October 1, 1992, to September 30, 1993. An organizational overview of CDIAC and its staff is supplemented by a detailed description of inquiries received and CDIAC`s response to those inquiries. An analysis and description of the preparation and distribution of NDPS, CMPS, technical reports, newsletters, fact sheets, specialty publications, and reprints are provided. Comments and descriptions of CDIAC`s information management systems, professional networking, and special bilateral agreements are also presented.

  15. Health-care district management information system plan: Review of operations analysis activities during calendar year 1975 and plan for continued research and analysis activities

    Science.gov (United States)

    Nielson, G. J.; Stevenson, W. G.

    1976-01-01

    Operations research activities developed to identify the information required to manage both the efficiency and effectiveness of the Veterans Administration (VA) health services as these services relate to individual patient care are reported. The clinical concerns and management functions that determine this information requirement are discussed conceptually. Investigations of existing VA data for useful management information are recorded, and a diagnostic index is provided. The age-specific characteristics of diseases and lengths of stay are explored, and recommendations for future analysis activities are articulated. The effect of the introduction of new technology to health care is also discussed.

  16. Analysis of Transaction Costs in Logistics and the Methodologies for Their Information Reflection for Automotive Companies

    OpenAIRE

    Ol’ga Evgen’evna Kovrizhnykh; Polina Aleksandrovna Nechaeva

    2016-01-01

    Transaction costs emerge in different types of logistics activities and influence the material flow and the accompanying financial and information flows; due to this fact, the information support and assessment are important tasks for the enterprise. The paper analyzes transaction costs in logistics for automotive manufacturers; according to the analysis, the level of these costs in any functional area of “logistics supply” ranges from 1.5 to 20%. These are only the official figures of transa...

  17. Origins of modern data analysis linked to the beginnings and early development of computer science and information engineering

    OpenAIRE

    Murtagh, F.

    2008-01-01

    The history of data analysis that is addressed here is underpinned by two themes, -- those of tabular data analysis, and the analysis of collected heterogeneous data. "Exploratory data analysis" is taken as the heuristic approach that begins with data and information and seeks underlying explanation for what is observed or measured. I also cover some of the evolving context of research and applications, including scholarly publishing, technology transfer and the economic relationship of the u...

  18. Geometric theory of information

    CERN Document Server

    2014-01-01

    This book brings together geometric tools and their applications for Information analysis. It collects current and many uses of in the interdisciplinary fields of Information Geometry Manifolds in Advanced Signal, Image & Video Processing, Complex Data Modeling and Analysis, Information Ranking and Retrieval, Coding, Cognitive Systems, Optimal Control, Statistics on Manifolds, Machine Learning, Speech/sound recognition, and natural language treatment which are also substantially relevant for the industry.

  19. Pharmaceutical information systems and possible implementations of informed consent - developing an heuristic

    DEFF Research Database (Denmark)

    Ploug, Thomas; Holm, Søren

    2012-01-01

    Background Denmark has implemented a comprehensive, nationwide pharmaceutical information system, and this system has been evaluated by the Danish Council of Ethics. The system can be seen as an exemplar of a comprehensive health information system for clinical use. Analysis The paper analyses 1......) how informed consent can be implemented in the system and how different implementations create different impacts on autonomy and control of information, and 2) arguments directed towards justifying not seeking informed consent in this context. Results and Conclusion Based on the analysis a heuristic...... is provided which enables a ranking and estimation of the impact on autonomy and control of information of different options for consent to entry of data into the system and use of data from the system. The danger of routinisation of consent is identified. The Danish pharmaceutical information system raises...

  20. A Development of Nonstationary Regional Frequency Analysis Model with Large-scale Climate Information: Its Application to Korean Watershed

    Science.gov (United States)

    Kim, Jin-Young; Kwon, Hyun-Han; Kim, Hung-Soo

    2015-04-01

    The existing regional frequency analysis has disadvantages in that it is difficult to consider geographical characteristics in estimating areal rainfall. In this regard, this study aims to develop a hierarchical Bayesian model based nonstationary regional frequency analysis in that spatial patterns of the design rainfall with geographical information (e.g. latitude, longitude and altitude) are explicitly incorporated. This study assumes that the parameters of Gumbel (or GEV distribution) are a function of geographical characteristics within a general linear regression framework. Posterior distribution of the regression parameters are estimated by Bayesian Markov Chain Monte Carlo (MCMC) method, and the identified functional relationship is used to spatially interpolate the parameters of the distributions by using digital elevation models (DEM) as inputs. The proposed model is applied to derive design rainfalls over the entire Han-river watershed. It was found that the proposed Bayesian regional frequency analysis model showed similar results compared to L-moment based regional frequency analysis. In addition, the model showed an advantage in terms of quantifying uncertainty of the design rainfall and estimating the area rainfall considering geographical information. Finally, comprehensive discussion on design rainfall in the context of nonstationary will be presented. KEYWORDS: Regional frequency analysis, Nonstationary, Spatial information, Bayesian Acknowledgement This research was supported by a grant (14AWMP-B082564-01) from Advanced Water Management Research Program funded by Ministry of Land, Infrastructure and Transport of Korean government.

  1. INFORMATION TECHNOLOGY STUDY ADAPTATSYONNOY ABILITIES CARDIOVASCULAR SYSTEM FOR PHYSICAL EXEMENATION BY MORPHOLOGICAL, TEMPORAL AND SPECTRAL ANALYSIS OSCILLOGRAMS

    OpenAIRE

    V. P. Martsenyuk; D. V. Vakulenko; L. O. Vakulenko

    2015-01-01

    Offer Author Information Technology morphological, temporal and spectral analysis of waveforms (recorded at rest and after exercise), the introduction of analytical treated for clinical interpretation of the results, evaluation and decision-making to doctors significantly increases the information content of the procedure of blood pressure measurement. Can be used for early detection and prenosological premorbid state and functional reserve of the circulatory system, help more effectively to ...

  2. Information contained within the large scale gas injection test (Lasgit) dataset exposed using a bespoke data analysis tool-kit

    International Nuclear Information System (INIS)

    Bennett, D.P.; Thomas, H.R.; Cuss, R.J.; Harrington, J.F.; Vardon, P.J.

    2012-01-01

    Document available in extended abstract form only. The Large Scale Gas Injection Test (Lasgit) is a field scale experiment run by the British Geological Survey (BGS) and is located approximately 420 m underground at SKB's Aespoe Hard Rock Laboratory (HRL) in Sweden. It has been designed to study the impact on safety of gas build up within a KBS-3V concept high level radioactive waste repository. Lasgit has been in almost continuous operation for approximately seven years and is still underway. An analysis of the dataset arising from the Lasgit experiment with particular attention to the smaller scale features and phenomenon recorded has been undertaken in parallel to the macro scale analysis performed by the BGS. Lasgit is a highly instrumented, frequently sampled and long-lived experiment leading to a substantial dataset containing in excess of 14.7 million datum points. The data is anticipated to include a wealth of information, including information regarding overall processes as well as smaller scale or 'second order' features. Due to the size of the dataset coupled with the detailed analysis of the dataset required and the reduction in subjectivity associated with measurement compared to observation, computational analysis is essential. Moreover, due to the length of operation and complexity of experimental activity, the Lasgit dataset is not typically suited to 'out of the box' time series analysis algorithms. In particular, the features that are not suited to standard algorithms include non-uniformities due to (deliberate) changes in sample rate at various points in the experimental history and missing data due to hardware malfunction/failure causing interruption of logging cycles. To address these features a computational tool-kit capable of performing an Exploratory Data Analysis (EDA) on long-term, large-scale datasets with non-uniformities has been developed. Particular tool-kit abilities include: the parameterization of signal variation in the dataset

  3. Informed consent and placebo effects: a content analysis of information leaflets to identify what clinical trial participants are told about placebos.

    Directory of Open Access Journals (Sweden)

    Felicity L Bishop

    Full Text Available Placebo groups are used in randomised clinical trials (RCTs to control for placebo effects, which can be large. Participants in trials can misunderstand written information particularly regarding technical aspects of trial design such as randomisation; the adequacy of written information about placebos has not been explored. We aimed to identify what participants in major RCTs in the UK are told about placebos and their effects.We conducted a content analysis of 45 Participant Information Leaflets (PILs using quantitative and qualitative methodologies. PILs were obtained from trials on a major registry of current UK clinical trials (the UKCRN database. Eligible leaflets were received from 44 non-commercial trials but only 1 commercial trial. The main limitation is the low response rate (13.5%, but characteristics of included trials were broadly representative of all non-commercial trials on the database. 84% of PILs were for trials with 50:50 randomisation ratios yet in almost every comparison the target treatments were prioritized over the placebos. Placebos were referred to significantly less frequently than target treatments (7 vs. 27 mentions, p<001 and were significantly less likely than target treatments to be described as triggering either beneficial effects (1 vs. 45, p<001 or adverse effects (4 vs. 39, p<001. 8 PILs (18% explicitly stated that the placebo treatment was either undesirable or ineffective.PILs from recent high quality clinical trials emphasise the benefits and adverse effects of the target treatment, while largely ignoring the possible effects of the placebo. Thus they provide incomplete and at times inaccurate information about placebos. Trial participants should be more fully informed about the health changes that they might experience from a placebo. To do otherwise jeopardises informed consent and is inconsistent with not only the science of placebos but also the fundamental rationale underpinning placebo controlled

  4. Towards a New Approach of the Economic Intelligence Process: Basic Concepts, Analysis Methods and Informational Tools

    Directory of Open Access Journals (Sweden)

    Sorin Briciu

    2009-04-01

    Full Text Available One of the obvious trends in current business environment is the increased competition. In this context, organizations are becoming more and more aware of the importance of knowledge as a key factor in obtaining competitive advantage. A possible solution in knowledge management is Economic Intelligence (EI that involves the collection, evaluation, processing, analysis, and dissemination of economic data (about products, clients, competitors, etc. inside organizations. The availability of massive quantities of data correlated with advances in information and communication technology allowing for the filtering and processing of these data provide new tools for the production of economic intelligence.The research is focused on innovative aspects of economic intelligence process (models of analysis, activities, methods and informational tools and is providing practical guidelines for initiating this process. In this paper, we try: (a to contribute to a coherent view on economic intelligence process (approaches, stages, fields of application; b to describe the most important models of analysis related to this process; c to analyze the activities, methods and tools associated with each stage of an EI process.

  5. A computational intelligent approach to multi-factor analysis of violent crime information system

    Science.gov (United States)

    Liu, Hongbo; Yang, Chao; Zhang, Meng; McLoone, Seán; Sun, Yeqing

    2017-02-01

    Various scientific studies have explored the causes of violent behaviour from different perspectives, with psychological tests, in particular, applied to the analysis of crime factors. The relationship between bi-factors has also been extensively studied including the link between age and crime. In reality, many factors interact to contribute to criminal behaviour and as such there is a need to have a greater level of insight into its complex nature. In this article we analyse violent crime information systems containing data on psychological, environmental and genetic factors. Our approach combines elements of rough set theory with fuzzy logic and particle swarm optimisation to yield an algorithm and methodology that can effectively extract multi-knowledge from information systems. The experimental results show that our approach outperforms alternative genetic algorithm and dynamic reduct-based techniques for reduct identification and has the added advantage of identifying multiple reducts and hence multi-knowledge (rules). Identified rules are consistent with classical statistical analysis of violent crime data and also reveal new insights into the interaction between several factors. As such, the results are helpful in improving our understanding of the factors contributing to violent crime and in highlighting the existence of hidden and intangible relationships between crime factors.

  6. Health information search to deal with the exploding amount of health information produced.

    Science.gov (United States)

    Müller, H; Hanbury, A; Al Shorbaji, N

    2012-01-01

    This focus theme deals with the various aspects of health information search that are necessary to cope with the challenges of an increasing amount and complexity of medical information currently produced. This editorial reviews the main challenges of health information search and summarizes the five papers of this focus theme. The five papers of the focus theme cover a large part of the current challenges in health information search such as coding standards, information extraction from complex data, user requirements analysis, multimedia data analysis and the access to big data. Several future challenges are identified such as the combination of visual and textual data for information search and the difficulty to scale when analyzing big data.

  7. The Engineering Mechanism in Formation of Informational Basis of Analysis of Financial Sustainability of Enterprise

    Directory of Open Access Journals (Sweden)

    Chumak Oksana V.

    2017-12-01

    Full Text Available The article is aimed at substantiating the mechanism and instruments of financial and accountancy engineering with purpose of formation of information support of analysis of financial sustainability in the enterprise management system. The essence and preconditions of introduction of financial and accountancy engineering are disclosed. Expediency of application of the financial engineering mechanism at enterprise while analyzing financial sustainability has been substantiated. An analysis of methods of formation and use of derivative balance reports was carried out. Models of the conception of mechanisms and instruments of financial and accountancy engineering in analyzing the financial sustainability of enterprise have been suggested. A mega-accounts system in the working plan of the enterprise’s accounts has been recommended. Seven iterations have been provided, which constitute the basis of accounting-analytical support of the accountancy engineering. The information obtained on the basis of the financial and accountancy engineering mechanism allows to carry out real assessment of the enterprise’s financial sustainability.

  8. An Information Foraging Analysis of Note Taking and Note Sharing While Browsing Campaign Information

    DEFF Research Database (Denmark)

    Vatrapu, Ravi; Robertson, Scott

    2010-01-01

    In this paper, we present an experimental study of political information foraging in the context of e-voting. Participants were observed while searching and browsing the internet for campaign information in a mock-voting situation in three online note-taking conditions: No Notes, Private Notes...... with lack of scent, low value perception, and value depletion of information. Implications for the voter centered design of e-voting portals are discussed....

  9. Information Management System Development for the Investigation, Reporting, and Analysis of Human Error in Naval Aviation Maintenance

    National Research Council Canada - National Science Library

    Nelson, Douglas

    2001-01-01

    The purpose of this research is to evaluate and refine a safety information management system that will facilitate data collection, organization, query, analysis and reporting of maintenance errors...

  10. Network meta-analysis of multiple outcome measures accounting for borrowing of information across outcomes.

    Science.gov (United States)

    Achana, Felix A; Cooper, Nicola J; Bujkiewicz, Sylwia; Hubbard, Stephanie J; Kendrick, Denise; Jones, David R; Sutton, Alex J

    2014-07-21

    Network meta-analysis (NMA) enables simultaneous comparison of multiple treatments while preserving randomisation. When summarising evidence to inform an economic evaluation, it is important that the analysis accurately reflects the dependency structure within the data, as correlations between outcomes may have implication for estimating the net benefit associated with treatment. A multivariate NMA offers a framework for evaluating multiple treatments across multiple outcome measures while accounting for the correlation structure between outcomes. The standard NMA model is extended to multiple outcome settings in two stages. In the first stage, information is borrowed across outcomes as well across studies through modelling the within-study and between-study correlation structure. In the second stage, we make use of the additional assumption that intervention effects are exchangeable between outcomes to predict effect estimates for all outcomes, including effect estimates on outcomes where evidence is either sparse or the treatment had not been considered by any one of the studies included in the analysis. We apply the methods to binary outcome data from a systematic review evaluating the effectiveness of nine home safety interventions on uptake of three poisoning prevention practices (safe storage of medicines, safe storage of other household products, and possession of poison centre control telephone number) in households with children. Analyses are conducted in WinBUGS using Markov Chain Monte Carlo (MCMC) simulations. Univariate and the first stage multivariate models produced broadly similar point estimates of intervention effects but the uncertainty around the multivariate estimates varied depending on the prior distribution specified for the between-study covariance structure. The second stage multivariate analyses produced more precise effect estimates while enabling intervention effects to be predicted for all outcomes, including intervention effects on

  11. Automatic generation of stop word lists for information retrieval and analysis

    Science.gov (United States)

    Rose, Stuart J

    2013-01-08

    Methods and systems for automatically generating lists of stop words for information retrieval and analysis. Generation of the stop words can include providing a corpus of documents and a plurality of keywords. From the corpus of documents, a term list of all terms is constructed and both a keyword adjacency frequency and a keyword frequency are determined. If a ratio of the keyword adjacency frequency to the keyword frequency for a particular term on the term list is less than a predetermined value, then that term is excluded from the term list. The resulting term list is truncated based on predetermined criteria to form a stop word list.

  12. Analysis and Design Information System Logistics Delivery Service in Pt Repex Wahana

    Directory of Open Access Journals (Sweden)

    Stephanie Surja

    2015-12-01

    Full Text Available Analysis and Design of Logistic Delivery System in PT Repex Wahana aims to analyze company’s need in existing business process of logistic delivery service. This will then be used in the development of an integrated system that can address the problems in the running process of sending and tracking the whereaboutsor status of the delivered goods which are the core business processes in the enterprise. The result then will be used as basis in the development of integrated information system in pursuit of corporate solution for process business automation, delivery process, inventory, and logistic delivery tracking, which is the core of the company business process, and it will be documented using Unified Modeling Language. The information system is meant to simplify the delivery and tracking process in the company, besides will minimize lost and error of data which is often happened because of the manual and unorganized transaction data processing.

  13. Drug information, misinformation, and disinformation on social media: a content analysis study.

    Science.gov (United States)

    Al Khaja, Khalid A J; AlKhaja, Alwaleed K; Sequeira, Reginald P

    2018-05-24

    Dissemination of misleading drug information through social media can be detrimental to the health of the public. This study, carried out in Bahrain, evaluated the truthfulness of 22 social media claims about drugs (72.7%), dietary supplements (22.7%), and toxic bisphenol-A (4.5%). They circulated on WhatsApp platform, as case studies. We categorized claims as objectively true, false, or potentially misleading. The content analysis revealed that "potentially misleading" claims were the most frequent messages (59.1%). They tend to exaggerate the efficacy or safety without sufficient evidence to substantiate claims. False claims (27.3%) were likely due to unfair competition or deception. Overall, 13.6% of the messages were objectively true claims that could withstand regulatory scrutiny. Majority of the drug-related messages on social media were potentially misleading or false claims that lacked credible evidence to support them. In the public interest, regulatory authorities should monitor such information disseminated via social media platforms.

  14. Information Myopia

    Directory of Open Access Journals (Sweden)

    Nadi Helena Presser

    2016-04-01

    Full Text Available This article reflects on the ways of appropriation in organizations. The notion of Information Myopia is characterized by the lack of knowledge about the available informational capabilities in organizations, revealing a narrow view of the information environment. This analysis has focused on the process for renewing the software licenses contracts of a large multinational group, in order to manage its organizational assets in information technology. The collected, explained and justified information allowed to elaborate an action proposal, which enabled the creation of new organizational knowledge. In its theoretical dimension, the value of information was materialized by its use, in a collective process of organizational learning.

  15. Onco-STS: a web-based laboratory information management system for sample and analysis tracking in oncogenomic experiments.

    Science.gov (United States)

    Gavrielides, Mike; Furney, Simon J; Yates, Tim; Miller, Crispin J; Marais, Richard

    2014-01-01

    Whole genomes, whole exomes and transcriptomes of tumour samples are sequenced routinely to identify the drivers of cancer. The systematic sequencing and analysis of tumour samples, as well other oncogenomic experiments, necessitates the tracking of relevant sample information throughout the investigative process. These meta-data of the sequencing and analysis procedures include information about the samples and projects as well as the sequencing centres, platforms, data locations, results locations, alignments, analysis specifications and further information relevant to the experiments. The current work presents a sample tracking system for oncogenomic studies (Onco-STS) to store these data and make them easily accessible to the researchers who work with the samples. The system is a web application, which includes a database and a front-end web page that allows the remote access, submission and updating of the sample data in the database. The web application development programming framework Grails was used for the development and implementation of the system. The resulting Onco-STS solution is efficient, secure and easy to use and is intended to replace the manual data handling of text records. Onco-STS allows simultaneous remote access to the system making collaboration among researchers more effective. The system stores both information on the samples in oncogenomic studies and details of the analyses conducted on the resulting data. Onco-STS is based on open-source software, is easy to develop and can be modified according to a research group's needs. Hence it is suitable for laboratories that do not require a commercial system.

  16. Tools for Trade Analysis and Open Source Information Monitoring for Non-proliferation

    International Nuclear Information System (INIS)

    Cojazzi, G.G.M.; Versino, C.; Wolfart, E.; Renda, G.; Janssens, W.A.M.; )

    2015-01-01

    The new state level approach being proposed by IAEA envisions an objective based and information driven safeguards approach utilizing all relevant information to improve the effectiveness and efficiency of safeguards. To this goal the IAEA makes also use of open source information, here broadly defined as any information that is neither classified nor proprietary. It includes, but is not limited to: media sources, government and non-governmental reports and analyzes, commercial data, and scientific/technical literature, including trade data. Within the EC support programme to IAEA, JRC has surveyed and catalogued open sources on import-export customs trade data and developed tools for supporting the use of the related databases in safeguards. The JRC software The Big Table, (TBT), supports i.a.: a) the search through a collection of reference documents relevant to trade analysis (legal/regulatory documents, technical handbooks); b) the selection of items of interests to specific verifications and c) the mapping of these items to customs commodities searchable in trade databases. In the field of open source monitoring, JRC is developing and operating a ''Nuclear Security Media Monitor'' (NSMM), which is a web-based multilingual news aggregation system that automatically collects news articles from pre-defined web sites. NSMM is a domain specific version of the general JRC-Europe Media Monitor (EMM). NSMM has been established within the EC support programme with the aim, i.e., to streamline IAEA's process of open source information monitoring. In the first part, the paper will recall the trade data sources relevant for non-proliferation and will then illustrate the main features of TBT, recently coupled with the IAEA Physical Model, and new visualization techniques applied to trade data. In the second part it will present the main aspects of the NSMM also by illustrating some of uses done at JRC. (author)

  17. Quality Inspection and Analysis of Three-Dimensional Geographic Information Model Based on Oblique Photogrammetry

    Science.gov (United States)

    Dong, S.; Yan, Q.; Xu, Y.; Bai, J.

    2018-04-01

    In order to promote the construction of digital geo-spatial framework in China and accelerate the construction of informatization mapping system, three-dimensional geographic information model emerged. The three-dimensional geographic information model based on oblique photogrammetry technology has higher accuracy, shorter period and lower cost than traditional methods, and can more directly reflect the elevation, position and appearance of the features. At this stage, the technology of producing three-dimensional geographic information models based on oblique photogrammetry technology is rapidly developing. The market demand and model results have been emerged in a large amount, and the related quality inspection needs are also getting larger and larger. Through the study of relevant literature, it is found that there are a lot of researches on the basic principles and technical characteristics of this technology, and relatively few studies on quality inspection and analysis. On the basis of summarizing the basic principle and technical characteristics of oblique photogrammetry technology, this paper introduces the inspection contents and inspection methods of three-dimensional geographic information model based on oblique photogrammetry technology. Combined with the actual inspection work, this paper summarizes the quality problems of three-dimensional geographic information model based on oblique photogrammetry technology, analyzes the causes of the problems and puts forward the quality control measures. It provides technical guidance for the quality inspection of three-dimensional geographic information model data products based on oblique photogrammetry technology in China and provides technical support for the vigorous development of three-dimensional geographic information model based on oblique photogrammetry technology.

  18. Extraction and Analysis of Information Related to Research & Development Declared Under an Additional Protocol

    International Nuclear Information System (INIS)

    Idinger, J.; Labella, R.; Rialhe, A.; Teller, N.

    2015-01-01

    The additional protocol (AP) provides important tools to strengthen and improve the effectiveness and efficiency of the safeguards system. Safeguards are designed to verify that States comply with their international commitments not to use nuclear material or to engage in nuclear-related activities for the purpose of developing nuclear weapons or other nuclear explosive devices. Under an AP based on INFCIRC/540, a State must provide to the IAEA additional information about, and inspector access to, all parts of its nuclear fuel cycle. In addition, the State has to supply information about its nuclear fuel cycle-related research and development (R&D) activities. The majority of States declare their R&D activities under the AP Articles 2.a.(i), 2.a.(x), and 2.b.(i) as part of initial declarations and their annual updates under the AP. In order to verify consistency and completeness of information provided under the AP by States, the Agency has started to analyze declared R&D information by identifying interrelationships between States in different R&D areas relevant to safeguards. The paper outlines the quality of R&D information provided by States to the Agency, describes how the extraction and analysis of relevant declarations are currently carried out at the Agency and specifies what kinds of difficulties arise during evaluation in respect to cross-linking international projects and finding gaps in reporting. In addition, the paper tries to elaborate how the reporting quality of AP information with reference to R&D activities and the assessment process of R&D information could be improved. (author)

  19. QUALITY INSPECTION AND ANALYSIS OF THREE-DIMENSIONAL GEOGRAPHIC INFORMATION MODEL BASED ON OBLIQUE PHOTOGRAMMETRY

    Directory of Open Access Journals (Sweden)

    S. Dong

    2018-04-01

    Full Text Available In order to promote the construction of digital geo-spatial framework in China and accelerate the construction of informatization mapping system, three-dimensional geographic information model emerged. The three-dimensional geographic information model based on oblique photogrammetry technology has higher accuracy, shorter period and lower cost than traditional methods, and can more directly reflect the elevation, position and appearance of the features. At this stage, the technology of producing three-dimensional geographic information models based on oblique photogrammetry technology is rapidly developing. The market demand and model results have been emerged in a large amount, and the related quality inspection needs are also getting larger and larger. Through the study of relevant literature, it is found that there are a lot of researches on the basic principles and technical characteristics of this technology, and relatively few studies on quality inspection and analysis. On the basis of summarizing the basic principle and technical characteristics of oblique photogrammetry technology, this paper introduces the inspection contents and inspection methods of three-dimensional geographic information model based on oblique photogrammetry technology. Combined with the actual inspection work, this paper summarizes the quality problems of three-dimensional geographic information model based on oblique photogrammetry technology, analyzes the causes of the problems and puts forward the quality control measures. It provides technical guidance for the quality inspection of three-dimensional geographic information model data products based on oblique photogrammetry technology in China and provides technical support for the vigorous development of three-dimensional geographic information model based on oblique photogrammetry technology.

  20. Library and Information Science Research Areas: A Content Analysis of Articles from the Top 10 Journals 2007-8

    Science.gov (United States)

    Aharony, Noa

    2012-01-01

    The current study seeks to describe and analyze journal research publications in the top 10 Library and Information Science journals from 2007-8. The paper presents a statistical descriptive analysis of authorship patterns (geographical distribution and affiliation) and keywords. Furthermore, it displays a thorough content analysis of keywords and…

  1. Transportation Big Data: Unbiased Analysis and Tools to Inform Sustainable Transportation Decisions

    Energy Technology Data Exchange (ETDEWEB)

    2016-06-01

    Today, transportation operation and energy systems data are generated at an unprecedented scale. The U.S. Department of Energy's National Renewable Energy Laboratory (NREL) is the go-to source for expertise in providing data and analysis to inform industry and government transportation decision making. The lab's teams of data experts and engineers are mining and analyzing large sets of complex data -- or 'big data' -- to develop solutions that support the research, development, and deployment of market-ready technologies that reduce fuel consumption and greenhouse gas emissions.

  2. Tire aging: a human factors analysis of failure to warn and inform.

    Science.gov (United States)

    Wogalter, Michael S; Laughery, Kenneth R

    2012-01-01

    A scenario of an automotive accident caused by tire failure is given followed by a human factors analysis of the information available to consumers on tire aging. Consumers have not been told that the age of the tire is a safety concern. It is not easy to decode the date of manufacture on tires. More publicity and prominent warnings are needed to communicate the dangers of older tires. Also, better ways to present the date of manufacture so that consumers can more easily and accurately assess tire age are needed.

  3. An information-theoretic machine learning approach to expression QTL analysis.

    Directory of Open Access Journals (Sweden)

    Tao Huang

    Full Text Available Expression Quantitative Trait Locus (eQTL analysis is a powerful tool to study the biological mechanisms linking the genotype with gene expression. Such analyses can identify genomic locations where genotypic variants influence the expression of genes, both in close proximity to the variant (cis-eQTL, and on other chromosomes (trans-eQTL. Many traditional eQTL methods are based on a linear regression model. In this study, we propose a novel method by which to identify eQTL associations with information theory and machine learning approaches. Mutual Information (MI is used to describe the association between genetic marker and gene expression. MI can detect both linear and non-linear associations. What's more, it can capture the heterogeneity of the population. Advanced feature selection methods, Maximum Relevance Minimum Redundancy (mRMR and Incremental Feature Selection (IFS, were applied to optimize the selection of the affected genes by the genetic marker. When we applied our method to a study of apoE-deficient mice, it was found that the cis-acting eQTLs are stronger than trans-acting eQTLs but there are more trans-acting eQTLs than cis-acting eQTLs. We compared our results (mRMR.eQTL with R/qtl, and MatrixEQTL (modelLINEAR and modelANOVA. In female mice, 67.9% of mRMR.eQTL results can be confirmed by at least two other methods while only 14.4% of R/qtl result can be confirmed by at least two other methods. In male mice, 74.1% of mRMR.eQTL results can be confirmed by at least two other methods while only 18.2% of R/qtl result can be confirmed by at least two other methods. Our methods provide a new way to identify the association between genetic markers and gene expression. Our software is available from supporting information.

  4. Between theory and quantification: An integrated analysis of metabolic patterns of informal urban settlements

    International Nuclear Information System (INIS)

    Kovacic, Zora; Giampietro, Mario

    2017-01-01

    As informal urban settlements grow in size and population across the developing world, the issue of how to design and implement effective policies to provide for the needs and the aspirations of dwellers becomes ever more pressing. This paper addresses the challenge of how to characterise in quantitative terms the complex and fast-changing phenomenon of informal urban settlements without falling into oversimplification and a narrow focus on the material deficits of informal settlements. Energy policies are taken as an example to illustrate the shortcomings of oversimplification in producing policy relevant information. We adopt a semantically open representation of informal settlements that can capture the diversity of adaptive strategies used by different settlement typologies, based on the societal metabolism approach. Results show that as settlements grow in size and complexity, they remain economically and politically marginalised and fail to integrate into the city. We argue that in the case of energy policy, the analysis must go beyond the definition of problems such as access to energy at the level of the individual, and focus on a multi-scale assessment including the household and community levels studying the capacity of the household to increase it energy throughput through exosomatic devices and infrastructure. - Highlights: • The policy challenges of fast changing informal urban settlements are assessed. • Metabolic patterns are used to assess and compare different typologies of slums. • Semantically open representations are used to capture the complexity of slums.

  5. An analysis method of the press information related with the nuclear activity in Argentina

    International Nuclear Information System (INIS)

    Alsina, Griselda.

    1989-01-01

    The articles published by the newspapers during the year 1987 were analyzed and classified according to their contents. An attribute was assigned to each article (positive, negative or neutral) in agreement with its connotation regarding the nuclear activity in Argentina. An ISIS base system was developed using these data. The purpose of this analysis was to evaluate the influence of the press in the public opinion. The relation between the different variables show the importance and approach (environmental, technico-scientifical or political) given by the press to the different subjects. The results show a general lack of knowledge about nuclear activities and a concern among the readers associated with the environmental risks, which calls for the need to develop an information program for the community. The fundamentals of this program should improve the organization in order to make the information reach the external demands, to promote educational programs and to continuously provide information to the press. (S.M.) [es

  6. Information as signs: A semiotic analysis of the information concept, determining it's ontological and epistemological commitments

    DEFF Research Database (Denmark)

    Thellefsen, Martin Muderspach; Thellefsen, Torkild Leo; Sørensen, Bent

    2018-01-01

    Purpose The purpose of this paper is to formulate an analytical framework for the information concept based on the semiotic theory. Design/methodology/approach The paper is motivated by the apparent controversy that still surrounds the information concept. Information, being a key concept within...... LIS, suffers from being anchored in various incompatible theories. The paper suggests that information is signs, and it demonstrates how the concept of information can be understood within C.S. Peirce’s phenomenologically rooted semiotic. Hence, from there, certain ontological conditions as well...... epistemological consequences of the information concept can be deduced. Findings The paper argues that an understanding of information, as either objective or subjective/discursive, leads to either objective reductionism and signal processing, that fails to explain how information becomes meaningful at all...

  7. Parametric Analysis of Surveillance Quality and Level and Quality of Intent Information and Their Impact on Conflict Detection Performance

    Science.gov (United States)

    Guerreiro, Nelson M.; Butler, Ricky W.; Hagen, George E.; Maddalon, Jeffrey M.; Lewis, Timothy A.

    2016-01-01

    A loss-of-separation (LOS) is said to occur when two aircraft are spatially too close to one another. A LOS is the fundamental unsafe event to be avoided in air traffic management and conflict detection (CD) is the function that attempts to predict these LOS events. In general, the effectiveness of conflict detection relates to the overall safety and performance of an air traffic management concept. An abstract, parametric analysis was conducted to investigate the impact of surveillance quality, level of intent information, and quality of intent information on conflict detection performance. The data collected in this analysis can be used to estimate the conflict detection performance under alternative future scenarios or alternative allocations of the conflict detection function, based on the quality of the surveillance and intent information under those conditions.Alternatively, this data could also be used to estimate the surveillance and intent information quality required to achieve some desired CD performance as part of the design of a new separation assurance system.

  8. Deprival value: information utility analysis

    OpenAIRE

    Pereira, Marco Antonio; Pinto, Alexandre Evaristo; Barbosa Neto, João Estevão; Martins, Eliseu

    2018-01-01

    ABSTRACT This article contributes to the perception that the users’ learning process plays a key role in order to apply an accounting concept and this involves a presentation that fits its informative potential, free of previous accounting fixations. Deprival value is a useful measure for managerial and corporate purposes, it may be applied to the current Conceptual Framework of the International Accounting Standards Board (IASB). This study analyzes its utility, taking into account cognitive...

  9. On the Feature Selection and Classification Based on Information Gain for Document Sentiment Analysis

    Directory of Open Access Journals (Sweden)

    Asriyanti Indah Pratiwi

    2018-01-01

    Full Text Available Sentiment analysis in a movie review is the needs of today lifestyle. Unfortunately, enormous features make the sentiment of analysis slow and less sensitive. Finding the optimum feature selection and classification is still a challenge. In order to handle an enormous number of features and provide better sentiment classification, an information-based feature selection and classification are proposed. The proposed method reduces more than 90% unnecessary features while the proposed classification scheme achieves 96% accuracy of sentiment classification. From the experimental results, it can be concluded that the combination of proposed feature selection and classification achieves the best performance so far.

  10. Collection and Analysis of Open Source News for Information Awareness and Early Warning in Nuclear Safeguards

    International Nuclear Information System (INIS)

    Cojazzi, Giacomo G.M.; Van Der Goot, Erik; Verile, Marco; Wolfart, Erik; Rutan Fowler, Marcy; Feldman, Yana; Hammond, William; Schweighardt, John; Ferguson, Mattew

    2013-01-01

    Acquisition and analysis of open source information plays an increasingly important role in the IAEA’s move towards safeguards implementation based on all safeguards relevant information known about a State. The growing volume of open source information requires the development of technology and tools capable of effectively collecting relevant information, filtering out “noise”, organizing valuable information in a clear and accessible manner, and assessing its relevance. In this context, the IAEA’s Division of Information Management (SGIM) and the EC’s Joint Research Centre (JRC) are currently implementing a joint project to advance the effectiveness and efficiency of the IAEA’s workflow for open source information collection and analysis. The objective is to provide tools to support SGIM in the production of the SGIM Open Source Highlights, which is a daily news brief consisting of the most pertinent news stories relevant to safeguards and non-proliferation. The process involves the review and selection of hundreds of articles from a wide array of specifically selected sources. The joint activity exploits the JRC’s Europe Media Monitor (EMM) and NewsDesk applications: EMM automatically collects and analyses news articles from a pre-defined list of web sites, and NewsDesk allows an analyst to manually select the most relevant articles from the EMM stream for further processing. The paper discusses the IAEA’s workflow for the production of SGIM Open Source Highlights and describes the capabilities of EMM and NewsDesk. It then provides an overview of the joint activities since the project started in 2011, which were focused i) on setting up a separate EMM installation dedicated to the nuclear safeguards and security domain (Nuclear Security Media Monitor, NSMM) and ii) on evaluating the NSMM/NewsDesk for meeting the IAEA’s needs. Finally, it presents the current use NSMM/NewsDesk at the IAEA and proposes options for further integration with the

  11. Systematic reviews in Library and Information Science: analysis and evaluation of the search process

    Directory of Open Access Journals (Sweden)

    José Antonio Salvador-Oliván

    2018-05-01

    Full Text Available Objective: An essential component of a systematic review is the development and execution of a literature search to identify all available and relevant published studies. The main objective of this study is to analyse and evaluate whether the systematic reviews in Library and Information Science (LIS provide complete information on all the elements that make up the search process. Methods: A search was launched in WOS, Scopus, LISTA, Library Science Database, Medline databases and a wiki published from 2000 to February 2017, in order to find and identify systematic reviews. The search was designed to find those records whose titles included the words “systematic review” and/or “meta-analysis”. A list was created with the twelve items recommended from of the main publication guides, to assess the information degree on each of them. Results and conclusions: Most of the reviews in LIS are created by information professionals. From the 94 systematic reviews selected for analysis, it was found that only a 4.3% provided the complete reporting on the search method. The most frequently included item is the name of the database (95.6% and the least one is the name of the host (35.8%. It is necessary to improve and complete the information about the search processes in the complete reports from LIS systematic reviews for reproducibility, updating and quality assessment improvement.

  12. Teaching information seeking

    Directory of Open Access Journals (Sweden)

    Louise Limberg

    2006-01-01

    Full Text Available Introduction. The article argues for a closer association between information seeking research and the practices of teaching information seeking. Findings are presented from a research project on information seeking, didactics and learning (IDOL investigating librarians' and teachers' experiences of teaching information seeking. Method. Thirteen teachers and five librarians, teaching 12-19 year-old students in three schools, participated. Forty-five interviews were conducted over a period of three years. Analysis. The IDOL project adopted a phenomenographic approach with the purpose of describing patterns of variation in experiences. The findings were also analysed by way of relating them to four competing approaches to the mediation of information literacy. Results. A gap was identified between experiences of teaching content that focused on sources and order, and experiences of assessment criteria applied to students' work that focused on the importance of correct facts and the analysis of information. These findings indicate a highly restricted range of teaching contents when compared with the four theoretical approaches to the mediation of information literacy. Conclusion. Teaching information seeking might be enhanced by a wider repertoire of contents reflecting more varied theoretical understanding developed in information seeking research, particularly as regards the importance of content and context related to user perspectives.

  13. ANALYSIS OF TRAIN SHEET IN THE INFORMATION SYSTEM OF JSC «UKRZALIZNYTSIA»: PERSPECTIVE

    Directory of Open Access Journals (Sweden)

    S. M. Ovcharenko

    2016-04-01

    Full Text Available Purpose. The system of train sheet analysis (TSA in the information system of JSC «Ukrzaliznytsia» provides work with passenger and suburban trains and has considerable potential. Therefore it is necessary to establish the prospects of development of the system. Methodology. Departments’ setup and the train delay causes should be carried out at every station and span, where such delays took place. This requires the fixation of condition deviations of infrastructure from normal and other adverse factors. In the sector of freight transportations the train schedule analysis is insufficient, since this analysis does not account for deviations from the terms of delivery. Therefore it also is necessary to analyze the delivery graphs. The basis for monitoring the cargo delivery is the method of control time points (CTP of technological operations performed with cargo at railway stations. On the basis of CTP to assess the quality of the transport process one should calculate the values of the analysis of cargo delivery schedule (performance level of the cargo delivery schedule, the coefficient of ahead of schedule/delay delivery. Findings. The article proposes to develop the system TSA using the input and display of the train delay causes on-line by transportation service employees, expansion of statistical databases and processing of the input delay causes during its calculation train sheet analysis of freight trains and quality assessment of the delivery schedule fulfillment. It is also appropriate before the new operator companies had appeared to make changes in the instructions TSCHU-TSD-0002 on the list of departments, which include delayed trains, by adding «the department» «The fault of operator companies» and corresponding causes of delays. Originality. The scheme of automated TSA in the information system of JSC «Ukrzaliznytsia» was improved. The author proposes to determine the cargo delivery quality on the certain polygon using the

  14. A Knowledge-Based Information Management System for Watershed Analysis in the Pacific Northwest U.S.

    Science.gov (United States)

    Keith Reynolds; Patrick Cunningham; Larry Bednar; Michael Saunders; Michael Foster; Richard Olson; Daniel Schmoldt; Donald Latham; Bruce Miller; John Steffenson

    1996-01-01

    The Pacific Northwest Research Station (USDA Forest Service) is developing a knowledge-based information management system to provide decision support for watershed analysis. The system includes: (1) a GIS interface that allows users to navigate graphically to specific provinces and watersheds and display a variety of themes (vegetation, streams, roads, topography, etc...

  15. Information Systems Security Audit

    OpenAIRE

    Gheorghe Popescu; Veronica Adriana Popescu; Cristina Raluca Popescu

    2007-01-01

    The article covers:Defining an information system; benefits obtained by introducing new information technologies; IT management;Defining prerequisites, analysis, design, implementation of IS; Information security management system; aspects regarding IS security policy; Conceptual model of a security system; Auditing information security systems and network infrastructure security.

  16. Analysis of the foreign systems for classification and marking the NPP unit components during their application in information systems

    International Nuclear Information System (INIS)

    Bylkin, B.K.; Shaposhnikov, V.A.; Sadovoj, Yu.K.; Tikhonovskij, V.L.; Chujko, D.V.

    2006-01-01

    Procedures accepted and permitted in an information system (IS) to arrange and to rank information drawn up to ensure the information support of decommissioning of an NPP power unit affect the application convenience and efficiency of the IS taken as a whole. The IS supporting the decommissioning efforts stores large volume of information, that is why, the problem to choose systems to ensure information ranking is the urgent one. The paper deals with the analysis of the foreign systems to rank the components used during an NPP operation. Besides, one considers the application of the mentioned systems to arrange and to rank the information on decommissioning of an NPP power unit presented to users [ru

  17. Congruence analysis of geodetic networks - hypothesis tests versus model selection by information criteria

    Science.gov (United States)

    Lehmann, Rüdiger; Lösler, Michael

    2017-12-01

    Geodetic deformation analysis can be interpreted as a model selection problem. The null model indicates that no deformation has occurred. It is opposed to a number of alternative models, which stipulate different deformation patterns. A common way to select the right model is the usage of a statistical hypothesis test. However, since we have to test a series of deformation patterns, this must be a multiple test. As an alternative solution for the test problem, we propose the p-value approach. Another approach arises from information theory. Here, the Akaike information criterion (AIC) or some alternative is used to select an appropriate model for a given set of observations. Both approaches are discussed and applied to two test scenarios: A synthetic levelling network and the Delft test data set. It is demonstrated that they work but behave differently, sometimes even producing different results. Hypothesis tests are well-established in geodesy, but may suffer from an unfavourable choice of the decision error rates. The multiple test also suffers from statistical dependencies between the test statistics, which are neglected. Both problems are overcome by applying information criterions like AIC.

  18. Hydrogen Technical Analysis -- Dissemination of Information

    Energy Technology Data Exchange (ETDEWEB)

    George Kervitsky, Jr.

    2006-03-20

    SENTECH is a small energy and environmental consulting firm providing technical, analytical, and communications solutions to technology management issues. The activities proposed by SENTECH focused on gathering and developing communications materials and information, and various dissemination activities to present the benefits of hydrogen energy to a broad audience while at the same time establishing permanent communications channels to enable continued two-way dialog with these audiences in future years. Effective communications and information dissemination is critical to the acceptance of new technology. Hydrogen technologies face the additional challenge of safety preconceptions formed primarily as a result of the crash of the Hindenburg. Effective communications play a key role in all aspects of human interaction, and will help to overcome the perceptual barriers, whether of safety, economics, or benefits. As originally proposed SENTECH identified three distinct information dissemination activities to address three distinct but important audiences; these formed the basis for the task structure used in phases 1 and 2. The tasks were: (1) Print information--Brochures that target the certain segment of the population and will be distributed via relevant technical conferences and traditional distribution channels. (2) Face-to-face meetings--With industries identified to have a stake in hydrogen energy. The three industry audiences are architect/engineering firms, renewable energy firms, and energy companies that have not made a commitment to hydrogen (3) Educational Forums--The final audience is students--the future engineers, technicians, and energy consumers. SENTECH will expand on its previous educational work in this area. The communications activities proposed by SENTECH and completed as a result of this cooperative agreement was designed to compliment the research and development work funded by the DOE by presenting the technical achievements and validations

  19. The system for statistical analysis of logistic information

    Directory of Open Access Journals (Sweden)

    Khayrullin Rustam Zinnatullovich

    2015-05-01

    Full Text Available The current problem for managers in logistic and trading companies is the task of improving the operational business performance and developing the logistics support of sales. The development of logistics sales supposes development and implementation of a set of works for the development of the existing warehouse facilities, including both a detailed description of the work performed, and the timing of their implementation. Logistics engineering of warehouse complex includes such tasks as: determining the number and the types of technological zones, calculation of the required number of loading-unloading places, development of storage structures, development and pre-sales preparation zones, development of specifications of storage types, selection of loading-unloading equipment, detailed planning of warehouse logistics system, creation of architectural-planning decisions, selection of information-processing equipment, etc. The currently used ERP and WMS systems did not allow us to solve the full list of logistics engineering problems. In this regard, the development of specialized software products, taking into account the specifics of warehouse logistics, and subsequent integration of these software with ERP and WMS systems seems to be a current task. In this paper we suggest a system of statistical analysis of logistics information, designed to meet the challenges of logistics engineering and planning. The system is based on the methods of statistical data processing.The proposed specialized software is designed to improve the efficiency of the operating business and the development of logistics support of sales. The system is based on the methods of statistical data processing, the methods of assessment and prediction of logistics performance, the methods for the determination and calculation of the data required for registration, storage and processing of metal products, as well as the methods for planning the reconstruction and development

  20. Study on Network Error Analysis and Locating based on Integrated Information Decision System

    Science.gov (United States)

    Yang, F.; Dong, Z. H.

    2017-10-01

    Integrated information decision system (IIDS) integrates multiple sub-system developed by many facilities, including almost hundred kinds of software, which provides with various services, such as email, short messages, drawing and sharing. Because the under-layer protocols are different, user standards are not unified, many errors are occurred during the stages of setup, configuration, and operation, which seriously affect the usage. Because the errors are various, which may be happened in different operation phases, stages, TCP/IP communication protocol layers, sub-system software, it is necessary to design a network error analysis and locating tool for IIDS to solve the above problems. This paper studies on network error analysis and locating based on IIDS, which provides strong theory and technology supports for the running and communicating of IIDS.

  1. Information technologies in biomedicine

    CERN Document Server

    Kawa, Jacek; Wieclawek, Wojciech

    2014-01-01

    New computerized approaches to various problems have become critically important in healthcare. Computer assisted diagnosis has been extended towards a support of the clinical treatment. Mathematical information analysis, computer applications have become standard tools underpinning the current rapid progress with developing Computational Intelligence. A computerized support in the analysis of patient information and implementation of a computer aided diagnosis and treatment systems, increases the objectivity of the analysis and speeds up the response to pathological changes. This book presents a variety of state-of-the-art information technology and its applications to the networked environment to allow robust computerized approaches to be introduced throughout the healthcare enterprise. Image analysis and its application is the traditional part that deals with the problem of data processing, recognition and classification. Bioinformatics has become a dynamically developed field of computer assisted biologic...

  2. Analysis of respiratory and muscle activity by means of cross information function between ventilatory and myographic signals.

    Science.gov (United States)

    Alonso, J F; Mañanas, M A; Hoyer, D; Topor, Z L; Bruce, E N

    2004-01-01

    Analysis of respiratory muscle activity is a promising technique for the study of pulmonary diseases such as obstructive sleep apnea syndrome (OSAS). Evaluation of interactions between muscles is very useful in order to determine the muscular pattern during an exercise. These interactions have already been assessed by means of different linear techniques like cross-spectrum, magnitude squared coherence or cross-correlation. The aim of this work is to evaluate interactions between respiratory and myographic signals through nonlinear analysis by means of cross mutual information function (CMIF), and finding out what information can be extracted from it. Some parameters are defined and calculated from CMIF between ventilatory and myographic signals of three respiratory muscles. Finally, differences in certain parameters were obtained between OSAS patients and healthy subjects indicating different respiratory muscle couplings.

  3. Overall analysis of meteorological information in the daeduk nuclear complex

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Byung Woo; Lee, Young Bok; Han, Moon Hee; Kim, Eun Han; Suh, Kyung Suk; Hwang, Won Tae; Hong, Suk Boong [Korea Atomic Energy Res. Inst., Taejon (Korea, Republic of)

    1992-12-01

    Problem shooting in tower structure, sensor installation, earth, and cabling have been done with integrated field-test, establishment of data acquisition system, and instrument calibration since the completion of the main tower construction in this year. Procedure guide was also made for the effective management covering instrument operation, calibration and repair. Real measurement has been done during two months from this October after whole integration of equipments. Occurrence of nocturnal inversion layer, fogging, and frequent stable condition of atmospheric stability were shown as the analysis results of measured data which well represented seasonal and regional characteristics in the site. Wireless data transmission to MIPS(Meteorological Information Processing System) has been done after collection in the DAS(data acquision system) where environmental assessment can be done by the developed simulation programs in both cases of normal operation and emergency. (Author).

  4. A qualitative analysis of Māori and Pacific smokers' views on informed choice and smoking

    Science.gov (United States)

    Gifford, Heather; Tautolo, El-Shadan; Erick, Stephanie; Hoek, Janet; Gray, Rebecca; Edwards, Richard

    2016-01-01

    Objectives Tobacco companies frame smoking as an informed choice, a strategy that holds individuals responsible for harms they incur. Few studies have tested this argument, and even fewer have examined how informed indigenous smokers or those from minority ethnicities are when they start smoking. We explored how young adult Māori and Pacific smokers interpreted ‘informed choice’ in relation to smoking. Participants Using recruitment via advertising, existing networks and word of mouth, we recruited and undertook qualitative in-depth interviews with 20 Māori and Pacific young adults aged 18–26 years who smoked. Analyses Data were analysed using an informed-choice framework developed by Chapman and Liberman. We used a thematic analysis approach to identify themes that extended this framework. Results Few participants considered themselves well informed and none met more than the framework's initial two criteria. Most reflected on their unthinking uptake and subsequent addiction, and identified environmental factors that had facilitated uptake. Nonetheless, despite this context, most agreed that they had made an informed choice to smoke. Conclusions The discrepancy between participants' reported knowledge and understanding of smoking's risks, and their assessment of smoking as an informed choice, reflects their view of smoking as a symbol of adulthood. Policies that make tobacco more difficult to use in social settings could help change social norms around smoking and the ease with which initiation and addiction currently occur. PMID:27188813

  5. Information asymmetries, information externalities, oil companies strategies and oil exploration information efficiency

    International Nuclear Information System (INIS)

    Nyouki, E.

    1998-07-01

    Both for economics (in general) and energy economics matters, it is important to reach oil exploration efficiency. To achieve this aim, a pragmatic approach is to use the concept of information efficiency which means that the different tracts have to be drilled in the decreasing order of estimated profitabilities, estimations being made on the basis of the best (in the sense of reliability) available information. What does 'best available information' mean? It corresponds either to the information held by the most experienced oil companies (due to the existence of information asymmetries to the profit of these companies), or to information revealed by the drilling and which allows to revise probabilities of success on neighboring tracts with similar geological features (due to the existence of information externalities). In consideration of these information asymmetries and externalities, we will say that exploration is information efficient when. -- on the one hand, initial exploration choices are directed by the most experienced companies, - and, on the other hand, during the drilling phase, in the face of the information externality, companies adopt a sequential drilling, i.e. excluding both over-investment and strategic under-investment. The topic we deal with in this thesis is then to know if oil companies, when they are put in normal competition conditions, are likely to make emerge a state of information efficiency in exploration, the analysis being conducted theoretically and empirically. (author)

  6. EQUILIBRIUM ANALYSIS OF FINANCIAL COMPANY BASED ON INFORMATION PROVIDED BY THE BALANCE SHEET

    Directory of Open Access Journals (Sweden)

    Ștefăniță ȘUȘU

    2014-06-01

    Full Text Available This article highlights the importance of indicators (as net working capital, working capital requirements and net cash by means of which it is considered in the context of financial balances capitalization information released by the balance sheet of an entity tourist profile. Theoretical concepts presented in a logical sequence are combined with the practical example transposed Turism Covasna company. The results of the analysis are interpreted while trying to formulate solutions to the economic and financial viability of the entity.

  7. Information content analysis: the potential for methane isotopologue retrieval from GOSAT-2

    Science.gov (United States)

    Malina, Edward; Yoshida, Yukio; Matsunaga, Tsuneo; Muller, Jan-Peter

    2018-02-01

    Atmospheric methane is comprised of multiple isotopic molecules, with the most abundant being 12CH4 and 13CH4, making up 98 and 1.1 % of atmospheric methane respectively. It has been shown that is it possible to distinguish between sources of methane (biogenic methane, e.g. marshland, or abiogenic methane, e.g. fracking) via a ratio of these main methane isotopologues, otherwise known as the δ13C value. δ13C values typically range between -10 and -80 ‰, with abiogenic sources closer to zero and biogenic sources showing more negative values. Initially, we suggest that a δ13C difference of 10 ‰ is sufficient, in order to differentiate between methane source types, based on this we derive that a precision of 0.2 ppbv on 13CH4 retrievals may achieve the target δ13C variance. Using an application of the well-established information content analysis (ICA) technique for assumed clear-sky conditions, this paper shows that using a combination of the shortwave infrared (SWIR) bands on the planned Greenhouse gases Observing SATellite (GOSAT-2) mission, 13CH4 can be measured with sufficient information content to a precision of between 0.7 and 1.2 ppbv from a single sounding (assuming a total column average value of 19.14 ppbv), which can then be reduced to the target precision through spatial and temporal averaging techniques. We therefore suggest that GOSAT-2 can be used to differentiate between methane source types. We find that large unconstrained covariance matrices are required in order to achieve sufficient information content, while the solar zenith angle has limited impact on the information content.

  8. The role of proxy information in missing data analysis.

    Science.gov (United States)

    Huang, Rong; Liang, Yuanyuan; Carrière, K C

    2005-10-01

    This article investigates the role of proxy data in dealing with the common problem of missing data in clinical trials using repeated measures designs. In an effort to avoid the missing data situation, some proxy information can be gathered. The question is how to treat proxy information, that is, is it always better to utilize proxy information when there are missing data? A model for repeated measures data with missing values is considered and a strategy for utilizing proxy information is developed. Then, simulations are used to compare the power of a test using proxy to simply utilizing all available data. It is concluded that using proxy information can be a useful alternative when such information is available. The implications for various clinical designs are also considered and a data collection strategy for efficiently estimating parameters is suggested.

  9. SDI-based business processes: A territorial analysis web information system in Spain

    Science.gov (United States)

    Béjar, Rubén; Latre, Miguel Á.; Lopez-Pellicer, Francisco J.; Nogueras-Iso, Javier; Zarazaga-Soria, F. J.; Muro-Medrano, Pedro R.

    2012-09-01

    Spatial Data Infrastructures (SDIs) provide access to geospatial data and operations through interoperable Web services. These data and operations can be chained to set up specialized geospatial business processes, and these processes can give support to different applications. End users can benefit from these applications, while experts can integrate the Web services in their own business processes and developments. This paper presents an SDI-based territorial analysis Web information system for Spain, which gives access to land cover, topography and elevation data, as well as to a number of interoperable geospatial operations by means of a Web Processing Service (WPS). Several examples illustrate how different territorial analysis business processes are supported. The system has been established by the Spanish National SDI (Infraestructura de Datos Espaciales de España, IDEE) both as an experimental platform for geoscientists and geoinformation system developers, and as a mechanism to contribute to the Spanish citizens knowledge about their territory.

  10. Advances in research methods for information systems research data mining, data envelopment analysis, value focused thinking

    CERN Document Server

    Osei-Bryson, Kweku-Muata

    2013-01-01

    Advances in social science research methodologies and data analytic methods are changing the way research in information systems is conducted. New developments in statistical software technologies for data mining (DM) such as regression splines or decision tree induction can be used to assist researchers in systematic post-positivist theory testing and development. Established management science techniques like data envelopment analysis (DEA), and value focused thinking (VFT) can be used in combination with traditional statistical analysis and data mining techniques to more effectively explore

  11. INFORMATION TECHNOLOGY STUDY ADAPTATSYONNOY ABILITIES CARDIOVASCULAR SYSTEM FOR PHYSICAL EXEMENATION BY MORPHOLOGICAL, TEMPORAL AND SPECTRAL ANALYSIS OSCILLOGRAMS

    Directory of Open Access Journals (Sweden)

    V. P. Martsenyuk

    2015-12-01

    Full Text Available Offer Author Information Technology morphological, temporal and spectral analysis of waveforms (recorded at rest and after exercise, the introduction of analytical treated for clinical interpretation of the results, evaluation and decision-making to doctors significantly increases the information content of the procedure of blood pressure measurement. Can be used for early detection and prenosological premorbid state and functional reserve of the circulatory system, help more effectively to plan preventive, diagnostic and therapeutic process.

  12. Design and Analysis: Payroll of Accounting Information System

    Directory of Open Access Journals (Sweden)

    Suryanto Suryanto

    2011-05-01

    Full Text Available Purpose of the research are to analyze, design, and recommended the payroll of accounting information system that support the internal control to solve the problem. Research methods used are book studies, field studies, and design studies. Fields studies done by survey and interview. The expected result are to give a review about the payroll of accounting information system in the ongoing business process of company and to solve all the weakness in the payroll system, so the company can use the integrated information system in counting of the payroll. Conclusion that can take from the research are there’s some manipulation risk of attendance data and documentation of form still using a manual system and simple data backup. Then, there’s also manipulation risk in allowance cash system and all the report that include in the payroll.Index Terms - Accounting Information System, Payroll

  13. Corporate political activity of the dairy industry in France: an analysis of publicly available information.

    Science.gov (United States)

    Mialon, Melissa; Mialon, Jonathan

    2017-09-01

    In the present study, we used a structured approach based on publicly available information to identify the corporate political activity (CPA) strategies of three major actors in the dairy industry in France. We collected publicly available information from the industry, government and other sources over a 6-month period, from March to August 2015. Data collection and analysis were informed by an existing framework for classifying the CPA of the food industry. Setting/Subjects Our study included three major actors in the dairy industry in France: Danone, Lactalis and the Centre National Interprofessionnel de l'Economie Laitière (CNIEL), a trade association. During the period of data collection, the dairy industry employed CPA practices on numerous occasions by using three strategies: the 'information and messaging', the 'constituency building' and the 'policy substitution' strategies. The most common practice was the shaping of evidence in ways that suited the industry. The industry also sought involvement in the community, establishing relationships with public health professionals, academics and the government. Our study shows that the dairy industry used several CPA practices, even during periods when there was no specific policy debate on the role of dairy products in dietary guidelines. The information provided here could inform public health advocates and policy makers and help them ensure that commercial interests of industry do not impede public health policies and programmes.

  14. Design and application of pulse information acquisition and analysis ...

    African Journals Online (AJOL)

    ... two-dimensional information acquisition, multiplex signals combination and deep data mining. Conclusions: The newly developed system could translate the pulse signals into digital, visual and measurable motion information of vessel. Keywords: Visualized pulse information; Radial artery; B mode ultrasound; Traditional ...

  15. Conceptual design of an integrated information system for safety related analysis of nuclear power plants (IRIS Phase 1)

    International Nuclear Information System (INIS)

    Hofer, K.; Zehnder, P.; Galperin, A.

    1994-01-01

    This report deals with a conceptual design of an integrated information management system, called PSI-IRIS, as needed to assist the analysts for reactor safety related investigations on Swiss nuclear power plants within the project STARS. Performing complicated engineering analyses of an NPP requires storage and manipulation of a large amount of information, both data and knowledge. This information is characterized by its multi-disciplinary nature, complexity, and diversity. The problems caused by inefficient and lengthy manual operations involving the data flow management within the framework of the safety related analysis of an NPP, can be solved by applying computer aided engineering (CAE) principles. These principles are the basis for the design of the integrated information management system PSI-IRIS presented in this report. The basic idea is to create a computerized environment, which includes both database and functional capabilities. The database of the PSI-IRIS consists of two parts, an NPP generic database (GDB) and a collection of analysis results (CASE L IB). The GDB includes all technical plant data and information needed to generate input decks for all computer codes utilized within the STARS project. The CASE L IB storage contains the accumulated knowledge, input decks, and result files of the NPP transient analyses. Considerations and analysis of the data types and the required data manipulation capabilities as well as operational requirements resulted in the choice of an object-oriented database management system (OODBMS) as a development platform for solving the software engineering problems. Several advantages of OODBMS's over conventional relational database management systems were found of crucial importance, especially providing the necessary flexibility for different data types and the potential for extensibility. (author) 15 figs., tabs., 20 refs

  16. Geographical information system analysis for oceanographic parameters in the coastal waters of Goa, India - A case study

    Digital Repository Service at National Institute of Oceanography (India)

    Suryanarayana, A.; Joglekar, V.V.

    A geographical information system (GIS) is used to create oceanography database and to do the spatial analysis of physical, chemical and biological characteristics of the coastal waters of Goa, India. Vector maps depicting distributions of currents...

  17. Information management in NACD regimes: a comparative analysis

    International Nuclear Information System (INIS)

    Unger, R.

    1998-01-01

    While all non-proliferation, arms control and disarmament (NACD) regimes must address the issue of information management, this area has remained an under-explored part of the arms control field. This paper compares information management processes across a variety of NACD regimes for the purpose of identifying potential synergies between regimes and suggesting means by which to strengthen future arms control verification efforts. The paper explores the information management systems of the International Atomic Energy Agency (IAEA), the United Nations Special Commission in Iraq (UNSCOM), the Conventional Forces in Europe Agreement (CFE), and the Comprehensive Test Ban Treaty (CTBT). (author)

  18. Information on Hydrologic Conceptual Models, Parameters, Uncertainty Analysis, and Data Sources for Dose Assessments at Decommissioning Sites

    International Nuclear Information System (INIS)

    Meyer, Philip D.; Gee, Glendon W.; Nicholson, Thomas J.

    1999-01-01

    This report addresses issues related to the analysis of uncertainty in dose assessments conducted as part of decommissioning analyses. The analysis is limited to the hydrologic aspects of the exposure pathway involving infiltration of water at the ground surface, leaching of contaminants, and transport of contaminants through the groundwater to a point of exposure. The basic conceptual models and mathematical implementations of three dose assessment codes are outlined along with the site-specific conditions under which the codes may provide inaccurate, potentially nonconservative results. In addition, the hydrologic parameters of the codes are identified and compared. A methodology for parameter uncertainty assessment is outlined that considers the potential data limitations and modeling needs of decommissioning analyses. This methodology uses generic parameter distributions based on national or regional databases, sensitivity analysis, probabilistic modeling, and Bayesian updating to incorporate site-specific information. Data sources for best-estimate parameter values and parameter uncertainty information are also reviewed. A follow-on report will illustrate the uncertainty assessment methodology using decommissioning test cases

  19. Information on Hydrologic Conceptual Models, Parameters, Uncertainty Analysis, and Data Sources for Dose Assessments at Decommissioning Sites

    International Nuclear Information System (INIS)

    Meyer D, Philip; Gee W, Glendon

    2000-01-01

    This report addresses issues related to the analysis of uncertainty in dose assessments conducted as part of decommissioning analyses. The analysis is limited to the hydrologic aspects of the exposure pathway involving infiltration of water at the ground surface, leaching of contaminants, and transport of contaminants through the groundwater to a point of exposure. The basic conceptual models and mathematical implementations of three dose assessment codes are outlined along with the site-specific conditions under which the codes may provide inaccurate, potentially nonconservative results. In addition, the hydrologic parameters of the codes are identified and compared. A methodology for parameter uncertainty assessment is outlined that considers the potential data limitations and modeling needs of decommissioning analyses. This methodology uses generic parameter distributions based on national or regional databases, sensitivity analysis, probabilistic modeling, and Bayesian updating to incorporate site-specific information. Data sources for best-estimate parameter values and parameter uncertainty information are also reviewed. A follow-on report will illustrate the uncertainty assessment methodology using decommissioning test cases

  20. Comparison of Seven Methods for Boolean Factor Analysis and Their Evaluation by Information Gain

    Czech Academy of Sciences Publication Activity Database

    Frolov, A.; Húsek, Dušan; Polyakov, P.Y.

    2016-01-01

    Roč. 27, č. 3 (2016), s. 538-550 ISSN 2162-237X R&D Projects: GA MŠk ED1.1.00/02.0070 Institutional support: RVO:67985807 Keywords : associative memory * bars problem (BP) * Boolean factor analysis (BFA) * data mining * dimension reduction * Hebbian learning rule * information gain * likelihood maximization (LM) * neural network application * recurrent neural network * statistics Subject RIV: IN - Informatics, Computer Science Impact factor: 6.108, year: 2016

  1. Security analysis and improvement of a privacy authentication scheme for telecare medical information systems.

    Science.gov (United States)

    Wu, Fan; Xu, Lili

    2013-08-01

    Nowadays, patients can gain many kinds of medical service on line via Telecare Medical Information Systems(TMIS) due to the fast development of computer technology. So security of communication through network between the users and the server is very significant. Authentication plays an important part to protect information from being attacked by malicious attackers. Recently, Jiang et al. proposed a privacy enhanced scheme for TMIS using smart cards and claimed their scheme was better than Chen et al.'s. However, we have showed that Jiang et al.'s scheme has the weakness of ID uselessness and is vulnerable to off-line password guessing attack and user impersonation attack if an attacker compromises the legal user's smart card. Also, it can't resist DoS attack in two cases: after a successful impersonation attack and wrong password input in Password change phase. Then we propose an improved mutual authentication scheme used for a telecare medical information system. Remote monitoring, checking patients' past medical history record and medical consultant can be applied in the system where information transmits via Internet. Finally, our analysis indicates that the suggested scheme overcomes the disadvantages of Jiang et al.'s scheme and is practical for TMIS.

  2. ADHD performance reflects inefficient but not impulsive information processing: a diffusion model analysis.

    Science.gov (United States)

    Metin, Baris; Roeyers, Herbert; Wiersema, Jan R; van der Meere, Jaap J; Thompson, Margaret; Sonuga-Barke, Edmund

    2013-03-01

    Attention-deficit/hyperactivity disorder (ADHD) is associated with performance deficits across a broad range of tasks. Although individual tasks are designed to tap specific cognitive functions (e.g., memory, inhibition, planning, etc.), these deficits could also reflect general effects related to either inefficient or impulsive information processing or both. These two components cannot be isolated from each other on the basis of classical analysis in which mean reaction time (RT) and mean accuracy are handled separately. Seventy children with a diagnosis of combined type ADHD and 50 healthy controls (between 6 and 17 years) performed two tasks: a simple two-choice RT (2-CRT) task and a conflict control task (CCT) that required higher levels of executive control. RT and errors were analyzed using the Ratcliff diffusion model, which divides decisional time into separate estimates of information processing efficiency (called "drift rate") and speed-accuracy tradeoff (SATO, called "boundary"). The model also provides an estimate of general nondecisional time. Results were the same for both tasks independent of executive load. ADHD was associated with lower drift rate and less nondecisional time. The groups did not differ in terms of boundary parameter estimates. RT and accuracy performance in ADHD appears to reflect inefficient rather than impulsive information processing, an effect independent of executive function load. The results are consistent with models in which basic information processing deficits make an important contribution to the ADHD cognitive phenotype. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  3. Lessons from a comparative (cross-country) study using conjoint analysis: Why not use all the information?

    DEFF Research Database (Denmark)

    Blunch, Niels Johan

    Re-examination of data from two comparative (cross-country) studies using conjoint analysis shows that significant improvement can be achieved by using two often neglected kinds of a priori information: Knowledge of the expected order of preferences for the various levels of one or more attributes...

  4. Manufacturing Technology Information Analysis Center: Knowledge Is Strength

    Science.gov (United States)

    Safar, Michal

    1992-01-01

    The Center's primary function is to facilitate technology transfer within DoD, other government agencies and industry. The DoD has recognized the importance of technology transfer, not only to support specific weapon system manufacture, but to strengthen the industrial base that sustains DoD. MTIAC uses an experienced technical staff of engineers and information specialists to acquire, analyze, and disseminate technical information. Besides ManTech project data, MTIAC collects manufacturing technology from other government agencies, commercial publications, proceedings, and various international sources. MTIAC has various means of disseminating this information. Much of the technical data is on user accessible data bases. The Center researches and writes a number of technical reports each year and publishes a newsletter monthly. Customized research is performed in response to specific inquiries from government and industry. MTIAC serves as a link between Government and Industry to strengthen the manufacturing technology base through the dissemination of advanced manufacturing information.

  5. Quantifying information transfer by protein domains: Analysis of the Fyn SH2 domain structure

    Directory of Open Access Journals (Sweden)

    Serrano Luis

    2008-10-01

    Full Text Available Abstract Background Efficient communication between distant sites within a protein is essential for cooperative biological response. Although often associated with large allosteric movements, more subtle changes in protein dynamics can also induce long-range correlations. However, an appropriate formalism that directly relates protein structural dynamics to information exchange between functional sites is still lacking. Results Here we introduce a method to analyze protein dynamics within the framework of information theory and show that signal transduction within proteins can be considered as a particular instance of communication over a noisy channel. In particular, we analyze the conformational correlations between protein residues and apply the concept of mutual information to quantify information exchange. Mapping out changes of mutual information on the protein structure then allows visualizing how distal communication is achieved. We illustrate the approach by analyzing information transfer by the SH2 domain of Fyn tyrosine kinase, obtained from Monte Carlo dynamics simulations. Our analysis reveals that the Fyn SH2 domain forms a noisy communication channel that couples residues located in the phosphopeptide and specificity binding sites and a number of residues at the other side of the domain near the linkers that connect the SH2 domain to the SH3 and kinase domains. We find that for this particular domain, communication is affected by a series of contiguous residues that connect distal sites by crossing the core of the SH2 domain. Conclusion As a result, our method provides a means to directly map the exchange of biological information on the structure of protein domains, making it clear how binding triggers conformational changes in the protein structure. As such it provides a structural road, next to the existing attempts at sequence level, to predict long-range interactions within protein structures.

  6. Clinical information modeling processes for semantic interoperability of electronic health records: systematic review and inductive analysis.

    Science.gov (United States)

    Moreno-Conde, Alberto; Moner, David; Cruz, Wellington Dimas da; Santos, Marcelo R; Maldonado, José Alberto; Robles, Montserrat; Kalra, Dipak

    2015-07-01

    This systematic review aims to identify and compare the existing processes and methodologies that have been published in the literature for defining clinical information models (CIMs) that support the semantic interoperability of electronic health record (EHR) systems. Following the preferred reporting items for systematic reviews and meta-analyses systematic review methodology, the authors reviewed published papers between 2000 and 2013 that covered that semantic interoperability of EHRs, found by searching the PubMed, IEEE Xplore, and ScienceDirect databases. Additionally, after selection of a final group of articles, an inductive content analysis was done to summarize the steps and methodologies followed in order to build CIMs described in those articles. Three hundred and seventy-eight articles were screened and thirty six were selected for full review. The articles selected for full review were analyzed to extract relevant information for the analysis and characterized according to the steps the authors had followed for clinical information modeling. Most of the reviewed papers lack a detailed description of the modeling methodologies used to create CIMs. A representative example is the lack of description related to the definition of terminology bindings and the publication of the generated models. However, this systematic review confirms that most clinical information modeling activities follow very similar steps for the definition of CIMs. Having a robust and shared methodology could improve their correctness, reliability, and quality. Independently of implementation technologies and standards, it is possible to find common patterns in methods for developing CIMs, suggesting the viability of defining a unified good practice methodology to be used by any clinical information modeler. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. Data Analysis Approaches for the Risk-Informed Safety Margins Characterization Toolkit

    International Nuclear Information System (INIS)

    Mandelli, Diego; Alfonsi, Andrea; Maljovec, Daniel P.; Parisi, Carlo; Cogliati, Joshua J.; Talbot, Paul W.; Smith, Curtis L.; Rabiti, Cristian; Picoco, Claudia

    2016-01-01

    In the past decades, several numerical simulation codes have been employed to simulate accident dynamics (e.g., RELAP5-3D, RELAP-7, MELCOR, MAAP). In order to evaluate the impact of uncertainties into accident dynamics, several stochastic methodologies have been coupled with these codes. These stochastic methods range from classical Monte-Carlo and Latin Hypercube sampling to stochastic polynomial methods. Similar approaches have been introduced into the risk and safety community where stochastic methods (such as RAVEN, ADAPT, MCDET, ADS) have been coupled with safety analysis codes in order to evaluate the safety impact of timing and sequencing of events. These approaches are usually called Dynamic PRA or simulation-based PRA methods. These uncertainties and safety methods usually generate a large number of simulation runs (database storage may be on the order of gigabytes or higher). The scope of this paper is to present a broad overview of methods and algorithms that can be used to analyze and extract information from large data sets containing time dependent data. In this context, ''extracting information'' means constructing input-output correlations, finding commonalities, and identifying outliers. Some of the algorithms presented here have been developed or are under development within the RAVEN statistical framework.

  8. Research in health sciences library and information science: a quantitative analysis.

    Science.gov (United States)

    Dimitroff, A

    1992-10-01

    A content analysis of research articles published between 1966 and 1990 in the Bulletin of the Medical Library Association was undertaken. Four specific questions were addressed: What subjects are of interest to health sciences librarians? Who is conducting this research? How do health sciences librarians conduct their research? Do health sciences librarians obtain funding for their research activities? Bibliometric characteristics of the research articles are described and compared to characteristics of research in library and information science as a whole in terms of subject and methodology. General findings were that most research in health sciences librarianship is conducted by librarians affiliated with academic health sciences libraries (51.8%); most deals with an applied (45.7%) or a theoretical (29.2%) topic; survey (41.0%) or observational (20.7%) research methodologies are used; descriptive quantitative analytical techniques are used (83.5%); and over 25% of research is funded. The average number of authors was 1.85, average article length was 7.25 pages, and average number of citations per article was 9.23. These findings are consistent with those reported in the general library and information science literature for the most part, although specific differences do exist in methodological and analytical areas.

  9. Parametric sensitivity analysis for stochastic molecular systems using information theoretic metrics

    Energy Technology Data Exchange (ETDEWEB)

    Tsourtis, Anastasios, E-mail: tsourtis@uoc.gr [Department of Mathematics and Applied Mathematics, University of Crete, Crete (Greece); Pantazis, Yannis, E-mail: pantazis@math.umass.edu; Katsoulakis, Markos A., E-mail: markos@math.umass.edu [Department of Mathematics and Statistics, University of Massachusetts, Amherst, Massachusetts 01003 (United States); Harmandaris, Vagelis, E-mail: harman@uoc.gr [Department of Mathematics and Applied Mathematics, University of Crete, and Institute of Applied and Computational Mathematics (IACM), Foundation for Research and Technology Hellas (FORTH), GR-70013 Heraklion, Crete (Greece)

    2015-07-07

    In this paper, we present a parametric sensitivity analysis (SA) methodology for continuous time and continuous space Markov processes represented by stochastic differential equations. Particularly, we focus on stochastic molecular dynamics as described by the Langevin equation. The utilized SA method is based on the computation of the information-theoretic (and thermodynamic) quantity of relative entropy rate (RER) and the associated Fisher information matrix (FIM) between path distributions, and it is an extension of the work proposed by Y. Pantazis and M. A. Katsoulakis [J. Chem. Phys. 138, 054115 (2013)]. A major advantage of the pathwise SA method is that both RER and pathwise FIM depend only on averages of the force field; therefore, they are tractable and computable as ergodic averages from a single run of the molecular dynamics simulation both in equilibrium and in non-equilibrium steady state regimes. We validate the performance of the extended SA method to two different molecular stochastic systems, a standard Lennard-Jones fluid and an all-atom methane liquid, and compare the obtained parameter sensitivities with parameter sensitivities on three popular and well-studied observable functions, namely, the radial distribution function, the mean squared displacement, and the pressure. Results show that the RER-based sensitivities are highly correlated with the observable-based sensitivities.

  10. Optimization of rainfall networks using information entropy and temporal variability analysis

    Science.gov (United States)

    Wang, Wenqi; Wang, Dong; Singh, Vijay P.; Wang, Yuankun; Wu, Jichun; Wang, Lachun; Zou, Xinqing; Liu, Jiufu; Zou, Ying; He, Ruimin

    2018-04-01

    Rainfall networks are the most direct sources of precipitation data and their optimization and evaluation are essential and important. Information entropy can not only represent the uncertainty of rainfall distribution but can also reflect the correlation and information transmission between rainfall stations. Using entropy this study performs optimization of rainfall networks that are of similar size located in two big cities in China, Shanghai (in Yangtze River basin) and Xi'an (in Yellow River basin), with respect to temporal variability analysis. Through an easy-to-implement greedy ranking algorithm based on the criterion called, Maximum Information Minimum Redundancy (MIMR), stations of the networks in the two areas (each area is further divided into two subareas) are ranked during sliding inter-annual series and under different meteorological conditions. It is found that observation series with different starting days affect the ranking, alluding to the temporal variability during network evaluation. We propose a dynamic network evaluation framework for considering temporal variability, which ranks stations under different starting days with a fixed time window (1-year, 2-year, and 5-year). Therefore, we can identify rainfall stations which are temporarily of importance or redundancy and provide some useful suggestions for decision makers. The proposed framework can serve as a supplement for the primary MIMR optimization approach. In addition, during different periods (wet season or dry season) the optimal network from MIMR exhibits differences in entropy values and the optimal network from wet season tended to produce higher entropy values. Differences in spatial distribution of the optimal networks suggest that optimizing the rainfall network for changing meteorological conditions may be more recommended.

  11. Application of information theory for the analysis of cogeneration-system performance

    International Nuclear Information System (INIS)

    Takahashi, Kazuki; Ishizaka, Tadashi

    1998-01-01

    Successful cogeneration system performance depends critically upon the correct estimation of load variation and the accuracy of demand prediction. We need not only aggregated annual heat and electricity demands, but also hourly and monthly patterns in order to evaluate a cogeneration system's performance by computer simulation. These data are usually obtained from the actual measurements of energy demand in existing buildings. However, it is extremely expensive to collect actual energy demand data and store it over a long period for many buildings. However we face the question of whether it is really necessary to survey hourly demands. This paper provides a sensitivity analysis of the influence of demand-prediction error upon the efficiency of cogeneration system, so as to evaluate the relative importance of various demand components. These components are annual energy demand, annual heat-to-electricity ratio, daily load factor and so forth. Our approach employs the concept of information theory to construct a mathematical model. This analysis provides an indication of the relative importances of demand indices, and identifies what may become a good measure of assessing the efficiency of the cogeneration system for planning purposes. (Author)

  12. EEG-Informed fMRI: A Review of Data Analysis Methods

    Science.gov (United States)

    Abreu, Rodolfo; Leal, Alberto; Figueiredo, Patrícia

    2018-01-01

    The simultaneous acquisition of electroencephalography (EEG) with functional magnetic resonance imaging (fMRI) is a very promising non-invasive technique for the study of human brain function. Despite continuous improvements, it remains a challenging technique, and a standard methodology for data analysis is yet to be established. Here we review the methodologies that are currently available to address the challenges at each step of the data analysis pipeline. We start by surveying methods for pre-processing both EEG and fMRI data. On the EEG side, we focus on the correction for several MR-induced artifacts, particularly the gradient and pulse artifacts, as well as other sources of EEG artifacts. On the fMRI side, we consider image artifacts induced by the presence of EEG hardware inside the MR scanner, and the contamination of the fMRI signal by physiological noise of non-neuronal origin, including a review of several approaches to model and remove it. We then provide an overview of the approaches specifically employed for the integration of EEG and fMRI when using EEG to predict the blood oxygenation level dependent (BOLD) fMRI signal, the so-called EEG-informed fMRI integration strategy, the most commonly used strategy in EEG-fMRI research. Finally, we systematically review methods used for the extraction of EEG features reflecting neuronal phenomena of interest. PMID:29467634

  13. EEG-Informed fMRI: A Review of Data Analysis Methods

    Directory of Open Access Journals (Sweden)

    Rodolfo Abreu

    2018-02-01

    Full Text Available The simultaneous acquisition of electroencephalography (EEG with functional magnetic resonance imaging (fMRI is a very promising non-invasive technique for the study of human brain function. Despite continuous improvements, it remains a challenging technique, and a standard methodology for data analysis is yet to be established. Here we review the methodologies that are currently available to address the challenges at each step of the data analysis pipeline. We start by surveying methods for pre-processing both EEG and fMRI data. On the EEG side, we focus on the correction for several MR-induced artifacts, particularly the gradient and pulse artifacts, as well as other sources of EEG artifacts. On the fMRI side, we consider image artifacts induced by the presence of EEG hardware inside the MR scanner, and the contamination of the fMRI signal by physiological noise of non-neuronal origin, including a review of several approaches to model and remove it. We then provide an overview of the approaches specifically employed for the integration of EEG and fMRI when using EEG to predict the blood oxygenation level dependent (BOLD fMRI signal, the so-called EEG-informed fMRI integration strategy, the most commonly used strategy in EEG-fMRI research. Finally, we systematically review methods used for the extraction of EEG features reflecting neuronal phenomena of interest.

  14. College Students’ Information Needs and Information Seeking Behaviors regarding Personal Information

    Directory of Open Access Journals (Sweden)

    Yu-Wen Liu

    2017-12-01

    Full Text Available This study analyzed college students’ reactions toward the issues of personal information. Students’ needs and seeking behaviors for personal information were assessed. Relevant literature was reviewed for framing the research questions and designing the questionnaire items for survey. Survey subjects were students from an university at northern Taiwan. A set of questionnaire items were used to collect research data. Statistical analysis from 252 valid data reveals some items were highly rated: Students reflected highly for their need of knowledge under the security threat of personal information (M = 4.29. They reacted strongly on acquiring knowledge and resources through the Internet (M = 4.24. They preferred the use of resources clear and easy to be understood (M = 4.04. However, most students had low level faith toward either government or non-governmental organizations in securing their personal information (M < 3.0 for most items. More effort among education and government should be emphasized in the future to improve personal use and reduce uncertainty in the use of personal information.

  15. Analysis and design on airport safety information management system

    Directory of Open Access Journals (Sweden)

    Yan Lin

    2017-01-01

    Full Text Available Airport safety information management system is the foundation of implementing safety operation, risk control, safety performance monitor, and safety management decision for the airport. The paper puts forward the architecture of airport safety information management system based on B/S model, focuses on safety information processing flow, designs the functional modules and proposes the supporting conditions for system operation. The system construction is helpful to perfecting the long effect mechanism driven by safety information, continually increasing airport safety management level and control proficiency.

  16. Simple LED spectrophotometer for analysis of color information.

    Science.gov (United States)

    Kim, Ji-Sun; Kim, A-Hee; Oh, Han-Byeol; Goh, Bong-Jun; Lee, Eun-Suk; Kim, Jun-Sik; Jung, Gu-In; Baek, Jin-Young; Jun, Jae-Hoon

    2015-01-01

    A spectrophotometer is the basic measuring equipment essential to most research activity fields requiring samples to be measured, such as physics, biotechnology and food engineering. This paper proposes a system that is able to detect sample concentration and color information by using LED and color sensor. Purity and wavelength information can be detected by CIE diagram, and the concentration can be estimated with purity information. This method is more economical and efficient than existing spectrophotometry, and can also be used by ordinary persons. This contribution is applicable to a number of fields because it can be used as a colorimeter to detect the wavelength and purity of samples.

  17. Managing Returnable Containers Logistics - A Case Study Part I - Physical and Information Flow Analysis

    Directory of Open Access Journals (Sweden)

    Reza A. Maleki

    2011-05-01

    Full Text Available This case study paper is the result of a project conducted on behalf of a company, hereon referred to as Midwest Assembly and Manufacturing or MAAN. The company's operations include component manufacturing, painting, and assembling products. The company also purchases a relatively large percentage of components and major assemblies that are needed to support final assembly operations. MAAN uses its own returnable containers to transport purchased parts from suppliers. Due to poor tracking of the containers, the company has been experiencing lost containers and occasional production disruptions at its facility well as at the supplier sites. The objective of this project was to develop a proposal to enable MAAN to more effectively track and manage its returnable containers. The research activities in support of this project included the analysis and documentation of both the physical flow and the information flow associated with the containers as well as some of the technologies that can help with automatic identification and tracking of containers. The focal point of this paper is on a macro?level approach for the analysis of container and information flow within the logistics chain. A companion paper deals with several of the automatic identification technologies that have the potential to improve the management of MAAN's returnable containers.

  18. The prospective relationship between satisfaction with information and symptoms of depression and anxiety in breast cancer: A structural equation modeling analysis.

    Science.gov (United States)

    Faller, Hermann; Strahl, André; Richard, Matthias; Niehues, Christiane; Meng, Karin

    2017-11-01

    Previous research has demonstrated associations between satisfaction with information and reduced emotional distress in cancer patients. However, as most studies were cross-sectional, the direction of this relationship remained unclear. We therefore aimed to test whether information satisfaction predicted subsequent depression and anxiety levels, and, reciprocally, depression and anxiety levels predicted subsequent information satisfaction, thus clarifying the direction of impact. We performed a secondary analysis of a prospective cohort study with 436 female breast cancer patients (mean age 51 years). We measured information satisfaction with 2 self-developed items, symptoms of depression with the 2-item Patient Heath Questionnaire and symptoms of anxiety with the 2-item Generalized Anxiety Disorder Scale. We created 2 structural equation models, 1 for depression and 1 for anxiety, that examined the prediction of 1-year depression (or anxiety) levels by baseline information satisfaction and, in the same model, 1-year information satisfaction by baseline depression (or anxiety) levels (cross-lagged panel analysis). Baseline information satisfaction predicted 1-year levels of both depression (beta = -0.17, P satisfaction, adjusting for its baseline score. Our results suggest a bidirectional relationship between information satisfaction and symptoms of depression and anxiety. Thus, provision of information may reduce subsequent depression and anxiety, while reducing depression and anxiety levels may increase satisfaction with received information. Combining the provision of information with emotional support may be particularly beneficial. Copyright © 2016 John Wiley & Sons, Ltd.

  19. Value of information analysis from a societal perspective: a case study in prevention of major depression.

    Science.gov (United States)

    Mohseninejad, Leyla; van Baal, Pieter H M; van den Berg, Matthijs; Buskens, Erik; Feenstra, Talitha

    2013-06-01

    Productivity losses usually have a considerable impact on cost-effectiveness estimates while their estimated values are often relatively uncertain. Therefore, parameters related to these indirect costs play a role in setting priorities for future research from a societal perspective. Until now, however, value of information analyses have usually applied a health care perspective for economic evaluations. Hence, the effect of productivity losses has rarely been investigated in such analyses. The aim of the current study therefore was to investigate the effects of including or excluding productivity costs in value of information analyses. Expected value of information analysis (EVPI) was performed in cost-effectiveness evaluation of prevention from both societal and health care perspectives, to give us the opportunity to compare different perspectives. Priorities for future research were determined by partial EVPI. The program to prevent major depression in patients with subthreshold depression was opportunistic screening followed by minimal contact psychotherapy. The EVPI indicated that regardless of perspective, further research is potentially worthwhile. Partial EVPI results underlined the importance of productivity losses when a societal perspective was considered. Furthermore, priority setting for future research differed according to perspective. The results illustrated that advise for future research will differ for a health care versus a societal perspective and hence the value of information analysis should be adjusted to the perspective that is relevant for the decision makers involved. The outcomes underlined the need for carefully choosing the suitable perspective for the decision problem at hand. Copyright © 2013 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  20. FACTORS INFLUENCING INFORMATION TECHNOLOGY ADPOTION: A CROSS-SECTIONAL ANALYSIS

    OpenAIRE

    Stroade, Jeri L.; Schurle, Bryan W.

    2003-01-01

    This project will explore information technology adoption issues. The unique characteristics of information technology will be discussed. Advantages and disadvantages to adoption will also be identified. Finally, a statistical model of Internet adoption will be developed to estimate the impacts of certain variables on the underlying process of information technology adoption.

  1. Quantitative and Qualitative Analysis of Nutrition and Food Safety Information in School Science Textbooks of India

    Science.gov (United States)

    Subba Rao, G. M.; Vijayapushapm, T.; Venkaiah, K.; Pavarala, V.

    2012-01-01

    Objective: To assess quantity and quality of nutrition and food safety information in science textbooks prescribed by the Central Board of Secondary Education (CBSE), India for grades I through X. Design: Content analysis. Methods: A coding scheme was developed for quantitative and qualitative analyses. Two investigators independently coded the…

  2. Directed information measures in neuroscience

    CERN Document Server

    Vicente, Raul; Lizier, Joseph

    2014-01-01

    Analysis of information transfer has found rapid adoption in neuroscience, where a highly dynamic transfer of information continuously runs on top of the brain's slowly-changing anatomical connectivity. Measuring such transfer is crucial to understanding how flexible information routing and processing give rise to higher cognitive function. Directed Information Measures in Neuroscience reviews recent developments of concepts and tools for measuring information transfer, their application to neurophysiological recordings and analysis of interactions. Written by the most active researchers in the field the book discusses the state of the art, future prospects and challenges on the way to an efficient assessment of neuronal information transfer. Highlights include the theoretical quantification and practical estimation of information transfer, description of transfer locally in space and time, multivariate directed measures, information decomposition among a set of stimulus/responses variables, and the relation ...

  3. A qualitative analysis of Māori and Pacific smokers' views on informed choice and smoking.

    Science.gov (United States)

    Gifford, Heather; Tautolo, El-Shadan; Erick, Stephanie; Hoek, Janet; Gray, Rebecca; Edwards, Richard

    2016-05-17

    Tobacco companies frame smoking as an informed choice, a strategy that holds individuals responsible for harms they incur. Few studies have tested this argument, and even fewer have examined how informed indigenous smokers or those from minority ethnicities are when they start smoking. We explored how young adult Māori and Pacific smokers interpreted 'informed choice' in relation to smoking. Using recruitment via advertising, existing networks and word of mouth, we recruited and undertook qualitative in-depth interviews with 20 Māori and Pacific young adults aged 18-26 years who smoked. Data were analysed using an informed-choice framework developed by Chapman and Liberman. We used a thematic analysis approach to identify themes that extended this framework. Few participants considered themselves well informed and none met more than the framework's initial two criteria. Most reflected on their unthinking uptake and subsequent addiction, and identified environmental factors that had facilitated uptake. Nonetheless, despite this context, most agreed that they had made an informed choice to smoke. The discrepancy between participants' reported knowledge and understanding of smoking's risks, and their assessment of smoking as an informed choice, reflects their view of smoking as a symbol of adulthood. Policies that make tobacco more difficult to use in social settings could help change social norms around smoking and the ease with which initiation and addiction currently occur. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  4. Optimization of information properties of NAA with respect to information content and profitability of results

    International Nuclear Information System (INIS)

    Obrusnik, I.; Eckschlager, K.

    1986-01-01

    Information properties of analytical results together with other important parameters especially economic ones can be used for the optimization of analytical procedures. Therefore, we have proposed a computational technique for the optimization of multielement neutron activation analysis (NAA) based on the information content and profitability. The optimization starts with the prediction of the γ-ray spectra to be expected during analysis under given experimental conditions (sample size, irradiation, decay and counting times etc.) and with the calculation of detection and determination limits. In the next step, the information contents for the determination of particular elements and for the simultaneous determination of element groups are computed. The information content depends or is closely connected with such properties of the method as selectivity, snesitivity, precision, accuracy and, as in the other cases of trace analysis, also with the detection limit. Then, the information profitability (IP) taking into account the information content and relevance (appreciation of specific information according to its contribution to the solution of a given problem) together wit economic aspects can be calculated. This function can be used for the optimization of a particular NAA procedure, for the mutual comparison of different variants of NAA and also for the comparison with other analytical methods. The use of information profitability for the optimization of NAA is shown on a practical example of the INAA analysis of urban particulate matter SRN 1648 produced by NBS (USA). (author)

  5. Trading in markets with noisy information: an evolutionary analysis

    Science.gov (United States)

    Bloembergen, Daan; Hennes, Daniel; McBurney, Peter; Tuyls, Karl

    2015-07-01

    We analyse the value of information in a stock market where information can be noisy and costly, using techniques from empirical game theory. Previous work has shown that the value of information follows a J-curve, where averagely informed traders perform below market average, and only insiders prevail. Here we show that both noise and cost can change this picture, in several cases leading to opposite results where insiders perform below market average, and averagely informed traders prevail. Moreover, we investigate the effect of random explorative actions on the market dynamics, showing how these lead to a mix of traders being sustained in equilibrium. These results provide insight into the complexity of real marketplaces, and show under which conditions a broad mix of different trading strategies might be sustainable.

  6. Incorporating prior information into differential network analysis using non-paranormal graphical models.

    Science.gov (United States)

    Zhang, Xiao-Fei; Ou-Yang, Le; Yan, Hong

    2017-08-15

    Understanding how gene regulatory networks change under different cellular states is important for revealing insights into network dynamics. Gaussian graphical models, which assume that the data follow a joint normal distribution, have been used recently to infer differential networks. However, the distributions of the omics data are non-normal in general. Furthermore, although much biological knowledge (or prior information) has been accumulated, most existing methods ignore the valuable prior information. Therefore, new statistical methods are needed to relax the normality assumption and make full use of prior information. We propose a new differential network analysis method to address the above challenges. Instead of using Gaussian graphical models, we employ a non-paranormal graphical model that can relax the normality assumption. We develop a principled model to take into account the following prior information: (i) a differential edge less likely exists between two genes that do not participate together in the same pathway; (ii) changes in the networks are driven by certain regulator genes that are perturbed across different cellular states and (iii) the differential networks estimated from multi-view gene expression data likely share common structures. Simulation studies demonstrate that our method outperforms other graphical model-based algorithms. We apply our method to identify the differential networks between platinum-sensitive and platinum-resistant ovarian tumors, and the differential networks between the proneural and mesenchymal subtypes of glioblastoma. Hub nodes in the estimated differential networks rediscover known cancer-related regulator genes and contain interesting predictions. The source code is at https://github.com/Zhangxf-ccnu/pDNA. szuouyl@gmail.com. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  7. Information content analysis: the potential for methane isotopologue retrieval from GOSAT-2

    Directory of Open Access Journals (Sweden)

    E. Malina

    2018-02-01

    Full Text Available Atmospheric methane is comprised of multiple isotopic molecules, with the most abundant being 12CH4 and 13CH4, making up 98 and 1.1 % of atmospheric methane respectively. It has been shown that is it possible to distinguish between sources of methane (biogenic methane, e.g. marshland, or abiogenic methane, e.g. fracking via a ratio of these main methane isotopologues, otherwise known as the δ13C value. δ13C values typically range between −10 and −80 ‰, with abiogenic sources closer to zero and biogenic sources showing more negative values. Initially, we suggest that a δ13C difference of 10 ‰ is sufficient, in order to differentiate between methane source types, based on this we derive that a precision of 0.2 ppbv on 13CH4 retrievals may achieve the target δ13C variance. Using an application of the well-established information content analysis (ICA technique for assumed clear-sky conditions, this paper shows that using a combination of the shortwave infrared (SWIR bands on the planned Greenhouse gases Observing SATellite (GOSAT-2 mission, 13CH4 can be measured with sufficient information content to a precision of between 0.7 and 1.2 ppbv from a single sounding (assuming a total column average value of 19.14 ppbv, which can then be reduced to the target precision through spatial and temporal averaging techniques. We therefore suggest that GOSAT-2 can be used to differentiate between methane source types. We find that large unconstrained covariance matrices are required in order to achieve sufficient information content, while the solar zenith angle has limited impact on the information content.

  8. Procedural Information and Behavioral Control: Longitudinal Analysis of the Intention-Behavior Gap in the Context of Recycling

    Directory of Open Access Journals (Sweden)

    Sonny Rosenthal

    2018-01-01

    Full Text Available The theory of planned behavior states that individuals act on their intentions, especially when they have behavioral control. The current study examines how seeking recycling-related procedural information—i.e., information about how and where to recycle—is related to behavioral control. Hypothesis testing used hierarchical ordinary least squares regression analysis of longitudinal data from 553 survey respondents. Results supported seven hypotheses. Most notably, procedural information seeking both mediated and moderated the relationship between intention and behavior. Further, the moderation effect was itself mediated by behavioral control. The argument for this mediated moderation is that information seeking enhances behavioral control, and it is primarily behavioral control that moderates the relationship between intention and behavior. These results have implications for the theory of planned behavior and, more generally, for how individuals use information to support their behaviors.

  9. An analysis of the role of information and communication technology sectors on Japanese national economy from 1995 through 2005: An application of multiplier analysis

    International Nuclear Information System (INIS)

    Zuhdi, Ubaidillah

    2015-01-01

    The purpose of this study is to continue the previous studies which focused on Japanese Information and Communication Technology (ICT) sectors. More specifically, this study aims to analyze the role of ICT sectors on Japanese national economy using simple household income multiplier, one of the analysis instruments in Input-Output (IO) analysis. The analysis period of this study is from 1995-2005. The results show that the sectors did not have an important role on the national economy of Japan on the period. Besides, the results also show that, from the point of view of the multiplier, Japanese national economy tended to stable during the period

  10. Wireless Information-Theoretic Security in an Outdoor Topology with Obstacles: Theoretical Analysis and Experimental Measurements

    Directory of Open Access Journals (Sweden)

    Dagiuklas Tasos

    2011-01-01

    Full Text Available This paper presents a Wireless Information-Theoretic Security (WITS scheme, which has been recently introduced as a robust physical layer-based security solution, especially for infrastructureless networks. An autonomic network of moving users was implemented via 802.11n nodes of an ad hoc network for an outdoor topology with obstacles. Obstructed-Line-of-Sight (OLOS and Non-Line-of-Sight (NLOS propagation scenarios were examined. Low-speed user movement was considered, so that Doppler spread could be discarded. A transmitter and a legitimate receiver exchanged information in the presence of a moving eavesdropper. Average Signal-to-Noise Ratio (SNR values were acquired for both the main and the wiretap channel, and the Probability of Nonzero Secrecy Capacity was calculated based on theoretical formula. Experimental results validate theoretical findings stressing the importance of user location and mobility schemes on the robustness of Wireless Information-Theoretic Security and call for further theoretical analysis.

  11. Analysis of the Deployed Military Health Information System and Its Ability to Satisfy Requirements of Public Law 105-85, Section 765

    National Research Council Canada - National Science Library

    Brown, David

    2005-01-01

    .... The information obtained in this analysis will be used to further identify the strengths and weaknesses of the deployed medical information systems in the MRS and determine the ability of the MRS to meet the requirements of Public Law 105-85.

  12. The Quality of Clinical Information in Adverse Drug Reaction Reports by Patients and Healthcare Professionals: A Retrospective Comparative Analysis.

    Science.gov (United States)

    Rolfes, Leàn; van Hunsel, Florence; van der Linden, Laura; Taxis, Katja; van Puijenbroek, Eugène

    2017-07-01

    Clinical information is needed to assess the causal relationship between a drug and an adverse drug reaction (ADR) in a reliable way. Little is known about the level of relevant clinical information about the ADRs reported by patients. The aim was to determine to what extent patients report relevant clinical information about an ADR compared with their healthcare professional. A retrospective analysis of all ADR reports on the same case, i.e., cases with a report from both the patient and the patient's healthcare professional, selected from the database of the Dutch Pharmacovigilance Center Lareb, was conducted. The extent to which relevant clinical information was reported was assessed by trained pharmacovigilance assessors, using a structured tool. The following four domains were assessed: ADR, chronology, suspected drug, and patient characteristics. For each domain, the proportion of reported information in relation to information deemed relevant was calculated. An average score of all relevant domains was determined and categorized as poorly (≤45%), moderately (from 46 to 74%) or well (≥75%) reported. Data were analyzed using a paired sample t test and Wilcoxon signed rank test. A total of 197 cases were included. In 107 cases (54.3%), patients and healthcare professionals reported a similar level of clinical information. Statistical analysis demonstrated no overall differences between the groups (p = 0.126). In a unique study of cases of ADRs reported by patients and healthcare professionals, we found that patients report clinical information at a similar level as their healthcare professional. For an optimal pharmacovigilance, both healthcare professionals and patient should be encouraged to report.

  13. Information Aggregation in Organizations

    OpenAIRE

    Schulte, Elisabeth

    2006-01-01

    This dissertation contributes to the analysis of information aggregation procedures within organizations. Facing uncertainty about the consequences of a collective decision, information has to be aggregated before making a choice. Two main questions are addressed. Firstly, how well is an organization suited for the aggregation of decision-relevant information? Secondly, how should an organization be designed in order to aggregate information efficiently? The main part deals with information a...

  14. Object-oriented analysis and design for information systems Modeling with UML, OCL, IFML

    CERN Document Server

    Wazlawick, Raul Sidnei

    2014-01-01

    Object-Oriented Analysis and Design for Information Systems clearly explains real object-oriented programming in practice. Expert author Raul Sidnei Wazlawick explains concepts such as object responsibility, visibility and the real need for delegation in detail. The object-oriented code generated by using these concepts in a systematic way is concise, organized and reusable. The patterns and solutions presented in this book are based in research and industrial applications. You will come away with clarity regarding processes and use cases and a clear understand of how to expand a use case.

  15. Information theory in analytical chemistry

    National Research Council Canada - National Science Library

    Eckschlager, Karel; Danzer, Klaus

    1994-01-01

    Contents: The aim of analytical chemistry - Basic concepts of information theory - Identification of components - Qualitative analysis - Quantitative analysis - Multicomponent analysis - Optimum analytical...

  16. On the censored cost-effectiveness analysis using copula information

    Directory of Open Access Journals (Sweden)

    Charles Fontaine

    2017-02-01

    Full Text Available Abstract Background Information and theory beyond copula concepts are essential to understand the dependence relationship between several marginal covariates distributions. In a therapeutic trial data scheme, most of the time, censoring occurs. That could lead to a biased interpretation of the dependence relationship between marginal distributions. Furthermore, it could result in a biased inference of the joint probability distribution function. A particular case is the cost-effectiveness analysis (CEA, which has shown its utility in many medico-economic studies and where censoring often occurs. Methods This paper discusses a copula-based modeling of the joint density and an estimation method of the costs, and quality adjusted life years (QALY in a cost-effectiveness analysis in case of censoring. This method is not based on any linearity assumption on the inferred variables, but on a punctual estimation obtained from the marginal distributions together with their dependence link. Results Our results show that the proposed methodology keeps only the bias resulting statistical inference and don’t have anymore a bias based on a unverified linearity assumption. An acupuncture study for chronic headache in primary care was used to show the applicability of the method and the obtained ICER keeps in the confidence interval of the standard regression methodology. Conclusion For the cost-effectiveness literature, such a technique without any linearity assumption is a progress since it does not need the specification of a global linear regression model. Hence, the estimation of the a marginal distributions for each therapeutic arm, the concordance measures between these populations and the right copulas families is now sufficient to process to the whole CEA.

  17. Vocal acoustic analysis as a biometric indicator of information processing: implications for neurological and psychiatric disorders.

    Science.gov (United States)

    Cohen, Alex S; Dinzeo, Thomas J; Donovan, Neila J; Brown, Caitlin E; Morrison, Sean C

    2015-03-30

    Vocal expression reflects an integral component of communication that varies considerably within individuals across contexts and is disrupted in a range of neurological and psychiatric disorders. There is reason to suspect that variability in vocal expression reflects, in part, the availability of "on-line" resources (e.g., working memory, attention). Thus, understanding vocal expression is a potentially important biometric index of information processing, not only across but within individuals over time. A first step in this line of research involves establishing a link between vocal expression and information processing systems in healthy adults. The present study employed a dual attention experimental task where participants provided natural speech while simultaneously engaged in a baseline, medium or high nonverbal processing-load task. Objective, automated, and computerized analysis was employed to measure vocal expression in 226 adults. Increased processing load resulted in longer pauses, fewer utterances, greater silence overall and less variability in frequency and intensity levels. These results provide compelling evidence of a link between information processing resources and vocal expression, and provide important information for the development of an automated, inexpensive and uninvasive biometric measure of information processing. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  18. "Whoever increases his knowledge merely increases his heartache." Moral tensions in heart surgery patients' and their spouses' talk about information seeking. Discourse analysis, Social constructionism, Heart surgery, Information seeking

    Directory of Open Access Journals (Sweden)

    Tuominen Kimmo

    2004-01-01

    Full Text Available The paper analyses accounts of information behaviour that are produced by 20 heart surgery patients and their spouses. It is shown that patients and their significant others have to act in a context in which health ideologies stressing self sufficiency and patient compliance play a strong role. Thus, the analysed accounts and narratives of information seeking reflect moral demands that ill persons and their significant others are facing in contemporary society. The author uses social constructionist discourse analysis to examine how the interviewees have to relate their descriptions of information practices to existing moral presuppositions on how rational individuals should behave.

  19. Analysis of Climatic and Environmental Changes Using CLEARS Web-GIS Information-Computational System: Siberia Case Study

    Science.gov (United States)

    Titov, A. G.; Gordov, E. P.; Okladnikov, I.; Shulgina, T. M.

    2011-12-01

    Analysis of recent climatic and environmental changes in Siberia performed on the basis of the CLEARS (CLimate and Environment Analysis and Research System) information-computational system is presented. The system was developed using the specialized software framework for rapid development of thematic information-computational systems based on Web-GIS technologies. It comprises structured environmental datasets, computational kernel, specialized web portal implementing web mapping application logic, and graphical user interface. Functional capabilities of the system include a number of procedures for mathematical and statistical analysis, data processing and visualization. At present a number of georeferenced datasets is available for processing including two editions of NCEP/NCAR Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 and ERA Interim Reanalysis, meteorological observation data for the territory of the former USSR, and others. Firstly, using functionality of the computational kernel employing approved statistical methods it was shown that the most reliable spatio-temporal characteristics of surface temperature and precipitation in Siberia in the second half of 20th and beginning of 21st centuries are provided by ERA-40/ERA Interim Reanalysis and APHRODITE JMA Reanalysis, respectively. Namely those Reanalyses are statistically consistent with reliable in situ meteorological observations. Analysis of surface temperature and precipitation dynamics for the territory of Siberia performed on the base of the developed information-computational system reveals fine spatial and temporal details in heterogeneous patterns obtained for the region earlier. Dynamics of bioclimatic indices determining climate change impact on structure and functioning of regional vegetation cover was investigated as well. Analysis shows significant positive trends of growing season length accompanied by statistically significant increase of sum of growing degree days and total

  20. AN ECONOMIC ANALYSIS OF THE DETERMINANTS OF ENTREPRENEURSHIP: THE CASE OF MASVINGO INFORMAL BUSINESSES

    Directory of Open Access Journals (Sweden)

    Clainos Chidoko

    2013-03-01

    Full Text Available In the past decade, Zimbabwe has been hit by its worst economic performance since its independence in 1980. Capacity utilization shrank to ten percent and unemployment rate was above eighty percent by 2008 as the private and public sector witnessed massive retrenchments. As a result many people are finding themselves engaging in informal businesses to make ends meet. However not all people have joined the informal sector as has been witnessed by the number of people who left the country in droves to neighbouring countries. It is against this background that this research conducted an economic analysis of the determinants of entrepreneurship in Masvingo urban with an emphasis on the informal businesses. The research targeted a sample of 100 informal businesses (30 from Rujeko Light industrial area, 40 from Mucheke Light industrial area and 30 from Masvingo Central Business District. The businesses included among others flea market operators, furniture manufacturers, suppliers and producers of agricultural products, and food vendors. The research found out that level of education, gender, age, marital status, number of dependants, type of subjects studied at secondary school and vocational training are the main determinants that influence the type of business that entrepreneur ventures into. The study recommends formal training for the participants, for the businesses to continue into existence since they fill in the gap that is left vacant by most formal enterprises.