WorldWideScience

Sample records for analysis georgraphic information

  1. Canonical Information Analysis

    DEFF Research Database (Denmark)

    Vestergaard, Jacob Schack; Nielsen, Allan Aasbjerg

    2015-01-01

    Canonical correlation analysis is an established multivariate statistical method in which correlation between linear combinations of multivariate sets of variables is maximized. In canonical information analysis introduced here, linear correlation as a measure of association between variables is ...

  2. A comparative analysis of information acquisition, information ...

    African Journals Online (AJOL)

    A comparative analysis of information acquisition, information management capacity and administrators' decision-making effectiveness in tertiary institutions in ... communication facilities as well as modernise their methods of storage and processing of information by computerising their management information systems.

  3. Information security risk analysis

    CERN Document Server

    Peltier, Thomas R

    2001-01-01

    Effective Risk AnalysisQualitative Risk AnalysisValue AnalysisOther Qualitative MethodsFacilitated Risk Analysis Process (FRAP)Other Uses of Qualitative Risk AnalysisCase StudyAppendix A: QuestionnaireAppendix B: Facilitated Risk Analysis Process FormsAppendix C: Business Impact Analysis FormsAppendix D: Sample of ReportAppendix E: Threat DefinitionsAppendix F: Other Risk Analysis OpinionsIndex

  4. Information Security Risk Analysis

    CERN Document Server

    Peltier, Thomas R

    2010-01-01

    Offers readers with the knowledge and the skill-set needed to achieve a highly effective risk analysis assessment. This title demonstrates how to identify threats and then determine if those threats pose a real risk. It is suitable for industry and academia professionals.

  5. Information Flow Analysis for VHDL

    DEFF Research Database (Denmark)

    Tolstrup, Terkel Kristian; Nielson, Flemming; Nielson, Hanne Riis

    2005-01-01

    We describe a fragment of the hardware description language VHDL that is suitable for implementing the Advanced Encryption Standard algorithm. We then define an Information Flow analysis as required by the international standard Common Criteria. The goal of the analysis is to identify the entire...... information flow through the VHDL program. The result of the analysis is presented as a non-transitive directed graph that connects those nodes (representing either variables or signals) where an information flow might occur. We compare our approach to that of Kemmerer and conclude that our approach yields...

  6. Deprival value: information utility analysis

    Directory of Open Access Journals (Sweden)

    Marco Antonio Pereira

    Full Text Available ABSTRACT This article contributes to the perception that the users’ learning process plays a key role in order to apply an accounting concept and this involves a presentation that fits its informative potential, free of previous accounting fixations. Deprival value is a useful measure for managerial and corporate purposes, it may be applied to the current Conceptual Framework of the International Accounting Standards Board (IASB. This study analyzes its utility, taking into account cognitive aspects. Also known as value to the business, deprival value is a measurement system that followed a path where it was misunderstood, confused with another one, it faced resistance to be implemented and fell into disuse; everything that a standardized measurement method tries to avoid. In contrast, deprival value has found support in the academy and in specific applications, such as those related to the public service regulation. The accounting area has been impacted by sophistication of the measurement methods that increasingly require the ability to analyze accounting facts on an economic basis, at the risk of loss of their information content. This development becomes possible only when the potential of a measurement system is known and it is feasible to be achieved. This study consists in a theoretical essay based on literature review to discuss its origin, presentation, and application. Considering the concept’s cognitive difficulties, deprival value was analyzed, as well as its corresponding heteronym, value to the business, in order to explain some of these changes. The concept’s utility was also explored through cross-analysis with impairment and the scheme developed was applied to actual economic situations faced by a company listed on stock exchange.

  7. Maximum auto-mutual-information factor analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg

    2017-01-01

    Based on the information theoretical measure mutual information derived from entropy and Kullback-Leibler divergence, an alternative to maximum autocorrelation factor analysis is sketched.......Based on the information theoretical measure mutual information derived from entropy and Kullback-Leibler divergence, an alternative to maximum autocorrelation factor analysis is sketched....

  8. Marketing Information: A Competitive Analysis

    OpenAIRE

    Miklos Sarvary; Philip M. Parker

    1997-01-01

    Selling information that is later used in decision making constitutes an increasingly important business in modern economies (Jensen [Jensen, Fred O. 1991. Information services. Congram, Friedman, eds. , Chapter 22. AMA-COM, New York, 423–443.]). Information is sold under a large variety of forms: industry reports, consulting services, database access, and/or professional opinions given by medical, engineering, accounting/financial, and legal professionals, among others. This paper is the fir...

  9. Canonical analysis based on mutual information

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Vestergaard, Jacob Schack

    2015-01-01

    combinations with the information theoretical measure mutual information (MI). We term this type of analysis canonical information analysis (CIA). MI allows for the actual joint distribution of the variables involved and not just second order statistics. While CCA is ideal for Gaussian data, CIA facilitates...

  10. Information needs analysis principles and practice in information organizations

    CERN Document Server

    Dorner, Daniel G; Calvert, Philip J

    2010-01-01

    If you want to provide an information service that truly fulfils your users' needs, this book is essential reading. The book supports practitioners in developing an information needs analysis strategy and offers the necessary professional skills and techniques to do so.

  11. Textual Analysis of Intangible Information

    NARCIS (Netherlands)

    A.J. Moniz (Andy)

    2016-01-01

    markdownabstractTraditionally, equity investors have relied upon the information reported in firms’ financial accounts to make their investment decisions. Due to the conservative nature of accounting standards, firms cannot value their intangible assets such as corporate culture, brand value and

  12. Automated information retrieval system for radioactivation analysis

    International Nuclear Information System (INIS)

    Lambrev, V.G.; Bochkov, P.E.; Gorokhov, S.A.; Nekrasov, V.V.; Tolstikova, L.I.

    1981-01-01

    An automated information retrieval system for radioactivation analysis has been developed. An ES-1022 computer and a problem-oriented software ''The description information search system'' were used for the purpose. Main aspects and sources of forming the system information fund, characteristics of the information retrieval language of the system are reported and examples of question-answer dialogue are given. Two modes can be used: selective information distribution and retrospective search [ru

  13. Cancer Patients' Informational Needs: Qualitative Content Analysis.

    Science.gov (United States)

    Heidari, Haydeh; Mardani-Hamooleh, Marjan

    2016-12-01

    Understanding the informational needs of cancer patients is a requirement to plan any educative care program for them. The aim of this study was to identify Iranian cancer patients' perceptions of informational needs. The study took a qualitative approach. Semi-structured interviews were held with 25 cancer patients in two teaching hospitals in Iran. Transcripts of the interviews underwent conventional content analysis, and categories were extracted. The results came under two main categories: disease-related informational needs and information needs related to daily life. Disease-related informational needs had two subcategories: obtaining information about the nature of disease and obtaining information about disease prognosis. Information needs related to daily life also had two subcategories: obtaining information about healthy lifestyle and obtaining information about regular activities of daily life. The findings provide deep understanding of cancer patients' informational needs in Iran.

  14. Information Acquisition, Analysis and Integration

    Science.gov (United States)

    2016-08-03

    in Harmonic Analysis 4, 2015. 7. G. Sapiro, “Mathematical Image Processing,” in The Princeton Com- panion to Applied Mathematics, N. J. Higham, Ed...and G. Sapiro, “Computer vi- sion tools for low-cost and non-invasive measurement of autism -related behaviors in infants,” Autism Research and...measuring autism risk behaviors in young children: A technical validity and feasibility study,” MobiHealth 2015, London, October 2015. 80. A. Newson, M

  15. Informational analysis involving application of complex information system

    Science.gov (United States)

    Ciupak, Clébia; Vanti, Adolfo Alberto; Balloni, Antonio José; Espin, Rafael

    The aim of the present research is performing an informal analysis for internal audit involving the application of complex information system based on fuzzy logic. The same has been applied in internal audit involving the integration of the accounting field into the information systems field. The technological advancements can provide improvements to the work performed by the internal audit. Thus we aim to find, in the complex information systems, priorities for the work of internal audit of a high importance Private Institution of Higher Education. The applied method is quali-quantitative, as from the definition of strategic linguistic variables it was possible to transform them into quantitative with the matrix intersection. By means of a case study, where data were collected via interview with the Administrative Pro-Rector, who takes part at the elaboration of the strategic planning of the institution, it was possible to infer analysis concerning points which must be prioritized at the internal audit work. We emphasize that the priorities were identified when processed in a system (of academic use). From the study we can conclude that, starting from these information systems, audit can identify priorities on its work program. Along with plans and strategic objectives of the enterprise, the internal auditor can define operational procedures to work in favor of the attainment of the objectives of the organization.

  16. Social Network Analysis and informal trade

    DEFF Research Database (Denmark)

    Walther, Olivier

    networks can be applied to better understand informal trade in developing countries, with a particular focus on Africa. The paper starts by discussing some of the fundamental concepts developed by social network analysis. Through a number of case studies, we show how social network analysis can...... illuminate the relevant causes of social patterns, the impact of social ties on economic performance, the diffusion of resources and information, and the exercise of power. The paper then examines some of the methodological challenges of social network analysis and how it can be combined with other...... approaches. The paper finally highlights some of the applications of social network analysis and their implications for trade policies....

  17. Drainage information analysis and mapping system.

    Science.gov (United States)

    2012-10-01

    The primary objective of this research is to develop a Drainage Information Analysis and Mapping System (DIAMS), with online inspection : data submission, which will comply with the necessary requirements, mandated by both the Governmental Accounting...

  18. Mathematical Analysis of Evolution, Information, and Complexity

    CERN Document Server

    Arendt, Wolfgang

    2009-01-01

    Mathematical Analysis of Evolution, Information, and Complexity deals with the analysis of evolution, information and complexity. The time evolution of systems or processes is a central question in science, this text covers a broad range of problems including diffusion processes, neuronal networks, quantum theory and cosmology. Bringing together a wide collection of research in mathematics, information theory, physics and other scientific and technical areas, this new title offers elementary and thus easily accessible introductions to the various fields of research addressed in the book.

  19. Crime analysis using open source information

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah; Shah, Azhar Ali

    2015-01-01

    In this paper, we present a method of crime analysis from open source information. We employed un-supervised methods of data mining to explore the facts regarding the crimes of an area of interest. The analysis is based on well known clustering and association techniques. The results show...

  20. Benefit analysis of proposed information systems

    OpenAIRE

    Besore, Mark H.

    1991-01-01

    Approved for public release; distribution is unlimited This thesis reviewed two different approaches to benefit analysis, benefit comparison and user satisfaction, that could be applied to the evaluation of proposed information systems which are under consideration for acquisition by the federal government. Currently the General Services Administration only recommends that present value analysis methods be used in the analysis of alternatives even though the GSA specifies...

  1. Risk Analysis of Accounting Information System Infrastructure

    OpenAIRE

    MIHALACHE, Arsenie-Samoil

    2011-01-01

    National economy and security are fully dependent on information technology and infrastructure. At the core of the information infrastructure society relies on, we have the Internet, a system designed initially as a scientists’ forum for unclassified research. The use of communication networks and systems may lead to hazardous situations that generate undesirable effects such as communication systems breakdown, loss of data or taking the wrong decisions. The paper studies the risk analysis of...

  2. Point Information Gain and Multidimensional Data Analysis

    Directory of Open Access Journals (Sweden)

    Renata Rychtáriková

    2016-10-01

    Full Text Available We generalize the point information gain (PIG and derived quantities, i.e., point information gain entropy (PIE and point information gain entropy density (PIED, for the case of the Rényi entropy and simulate the behavior of PIG for typical distributions. We also use these methods for the analysis of multidimensional datasets. We demonstrate the main properties of PIE/PIED spectra for the real data with the examples of several images and discuss further possible utilizations in other fields of data processing.

  3. Information theory applications for biological sequence analysis.

    Science.gov (United States)

    Vinga, Susana

    2014-05-01

    Information theory (IT) addresses the analysis of communication systems and has been widely applied in molecular biology. In particular, alignment-free sequence analysis and comparison greatly benefited from concepts derived from IT, such as entropy and mutual information. This review covers several aspects of IT applications, ranging from genome global analysis and comparison, including block-entropy estimation and resolution-free metrics based on iterative maps, to local analysis, comprising the classification of motifs, prediction of transcription factor binding sites and sequence characterization based on linguistic complexity and entropic profiles. IT has also been applied to high-level correlations that combine DNA, RNA or protein features with sequence-independent properties, such as gene mapping and phenotype analysis, and has also provided models based on communication systems theory to describe information transmission channels at the cell level and also during evolutionary processes. While not exhaustive, this review attempts to categorize existing methods and to indicate their relation with broader transversal topics such as genomic signatures, data compression and complexity, time series analysis and phylogenetic classification, providing a resource for future developments in this promising area.

  4. Information flow analysis of interactome networks.

    Directory of Open Access Journals (Sweden)

    Patrycja Vasilyev Missiuro

    2009-04-01

    Full Text Available Recent studies of cellular networks have revealed modular organizations of genes and proteins. For example, in interactome networks, a module refers to a group of interacting proteins that form molecular complexes and/or biochemical pathways and together mediate a biological process. However, it is still poorly understood how biological information is transmitted between different modules. We have developed information flow analysis, a new computational approach that identifies proteins central to the transmission of biological information throughout the network. In the information flow analysis, we represent an interactome network as an electrical circuit, where interactions are modeled as resistors and proteins as interconnecting junctions. Construing the propagation of biological signals as flow of electrical current, our method calculates an information flow score for every protein. Unlike previous metrics of network centrality such as degree or betweenness that only consider topological features, our approach incorporates confidence scores of protein-protein interactions and automatically considers all possible paths in a network when evaluating the importance of each protein. We apply our method to the interactome networks of Saccharomyces cerevisiae and Caenorhabditis elegans. We find that the likelihood of observing lethality and pleiotropy when a protein is eliminated is positively correlated with the protein's information flow score. Even among proteins of low degree or low betweenness, high information scores serve as a strong predictor of loss-of-function lethality or pleiotropy. The correlation between information flow scores and phenotypes supports our hypothesis that the proteins of high information flow reside in central positions in interactome networks. We also show that the ranks of information flow scores are more consistent than that of betweenness when a large amount of noisy data is added to an interactome. Finally, we

  5. A dictionary based informational genome analysis

    Directory of Open Access Journals (Sweden)

    Castellini Alberto

    2012-09-01

    Full Text Available Abstract Background In the post-genomic era several methods of computational genomics are emerging to understand how the whole information is structured within genomes. Literature of last five years accounts for several alignment-free methods, arisen as alternative metrics for dissimilarity of biological sequences. Among the others, recent approaches are based on empirical frequencies of DNA k-mers in whole genomes. Results Any set of words (factors occurring in a genome provides a genomic dictionary. About sixty genomes were analyzed by means of informational indexes based on genomic dictionaries, where a systemic view replaces a local sequence analysis. A software prototype applying a methodology here outlined carried out some computations on genomic data. We computed informational indexes, built the genomic dictionaries with different sizes, along with frequency distributions. The software performed three main tasks: computation of informational indexes, storage of these in a database, index analysis and visualization. The validation was done by investigating genomes of various organisms. A systematic analysis of genomic repeats of several lengths, which is of vivid interest in biology (for example to compute excessively represented functional sequences, such as promoters, was discussed, and suggested a method to define synthetic genetic networks. Conclusions We introduced a methodology based on dictionaries, and an efficient motif-finding software application for comparative genomics. This approach could be extended along many investigation lines, namely exported in other contexts of computational genomics, as a basis for discrimination of genomic pathologies.

  6. Information Superiority via Formal Concept Analysis

    Science.gov (United States)

    Koester, Bjoern; Schmidt, Stefan E.

    This chapter will show how to get more mileage out of information. To achieve that, we first start with an introduction to the fundamentals of Formal Concept Analysis (FCA). FCA is a highly versatile field of applied lattice theory, which allows hidden relationships to be uncovered in relational data. Moreover, FCA provides a distinguished supporting framework to subsequently find and fill information gaps in a systematic and rigorous way. In addition, we would like to build bridges via a universal approach to other communities which can be related to FCA in order for other research areas to benefit from a theory that has been elaborated for more than twenty years. Last but not least, the essential benefits of FCA will be presented algorithmically as well as theoretically by investigating a real data set from the MIPT Terrorism Knowledge Base and also by demonstrating an application in the field of Web Information Retrieval and Web Intelligence.

  7. Nondestructive Testing Information Analysis Center, 1982.

    Science.gov (United States)

    1983-03-01

    1266 UNCLASSI F IED / 1*7 N I EhhhhEE 1. 12I2 lull- IlIll 1 1.25 1111.4 1111.6 1WuI - MICROCOPY RESOLUTION TESI CHART NAT,TINAL BLIPEI , ,lNUR, SwRl i...RESEARCH ENGINE AND VEHICLE RESEARCH VEHICLE AND TRAFFIC SAFETY INSTRUMENTATION RESEARCH DIVISION CHEMISTRY AND CHEMICAL NONDESTRUCTIVE TESTING INFORMATION...ANALYSIS CENTERSPACE PHYSICS ENGINEERING DIVISION PCE YSICS_________________________________________POWER SYSTEMS APPLIED CHEMISTRY AND CHEMICAL

  8. Analysis of information security reliability: A tutorial

    International Nuclear Information System (INIS)

    Kondakci, Suleyman

    2015-01-01

    This article presents a concise reliability analysis of network security abstracted from stochastic modeling, reliability, and queuing theories. Network security analysis is composed of threats, their impacts, and recovery of the failed systems. A unique framework with a collection of the key reliability models is presented here to guide the determination of the system reliability based on the strength of malicious acts and performance of the recovery processes. A unique model, called Attack-obstacle model, is also proposed here for analyzing systems with immunity growth features. Most computer science curricula do not contain courses in reliability modeling applicable to different areas of computer engineering. Hence, the topic of reliability analysis is often too diffuse to most computer engineers and researchers dealing with network security. This work is thus aimed at shedding some light on this issue, which can be useful in identifying models, their assumptions and practical parameters for estimating the reliability of threatened systems and for assessing the performance of recovery facilities. It can also be useful for the classification of processes and states regarding the reliability of information systems. Systems with stochastic behaviors undergoing queue operations and random state transitions can also benefit from the approaches presented here. - Highlights: • A concise survey and tutorial in model-based reliability analysis applicable to information security. • A framework of key modeling approaches for assessing reliability of networked systems. • The framework facilitates quantitative risk assessment tasks guided by stochastic modeling and queuing theory. • Evaluation of approaches and models for modeling threats, failures, impacts, and recovery analysis of information systems

  9. 78 FR 38096 - Fatality Analysis Reporting System Information Collection

    Science.gov (United States)

    2013-06-25

    ... Number NHTSA-2012-0168] Fatality Analysis Reporting System Information Collection AGENCY: National... comments on the following proposed collections of information: (1) Title: Fatal Analysis Reporting System... system that acquires national fatality information directly from existing State files and documents...

  10. Formal Concept Analysis for Information Retrieval

    OpenAIRE

    Qadi, Abderrahim El; Aboutajedine, Driss; Ennouary, Yassine

    2010-01-01

    In this paper we describe a mechanism to improve Information Retrieval (IR) on the web. The method is based on Formal Concepts Analysis (FCA) that it is makes semantical relations during the queries, and allows a reorganizing, in the shape of a lattice of concepts, the answers provided by a search engine. We proposed for the IR an incremental algorithm based on Galois lattice. This algorithm allows a formal clustering of the data sources, and the results which it turns over are classified by ...

  11. Multicriteria analysis of ontologically represented information

    Science.gov (United States)

    Wasielewska, K.; Ganzha, M.; Paprzycki, M.; Bǎdicǎ, C.; Ivanovic, M.; Lirkov, I.

    2014-11-01

    Our current work concerns the development of a decision support system for the software selection problem. The main idea is to utilize expert knowledge to help the user in selecting the best software / method / computational resource to solve a computational problem. Obviously, this involves multicriterial decision making and the key open question is: which method to choose. The context of the work is provided by the Agents in Grid (AiG) project, where the software selection (and thus multicriterial analysis) is to be realized when all information concerning the problem, the hardware and the software is ontologically represented. Initially, we have considered the Analytical Hierarchy Process (AHP), which is well suited for the hierarchical data structures (e.g., such that have been formulated in terms of ontologies). However, due to its well-known shortcomings, we have decided to extend our search for the multicriterial analysis method best suited for the problem in question. In this paper we report results of our search, which involved: (i) TOPSIS (Technique for Order Preference by Similarity to Ideal Solution), (ii) PROMETHEE, and (iii) GRIP (Generalized Regression with Intensities of Preference). We also briefly argue why other methods have not been considered as valuable candidates.

  12. ANALYSIS DEFINITIONS OF BASIC RESEARCH INFORMATION CULTURE OF FUTURE navigators

    Directory of Open Access Journals (Sweden)

    Mikhailo Sherman

    2016-01-01

    Full Text Available Based on analysis of data sources, professionally-oriented products with cultural and activity approaches clarified the definition of basic research - "information activities", "information behavior", "information need", "world news". It was found that the information needs of the individual are in fact in the information needs of human activities in order to eliminate the mismatch between the current and the normal state of the information sphere of the subject in the information society. It has been established that the information requirements have pronounced social and informational nature, are divided into sensory, social and professional structure of information requirements established communication, cognitive, mnemonic, aesthetic, regulatory, professional components, and they are the driving forces of human information. In theory reasonable and entered in a scientific appeal definition "informative activity of future navigators", under that we understand totality of the actions, sent to satisfaction of their informative requirements in social, cognitive and practical spheres of professional and personality activity the aim of that is providing of steady professional development and personality adaptation of future navigators in the conditions of global informative society. It is certain that her structure is formed by informatively-legal, research and information, informatively-communicative, informatively-administrative, informatively-documentary, informatively-operator constituents. The analysis of maintenance of terms is executed "informative literacy", "informative form", "informative competence", "informative culture" and lined up their hierarchy as sequence of the stages of forming of informative culture of future navigators

  13. ANALYSIS DEFINITIONS OF BASIC RESEARCH INFORMATION CULTURE OF FUTURE navigators

    OpenAIRE

    Mikhailo Sherman; Oleg Bezbah

    2016-01-01

    Based on analysis of data sources, professionally-oriented products with cultural and activity approaches clarified the definition of basic research - "information activities", "information behavior", "information need", "world news". It was found that the information needs of the individual are in fact in the information needs of human activities in order to eliminate the mismatch between the current and the normal state of the information sphere of the subject in the information society. It...

  14. Spatial information analysis of chemotactic trajectories.

    Science.gov (United States)

    Hoh, Jan H; Heinz, William F; Werbin, Jeffrey L

    2012-03-01

    During bacterial chemotaxis, a cell acquires information about its environment by sampling changes in the local concentration of a chemoattractant, and then uses that information to bias its motion relative to the source of the chemoattractant. The trajectory of a chemotaxing bacteria is thus a spatial manifestation of the information gathered by the cell. Here we show that a recently developed approach for computing spatial information using Fourier coefficient probabilities, the k-space information (kSI), can be used to quantify the information in such trajectories. The kSI is shown to capture expected responses to gradients of a chemoattractant. We then extend the k-space approach by developing an experimental probability distribution (EPD) that is computed from chemotactic trajectories collected under a reference condition. The EPD accounts for connectivity and other constraints that the nature of the trajectories imposes on the k-space computation. The EPD is used to compute the spatial information from any trajectory of interest, relative to the reference condition. The EPD-based spatial information also captures the expected responses to gradients of a chemoattractant, although the results differ in significant ways from the original kSI computation. In addition, the entropy calculated from the EPD provides a useful measure of trajectory space. The methods developed are highly general, and can be applied to a wide range of other trajectory types as well as non-trajectory data.

  15. Comprehensive analysis of information dissemination in disasters

    Science.gov (United States)

    Zhang, N.; Huang, H.; Su, Boni

    2016-11-01

    China is a country that experiences a large number of disasters. The number of deaths caused by large-scale disasters and accidents in past 10 years is around 900,000. More than 92.8 percent of these deaths could be avoided if there were an effective pre-warning system deployed. Knowledge of the information dissemination characteristics of different information media taking into consideration governmental assistance (information published by a government) in disasters in urban areas, plays a critical role in increasing response time and reducing the number of deaths and economic losses. In this paper we have developed a comprehensive information dissemination model to optimize efficiency of pre-warning mechanics. This model also can be used for disseminating information for evacuees making real-time evacuation plans. We analyzed every single information dissemination models for pre-warning in disasters by considering 14 media: short message service (SMS), phone, television, radio, news portals, Wechat, microblogs, email, newspapers, loudspeaker vehicles, loudspeakers, oral communication, and passive information acquisition via visual and auditory senses. Since governmental assistance is very useful in a disaster, we calculated the sensitivity of governmental assistance ratio. The results provide useful references for information dissemination during disasters in urban areas.

  16. Function analysis for waste information systems

    Energy Technology Data Exchange (ETDEWEB)

    Sexton, J.L.; Neal, C.T.; Heath, T.C.; Starling, C.D.

    1996-04-01

    This study has a two-fold purpose. It seeks to identify the functional requirements of a waste tracking information system and to find feasible alternatives for meeting those requirements on the Oak Ridge Reservation (ORR) and the Portsmouth (PORTS) and Paducah (PGDP) facilities; identify options that offer potential cost savings to the US government and also show opportunities for improved efficiency and effectiveness in managing waste information; and, finally, to recommend a practical course of action that can be immediately initiated. In addition to identifying relevant requirements, it also identifies any existing requirements that are currently not being completely met. Another aim of this study is to carry out preliminary benchmarking by contacting representative companies about their strategic directions in waste information. The information obtained from representatives of these organizations is contained in an appendix to the document; a full benchmarking effort, however, is beyond the intended scope of this study.

  17. Function analysis for waste information systems

    International Nuclear Information System (INIS)

    Sexton, J.L.; Neal, C.T.; Heath, T.C.; Starling, C.D.

    1996-04-01

    This study has a two-fold purpose. It seeks to identify the functional requirements of a waste tracking information system and to find feasible alternatives for meeting those requirements on the Oak Ridge Reservation (ORR) and the Portsmouth (PORTS) and Paducah (PGDP) facilities; identify options that offer potential cost savings to the US government and also show opportunities for improved efficiency and effectiveness in managing waste information; and, finally, to recommend a practical course of action that can be immediately initiated. In addition to identifying relevant requirements, it also identifies any existing requirements that are currently not being completely met. Another aim of this study is to carry out preliminary benchmarking by contacting representative companies about their strategic directions in waste information. The information obtained from representatives of these organizations is contained in an appendix to the document; a full benchmarking effort, however, is beyond the intended scope of this study

  18. A Comparative Analysis of University Information Systems within the Scope of the Information Security Risks

    Directory of Open Access Journals (Sweden)

    Rustu Yilmaz

    2016-05-01

    Full Text Available Universities are the leading institutions that are the sources of educated human population who both produce information and ensure to develop new products and new services by using information effectively, and who are needed in every area. Therefore, universities are expected to be institutions where information and information management are used efficiently. In the present study, the topics such as infrastructure, operation, application, information, policy and human-based information security at universities were examined within the scope of the information security standards which are highly required and intended to be available at each university today, and then a comparative analysis was conducted specific to Turkey. Within the present study, the Microsoft Security Assessment Tool developed by Microsoft was used as the risk analysis tool. The analyses aim to enable the universities to compare their information systems with the information systems of other universities within the scope of the information security awareness, and to make suggestions in this regard.

  19. Information seeking and reciprocity: A transformational analysis

    NARCIS (Netherlands)

    Gallucci, M.; Perugini, M.

    2003-01-01

    The motivation to reciprocate is analyzed within the framework of interdependence theory, with focus on the process of transformation of situations. A model of transformation is presented for the motivation to reciprocate and hypotheses regarding allocation behavior and information seeking are

  20. Value of Information Analysis in Structural Safety

    DEFF Research Database (Denmark)

    Konakli, Katerina; Faber, Michael Havbro

    2014-01-01

    of structural systems. In this context, experiments may refer to inspections or techniques of structural health monitoring. The Value of Information concept provides a powerful tool for determining whether the experimental cost is justified by the expected benefit and for identifying the optimal among different...

  1. Sentiment Analysis Challenges of Informal Arabic Language

    OpenAIRE

    Salihah AlOtaibi; Muhammad Badruddin Khan

    2017-01-01

    Recently, there are wide numbers of users that use the social network like Twitter, Facebook, MySpace to share various kinds of resources, express their opinions, thoughts, messages in real time. Thus, increase the amount of electronic content that generated by users. Sentiment analysis becomes a very interesting topic in research community. Thereby, we need to give more attention to Arabic sentiment analysis. This paper discusses the challenges and obstacles when analyze the sentiment analys...

  2. Analysis and visualization of chromosome information.

    Science.gov (United States)

    Machado, J A Tenreiro; Costa, António C; Quelhas, Maria Dulce

    2012-01-01

    This paper analyzes the DNA code of several species in the perspective of information content. For that purpose several concepts and mathematical tools are selected towards establishing a quantitative method without a priori distorting the alphabet represented by the sequence of DNA bases. The synergies of associating Gray code, histogram characterization and multidimensional scaling visualization lead to a collection of plots with a categorical representation of species and chromosomes. Copyright © 2011 Elsevier B.V. All rights reserved.

  3. The informational subject in the contemporary context. An analysis from the epistemology of informational community identity

    Directory of Open Access Journals (Sweden)

    Miguel Angel Rendón-Rojas

    2012-04-01

    Full Text Available The Epistemology of Informational Community Identity (ECI-I is proposed as a toolbox for the analysis of informational reality within categories as contextual paradigm, informational subject and informational entity, built ex profeso for this theoretical-methodological analysis. The concepts of information user´s and informational subject are distinguished, the latest, to seek an answer from a concrete social enclave within a particular community and its interrelationships with others, to under go a process of self construction, from which specific information needs arise. And the user needs to seek concrete answer after formal questioning many facts occurring in a consumerist, unequal and alienating world. So the emphasis is put on the need for an interdisciplinary approach between social theory and library science in the study of the documentary information world of particular informational subjects, which is often marginalized and excluded

  4. Information granules in image histogram analysis.

    Science.gov (United States)

    Wieclawek, Wojciech

    2017-05-10

    A concept of granular computing employed in intensity-based image enhancement is discussed. First, a weighted granular computing idea is introduced. Then, the implementation of this term in the image processing area is presented. Finally, multidimensional granular histogram analysis is introduced. The proposed approach is dedicated to digital images, especially to medical images acquired by Computed Tomography (CT). As the histogram equalization approach, this method is based on image histogram analysis. Yet, unlike the histogram equalization technique, it works on a selected range of the pixel intensity and is controlled by two parameters. Performance is tested on anonymous clinical CT series. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Full-Information Item Factor Analysis.

    Science.gov (United States)

    Bock, R. Darrell; And Others

    1988-01-01

    A method of item factor analysis is described, which is based on Thurstone's multiple-factor model and implemented by marginal maximum likelihood estimation and the EM algorithm. Also assessed are the statistical significance of successive factors added to the model, provisions for guessing and omitted items, and Bayes constraints. (TJH)

  6. Hydrogen Technical Analysis -- Dissemination of Information

    Energy Technology Data Exchange (ETDEWEB)

    George Kervitsky, Jr.

    2006-03-20

    SENTECH is a small energy and environmental consulting firm providing technical, analytical, and communications solutions to technology management issues. The activities proposed by SENTECH focused on gathering and developing communications materials and information, and various dissemination activities to present the benefits of hydrogen energy to a broad audience while at the same time establishing permanent communications channels to enable continued two-way dialog with these audiences in future years. Effective communications and information dissemination is critical to the acceptance of new technology. Hydrogen technologies face the additional challenge of safety preconceptions formed primarily as a result of the crash of the Hindenburg. Effective communications play a key role in all aspects of human interaction, and will help to overcome the perceptual barriers, whether of safety, economics, or benefits. As originally proposed SENTECH identified three distinct information dissemination activities to address three distinct but important audiences; these formed the basis for the task structure used in phases 1 and 2. The tasks were: (1) Print information--Brochures that target the certain segment of the population and will be distributed via relevant technical conferences and traditional distribution channels. (2) Face-to-face meetings--With industries identified to have a stake in hydrogen energy. The three industry audiences are architect/engineering firms, renewable energy firms, and energy companies that have not made a commitment to hydrogen (3) Educational Forums--The final audience is students--the future engineers, technicians, and energy consumers. SENTECH will expand on its previous educational work in this area. The communications activities proposed by SENTECH and completed as a result of this cooperative agreement was designed to compliment the research and development work funded by the DOE by presenting the technical achievements and validations

  7. Information as signs: A semiotic analysis of the information concept, determining it's ontological and epistemological commitments

    DEFF Research Database (Denmark)

    Thellefsen, Martin Muderspach; Thellefsen, Torkild Leo; Sørensen, Bent

    2018-01-01

    , or conversely, information is understood only relative to subjective/discursive intentions, agendas, etc. To overcome the limitations of defining information as either objective or subjective/discursive, a semiotic analysis shows that information understood as signs is consistently sensitive to both objective...

  8. Sensor Network Information Analytical Methods: Analysis of Similarities and Differences

    Directory of Open Access Journals (Sweden)

    Chen Jian

    2014-04-01

    Full Text Available In the Sensor Network information engineering literature, few references focus on the definition and design of Sensor Network information analytical methods. Among those that do are Munson, et al. and the ISO standards on functional size analysis. To avoid inconsistent vocabulary and potentially incorrect interpretation of data, Sensor Network information analytical methods must be better designed, including definitions, analysis principles, analysis rules, and base units. This paper analyzes the similarities and differences across three different views of analytical methods, and uses a process proposed for the design of Sensor Network information analytical methods to analyze two examples of such methods selected from the literature.

  9. Army Information Operations Officer Needs Analysis Report

    Science.gov (United States)

    2016-03-01

    educated . There were also some other points of divergence. Open source media or what some might consider “ media monitoring” was considered a...making of adversaries and potential adversaries while protecting our own (JP 3-13). The IO officer serves as the integration specialist for IO. The...purpose of the needs analysis was to determine the training, education , and/or other changes to Doctrine, Organization, Training, Materiel, Leadership

  10. Information architecture. Volume 2, Part 1: Baseline analysis summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-01

    The Department of Energy (DOE) Information Architecture, Volume 2, Baseline Analysis, is a collaborative and logical next-step effort in the processes required to produce a Departmentwide information architecture. The baseline analysis serves a diverse audience of program management and technical personnel and provides an organized way to examine the Department`s existing or de facto information architecture. A companion document to Volume 1, The Foundations, it furnishes the rationale for establishing a Departmentwide information architecture. This volume, consisting of the Baseline Analysis Summary (part 1), Baseline Analysis (part 2), and Reference Data (part 3), is of interest to readers who wish to understand how the Department`s current information architecture technologies are employed. The analysis identifies how and where current technologies support business areas, programs, sites, and corporate systems.

  11. INFORMATION ARCHITECTURE ANALYSIS USING BUSINESS INTELLIGENCE TOOLS BASED ON THE INFORMATION NEEDS OF EXECUTIVES

    Directory of Open Access Journals (Sweden)

    Fabricio Sobrosa Affeldt

    2013-08-01

    Full Text Available Devising an information architecture system that enables an organization to centralize information regarding its operational, managerial and strategic performance is one of the challenges currently facing information technology. The present study aimed to analyze an information architecture system developed using Business Intelligence (BI technology. The analysis was performed based on a questionnaire enquiring as to whether the information needs of executives were met during the process. A theoretical framework was applied consisting of information architecture and BI technology, using a case study methodology. Results indicated that the transaction processing systems studied did not meet the information needs of company executives. Information architecture using data warehousing, online analytical processing (OLAP tools and data mining may provide a more agile means of meeting these needs. However, some items must be included and others modified, in addition to improving the culture of information use by company executives.

  12. Shuttle Topography Data Inform Solar Power Analysis

    Science.gov (United States)

    2013-01-01

    The next time you flip on a light switch, there s a chance that you could be benefitting from data originally acquired during the Space Shuttle Program. An effort spearheaded by Jet Propulsion Laboratory (JPL) and the National Geospatial-Intelligence Agency (NGA) in 2000 put together the first near-global elevation map of the Earth ever assembled, which has found use in everything from 3D terrain maps to models that inform solar power production. For the project, called the Shuttle Radar Topography Mission (SRTM), engineers at JPL designed a 60-meter mast that was fitted onto Shuttle Endeavour. Once deployed in space, an antenna attached to the end of the mast worked in combination with another antenna on the shuttle to simultaneously collect data from two perspectives. Just as having two eyes makes depth perception possible, the SRTM data sets could be combined to form an accurate picture of the Earth s surface elevations, the first hight-detail, near-global elevation map ever assembled. What made SRTM unique was not just its surface mapping capabilities but the completeness of the data it acquired. Over the course of 11 days, the shuttle orbited the Earth nearly 180 times, covering everything between the 60deg north and 54deg south latitudes, or roughly 80 percent of the world s total landmass. Of that targeted land area, 95 percent was mapped at least twice, and 24 percent was mapped at least four times. Following several years of processing, NASA released the data to the public in partnership with NGA. Robert Crippen, a member of the SRTM science team, says that the data have proven useful in a variety of fields. "Satellites have produced vast amounts of remote sensing data, which over the years have been mostly two-dimensional. But the Earth s surface is three-dimensional. Detailed topographic data give us the means to visualize and analyze remote sensing data in their natural three-dimensional structure, facilitating a greater understanding of the features

  13. Propositional Analysis: A Tool for Library and Information Science Research.

    Science.gov (United States)

    Allen, Bryce

    1989-01-01

    Reviews the use of propositional analysis in library and information science research. Evidence that different analysts produce similar judgments about texts and use the method consistently over time is presented, and it is concluded that propositional analysis is a reliable and valid research method. An example of an analysis is appended. (32…

  14. Ground subsidence information as a valuable layer in GIS analysis

    Directory of Open Access Journals (Sweden)

    Murdzek Radosław

    2018-01-01

    Full Text Available Among the technologies used to improve functioning of local governments the geographic information systems (GIS are widely used. GIS tools allow to simultaneously integrate spatial data resources, analyse them, process and use them to make strategic decisions. Nowadays GIS analysis is widely used in spatial planning or environmental protection. In these applications a number of spatial information are utilized, but rarely it is an information about environmental hazards. This paper includes information about ground subsidence that occurred in USCB mining area into GIS analysis. Monitoring of this phenomenon can be carried out using the radar differential interferometry (DInSAR method.

  15. Evaluation of health information systems research in information systems research: A meta-analysis.

    Science.gov (United States)

    Haried, Peter; Claybaugh, Craig; Dai, Hua

    2017-04-01

    Given the importance of the health-care industry and the promise of health information systems, researchers are encouraged to build on the shoulders of giants as the saying goes. The health information systems field has a unique opportunity to learn from and extend the work that has already been done by the highly correlated information systems field. As a result, this research article presents a past, present and future meta-analysis of health information systems research in information systems journals over the 2000-2015 time period. Our analysis reviewed 126 articles on a variety of topics related to health information systems research published in the "Senior Scholars" list of the top eight ranked information systems academic journals. Across the selected information systems academic journals, our findings compare research methodologies applied, health information systems topic areas investigated and research trends. Interesting results emerge in the range and evolution of health information systems research and opportunities for health information systems researchers and practitioners to consider moving forward.

  16. Analysis of reliability parameters for complicated information measurement systems

    OpenAIRE

    Sydor, Andriy

    2012-01-01

    A method of analysis of reliability parameters for complicated systems by means of generating functions is developed taking account of aging of the systems output elements. Main reliability parameters of complicated information measurement systems are examined in this paper.

  17. Water Information Management & Analysis System (WIMAS) v 4.0

    Data.gov (United States)

    Kansas Data Access and Support Center — The Water Information Management and Analysis System (WIMAS) is an ArcView based GIS application that allows users to query Kansas water right data maintained by the...

  18. Analysis of multi-interpretable ecological monitoring information

    NARCIS (Netherlands)

    Brazier, F.M.T.; Engelfriet, J.; Treur, J.

    1998-01-01

    In this paper logical techniques developed to formalise the analysis of multi-interpretable information, in particular belief set operators and selection operators, are applied to an ecological domain. A knowledge-based decision support system is

  19. Analysis of safeguards information treatment system at the facility level

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Byung Doo; Song, Dae Yong; Kwack, Eun Ho

    2000-12-01

    Safeguards Information Treatment System(SITS) at the facility level is required to implement efficiently the obligations under the Korea-IAEA Safeguards Agreement, bilateral agreements with other countries and domestic law. In this report, the analysis of information, which the SITS treats, and operation environment of SITS including the review of the relationship between safeguards information are described. SITS will be developed to cover the different accounting procedures and methods applied at the various facilities under IAEA safeguards.

  20. Analysis of safeguards information treatment system at the facility level

    International Nuclear Information System (INIS)

    Lee, Byung Doo; Song, Dae Yong; Kwack, Eun Ho

    2000-12-01

    Safeguards Information Treatment System(SITS) at the facility level is required to implement efficiently the obligations under the Korea-IAEA Safeguards Agreement, bilateral agreements with other countries and domestic law. In this report, the analysis of information, which the SITS treats, and operation environment of SITS including the review of the relationship between safeguards information are described. SITS will be developed to cover the different accounting procedures and methods applied at the various facilities under IAEA safeguards

  1. A Content Analysis of Online HPV Immunization Information

    Science.gov (United States)

    Pappa, Sara T.

    2016-01-01

    The Human Papillomavirus (HPV) can cause some types of cancer and is the most common sexually transmitted infection in the US. Because most people turn to the internet for health information, this study analyzed HPV information found online. A content analysis was conducted on 69 web search results (URLs) from Google, Yahoo, Bing and Ask. The…

  2. Analysis of Computer Network Information Based on "Big Data"

    Science.gov (United States)

    Li, Tianli

    2017-11-01

    With the development of the current era, computer network and large data gradually become part of the people's life, people use the computer to provide convenience for their own life, but at the same time there are many network information problems has to pay attention. This paper analyzes the information security of computer network based on "big data" analysis, and puts forward some solutions.

  3. The informed society: an analysis of the public's information-seeking behavior regarding coastal flood risks.

    Science.gov (United States)

    Kellens, Wim; Zaalberg, Ruud; De Maeyer, Philippe

    2012-08-01

    Recent flood risk management puts an increasing emphasis on the public's risk perception and its preferences. It is now widely recognized that a better knowledge of the public's awareness and concern about risks is of vital importance to outline effective risk communication strategies. Models such as Risk Information Seeking and Processing address this evolution by considering the public's needs and its information-seeking behavior with regard to risk information. This study builds upon earlier information-seeking models and focuses on the empirical relationships between information-seeking behavior and the constructs of risk perception, perceived hazard knowledge, response efficacy, and information need in the context of coastal flood risks. Specific focus is given to the mediating role of information need in the model and to the differences in information-seeking behavior between permanent and temporary residents. By means of a structured on-line questionnaire, a cross-sectional survey was carried out in the city of Ostend, one of the most vulnerable places to coastal flooding on the Belgian coast. Three hundred thirteen respondents participated in the survey. Path analysis reveals that information need does not act as a mediator in contrast to risk perception and perceived knowledge. In addition, it is shown that risk perception and perceived hazard knowledge are higher for permanent than temporary residents, leading to increased information-seeking behavior among the former group. Implications for risk communication are discussed. © 2012 Society for Risk Analysis.

  4. Online Information About Harmful Tobacco Constituents: A Content Analysis.

    Science.gov (United States)

    Margolis, Katherine A; Bernat, Jennifer K; Keely O'Brien, Erin; Delahanty, Janine C

    2017-10-01

    Tobacco products and smoke contain more than 7000 chemicals (ie, constituents). Research shows that consumers have poor understanding of tobacco constituents and find communication about them to be confusing. The current content analysis describes how information is communicated about tobacco constituents online in terms of source, target audience, and message. A search was conducted in September 2015 using tobacco constituent and tobacco terms and identified 226 relevant Web sites for coding. Web sites were coded for type, target audience, reading level, constituent information, type of tobacco product, health effects, and emotional valence by two coders who independently coded half of the sample. There was a 20% overlap to assess interrater reliability, which was high (κ = .83, p tobacco constituents. Cancer was the most frequently mentioned health effect (51.3%). Nearly a quarter (23%) of the Web sites did not explicitly state that tobacco constituents or tobacco products are associated with health effects. Large gaps exist in online information about tobacco constituents including incomplete information about tobacco constituent-related health effects and limited information about tobacco products other than cigarettes and smokeless tobacco. This study highlights opportunities to improve the content and presentation of information related to tobacco constituents. The US Food and Drug Administration (FDA) is required to publicly display a list of tobacco constituents in tobacco products and tobacco smoke by brand. However, little is known about tobacco constituent information available to the public. This is the first systematic content analysis of online information about tobacco constituents. The analysis reveals that although information about tobacco constituents is available online, large information gaps exist, including incomplete information about tobacco constituent-related health effects. This study highlights opportunities to improve the content and

  5. Content Analysis in Library and Information Science Research.

    Science.gov (United States)

    Allen, Bryce; Reser, David

    1990-01-01

    Describes ways in which content analysis is being used in library and information science research. Methodological concerns are addressed, including selection of target documents, selection of samples to be analyzed, selection of categories for analysis, and elimination of researcher bias to assure reliability. (35 references) (LRW)

  6. METHODOLOGICAL APPROACH TO ANALYSIS AND EVALUATION OF INFORMATION PROTECTION IN INFORMATION SYSTEMS BASED ON VULNERABILITY DANGER

    Directory of Open Access Journals (Sweden)

    Y. M. Krotiuk

    2008-01-01

    Full Text Available The paper considers a methodological approach to an analysis and estimation of information security in the information systems which is based on the analysis of vulnerabilities and an extent of their hazard. By vulnerability hazard it is meant a complexity of its operation as a part of an information system. The required and sufficient vulnerability operational conditions  have  been  determined in the paper. The paper proposes a generalized model for attack realization which is used as a basis for construction of an attack realization model for an operation of a particular vulnerability. A criterion for estimation of information protection in the information systems which is based on the estimation of vulnerability hazard is formulated in the paper. The proposed approach allows to obtain a quantitative estimation of the information system security on the basis of the proposed schemes on realization of typical attacks for the distinguished classes of vulnerabilities.The methodical approach is used for choosing variants to be applied for realization of protection mechanisms in the information systems as well as for estimation of information safety in the operating information systems.

  7. Agricultural information dissemination using ICTs: A review and analysis of information dissemination models in China

    Directory of Open Access Journals (Sweden)

    Yun Zhang

    2016-03-01

    Full Text Available Over the last three decades, China’s agriculture sector has been transformed from the traditional to modern practice through the effective deployment of Information and Communication Technologies (ICTs. Information processing and dissemination have played a critical role in this transformation process. Many studies in relation to agriculture information services have been conducted in China, but few of them have attempted to provide a comprehensive review and analysis of different information dissemination models and their applications. This paper aims to review and identify the ICT based information dissemination models in China and to share the knowledge and experience in applying emerging ICTs in disseminating agriculture information to farmers and farm communities to improve productivity and economic, social and environmental sustainability. The paper reviews and analyzes the development stages of China’s agricultural information dissemination systems and different mechanisms for agricultural information service development and operations. Seven ICT-based information dissemination models are identified and discussed. Success cases are presented. The findings provide a useful direction for researchers and practitioners in developing future ICT based information dissemination systems. It is hoped that this paper will also help other developing countries to learn from China’s experience and best practice in their endeavor of applying emerging ICTs in agriculture information dissemination and knowledge transfer.

  8. A Strategic Analysis of Information Sharing Among Cyber Attackers

    Directory of Open Access Journals (Sweden)

    Kjell Hausken

    2015-10-01

    Full Text Available We build a game theory model where the market design is such that one firm invests in security to defend against cyber attacks by two hackers. The firm has an asset, which is allocated between the three market participants dependent on their contest success. Each hacker chooses an optimal attack, and they share information with each other about the firm’s vulnerabilities. Each hacker prefers to receive information, but delivering information gives competitive advantage to the other hacker. We find that each hacker’s attack and information sharing are strategic complements while one hacker’s attack and the other hacker’s information sharing are strategic substitutes. As the firm’s unit defense cost increases, the attack is inverse U-shaped and reaches zero, while the firm’s defense and profit decrease, and the hackers’ information sharing and profit increase. The firm’s profit increases in the hackers’ unit cost of attack, while the hackers’ information sharing and profit decrease. Our analysis also reveals the interesting result that the cumulative attack level of the hackers is not affected by the effectiveness of information sharing between them and moreover, is also unaffected by the intensity of joint information sharing. We also find that as the effectiveness of information sharing between hackers increases relative to the investment in attack, the firm’s investment in cyber security defense and profit are constant, the hackers’ investments in attacks decrease, and information sharing levels and hacker profits increase. In contrast, as the intensity of joint information sharing increases, while the firm’s investment in cyber security defense and profit remain constant, the hackers’ investments in attacks increase, and the hackers’ information sharing levels and profits decrease. Increasing the firm’s asset causes all the variables to increase linearly, except information sharing which is constant. We extend

  9. Quantifying neurotransmission reliability through metrics-based information analysis.

    Science.gov (United States)

    Brasselet, Romain; Johansson, Roland S; Arleo, Angelo

    2011-04-01

    We set forth an information-theoretical measure to quantify neurotransmission reliability while taking into full account the metrical properties of the spike train space. This parametric information analysis relies on similarity measures induced by the metrical relations between neural responses as spikes flow in. Thus, in order to assess the entropy, the conditional entropy, and the overall information transfer, this method does not require any a priori decoding algorithm to partition the space into equivalence classes. It therefore allows the optimal parameters of a class of distances to be determined with respect to information transmission. To validate the proposed information-theoretical approach, we study precise temporal decoding of human somatosensory signals recorded using microneurography experiments. For this analysis, we employ a similarity measure based on the Victor-Purpura spike train metrics. We show that with appropriate parameters of this distance, the relative spike times of the mechanoreceptors' responses convey enough information to perform optimal discrimination--defined as maximum metrical information and zero conditional entropy--of 81 distinct stimuli within 40 ms of the first afferent spike. The proposed information-theoretical measure proves to be a suitable generalization of Shannon mutual information in order to consider the metrics of temporal codes explicitly. It allows neurotransmission reliability to be assessed in the presence of large spike train spaces (e.g., neural population codes) with high temporal precision.

  10. Sentiment Analysis Using Common-Sense and Context Information

    OpenAIRE

    Agarwal, Basant; Mittal, Namita; Bansal, Pooja; Garg, Sonal

    2015-01-01

    Sentiment analysis research has been increasing tremendously in recent times due to the wide range of business and social applications. Sentiment analysis from unstructured natural language text has recently received considerable attention from the research community. In this paper, we propose a novel sentiment analysis model based on common-sense knowledge extracted from ConceptNet based ontology and context information. ConceptNet based ontology is used to determine the domain specific conc...

  11. Analysis and design of nuclear energy information systems

    International Nuclear Information System (INIS)

    Yohanes Dwi Anggoro; Sriyana; Arief Tris Yuliyanto; Wiku Lulus Widodo

    2015-01-01

    Management of research reports and activities of the Center for Nuclear Energy System Assessment (PKSEN), either in the form of documents and the results of other activities, are important part of the series of activities PKSEN mission achievement. Management of good documents will facilitate the provision of improved inputs or use the maximum results. But over the past few years, there are still some problem in the management of research reports and activities performed by PKSEN. The purpose of this study is to analyze and design flow layout of the Nuclear Energy Information System to facilitate the implementation of the Nuclear Energy Information System. In addition to be used as a research management system and PKSEN activities, it can also be used as information media for the community. Nuclear Energy Information System package is expected to be ''one gate systems for PKSEN information. The research methodology used are: (i) analysis of organizational systems, (ii) the analysis and design of information systems; (iii) the analysis and design of software systems; (iv) the analysis and design of database systems. The results of this study are: had identified and resources throughout the organization PKSEN activation, had analyzed the application of SIEN using SWOT analysis, had identified several types of devices required, had been compiled hierarchy of SIEN, had determined that the database system used is a centralized database system and had elections MySQL as DBMS. The result is a basic design of the Nuclear Energy Information System) which will used as a research and activities management system of PKSEN and also can be used as a medium of information for the community. (author)

  12. Hybrid Information Flow Analysis for Programs with Arrays

    Directory of Open Access Journals (Sweden)

    Gergö Barany

    2016-07-01

    Full Text Available Information flow analysis checks whether certain pieces of (confidential data may affect the results of computations in unwanted ways and thus leak information. Dynamic information flow analysis adds instrumentation code to the target software to track flows at run time and raise alarms if a flow policy is violated; hybrid analyses combine this with preliminary static analysis. Using a subset of C as the target language, we extend previous work on hybrid information flow analysis that handled pointers to scalars. Our extended formulation handles arrays, pointers to array elements, and pointer arithmetic. Information flow through arrays of pointers is tracked precisely while arrays of non-pointer types are summarized efficiently. A prototype of our approach is implemented using the Frama-C program analysis and transformation framework. Work on a full machine-checked proof of the correctness of our approach using Isabelle/HOL is well underway; we present the existing parts and sketch the rest of the correctness argument.

  13. An Information Foraging Analysis of Note Taking and Note Sharing While Browsing Campaign Information

    DEFF Research Database (Denmark)

    Vatrapu, Ravi; Robertson, Scott

    2010-01-01

    In this paper, we present an experimental study of political information foraging in the context of e-voting. Participants were observed while searching and browsing the internet for campaign information in a mock-voting situation in three online note-taking conditions: No Notes, Private Notes......, and Shared Notes. Interaction analysis of the study data consisted of applying Information Foraging Theory to understand participants' searching, and browsing behaviors. Empirical results show skewed time allocation to activities, a tradeoff between enrichment vs. exploitation of search results and issues...

  14. Information-Pooling Bias in Collaborative Security Incident Correlation Analysis.

    Science.gov (United States)

    Rajivan, Prashanth; Cooke, Nancy J

    2018-03-01

    Incident correlation is a vital step in the cybersecurity threat detection process. This article presents research on the effect of group-level information-pooling bias on collaborative incident correlation analysis in a synthetic task environment. Past research has shown that uneven information distribution biases people to share information that is known to most team members and prevents them from sharing any unique information available with them. The effect of such biases on security team collaborations are largely unknown. Thirty 3-person teams performed two threat detection missions involving information sharing and correlating security incidents. Incidents were predistributed to each person in the team based on the hidden profile paradigm. Participant teams, randomly assigned to three experimental groups, used different collaboration aids during Mission 2. Communication analysis revealed that participant teams were 3 times more likely to discuss security incidents commonly known to the majority. Unaided team collaboration was inefficient in finding associations between security incidents uniquely available to each member of the team. Visualizations that augment perceptual processing and recognition memory were found to mitigate the bias. The data suggest that (a) security analyst teams, when conducting collaborative correlation analysis, could be inefficient in pooling unique information from their peers; (b) employing off-the-shelf collaboration tools in cybersecurity defense environments is inadequate; and (c) collaborative security visualization tools developed considering the human cognitive limitations of security analysts is necessary. Potential applications of this research include development of team training procedures and collaboration tool development for security analysts.

  15. Performance of the Carbon Dioxide Information Analysis Center (CDIAC)

    Energy Technology Data Exchange (ETDEWEB)

    Stoss, F.W. [Univ. of Tennessee, Knoxville, TN (United States). Environment, Energy, and Resources Center; Jones, S.B. [Oak Ridge National Lab., TN (United States)

    1993-11-01

    The Carbon Dioxide Information Analysis Center (CDIAC) provides information and data resources in support of the US Department of Energy`s Global Change Research Program. CDIAC also serves as a resource of global change information for a broader international commonly of researchers, policymakers, managers, educators, and students. The number of requests for CDIAC`s data products, information services, and publications has grown over the years and represents multidisciplinary interests in the physical, life, and social sciences and from diverse work settings in government, business, and academia. CDIAC`s staff addresses thousands of requests yearly for data and information resources. In response to these requests, CDIAC has distributed tens of thousands of data products, technical reports, newsletters, and other information resources worldwide since 1982. This paper describes CDIAC, examines CDIAC`s user community, and describes CDIAC`s response to requests for information. The CDIAC Information System, which serves as a comprehensive PC-based inventory and information management tracking system, is also described.

  16. CISAPS: Complex Informational Spectrum for the Analysis of Protein Sequences

    Directory of Open Access Journals (Sweden)

    Charalambos Chrysostomou

    2015-01-01

    Full Text Available Complex informational spectrum analysis for protein sequences (CISAPS and its web-based server are developed and presented. As recent studies show, only the use of the absolute spectrum in the analysis of protein sequences using the informational spectrum analysis is proven to be insufficient. Therefore, CISAPS is developed to consider and provide results in three forms including absolute, real, and imaginary spectrum. Biologically related features to the analysis of influenza A subtypes as presented as a case study in this study can also appear individually either in the real or imaginary spectrum. As the results presented, protein classes can present similarities or differences according to the features extracted from CISAPS web server. These associations are probable to be related with the protein feature that the specific amino acid index represents. In addition, various technical issues such as zero-padding and windowing that may affect the analysis are also addressed. CISAPS uses an expanded list of 611 unique amino acid indices where each one represents a different property to perform the analysis. This web-based server enables researchers with little knowledge of signal processing methods to apply and include complex informational spectrum analysis to their work.

  17. Semantic analysis based forms information retrieval and classification

    Science.gov (United States)

    Saba, Tanzila; Alqahtani, Fatimah Ayidh

    2013-09-01

    Data entry forms are employed in all types of enterprises to collect hundreds of customer's information on daily basis. The information is filled manually by the customers. Hence, it is laborious and time consuming to use human operator to transfer these customers information into computers manually. Additionally, it is expensive and human errors might cause serious flaws. The automatic interpretation of scanned forms has facilitated many real applications from speed and accuracy point of view such as keywords spotting, sorting of postal addresses, script matching and writer identification. This research deals with different strategies to extract customer's information from these scanned forms, interpretation and classification. Accordingly, extracted information is segmented into characters for their classification and finally stored in the forms of records in databases for their further processing. This paper presents a detailed discussion of these semantic based analysis strategies for forms processing. Finally, new directions are also recommended for future research. [Figure not available: see fulltext.

  18. Performance Analysis of Information Services in a Grid Environment

    Directory of Open Access Journals (Sweden)

    Giovanni Aloisio

    2004-10-01

    Full Text Available The Information Service is a fundamental component in a grid environment. It has to meet a lot of requirements such as access to static and dynamic information related to grid resources, efficient and secure access to dynamic data, decentralized maintenance, fault tolerance etc., in order to achieve better performance, scalability, security and extensibility. Currently there are two different major approaches. One is based on a directory infrastructure and another one on a novel approach that exploits a relational DBMS. In this paper we present a performance comparison analysis between Grid Resource Information Service (GRIS and Local Dynamic Grid Catalog relational information service (LDGC, providing also information about two projects (iGrid and Grid Relational Catalog in the grid data management area.

  19. HEALTH INSURANCE INFORMATION-SEEKING BEHAVIORS AMONG INTERNET USERS: AN EXPLORATORY ANALYSIS TO INFORM POLICIES.

    Science.gov (United States)

    Erlyana, Erlyana; Acosta-Deprez, Veronica; O'Lawrence, Henry; Sinay, Tony; Ramirez, Jeremy; Jacot, Emmanuel C; Shim, Kyuyoung

    2015-01-01

    The purpose of this study was to explore characteristics of Internet users who seek health insurance information online, as well as factors affecting their behaviors in seeking health insurance information. Secondary data analysis was conducted using data from the 2012 Pew Internet Health Tracking Survey. Of 2,305 Internet user adults, only 29% were seeking health insurance information online. Bivariate analyses were conducted to test differences in characteristics of those who seek health insurance information online and those who do not. A logistic regression model was used to determine significant predictors of health insurance information-seeking behavior online. Findings suggested that factors such as being a single parent, having a high school education or less, and being uninsured were significant and those individuals were less likely to seek health insurance information online. Being a family caregiver of an adult and those who bought private health insurance or were entitled to Medicare were more likely to seek health insurance information online than non-caregivers and the uninsured. The findings suggested the need to provide quality health insurance information online is critical for both the insured and uninsured population.

  20. Building Information Modeling (BIM) for Indoor Environmental Performance Analysis

    DEFF Research Database (Denmark)

    The report is a part of a research assignment carried out by students in the 5ETCS course “Project Byggeri – [entitled as: Building Information Modeling (BIM) – Modeling & Analysis]”, during the 3rd semester of master degree in Civil and Architectural Engineering, Department of Engineering, Aarhus...... University. This includes seven papers describing BIM for Sustainability, concentrating specifically on individual topics regarding to Indoor Environment Performance Analysis....

  1. A Distributed Flocking Approach for Information Stream Clustering Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Xiaohui [ORNL; Potok, Thomas E [ORNL

    2006-01-01

    Intelligence analysts are currently overwhelmed with the amount of information streams generated everyday. There is a lack of comprehensive tool that can real-time analyze the information streams. Document clustering analysis plays an important role in improving the accuracy of information retrieval. However, most clustering technologies can only be applied for analyzing the static document collection because they normally require a large amount of computation resource and long time to get accurate result. It is very difficult to cluster a dynamic changed text information streams on an individual computer. Our early research has resulted in a dynamic reactive flock clustering algorithm which can continually refine the clustering result and quickly react to the change of document contents. This character makes the algorithm suitable for cluster analyzing dynamic changed document information, such as text information stream. Because of the decentralized character of this algorithm, a distributed approach is a very natural way to increase the clustering speed of the algorithm. In this paper, we present a distributed multi-agent flocking approach for the text information stream clustering and discuss the decentralized architectures and communication schemes for load balance and status information synchronization in this approach.

  2. On the Application of Information in Time Series Analysis

    Czech Academy of Sciences Publication Activity Database

    Klán, Petr; Wilkie, J.; Ankenbrand, T.

    1998-01-01

    Roč. 8, č. 1 (1998), s. 39-49 ISSN 1210-0552 Grant - others:Fonds National Suisse de la Recherche Scientifique (XE) CP93:9630 Keywords : time series analysis * measurement and application of information Subject RIV: BA - General Mathematics

  3. Gender analysis of sexual and reproductive health information ...

    African Journals Online (AJOL)

    Gender analysis of sexual and reproductive health information access and use: a study of university student communities in Tanzania. ... Finally, the paper makes detailed recommendations on SRH service provision; gender mainstreaming in SRH service provision; family planning; IEC and BCC; marketing and promoting ...

  4. Analysis of multi-interpretable ecological monitoring information

    NARCIS (Netherlands)

    Brazier, F.; Engelfriet, J.; Treur, J.

    In this paper logical techniques developed to formalize the analysis of multi-interpretable information, in particular belief set operators and selection operators, are applied to an ecological domain. A knowledge-based decision support system is described that determines the abiotic (chemical and

  5. Analysis of multi-interpretable ecological monitoring information

    NARCIS (Netherlands)

    Brazier, F.M.; Engelfriet, J.; Treur, J.; Hunter, A.

    1998-01-01

    In this paper logical techniques developed to formalize the analysis of multi-interpretable information, in particular belief set operators and selection operators, are applied to an ecological domain. A knowledge-based decision support system is described that determines the abiotic (chemical and

  6. Semiotic user interface analysis of building information model systems

    NARCIS (Netherlands)

    Hartmann, Timo

    2013-01-01

    To promote understanding of how to use building information (BI) systems to support communication, this paper uses computer semiotic theory to study how user interfaces of BI systems operate as a carrier of meaning. More specifically, the paper introduces a semiotic framework for the analysis of BI

  7. A Gaussian Approximation Approach for Value of Information Analysis.

    Science.gov (United States)

    Jalal, Hawre; Alarid-Escudero, Fernando

    2018-02-01

    Most decisions are associated with uncertainty. Value of information (VOI) analysis quantifies the opportunity loss associated with choosing a suboptimal intervention based on current imperfect information. VOI can inform the value of collecting additional information, resource allocation, research prioritization, and future research designs. However, in practice, VOI remains underused due to many conceptual and computational challenges associated with its application. Expected value of sample information (EVSI) is rooted in Bayesian statistical decision theory and measures the value of information from a finite sample. The past few years have witnessed a dramatic growth in computationally efficient methods to calculate EVSI, including metamodeling. However, little research has been done to simplify the experimental data collection step inherent to all EVSI computations, especially for correlated model parameters. This article proposes a general Gaussian approximation (GA) of the traditional Bayesian updating approach based on the original work by Raiffa and Schlaifer to compute EVSI. The proposed approach uses a single probabilistic sensitivity analysis (PSA) data set and involves 2 steps: 1) a linear metamodel step to compute the EVSI on the preposterior distributions and 2) a GA step to compute the preposterior distribution of the parameters of interest. The proposed approach is efficient and can be applied for a wide range of data collection designs involving multiple non-Gaussian parameters and unbalanced study designs. Our approach is particularly useful when the parameters of an economic evaluation are correlated or interact.

  8. Scientific Information Analysis of Chemistry Dissertations Using Thesaurus of Chemistry

    Directory of Open Access Journals (Sweden)

    Taghi Rajabi

    2017-09-01

    Full Text Available : Concept maps of chemistry can be obtained from thesaurus of chemistry. Analysis of information in the field of chemistry is done at graduate level, based on comparing and analyzing chemistry dissertations by using these maps. Therefore, the use of thesaurus for analyzing scientific information is recommended. Major advantage of using this method, is that it is possible to obtain a detailed map of all academic researches across all branches of science. The researches analysis results in chemical science can play a key role in developing strategic research policies, educational programming, linking universities to industries and postgraduate educational programming. This paper will first introduce the concept maps of chemistry. Then, emerging patterns from the concept maps of chemistry will be used to analyze the trend in the academic dissertations in chemistry, using the data collected and stored in our database at Iranian Research Institute for Information Science and Technology (IranDoc over the past 10 years (1998-2009.

  9. ANALYSIS OF INFORMATION FACTORS FOR DESIGNING INTELLECTUAL MECHATRONIC SYSTEM

    Directory of Open Access Journals (Sweden)

    A. V. Gulai

    2016-01-01

    Full Text Available The paper proposes to evaluate achievement of main results in operation of intellectual mechatronic systems with digital control by the obtained information effect. In this respect, common information requirements with intellectual components are considered as a basic information factor which influences on the process of mechatronic system designing. Therefore, some parameters have been accentuated and they can help to provide rather complete description of the processes used for obtaining and using systematic information within the volume of the intellectual mechatronic system. Conformity degree of control vector parameters synthesized by the system and identification results of its current states have been selected as an information criterion of the control efficiency. A set of expected probability values for location of each parameter of an control object and a mechatronic system within the required tolerances has been used for formation of possible states. The paper shows that when a complex information description of the system is used then it is expedient to use an expert assessment of selection probability for allowable control vectors which ensure a system transfer to favorable states. This approach has made it possible to pinpoint main information and technical specifications of the intellectual mechatronic system: structural construction (informational and technical compatibility and information matching of its components; control object (uncertainty of its state and information vector, information capacity of the mechatronic system; control actions (their hierarchy and entropic balance of control process, managerial resource of mechatronic system; functioning result (informational effect and control efficiency criterion, probabilistic selection of system states. In accordance with the fulfilled analysis it is possible to note the most effective directions for practical use of the proposed informational approach for creation of the

  10. Regional Analysis of Remote Sensing Based Evapotranspiration Information

    Science.gov (United States)

    Geli, H. M. E.; Hain, C.; Anderson, M. C.; Senay, G. B.

    2017-12-01

    Recent research findings on modeling actual evapotranspiration (ET) using remote sensing data and methods have proven the ability of these methods to address wide range of hydrological and water resources issues including river basin water balance for improved water resources management, drought monitoring, drought impact and socioeconomic responses, agricultural water management, optimization of land-use for water conservations, water allocation agreement among others. However, there is still a critical need to identify appropriate type of ET information that can address each of these issues. The current trend of increasing demand for water due to population growth coupled with variable and limited water supply due to drought especially in arid and semiarid regions with limited water supply have highlighted the need for such information. To properly address these issues different spatial and temporal resolutions of ET information will need to be used. For example, agricultural water management applications require ET information at field (30-m) and daily time scales while for river basin hydrologic analysis relatively coarser spatial and temporal scales can be adequate for such regional applications. The objective of this analysis is to evaluate the potential of using an integrated ET information that can be used to address some of these issues collectively. This analysis will highlight efforts to address some of the issues that are applicable to New Mexico including assessment of statewide water budget as well as drought impact and socioeconomic responses which all require ET information but at different spatial and temporal scales. This analysis will provide an evaluation of four remote sensing based ET models including ALEXI, DisALEXI, SSEBop, and SEBAL3.0. The models will be compared with ground-based observations from eddy covariance towers and water balance calculations. Remote sensing data from Landsat, MODIS, and VIIRS sensors will be used to provide ET

  11. Environmental Quality Information Analysis Center multi-year plan

    International Nuclear Information System (INIS)

    Rivera, R.G.; Das, S.; Walsh, T.E.

    1992-09-01

    An information analysis center (IAC) is a federal resource that provides technical information for a specific technology field. An IAC links an expert technical staff with an experienced information specialist group, supported by in-house or external data bases to provide technical information and maintain a corporate knowledge in a technical area. An IAC promotes the rapid transfer of technology among its users and provides assistance in adopting new technology and predicting and assessing emerging technology. This document outlines the concept, requirements, and proposed development of an Environmental Quality IAC (EQIAC). An EQIAC network is composed of several nodes, each of which has specific technology capabilities. This document outlines strategic and operational objectives for the phased development of one such node of an EQIAC network

  12. Imaging for dismantlement verification: Information management and analysis algorithms

    International Nuclear Information System (INIS)

    Robinson, S.M.; Jarman, K.D.; Pitts, W.K.; Seifert, A.; Misner, A.C.; Woodring, M.L.; Myjak, M.J.

    2012-01-01

    The level of detail discernible in imaging techniques has generally excluded them from consideration as verification tools in inspection regimes. An image will almost certainly contain highly sensitive information, and storing a comparison image will almost certainly violate a cardinal principle of information barriers: that no sensitive information be stored in the system. To overcome this problem, some features of the image might be reduced to a few parameters suitable for definition as an attribute, which must be non-sensitive to be acceptable in an Information Barrier regime. However, this process must be performed with care. Features like the perimeter, area, and intensity of an object, for example, might reveal sensitive information. Any data-reduction technique must provide sufficient information to discriminate a real object from a spoofed or incorrect one, while avoiding disclosure (or storage) of any sensitive object qualities. Ultimately, algorithms are intended to provide only a yes/no response verifying the presence of features in the image. We discuss the utility of imaging for arms control applications and present three image-based verification algorithms in this context. The algorithms reduce full image information to non-sensitive feature information, in a process that is intended to enable verification while eliminating the possibility of image reconstruction. The underlying images can be highly detailed, since they are dynamically generated behind an information barrier. We consider the use of active (conventional) radiography alone and in tandem with passive (auto) radiography. We study these algorithms in terms of technical performance in image analysis and application to an information barrier scheme.

  13. Sentiment analysis using common-sense and context information.

    Science.gov (United States)

    Agarwal, Basant; Mittal, Namita; Bansal, Pooja; Garg, Sonal

    2015-01-01

    Sentiment analysis research has been increasing tremendously in recent times due to the wide range of business and social applications. Sentiment analysis from unstructured natural language text has recently received considerable attention from the research community. In this paper, we propose a novel sentiment analysis model based on common-sense knowledge extracted from ConceptNet based ontology and context information. ConceptNet based ontology is used to determine the domain specific concepts which in turn produced the domain specific important features. Further, the polarities of the extracted concepts are determined using the contextual polarity lexicon which we developed by considering the context information of a word. Finally, semantic orientations of domain specific features of the review document are aggregated based on the importance of a feature with respect to the domain. The importance of the feature is determined by the depth of the feature in the ontology. Experimental results show the effectiveness of the proposed methods.

  14. Sentiment Analysis Using Common-Sense and Context Information

    Directory of Open Access Journals (Sweden)

    Basant Agarwal

    2015-01-01

    Full Text Available Sentiment analysis research has been increasing tremendously in recent times due to the wide range of business and social applications. Sentiment analysis from unstructured natural language text has recently received considerable attention from the research community. In this paper, we propose a novel sentiment analysis model based on common-sense knowledge extracted from ConceptNet based ontology and context information. ConceptNet based ontology is used to determine the domain specific concepts which in turn produced the domain specific important features. Further, the polarities of the extracted concepts are determined using the contextual polarity lexicon which we developed by considering the context information of a word. Finally, semantic orientations of domain specific features of the review document are aggregated based on the importance of a feature with respect to the domain. The importance of the feature is determined by the depth of the feature in the ontology. Experimental results show the effectiveness of the proposed methods.

  15. Ethics, effectiveness and population health information interventions: a Canadian analysis.

    Science.gov (United States)

    Greyson, Devon; Knight, Rod; Shoveller, Jean A

    2018-02-19

    Population health information interventions (PHIIs) use information in efforts to promote health. PHIIs may push information to a target audience (communication), pull information from the public (surveillance), or combine both in a bidirectional intervention. Although PHIIs have often been framed as non-invasive and ethically innocuous, in reality they may be intrusive into people's lives, affecting not only their health but their senses of security, respect, and self-determination. Ethical acceptability of PHIIs may have impacts on intervention effectiveness, potentially giving rise to unintended consequences. This article examines push, pull, and bidirectional PHIIs using empirical data from an ethnographic study of young mothers in Greater Vancouver, Canada. Data were collected from October 2013 to December 2014 via naturalistic observation and individual interviews with 37 young mothers ages 16-22. Transcribed interviews and field notes were analyzed using inductive qualitative thematic analysis. Both push and pull interventions were experienced as non-neutral by the target population, and implementation factors on a structural and individual scale affected intervention ethics and effectiveness. Based on our findings, we suggest that careful ethical consideration be applied to use of PHIIs as health promotion tools. Advancing the 'ethics of PHIIs' will benefit from empirical data that is informed by information and computer science theory and methods. Information technologies, digital health promotion services, and integrated surveillance programs reflect important areas for investigation in terms of their effects and ethics. Health promotion researchers, practitioners, and ethicists should explore these across contexts and populations.

  16. Analysis of Emergency Information Management Research Hotspots Based on Bibliometric and Co-occurrence Analysis

    Directory of Open Access Journals (Sweden)

    Zou Qingyun

    2017-04-01

    Full Text Available [Purpose/significance] Emergency information management is an interdisciplinary field of emergency management and information management. Summarizing the major research output is helpful to strengthen the effective utilization of information resources in emergency management research, and to provide references for the follow-up development and practical exploration of emergency information management research. [Method/process] By retrieving concerned literature from CNKI, this paper used the bibliometric and co-word clustering analysis methods to analyze the domestic emergency management research output. [Result/conclusion] Domestic emergency information management research mainly focuses on five hot topics: disaster emergency information management, crisis information disclosure, emergency information management system, emergency response, wisdom emergency management. China should strengthen the emergency management information base for future theoretical research, and build the emergency information management theoretical framework.

  17. IMMAN: free software for information theory-based chemometric analysis.

    Science.gov (United States)

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  18. Online nutrition information for pregnant women: a content analysis.

    Science.gov (United States)

    Storr, Tayla; Maher, Judith; Swanepoel, Elizabeth

    2017-04-01

    Pregnant women actively seek health information online, including nutrition and food-related topics. However, the accuracy and readability of this information have not been evaluated. The aim of this study was to describe and evaluate pregnancy-related food and nutrition information available online. Four search engines were used to search for pregnancy-related nutrition web pages. Content analysis of web pages was performed. Web pages were assessed against the 2013 Australian Dietary Guidelines to assess accuracy. Flesch-Kincaid (F-K), Simple Measure of Gobbledygook (SMOG), Gunning Fog Index (FOG) and Flesch reading ease (FRE) formulas were used to assess readability. Data was analysed descriptively. Spearman's correlation was used to assess the relationship between web page characteristics. Kruskal-Wallis test was used to check for differences among readability and other web page characteristics. A total of 693 web pages were included. Web page types included commercial (n = 340), not-for-profit (n = 113), blogs (n = 112), government (n = 89), personal (n = 36) and educational (n = 3). The accuracy of online nutrition information varied with 39.7% of web pages containing accurate information, 22.8% containing mixed information and 37.5% containing inaccurate information. The average reading grade of all pages analysed measured by F-K, SMOG and FOG was 11.8. The mean FRE was 51.6, a 'fairly difficult to read' score. Only 0.5% of web pages were written at or below grade 6 according to F-K, SMOG and FOG. The findings suggest that accuracy of pregnancy-related nutrition information is a problem on the internet. Web page readability is generally difficult and means that the information may not be accessible to those who cannot read at a sophisticated level. © 2016 John Wiley & Sons Ltd. © 2016 John Wiley & Sons Ltd.

  19. Implantation of a safety management system information under the ISO 27001: risk analysis information

    Directory of Open Access Journals (Sweden)

    José Gregorio Arévalo Ascanio

    2015-11-01

    Full Text Available In this article the structure of the business of the city of Ocaña is explored with the aim of expanding the information and knowledge of the main variables of the productive activity of the municipality, its entrepreneurial spirit, technological development and productive structure. For this, a descriptive research was performed to identify economic activity in its various forms and promote the implementation of administrative practices consistent with national and international references.The results allowed to establish business weaknesses, including information, which once identified are used to design spaces training, acquisition of abilities and employers management practices in consistent with the challenges of competitiveness and stay on the market.As of the results was collected information regarding technological component companies of the productive fabric of the city, for which the application of tools for the analysis of information systems is proposed using the ISO 27001: 2005, using most appropriate technologies to study organizations that protect their most important asset information: information.

  20. Health information systems in Africa: descriptive analysis of data sources, information products and health statistics.

    Science.gov (United States)

    Mbondji, Peter Ebongue; Kebede, Derege; Soumbey-Alley, Edoh William; Zielinski, Chris; Kouvividila, Wenceslas; Lusamba-Dikassa, Paul-Samson

    2014-05-01

    To identify key data sources of health information and describe their availability in countries of the World Health Organization (WHO) African Region. An analytical review on the availability and quality of health information data sources in countries; from experience, observations, literature and contributions from countries. Forty-six Member States of the WHO African Region. No participants. The state of data sources, including censuses, surveys, vital registration and health care facility-based sources. In almost all countries of the Region, there is a heavy reliance on household surveys for most indicators, with more than 121 household surveys having been conducted in the Region since 2000. Few countries have civil registration systems that permit adequate and regular tracking of mortality and causes of death. Demographic surveillance sites function in several countries, but the data generated are not integrated into the national health information system because of concerns about representativeness. Health management information systems generate considerable data, but the information is rarely used because of concerns about bias, quality and timeliness. To date, 43 countries in the Region have initiated Integrated Disease Surveillance and Response. A multitude of data sources are used to track progress towards health-related goals in the Region, with heavy reliance on household surveys for most indicators. Countries need to develop comprehensive national plans for health information that address the full range of data needs and data sources and that include provision for building national capacities for data generation, analysis, dissemination and use. © The Royal Society of Medicine.

  1. Food labelled Information: An Empirical Analysis of Consumer Preferences

    Directory of Open Access Journals (Sweden)

    Alessandro Banterle

    2012-12-01

    Full Text Available This paper aims at analysing which kinds of currently labelled information are of interest and actually used by consumers, and which additional kinds could improve consumer choices. We investigate the attitude of consumers with respect to innovative strategies for the diffusion of product information, as smart-labels for mobile-phones. The empirical analysis was organised in focus groups followed by a survey on 240 consumers. Results show that the most important nutritional claims are vitamins, energy and fat content. Consumers show a high interest in the origin of the products, GMOs, environmental impact, animal welfare and type of breeding.

  2. Intelligent Data Analysis in the EMERCOM Information System

    Science.gov (United States)

    Elena, Sharafutdinova; Tatiana, Avdeenko; Bakaev, Maxim

    2017-01-01

    The paper describes an information system development project for the Russian Ministry of Emergency Situations (MES, whose international operations body is known as EMERCOM), which was attended by the representatives of both the IT industry and the academia. Besides the general description of the system, we put forward OLAP and Data Mining-based approaches towards the intelligent analysis of the data accumulated in the database. In particular, some operational OLAP reports and an example of multi-dimensional information space based on OLAP Data Warehouse are presented. Finally, we outline Data Mining application to support decision-making regarding security inspections planning and results consideration.

  3. Activation Analysis. Proceedings of an Informal Study Group Meeting

    International Nuclear Information System (INIS)

    1971-01-01

    As part of its programme to promote the exchange of information relating to nuclear science and technology, the International Atomic Energy Agency convened in Bangkok, Thailand, from 6-8 July 1970, an informal meeting to discuss the topic of Activation Analysis. The meeting was attended by participants drawn from the following countries: Australia, Burma, Ceylon, Republic of China, India, Indonesia, Prance, Japan, Republic of Korea, New Zealand, Philippines, Singapore, Thailand, United States of America and Vietnam. The proceedings consist of the contributions presented at the meeting with minor editorial changes

  4. Enhancing Safeguards through Information Analysis: Business Analytics Tools

    International Nuclear Information System (INIS)

    Vincent, J.; Midwinter, J.

    2015-01-01

    For the past 25 years the IBM i2 Intelligence Analysis product portfolio has assisted over 4,500 organizations across law enforcement, defense, government agencies, and commercial private sector businesses to maximize the value of the mass of information to discover and disseminate actionable intelligence that can help identify, investigate, predict, prevent, and disrupt criminal, terrorist, and fraudulent acts; safeguarding communities, organizations, infrastructures, and investments. The collaborative Intelligence Analysis environment delivered by i2 is specifically designed to be: · scalable: supporting business needs as well as operational and end user environments · modular: an architecture which can deliver maximum operational flexibility with ability to add complimentary analytics · interoperable: integrating with existing environments and eases information sharing across partner agencies · extendable: providing an open source developer essential toolkit, examples, and documentation for custom requirements i2 Intelligence Analysis brings clarity to complex investigations and operations by delivering industry leading multidimensional analytics that can be run on-demand across disparate data sets or across a single centralized analysis environment. The sole aim is to detect connections, patterns, and relationships hidden within high-volume, all-source data, and to create and disseminate intelligence products in near real time for faster informed decision making. (author)

  5. [Analysis of information sources in the journal, Atencion Primaria].

    Science.gov (United States)

    Aleixandre, R; Giménez Sánchez, J V; Terrada Ferrandis, M L; López Piñero, J M

    1996-03-31

    To contribute to improving the understanding of the information bases of Spanish scientific production in the primary care field; to aid international contacts and the diffusion of information among doctors in our country. A bibliometric analysis of the bibliographic references of the studies published in the journal Atención Primaria during 1991. A data base, managed by dBASE IV, was created. The Unit of Documental Analysis and Bibliometry of the Institute of Documental and Historical Studies in Science (University of Valencia-CSIC). 2,615 bibliographic references in 205 studies contained in volume 8 of Atención Primaria. References from journal articles (68%) and books (26%) predominated. There were a large number of references from Spanish publications (45%) as against Northamerican (27%) and British (15%). There were few references from other European countries or Latin America. Information was quickly out of date (4 year semiperiod; Price index 50%). High proportion of self-quotations from the journal. The major role of the journal Atención Primaria in communicating information in the primary care field was underlined. The small amount of information from Latin America and E.U. countries (except Great Britain) highlights the isolation of Spain from these countries, which can be explained by the limits of the primary care field itself. There was also little use of information to which access was more difficult for primary care doctors, such as doctoral theses, congress papers or reports. Moreover this is a field with a high percentage of recent literature and rapidly out-of-date information.

  6. PACC information management code for common cause failures analysis

    International Nuclear Information System (INIS)

    Ortega Prieto, P.; Garcia Gay, J.; Mira McWilliams, J.

    1987-01-01

    The purpose of this paper is to present the PACC code, which, through an adequate data management, makes the task of computerized common-mode failure analysis easier. PACC processes and generates information in order to carry out the corresponding qualitative analysis, by means of the boolean technique of transformation of variables, and the quantitative analysis either using one of several parametric methods or a direct data-base. As far as the qualitative analysis is concerned, the code creates several functional forms for the transformation equations according to the user's choice. These equations are subsequently processed by boolean manipulation codes, such as SETS. The quantitative calculations of the code can be carried out in two different ways: either starting from a common cause data-base, or through parametric methods, such as the Binomial Failure Rate Method, the Basic Parameters Method or the Multiple Greek Letter Method, among others. (orig.)

  7. Structural Simulations and Conservation Analysis -Historic Building Information Model (HBIM

    Directory of Open Access Journals (Sweden)

    C. Dore

    2015-02-01

    Full Text Available In this paper the current findings to date of the Historic Building Information Model (HBIM of the Four Courts in Dublin are presented. The Historic Building Information Model (HBIM forms the basis for both structural and conservation analysis to measure the impact of war damage which still impacts on the building. The laser scan survey was carried out in the summer of 2014 of the internal and external structure. After registration and processing of the laser scan survey, the HBIM was created of the damaged section of the building and is presented as two separate workflows in this paper. The first is the model created from historic data, the second a procedural and segmented model developed from laser scan survey of the war damaged drum and dome. From both models structural damage and decay simulations will be developed for documentation and conservation analysis.

  8. Information Retrieval and Graph Analysis Approaches for Book Recommendation.

    Science.gov (United States)

    Benkoussas, Chahinez; Bellot, Patrice

    2015-01-01

    A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model) and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval approach to related document network comprised of social links. We called Directed Graph of Documents (DGD) a network constructed with documents and social information provided from each one of them. Specifically, this work tackles the problem of book recommendation in the context of INEX (Initiative for the Evaluation of XML retrieval) Social Book Search track. A series of reranking experiments demonstrate that combining retrieval models yields significant improvements in terms of standard ranked retrieval metrics. These results extend the applicability of link analysis algorithms to different environments.

  9. Information technology portfolio in supply chain management using factor analysis

    Directory of Open Access Journals (Sweden)

    Ahmad Jaafarnejad

    2013-11-01

    Full Text Available The adoption of information technology (IT along with supply chain management (SCM has become increasingly a necessity among most businesses. This enhances supply chain (SC performance and helps companies achieve the organizational competitiveness. IT systems capture and analyze information and enable management to make decisions by considering a global scope across the entire SC. This paper reviews the existing literature on IT in SCM and considers pertinent criteria. Using principal component analysis (PCA of factor analysis (FA, a number of related criteria are divided into smaller groups. Finally, SC managers can develop an IT portfolio in SCM using mean values of few extracted components on the relevance –emergency matrix. A numerical example is provided to explain details of the proposed method.

  10. Main determinants of informality: a regional analysis for Mexico

    Directory of Open Access Journals (Sweden)

    David Robles Ortiz

    2018-01-01

    Full Text Available The aim of this study is to examine the most frequently-cited factors in order to explain informality in Mexico and to know if they have the same effect across the country. First, through cross-section analysis of microdata obtained from the Socio-economic Conditions Modules, features and current state of the phenomenon are described. Then, by using logit type econometric models, probabilities of being informal in all regions of the country are estimated. The results show that, in each one, the most common causes accounting for informality influence differently. Studying this phenomenon regionally, and not in an aggregated manner, will lead to a better understanding of its causes, which will help designing state and federal public policies in order to eradicate these problems in the future.

  11. Geriatric information analysis of the molecular properties of mexidole

    Directory of Open Access Journals (Sweden)

    O. A. Gromova

    2017-01-01

    Full Text Available Objective: by using the pharmacoinformation profiling, to comprehensively assess all possible effects of the molecules of mexidol, choline alfoscerate, piracetam, glycine, and semax in accordance with the anatomical therapeutic and chemical (ATC classification system.Material and methods. Chemoreactomic, pharmacoinformation, and geriatric information analyses of the properties of the molecules are based on chemoreactomic methodology. The chemoreactomic analysis uses the information from the PubChem, HMDB, and String databases; the pharmacoinformation analysis applies the information from the international ATC classification and a combined sample of data from the Therapeutic Target Database (TTD, SuperTarget, Manually Annotated Targets and Drugs Online Resource (MATADOR, and Potential Drug Target Database (PDTD; geriatric information analysis employs the data on the geroprotective effect of individual substances from the PubChem database and the data available in the literature data on geroprotection from the PubMed database, which have been collected through the artificial intelligence system.Results and discussion. Mexidol is characterized by the maximum set of positive effects (the drug is used to treat CNS and cardiovascular diseases and metabolic disorders and has anti-inflammatory and anti-infective properties, etc.. Mexidol and glycine are predicted to cause the lowest frequency of adverse reactions, such as itching, constipation, paresthesia, vomiting, etc. Geriatric information assessments of changes in the life span of model organisms have shown that mexidol contributes to the higher life expectancy of C. elegans (by 22.7±10%, Drosophila (by 14.4±15%, and mice (by 14.6±3%; the control drugs do by no more than 6.1%.Conclusion. The results of the study indicate that mexidol has a high potential to be used as a geroprotector.

  12. Information Retrieval and Graph Analysis Approaches for Book Recommendation

    OpenAIRE

    Chahinez Benkoussas; Patrice Bellot

    2015-01-01

    A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model) and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval ...

  13. Formal Concept Analysis and Information Retrieval – A Survey

    OpenAIRE

    Codocedo , Victor; Napoli , Amedeo

    2015-01-01

    International audience; One of the first models to be proposed as a document index for retrieval purposes was a lattice structure, decades before the introduction of Formal Concept Analysis. Nevertheless, the main notions that we consider so familiar within the community (" extension " , " intension " , " closure operators " , " order ") were already an important part of it. In the '90s, as FCA was starting to settle as an epistemic community, lattice-based Information Retrieval (IR) systems ...

  14. Crime Mapping and Geographical Information Systems in Crime Analysis

    OpenAIRE

    Dağlar, Murat; Argun, Uğur

    2016-01-01

    As essential apparatus in crime analysis, crime mapping and Geographical Information Systems (GIS) are being progressively more accepted by police agencies. Development in technology and the accessibility of geographic data sources make it feasible for police departments to use GIS and crime mapping. GIS and crime mapping can be utilized as devices to discover reasons contributing to crime, and hence let law enforcement agencies proactively take action against the crime problems before they b...

  15. Army Information Technology Procurement: A Business Process Analysis

    Science.gov (United States)

    2015-03-27

    Education and Training Command In Partial Fulfillment of the Requirements for the Degree of Master of Science in Engineering Management...Brent Langhals, Lt Col, USAF Member John Elshaw, PhD Member Abstract The integration of Information and Communication Technology ( ICT ...the lack transparency in how resources are allocated. This thesis presents a business process analysis of the Army’s ICT procurement system. The

  16. Carbon Dioxide Information Analysis Center: FY 1992 activities

    Energy Technology Data Exchange (ETDEWEB)

    Cushman, R.M. [Oak Ridge National Lab., TN (United States). Carbon Dioxide Information Analysis Center; Stoss, F.W. [Tennessee Univ., Knoxville, TN (United States). Energy, Environment and Resources Center

    1993-03-01

    During the course of a fiscal year, Oak Ridge National Laboratory`s Carbon Dioxide Information Analysis Center (CDIAC) distributes thousands of specialty publications-numeric data packages (NDPs), computer model packages (CMPs), technical reports, public communication publications, newsletters, article reprints, and reference books-in response to requests for information related to global environmental issues, primarily those pertaining to climate change. CDIACs staff also provides technical responses to specific inquiries related to carbon dioxide (CO{sub 2}), other trace gases, and climate. Hundreds of referrals to other researchers, policy analysts, information specialists, or organizations are also facilitated by CDIAC`s staff. This report provides an account of the activities accomplished by CDIAC during the period October 1, 1991 to September 30, 1992. An organizational overview of CDIAC and its staff is supplemented by a detailed description of inquiries received and CDIAC`s response to those inquiries. As analysis and description of the preparation and distribution of numeric data packages, computer model packages, technical reports, newsletters, fact sheets, specialty publications, and reprints is provided. Comments and descriptions of CDIAC`s information management systems, professional networking, and special bilateral agreements are also described.

  17. Carbon Dioxide Information Analysis Center: FY 1991 activities

    Energy Technology Data Exchange (ETDEWEB)

    Cushman, R.M.; Stoss, F.W.

    1992-06-01

    During the course of a fiscal year, Oak Ridge National Laboratory`s Carbon Dioxide Information Analysis Center (CDIAC) distributes thousands of specially publications-numeric data packages (NDPs), computer model packages (CMPs), technical reports, public communication publications, newsletters, article reprints, and reference books-in response to requests for information related to global environmental issues, primarily those pertaining to climate change. CDIAC`s staff also provides technical responses to specific inquiries related to carbon dioxide (CO{sub 2}), other trace gases, and climate. Hundreds of referrals to other researchers, policy analysts, information specialists, or organizations are also facilitated by CDIAC`s staff. This report provides an account of the activities accomplished by CDIAC during the period October 1, 1990 to September 30, 1991. An organizational overview of CDIAC and its staff is supplemented by a detailed description of inquiries received and CDIAC`s response to those inquiries. An analysis and description of the preparation and distribution of numeric data packages, computer model packages, technical reports, newsletters, factsheets, specially publications, and reprints is provided. Comments and descriptions of CDIAC`s information management systems, professional networking, and special bilateral agreements are also described.

  18. Carbon Dioxide Information Analysis Center: FY 1991 activities

    Energy Technology Data Exchange (ETDEWEB)

    Cushman, R.M.; Stoss, F.W.

    1992-06-01

    During the course of a fiscal year, Oak Ridge National Laboratory's Carbon Dioxide Information Analysis Center (CDIAC) distributes thousands of specially publications-numeric data packages (NDPs), computer model packages (CMPs), technical reports, public communication publications, newsletters, article reprints, and reference books-in response to requests for information related to global environmental issues, primarily those pertaining to climate change. CDIAC's staff also provides technical responses to specific inquiries related to carbon dioxide (CO{sub 2}), other trace gases, and climate. Hundreds of referrals to other researchers, policy analysts, information specialists, or organizations are also facilitated by CDIAC's staff. This report provides an account of the activities accomplished by CDIAC during the period October 1, 1990 to September 30, 1991. An organizational overview of CDIAC and its staff is supplemented by a detailed description of inquiries received and CDIAC's response to those inquiries. An analysis and description of the preparation and distribution of numeric data packages, computer model packages, technical reports, newsletters, factsheets, specially publications, and reprints is provided. Comments and descriptions of CDIAC's information management systems, professional networking, and special bilateral agreements are also described.

  19. [Analysis of utilization of information in the journal Medicina Clinica].

    Science.gov (United States)

    Aleixandre, R; Giménez Sánchez, J V; Terrada, M L; López Piñero, J M

    1994-09-10

    Scientific communication knowledge is specifically based in the analysis of the bibliographic references inside the publications. Pattern and laws determining the information consumption in the items of the journal Medicina Clinica are investigated in the present study as its own aim. An analysis was performed on the 13,286 references downloaded from 618 papers published by the journal in 1990. With dBASE IV was generated a database for the management of the information; data was distributed in several tables through criteria of age, documentary types, countries, journals and Bradford zones. The analysed references belong to 1,241 different journals, 110 from Spain. Being two thirds of the total sum, the publications from United States and United Kingdom have received more citations than those from Spain. The publications from european countries, like France, Germany and Italy, are scarcely present. Bradford core is constituted by the journals Medicina Clinica and The Lancet. The analysis of the bibliographic references available from the articles in this journal is able to produce knowledge on the information consumption by the practitioners; its usefulness as a complementary utility to the Indice de Citas e Indicadores Bibliométricos de Revistas Españolas de Medicina Interna y sus especialidades 1990 must be considered.

  20. Analysis of data as information: quality assurance approach.

    Science.gov (United States)

    Ivankovic, D; Kern, J; Bartolic, A; Vuletic, S

    1993-01-01

    Describes a prototype module for data analysis of the healthcare delivery system. It consists of three main parts: data/variable selection; algorithms for the analysis of quantitative and qualitative changes in the system; and interpretation and explanation of the results. Such a module designed for primary health care has been installed on a PC in the health manager's office. Data enter the information system through the standard DBMS procedures, followed by calculating a number of different indicators and the time series, as the ordered sequences of indicators, according to demands of the manager. The last procedure is "the change analysis" with estimation of unexpected differences between and within some units, e.g. health-care teams, as well as some unexpected variabilities and trends. As an example, presents and discusses the diagnostic pattern of neurotic cases, referral patterns and preventive behaviour of GP's teams as well.

  1. Principal Component Analysis - A Powerful Tool in Computing Marketing Information

    Directory of Open Access Journals (Sweden)

    Constantin C.

    2014-12-01

    Full Text Available This paper is about an instrumental research regarding a powerful multivariate data analysis method which can be used by the researchers in order to obtain valuable information for decision makers that need to solve the marketing problem a company face with. The literature stresses the need to avoid the multicollinearity phenomenon in multivariate analysis and the features of Principal Component Analysis (PCA in reducing the number of variables that could be correlated with each other to a small number of principal components that are uncorrelated. In this respect, the paper presents step-by-step the process of applying the PCA in marketing research when we use a large number of variables that naturally are collinear.

  2. Predominant information quality scheme for the essential amino acids: an information-theoretical analysis.

    Science.gov (United States)

    Esquivel, Rodolfo O; Molina-Espíritu, Moyocoyani; López-Rosa, Sheila; Soriano-Correa, Catalina; Barrientos-Salcedo, Carolina; Kohout, Miroslav; Dehesa, Jesús S

    2015-08-24

    In this work we undertake a pioneer information-theoretical analysis of 18 selected amino acids extracted from a natural protein, bacteriorhodopsin (1C3W). The conformational structures of each amino acid are analyzed by use of various quantum chemistry methodologies at high levels of theory: HF, M062X and CISD(Full). The Shannon entropy, Fisher information and disequilibrium are determined to grasp the spatial spreading features of delocalizability, order and uniformity of the optimized structures. These three entropic measures uniquely characterize all amino acids through a predominant information-theoretic quality scheme (PIQS), which gathers all chemical families by means of three major spreading features: delocalization, narrowness and uniformity. This scheme recognizes four major chemical families: aliphatic (delocalized), aromatic (delocalized), electro-attractive (narrowed) and tiny (uniform). All chemical families recognized by the existing energy-based classifications are embraced by this entropic scheme. Finally, novel chemical patterns are shown in the information planes associated with the PIQS entropic measures. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Petroleum labour market information supply demand analysis 2009-2020

    International Nuclear Information System (INIS)

    2010-03-01

    Since 2006, the petroleum industry has been interested in collaboration to determine labour demand and supply/demand gaps for the upstream petroleum industry. In 2006, the petroleum industry experienced strong employment growth and was having difficulty finding workers. Comprehensive, up-to-date labour market information and analysis are the key foundation for addressing labour supply/demand issues. This document presented labour market information on the petroleum industry in order to inform company retention and recruitment offices; government departments involved in development of labour market policies and programs; education and training institutions; guidance counsellors, employment centres and organizations that work with youth and labour supply pools; and job seekers. Specific topics that were discussed included two industry scenarios (growth and base case) in determining the petroleum industry's medium-and long-term employment needs; labour supply/demand considerations for the industry as a whole and an industry-wide cost management; and an analysis of exploration and production, oil sands, services, and pipeline sectors to 2020. It was concluded that while new employment is not expected to lead to labour shortages within the pipeline sector, attrition due to requirements almost certainly would. In the growth scenario, it is likely the pipeline sector will be challenged by competition from the other petroleum industry sectors. tabs., figs., appendices.

  4. HIV Drug-Resistant Patient Information Management, Analysis, and Interpretation.

    Science.gov (United States)

    Singh, Yashik; Mars, Maurice

    2012-06-07

    The science of information systems, management, and interpretation plays an important part in the continuity of care of patients. This is becoming more evident in the treatment of human immunodeficiency virus (HIV) and acquired immune deficiency syndrome (AIDS), the leading cause of death in sub-Saharan Africa. The high replication rates, selective pressure, and initial infection by resistant strains of HIV infer that drug resistance will inevitably become an important health care concern. This paper describes proposed research with the aim of developing a physician-administered, artificial intelligence-based decision support system tool to facilitate the management of patients on antiretroviral therapy. This tool will consist of (1) an artificial intelligence computer program that will determine HIV drug resistance information from genomic analysis; (2) a machine-learning algorithm that can predict future CD4 count information given a genomic sequence; and (3) the integration of these tools into an electronic medical record for storage and management. The aim of the project is to create an electronic tool that assists clinicians in managing and interpreting patient information in order to determine the optimal therapy for drug-resistant HIV patients.

  5. Information management and analysis system for groundwater data in Thailand

    Science.gov (United States)

    Gill, D.; Luckananurung, P.

    1992-01-01

    The Ground Water Division of the Thai Department of Mineral Resources maintains a large archive of groundwater data with information on some 50,000 water wells. Each well file contains information on well location, well completion, borehole geology, water levels, water quality, and pumping tests. In order to enable efficient use of this information a computer-based system for information management and analysis was created. The project was sponsored by the United Nations Development Program and the Thai Department of Mineral Resources. The system was designed to serve users who lack prior training in automated data processing. Access is through a friendly user/system dialogue. Tasks are segmented into a number of logical steps, each of which is managed by a separate screen. Selective retrieval is possible by four different methods of area definition and by compliance with user-specified constraints on any combination of database variables. The main types of outputs are: (1) files of retrieved data, screened according to users' specifications; (2) an assortment of pre-formatted reports; (3) computed geochemical parameters and various diagrams of water chemistry derived therefrom; (4) bivariate scatter diagrams and linear regression analysis; (5) posting of data and computed results on maps; and (6) hydraulic aquifer characteristics as computed from pumping tests. Data are entered directly from formatted screens. Most records can be copied directly from hand-written documents. The database-management program performs data integrity checks in real time, enabling corrections at the time of input. The system software can be grouped into: (1) database administration and maintenance—these functions are carried out by the SIR/DBMS software package; (2) user communication interface for task definition and execution control—the interface is written in the operating system command language (VMS/DCL) and in FORTRAN 77; and (3) scientific data-processing programs, written

  6. Implementation of a drainage information, analysis and management system

    Directory of Open Access Journals (Sweden)

    J.N. Meegoda

    2017-04-01

    Full Text Available An integrated drainage information, analysis and management system (DIAMS was developed and implemented for the New Jersey Department of Transportation (NJDOT. The purpose of the DIAMS is to provide a useful tool for managers to evaluate drainage infrastructure, to facilitate the determination of the present costs of preserving those infrastructures, and to make decisions regarding the optimal use of their infrastructure budgets. The impetus for DIAMS is the culvert information management system (CIMS, which is developed to manage the data for culvert pipes. DIAMS maintains and summarizes accumulated inspection data for all types of drainage infrastructure assets, including pipes, inlet/outlet structures, outfalls and manufactured treatment devices. DIAMS capabilities include identifying drainage infrastructure, maintaining inspection history, mapping locations, predicting service life based on the current condition states, and assessing present asset value. It also includes unit cost values of 72 standard items to estimate the current cost for new assets with the ability to adjust for future inflation. In addition, DIAMS contains several different repair, rehabilitation and replacement options to remedy the drainage infrastructure. DIAMS can analyze asset information and determine decisions to inspect, rehabilitate, replace or do nothing at the project and network levels by comparing costs with risks and failures. Costs may be optimized to meet annual maintenance budget allocations by prioritizing drainage infrastructure needing inspection, cleaning and repair. DIAMS functional modules include vendor data uploading, asset identification, system administration and financial analysis. Among the significant performance feature of DIAMS is its proactive nature, which affords decision makers the means of conducting a comprehensive financial analysis to determine the optimal proactive schedule for the proper maintenance actions and to prioritize them

  7. Minimum Information Loss Cluster Analysis for Cathegorical Data

    Czech Academy of Sciences Publication Activity Database

    Grim, Jiří; Hora, Jan

    2007-01-01

    Roč. 2007, Č. 4571 (2007), s. 233-247 ISSN 0302-9743. [International Conference on Machine Learning and Data Mining MLDM 2007 /5./. Leipzig, 18.07.2007-20.07.2007] R&D Projects: GA MŠk 1M0572; GA ČR GA102/07/1594 Grant - others:GA MŠk(CZ) 2C06019 Institutional research plan: CEZ:AV0Z10750506 Keywords : Cluster Analysis * Cathegorical Data * EM algorithm Subject RIV: BD - The ory of Information Impact factor: 0.402, year: 2005

  8. Rapid automatic keyword extraction for information retrieval and analysis

    Science.gov (United States)

    Rose, Stuart J [Richland, WA; Cowley,; E, Wendy [Richland, WA; Crow, Vernon L [Richland, WA; Cramer, Nicholas O [Richland, WA

    2012-03-06

    Methods and systems for rapid automatic keyword extraction for information retrieval and analysis. Embodiments can include parsing words in an individual document by delimiters, stop words, or both in order to identify candidate keywords. Word scores for each word within the candidate keywords are then calculated based on a function of co-occurrence degree, co-occurrence frequency, or both. Based on a function of the word scores for words within the candidate keyword, a keyword score is calculated for each of the candidate keywords. A portion of the candidate keywords are then extracted as keywords based, at least in part, on the candidate keywords having the highest keyword scores.

  9. Analysis of Internet Information on Lateral Lumbar Interbody Fusion.

    Science.gov (United States)

    Belayneh, Rebekah; Mesfin, Addisu

    2016-07-01

    Lateral lumbar interbody fusion (LLIF) is a surgical technique that is being increasingly used. The authors' objective was to examine information on the Internet pertaining to the LLIF technique. An analysis was conducted of publicly accessible websites pertaining to LLIF. The following search engines were used: Google (www.google.com), Bing (www.bing.com), and Yahoo (www.yahoo.com). DuckDuckGo (www.duckduckgo.com) was an additional search engine used due to its emphasis on generating accurate and consistent results while protecting searchers' privacy and reducing advertisements. The top 35 websites providing information on LLIF from the 4 search engines were identified. A total of 140 websites were evaluated. Each web-site was categorized based on authorship (academic, private, medical industry, insurance company, other) and content of information. Using the search term lateral lumbar interbody fusion, 174,000 Google results, 112,000 Yahoo results, and 112,000 Bing results were obtained. DuckDuckGo does not display the number of results found for a search. From the top 140 websites collected from each website, 78 unique websites were identified. Websites were authored by a private medical group in 46.2% of the cases, an academic medical group in 26.9% of the cases, and the biomedical industry in 5.1% of the cases. Sixty-eight percent of websites reported indications, and 24.4% reported contraindications. Benefits of LLIF were reported by 69.2% of websites. Thirty-six percent of websites reported complications of LLIF. Overall, the quality of information regarding LLIF on the Internet is poor. Spine surgeons and spine societies can assist in improving the quality of the information on the Internet regarding LLIF. [Orthopedics. 2016; 39(4):e701-e707.]. Copyright 2016, SLACK Incorporated.

  10. Information management for global environmental change, including the Carbon Dioxide Information Analysis Center

    Energy Technology Data Exchange (ETDEWEB)

    Stoss, F.W. [Oak Ridge National Lab., TN (United States). Carbon Dioxide Information Analysis Center

    1994-06-01

    The issue of global change is international in scope. A body of international organizations oversees the worldwide coordination of research and policy initiatives. In the US the National Science and Technology Council (NSTC) was established in November of 1993 to provide coordination of science, space, and technology policies throughout the federal government. NSTC is organized into nine proposed committees. The Committee on Environmental and Natural Resources (CERN) oversees the US Department of Energy`s Global Change Research Program (USGCRP). As part of the USGCRP, the US Department of Energy`s Global Change Research Program aims to improve the understanding of Earth systems and to strengthen the scientific basis for the evaluation of policy and government action in response to potential global environmental changes. This paper examines the information and data management roles of several international and national programs, including Oak Ridge National Laboratory`s (ORNL`s) global change information programs. An emphasis will be placed on the Carbon Dioxide Information Analysis Center (CDIAC), which also serves as the World Data Center-A for Atmospheric Trace Gases.

  11. Analysis of College Students' Personal Health Information Activities: Online Survey.

    Science.gov (United States)

    Kim, Sujin; Sinn, Donghee; Syn, Sue Yeon

    2018-04-20

    With abundant personal health information at hand, individuals are faced with a critical challenge in evaluating the informational value of health care records to keep useful information and discard that which is determined useless. Young, healthy college students who were previously dependents of adult parents or caregivers are less likely to be concerned with disease management. Personal health information management (PHIM) is a special case of personal information management (PIM) that is associated with multiple interactions among varying stakeholders and systems. However, there has been limited evidence to understand informational or behavioral underpinning of the college students' PHIM activities, which can influence their health in general throughout their lifetime. This study aimed to investigate demographic and academic profiles of college students with relevance to PHIM activities. Next, we sought to construct major PHIM-related activity components and perceptions among college students. Finally, we sought to discover major factors predicting core PHIM activities among college students we sampled. A Web survey was administered to collect responses about PHIM behaviors and perceptions among college students from the University of Kentucky from January through March 2017. A total of 1408 college students were included in the analysis. PHIM perceptions, demographics, and academic variations were used as independent variables to predict diverse PHIM activities using a principal component analysis (PCA) and hierarchical regression analyses (SPSS v.24, IBM Corp, Armonk, NY, USA). Majority of the participants were female (956/1408, 67.90%), and the age distribution of this population included an adequate representation of college students of all ages. The most preferred health information resources were family (612/1408, 43.47%), health care professionals (366/1408, 26.00%), friends (27/1408, 1.91%), and the internet (157/1408, 11.15%). Organizational or

  12. Information findability: An informal study to explore options for improving information findability for the systems analysis group

    Energy Technology Data Exchange (ETDEWEB)

    Stoecker, Nora Kathleen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-03-01

    A Systems Analysis Group has existed at Sandia National Laboratories since at least the mid-1950s. Much of the groups work output (reports, briefing documents, and other materials) has been retained, along with large numbers of related documents. Over time the collection has grown to hundreds of thousands of unstructured documents in many formats contained in one or more of several different shared drives or SharePoint sites, with perhaps five percent of the collection still existing in print format. This presents a challenge. How can the group effectively find, manage, and build on information contained somewhere within such a large set of unstructured documents? In response, a project was initiated to identify tools that would be able to meet this challenge. This report documents the results found and recommendations made as of August 2013.

  13. Information Retrieval and Graph Analysis Approaches for Book Recommendation

    Directory of Open Access Journals (Sweden)

    Chahinez Benkoussas

    2015-01-01

    Full Text Available A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval approach to related document network comprised of social links. We called Directed Graph of Documents (DGD a network constructed with documents and social information provided from each one of them. Specifically, this work tackles the problem of book recommendation in the context of INEX (Initiative for the Evaluation of XML retrieval Social Book Search track. A series of reranking experiments demonstrate that combining retrieval models yields significant improvements in terms of standard ranked retrieval metrics. These results extend the applicability of link analysis algorithms to different environments.

  14. SUCCESS CONCEPT ANALYSIS APPLIED TO THE INFORMATION TECHNOLOGY PROJECT MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Cassio C. Montenegro Duarte

    2012-05-01

    Full Text Available This study evaluates the concept of success in project management that is applicable to the IT universe, from the classical theory associated with the techniques of project management. Therefore, it applies the theoretical analysis associated to the context of information technology in enterprises as well as the classic literature of traditional project management, focusing on its application in business information technology. From the literature developed in the first part of the study, four propositions were prepared for study which formed the basis for the development of the field research with three large companies that develop projects of Information Technology. The methodology used in the study predicted the development of the multiple case study. Empirical evidence suggests that the concept of success found in the classical literature in project management adjusts to the environment management of IT projects. Showed that it is possible to create the model of standard IT projects in order to replicate it in future derivatives projects, which depends on the learning acquired at the end of a long and continuous process and sponsorship of senior management, which ultimately results in its merger into the company culture.

  15. ARN: Analysis and Visualization System for Adipogenic Regulation Network Information.

    Science.gov (United States)

    Huang, Yan; Wang, Li; Zan, Lin-Sen

    2016-12-16

    Adipogenesis is the process of cell differentiation through which preadipocytes become adipocytes. Lots of research is currently ongoing to identify genes, including their gene products and microRNAs, that correlate with fat cell development. However, information fragmentation hampers the identification of key regulatory genes and pathways. Here, we present a database of literature-curated adipogenesis-related regulatory interactions, designated the Adipogenesis Regulation Network (ARN, http://210.27.80.93/arn/), which currently contains 3101 nodes (genes and microRNAs), 1863 regulatory interactions, and 33,969 expression records associated with adipogenesis, based on 1619 papers. A sentence-based text-mining approach was employed for efficient manual curation of regulatory interactions from approximately 37,000 PubMed abstracts. Additionally, we further determined 13,103 possible node relationships by searching miRGate, BioGRID, PAZAR and TRRUST. ARN also has several useful features: i) regulatory map information; ii) tests to examine the impact of a query node on adipogenesis; iii) tests for the interactions and modes of a query node; iv) prediction of interactions of a query node; and v) analysis of experimental data or the construction of hypotheses related to adipogenesis. In summary, ARN can store, retrieve and analyze adipogenesis-related information as well as support ongoing adipogenesis research and contribute to the discovery of key regulatory genes and pathways.

  16. From paragraph to graph: Latent semantic analysis for information visualization

    Science.gov (United States)

    Landauer, Thomas K.; Laham, Darrell; Derr, Marcia

    2004-01-01

    Most techniques for relating textual information rely on intellectually created links such as author-chosen keywords and titles, authority indexing terms, or bibliographic citations. Similarity of the semantic content of whole documents, rather than just titles, abstracts, or overlap of keywords, offers an attractive alternative. Latent semantic analysis provides an effective dimension reduction method for the purpose that reflects synonymy and the sense of arbitrary word combinations. However, latent semantic analysis correlations with human text-to-text similarity judgments are often empirically highest at ≈300 dimensions. Thus, two- or three-dimensional visualizations are severely limited in what they can show, and the first and/or second automatically discovered principal component, or any three such for that matter, rarely capture all of the relations that might be of interest. It is our conjecture that linguistic meaning is intrinsically and irreducibly very high dimensional. Thus, some method to explore a high dimensional similarity space is needed. But the 2.7 × 107 projections and infinite rotations of, for example, a 300-dimensional pattern are impossible to examine. We suggest, however, that the use of a high dimensional dynamic viewer with an effective projection pursuit routine and user control, coupled with the exquisite abilities of the human visual system to extract information about objects and from moving patterns, can often succeed in discovering multiple revealing views that are missed by current computational algorithms. We show some examples of the use of latent semantic analysis to support such visualizations and offer views on future needs. PMID:15037748

  17. Development of Cost Benefit Methodology for Scientific and Technical Information Communication and Application to Information Analysis Centers. Final Report.

    Science.gov (United States)

    Mason, Robert M.; And Others

    This document presents a research effort intended to improve the economic information available for formulating politics and making decisions related to Information Analysis Centers (IAC's) and IAC services. The project used a system of IAC information activities to analyze the functional aspects of IAC services, calculate the present value of net…

  18. Sensitivity analysis of an information fusion tool: OWA operator

    Science.gov (United States)

    Zarghaami, Mahdi; Ardakanian, Reza; Szidarovszky, Ferenc

    2007-04-01

    The successful design and application of the Ordered Weighted Averaging (OWA) method as a decision making tool depend on the efficient computation of its order weights. The most popular methods for determining the order weights are the Fuzzy Linguistic Quantifiers approach and the Minimal Variability method which give different behavior patterns for OWA. These methods will be compared by using Sensitivity Analysis on the outputs of OWA with respect to the optimism degree of the decision maker. The theoretical results are illustrated in a water resources management problem. The Fuzzy Linguistic Quantifiers approach gives more information about the behavior of the OWA outputs in comparison to the Minimal Variability method. However, in using the Minimal Variability method, the OWA has a linear behavior with respect to the optimism degree and therefore it has better computation efficiency.

  19. How Analysis Informs Regulation:Success and Failure of ...

    Science.gov (United States)

    How Analysis Informs Regulation:Success and Failure of Evolving Approaches to Polyfluoroalkyl Acid Contamination The National Exposure Research Laboratory (NERL) Human Exposure and Atmospheric Sciences Division (HEASD) conducts research in support of EPA mission to protect human health and the environment. HEASD research program supports Goal 1 (Clean Air) and Goal 4 (Healthy People) of EPA strategic plan. More specifically, our division conducts research to characterize the movement of pollutants from the source to contact with humans. Our multidisciplinary research program produces Methods, Measurements, and Models to identify relationships between and characterize processes that link source emissions, environmental concentrations, human exposures, and target-tissue dose. The impact of these tools is improved regulatory programs and policies for EPA.

  20. A Study on the Information Analysis and Legal Affairs

    International Nuclear Information System (INIS)

    Chung, W. S.; Yang, M. H.; Yun, S. W.; Lee, D. S.; Kim, H. R.; Noh, B. C.

    2009-02-01

    It is followed that results and contents of a Study on the Nuclear Information Analyses and Legal Affairs. Our team makes an effort to secure KAERI's best legal interest in the process of enacting nuclear laws and codes, international collaborative study, and management. Moreover, as a international trend analysis, we studied Japan government's position to nuclear energy under the aspect of reducing climate change and supplying sustainable energy. Improvement of Japan's radiation use showed increasing contribution of radiation technology to the people. Results of studies of nuclear policy of Kazakhstan, forecasting global trend in 2030 of Nuclear area, and new U.S. government's policy to nuclear energy are also explained. Lastly, we performed evaluation of source of electric generator which reduce emitting carbon dioxide in the aspect of greenhouse gas emission statistic and tested green gas reducing ability of Korea's green source of electric generator that reducing greenhouse gas effect

  1. Trajectory Shape Analysis and Anomaly Detection Utilizing Information Theory Tools

    Directory of Open Access Journals (Sweden)

    Yuejun Guo

    2017-06-01

    Full Text Available In this paper, we propose to improve trajectory shape analysis by explicitly considering the speed attribute of trajectory data, and to successfully achieve anomaly detection. The shape of object motion trajectory is modeled using Kernel Density Estimation (KDE, making use of both the angle attribute of the trajectory and the speed of the moving object. An unsupervised clustering algorithm, based on the Information Bottleneck (IB method, is employed for trajectory learning to obtain an adaptive number of trajectory clusters through maximizing the Mutual Information (MI between the clustering result and a feature set of the trajectory data. Furthermore, we propose to effectively enhance the performance of IB by taking into account the clustering quality in each iteration of the clustering procedure. The trajectories are determined as either abnormal (infrequently observed or normal by a measure based on Shannon entropy. Extensive tests on real-world and synthetic data show that the proposed technique behaves very well and outperforms the state-of-the-art methods.

  2. Mediating Informal Care Online: Findings from an Extensive Requirements Analysis

    Directory of Open Access Journals (Sweden)

    Christiane Moser

    2015-05-01

    Full Text Available Organizing and satisfying the increasing demand for social and informal care for older adults is an important topic. We aim at building a peer-to-peer exchange platform that empowers older adults to benefit from receiving support for daily activities and reciprocally offering support to others. In situated interviews and within a survey we investigated the requirements and needs of 246 older adults with mild impairments. Additionally, we conducted an interpretative role analysis of older adults’ collaborative care processes (i.e., support exchange practices in order to identify social roles and understand the inherent expectations towards the execution of support. We will describe our target group in the form of personas and different social roles, as well as user requirements for establishing a successful peer-to-peer collaboration. We also consider our finding from the perspective of social capital theory that allows us to describe in our requirements how relationships provide valuable social resources (i.e., social capital for informal and social care.

  3. Analysis and Visualization of Seismic Data Using Mutual Information

    Directory of Open Access Journals (Sweden)

    António M. Lopes

    2013-09-01

    Full Text Available Seismic data is difficult to analyze and classical mathematical tools reveal strong limitations in exposing hidden relationships between earthquakes. In this paper, we study earthquake phenomena in the perspective of complex systems. Global seismic data, covering the period from 1962 up to 2011 is analyzed. The events, characterized by their magnitude, geographic location and time of occurrence, are divided into groups, either according to the Flinn-Engdahl (F-E seismic regions of Earth or using a rectangular grid based in latitude and longitude coordinates. Two methods of analysis are considered and compared in this study. In a first method, the distributions of magnitudes are approximated by Gutenberg-Richter (G-R distributions and the parameters used to reveal the relationships among regions. In the second method, the mutual information is calculated and adopted as a measure of similarity between regions. In both cases, using clustering analysis, visualization maps are generated, providing an intuitive and useful representation of the complex relationships that are present among seismic data. Such relationships might not be perceived on classical geographic maps. Therefore, the generated charts are a valid alternative to other visualization tools, for understanding the global behavior of earthquakes.

  4. Intelligent acoustic data fusion technique for information security analysis

    Science.gov (United States)

    Jiang, Ying; Tang, Yize; Lu, Wenda; Wang, Zhongfeng; Wang, Zepeng; Zhang, Luming

    2017-08-01

    Tone is an essential component of word formation in all tonal languages, and it plays an important role in the transmission of information in speech communication. Therefore, tones characteristics study can be applied into security analysis of acoustic signal by the means of language identification, etc. In speech processing, fundamental frequency (F0) is often viewed as representing tones by researchers of speech synthesis. However, regular F0 values may lead to low naturalness in synthesized speech. Moreover, F0 and tone are not equivalent linguistically; F0 is just a representation of a tone. Therefore, the Electroglottography (EGG) signal is collected for deeper tones characteristics study. In this paper, focusing on the Northern Kam language, which has nine tonal contours and five level tone types, we first collected EGG and speech signals from six natural male speakers of the Northern Kam language, and then achieved the clustering distributions of the tone curves. After summarizing the main characteristics of tones of Northern Kam, we analyzed the relationship between EGG and speech signal parameters, and laid the foundation for further security analysis of acoustic signal.

  5. Scientific and technological information: analysis of periodic publications of information science

    OpenAIRE

    Mayara Cintya do Nascimento Vasconcelos; Gabriela Belmont de Farias

    2017-01-01

    The research analyzes the articles published in national scientific journals of the area of Information Science, classified with Qualis A1, having as parameter the term "scientific and technological information". It presents concepts about scientific and technological information and the processes that involve its uses, as well as scientific communication, information flows and sources of information. The methodology used is a descriptive study with a quantitative-qualitative approach, using ...

  6. The system for statistical analysis of logistic information

    Directory of Open Access Journals (Sweden)

    Khayrullin Rustam Zinnatullovich

    2015-05-01

    Full Text Available The current problem for managers in logistic and trading companies is the task of improving the operational business performance and developing the logistics support of sales. The development of logistics sales supposes development and implementation of a set of works for the development of the existing warehouse facilities, including both a detailed description of the work performed, and the timing of their implementation. Logistics engineering of warehouse complex includes such tasks as: determining the number and the types of technological zones, calculation of the required number of loading-unloading places, development of storage structures, development and pre-sales preparation zones, development of specifications of storage types, selection of loading-unloading equipment, detailed planning of warehouse logistics system, creation of architectural-planning decisions, selection of information-processing equipment, etc. The currently used ERP and WMS systems did not allow us to solve the full list of logistics engineering problems. In this regard, the development of specialized software products, taking into account the specifics of warehouse logistics, and subsequent integration of these software with ERP and WMS systems seems to be a current task. In this paper we suggest a system of statistical analysis of logistics information, designed to meet the challenges of logistics engineering and planning. The system is based on the methods of statistical data processing.The proposed specialized software is designed to improve the efficiency of the operating business and the development of logistics support of sales. The system is based on the methods of statistical data processing, the methods of assessment and prediction of logistics performance, the methods for the determination and calculation of the data required for registration, storage and processing of metal products, as well as the methods for planning the reconstruction and development

  7. Situated student learning and spatial informational analysis for environmental problems

    Science.gov (United States)

    Olsen, Timothy Paul

    Ninth and tenth grade high school Biology student research teams used spatial information analysis tools to site a prairie restoration plot on a 55 acre campus during a four-week environment unit. Students made use of innovative technological practices by applying geographic information systems (GIS) approaches to solving environmental and land use problems. Student learning was facilitated by starting with the students' initial conceptions of computing, local landscape and biological environment, and then by guiding them through a problem-based science project process. The project curriculum was framed by the perspective of legitimate peripheral participation (Lave & Wenger, 1991) where students were provided with learning opportunities designed to allow them to act like GIS practitioners. Sociocultural lenses for learning were employed to create accounts of human mental processes that recognize the essential relationship between these processes and their cultural, historical, and institutional settings (Jacob, 1997; Wertsch, 1991). This research investigated how student groups' meaning-making actions were mediated by GIS tools on the periphery of a scientific community of practice. Research observations focused on supporting interpretations of learners' socially constructed actions and the iterative building of assertions from multiple sources. These included the artifacts students produced, the tools they used, the cultural contexts that constrained their activity, and how people begin to adopt ways of speaking (speech genres) of the referent community to negotiate meanings and roles. Students gathered field observations and interpreted attributes of landscape entities from the GIS data to advocate for an environmental decision. However, even while gaining proficiencies with GIS tools, most students did not begin to appropriate roles from the GIS community of practice. Students continued to negotiate their project actions simply as school exercises motivated by

  8. On the censored cost-effectiveness analysis using copula information

    Directory of Open Access Journals (Sweden)

    Charles Fontaine

    2017-02-01

    Full Text Available Abstract Background Information and theory beyond copula concepts are essential to understand the dependence relationship between several marginal covariates distributions. In a therapeutic trial data scheme, most of the time, censoring occurs. That could lead to a biased interpretation of the dependence relationship between marginal distributions. Furthermore, it could result in a biased inference of the joint probability distribution function. A particular case is the cost-effectiveness analysis (CEA, which has shown its utility in many medico-economic studies and where censoring often occurs. Methods This paper discusses a copula-based modeling of the joint density and an estimation method of the costs, and quality adjusted life years (QALY in a cost-effectiveness analysis in case of censoring. This method is not based on any linearity assumption on the inferred variables, but on a punctual estimation obtained from the marginal distributions together with their dependence link. Results Our results show that the proposed methodology keeps only the bias resulting statistical inference and don’t have anymore a bias based on a unverified linearity assumption. An acupuncture study for chronic headache in primary care was used to show the applicability of the method and the obtained ICER keeps in the confidence interval of the standard regression methodology. Conclusion For the cost-effectiveness literature, such a technique without any linearity assumption is a progress since it does not need the specification of a global linear regression model. Hence, the estimation of the a marginal distributions for each therapeutic arm, the concordance measures between these populations and the right copulas families is now sufficient to process to the whole CEA.

  9. Exploring the Interactions Between Network Data Analysis and Security Information/Event Management

    Science.gov (United States)

    2011-01-01

    2011 Carnegie Mellon University Exploring the Interactions Between Network Data Analysis and Security Information/ Event Management Timothy J...AND SUBTITLE Exploring the Interactions Between Network Data Analysis and Security Information/ Event Management 5a. CONTRACT NUMBER 5b. GRANT...Network Data Security Information/ Events The Problem Events , Revisited Analysis leading to Events The Problem, Revisited Summary 4 Network Data larger

  10. Enhancing multilingual latent semantic analysis with term alignment information.

    Energy Technology Data Exchange (ETDEWEB)

    Chew, Peter A.; Bader, Brett William

    2008-08-01

    Latent Semantic Analysis (LSA) is based on the Singular Value Decomposition (SVD) of a term-by-document matrix for identifying relationships among terms and documents from co-occurrence patterns. Among the multiple ways of computing the SVD of a rectangular matrix X, one approach is to compute the eigenvalue decomposition (EVD) of a square 2 x 2 composite matrix consisting of four blocks with X and XT in the off-diagonal blocks and zero matrices in the diagonal blocks. We point out that significant value can be added to LSA by filling in some of the values in the diagonal blocks (corresponding to explicit term-to-term or document-to-document associations) and computing a term-by-concept matrix from the EVD. For the case of multilingual LSA, we incorporate information on cross-language term alignments of the same sort used in Statistical Machine Translation (SMT). Since all elements of the proposed EVD-based approach can rely entirely on lexical statistics, hardly any price is paid for the improved empirical results. In particular, the approach, like LSA or SMT, can still be generalized to virtually any language(s); computation of the EVD takes similar resources to that of the SVD since all the blocks are sparse; and the results of EVD are just as economical as those of SVD.

  11. Using Failure Information Analysis to Detect Enterprise Zombies

    Science.gov (United States)

    Zhu, Zhaosheng; Yegneswaran, Vinod; Chen, Yan

    We propose failure information analysis as a novel strategy for uncovering malware activity and other anomalies in enterprise network traffic. A focus of our study is detecting self-propagating malware such as worms and botnets. We begin by conducting an empirical study of transport- and application-layer failure activity using a collection of long-lived malware traces. We dissect the failure activity observed in this traffic in several dimensions, finding that their failure patterns differ significantly from those of real-world applications. Based on these observations, we describe the design of a prototype system called Netfuse to automatically detect and isolate malware-like failure patterns. The system uses an SVM-based classification engine to identify suspicious systems and clustering to aggregate failure activity of related enterprise hosts. Our evaluation using several malware traces demonstrates that the Netfuse system provides an effective means to discover suspicious application failures and infected enterprise hosts. We believe it would be a useful complement to existing defenses.

  12. Multi-Dimensional Analysis of Dynamic Human Information Interaction

    Science.gov (United States)

    Park, Minsoo

    2013-01-01

    Introduction: This study aims to understand the interactions of perception, effort, emotion, time and performance during the performance of multiple information tasks using Web information technologies. Method: Twenty volunteers from a university participated in this study. Questionnaires were used to obtain general background information and…

  13. Design and application of pulse information acquisition and analysis ...

    African Journals Online (AJOL)

    ... two-dimensional information acquisition, multiplex signals combination and deep data mining. Conclusions: The newly developed system could translate the pulse signals into digital, visual and measurable motion information of vessel. Keywords: Visualized pulse information; Radial artery; B mode ultrasound; Traditional ...

  14. FACTORS INFLUENCING INFORMATION TECHNOLOGY ADPOTION: A CROSS-SECTIONAL ANALYSIS

    OpenAIRE

    Stroade, Jeri L.; Schurle, Bryan W.

    2003-01-01

    This project will explore information technology adoption issues. The unique characteristics of information technology will be discussed. Advantages and disadvantages to adoption will also be identified. Finally, a statistical model of Internet adoption will be developed to estimate the impacts of certain variables on the underlying process of information technology adoption.

  15. Information-Theoretic Analysis of Memoryless Deterministic Systems

    Directory of Open Access Journals (Sweden)

    Bernhard C. Geiger

    2016-11-01

    Full Text Available The information loss in deterministic, memoryless systems is investigated by evaluating the conditional entropy of the input random variable given the output random variable. It is shown that for a large class of systems the information loss is finite, even if the input has a continuous distribution. For systems with infinite information loss, a relative measure is defined and shown to be related to Rényi information dimension. As deterministic signal processing can only destroy information, it is important to know how this information loss affects the solution of inverse problems. Hence, we connect the probability of perfectly reconstructing the input to the information lost in the system via Fano-type bounds. The theoretical results are illustrated by example systems commonly used in discrete-time, nonlinear signal processing and communications.

  16. Commercial babassu mesocarp: microbiological evaluation and analysis of label information

    Directory of Open Access Journals (Sweden)

    Laisa Lis Fontinele Sá

    2015-10-01

    Full Text Available The babassu mesocarp is easily found in supermarkets and other commercial establishments in Brazil. Despite its widespread use in both pharmaceutical and food industries, the literature has neither scientific studies about microbial contamination for these products nor about legal information expressed on label. The aim of this study was to evaluate the level of microbiological contamination in babassu mesocarp sold in commercial establishments in Teresina-PI/Brazil besides the conformity of label information according to the rules of Brazilian Sanitary Surveillance Agency (ANVISA. Ten samples of babassu mesocarp powder sold in the region were selected for study. Determination of heterotrophic microorganisms was carried out using the seeding technique of Plate Count Agar (CFU g-1. It was used Sabouraud Dextrose Agar medium for cultivation of fungi. For the analysis of label information, the resolutions (RDC, 259 of September 20, 2002, and 360 of December 23, 2003, beyond the law 10,674 of May 16, 2003 were used. The results of levels of contamination for heterotrophic bacteria and fungi showed high contamination for all samples. Most of the label samples were according to the rules. Therefore, the results suggest a more comprehensive monitoring of these microorganisms besides the development of more effective methods for decontamination of these products sold in Brazil.Keywords: Babassu. Label. Contamination. Food. Pharmacy. RESUMO O mesocarpo de babaçu é encontrado facilmente em supermercados e em outros estabelecimentos comercias e apesar de sua ampla utilização, tanto na indústria farmacêutica e de alimentos, na literatura não há trabalhos científicos que avaliem sua contaminação microbiológica ou informações legais necessárias para rótulos. O objetivo do estudo foi avaliar o nível de contaminação microbiológica do mesocarpo de babaçu, vendidos no comércio de Teresina-PI, bem como verificar a conformidade das informa

  17. The role of proxy information in missing data analysis.

    Science.gov (United States)

    Huang, Rong; Liang, Yuanyuan; Carrière, K C

    2005-10-01

    This article investigates the role of proxy data in dealing with the common problem of missing data in clinical trials using repeated measures designs. In an effort to avoid the missing data situation, some proxy information can be gathered. The question is how to treat proxy information, that is, is it always better to utilize proxy information when there are missing data? A model for repeated measures data with missing values is considered and a strategy for utilizing proxy information is developed. Then, simulations are used to compare the power of a test using proxy to simply utilizing all available data. It is concluded that using proxy information can be a useful alternative when such information is available. The implications for various clinical designs are also considered and a data collection strategy for efficiently estimating parameters is suggested.

  18. The analysis of the nuclear information resources on the internet

    International Nuclear Information System (INIS)

    Zhang Guoqing; Tu Jinchi; Yao Ruiquan

    2014-01-01

    Information resources have become increasingly prominent role in social and economic development, which has become the focus of international competition in the new open environment such as political, economic, cultural and military. The level of management, development and utilization of network information resources has become an important symbol for the measure of the level of development and degree of informatization of a country or an enterprise. But the exploitation of information resources has greater complexity compared with that of natural resources. Facing of the mass and the uneven quality of the network information, we must sort out the resources through a broader perspective and make a structured framework for the development of network information resources. This article analysed the statistical data of the published content, publishing style, renewal period, data type and so on. It also analysed and evaluated the different types and content of the nuclear information resources on the Internet by its number and characters. Furthermore, it provides a basis for developing and utilizing of nuclear information resources on the Internet of foreign related organizations and sifting the targeted, high qualitied, guaranteed and valued nuclear information resources. It can make the organization and management of the nuclear information resources on the Internet more effective. (authors)

  19. Content of information ethics in the Holy Quran: an analysis

    Directory of Open Access Journals (Sweden)

    Akram Mehrandasht

    2014-06-01

    Full Text Available Background and Objectives: Information ethics, according to Islam, equals observing human rights and morals in dealing with information transmission and provision, which must be based on honesty and truthfulness. As Islam highly values society and social issues, it believes that human behaviors and interactions are strongly interconnected in a society. Committing to ethical issues regarding information is a responsibility of all members of a society according to Quran. Islam prohibits Believers from unmasking peoples’ private information. Results: Thus, from the viewpoint of Islam, the purpose of information ethics is to observe and protect human rights in the society and to show the positive effects of the right information on all aspects of human life. Method s: For the purpose of the current study all the words and verses related to information ethics was identified in Quran. The data was then evaluated and analyzed from the perspective of scholars and experts of Quran. Conclusions: To implement information ethics, it is necessary to be aware of the position Islam has given this concept. Following Quran guidelines and the manners of the family members of the Prophet Muhammad (pbuh results in a society in which information ethics is observed optimally.

  20. Analysis and design on airport safety information management system

    Directory of Open Access Journals (Sweden)

    Yan Lin

    2017-01-01

    Full Text Available Airport safety information management system is the foundation of implementing safety operation, risk control, safety performance monitor, and safety management decision for the airport. The paper puts forward the architecture of airport safety information management system based on B/S model, focuses on safety information processing flow, designs the functional modules and proposes the supporting conditions for system operation. The system construction is helpful to perfecting the long effect mechanism driven by safety information, continually increasing airport safety management level and control proficiency.

  1. Analysis and improvement of vehicle information sharing networks

    Science.gov (United States)

    Gong, Hang; He, Kun; Qu, Yingchun; Wang, Pu

    2016-06-01

    Based on large-scale mobile phone data, mobility demand was estimated and locations of vehicles were inferred in the Boston area. Using the spatial distribution of vehicles, we analyze the vehicle information sharing network generated by the vehicle-to-vehicle (V2V) communications. Although a giant vehicle cluster is observed, the coverage and the efficiency of the information sharing network remain limited. Consequently, we propose a method to extend the information sharing network's coverage by adding long-range connections between targeted vehicle clusters. Furthermore, we employ the optimal design strategy discovered in square lattice to improve the efficiency of the vehicle information sharing network.

  2. The threat nets approach to information system security risk analysis

    NARCIS (Netherlands)

    Mirembe, Drake

    2015-01-01

    The growing demand for healthcare services is motivating hospitals to strengthen outpatient case management using information systems in order to serve more patients using the available resources. Though the use of information systems in outpatient case management raises patient data security

  3. Information Foraging Theory: A Framework for Intelligence Analysis

    Science.gov (United States)

    2014-11-01

    information scent cues, and information visualisation techniques. Significance to defence and security One of the most important tasks of military...renseignement militaire. La théorie du butinage des renseignements explique la recherche et l’exploitation humaine des renseignements comme des adaptations...i Significance to defence and security

  4. Analysis of Urban Households' Preference for Informal Access to ...

    African Journals Online (AJOL)

    2016-10-02

    Oct 2, 2016 ... these factors on the urban land market will continue to sustain the informality of land access in Minna. The study recommends the decentralisation of land administrative system and reduction in the planning standards to enhanced formal land access. Keywords: Urban Households, Informal Access, ...

  5. Analysis of sources of information for Improved Cassava Production ...

    African Journals Online (AJOL)

    Local Government Area, Cross River State saw radio (41.81%) as major source of information for new technologies in cassava production. The television (30%), extension agent (12.73%), friends/ neighbours (8.18%) and information communication technologies (ICTs)(7.27%) were ranked second, third, fourth and fifth ...

  6. A Situational Analysis of Information Management in Selected ...

    African Journals Online (AJOL)

    Information is the fuel that drives government programmes and services. For Government ministries in Kenya therefore, effectively managing information is key to provision of service delivery, development and growth of the country, especially in successfully implementing strategies articulated in the Kenya Vision 2030, the ...

  7. Title: Gender analysis of sexual and reproductive health information ...

    African Journals Online (AJOL)

    manda

    students could access a wide range of sources of SRH information but the actual use was concentrated and .... Simonelli et al (2002), evaluating sexual and reproductive health education and services for youths in .... of these sources of SRH information and the attitude towards gender equity (see table 1-3 below). Table 1: ...

  8. Analysis of the Interdisciplinary Nature of Library and Information Science

    Science.gov (United States)

    Prebor, Gila

    2010-01-01

    Library and information science (LIS) is highly interdisciplinary by nature and is affected by the incessant evolution of technologies. A recent study surveying research trends in the years 2002-6 at various information science departments worldwide has found that a clear trend was identified in Masters theses and doctoral dissertations of social…

  9. Urban parking information provision: an in-depth effect analysis

    NARCIS (Netherlands)

    Tasseron, G.

    2017-01-01

    Recent advances in wireless communication technologies, such as parking sensors, enable real-time information provision on on-street parking places to drivers. These developments have been embraced by policy makers, as they expect that real-time information may reduce traffic searching for a parking

  10. Cost-volume-profit and net present value analysis of health information systems.

    Science.gov (United States)

    McLean, R A

    1998-08-01

    The adoption of any information system should be justified by an economic analysis demonstrating that its projected benefits outweigh its projected costs. Analysis differ, however, on which methods to employ for such a justification. Accountants prefer cost-volume-profit analysis, and economists prefer net present value analysis. The article explains the strengths and weaknesses of each method and shows how they can be used together so that well-informed investments in information systems can be made.

  11. A Technical Analysis Information Fusion Approach for Stock Price Analysis and Modeling

    Science.gov (United States)

    Lahmiri, Salim

    In this paper, we address the problem of technical analysis information fusion in improving stock market index-level prediction. We present an approach for analyzing stock market price behavior based on different categories of technical analysis metrics and a multiple predictive system. Each category of technical analysis measures is used to characterize stock market price movements. The presented predictive system is based on an ensemble of neural networks (NN) coupled with particle swarm intelligence for parameter optimization where each single neural network is trained with a specific category of technical analysis measures. The experimental evaluation on three international stock market indices and three individual stocks show that the presented ensemble-based technical indicators fusion system significantly improves forecasting accuracy in comparison with single NN. Also, it outperforms the classical neural network trained with index-level lagged values and NN trained with stationary wavelet transform details and approximation coefficients. As a result, technical information fusion in NN ensemble architecture helps improving prediction accuracy.

  12. Trading in markets with noisy information: an evolutionary analysis

    Science.gov (United States)

    Bloembergen, Daan; Hennes, Daniel; McBurney, Peter; Tuyls, Karl

    2015-07-01

    We analyse the value of information in a stock market where information can be noisy and costly, using techniques from empirical game theory. Previous work has shown that the value of information follows a J-curve, where averagely informed traders perform below market average, and only insiders prevail. Here we show that both noise and cost can change this picture, in several cases leading to opposite results where insiders perform below market average, and averagely informed traders prevail. Moreover, we investigate the effect of random explorative actions on the market dynamics, showing how these lead to a mix of traders being sustained in equilibrium. These results provide insight into the complexity of real marketplaces, and show under which conditions a broad mix of different trading strategies might be sustainable.

  13. Change detection in bi-temporal data by canonical information analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Vestergaard, Jacob Schack

    2015-01-01

    combinations with the information theoretical measure mutual information (MI). We term this type of analysis canonical information analysis (CIA). MI allows for the actual joint distribution of the variables involved and not just second order statistics. Where CCA is ideal for Gaussian data, CIA facilitates...

  14. 78 FR 68463 - Notice of Emergency Approval of an Information Collection: Regional Analysis of Impediments...

    Science.gov (United States)

    2013-11-14

    ... URBAN DEVELOPMENT Notice of Emergency Approval of an Information Collection: Regional Analysis of... and Communities, Department of Housing and Urban Development, 451 7th Street SW., Washington, DC 20410... Information Collection Title of Information Collection: Regional Analysis of Impediments Guidance for...

  15. Independent component analysis of edge information for face recognition

    CERN Document Server

    Karande, Kailash Jagannath

    2013-01-01

    The book presents research work on face recognition using edge information as features for face recognition with ICA algorithms. The independent components are extracted from edge information. These independent components are used with classifiers to match the facial images for recognition purpose. In their study, authors have explored Canny and LOG edge detectors as standard edge detection methods. Oriented Laplacian of Gaussian (OLOG) method is explored to extract the edge information with different orientations of Laplacian pyramid. Multiscale wavelet model for edge detection is also propos

  16. Simple LED spectrophotometer for analysis of color information.

    Science.gov (United States)

    Kim, Ji-Sun; Kim, A-Hee; Oh, Han-Byeol; Goh, Bong-Jun; Lee, Eun-Suk; Kim, Jun-Sik; Jung, Gu-In; Baek, Jin-Young; Jun, Jae-Hoon

    2015-01-01

    A spectrophotometer is the basic measuring equipment essential to most research activity fields requiring samples to be measured, such as physics, biotechnology and food engineering. This paper proposes a system that is able to detect sample concentration and color information by using LED and color sensor. Purity and wavelength information can be detected by CIE diagram, and the concentration can be estimated with purity information. This method is more economical and efficient than existing spectrophotometry, and can also be used by ordinary persons. This contribution is applicable to a number of fields because it can be used as a colorimeter to detect the wavelength and purity of samples.

  17. Integrating Information and Communication Technology for Health Information System Strengthening: A Policy Analysis.

    Science.gov (United States)

    Marzuki, Nuraidah; Ismail, Saimy; Al-Sadat, Nabilla; Ehsan, Fauziah Z; Chan, Chee-Khoon; Ng, Chiu-Wan

    2015-11-01

    Despite the high costs involved and the lack of definitive evidence of sustained effectiveness, many low- and middle-income countries had begun to strengthen their health information system using information and communication technology in the past few decades. Following this international trend, the Malaysian Ministry of Health had been incorporating Telehealth (National Telehealth initiatives) into national health policies since the 1990s. Employing qualitative approaches, including key informant interviews and document review, this study examines the agenda-setting processes of the Telehealth policy using Kingdon's framework. The findings suggested that Telehealth policies emerged through actions of policy entrepreneurs within the Ministry of Health, who took advantage of several simultaneously occurring opportunities--official recognition of problems within the existing health information system, availability of information and communication technology to strengthen health information system and political interests surrounding the national Multimedia Super Corridor initiative being developed at the time. The last was achieved by the inclusion of Telehealth as a component of the Multimedia Super Corridor. © 2015 APJPH.

  18. Information, public policy analysis and sustainable development in ...

    African Journals Online (AJOL)

    scale secrecy in government business, inadequate incentives to statisticians and grand corruption in many, if not all, public organizations in Nigeria. They promote negative effects, including unreliable data and information, inadequate public ...

  19. Market analysis of the commercial traffic information business

    Science.gov (United States)

    1994-03-01

    This document examines the private sector traffic information marketplace in the United States--its beginnings, history, economics, business operations, functions, products, and possible future directions--particularly as it relates to the individual...

  20. Manufacturing Technology Information Analysis Center: Knowledge Is Strength

    Science.gov (United States)

    Safar, Michal

    1992-01-01

    The Center's primary function is to facilitate technology transfer within DoD, other government agencies and industry. The DoD has recognized the importance of technology transfer, not only to support specific weapon system manufacture, but to strengthen the industrial base that sustains DoD. MTIAC uses an experienced technical staff of engineers and information specialists to acquire, analyze, and disseminate technical information. Besides ManTech project data, MTIAC collects manufacturing technology from other government agencies, commercial publications, proceedings, and various international sources. MTIAC has various means of disseminating this information. Much of the technical data is on user accessible data bases. The Center researches and writes a number of technical reports each year and publishes a newsletter monthly. Customized research is performed in response to specific inquiries from government and industry. MTIAC serves as a link between Government and Industry to strengthen the manufacturing technology base through the dissemination of advanced manufacturing information.

  1. Design and Analysis: Payroll of Accounting Information System

    Directory of Open Access Journals (Sweden)

    Suryanto Suryanto

    2011-05-01

    Full Text Available Purpose of the research are to analyze, design, and recommended the payroll of accounting information system that support the internal control to solve the problem. Research methods used are book studies, field studies, and design studies. Fields studies done by survey and interview. The expected result are to give a review about the payroll of accounting information system in the ongoing business process of company and to solve all the weakness in the payroll system, so the company can use the integrated information system in counting of the payroll. Conclusion that can take from the research are there’s some manipulation risk of attendance data and documentation of form still using a manual system and simple data backup. Then, there’s also manipulation risk in allowance cash system and all the report that include in the payroll.Index Terms - Accounting Information System, Payroll

  2. Information analysis of hyperspectral images from the hyperion satellite

    Science.gov (United States)

    Puzachenko, Yu. G.; Sandlersky, R. B.; Krenke, A. N.; Puzachenko, M. Yu.

    2017-07-01

    A new method of estimating the outgoing radiation spectra data obtained from the Hyperion EO-1 satellite is considered. In theoretical terms, this method is based on the nonequilibrium thermodynamics concept with corresponding estimates of the entropy and the Kullbak information. The obtained information estimates make it possible to assess the effective work of the landscape cover both in general and for its various types and to identify the spectrum ranges primarily responsible for the information increment and, accordingly, for the effective work. The information is measured in the frequency band intervals corresponding to the peaks of solar radiation absorption by different pigments, mesophyll, and water to evaluate the system operation by their synthesis and moisture accumulation. This method is assumed to be effective in investigation of ecosystem functioning by hyperspectral remote sensing.

  3. Information delivery manuals to facilitate it supported energy analysis

    DEFF Research Database (Denmark)

    Mondrup, Thomas Fænø; Karlshøj, Jan; Vestergaard, Flemming

    In response to continuing Building Information Modeling (BIM) progress, building performance simulation tools such as IESVE are being utilized to explore construction projects and influence design decisions with increasing frequency. To maximize the potential of these tools, a specification...

  4. Information technology portfolio in supply chain management using factor analysis

    OpenAIRE

    Ahmad Jaafarnejad; Davood Rafierad; Masoumeh Gardeshi

    2013-01-01

    The adoption of information technology (IT) along with supply chain management (SCM) has become increasingly a necessity among most businesses. This enhances supply chain (SC) performance and helps companies achieve the organizational competitiveness. IT systems capture and analyze information and enable management to make decisions by considering a global scope across the entire SC. This paper reviews the existing literature on IT in SCM and considers pertinent criteria. Using principal comp...

  5. Constructing a model of effective information dissemination in a crisis. Information dissemination, Crisis, Crises, Tuberculosis, Dissemination of information, Meta-ethnographic analysis, Social marketing

    Directory of Open Access Journals (Sweden)

    Fiona Duggan

    2004-01-01

    Full Text Available A model of effective information dissemination in a crisis was developed from a Ph.D. study of information dissemination during a suspected TB outbreak. The research aimed to characterise and evaluate the dissemination of information to the community during the incident. A qualitative systematic review of the research literature identified twenty relevant studies. Meta-ethnographic analysis of these studies highlighted the key factors in effective dissemination. Consideration of these factors in relation to dissemination theory provided the links between the key factors. When the resulting model was applied to the specific circumstances of the incident two barriers to effective information dissemination were identified. Incorporating these barriers into the original model enabled the construction of a model of effective information dissemination in a crisis. The implications of this model for information professionals include incorporating social marketing as a core element of education and training and adopting multi-method dissemination strategies.

  6. Information needs of engineers. The methodology developed by the WFEO Committee on Engineering Information and the use of value analysis for improving information services

    International Nuclear Information System (INIS)

    Darjoto, S.W.; Martono, A.; Michel, J.

    1990-05-01

    The World Federation of Engineering Organizations - WFEO - through the work of its Committee on Engineering Information, aims at improving the efficiency of engineers and particularly at developing new attitudes and practices concerning the specialized information mastering. One important part of the WFEO/CEI programme of activities during the last years and for the next years was and is devoted to a better understanding of the information needs of engineers. But also, it seems now essential to WFEO/CEI to better evaluate information services in order to correctly adapt them to the identified needs of engineers. The following communication will emphasize these two main and related perspectives: identifying the information needs of engineers; developing Value Analysis approaches for engineering information services. (author). 3 refs

  7. 10 CFR 52.157 - Contents of applications; technical information in final safety analysis report.

    Science.gov (United States)

    2010-01-01

    ...; technical information in final safety analysis report. The application must contain a final safety analysis... 10 Energy 2 2010-01-01 2010-01-01 false Contents of applications; technical information in final safety analysis report. 52.157 Section 52.157 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSES...

  8. HPC Performance Analysis of a Distributed Information Enterprise Simulation

    National Research Council Canada - National Science Library

    Hanna, James P; Walter, Martin J; Hillman, Robert G

    2004-01-01

    .... The analysis identified several performance limitations and bottlenecks. One critical limitation addressed and eliminated was simultaneously mixing a periodic process model with an event driven model causing rollbacks...

  9. Meaningful mediation analysis : Plausible causal inference and informative communication

    NARCIS (Netherlands)

    Pieters, Rik

    2017-01-01

    Statistical mediation analysis has become the technique of choice in consumer research to make causal inferences about the influence of a treatment on an outcome via one or more mediators. This tutorial aims to strengthen two weak links that impede statistical mediation analysis from reaching its

  10. An Information Geometric Analysis of Entangled Continuous Variable Quantum Systems

    Energy Technology Data Exchange (ETDEWEB)

    Kim, D-H [Institute for the Early Universe, Ewha Womans University, Seoul 120-750 (Korea, Republic of); Ali, S A [Department of Physics, State University of New York at Albany, 1400 Washington Avenue, Albany, NY 12222 (United States); Cafaro, C; Mancini, S [School of Science and Technology, Physics Division, University of Camerino, I-62032 Camerino (Italy)

    2011-07-08

    In this work, using information geometric (IG) techniques, we investigate the effects of micro-correlations on the evolution of maximal probability paths on statistical manifolds induced by systems whose microscopic degrees of freedom are Gaussian distributed. Analytical estimates of the information geometric entropy (IGE) as well as the IG analogue of the Lyapunov exponents are presented. It is shown that the entanglement duration is related to the scattering potential and incident particle energies. Finally, the degree of entanglement generated by an s-wave scattering event between minimum uncertainty wave-packets is computed in terms of the purity of the system.

  11. All-Source Information Acquisition and Analysis in the IAEA Department of Safeguards

    International Nuclear Information System (INIS)

    Ferguson, Matthew; Norman, Claude

    2010-01-01

    All source information analysis enables proactive implementation of in-field verification activities, supports the State Evaluation process, and is essential to the IAEA's strengthened safeguards system. Information sources include State-declared nuclear material accounting and facility design information; voluntarily supplied information such as nuclear procurement data; commercial satellite imagery; open source information and information/results from design information verifications (DIVs), inspections and complementary accesses (CAs). The analysis of disparate information sources directly supports inspections, design information verifications and complementary access, and enables both more reliable cross-examination for consistency and completeness as well as in-depth investigation of possible safeguards compliance issues. Comparison of State-declared information against information on illicit nuclear procurement networks, possible trafficking in nuclear materials, and scientific and technical information on nuclear-related research and development programmes, provides complementary measures for monitoring nuclear developments and increases Agency capabilities to detect possible undeclared nuclear activities. Likewise, expert analysis of commercial satellite imagery plays a critical role for monitoring un-safeguarded sites and facilities. In sum, the combination of these measures provides early identification of possible undeclared nuclear material or activities, thus enhancing deterrence of safeguards system that is fully information driven, and increasing confidence in Safeguards conclusions. By increasing confidence that nuclear materials and technologies in States under Safeguards are used solely for peaceful purposes, information-driven safeguards will strengthen the nuclear non-proliferation system. Key assets for Agency collection, processing, expert analysis, and integration of these information sources are the Information Collection and Analysis

  12. Information

    International Nuclear Information System (INIS)

    Boyard, Pierre.

    1981-01-01

    The fear for nuclear energy and more particularly for radioactive wastes is analyzed in the sociological context. Everybody agree on the information need, information is available but there is a problem for their diffusion. Reactions of the public are analyzed and journalists, scientists and teachers have a role to play [fr

  13. Temporal Expectation and Information Processing: A Model-Based Analysis

    Science.gov (United States)

    Jepma, Marieke; Wagenmakers, Eric-Jan; Nieuwenhuis, Sander

    2012-01-01

    People are able to use temporal cues to anticipate the timing of an event, enabling them to process that event more efficiently. We conducted two experiments, using the fixed-foreperiod paradigm (Experiment 1) and the temporal-cueing paradigm (Experiment 2), to assess which components of information processing are speeded when subjects use such…

  14. Analysis and modelling of flood risk assessment using information ...

    African Journals Online (AJOL)

    Floods are a serious hazard to life and property. The traditional probability statistical method is acceptable in analysing the flood risk but requires a large sample size of hydrological data. This paper puts forward a composite method based on artificial neural network (ANN) and information diffusion method (IDM) for flood ...

  15. Analysis of Urban Households' Preference for Informal Access to ...

    African Journals Online (AJOL)

    2016-10-02

    Oct 2, 2016 ... informal rights lacks basic tenure security, hence susceptible to expropriation pro bono or with inadequate ... challenges also includes those of insecure tenure rights and property delineations. (Rakodi and Leduka, 2005) ..... M.Sc thesis submitted to the Department of Real Estate and. Construction, Royal ...

  16. Geographical information system (GIS)–based analysis of road ...

    African Journals Online (AJOL)

    In recent years there has been serious concern on the increasing rate of road traffic accidents in Nigeria. Geographic Information System (GIS), a high performance computer based tool is useful in road traffic management and vehicular movement study. In this work, the GIS software was used to analyze and depicts the ...

  17. Library and information science research in Botswana: An analysis ...

    African Journals Online (AJOL)

    This paper analysed library and information science research in Botswana that has been published since 1979, when the library school at the University of Botswana was established. The period considered is from 1980 to 2006, a period of 27 years. The paper linked research and publication trends with the historical, social ...

  18. Analysis of haptic information in the cerebral cortex

    Science.gov (United States)

    2016-01-01

    Haptic sensing of objects acquires information about a number of properties. This review summarizes current understanding about how these properties are processed in the cerebral cortex of macaques and humans. Nonnoxious somatosensory inputs, after initial processing in primary somatosensory cortex, are partially segregated into different pathways. A ventrally directed pathway carries information about surface texture into parietal opercular cortex and thence to medial occipital cortex. A dorsally directed pathway transmits information regarding the location of features on objects to the intraparietal sulcus and frontal eye fields. Shape processing occurs mainly in the intraparietal sulcus and lateral occipital complex, while orientation processing is distributed across primary somatosensory cortex, the parietal operculum, the anterior intraparietal sulcus, and a parieto-occipital region. For each of these properties, the respective areas outside primary somatosensory cortex also process corresponding visual information and are thus multisensory. Consistent with the distributed neural processing of haptic object properties, tactile spatial acuity depends on interaction between bottom-up tactile inputs and top-down attentional signals in a distributed neural network. Future work should clarify the roles of the various brain regions and how they interact at the network level. PMID:27440247

  19. African Web-Based Animal Health Information: Analysis Of Online ...

    African Journals Online (AJOL)

    This implies that the web is dominated with the information from developed world. The paper recommends that African scientists should utilize both open access repositories and journals to increase accessibility of the local animal health content on the web. University of Dar Es Salaam Library Journal Vol. 9 (1) 2007: pp.

  20. Design and analysis of information model hotel complex

    Directory of Open Access Journals (Sweden)

    Garyaev Nikolai

    2016-01-01

    Full Text Available The article analyzes the innovation in 3D modeling and development of process design approaches based on visualization of information technology and computer-aided design systems. The problems arising in the modern design and the approach to address them.

  1. Efficiency and credit ratings: a permutation-information-theory analysis

    Science.gov (United States)

    Fernandez Bariviera, Aurelio; Zunino, Luciano; Belén Guercio, M.; Martinez, Lisana B.; Rosso, Osvaldo A.

    2013-08-01

    The role of credit rating agencies has been under severe scrutiny after the subprime crisis. In this paper we explore the relationship between credit ratings and informational efficiency of a sample of thirty nine corporate bonds of US oil and energy companies from April 2008 to November 2012. For this purpose we use a powerful statistical tool, relatively new in the financial literature: the complexity-entropy causality plane. This representation space allows us to graphically classify the different bonds according to their degree of informational efficiency. We find that this classification agrees with the credit ratings assigned by Moody’s. In particular, we detect the formation of two clusters, which correspond to the global categories of investment and speculative grades. Regarding the latter cluster, two subgroups reflect distinct levels of efficiency. Additionally, we also find an intriguing absence of correlation between informational efficiency and firm characteristics. This allows us to conclude that the proposed permutation-information-theory approach provides an alternative practical way to justify bond classification.

  2. Analysis of Informal Credit Operations among Farmers In Atisbo ...

    African Journals Online (AJOL)

    The study explored the operations, (savings and loan procurement), as well as the problems and prospects of the informal financial institutions in ATISBO Local Government Area of Oyo State, Southwest Nigeria. The sampling technique used was Stratified Random Sampling to select three villages from the area and ...

  3. Gender Analysis Of Electronic Information Resource Use: The Case ...

    African Journals Online (AJOL)

    This article is based on an empirical study that examined the association between gender and the use of electronic information resources among postgraduate students at the University of Dar es salaam, Tanzania. The study was conducted in December 2005 and integrated both qualitative and quantitative research ...

  4. 511 Spatial Analysis of Soil Fertility Using Geographical Information ...

    African Journals Online (AJOL)

    User

    2011-07-21

    Jul 21, 2011 ... Information Systems Technology. (Pp. 511-524). Njoku, J. D. - Dept of Environmental Technology, Federal University of. Technology PMB 1526 Owerri, Nigeria. Email:dr_jdnjoku@yahoo.com. Nnaji, A. O. - Dept of Environmental Technology, Federal University of. Technology PMB 1526 Owerri, Nigeria. &.

  5. Information sharing for consumption tax purposes : An empirical analysis

    NARCIS (Netherlands)

    Ligthart, Jenny E.

    The paper studies the determinants of information sharing between Swedish tax authorities and 14 EU tax authorities for value-added tax (VAT) purposes. It is shown that trade-related variables (such as the partner country's net trade position and population size), reciprocity, and legal arrangements

  6. Combining Linguistic and Spatial Information for Document Analysis

    NARCIS (Netherlands)

    Aiello, Marco; Monz, Christof; Todoran, Leon

    2000-01-01

    We present a framework to analyze color documents of complex layout. In addition, no assumption is made on the layout. Our framework combines in a content-driven bottom-up approach two different sources of information: textual and spatial. To analyze the text, shallow natural language processing

  7. ANKH: Information Threat Analysis with Actor-NetworK Hypergraphs

    NARCIS (Netherlands)

    Pieters, Wolter

    2010-01-01

    Traditional information security modelling approaches often focus on containment of assets within boundaries. Due to what is called de-perimeterisation, such boundaries, for example in the form of clearly separated company networks, disappear. This paper argues that in a de-perimeterised situation a

  8. Analysis of Public Health Risks From Consumption of Informally ...

    African Journals Online (AJOL)

    Despite an unfavorable policy environment against informal milk markets, these market account for most milk sales in Kenya. Convenient delivery and lower prices are the principal benefits for poor consumers. Current milk handling and safety regulations in Kenya are derived from models in industrialized countries.

  9. Modeling and Analysis of Information Attack in Computer Networks

    National Research Council Canada - National Science Library

    Pepyne, David

    2003-01-01

    ... (as opposed to physical and other forms of attack) . Information based attacks are attacks that can be carried out from anywhere in the world, while sipping cappuccino at an Internet cafe' or while enjoying the comfort of a living room armchair...

  10. Bridging Levels of Analysis: Learning, Information Theory, and the Lexicon

    Science.gov (United States)

    Dye, Melody

    2017-01-01

    While information theory is typically considered in the context of modern computing and engineering, its core mathematical principles provide a potentially useful lens through which to consider human language. Like the artificial communication systems such principles were invented to describe, natural languages involve a sender and receiver, a…

  11. A Comparative Analysis of Library and Information Science Master's ...

    African Journals Online (AJOL)

    In the wake of technological developments, taking a pragmatic approach towards continual library and information science (LIS) curricula revision becomes inevitable. This paper analyses the existing LIS curricula both in the United States of America (USA) and Uganda. Specific focus is on a comparison between the ...

  12. Chromatic Information and Feature Detection in Fast Visual Analysis.

    Directory of Open Access Journals (Sweden)

    Maria M Del Viva

    Full Text Available The visual system is able to recognize a scene based on a sketch made of very simple features. This ability is likely crucial for survival, when fast image recognition is necessary, and it is believed that a primal sketch is extracted very early in the visual processing. Such highly simplified representations can be sufficient for accurate object discrimination, but an open question is the role played by color in this process. Rich color information is available in natural scenes, yet artist's sketches are usually monochromatic; and, black-and-white movies provide compelling representations of real world scenes. Also, the contrast sensitivity of color is low at fine spatial scales. We approach the question from the perspective of optimal information processing by a system endowed with limited computational resources. We show that when such limitations are taken into account, the intrinsic statistical properties of natural scenes imply that the most effective strategy is to ignore fine-scale color features and devote most of the bandwidth to gray-scale information. We find confirmation of these information-based predictions from psychophysics measurements of fast-viewing discrimination of natural scenes. We conclude that the lack of colored features in our visual representation, and our overall low sensitivity to high-frequency color components, are a consequence of an adaptation process, optimizing the size and power consumption of our brain for the visual world we live in.

  13. Fitting the Jigsaw of Citation: Information Visualization in Domain Analysis.

    Science.gov (United States)

    Chen, Chaomei; Paul, Ray J.; O'Keefe, Bob

    2001-01-01

    Discusses the role of information visualization in modeling and representing intellectual structures associated with scientific disciplines and visualizes the domain of computer graphics based on bibliographic data from author cocitation patterns. Highlights include author cocitation maps, citation time lines, animation of a high-dimensional…

  14. Repeater Analysis for Combining Information from Different Assessments

    Science.gov (United States)

    Haberman, Shelby; Yao, Lili

    2015-01-01

    Admission decisions frequently rely on multiple assessments. As a consequence, it is important to explore rational approaches to combine the information from different educational tests. For example, U.S. graduate schools usually receive both TOEFL iBT® scores and GRE® General scores of foreign applicants for admission; however, little guidance…

  15. Information Technology: A Community of Practice. A Workplace Analysis

    Science.gov (United States)

    Guerrero, Tony

    2014-01-01

    Information Technology (IT) encompasses all aspects of computing technology. IT is concerned with issues relating to supporting technology users and meeting their needs within an organizational and societal context through the selection, creation, application, integration, and administration of computing technologies (Lunt, et. al., 2008). The…

  16. Analysis of urban households' preference for informal access to ...

    African Journals Online (AJOL)

    Over the years, the urban land markets in Nigeria have been grappling with conflicts between the formal and informal institutions who have remained the dominant players. Despite the provision of the Land Use Act of 1978, which vested in the states, the power to hold and administer all lands within their territorial ...

  17. Toward an Information Processing Analysis of Field-Independence.

    Science.gov (United States)

    Davis, J. Kent; Cochran, Kathryn F.

    Goodenough's (1976) findings on field dependence/independence are extended here by focusing on the information processing stages of attention, encoding in short term and working memories, and storage and retrieval processes of long term memory. The reviewed research indicates that field independent and dependent individuals differ in the ability…

  18. Event Sequence Analysis of the Air Intelligence Agency Information Operations Center Flight Operations

    National Research Council Canada - National Science Library

    Larsen, Glen

    1998-01-01

    This report applies Event Sequence Analysis, methodology adapted from aircraft mishap investigation, to an investigation of the performance of the Air Intelligence Agency's Information Operations Center (IOC...

  19. System Engineering Analysis For Improved Scout Business Information Systems

    Energy Technology Data Exchange (ETDEWEB)

    Van Slyke, D. A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-01-30

    The project uses system engineering principles to address the need of Boy Scout leaders for an integrated system to facilitate advancement and awards records, leader training and planning for meetings and activities. Existing products to address needs of Scout leaders and relevant stakeholders function to support record keeping and some communication functions but opportunity exists for a better system to fully integrate these functions with training delivery and recording, activity planning along with feedback and information gathering from stakeholders. Key stakeholders for the sytem include Scouts and their families, leaders, training providers, sellers of supplies and awards, content generators and facilities that serve Scout activities. Key performance parameters for the system are protection of personal information, availability of current information, information accuracy and information content that has depth. Implementation concepts considered for the system include (1) owned and operated by Boy Scouts of America, (2) Contracted out to a vendor (3) distributed system that functions with BSA managed interfaces. The selected concept is to contract out to a vendor to maximize the likelihood of successful integration and take advantage of the best technology. Development of requirements considers three key use cases (1) System facilitates planning a hike with training needed satisfied in advance and advancement recording real time (2) Scheduling and documenting in-person training, (3) Family interested in Scouting receives information and can request follow-up. Non-functional requirements are analyzed with the Quality Function Deployment tool. Requirement addressing frequency of backup, compatibility with legacy and new technology, language support, software update are developed to address system reliability and intuitive interface. System functions analyzed include update of activity database, maintenance of advancement status, archive of documents, and

  20. Analysis of the quality of hospital information systems Audit Trails.

    Science.gov (United States)

    Cruz-Correia, Ricardo; Boldt, Isabel; Lapão, Luís; Santos-Pereira, Cátia; Rodrigues, Pedro Pereira; Ferreira, Ana Margarida; Freitas, Alberto

    2013-08-06

    Audit Trails (AT) are fundamental to information security in order to guarantee access traceability but can also be used to improve Health information System's (HIS) quality namely to assess how they are used or misused. This paper aims at analysing the existence and quality of AT, describing scenarios in hospitals and making some recommendations to improve the quality of information. The responsibles of HIS for eight Portuguese hospitals were contacted in order to arrange an interview about the importance of AT and to collect audit trail data from their HIS. Five institutions agreed to participate in this study; four of them accepted to be interviewed, and four sent AT data. The interviews were performed in 2011 and audit trail data sent in 2011 and 2012. Each AT was evaluated and compared in relation to data quality standards, namely for completeness, comprehensibility, traceability among others. Only one of the AT had enough information for us to apply a consistency evaluation by modelling user behaviour. The interviewees in these hospitals only knew a few AT (average of 1 AT per hospital in an estimate of 21 existing HIS), although they all recognize some advantages of analysing AT. Four hospitals sent a total of 7 AT - 2 from Radiology Information System (RIS), 2 from Picture Archiving and Communication System (PACS), 3 from Patient Records. Three of the AT were understandable and three of the AT were complete. The AT from the patient records are better structured and more complete than the RIS/PACS. Existing AT do not have enough quality to guarantee traceability or be used in HIS improvement. Its quality reflects the importance given to them by the CIO of healthcare institutions. Existing standards (e.g. ASTM:E2147, ISO/TS 18308:2004, ISO/IEC 27001:2006) are still not broadly used in Portugal.

  1. Analysis of the Effect of Information System Quality to Intention to Reuse of Employee Management Information System (Simpeg Based on Information Systems Success Model

    Directory of Open Access Journals (Sweden)

    Suryanto Tri Lathif Mardi

    2016-01-01

    Full Text Available This study examines the effect of Information Quality, Systems Quality and Service Quality on the user intention to reuse Employee Management Information System (SIMPEG in University in the city of Surabaya, based on the theoretical foundation of DeLone and McLane Information Systems Success (ISS Model. The distribution of questionnaire was conducted to 120 employees of different universities by means of stratified random sampling. The results showed that: (1 there is a significant positive effect of the System Quality on the Quality of Information, (2 there is a significant positive effect of the Information Quality on the Intention to Reuse, information related to the fulfillment of the user’s needs; (3 there is a significant positive effect of the Quality of the Intention on system re-use, the system related to the fulfillment of the needs of users; (4 there is no effect of the Quality of Service to the Intention to Reuse. In the end, the results of this study provide an analysis and advice to The University officials that can be used as a consideration for Information Technology/Information System investment and development in accordance with the Success of Information System and Intention to Reuse model.

  2. ERROR ANALYSIS ON INFORMATION AND TECHNOLOGY STUDENTS’ SENTENCE WRITING ASSIGNMENTS

    OpenAIRE

    Rentauli Mariah Silalahi

    2015-01-01

    Students’ error analysis is very important for helping EFL teachers to develop their teaching materials, assessments and methods. However, it takes much time and effort from the teachers to do such an error analysis towards their students’ language. This study seeks to identify the common errors made by 1 class of 28 freshmen students studying English in their first semester in an IT university. The data is collected from their writing assignments for eight consecutive weeks. The errors found...

  3. Proof of patient information: Analysis of 201 judicial decisions.

    Science.gov (United States)

    Dugleux, E; Rached, H; Rougé-Maillart, C

    2018-02-15

    The ruling by the French Court of Cassation dated February 25, 1997 obliged doctors to provide proof of the information given to patients, reversing more than half a century of case law. In October 1997, it was specified that such evidence could be provided by "all means", including presumption. No hierarchy in respect of means of proof has been defined by case law or legislation. The present study analyzed judicial decisions with a view to determining the means of proof liable to carry the most weight in a suit for failure to provide due patient information. A retrospective qualitative study was conducted for the period from January 2010 to December 2015, by a search on the LexisNexis ® JurisClasseur website. Two hundred and one judicial decisions relating to failure to provide due patient information were selected and analyzed to study the characteristics of the practitioners involved, the content of the information at issue and the means of proof provided. The resulting cohort of practitioners was compared with the medical demographic atlas of the French Order of Medicine, considered as exhaustive. Two hundred and one practitioners were investigated for failure to provide information: 45 medical practitioners (22±3%), and 156 surgeons (78±3%) including 45 orthopedic surgeons (29±3.6% of surgeons). Hundred and ninety-three were private sector (96±1.3%) and 8 public sector (4±1.3%). Hundred and one surgeons (65±3.8% of surgeons), and 26 medical practitioners (58±7.4%) were convicted. Twenty-five of the 45 orthopedic surgeons were convicted (55±7.5%). There was no significant difference in conviction rates between surgeons and medical practitioners: odds ratio, 1.339916; 95% CI [0.6393982; 2.7753764] (Chi 2 test: p=0.49). Ninety-two practitioners based their defense on a single means of proof, and 74 of these were convicted (80±4.2%). Forty practitioners based their defense on several means of proof, and 16 of these were convicted (40±7.8%). There was

  4. Information Operations - Analysis Support and Capability Requirements (Operations d'information - Soutien a l'analyse et exigences de capacites) (CD-ROM)

    National Research Council Canada - National Science Library

    2006-01-01

    ...: The focus of the study "Information Operations - Analysis Support and Capability Requirements" undertaken by the RTO Task Group SAS-057 was to provide recommendations to improve analysis support...

  5. Qualitative website analysis of information on birth after caesarean section.

    Science.gov (United States)

    Peddie, Valerie L; Whitelaw, Natalie; Cumming, Grant P; Bhattacharya, Siladitya; Black, Mairead

    2015-08-19

    The United Kingdom (UK) caesarean section (CS) rate is largely determined by reluctance to augment trial of labour and vaginal birth. Choice between repeat CS and attempting vaginal birth after CS (VBAC) in the next pregnancy is challenging, with neither offering clear safety advantages. Women may access online information during the decision-making process. Such information is known to vary in its support for either mode of birth when assessed quantitatively. Therefore, we sought to explore qualitatively, the content and presentation of web-based health care information on birth after caesarean section (CS) in order to identify the dominant messages being conveyed. The search engine Google™ was used to conduct an internet search using terms relating to birth after CS. The ten most frequently returned websites meeting relevant purposive sampling criteria were analysed. Sampling criteria were based upon funding source, authorship and intended audience. Images and written textual content together with presence of links to additional media or external web content were analysed using descriptive and thematic analyses respectively. Ten websites were analysed: five funded by Government bodies or professional membership; one via charitable donations, and four funded commercially. All sites compared the advantages and disadvantages of both repeat CS and VBAC. Commercially funded websites favoured a question and answer format alongside images, 'pop-ups', social media forum links and hyperlinks to third-party sites. The relationship between the parent sites and those being linked to may not be readily apparent to users, risking perception of endorsement of either VBAC or repeat CS whether intended or otherwise. Websites affiliated with Government or health services presented referenced clinical information in a factual manner with podcasts of real life experiences. Many imply greater support for VBAC than repeat CS although this was predominantly conveyed through subtle

  6. Economic analysis of e-waste market under imperfect information

    OpenAIRE

    Prudence Dato

    2015-01-01

    Despite international regulations that prohibit the trans-boundary movement of electronic and electric waste (e-waste), non-reusable e-waste is often illegally mixed with reusable e-waste and results in being sent to developing countries. As developing countries are not well prepared to properly manage e-waste, this illegal trade has important negative externalities, and creates ‘environmental injustice’. The two main information problems on the e-waste market are imperfect monitoring and imp...

  7. Sources of referral information: a marketing analysis of physician behavior.

    Science.gov (United States)

    Powers, T L; Swan, J E; Taylor, J A; Bendall, D

    1998-01-01

    The referral process is an important means of obtaining patients and it is necessary to determine ways of influencing the referral process to increase the patient base. This article reports research based on a survey of the referral habits of 806 primary care physicians. The results are examined in the context of physician receptivity to marketer-controlled versus health services sources of referral information.

  8. Analysis of consumer information brochures on osteoporosis prevention and treatment

    Directory of Open Access Journals (Sweden)

    Mühlhauser, Ingrid

    2007-01-01

    Full Text Available Purpose: Evidence-based consumer information is a prerequisite for informed decision making. So far, there are no reports on the quality of consumer information brochures on osteoporosis. In the present study we analysed brochures on osteoporosis available in Germany. Method: All printed brochures from patient and consumer advocacy groups, physician and governmental organisations, health insurances, and pharmaceutical companies were initially collected in 2001, and updated in December 2004. Brochures were analysed by two independent researchers using 37 internationally proposed criteria addressing evidence-based content, risk communication, transparency of the development process, and layout and design. Results: A total of 165 brochures were identified; 59 were included as they specifically targeted osteoporosis prevention and treatment. Most brochures were provided by pharmaceutical companies (n=25, followed by health insurances (n=11 and patient and consumer advocacy groups (n=11. Quality of brochures did not differ between providers. Only 1 brochure presented lifetime risk estimate; 4 mentioned natural course of osteoporosis. A balanced report on benefit versus lack of benefit was presented in 2 brochures and on benefit versus adverse effects in 8 brochures. Four brochures mentioned relative risk reduction, 1 reported absolute risk reduction through hormone replacement therapy (HRT. Out of 28 brochures accessed in 2004 10 still recommended HRT without discussing adverse effects. Transparency of the development process was limited: 25 brochures reported publication date, 26 cited author and only 1 references. In contrast, readability and design was generally good. Conclusion: The quality of consumer brochures on osteoporosis in Germany is utterly inadequate. They fail to give evidence-based data on diagnosis and treatment options. Therefore, the material is not useful to enhance informed consumer choice.

  9. Financial time series analysis based on information categorization method

    Science.gov (United States)

    Tian, Qiang; Shang, Pengjian; Feng, Guochen

    2014-12-01

    The paper mainly applies the information categorization method to analyze the financial time series. The method is used to examine the similarity of different sequences by calculating the distances between them. We apply this method to quantify the similarity of different stock markets. And we report the results of similarity in US and Chinese stock markets in periods 1991-1998 (before the Asian currency crisis), 1999-2006 (after the Asian currency crisis and before the global financial crisis), and 2007-2013 (during and after global financial crisis) by using this method. The results show the difference of similarity between different stock markets in different time periods and the similarity of the two stock markets become larger after these two crises. Also we acquire the results of similarity of 10 stock indices in three areas; it means the method can distinguish different areas' markets from the phylogenetic trees. The results show that we can get satisfactory information from financial markets by this method. The information categorization method can not only be used in physiologic time series, but also in financial time series.

  10. Tuberculosis diagnosis support analysis for precarious health information systems.

    Science.gov (United States)

    Orjuela-Cañón, Alvaro David; Camargo Mendoza, Jorge Eliécer; Awad García, Carlos Enrique; Vergara Vela, Erika Paola

    2018-04-01

    Pulmonary tuberculosis is a world emergency for the World Health Organization. Techniques and new diagnosis tools are important to battle this bacterial infection. There have been many advances in all those fields, but in developing countries such as Colombia, where the resources and infrastructure are limited, new fast and less expensive strategies are increasingly needed. Artificial neural networks are computational intelligence techniques that can be used in this kind of problems and offer additional support in the tuberculosis diagnosis process, providing a tool to medical staff to make decisions about management of subjects under suspicious of tuberculosis. A database extracted from 105 subjects with precarious information of people under suspect of pulmonary tuberculosis was used in this study. Data extracted from sex, age, diabetes, homeless, AIDS status and a variable with clinical knowledge from the medical personnel were used. Models based on artificial neural networks were used, exploring supervised learning to detect the disease. Unsupervised learning was used to create three risk groups based on available information. Obtained results are comparable with traditional techniques for detection of tuberculosis, showing advantages such as fast and low implementation costs. Sensitivity of 97% and specificity of 71% where achieved. Used techniques allowed to obtain valuable information that can be useful for physicians who treat the disease in decision making processes, especially under limited infrastructure and data. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Dynamics analysis of epidemic and information spreading in overlay networks.

    Science.gov (United States)

    Liu, Guirong; Liu, Zhimei; Jin, Zhen

    2018-05-07

    We establish an SIS-UAU model to present the dynamics of epidemic and information spreading in overlay networks. The overlay network is represented by two layers: one where the dynamics of the epidemic evolves and another where the information spreads. We theoretically derive the explicit formulas for the basic reproduction number of awareness R 0 a by analyzing the self-consistent equation and the basic reproduction number of disease R 0 d by using the next generation matrix. The formula of R 0 d shows that the effect of awareness can reduce the basic reproduction number of disease. In particular, when awareness does not affect epidemic spreading, R 0 d is shown to match the existing theoretical results. Furthermore, we demonstrate that the disease-free equilibrium is globally asymptotically stable if R 0 d 1. Finally, numerical simulations show that information plays a vital role in preventing and controlling disease and effectively reduces the final disease scale. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Deactivation and Decommissioning Planning and Analysis with Geographic Information Systems

    International Nuclear Information System (INIS)

    Bollinger, James S.; Koffman, Larry D.; Austin, William E.

    2008-01-01

    From the mid-1950's through the 1980's, the U.S. Department of Energy's Savannah River Site produced nuclear materials for the weapons stockpile, for medical and industrial applications, and for space exploration. Although SRS has a continuing defense-related mission, the overall site mission is now oriented toward environmental restoration and management of legacy chemical and nuclear waste. With the change in mission, SRS no longer has a need for much of the infrastructure developed to support the weapons program. This excess infrastructure, which includes over 1000 facilities, will be decommissioned and demolished over the forthcoming years. Dis-positioning facilities for decommissioning and deactivation requires significant resources to determine hazards, structure type, and a rough-order-of-magnitude estimate for the decommissioning and demolition cost. Geographic information systems (GIS) technology was used to help manage the process of dis-positioning infrastructure and for reporting the future status of impacted facilities. Several thousand facilities of various ages and conditions are present at SRS. Many of these facilities, built to support previous defense-related missions, now represent a potential hazard and cost for maintenance and surveillance. To reduce costs and the hazards associated with this excess infrastructure, SRS has developed an ambitious plan to decommission and demolish unneeded facilities in a systematic fashion. GIS technology was used to assist development of this plan by: providing locational information for remote facilities, identifying the location of known waste units adjacent to buildings slated for demolition, and for providing a powerful visual representation of the impact of the overall plan. Several steps were required for the development of the infrastructure GIS model. The first step involved creating an accurate and current GIS representation of the infrastructure data. This data is maintained in a Computer Aided Design

  13. A collection and information analysis of the experiment with microcomputer

    International Nuclear Information System (INIS)

    Mohd Ariffin bin Aton; Ler Leong Tat

    1985-01-01

    A microcomputer-based system for the continuous collection and analysis of data from a fermentor is described. The system was designed around commercially available hardware and interface and software packages written for microcomputers. Additional programmes were written in BASIC to allow the results to be printed in a specific format. The data read from the fermentor were automatically stored on a floppy disc and analysis on the data can be performed at our convenience. Such method for data collection is not limited to a bioreactor, however, since instruments that require continuous accurate reading, such as GLC, HPLC, etc., could be coupled to a microcomputer system. (author)

  14. A Market Analysis Of The Commercial Traffic Information Business

    Science.gov (United States)

    1994-03-01

    THE FOLLOWING IS ONE OF A SERIES OF PAPERS DEVELOPED OR PRODUCED BY THE ECONOMIC ANALYSIS DIVISION OF THE JOHN A. VOLPE NATIONAL TRANSPORTATION SYSTEMS CENTER AS PART OF ITS RESEARCH PROJECT LOOKING INTO ISSUES SURROUNDING USER RESPONSE AND MARKET DE...

  15. Spatial Analysis of Soil Fertility Using Geographical Information ...

    African Journals Online (AJOL)

    The research evaluated soil fertility condition of River Otamiri watershed in southeastern Nigeria in relation to topographic heterogeneity using GIS technique. GPS was used to determine the geodetic coordinate of the sampling points and site elevation. Soil samples were collected and analyzed using standard soil analysis ...

  16. Transport network extensions for accessibility analysis in geographic information systems

    NARCIS (Netherlands)

    Jong, Tom de; Tillema, T.

    2005-01-01

    In many developed countries high quality digital transport networks are available for GIS based analysis. Partly this is due to the requirements of route planning software for internet and car navigation systems. Properties of these networks consist among others of road quality attributes,

  17. A database analysis of information on multiply charged ions

    International Nuclear Information System (INIS)

    Delcroix, J.L.

    1989-01-01

    A statistical analysis of data related to multiply charged ions, is performed in GAPHYOR data base: over-all statistics by ionization degree from q=1 to q=99, 'historical' development from 1975 to 1987, distribution (for q≥ 5) over physical processes (energy levels, charge exchange,...) and chemical elements

  18. The analysis of network transmission method for welding robot information

    Science.gov (United States)

    Cheng, Weide; Zhang, Hua; Liu, Donghua; Wang, Hongbo

    2012-01-01

    On the asis of User Datagram Protocol (UserDatagram Protocol, UDP), to do some improvement and design a welding robot network communication protocol (welding robot network communicate protocol: WRNCP), working on the fields of the transport layer and application layer of TCP / IP protocol. According to the characteristics of video data, to design the radio push-type (Broadcast Push Model, BPM) transmission method, improving the efficiency and stability of video transmission.and to designed the network information transmission system, used for real-time control of welding robot network.

  19. Analysis of Information Quality in event triggered Smart Grid Control

    DEFF Research Database (Denmark)

    Kristensen, Thomas le Fevre; Olsen, Rasmus Løvenstein; Rasmussen, Jakob Gulddahl

    2015-01-01

    The integration of renewable energy sources into the power grid requires added control intelligence which imposes new communication requirements onto the future power grid. Since large scale implementation of new communication infrastructure is infeasible, we consider methods of increasing depend...... access scheme depends on network conditions as well as trade-offs between information quality, network resources and control reactivity.......The integration of renewable energy sources into the power grid requires added control intelligence which imposes new communication requirements onto the future power grid. Since large scale implementation of new communication infrastructure is infeasible, we consider methods of increasing...

  20. Crowdsourcing prior information to improve study design and data analysis.

    Directory of Open Access Journals (Sweden)

    Jeffrey S Chrabaszcz

    Full Text Available Though Bayesian methods are being used more frequently, many still struggle with the best method for setting priors with novel measures or task environments. We propose a method for setting priors by eliciting continuous probability distributions from naive participants. This allows us to include any relevant information participants have for a given effect. Even when prior means are near-zero, this method provides a principle way to estimate dispersion and produce shrinkage, reducing the occurrence of overestimated effect sizes. We demonstrate this method with a number of published studies and compare the effect of different prior estimation and aggregation methods.

  1. The Effect of Information Analysis Automation Display Content on Human Judgment Performance in Noisy Environments

    Science.gov (United States)

    Bass, Ellen J.; Baumgart, Leigh A.; Shepley, Kathryn Klein

    2014-01-01

    Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noisy sensor data are used by both the human and the information analysis automation to make judgments. In a simplified air traffic conflict prediction experiment, 32 participants made probability of horizontal conflict judgments under different display content conditions. After being exposed to the information analysis automation, judgment achievement significantly improved for all participants as compared to judgments without any of the automation's information. Participants provided with additional display content pertaining to cue variability in the task environment had significantly higher aided judgment achievement compared to those provided with only the automation's judgment of a probability of conflict. When designing information analysis automation for environments where the automation's judgment achievement is impacted by noisy environmental data, it may be beneficial to show additional task environment information to the human judge in order to improve judgment performance. PMID:24847184

  2. Knowledge-Based Information Management for Watershed Analysis in the Pacific Northwest U.S.

    Science.gov (United States)

    Keith Reynolds; Richard Olson; Michael Saunders; Donald Latham; Michael Foster; Bruce Miller; Lawrence Bednar; Daniel Schmoldt; Patrick Cunningham; John Steffenson

    1996-01-01

    We are developing a knowledge-based information management system to provide decision support for watershed analysis in the Pacific Northwest region of the U.S. The system includes: (1) a GIS interface that allows users to graphically navigate to specific provinces and watersheds and display a variety of themes and other area-specific information, (2) an analysis...

  3. Components of spatial information management in wildlife ecology: Software for statistical and modeling analysis [Chapter 14

    Science.gov (United States)

    Hawthorne L. Beyer; Jeff Jenness; Samuel A. Cushman

    2010-01-01

    Spatial information systems (SIS) is a term that describes a wide diversity of concepts, techniques, and technologies related to the capture, management, display and analysis of spatial information. It encompasses technologies such as geographic information systems (GIS), global positioning systems (GPS), remote sensing, and relational database management systems (...

  4. 75 FR 58374 - 2010 Release of CADDIS (Causal Analysis/Diagnosis Decision Information System)

    Science.gov (United States)

    2010-09-24

    ... Decision Information System) AGENCY: Environmental Protection Agency (EPA). ACTION: Notice of public... 2010 version of the Causal Analysis/Diagnosis Decision Information System (CADDIS). This Web site was developed to help scientists find, develop, organize, and use environmental information to improve causal...

  5. Temporal abstraction for the analysis of intensive care information

    International Nuclear Information System (INIS)

    Hadad, Alejandro J; Evin, Diego A; Drozdowicz, Bartolome; Chiotti, Omar

    2007-01-01

    This paper proposes a scheme for the analysis of time-stamped series data from multiple monitoring devices of intensive care units, using Temporal Abstraction concepts. This scheme is oriented to obtain a description of the patient state evolution in an unsupervised way. The case of study is based on a dataset clinically classified with Pulmonary Edema. For this dataset a trends based Temporal Abstraction mechanism is proposed, by means of a Behaviours Base of time-stamped series and then used in a classification step. Combining this approach with the introduction of expert knowledge, using Fuzzy Logic, and multivariate analysis by means of Self-Organizing Maps, a states characterization model is obtained. This model is feasible of being extended to different patients groups and states. The proposed scheme allows to obtain intermediate states descriptions through which it is passing the patient and that could be used to anticipate alert situations

  6. Demand Analysis of Logistics Information Matching Platform: A Survey from Highway Freight Market in Zhejiang Province

    Science.gov (United States)

    Chen, Daqiang; Shen, Xiahong; Tong, Bing; Zhu, Xiaoxiao; Feng, Tao

    With the increasing competition in logistics industry and promotion of lower logistics costs requirements, the construction of logistics information matching platform for highway transportation plays an important role, and the accuracy of platform design is the key to successful operation or not. Based on survey results of logistics service providers, customers and regulation authorities to access to information and in-depth information demand analysis of logistics information matching platform for highway transportation in Zhejiang province, a survey analysis for framework of logistics information matching platform for highway transportation is provided.

  7. Directory of Federally Supported Information Analysis Centers, 1979. Fourth Edition.

    Science.gov (United States)

    1979-01-01

    monitoring program for yield. brine disposal in the Gulf of Mexico from leaching of salt domes, under an agreement with the HOLDINGS: Computerized...SPR/Salt Dome Storage-Analysis of Brine Disposal in the Gulf of Mexico (1977); journal ar- HOLDINGS: Computerized data base on the ticles...Electrochemistry, 68 Glaciology, 60 . Electroluminescence, 34 Glass, 34, 65, 103 Electrolysis , 33 Graduate study, 42 Joining, 65 Electrolyte solutions

  8. Analysis of the interests of Google users on toothache information.

    Directory of Open Access Journals (Sweden)

    Matheus Lotto

    Full Text Available The knowledge on health interests of a given population of Internet users might contribute to the increase of evidence on community's dental needs, and consequently, to the improvement of public health planning. The frequency of searches for specific issues on Google can be analyzed by the application of Google Trends.In this study, we aimed to characterize the interests on toothache information of Google users from the United States, United Kingdom, Australia and Brazil.The monthly variation of relative search volume (RSV and the lists of main toothache-related queries were determined from January 2004 through April 2016 using Google Trends. Autoregressive Integrated Moving Average (ARIMA forecasting models were developed to determine predictive RSV values for toothache for additional 12 months. Autocorrelation plots and general additive model (GAM were applied to determine trends and seasonality in RSV curves. Through linear regression models, we assessed the association between the variation of annual means of RSV values and national statistics for toothache in the U.S. and U.K. Subsequently, the distribution of main queries according to the identification of endodontic pain, type of searching information, and the interest in self-management of toothache was evaluated for the four countries.The autocorrelation plots showed patterns of non-stationary time series. The monthly variation influenced the data of the U.S. and U.K., with the higher RSV values found respectively in January/July and December. Also, the interest on toothache in the U.K. increases in the second semester and in the fourth quarter, especially in December. Additionally, an annual variation affected significantly all time series, with the increment of RSV means over the years, varying from 265% in the U.S. to 745% in Brazil. In parallel, the increments in RSV values were also observed in all predictive curves. The annual variation of observed and fitted RSV values was

  9. Overall analysis of meteorological information in the Daeduk nuclear complex

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Eun Han; Lee, Yung Bok; Han, Moon Heui; Suh, Kyung Suk; Hwang Won Tae [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-01-01

    Inspection and repair of tower structure and lift, instrument calibration have been done. Wireless data transmission to MIPS(Meteorological Information Processing System) has been done after collection in the DAS where environmental assessment can be done by the developed simulation programs in both cases of normal operation and emergency. Wind direction, wind speed, temperature, humidity, at 67 m, 27 m, and 10 m height and temperature, humidity, atmospheric pressure, solar radiation, precipitation, and visibility at surface have been measured and analyzed with statistical methods. At the site, the prevailing wind directions were SW in spring and summer, N and NW in autumn and winter season. The calm distributed 13.6% at 67 m, 24.5% at 27 m, 40.8% at 10 m height. 4 figs, 9 tabs, 6 refs. (Author).

  10. Overall analysis of meteorological information in the Daeduk nuclear complex

    International Nuclear Information System (INIS)

    Kim, Byung Woo; Lee, Young Bok; Han, Moon Hee; Kim, Eun Han; Suh, Kyung Suk; Hwang, Won Tae

    1994-01-01

    Inspection and repair of tower structure and lift, instrument calibration have been done with DAS (data aquisition system) updating. Wind direction, wind speed, temperature, humidity at 67m, 27m, and 10m height and temperature, humidity, atmospheric pressure, solar radiation, precipitation, and visibility at surface have been measured and analyzed with statistical methods. Wireless data transmission to MIPS (Meteorological Information Processing System) has been done after collection in the DAS where enviromental assessment can be done by the developed simulation programs in both cases of normal operation and emergency. The meteorological data as the result of this project had been used to report 'Environmental Impact Assessment of the Korean Multi-purpose Research Reactor' and S ite Selection of Meteorological Tower and Environment Impact Assessment of the Cooling Tower of the Korean Multi-purpose Research Reactor . (Author)

  11. Developing a Value of Information (VoI) Enabled System from Collection to Analysis

    Science.gov (United States)

    2016-11-01

    ARL-TN-0797 ● NOV 2016 US Army Research Laboratory Developing a Value of Information (VoI)- Enabled System from Collection to...US Army Research Laboratory Developing a Value of Information (VoI)- Enabled System from Collection to Analysis by Mark R Mittrick, John... Developing a Value of Information (VoI)-Enabled System from Collection to Analysis 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT

  12. HIV/AIDS information by African companies: an empirical analysis.

    Science.gov (United States)

    Barako, Dulacha G; Taplin, Ross H; Brown, Alistair M

    2010-01-01

    This article investigates the extent of Human Immunodeficiency Virus/Acquired Immune Deficiency Syndrome Disclosures (HIV/AIDSD) in online annual reports by 200 listed companies from 10 African countries for the year ending 2006. Descriptive statistics reveal a very low level of overall HIV/AIDSD practices with a mean of 6 per cent disclosure, with half (100 out of 200) of the African companies making no disclosures at all. Logistic regression analysis reveals that company size and country are highly significant predictors of any disclosure of HIV/AIDS in annual reports. Profitability is also statistically significantly associated with the extent of disclosure.

  13. 2004/2008 labour market information comparative analysis report

    International Nuclear Information System (INIS)

    2009-01-01

    The electricity sector has entered into a phase of both challenges and opportunities. Challenges include workforce retirement, labour shortages, and increased competition from other employers to attract and retain the skilled people required to deliver on the increasing demand for electricity in Canada. The electricity sector in Canada is also moving into a new phase, whereby much of the existing infrastructure is either due for significant upgrades, or complete replacement. The increasing demand for electricity means that increased investment and capital expenditure will need to be put toward building new infrastructure altogether. The opportunities for the electricity industry will lie in its ability to effectively and efficiently react to these challenges. The purpose of this report was to provide employers and stakeholders in the sector with relevant and current trend data to help them make appropriate policy and human resource decisions. The report presented a comparative analysis of a 2004 Canadian Electricity Association employer survey with a 2008 Electricity Sector Council employer survey. The comparative analysis highlighted trends and changes that emerged between the 2004 and 2008 studies. Specific topics that were addressed included overall employment trends; employment diversity in the sector; age of non-support staff; recruitment; and retirements and pension eligibility. Recommendations were also offered. It was concluded that the electricity sector could benefit greatly from implementing on-going recruitment campaigns. refs., tabs., figs

  14. Value of Information Analysis Project Gnome Site, New Mexico

    International Nuclear Information System (INIS)

    Pohll, Greg; Chapman, Jenny

    2010-01-01

    The Project Gnome site in southeastern New Mexico was the location of an underground nuclear detonation in 1961 and a hydrologic tracer test using radionuclides in 1963. The tracer test is recognized as having greater radionuclide migration potential than the nuclear test because the tracer test radionuclides (tritium, 90Sr, 131I, and 137Cs) are in direct contact with the Culebra Dolomite aquifer, whereas the nuclear test is within a bedded salt formation. The tracer test is the topic here. Recognizing previous analyses of the fate of the Gnome tracer test contaminants (Pohll and Pohlmann, 1996; Pohlmann and Andricevic, 1994), and the existence of a large body of relevant investigations and analyses associated with the nearby Waste Isolation Pilot Plant (WIPP) site (summarized in US DOE, 2009), the Gnome Site Characterization Work Plan (U.S. DOE, 2002) called for a Data Decision Analysis to determine whether or not additional characterization data are needed prior to evaluating existing subsurface intrusion restrictions and determining long-term monitoring for the tracer test. Specifically, the Work Plan called for the analysis to weigh the potential reduction in uncertainty from additional data collection against the cost of such field efforts.

  15. ERROR ANALYSIS ON INFORMATION AND TECHNOLOGY STUDENTS’ SENTENCE WRITING ASSIGNMENTS

    Directory of Open Access Journals (Sweden)

    Rentauli Mariah Silalahi

    2015-03-01

    Full Text Available Students’ error analysis is very important for helping EFL teachers to develop their teaching materials, assessments and methods. However, it takes much time and effort from the teachers to do such an error analysis towards their students’ language. This study seeks to identify the common errors made by 1 class of 28 freshmen students studying English in their first semester in an IT university. The data is collected from their writing assignments for eight consecutive weeks. The errors found were classified into 24 types and the top ten most common errors committed by the students were article, preposition, spelling, word choice, subject-verb agreement, auxiliary verb, plural form, verb form, capital letter, and meaningless sentences. The findings about the students’ frequency of committing errors were, then, contrasted to their midterm test result and in order to find out the reasons behind the error recurrence; the students were given some questions to answer in a questionnaire format. Most of the students admitted that careless was the major reason for their errors and lack understanding came next. This study suggests EFL teachers to devote their time to continuously check the students’ language by giving corrections so that the students can learn from their errors and stop committing the same errors.

  16. The Readability of Electronic Cigarette Health Information and Advice: A Quantitative Analysis of Web-Based Information.

    Science.gov (United States)

    Park, Albert; Zhu, Shu-Hong; Conway, Mike

    2017-01-06

    The popularity and use of electronic cigarettes (e-cigarettes) has increased across all demographic groups in recent years. However, little is currently known about the readability of health information and advice aimed at the general public regarding the use of e-cigarettes. The objective of our study was to examine the readability of publicly available health information as well as advice on e-cigarettes. We compared information and advice available from US government agencies, nongovernment organizations, English speaking government agencies outside the United States, and for-profit entities. A systematic search for health information and advice on e-cigarettes was conducted using search engines. We manually verified search results and converted to plain text for analysis. We then assessed readability of the collected documents using 4 readability metrics followed by pairwise comparisons of groups with adjustment for multiple comparisons. A total of 54 documents were collected for this study. All 4 readability metrics indicate that all information and advice on e-cigarette use is written at a level higher than that recommended for the general public by National Institutes of Health (NIH) communication guidelines. However, health information and advice written by for-profit entities, many of which were promoting e-cigarettes, were significantly easier to read. A substantial proportion of potential and current e-cigarette users are likely to have difficulty in fully comprehending Web-based health information regarding e-cigarettes, potentially hindering effective health-seeking behaviors. To comply with NIH communication guidelines, government entities and nongovernment organizations would benefit from improving the readability of e-cigarettes information and advice. ©Albert Park, Shu-Hong Zhu, Mike Conway. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 06.01.2017.

  17. Value of information analysis for corrective action unit No. 98: Frenchman Flat

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-06-01

    A value of information analysis has been completed as part of the corrective action process for Frenchman Flat, the first Nevada Test Site underground test area to be scheduled for the corrective action process. A value of information analysis is a cost-benefit analysis applied to the acquisition of new information which is needed to reduce the uncertainty in the prediction of a contaminant boundary surrounding underground nuclear tests in Frenchman Flat. The boundary location will be established to protect human health and the environment from the consequences of using contaminated groundwater on the Nevada Test Site. Uncertainties in the boundary predictions are assumed to be the result of data gaps. The value of information analysis in this document compares the cost of acquiring new information with the benefit of acquiring that information during the corrective action investigation at Frenchman Flat. Methodologies incorporated into the value of information analysis include previous geological modeling, groundwater flow modeling, contaminant transport modeling, statistics, sensitivity analysis, uncertainty analysis, and decision analysis.

  18. Value of information analysis for corrective action unit No. 98: Frenchman Flat

    International Nuclear Information System (INIS)

    1997-06-01

    A value of information analysis has been completed as part of the corrective action process for Frenchman Flat, the first Nevada Test Site underground test area to be scheduled for the corrective action process. A value of information analysis is a cost-benefit analysis applied to the acquisition of new information which is needed to reduce the uncertainty in the prediction of a contaminant boundary surrounding underground nuclear tests in Frenchman Flat. The boundary location will be established to protect human health and the environment from the consequences of using contaminated groundwater on the Nevada Test Site. Uncertainties in the boundary predictions are assumed to be the result of data gaps. The value of information analysis in this document compares the cost of acquiring new information with the benefit of acquiring that information during the corrective action investigation at Frenchman Flat. Methodologies incorporated into the value of information analysis include previous geological modeling, groundwater flow modeling, contaminant transport modeling, statistics, sensitivity analysis, uncertainty analysis, and decision analysis

  19. Characteristics analysis of CANDU PSA for risk-informed application

    International Nuclear Information System (INIS)

    Yang, Joon Eon; Jeong, Jong Tae; Kim, See Dal

    2005-01-01

    Recently, the risk informed applications (RIA) have become a worldwide issue of the nuclear industry. In this area, the U.S.A. plays a leading role in developing the present RIA framework. The other countries have adopted and/or modified the RIA framework of the U.S.A. for their own purpose. Nowadays, Korean nuclear industry is trying to introduce the RIA into Korea including the CANDU reactor. The present RIA framework has been developed for the light water reactors such as PWR (Pressurized Water Reactor) and/or BWR (Boiling Water Reactor) in the U.S.A. So if we want to use this RIA framework for the other types of reactors such as the CANDU reactor, etc., we have to review the applicability of the present RIA framework to the other types of reactors. In this aspect, we have to consider two factors: (1) the definition of risk measures and (2) the used PSA techniques. In this paper, we have reviewed the characteristics of the CANDU PSA, i.e. Wolsong 2/3/4 PSA. And we have performed the sensitivity analyses to identify the issues to be resolved for the CANDU RIA Framework

  20. Information entropy analysis of leopard seal vocalization bouts

    Science.gov (United States)

    Buck, John R.; Rogers, Tracey L.; Cato, Douglas H.

    2004-05-01

    Leopard seals (Hydrurga leptonyx) are solitary pinnipeds who are vocally active during their brief breeding season. The seals produce vocal bouts consisting of a sequence of distinct sounds, with an average length of roughly ten sounds. The sequential structure of the bouts is thought to be individually distinctive. Bouts recorded from five leopard seals during 1992-1994 were analyzed using information theory. The first-order Markov model entropy estimates were substantially smaller than the independent, identically distributed model entropy estimates for all five seals, indicative of constraints on the sequential structure of each seal's bouts. Each bout in the data set was classified using maximum-likelihood estimates from the first-order Markov model for each seal. This technique correctly classified 85% of the bouts, comparable to results in Rogers and Cato [Behaviour (2002)]. The relative entropies between the Markov models were found to be infinite in 18/20 possible cross-comparisons, indicating there is no probability of misclassifying the bouts in these 18 comparisons in the limit of long data sequences. One seal has sufficient data to compare a nonparametric entropy estimate with the Markov entropy estimate, finding only a small difference. This suggests that the first-order Markov model captures almost all the sequential structure in this seal's bouts.

  1. URBAN TRAFFIC ACCIDENT ANALYSIS BY USING GEOGRAPHIC INFORMATION SYSTEM

    Directory of Open Access Journals (Sweden)

    Meltem SAPLIOĞLU

    2006-03-01

    Full Text Available In recent years, traffic accidents that cause more social and economic losses than that of natural disasters,have become a national problem in Turkey. To solve this problem and to reduce the casualties, road safety programs are tried to be developed. It is necessary to develop the most effective measures with low investment cost due to limited budgets allocated to such road safety programs. The most important program is to determine dangerous locations of traffic accidents and to improve these sections from the road safety view point. New Technologies are driving a cycle of continuous improvement that causes rapid changes in the traffic engineering and any engineering services within it. It is obvious that this developed services will be the potential for forward-thinking engineering studies to take a more influence role. In this study, Geographic Information System (GIS was used to identify the hazardous locations of traffic accidents in Isparta. Isparta city map was digitized by using Arcinfo 7.21. Traffic accident reports occurred between 1998-2002 were obtained from Directory of Isparta Traffic Region and had been used to form the database. Topology was set up by using Crash Diagrams and Geographic Position Reference Systems. Tables are formed according to the obtained results and interpreted.

  2. [Visualization analysis of literature information in Journal of Hygiene Research].

    Science.gov (United States)

    Pan, Xiaojie; Zhang, Chichen; Zheng, Jianzhong; Su, Chunhui; Hu, Weihong; Zhao, Haifeng; Mi, Yuqian; Hao, Congying

    2016-03-01

    To grasp research status and the revolution of Journal of Hygiene Research since started publishing, and also track research hot spots and developing trends of this field. Using the method of bibliometrics and information visualization software CiteSpace III, quantities of published literature, supported funds, institutions, authors and keywords from 6775 articles published in Journal of Hygiene Research from 1972 to 2014 were analyzed. Amount of literatures published on Journal of Hygiene Research increased wave upon wave, the peak appeared in 1995. Institutions of Chinese Center for Disease Control and Prevention, such as National Institute for Nutrition and Health, Institute of Environmental Health and Related Product Safety, National Institute of Occupational Health and Poison Control and some medical colleges were the most productive. The scholars with the most number of publications were YANG Xiaoguang, YIN Shian and PIAO Jianhua, the researcher of National Institute for Nutrition and Health, Chinese Center for Disease Control and Prevention, with more than 75 articles were published. The research contents included influencing factors, related concepts, diseases, methods and objects. "mutagenicity", "apoptosis", "lead poisoning", "HPLC" and "rat" were research focuses in this field. There were lots of matter and cooperation in the articles published on Journal of Hygiene Research. The centers for disease control and prevention in different regions and universities pay attention to the coorperation, research teams with members of various ages collaborate and grow together, form a close, complex collaborative network among authors, which promote the development of the magazine and research fields together.

  3. An interview-informed synthesized contingency analysis to inform the treatment of challenging behavior in a young child with autism.

    Science.gov (United States)

    Herman, Ciara; Healy, Olive; Lydon, Sinéad

    2018-04-01

    Experimental Functional analysis (EFA) is considered the "gold standard" of behavioural assessment and its use is predictive of treatment success. However, EFA has a number of limitations including its lengthy nature, the high level of expertise required, and the reinforcement of challenging behaviour. This study aimed to further validate a novel interview-informed synthesised contingency analysis (IISCA). An open-ended interview and brief direct observation informed an IISCA for a young boy with autism who engaged in challenging behaviour. Resulting data supported the hypothesis that the target behaviour was multiply controlled by escape from demands and access to tangible items. An intervention comprised of most-to-least prompting, escape extinction, differential reinforcement and a high-probability instruction sequence was evaluated using a reversal design. This intervention reduced challenging behaviour to low levels and resulted in increased compliance. Findings support the status of the IISCA as a valid, practical, and effective process for designing function-based interventions.

  4. AN ANALYSIS OF SALES INFORMATION SYSTEM AND COMPETITIVE ADVANTAGE (Study Case of UD. Citra Helmet

    Directory of Open Access Journals (Sweden)

    Hendra Alianto

    2012-10-01

    Full Text Available Business development in this era of globalization leads companies to use information system in running business relationship by changing the traditional way of working in non-integrated information systems into integrated information systems. The intended use of the integrated information system will improve the effective and efficient way of working, such as the availability of information in real-time, accurate and informative for decision-making for the benefit of operational activities, as well as decision-making for strategic interests and the company’s business development. Especially with the application of sales information system, it will improve the company’s performance and will affect the competitiveness of companies, which can ultimately increase the maximum profit. However, in reality it is not easy to implement the integrated information system, because it is influenced by the customs, culture and mindset of the user company. It is necessary for running system analysis activity and building an integrated information system by concerning into the needs of users, management, customers and stakeholders. The implementation of integrated information system will increase productivity and achieve the effectiveness and efficiency level of company’s operations, through the analysis of sales information system will affect the competitiveness of companies.Keywords: Sales Information System Analysis

  5. Information Switching Processor (ISP) contention analysis and control

    Science.gov (United States)

    Inukai, Thomas

    1995-01-01

    In designing a satellite system with on-board processing, the selection of a switching architecture is often critical. The on-board switching function can be implemented by circuit switching or packet switching. Destination-directed packet switching has several attractive features, such as self-routing without on-board switch reconfiguration, no switch control memory requirement, efficient bandwidth utilization for packet switched traffic, and accommodation of circuit switched traffic. Destination-directed packet switching, however, has two potential concerns: (1) contention and (2) congestion. And this report specifically deals with the first problem. It includes a description and analysis of various self-routing switch structures, the nature of contention problems, and contention and resolution techniques.

  6. Search Strategies in Nanotechnology Databases: Log Analysis. Journal of Information Processing and Management

    Directory of Open Access Journals (Sweden)

    Nadjla Hariri

    2014-03-01

    Full Text Available Major concern of users of information systems is information retrieval related their information needs, query used by the users is a manifestation of their information needs. The research purpose is to analyze databases of nanotechnology through the query analysis and follow-up users navigation. The research method is transaction log analysis. This study was performed through the analysis of the interaction between users and the database transaction files. Results show that nanothechnology databases users using three methods to retrieve information needs: search engines, referral sites and directly use. In the used directly Bounce Rate have been lower and more pages have been visited. The average length of query is 3.36. And easier search strategies are used to retrieve information.

  7. Thinking about information work of nuclear science and technology in the age of big data: speaking of the information analysis and research

    International Nuclear Information System (INIS)

    Chen Tieyong

    2014-01-01

    Human society is entering a 'PB' (1024TB) the new era as the unit of structured and unstructured data, In the network era, with the development of mobile communications, electronic commerce, the emergence and development of social network. Now, a large-scale production, sharing and application data era is opening. How to explore the value of data, to conquer big data, to get useful information, is an important task of our science and technology information workers. This paper tries to analyze the development of the nuclear science and technology information work from big data obtain, analysis, application. Our analysis and research work for information will be increasingly based on all data and analysis, Instead of random sampling. The data 'sound' is possible. A lot of results of information analysis and research can be expressed quantitatively. We should attach great importance to data collection, careful analysis of the big data. We involves the professional division of labor, but also to cooperation In nuclear science and technology information analysis and research process. In addition, we should strengthen the nuclear science and technology information resource construction, improve Information supply; strengthen the analysis and research of nuclear science and technology information, improve the information service; strengthen information management of nuclear science and technology, pay attention to the security problems and intellectual property rights in information sharing; strengthen personnel training, continuously improve the nuclear science and technology information work efficiency and performance. In the age of big data, our nuclear science and technology information workers shall be based on the information analysis and study as the core, one hand grasping information collection, another hand grasping information service, forge ahead and innovation, continuous improvement working ability of nuclear science and technology

  8. Comparative analysis of regulatory information and circuits across distant species.

    Science.gov (United States)

    Boyle, Alan P; Araya, Carlos L; Brdlik, Cathleen; Cayting, Philip; Cheng, Chao; Cheng, Yong; Gardner, Kathryn; Hillier, LaDeana W; Janette, Judith; Jiang, Lixia; Kasper, Dionna; Kawli, Trupti; Kheradpour, Pouya; Kundaje, Anshul; Li, Jingyi Jessica; Ma, Lijia; Niu, Wei; Rehm, E Jay; Rozowsky, Joel; Slattery, Matthew; Spokony, Rebecca; Terrell, Robert; Vafeados, Dionne; Wang, Daifeng; Weisdepp, Peter; Wu, Yi-Chieh; Xie, Dan; Yan, Koon-Kiu; Feingold, Elise A; Good, Peter J; Pazin, Michael J; Huang, Haiyan; Bickel, Peter J; Brenner, Steven E; Reinke, Valerie; Waterston, Robert H; Gerstein, Mark; White, Kevin P; Kellis, Manolis; Snyder, Michael

    2014-08-28

    Despite the large evolutionary distances between metazoan species, they can show remarkable commonalities in their biology, and this has helped to establish fly and worm as model organisms for human biology. Although studies of individual elements and factors have explored similarities in gene regulation, a large-scale comparative analysis of basic principles of transcriptional regulatory features is lacking. Here we map the genome-wide binding locations of 165 human, 93 worm and 52 fly transcription regulatory factors, generating a total of 1,019 data sets from diverse cell types, developmental stages, or conditions in the three species, of which 498 (48.9%) are presented here for the first time. We find that structural properties of regulatory networks are remarkably conserved and that orthologous regulatory factor families recognize similar binding motifs in vivo and show some similar co-associations. Our results suggest that gene-regulatory properties previously observed for individual factors are general principles of metazoan regulation that are remarkably well-preserved despite extensive functional divergence of individual network connections. The comparative maps of regulatory circuitry provided here will drive an improved understanding of the regulatory underpinnings of model organism biology and how these relate to human biology, development and disease.

  9. Graph analysis of dream reports is especially informative about psychosis

    Science.gov (United States)

    Mota, Natália B.; Furtado, Raimundo; Maia, Pedro P. C.; Copelli, Mauro; Ribeiro, Sidarta

    2014-01-01

    Early psychiatry investigated dreams to understand psychopathologies. Contemporary psychiatry, which neglects dreams, has been criticized for lack of objectivity. In search of quantitative insight into the structure of psychotic speech, we investigated speech graph attributes (SGA) in patients with schizophrenia, bipolar disorder type I, and non-psychotic controls as they reported waking and dream contents. Schizophrenic subjects spoke with reduced connectivity, in tight correlation with negative and cognitive symptoms measured by standard psychometric scales. Bipolar and control subjects were undistinguishable by waking reports, but in dream reports bipolar subjects showed significantly less connectivity. Dream-related SGA outperformed psychometric scores or waking-related data for group sorting. Altogether, the results indicate that online and offline processing, the two most fundamental modes of brain operation, produce nearly opposite effects on recollections: While dreaming exposes differences in the mnemonic records across individuals, waking dampens distinctions. The results also demonstrate the feasibility of the differential diagnosis of psychosis based on the analysis of dream graphs, pointing to a fast, low-cost and language-invariant tool for psychiatric diagnosis and the objective search for biomarkers. The Freudian notion that ``dreams are the royal road to the unconscious'' is clinically useful, after all.

  10. Numerical Investigations into the Value of Information in Lifecycle Analysis of Structural Systems

    DEFF Research Database (Denmark)

    Konakli, Katerina; Sudret, Bruno; Faber, Michael Havbro

    2015-01-01

    Preposterior analysis can be used to assess the potential of an experiment to enhance decision-making by providing information on parameters of the decision problem that are surrounded by epistemic uncertainties. The present paper describes a framework for preposterior analysis for support...... dependencies between the components of a system. Furthermore, challenges and potentials in value-of-information analysis for structural systems are discussed....... of decisions related to maintenance of structural systems. In this context, experiments may refer to inspections or structural health monitoring. The value-of-information concept comprises a powerful tool for determining whether the experimental cost is justified by the expected gained benefit during...

  11. Analysis and evolution of air quality monitoring networks using combined statistical information indexes

    Directory of Open Access Journals (Sweden)

    Axel Osses

    2013-10-01

    Full Text Available In this work, we present combined statistical indexes for evaluating air quality monitoring networks based on concepts derived from the information theory and Kullback–Liebler divergence. More precisely, we introduce: (1 the standard measure of complementary mutual information or ‘specificity’ index; (2 a new measure of information gain or ‘representativity’ index; (3 the information gaps associated with the evolution of a network and (4 the normalised information distance used in clustering analysis. All these information concepts are illustrated by applying them to 14 yr of data collected by the air quality monitoring network in Santiago de Chile (33.5 S, 70.5 W, 500 m a.s.l.. We find that downtown stations, located in a relatively flat area of the Santiago basin, generally show high ‘representativity’ and low ‘specificity’, whereas the contrary is found for a station located in a canyon to the east of the basin, consistently with known emission and circulation patterns of Santiago. We also show interesting applications of information gain to the analysis of the evolution of a network, where the choice of background information is also discussed, and of mutual information distance to the classifications of stations. Our analyses show that information as those presented here should of course be used in a complementary way when addressing the analysis of an air quality network for planning and evaluation purposes.

  12. Transportation Routing Analysis Geographic Information System -- TRAGIS, progress on improving a routing tool

    International Nuclear Information System (INIS)

    Johnson, P.E.; Lester, P.B.

    1998-05-01

    The Transportation Routing Analysis Geographic Information System (TRAGIS) model provides a useful tool to calculate and analyze transportation routes for radioactive materials within the continental US. This paper outlines some of the features available in this model

  13. Using visual information analysis to explore complex patterns in the activity of designers

    DEFF Research Database (Denmark)

    Cash, Philip; Stanković, Tino; Štorga, Mario

    2014-01-01

    The analysis of complex interlinked datasets poses a significant problem for design researchers. This is addressed by proposing an information visualisation method for analysing patterns of design activity, qualitatively and quantitatively, with respect to time. This method visualises the tempora...... to a fully realised example of information seeking activity. The core contribution of the proposed method is in supporting the analysis of activity with respect to both macro and micro level temporal interactions between variables....

  14. User Information Fusion Decision Making Analysis with the C-OODA Model

    Science.gov (United States)

    2011-07-01

    Observe-Orient-Decide-Act (C- OODA) model as a method of user and team analysis in the context of the Data Fusion Information Group ( DFIG ) Information...Fusion Model. From the DFIG model [as an update to the Joint Directors of the Lab (JDL) model], we look at Level 5 Fusion of “user refinement” in...OODA comparisons to the DFIG model support systems evaluation and analysis as well as coordinating the time interval of interaction between the machine

  15. Software Engineers' Information Seeking Behavior in Change Impact Analysis: An Interview Study

    OpenAIRE

    Borg, Markus; Alégroth, Emil; Runeson, Per

    2017-01-01

    Software engineers working in large projects must navigate complex information landscapes. Change Impact Analysis (CIA) is a task that relies on engineers' successful information seeking in databases storing, e.g., source code, requirements, design descriptions, and test case specifications. Several previous approaches to support information seeking are task-specific, thus understanding engineers' seeking behavior in specific tasks is fundamental. We present an industrial case study on how en...

  16. Research of Classical and Intelligent Information System Solutions for Criminal Intelligence Analysis

    OpenAIRE

    Šimović, Vladimir

    2001-01-01

    The objective of this study is to present research on classical and intelligent information system solutions used in criminal intelligence analysis in Croatian security system theory. The study analyses objective and classical methods of information science, including artificial intelligence and other scientific methods. The intelligence and classical software solutions researched, proposed, and presented in this study were used in developing the integrated information system for the Croatian...

  17. Understanding information exchange during disaster response: Methodological insights from infocentric analysis

    Science.gov (United States)

    Toddi A. Steelman; Branda Nowell; Deena. Bayoumi; Sarah. McCaffrey

    2014-01-01

    We leverage economic theory, network theory, and social network analytical techniques to bring greater conceptual and methodological rigor to understand how information is exchanged during disasters. We ask, "How can information relationships be evaluated more systematically during a disaster response?" "Infocentric analysis"—a term and...

  18. Experiments in Discourse Analysis Impact on Information Classification and Retrieval Algorithms.

    Science.gov (United States)

    Morato, Jorge; Llorens, J.; Genova, G.; Moreiro, J. A.

    2003-01-01

    Discusses the inclusion of contextual information in indexing and retrieval systems to improve results and the ability to carry out text analysis by means of linguistic knowledge. Presents research that investigated whether discourse variables have an impact on information and retrieval and classification algorithms. (Author/LRW)

  19. Needs of informal caregivers across the caregiving course in amyotrophic lateral sclerosis: a qualitative analysis.

    LENUS (Irish Health Repository)

    Galvin, Miriam

    2018-01-27

    Amyotrophic lateral sclerosis (ALS), also known as motor neuron disease (MND), is a debilitating terminal condition. Informal caregivers are key figures in ALS care provision. The physical, psychological and emotional impact of providing care in the home requires appropriate assistance and support. The objective of this analysis is to explore the needs of informal ALS caregivers across the caregiving course.

  20. Industry subsector analysis Australia: Air pollution control equipment. Export trade information

    Energy Technology Data Exchange (ETDEWEB)

    1992-12-01

    The market survey covers the air pollution control equipment market in Australia. The analysis contains statistical and narrative information on projected market demand, end-users; receptivity of Australian consumers to U.S. products; the competitive situation, and market access (tariffs, non-tariff barriers, standards, taxes, distribution channels). It also contains key contact information.

  1. Using Meta-Analysis to Inform the Design of Subsequent Studies of Diagnostic Test Accuracy

    Science.gov (United States)

    Hinchliffe, Sally R.; Crowther, Michael J.; Phillips, Robert S.; Sutton, Alex J.

    2013-01-01

    An individual diagnostic accuracy study rarely provides enough information to make conclusive recommendations about the accuracy of a diagnostic test; particularly when the study is small. Meta-analysis methods provide a way of combining information from multiple studies, reducing uncertainty in the result and hopefully providing substantial…

  2. Can Raters with Reduced Job Descriptive Information Provide Accurate Position Analysis Questionnaire (PAQ) Ratings?

    Science.gov (United States)

    Friedman, Lee; Harvey, Robert J.

    1986-01-01

    Job-naive raters provided with job descriptive information made Position Analysis Questionnaire (PAQ) ratings which were validated against ratings of job analysts who were also job content experts. None of the reduced job descriptive information conditions enabled job-naive raters to obtain either acceptable levels of convergent validity with…

  3. Sheltering Children from the Whole Truth: A Critical Analysis of an Informational Picture Book.

    Science.gov (United States)

    Lamme, Linda; Fu, Danling

    2001-01-01

    Uses Orbis Pictus Award Committee criteria (accuracy, organization, design, and style) to examine an informational book, "Rice Is Life," by Rita Golden Gelman. Subjects the book to a deeper critical analysis. Suggests that it is important to help students become critical thinkers about everything they read, including informational books.…

  4. Self-Informant Agreement in Well-Being Ratings: A Meta-Analysis

    Science.gov (United States)

    Schneider, Leann; Schimmack, Ulrich

    2009-01-01

    A meta-analysis of published studies that reported correlations between self-ratings and informant ratings of well-being (life-satisfaction, happiness, positive affect, negative affect) was performed. The average self-informant correlation based on 44 independent samples and 81 correlations for a total of 8,897 participants was r = 0.42 [99%…

  5. Efficiency and Effectiveness in the Collection and Analysis of S&T Open Source Information

    International Nuclear Information System (INIS)

    Pericou-Cayere, M.; Lemaire, P.; Pace, J.-M.; Baude, S.; Samson, N.

    2015-01-01

    While looking for information in scientific database, we are overwhelmed by the amount of information that we encounter. In this big data collection, getting information with added-value could be strategic for nuclear verification. In our study, we have worked about ''best practices'' in collecting, processing and analyzing open source scientific and technical information. First, we were insistent on working with information authenticated by referees such as scientific publications (structured information). Analysis of this structured data is made with bibliometric tools. Several steps are carried out: collecting data related to the paradigm, creating a database to store data generated by bibliographic research, analyzing data with selected tools. With analysis of bibliographic data only, we are able to get: · a panoramic view of countries that publish in the paradigm, · co-publication networks, · organizations that contribute to scientific publications, · countries with which a country collaborates, · areas of interest of a country, . . . So we are able to identify a target. On a second phase, we can focus on a target (countries for example). Working with non-structured data (i.e., press release, social networks, full text analysis of publications) is in progress and needs other tools to be added to the process, as we will discuss in this paper. In information analysis, methodology and expert analysis are important. Software analysis is just a tool to achieve our goal. This presentation deals with concrete measures that improve the efficiency and effectiveness in the use of open source S&T information and in the management of that information over time. Examples are shown. (author)

  6. Automatic Content Analysis; Part I of Scientific Report No. ISR-18, Information Storage and Retrieval...

    Science.gov (United States)

    Cornell Univ., Ithaca, NY. Dept. of Computer Science.

    Four papers are included in Part One of the eighteenth report on Salton's Magical Automatic Retriever of Texts (SMART) project. The first paper: "Content Analysis in Information Retrieval" by S. F. Weiss presents the results of experiments aimed at determining the conditions under which content analysis improves retrieval results as well…

  7. Auditing information structures in organizations: A review of data collection techniques for network analysis

    NARCIS (Netherlands)

    Koning, K.H.; de Jong, Menno D.T.

    2005-01-01

    Network analysis is one of the current techniques for investigating organizational communication. Despite the amount of how-to literature about using network analysis to assess information flows and relationships in organizations, little is known about the methodological strengths and weaknesses of

  8. Mobile Geospatial Information Systems for Land Force Operations: Analysis of Operational Needs and Research Opportunities

    Science.gov (United States)

    2010-03-01

    Task Analysis ( HTA ), Cognitive Work Analysis (CWA) Military battlefield awareness, wayfind*, terrain navigation, route guidance, reconnaissance...where specific attention was paid to how the device met the military operational tasks and physical and technical considerations. The information...location as well as separating different geographical features on mobile GIS systems. In military settings, much attention , training and

  9. BGI-RIS: an integrated information resource and comparative analysis workbench for rice genomics

    DEFF Research Database (Denmark)

    Zhao, Wenming; Wang, Jing; He, Ximiao

    2004-01-01

    the application of the rice genomic information and to provide a foundation for functional and evolutionary studies of other important cereal crops, we implemented our Rice Information System (BGI-RIS), the most up-to-date integrated information resource as well as a workbench for comparative genomic analysis....... In addition to comprehensive data from Oryza sativa L. ssp. indica sequenced by BGI, BGI-RIS also hosts carefully curated genome information from Oryza sativa L. ssp. japonica and EST sequences available from other cereal crops. In this resource, sequence contigs of indica (93-11) have been further assembled...

  10. Analysis of the accuracy and readability of herbal supplement information on Wikipedia.

    Science.gov (United States)

    Phillips, Jennifer; Lam, Connie; Palmisano, Lisa

    2014-01-01

    To determine the completeness and readability of information found in Wikipedia for leading dietary supplements and assess the accuracy of this information with regard to safety (including use during pregnancy/lactation), contraindications, drug interactions, therapeutic uses, and dosing. Cross-sectional analysis of Wikipedia articles. The contents of Wikipedia articles for the 19 top-selling herbal supplements were retrieved on July 24, 2012, and evaluated for organization, content, accuracy (as compared with information in two leading dietary supplement references) and readability. Accuracy of Wikipedia articles. No consistency was noted in how much information was included in each Wikipedia article, how the information was organized, what major categories were used, and where safety and therapeutic information was located in the article. All articles in Wikipedia contained information on therapeutic uses and adverse effects but several lacked information on drug interactions, pregnancy, and contraindications. Wikipedia articles had 26%-75% of therapeutic uses and 76%-100% of adverse effects listed in the Natural Medicines Comprehensive Database and/or Natural Standard. Overall, articles were written at a 13.5-grade level, and all were at a ninth-grade level or above. Articles in Wikipedia in mid-2012 for the 19 top-selling herbal supplements were frequently incomplete, of variable quality, and sometimes inconsistent with reputable sources of information on these products. Safety information was particularly inconsistent among the articles. Patients and health professionals should not rely solely on Wikipedia for information on these herbal supplements when treatment decisions are being made.

  11. The US Support Program Assistance to the IAEA Safeguards Information Technology, Collection, and Analysis 2008

    Energy Technology Data Exchange (ETDEWEB)

    Tackentien,J.

    2008-06-12

    One of the United States Support Program's (USSP) priorities for 2008 is to support the International Atomic Energy Agency's (IAEA) development of an integrated and efficient safeguards information infrastructure, including reliable and maintainable information systems, and effective tools and resources to collect and analyze safeguards-relevant information. The USSP has provided funding in support of this priority for the ISIS Re-engineering Project (IRP), and for human resources support to the design and definition of the enhanced information analysis architecture project (nVision). Assistance for several other information technology efforts is provided. This paper will report on the various ongoing support measures undertaken by the USSP to support the IAEA's information technology enhancements and will provide some insights into activities that the USSP may support in the future.

  12. The Impact of the Introduction of Web Information Systems (WIS) on Information Policies: An Analysis of the Canadian Federal Government Policies Related to WIS.

    Science.gov (United States)

    Dufour, Christine; Bergeron, Pierette

    2002-01-01

    Presents results of an analysis of the Canadian federal government information policies that govern its Web information systems (WIS) that was conducted to better understand how the government has adapted its information policies to the WIS. Discusses results that indicate new policies have been crafted to take into account the WIS context.…

  13. Financial Ratio Analysis: the Development of a Dedicated Management Information System

    Directory of Open Access Journals (Sweden)

    Voicu-Dan Dragomir

    2007-01-01

    Full Text Available This paper disseminates the results of the development process for a financial analysis information system. The system has been subject to conceptual design using the Unified Modeling Language (UML and has been implemented in object-oriented manner using the Visual Basic .NET 2003 programming language. The classic financial analysis literature is focused on the chain-substitution method of computing the prior-year to current-year variation of linked financial ratios. We have applied this technique on the DuPont System of analysis concerning the Return on Equity ratio, by designing several structural UML diagrams depicting the breakdown and analysis of each financial ratio involved. The resulting computer application offers a flexible approach to the analytical tools: the user is required to introduce the raw data and the system provides both table-style and charted information on the output of computation. User-friendliness is also a key feature of this particular financial analysis application.

  14. Causality Analysis of fMRI Data Based on the Directed Information Theory Framework.

    Science.gov (United States)

    Wang, Zhe; Alahmadi, Ahmed; Zhu, David C; Li, Tongtong

    2016-05-01

    This paper aims to conduct fMRI-based causality analysis in brain connectivity by exploiting the directed information (DI) theory framework. Unlike the well-known Granger causality (GC) analysis, which relies on the linear prediction technique, the DI theory framework does not have any modeling constraints on the sequences to be evaluated and ensures estimation convergence. Moreover, it can be used to generate the GC graphs. In this paper, first, we introduce the core concepts in the DI framework. Second, we present how to conduct causality analysis using DI measures between two time series. We provide the detailed procedure on how to calculate the DI for two finite-time series. The two major steps involved here are optimal bin size selection for data digitization and probability estimation. Finally, we demonstrate the applicability of DI-based causality analysis using both the simulated data and experimental fMRI data, and compare the results with that of the GC analysis. Our analysis indicates that GC analysis is effective in detecting linear or nearly linear causal relationship, but may have difficulty in capturing nonlinear causal relationships. On the other hand, DI-based causality analysis is more effective in capturing both linear and nonlinear causal relationships. Moreover, it is observed that brain connectivity among different regions generally involves dynamic two-way information transmissions between them. Our results show that when bidirectional information flow is present, DI is more effective than GC to quantify the overall causal relationship.

  15. Alice and Bob meet Banach the interface of asymptotic geometric analysis and quantum information theory

    CERN Document Server

    Aubrun, Guillaume

    2017-01-01

    The quest to build a quantum computer is arguably one of the major scientific and technological challenges of the twenty-first century, and quantum information theory (QIT) provides the mathematical framework for that quest. Over the last dozen or so years, it has become clear that quantum information theory is closely linked to geometric functional analysis (Banach space theory, operator spaces, high-dimensional probability), a field also known as asymptotic geometric analysis (AGA). In a nutshell, asymptotic geometric analysis investigates quantitative properties of convex sets, or other geometric structures, and their approximate symmetries as the dimension becomes large. This makes it especially relevant to quantum theory, where systems consisting of just a few particles naturally lead to models whose dimension is in the thousands, or even in the billions. Alice and Bob Meet Banach is aimed at multiple audiences connected through their interest in the interface of QIT and AGA: at quantum information resea...

  16. Methods of sports genetics: dermatoglyphic analysis of human palmarprints (information 2)

    OpenAIRE

    Serhiyenko L.P.; Lyshevskaya V.M.

    2010-01-01

    Information is generalized about the dermatoglyphic analysis of hands of hands of man. The quantitative dermatoglyphic indexes of hands of hands are presented for youths and girls of the Podol region of Ukraine. The quantitative indexes of palm's dermatoglyphics are rotined for youths and girls of Ukrainian and Russian nationality in Kharkov. The most informing dermatoglyphic indexes of hands of hands which it is possible to use in sporting genetics are certain. Formed recommendation on techn...

  17. Analysis of Transaction Costs in Logistics and the Methodologies for Their Information Reflection for Automotive Companies

    OpenAIRE

    Ol’ga Evgen’evna Kovrizhnykh; Polina Aleksandrovna Nechaeva

    2016-01-01

    Transaction costs emerge in different types of logistics activities and influence the material flow and the accompanying financial and information flows; due to this fact, the information support and assessment are important tasks for the enterprise. The paper analyzes transaction costs in logistics for automotive manufacturers; according to the analysis, the level of these costs in any functional area of “logistics supply” ranges from 1.5 to 20%. These are only the official figures of transa...

  18. Open source information acquisition, analysis and integration in the IAEA Department of Safeguards

    International Nuclear Information System (INIS)

    Barletta, M.; Zarimpas, N.; Zarucki, R.

    2010-10-01

    Acquisition and analysis of open source information plays an increasingly important role in the IAEA strengthened safeguards system. The Agency's focal point for open source information collection and analysis is the Division of Safeguards Information Management (SGIM) within the IAEA Department of Safeguards. In parallel with the approval of the Model Additional Protocol in 1997, a new centre of information acquisition and analysis expertise was created within SGIM. By acquiring software, developing databases, retraining existing staff and hiring new staff with diverse analytical skills, SGIM is pro actively contributing to the future implementation of information-driven safeguards in collaboration with other Divisions within the Department of Safeguards. Open source information support is now fully integrated with core safeguards processes and activities, and has become an effective tool in the work of the Department of Safeguards. This provides and overview of progress realized through the acquisition and use of open source information in several thematic areas: evaluation of additional protocol declarations; support to the State Evaluation process; in-depth investigation of safeguards issues, including assisting inspections and complementary access; research on illicit nuclear procurement networks and trafficking; and monitoring nuclear developments. Demands for open source information have steadily grown and are likely to continue to grow in the future. Coupled with the enormous growth and accessibility in the volume and sources of information, new challenges are presented, both technical and analytical. This paper discusses actions taken and future plans for multi-source and multi-disciplinary analytic integration to strengthen confidence in safeguards conclusions - especially regarding the absence of undeclared nuclear materials and activities. (Author)

  19. Open source information acquisition, analysis and integration in the IAEA Department of Safeguards

    Energy Technology Data Exchange (ETDEWEB)

    Barletta, M.; Zarimpas, N.; Zarucki, R., E-mail: M.Barletta@iaea.or [IAEA, Wagramerstrasse 5, P.O. Box 100, 1400 Vienna (Austria)

    2010-10-15

    Acquisition and analysis of open source information plays an increasingly important role in the IAEA strengthened safeguards system. The Agency's focal point for open source information collection and analysis is the Division of Safeguards Information Management (SGIM) within the IAEA Department of Safeguards. In parallel with the approval of the Model Additional Protocol in 1997, a new centre of information acquisition and analysis expertise was created within SGIM. By acquiring software, developing databases, retraining existing staff and hiring new staff with diverse analytical skills, SGIM is pro actively contributing to the future implementation of information-driven safeguards in collaboration with other Divisions within the Department of Safeguards. Open source information support is now fully integrated with core safeguards processes and activities, and has become an effective tool in the work of the Department of Safeguards. This provides and overview of progress realized through the acquisition and use of open source information in several thematic areas: evaluation of additional protocol declarations; support to the State Evaluation process; in-depth investigation of safeguards issues, including assisting inspections and complementary access; research on illicit nuclear procurement networks and trafficking; and monitoring nuclear developments. Demands for open source information have steadily grown and are likely to continue to grow in the future. Coupled with the enormous growth and accessibility in the volume and sources of information, new challenges are presented, both technical and analytical. This paper discusses actions taken and future plans for multi-source and multi-disciplinary analytic integration to strengthen confidence in safeguards conclusions - especially regarding the absence of undeclared nuclear materials and activities. (Author)

  20. Evaluation of a gene information summarization system by users during the analysis process of microarray datasets.

    Science.gov (United States)

    Yang, Jianji; Cohen, Aaron; Hersh, William

    2009-02-05

    Summarization of gene information in the literature has the potential to help genomics researchers translate basic research into clinical benefits. Gene expression microarrays have been used to study biomarkers for disease and discover novel types of therapeutics and the task of finding information in journal articles on sets of genes is common for translational researchers working with microarray data. However, manually searching and scanning the literature references returned from PubMed is a time-consuming task for scientists. We built and evaluated an automatic summarizer of information on genes studied in microarray experiments. The Gene Information Clustering and Summarization System (GICSS) is a system that integrates two related steps of the microarray data analysis process: functional gene clustering and gene information gathering. The system evaluation was conducted during the process of genomic researchers analyzing their own experimental microarray datasets. The clusters generated by GICSS were validated by scientists during their microarray analysis process. In addition, presenting sentences in the abstract provided significantly more important information to the users than just showing the title in the default PubMed format. The evaluation results suggest that GICSS can be useful for researchers in genomic area. In addition, the hybrid evaluation method, partway between intrinsic and extrinsic system evaluation, may enable researchers to gauge the true usefulness of the tool for the scientists in their natural analysis workflow and also elicit suggestions for future enhancements. GICSS can be accessed online at: http://ir.ohsu.edu/jianji/index.html.

  1. Analysis of Russian Federation Foreign Policy in the Field of International Information Security

    Directory of Open Access Journals (Sweden)

    Elena S. Zinovieva

    2014-01-01

    Full Text Available Information and communication technologies (ICT play an essential role in the improvement of the quality of life, economic and socio-political of individual countries and humanity in general. However, ICT development is fraught with new challenges and threats to international and national security. Interstate rivalry in the information sphere generates conflicts, an extreme form of which is an information war. Since 1998, the Russian initiative supports the international cooperation on information security at the global and regional level as well as within the framework of the bilateral relations. The article analyzes the characteristics of the global information society, which has a decisive influence on the international security in the information age, as well as international cooperation in this field. The analysis of Russian foreign policy initiatives in the field of international information security is also presented. Today more than 130 countries develop cyber capabilities, both defensive and offensive, that pose serious threats to the international stability. It's difficult to trace the source of information attacks and its consequences can be devastating and cause retaliation, including the use of conventional weapons. In this situation Russian approach, advocating for the development of the rules of conduct of States and demilitarization of information space in order to ensure its safety, seems urgent and relevant with the international situation.

  2. Management Information Systems. Analysis of Literature and Selected Bibliography. Analysis and Bibliography Series, No. 4.

    Science.gov (United States)

    ERIC Clearinghouse on Educational Management, Eugene, OR.

    This review analyzes literaure dealing with applications of management information system (MIS) tools to educational management. Of the three levels of management--operational control, management control, and strategic planning-- the literature suggests that most activity is taking place at the operational control level. Fewest applications have…

  3. Methods of sports genetics: dermatoglyphic analysis of human palmarprints (information 2

    Directory of Open Access Journals (Sweden)

    Serhiyenko L.P.

    2010-01-01

    Full Text Available Information is generalized about the dermatoglyphic analysis of hands of hands of man. The quantitative dermatoglyphic indexes of hands of hands are presented for youths and girls of the Podol region of Ukraine. The quantitative indexes of palm's dermatoglyphics are rotined for youths and girls of Ukrainian and Russian nationality in Kharkov. The most informing dermatoglyphic indexes of hands of hands which it is possible to use in sporting genetics are certain. Formed recommendation on technology of dermatoglyphic analysis of hands of hands of man in sporting genetics.

  4. RACLOUDS - Model for Clouds Risk Analysis in the Information Assets Context

    Directory of Open Access Journals (Sweden)

    SILVA, P. F.

    2016-06-01

    Full Text Available Cloud computing offers benefits in terms of availability and cost, but transfers the responsibility of information security management for the cloud service provider. Thus the consumer loses control over the security of their information and services. This factor has prevented the migration to cloud computing in many businesses. This paper proposes a model where the cloud consumer can perform risk analysis on providers before and after contracting the service. The proposed model establishes the responsibilities of three actors: Consumer, Provider and Security Labs. The inclusion of actor Security Labs provides more credibility to risk analysis making the results more consistent for the consumer.

  5. ANALYSIS OF THE COMPETITIVENESS OF UKRAINIAN ECONOMY IN THE LIGHT OF INFORMATION AND COMMUNICATION TECHNOLOGIES

    Directory of Open Access Journals (Sweden)

    Nataliia V. Morze

    2015-10-01

    Full Text Available The article analyzes the development of information society in Ukraine by retrospective comparison of the Global Competitiveness Index, Networked Readiness Index, E-Government Development Index, reflecting the major trends of ICT use in Ukrainian society. Based on the received results the SWOT-analysis of the information society development in Ukraine was made; the main problems and perspectives of its development were determined, the recommendations for increasing the level of the competitiveness of Ukrainian economy through the use of information and communication technologies were developed.

  6. Information analysis of iris biometrics for the needs of cryptology key extraction

    Directory of Open Access Journals (Sweden)

    Adamović Saša

    2013-01-01

    Full Text Available The paper presents a rigorous analysis of iris biometric information for the synthesis of an optimized system for the extraction of a high quality cryptology key. Estimations of local entropy and mutual information were identified as segments of the iris most suitable for this purpose. In order to optimize parameters, corresponding wavelets were transformed, in order to obtain the highest possible entropy and mutual information lower in the transformation domain, which set frameworks for the synthesis of systems for the extraction of truly random sequences of iris biometrics, without compromising authentication properties. [Projekat Ministarstva nauke Republike Srbije, br. TR32054 i br. III44006

  7. Analysis of Web Server Log Files: Website of Information Management Department of Hacettepe University

    Directory of Open Access Journals (Sweden)

    Mandana Mir Moftakhari

    2015-09-01

    Full Text Available Over the last decade, the importance of analysing information management systems logs has grown, because it has proved that results of the analysing log data can help developing in information system design, interface and architecture of websites. Log file analysis is one of the best ways in order to understand information-searching process of online searchers, users’ needs, interests, knowledge, and prejudices. The utilization of data collected in transaction logs of web search engines helps designers, researchers and web site managers to find complex interactions of users’ goals and behaviours to increase efficiency and effectiveness of websites. Before starting any analysis it should be observed that the log file of the web site contain enough information, otherwise analyser wouldn’t be able to create complete report. In this study we evaluate the website of Information Management Department of Hacettepe University by analysing the server log files. Results show that there is not adequate amount of information in log files which are provided by web site server. The reports which we have created have some information about users’ behaviour and need but they are not sufficient for taking ideal decisions about contents & hyperlink structure of website. It also provides that creating an extended log file is essential for the website. Finally we believe that results can be helpful to improve, redesign and create better website.

  8. Management of organizations in Serbia from the aspect of the maturity analysis of information security

    Directory of Open Access Journals (Sweden)

    Trivan Dragan

    2016-01-01

    Full Text Available The aim of this work is focused on research of information security in organizations, with a focus on cybersecurity. In accordance with the theoretical analysis, the subject of the empirical part of the work is the analysis of information security in Serbia, in order to better understand the information security programs and management structures in organizations in Serbia. The survey covers a variety of industries and discusses how organizations assess, develop, create and support their programs to ensure information security. The survey included 53 companies. The results that were obtained enabled us to select five core elements of the program on the state of information security and cybersecurity in Serbian companies: most companies had not been exposed to cybersecurity incidents; in most companies policy, procedures and spheres of responsibility for information security exist, there are not enough controls to ensure compliance with relevant safety standards by third parties, top management and end-users are insufficiently familiar with cybersecurity risks, although they apply basic measures of protection, safety protection systems are very rare. The scientific goal of this work is to, on the basis of the results obtained, make conclusions that can contribute to the study of corporate information security with special emphasis on cybersecurity. The practical aim of the research is the application of the results for more efficient implementation process of security against cyber attacks in the Serbian organizations.

  9. Functional analysis of ultra high information rates conveyed by rat vibrissal primary afferents

    Directory of Open Access Journals (Sweden)

    André eMaia Chagas

    2013-12-01

    Full Text Available Sensory receptors determine the type and the quantity of information available for perception. Here, we quantified and characterized the information transferred by primary afferents in the rat whisker system using neural system identification. Quantification of ‘how much’ information is conveyed by primary afferents, using the direct method, a classical information theoretic tool, revealed that primary afferents transfer huge amounts of information (up to 529 bits/s. Information theoretic analysis of instantaneous spike-triggered kinematic stimulus features was used to gain functional insight on ‘what’ is coded by primary afferents. Amongst the kinematic variables tested - position, velocity, and acceleration - primary afferent spikes encoded velocity best. The other two variables contribute to information transfer, but only if combined with velocity. We further revealed three additional characteristics that play a role in information transfer by primary afferents. Firstly, primary afferent spikes show preference for well separated multiple stimuli (i.e. well separated sets of combinations of the three instantaneous kinematic variables. Secondly, spikes are sensitive to short strips of the stimulus trajectory (up to 10 ms pre-spike time, and thirdly, they show spike patterns (precise doublet and triplet spiking. In order to deal with these complexities, we used a flexible probabilistic neuron model fitting mixtures of Gaussians to the spike triggered stimulus distributions, which quantitatively captured the contribution of the mentioned features and allowed us to achieve a full functional analysis of the total information rate indicated by the direct method. We found that instantaneous position, velocity, and acceleration explained about 50% of the total information rate. Adding a 10 ms pre-spike interval of stimulus trajectory achieved 80-90%. The final 10-20% were found to be due to non-linear coding by spike bursts.

  10. Methods of Sports Genetics: dermatoglyphic analysis of human fingerprints (information 1

    Directory of Open Access Journals (Sweden)

    Serhiyenko L.P.

    2010-02-01

    Full Text Available The article provides data on the dermatoglyphic analysis of human fingerprints. The most informative dermatoglyphic traits of fingerprints are defined. They can be used as genetic markers to prognosticate sports endowments. The recommendations to use the technology of dermatoglyphic analysis of human fingerprints in sports genetics are given. There are certain national and racial differences in phenotypical expressed of dermatoglyphics of digit patterns.

  11. Near-Real-Time Analysis of Publicly Communicated Disaster Response Information

    Science.gov (United States)

    Girard, Trevor

    2015-04-01

    During a disaster situation the public will need to make critical actions regarding what to do, where to go, how to get there, and so on. The more informed the public is, the better actions they are able to make, resulting in reduced disaster impacts. The criteria for what information to provide the public needs to change depending on the specific needs of the disaster affected population. The method of dissemination also needs to match the communication channels that the public typically uses in disaster situations. This research project investigates the dynamic information needs of disaster affected populations and how information leads to actions. The purpose of the research project is to identify key indicators for measuring how well informed the public is during disasters. The indicators are limited to those which can be observed as communication is happening (i.e., in near-real-time). By doing so, the indicators can be analyzed as disaster situations unfold, deficiencies can be identified, and recommendations can be made to potentially improve communication while the response is still underway. The end goal of the research is to improve the ability of communicators to inform disaster affected communities. A classification scheme has been developed to categorize the information provided to the public during disasters. Under each category is a set of typical questions that the information should answer. These questions are the result of a best observed practice review of the information available during 11 disasters. For example, under the category 'Life Saving Response', the questions which should be answered are who is doing what (Evacuation, SAR), where and when, and the amount of the affected communities' needs being covered by these actions. Review of what questions remain unanswered acts as the first indicator, referred to as an 'Information Gap Analysis'. Comparative analysis of the information within categories, between categories, and between similar

  12. Relating Maxwell’s demon and quantitative analysis of information leakage for practical imperative programs

    International Nuclear Information System (INIS)

    Anjaria, Kushal; Mishra, Arun

    2017-01-01

    Shannon observed the relation between information entropy and Maxwell demon experiment to come up with information entropy formula. After that, Shannon’s entropy formula is widely used to measure information leakage in imperative programs. But in the present work, our aim is to go in a reverse direction and try to find possible Maxwell’s demon experimental setup for contemporary practical imperative programs in which variations of Shannon’s entropy formula has been applied to measure the information leakage. To establish the relation between the second principle of thermodynamics and quantitative analysis of information leakage, present work models contemporary variations of imperative programs in terms of Maxwell’s demon experimental setup. In the present work five contemporary variations of imperative program related to information quantification are identified. They are: (i) information leakage in imperative program, (ii) imperative multithreaded program, (iii) point to point leakage in the imperative program, (iv) imperative program with infinite observation, and (v) imperative program in the SOA-based environment. For these variations, minimal work required by an attacker to gain the secret is also calculated using historical Maxwell’s demon experiment. To model the experimental setup of Maxwell’s demon, non-interference security policy is used. In the present work, imperative programs with one-bit secret information have been considered to avoid the complexity. The findings of the present work from the history of physics can be utilized in many areas related to information flow of physical computing, nano-computing, quantum computing, biological computing, energy dissipation in computing, and computing power analysis. (paper)

  13. A flood-based information flow analysis and network minimization method for gene regulatory networks.

    Science.gov (United States)

    Pavlogiannis, Andreas; Mozhayskiy, Vadim; Tagkopoulos, Ilias

    2013-04-24

    Biological networks tend to have high interconnectivity, complex topologies and multiple types of interactions. This renders difficult the identification of sub-networks that are involved in condition- specific responses. In addition, we generally lack scalable methods that can reveal the information flow in gene regulatory and biochemical pathways. Doing so will help us to identify key participants and paths under specific environmental and cellular context. This paper introduces the theory of network flooding, which aims to address the problem of network minimization and regulatory information flow in gene regulatory networks. Given a regulatory biological network, a set of source (input) nodes and optionally a set of sink (output) nodes, our task is to find (a) the minimal sub-network that encodes the regulatory program involving all input and output nodes and (b) the information flow from the source to the sink nodes of the network. Here, we describe a novel, scalable, network traversal algorithm and we assess its potential to achieve significant network size reduction in both synthetic and E. coli networks. Scalability and sensitivity analysis show that the proposed method scales well with the size of the network, and is robust to noise and missing data. The method of network flooding proves to be a useful, practical approach towards information flow analysis in gene regulatory networks. Further extension of the proposed theory has the potential to lead in a unifying framework for the simultaneous network minimization and information flow analysis across various "omics" levels.

  14. Information Use in History Research: A Citation Analysis of Master's Level Theses

    Science.gov (United States)

    Sherriff, Graham

    2010-01-01

    This article addresses the need for quantitative investigation into students' use of information resources in historical research. It reports the results of a citation analysis of more than 3,000 citations from master's level history theses submitted between 1998 and 2008 at a mid-sized public university. The study's results support the hypotheses…

  15. Chain Analysis for Large-scale Communication Systems: A Methodology for Information Exchange in Chains

    NARCIS (Netherlands)

    Grijpink, J.H.A.M.

    2010-01-01

    The chain concept is introduced to explain how large-scale information infrastructures so often fail and sometimes even backfire. Next, the assessment framework of the doctrine of Chain-computerisation and its chain analysis procedure are outlined. In this procedure chain description precedes

  16. Design and Implementation of Marine Information System, and Analysis of Learners' Intention toward

    Science.gov (United States)

    Pan, Yu-Jen; Kao, Jui-Chung; Yu, Te-Cheng

    2016-01-01

    The goal of this study is to conduct further research and discussion on applying the internet on marine education, utilizing existing technologies such as cloud service, social network, data collection analysis, etc. to construct a marine environment education information system. The content to be explored includes marine education information…

  17. 75 FR 40839 - Agency Information Collection Activities; Proposed Collection; Comment Request; Hazard Analysis...

    Science.gov (United States)

    2010-07-14

    ... appropriate, and other forms of information technology. Hazard Analysis and Critical Control Point (HACCP... 0910-0466)--Extension FDA's regulations in part 120 (21 CFR part 120) mandate the application of HACCP procedures to fruit and vegetable juice processing. HACCP is a preventative system of hazard control that can...

  18. 78 FR 69689 - Agency Information Collection Activities; Proposed Collection; Comment Request; Hazard Analysis...

    Science.gov (United States)

    2013-11-20

    ... appropriate, and other forms of information technology. Hazard Analysis and Critical Control Point (HACCP... 0910-0466)--Extension FDA regulations in part 120 (21 CFR part 120) mandate the application of HACCP principles to the processing of fruit and vegetable juices. HACCP is a preventive system of hazard control...

  19. A Comprehensive Analysis of the Quality of Online Health-Related Information regarding Schizophrenia

    Science.gov (United States)

    Guada, Joseph; Venable, Victoria

    2011-01-01

    Social workers are major mental health providers and, thus, can be key players in guiding consumers and their families to accurate information regarding schizophrenia. The present study, using the WebMedQual scale, is a comprehensive analysis across a one-year period at two different time points of the top for-profit and nonprofit sites that…

  20. Sensitivity Analysis of Multiple Informant Models When Data Are Not Missing at Random

    Science.gov (United States)

    Blozis, Shelley A.; Ge, Xiaojia; Xu, Shu; Natsuaki, Misaki N.; Shaw, Daniel S.; Neiderhiser, Jenae M.; Scaramella, Laura V.; Leve, Leslie D.; Reiss, David

    2013-01-01

    Missing data are common in studies that rely on multiple informant data to evaluate relationships among variables for distinguishable individuals clustered within groups. Estimation of structural equation models using raw data allows for incomplete data, and so all groups can be retained for analysis even if only 1 member of a group contributes…

  1. Laboratory practical work according to the mathematical analysis with application information technologies

    Directory of Open Access Journals (Sweden)

    Рамиз Муталлимович Асланов

    2014-12-01

    Full Text Available In this article the structure and contents of the manual “Laboratory Workshop on the Mathematical Analysis with Application of Information Technologies” which Authors are R.M. Aslanov and O.V. Li is considered. The grant can be used when carrying out a practical training in internal, as well as in remote forms of education.

  2. Quantitative and Qualitative Analysis of Nutrition and Food Safety Information in School Science Textbooks of India

    Science.gov (United States)

    Subba Rao, G. M.; Vijayapushapm, T.; Venkaiah, K.; Pavarala, V.

    2012-01-01

    Objective: To assess quantity and quality of nutrition and food safety information in science textbooks prescribed by the Central Board of Secondary Education (CBSE), India for grades I through X. Design: Content analysis. Methods: A coding scheme was developed for quantitative and qualitative analyses. Two investigators independently coded the…

  3. PAVLOV: An Information Retrieval Program for the Analysis of Learning Data.

    Science.gov (United States)

    Kjeldergaard, Paul M.

    1967-01-01

    PAVLOV (Paired Associate Verbal Learning Organizational Vehicle) is a Fortran coded program designed to facilitate the analysis of learning data. The program analyzes four classes of information parameters, list order, data format, and data. Utilizing this input, the program performs an in-depth measurement of several dependent variables for each…

  4. The Technical Report: An Analysis of Information Design and Packaging for an Inelastic Market.

    Science.gov (United States)

    Pinelli, Thomas E.; And Others

    As part of an evaluation of its scientific and technical information program, the National Aeronautics and Space Administration (NASA) conducted a review and analysis of structural, language, and presentation components of its technical report form. The investigation involved comparing and contrasting NASA's publications standards for technical…

  5. Key attributes of the SAPHIRE risk and reliability analysis software for risk-informed probabilistic applications

    International Nuclear Information System (INIS)

    Smith, Curtis; Knudsen, James; Kvarfordt, Kellie; Wood, Ted

    2008-01-01

    The Idaho National Laboratory is a primary developer of probabilistic risk and reliability analysis (PRRA) tools, dating back over 35 years. Evolving from mainframe-based software, the current state-of-the-practice has led to the creation of the SAPHIRE software. Currently, agencies such as the Nuclear Regulatory Commission, the National Aeronautics and Aerospace Agency, the Department of Energy, and the Department of Defense use version 7 of the SAPHIRE software for many of their risk-informed activities. In order to better understand and appreciate the power of software as part of risk-informed applications, we need to recall that our current analysis methods and solution methods have built upon pioneering work done 30-40 years ago. We contrast this work with the current capabilities in the SAPHIRE analysis package. As part of this discussion, we provide information for both the typical features and special analysis capabilities, which are available. We also present the application and results typically found with state-of-the-practice PRRA models. By providing both a high-level and detailed look at the SAPHIRE software, we give a snapshot in time for the current use of software tools in a risk-informed decision arena

  6. Cultural-Historical Activity Theory and Domain Analysis: Metatheoretical Implications for Information Science

    Science.gov (United States)

    Wang, Lin

    2013-01-01

    Background: Cultural-historical activity theory is an important theory in modern psychology. In recent years, it has drawn more attention from related disciplines including information science. Argument: This paper argues that activity theory and domain analysis which uses the theory as one of its bases could bring about some important…

  7. Comprehension and Analysis of Information in Text: I. Construction and Evaluation of Brief Texts.

    Science.gov (United States)

    Kozminsky, Ely; And Others

    This report describes a series of studies designed to construct and validate a set of text materials necessary to the pursuance of a long-term research project on information analysis and integration in semantically rich, naturalistic domains, primarily in the domain of the stock market. The methods and results of six separate experiments on…

  8. Sewer-system analysis with the aid of a geographical information ...

    African Journals Online (AJOL)

    drinie

    2002-07-03

    Jul 3, 2002 ... Geographical information system (GIS)-supported sewer-system analysis has major advantages over the use of traditional stand- alone sewer programs, especially with regard to establishing network topology, input of sewage contribution data, querying, displaying and mapping of results. This paper ...

  9. Three dimensional visualization breakthrough in analysis and communication of technical information for nuclear waste management

    Energy Technology Data Exchange (ETDEWEB)

    Alexander, D.H.; Cerny, B.A. [USDOE, Washington, DC (USA); Hill, E.R.; Krupka, K.M. [Pacific Northwest Lab., Washington, DC (USA); Smoot, J.L. [Pacific Northwest Lab., Richland, WA (USA); Smith, D.R.; Waldo, K. [Dynamic Graphics, Inc., Bethesda, MD (USA)

    1990-11-01

    Computer graphics systems that provide interactive display and manipulation of three-dimensional data are powerful tools for the analysis and communication of technical information required for characterization and design of a geologic repository for nuclear waste. Greater understanding of site performance and repository design information is possible when performance-assessment modeling results can be visually analyzed in relation to site geologic and hydrologic information and engineering data for surface and subsurface facilities. In turn, this enhanced visualization capability provides better communication between technical staff and program management with respect to analysis of available information and prioritization of program planning. A commercially-available computer system was used to demonstrate some of the current technology for three-dimensional visualization within the architecture of systems for nuclear waste management. This computer system was used to interactively visualize and analyze the information for two examples: (1) site-characterization and engineering data for a potential geologic repository at Yucca Mountain, Nevada; and (2) three-dimensional simulations of a hypothetical release and transport of contaminants from a source of radionuclides to the vadose zone. Users may assess the three-dimensional distribution of data and modeling results by interactive zooming, rotating, slicing, and peeling operations. For those parts of the database where information is sparse or not available, the software incorporates models for the interpolation and extrapolation of data over the three-dimensional space of interest. 12 refs., 4 figs.

  10. The information needs and behaviour of clinical researchers: a user-needs analysis.

    Science.gov (United States)

    Korjonen-Close, Helena

    2005-06-01

    As part of the strategy to set up a new information service, including a physical Resource Centre, the analysis of information needs of clinical research professionals involved with clinical research and development in the UK and Europe was required. It also aimed to identify differences in requirements between the various roles of professionals and establish what information resources are currently used. A user-needs survey online of the members of The Institute. Group discussions with specialist subcommittees of members. Two hundred and ninety members responded to the online survey of 20 questions. This makes it a response rate of 7.9%. Members expressed a lack of information in their particular professional area, and lack the skills to retrieve and appraise information. The results of the survey are discussed in more detail, giving indications of what the information service should collect, what types of materials should be provided to members and what services should be on offer. These were developed from the results of the needs analysis and submitted to management for approval. Issues of concern, such as financial constraint and staff constraints are also discussed. There is an opportunity to build a unique collection of clinical research material, which will promote The Institute not only to members, but also to the wider health sector. Members stated that the most physical medical libraries don't provide what they need, but the main finding through the survey and discussions is that it's pointless to set up 'yet another medical library'.

  11. Information Flow Analysis for Human-System Interaction in the SG Level Control

    International Nuclear Information System (INIS)

    Kim, Jong Hyun; Shin, Yeong Cheol

    2008-01-01

    Interaction between automatic control and operators is one of main issues in the application of automation technology. Inappropriate information from automatic control systems causes unexpected problems in human-automation collaboration. Poor information becomes critical, especially when the operator takes over the control from an automation system. Operators cannot properly handle the situation transferred from the automatic mode because of inadequate situation awareness, if the operator is out-of-the loop and the automatic control system fails. Some cases of unplanned reactor trips during the transition between the manual mode and the automatic mode are reported in nuclear power plants (NPPs). Among unplanned reactor trips since 2002, two cases were partially caused by automation-related failures of steam generator (SG) level control. This paper conducts information flow analysis to identify information and control requirement for human-system interaction of SG level control. At first, this paper identifies the level of automation in SG level control systems and then function allocation between system control and human operators. Then information flow analysis for monitoring and transition of automation is performed by adapting job process chart. Information and control requirements will be useful as an input for the human-system interface (HSI) design of SG level control

  12. Three dimensional visualization breakthrough in analysis and communication of technical information for nuclear waste management

    International Nuclear Information System (INIS)

    Alexander, D.H.; Cerny, B.A.; Hill, E.R.; Krupka, K.M.; Smoot, J.L.; Smith, D.R.; Waldo, K.

    1990-11-01

    Computer graphics systems that provide interactive display and manipulation of three-dimensional data are powerful tools for the analysis and communication of technical information required for characterization and design of a geologic repository for nuclear waste. Greater understanding of site performance and repository design information is possible when performance-assessment modeling results can be visually analyzed in relation to site geologic and hydrologic information and engineering data for surface and subsurface facilities. In turn, this enhanced visualization capability provides better communication between technical staff and program management with respect to analysis of available information and prioritization of program planning. A commercially-available computer system was used to demonstrate some of the current technology for three-dimensional visualization within the architecture of systems for nuclear waste management. This computer system was used to interactively visualize and analyze the information for two examples: (1) site-characterization and engineering data for a potential geologic repository at Yucca Mountain, Nevada; and (2) three-dimensional simulations of a hypothetical release and transport of contaminants from a source of radionuclides to the vadose zone. Users may assess the three-dimensional distribution of data and modeling results by interactive zooming, rotating, slicing, and peeling operations. For those parts of the database where information is sparse or not available, the software incorporates models for the interpolation and extrapolation of data over the three-dimensional space of interest. 12 refs., 4 figs

  13. A risk-informed perspective on deterministic safety analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Wan, P.T.

    2009-01-01

    In this work, the deterministic safety analysis (DSA) approach to nuclear safety is examined from a risk-informed perspective. One objective of safety analysis of a nuclear power plant is to demonstrate via analysis that the risks to the public from events or accidents that are within the design basis of the power plant are within acceptable levels with a high degree of assurance. This nuclear safety analysis objective can be translated into two requirements on the risk estimates of design basis events or accidents: the nominal risk estimate to the public must be shown to be within acceptable levels, and the uncertainty in the risk estimates must be shown to be small on an absolute or relative basis. The DSA approach combined with the defense-in-depth (DID) principle is a simplified safety analysis approach that attempts to achieve the above safety analysis objective in the face of potentially large uncertainties in the risk estimates of a nuclear power plant by treating the various uncertainty contributors using a stylized conservative binary (yes-no) approach, and applying multiple overlapping physical barriers and defense levels to protect against the release of radioactivity from the reactor. It is shown that by focusing on the consequence aspect of risk, the previous two nuclear safety analysis requirements on risk can be satisfied with the DSA-DID approach to nuclear safety. It is also shown the use of multiple overlapping physical barriers and defense levels in the traditional DSA-DID approach to nuclear safety is risk-informed in the sense that it provides a consistently high level of confidence in the validity of the safety analysis results for various design basis events or accidents with a wide range of frequency of occurrence. It is hoped that by providing a linkage between the consequence analysis approach in DSA with a risk-informed perspective, greater understanding of the limitation and capability of the DSA approach is obtained. (author)

  14. Information Flow Through Stages of Complex Engineering Design Projects: A Dynamic Network Analysis Approach

    DEFF Research Database (Denmark)

    Parraguez, Pedro; Eppinger, Steven D.; Maier, Anja

    2015-01-01

    The pattern of information flow through the network of interdependent design activities is thought to be an important determinant of engineering design process results. A previously unexplored aspect of such patterns relates to the temporal dynamics of information transfer between activities...... information flows between activities in complex engineering design projects; 2) we show how the network of information flows in a large-scale engineering project evolved over time and how network analysis yields several managerial insights; and 3) we provide a useful new representation of the engineering...... as those activities are implemented through the network of people executing the project. To address this gap, we develop a dynamic modeling method that integrates both the network of people and the network of activities in the project. We then employ a large dataset collected from an industrial setting...

  15. Maximizing information obtained from secondary ion mass spectra of organic thin films using multivariate analysis

    Science.gov (United States)

    Wagner, M. S.; Graham, D. J.; Ratner, B. D.; Castner, David G.

    2004-10-01

    Time-of-flight secondary ion mass spectrometry (ToF-SIMS) can give a detailed description of the surface chemistry and structure of organic materials. The high mass resolution and high mass range mass spectra obtainable from modern ToF-SIMS instruments offer the ability to rapidly obtain large amounts of data. Distillation of that data into usable information presents a significant problem in the analysis of ToF-SIMS data from organic materials. Multivariate data analysis techniques have become increasingly common for assisting with the interpretation of complex ToF-SIMS data sets. This study presents an overview of principal component analysis (PCA) and partial least squares regression (PLSR) for analyzing the ToF-SIMS spectra of alkanethiol self-assembled monolayers (SAMs) adsorbed onto gold substrates and polymer molecular depth profiles obtained using an SF5+ primary ion beam. The effect of data pretreatment on the information obtained from multivariate analysis of these data sets has been explored. Multivariate analysis is an important tool for maximizing the information obtained from the ToF-SIMS spectra of organic thin films.

  16. The Political Economy of Information Management : A Theoretical and Empirical Analysis of Decision Making regarding Interorganizational Information Systems

    NARCIS (Netherlands)

    V.M.F. Homburg (Vincent)

    1999-01-01

    textabstractDesigning and using interorganizational information systems requires cooperation and coordination among organizations that are, to a certain degree, competitors. This thesis analyzes the design and use of interorganizational information systems from the points of view of political

  17. An Analysis of Information Asset Valuation (IAV) Quantification Methodology for Application with Cyber Information Mission Impact Assessment (CIMIA)

    National Research Council Canada - National Science Library

    Hellesen, Denzil L

    2008-01-01

    .... The IAV methodology proposes that accurate valuation for an Information Asset (InfoA) is the convergence of information tangible, intangible, and flow attributes to form a functional entity that enhances mission capability...

  18. Factor analysis of sources of information on organ donation and transplantation in journalism students.

    Science.gov (United States)

    Martínez-Alarcón, L; Ríos, A; Ramis, G; López-Navas, A; Febrero, B; Ramírez, P; Parrilla, P

    2013-01-01

    Journalists and the information they disseminate are essential to promote health and organ donation and transplantation (ODT). The attitude of journalism students toward ODT could influence public opinion and help promote this treatment option. The aim of this study was to determine the media through which journalism students receive information on ODT and to analyze the association between the sources of information and psychosocial variables. We surveyed journalism students (n = 129) recruited in compulsory classes. A validated psychosocial questionnaire (self-administered, anonymous) about ODT was used. Student t test and χ(2) test were applied. Questionnaire completion rate was 98% (n = 126). The medium with the greatest incidence on students was television (TV), followed by press and magazines/books. In the factor analysis to determine the impact of the information by its source, the first factor was talks with friends and family; the second was shared by hoardings/publicity posters, health professionals, and college/school; and the third was TV and radio. In the factor analysis between information sources and psychosocial variables, the associations were between information about organ donation transmitted by friends and family and having spoken about ODT with them; by TV, radio, and hoardings and not having spoken in the family; and by TV/radio and the father's and mother's opinion about ODT. The medium with the greatest incidence on students is TV, and the medium with the greatest impact on broadcasting information was conversations with friends, family, and health professionals. This could be useful for society, because they should be provided with clear and concise information. Copyright © 2013 Elsevier Inc. All rights reserved.

  19. Empowering Students to Make Sense of an Information-Saturated World: The Evolution of "Information Searching and Analysis"

    Science.gov (United States)

    Wittebols, James H.

    2016-01-01

    How well students conduct research online is an increasing concern for educators at all levels, especially higher education. This paper describes the evolution of a course that examines confirmation bias, information searching, and the political economy of information as keys to becoming more information and media literate. After a key assignment…

  20. Information Communication Technology and Politics: A Synthesized Analysis of the Impacts of Information Technology on Voter Participation in Kenya

    Science.gov (United States)

    Tsuma, Clive Katiba

    2011-01-01

    The availability of political information throughout society made possible by the evolution of contemporary information communication technology has precipitated conflicting debate regarding the effects of technology use on real life political participation. Proponents of technology argue that the use of new information technology stimulates…

  1. NASA Informal Education: Final Report. A Descriptive Analysis of NASA's Informal Education Portfolio: Preliminary Case Studies

    Science.gov (United States)

    Rulf Fountain, Alyssa; Levy, Abigail Jurist

    2010-01-01

    This report was requested by the National Aeronautics and Space Administration's (NASA), Office of Education in July 2009 to evaluate the Informal Education Program. The goals of the evaluation were twofold: (1) to gain insight into its investment in informal education; and (2) to clarify existing distinctions between its informal education…

  2. The dynamics of information-driven coordination phenomena: A transfer entropy analysis.

    Science.gov (United States)

    Borge-Holthoefer, Javier; Perra, Nicola; Gonçalves, Bruno; González-Bailón, Sandra; Arenas, Alex; Moreno, Yamir; Vespignani, Alessandro

    2016-04-01

    Data from social media provide unprecedented opportunities to investigate the processes that govern the dynamics of collective social phenomena. We consider an information theoretical approach to define and measure the temporal and structural signatures typical of collective social events as they arise and gain prominence. We use the symbolic transfer entropy analysis of microblogging time series to extract directed networks of influence among geolocalized subunits in social systems. This methodology captures the emergence of system-level dynamics close to the onset of socially relevant collective phenomena. The framework is validated against a detailed empirical analysis of five case studies. In particular, we identify a change in the characteristic time scale of the information transfer that flags the onset of information-driven collective phenomena. Furthermore, our approach identifies an order-disorder transition in the directed network of influence between social subunits. In the absence of clear exogenous driving, social collective phenomena can be represented as endogenously driven structural transitions of the information transfer network. This study provides results that can help define models and predictive algorithms for the analysis of societal events based on open source data.

  3. Bim Orientation: Grades of Generation and Information for Different Type of Analysis and Management Process

    Science.gov (United States)

    Banfi, F.

    2017-08-01

    Architecture, Engineering and Construction (AEC) industry is facing a great process re-engineering of the management procedures for new constructions, and recent studies show a significant increase of the benefits obtained through the use of Building Information Modelling (BIM) methodologies. This innovative approach needs new developments for information and communication technologies (ICT) in order to improve cooperation and interoperability among different actors and scientific disciplines. Accordingly, BIM could be described as a new tool capable of collect/analyse a great quantity of information (Big data) and improve the management of building during its life of cycle (LC). The main aim of this research is, in addition to a reduction in production times, reduce physical and financial resources (economic impact), to demonstrate how technology development can support a complex generative process with new digital tools (modelling impact). This paper reviews recent BIMs of different historical Italian buildings such as Basilica of Collemaggio in L'Aquila, Masegra Castle in Sondrio, Basilica of Saint Ambrose in Milan and Visconti Bridge in Lecco and carries out a methodological analysis to optimize output information and results combining different data and modelling techniques into a single hub (cloud service) through the use of new Grade of Generation (GoG) and Information (GoI) (management impact). Finally, this study shows the need to orient GoG and GoI for a different type of analysis, which requires a high Grade of Accuracy (GoA) and an Automatic Verification System (AVS ) at the same time.

  4. Multihop Capability Analysis in Wireless Information and Power Transfer Multirelay Cooperative Networks

    Directory of Open Access Journals (Sweden)

    Qilin Wu

    2018-01-01

    Full Text Available We study simultaneous wireless information and power transfer (SWIPT in multihop wireless cooperative networks, where the multihop capability that denotes the largest number of transmission hops is investigated. By utilizing the broadcast nature of multihop wireless networks, we first propose a cooperative forwarding power (CFP scheme. In CFP scheme, the multiple relays and receiver have distinctly different tasks. Specifically, multiple relays close to the transmitter harvest power from the transmitter first and then cooperatively forward the power (not the information towards the receiver. The receiver receives the information (not the power from the transmitter first, and then it harvests the power from the relays and is taken as the transmitter of the next hop. Furthermore, for performance comparison, we suggest two schemes: cooperative forwarding information and power (CFIP and direct receiving information and power (DFIP. Also, we construct an analysis model to investigate the multihop capabilities of CFP, CFIP, and DFIP schemes under the given targeted throughput requirement. Finally, simulation results validate the analysis model and show that the multihop capability of CFP is better than CFIP and DFIP, and for improving the multihop capabilities, it is best effective to increase the average number of relay nodes in cooperative set.

  5. Information architecture: study and analysis of data Public Medical base (PubMed

    Directory of Open Access Journals (Sweden)

    Odete Máyra Mesquita Sales

    2016-07-01

    Full Text Available Objective. Based on principles proposed by Rosenfeld and Morville (2006, the present study examined the PubMed database interface, since a well-structured information architecture contributes to good usability in any digital environment. Method. The research development occurred through the use of literature techniques and empirical study on the analysis of information architecture based on organization, navigation, recommended labeling and search for Rosenfeld and Morville (2006 for the sake of usability base PubMed. For better understanding and description of these principles, we used the technique of content analysis. Results. The results showed that the database interface meets the criteria established by the elements of Information Architecture, such as organization based on hypertext structure, horizontal menu and local content divided into categories, identifying active links, global navigation , breadcrumb, textual labeling and iconographic and highlight the search engine. Conclusions. This research showed that the PubMed database interface is well structured, friendly and objective, with numerous possibilities of search and information retrieval. However, there is a need to adopt accessibility standards on this website, so that it reaches more efficiently its purpose of facilitating access to information organized and stored in the PubMed database.

  6. Social inclusion and its approach at Information Science: scientific production analysis in the area of information science periodicals between 2001 and 2010

    Directory of Open Access Journals (Sweden)

    Alex Serrano Almeida

    2013-08-01

    Full Text Available This study has the purpose to check how the social inclusion has been approached at Information Science area, from the scientific production area published at the area national periodicals. Over there, to verify which inclusion forms are recurrently approached at Information Science area; to show the use tendencies of social inclusion concept at the Science Information area scientific articles; to find how it presents the social inclusion concept connected to the information professional and analyze if it there is any association to other themes. It was realized searches in six periodicals at the period between 2001 and 2010. We used how analysis method the Bardin content analysis reference. The analysis corpus was constituted of 30 articles which approached the social inclusion theme. As the results, it was showed that the social inclusion on Information Science area publications, in general, is turned to digital inclusion and to the Information Science area publications uses. Besides, it was still identified connections with the information professionals, which one must serve as mediator between the information and the environment where information and users are inserted.

  7. Evaluation of a gene information summarization system by users during the analysis process of microarray datasets

    Directory of Open Access Journals (Sweden)

    Cohen Aaron

    2009-02-01

    Full Text Available Abstract Background Summarization of gene information in the literature has the potential to help genomics researchers translate basic research into clinical benefits. Gene expression microarrays have been used to study biomarkers for disease and discover novel types of therapeutics and the task of finding information in journal articles on sets of genes is common for translational researchers working with microarray data. However, manually searching and scanning the literature references returned from PubMed is a time-consuming task for scientists. We built and evaluated an automatic summarizer of information on genes studied in microarray experiments. The Gene Information Clustering and Summarization System (GICSS is a system that integrates two related steps of the microarray data analysis process: functional gene clustering and gene information gathering. The system evaluation was conducted during the process of genomic researchers analyzing their own experimental microarray datasets. Results The clusters generated by GICSS were validated by scientists during their microarray analysis process. In addition, presenting sentences in the abstract provided significantly more important information to the users than just showing the title in the default PubMed format. Conclusion The evaluation results suggest that GICSS can be useful for researchers in genomic area. In addition, the hybrid evaluation method, partway between intrinsic and extrinsic system evaluation, may enable researchers to gauge the true usefulness of the tool for the scientists in their natural analysis workflow and also elicit suggestions for future enhancements. Availability GICSS can be accessed online at: http://ir.ohsu.edu/jianji/index.html

  8. Concept similarity and related categories in information retrieval using formal concept analysis

    Science.gov (United States)

    Eklund, P.; Ducrou, J.; Dau, F.

    2012-11-01

    The application of formal concept analysis to the problem of information retrieval has been shown useful but has lacked any real analysis of the idea of relevance ranking of search results. SearchSleuth is a program developed to experiment with the automated local analysis of Web search using formal concept analysis. SearchSleuth extends a standard search interface to include a conceptual neighbourhood centred on a formal concept derived from the initial query. This neighbourhood of the concept derived from the search terms is decorated with its upper and lower neighbours representing more general and special concepts, respectively. SearchSleuth is in many ways an archetype of search engines based on formal concept analysis with some novel features. In SearchSleuth, the notion of related categories - which are themselves formal concepts - is also introduced. This allows the retrieval focus to shift to a new formal concept called a sibling. This movement across the concept lattice needs to relate one formal concept to another in a principled way. This paper presents the issues concerning exploring, searching, and ordering the space of related categories. The focus is on understanding the use and meaning of proximity and semantic distance in the context of information retrieval using formal concept analysis.

  9. Analog-to-digital conversion of spectrometric data in information-control systems of activation analysis

    International Nuclear Information System (INIS)

    Mamonov, E.I.

    1972-01-01

    Analogue-digital conversion (ADC) techniques in nuclear radiation spectrometer channels is a most important link of information control systems in activation analysis. For the development of the ADC of spectrometer channels logico-structural methods of increasing the capacity, procedures for boosting frequency modes and improving the accuracy are promising. Procedures are suggested for increasing the ADC capacity. Insufficient stability and noticeable non-linearity of the spectrometer channel can be corrected at the information processing stage if their regularities are known. Capacity limitations make the development of ADC featuring high stability, capacity and linearity quite urgent

  10. Technology and Research Requirements for Combating Human Trafficking: Enhancing Communication, Analysis, Reporting, and Information Sharing

    Energy Technology Data Exchange (ETDEWEB)

    Kreyling, Sean J.; West, Curtis L.; Olson, Jarrod

    2011-03-17

    DHS’ Science & Technology Directorate directed PNNL to conduct an exploratory study on the domain of human trafficking in the Pacific Northwest in order to examine and identify technology and research requirements for enhancing communication, analysis, reporting, and information sharing – activities that directly support efforts to track, identify, deter, and prosecute human trafficking – including identification of potential national threats from smuggling and trafficking networks. This effort was conducted under the Knowledge Management Technologies Portfolio as part of the Integrated Federal, State, and Local/Regional Information Sharing (RISC) and Collaboration Program.

  11. APPLICATION OF OLAP SYSTEM IN INFORMATION SUB-SYSTEM OF QMS INCOSISTENCY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Alempije Veljovic

    2008-03-01

    Full Text Available Records of inconsistencies arise as a result of incompliance of certain requirements during the execution of the process for the quality management system (QMS functioning. In this study, the established connection between QMS and projected information sub-system for inconsistencies management is presented. The information model of inconsistencies management provides a possibility to analyse inconsistencies from the aspect of interactive analytical data processing (OLAPsystems on the basis of multi-dimensional tables (OLAPcubes created in MSSQL Server-Analysis Services programme.

  12. Fusion Energy: Contextual Analysis of the Information Panels Developed by the Scientific Community versus Citizen Discourse

    International Nuclear Information System (INIS)

    Ferri Anglada, S.; Cornejo Alvarez, J. M.

    2014-01-01

    The report presents an exploratory study on the impact of scientific dissemination, particularly a comparative analysis of two discourses on fusion energy as an alternative energy future. The report introduces a comparative analysis of the institutional discourse, as portrayed by the scientific jargon used in a European travelling exhibition on nuclear fusion Fusion Expo, and the social discourse, as illustrated by a citizen deliberation on this very same exhibition. Through textual analysis, the scientific discourse as deployed in the informative panels at the Fusion Expo is compared with the citizen discourse as developed in the discussions within the citizen groups. The ConText software was applied for such analysis. The purpose is to analyze how visitors assimilate, capture and understand highly technical information. Results suggest that, in despite of convergence points, the two discourses present certain differences, showing diverse levels of communication. The scientific discourse shows a great profusion of formalisms and technicalities of scientific jargon. The citizen discourse shows abundance of words associated with daily life and the more practical aspects (economy, efficiency), concerning institutional and evaluative references. In sum, the study shows that although there are a few common communicative spaces, there are still very few turning points. These data indicate that although exhibitions can be a good tool to disseminate advances in fusion energy in informal learning contexts, public feedback is a powerful tool for improving the quality of social dialogue. (Author)

  13. Collection and Analysis of Open Source News for Information Awareness and Early Warning in Nuclear Safeguards

    International Nuclear Information System (INIS)

    Cojazzi, Giacomo G.M.; Van Der Goot, Erik; Verile, Marco; Wolfart, Erik; Rutan Fowler, Marcy; Feldman, Yana; Hammond, William; Schweighardt, John; Ferguson, Mattew

    2013-01-01

    Acquisition and analysis of open source information plays an increasingly important role in the IAEA’s move towards safeguards implementation based on all safeguards relevant information known about a State. The growing volume of open source information requires the development of technology and tools capable of effectively collecting relevant information, filtering out “noise”, organizing valuable information in a clear and accessible manner, and assessing its relevance. In this context, the IAEA’s Division of Information Management (SGIM) and the EC’s Joint Research Centre (JRC) are currently implementing a joint project to advance the effectiveness and efficiency of the IAEA’s workflow for open source information collection and analysis. The objective is to provide tools to support SGIM in the production of the SGIM Open Source Highlights, which is a daily news brief consisting of the most pertinent news stories relevant to safeguards and non-proliferation. The process involves the review and selection of hundreds of articles from a wide array of specifically selected sources. The joint activity exploits the JRC’s Europe Media Monitor (EMM) and NewsDesk applications: EMM automatically collects and analyses news articles from a pre-defined list of web sites, and NewsDesk allows an analyst to manually select the most relevant articles from the EMM stream for further processing. The paper discusses the IAEA’s workflow for the production of SGIM Open Source Highlights and describes the capabilities of EMM and NewsDesk. It then provides an overview of the joint activities since the project started in 2011, which were focused i) on setting up a separate EMM installation dedicated to the nuclear safeguards and security domain (Nuclear Security Media Monitor, NSMM) and ii) on evaluating the NSMM/NewsDesk for meeting the IAEA’s needs. Finally, it presents the current use NSMM/NewsDesk at the IAEA and proposes options for further integration with the

  14. Information systems for mental health in six low and middle income countries: cross country situation analysis.

    Science.gov (United States)

    Upadhaya, Nawaraj; Jordans, Mark J D; Abdulmalik, Jibril; Ahuja, Shalini; Alem, Atalay; Hanlon, Charlotte; Kigozi, Fred; Kizza, Dorothy; Lund, Crick; Semrau, Maya; Shidhaye, Rahul; Thornicroft, Graham; Komproe, Ivan H; Gureje, Oye

    2016-01-01

    Research on information systems for mental health in low and middle income countries (LMICs) is scarce. As a result, there is a lack of reliable information on mental health service needs, treatment coverage and the quality of services provided. With the aim of informing the development and implementation of a mental health information sub-system that includes reliable and measurable indicators on mental health within the Health Management Information Systems (HMIS), a cross-country situation analysis of HMIS was conducted in six LMICs (Ethiopia, India, Nepal, Nigeria, South Africa and Uganda), participating in the 'Emerging mental health systems in low and middle income countries' (Emerald) research programme. A situation analysis tool was developed to obtain and chart information from documents in the public domain. In circumstances when information was inadequate, key government officials were contacted to verify the data collected. In this paper we compare the baseline policy context, human resources situation as well as the processes and mechanisms of collecting, verifying, reporting and disseminating mental health related HMIS data. The findings suggest that countries face substantial policy, human resource and health governance challenges for mental health HMIS, many of which are common across sites. In particular, the specific policies and plans for the governance and implementation of mental health data collection, reporting and dissemination are absent. Across sites there is inadequate infrastructure, few HMIS experts, and inadequate technical support and supervision to junior staff, particularly in the area of mental health. Nonetheless there are also strengths in existing HMIS where a few mental health morbidity, mortality, and system level indicators are collected and reported. Our study indicates the need for greater technical and resources input to strengthen routine HMIS and develop standardized HMIS indicators for mental health, focusing in

  15. Quality analysis of patient information about knee arthroscopy on the World Wide Web.

    Science.gov (United States)

    Sambandam, Senthil Nathan; Ramasamy, Vijayaraj; Priyanka, Priyanka; Ilango, Balakrishnan

    2007-05-01

    This study was designed to ascertain the quality of patient information available on the World Wide Web on the topic of knee arthroscopy. For the purpose of quality analysis, we used a pool of 232 search results obtained from 7 different search engines. We used a modified assessment questionnaire to assess the quality of these Web sites. This questionnaire was developed based on similar studies evaluating Web site quality and includes items on illustrations, accessibility, availability, accountability, and content of the Web site. We also compared results obtained with different search engines and tried to establish the best possible search strategy to attain the most relevant, authentic, and adequate information with minimum time consumption. For this purpose, we first compared 100 search results from the single most commonly used search engine (AltaVista) with the pooled sample containing 20 search results from each of the 7 different search engines. The search engines used were metasearch (Copernic and Mamma), general search (Google, AltaVista, and Yahoo), and health topic-related search engines (MedHunt and Healthfinder). The phrase "knee arthroscopy" was used as the search terminology. Excluding the repetitions, there were 117 Web sites available for quality analysis. These sites were analyzed for accessibility, relevance, authenticity, adequacy, and accountability by use of a specially designed questionnaire. Our analysis showed that most of the sites providing patient information on knee arthroscopy contained outdated information, were inadequate, and were not accountable. Only 16 sites were found to be providing reasonably good patient information and hence can be recommended to patients. Understandably, most of these sites were from nonprofit organizations and educational institutions. Furthermore, our study revealed that using multiple search engines increases patients' chances of obtaining more relevant information rather than using a single search

  16. Celebrity Health Announcements and Online Health Information Seeking: An Analysis of Angelina Jolie's Preventative Health Decision.

    Science.gov (United States)

    Dean, Marleah

    2016-01-01

    On May 14, 2013, Angelina Jolie disclosed she carries BRCA1, which means she has an 87% risk of developing breast cancer during her lifetime. Jolie decided to undergo a preventative bilateral mastectomy (PBM), reducing her risk to 5%. The purpose of this study was to analyze the type of information individuals are exposed to when using the Internet to search health information regarding Jolie's decision. Qualitative content analysis revealed four main themes--information about genetics, information about a PBM, information about health care, and information about Jolie's gender identity. Broadly, the identified websites mention Jolie's high risk for developing cancer due to the genetic mutation BRCA1, describe a PBM occasionally noting reasons why she had this surgery and providing alternatives to the surgery, discuss issues related to health care services, costs, and insurances about Jolie's health decision, and portray Jolie as a sexual icon, a partner to Brad Pitt, a mother of six children, and an inspirational humanitarian. The websites also depict Jolie's health decision in positive, negative, and/or both ways. Discussion centers on how this actress' health decision impacts the public.

  17. Information leakage analysis of software: How to make it useful to IT industries?

    Directory of Open Access Journals (Sweden)

    Kushal Anjaria

    2017-06-01

    Full Text Available Nowadays the software is becoming complex as clients expect a number of functionalities in software. In such scenario, information leakage can't be avoided. As a result, a lot of research is going on to develop tools, methods and policies to find and minimize the leakage. The paper proposes a method to provide a measure, especially to the IT organizations to find how the information leakage at one portion of the software can propagate leakage risk to the other portions of the software or entire software. The paper uses the quantitative analysis of information leakage and cost function based statistical method to find the leakage risk propagation in the software. The method proposed in the paper facilitates the organizations by allowing them to set the organization specific parameters. The proposed method has been applied to the function of Linux to demonstrate the information leakage risk propagation. When organizations find information leakage in the software, their sustaining engineering or quality management teams simply rectify the software portion. But it becomes difficult for the organizations to document the overall mitigation of the risk of leakage. Thus, using the proposed method, organizations will be able to quantify the information leakage risk mitigation.

  18. Applying a sociolinguistic model to the analysis of informed consent documents.

    Science.gov (United States)

    Granero-Molina, José; Fernández-Sola, Cayetano; Aguilera-Manrique, Gabriel

    2009-11-01

    Information on the risks and benefits related to surgical procedures is essential for patients in order to obtain their informed consent. Some disciplines, such as sociolinguistics, offer insights that are helpful for patient-professional communication in both written and oral consent. Communication difficulties become more acute when patients make decisions through an informed consent document because they may sign this with a lack of understanding and information, and consequently feel deprived of their freedom to make their choice about different treatments or surgery. This article discusses findings from documentary analysis using the sociolinguistic SPEAKING model, which was applied to the general and specific informed consent documents required for laparoscopic surgery of the bile duct at Torrecárdenas Hospital, Almería, Spain. The objective of this procedure was to identify flaws when information was provided, together with its readability, its voluntary basis, and patients' consent. The results suggest potential linguistic communication difficulties, different languages being used, cultural clashes, asymmetry of communication between professionals and patients, assignment of rights on the part of patients, and overprotection of professionals and institutions.

  19. Analysis of Transaction Costs in Logistics and the Methodologies for Their Information Reflection for Automotive Companies

    Directory of Open Access Journals (Sweden)

    Ol’ga Evgen’evna Kovrizhnykh

    2016-12-01

    Full Text Available Transaction costs emerge in different types of logistics activities and influence the material flow and the accompanying financial and information flows; due to this fact, the information support and assessment are important tasks for the enterprise. The paper analyzes transaction costs in logistics for automotive manufacturers; according to the analysis, the level of these costs in any functional area of “logistics supply” ranges from 1.5 to 20%. These are only the official figures of transaction costs of enterprises that do not take into consideration implicit costs. Despite the growing interest in transaction costs in logistics in the latest fifteen years, this topic is covered rather poorly in Russian literature; the definition of “transaction costs” is unclear, there is no technique of their information reflection and assessment. We have developed the methods for information reflection of transaction costs that can be used by automotive enterprises. Each enterprise will have an opportunity to choose the most suitable technique for information reflection of transaction costs or to compare the level of transaction costs when using different techniques. Application of techniques for information reflection of transaction costs allows the enterprises to increase profits by optimizing and reducing costs and using their assets more effectively, to identify possible ways to improve cost parameters of their performance, to improve their efficiency and productivity; to cut out unnecessary or duplicate activities, to optimize the number of staff involved in a particular activity

  20. Communicating Risk Information in Direct-to-Consumer Prescription Drug Television Ads: A Content Analysis.

    Science.gov (United States)

    Sullivan, Helen W; Aikin, Kathryn J; Poehlman, Jon

    2017-11-10

    Direct-to-consumer (DTC) television ads for prescription drugs are required to disclose the product's major risks in the audio or audio and visual parts of the presentation (sometimes referred to as the "major statement"). The objective of this content analysis was to determine how the major statement of risks is presented in DTC television ads, including what risk information is presented, how easy or difficult it is to understand the risk information, and the audio and visual characteristics of the major statement. We identified 68 DTC television ads for branded prescription drugs, which included a unique major statement and that aired between July 2012 and August 2014. We used subjective and objective measures to code 50 ads randomly selected from the main sample. Major statements often presented numerous risks, usually in order of severity, with no quantitative information about the risks' severity or prevalence. The major statements required a high school reading level, and many included long and complex sentences. The major statements were often accompanied by competing non-risk information in the visual images, presented with moderately fast-paced music, and read at a faster pace than benefit information. Overall, we discovered several ways in which the communication of risk information could be improved.

  1. SWOT analysis on National Common Geospatial Information Service Platform of China

    Science.gov (United States)

    Zheng, Xinyan; He, Biao

    2010-11-01

    Currently, the trend of International Surveying and Mapping is shifting from map production to integrated service of geospatial information, such as GOS of U.S. etc. Under this circumstance, the Surveying and Mapping of China is inevitably shifting from 4D product service to NCGISPC (National Common Geospatial Information Service Platform of China)-centered service. Although State Bureau of Surveying and Mapping of China has already provided a great quantity of geospatial information service to various lines of business, such as emergency and disaster management, transportation, water resource, agriculture etc. The shortcomings of the traditional service mode are more and more obvious, due to the highly emerging requirement of e-government construction, the remarkable development of IT technology and emerging online geospatial service demands of various lines of business. NCGISPC, which aimed to provide multiple authoritative online one-stop geospatial information service and API for further development to government, business and public, is now the strategic core of SBSM (State Bureau of Surveying and Mapping of China). This paper focuses on the paradigm shift that NCGISPC brings up by using SWOT (Strength, Weakness, Opportunity and Threat) analysis, compared to the service mode that based on 4D product. Though NCGISPC is still at its early stage, it represents the future service mode of geospatial information of China, and surely will have great impact not only on the construction of digital China, but also on the way that everyone uses geospatial information service.

  2. ANALYSIS OF INFORMATION SYSTEM IMPLEMENTATION IN BINUS UNIVERSITY USING DELONE AND MCLEAN INFORMATION SYSTEM SUCCESS MODEL AND COBIT FRAMEWORK

    OpenAIRE

    Johan Muliadi Kerta; Angellia Debora Suryawan

    2013-01-01

    The success of implementation of information system in an organization will supportthe organization in the process of achieving goals. Successful information system will support theorganization's day-to-day operations, so that problem can be resolved more quickly and easily. Theinformation system which has been developed and implemented is also necessary to measure thematurity level. Therefore, it can determine whether the implementation of information systemsmade in accordance with the goals...

  3. Analysis of Factors Affect to Organizational Performance In Using Accounting Information Systems Through Users Satisfaction and Integration Information Systems

    Directory of Open Access Journals (Sweden)

    Anton Arisman

    2017-09-01

    Full Text Available The aim of this research is to investigate the factors affecting organizational performance in using accounting information system through users satisfaction and integration information systems. The research respondents were 447 companies that listed in Indonesian Stock Exchange. The data are gathered through consensus method and in total there are 176 responses with complete data. Structural Equation Model (SEM is used in analyzing the data and system theory is utilized in this research. The result shows that knowledge management systems and management control system have significant influence on users satisfaction and integration information systems.  Integration information system and users satisfaction has positive significant on organizational performance.

  4. Methodological choices for research in Information Science: Contributions to domain analysis

    Directory of Open Access Journals (Sweden)

    Juliana Lazzarotto FREITAS

    Full Text Available Abstract The article focuses on the ways of organizing studies according to their methodological choices in the Base Referencial de Artigos de Periódicos em Ciência da Informação (Reference Database of Journal articles in Information Science. We highlight how the organization of scientific production by the methodological choices in Information Science contributes to the identification of its production features and domain analysis. We studied research categories and proposed five classification criteria: research purposes, approaches, focus, techniques and type of analysis. The proposal of a corpus in Information Science is empirically applied, represented by 689 articles, 10% of the production indexed in Base Referencial de Artigos de Periódicos em Ciência da Informação from 1972 to 2010. We adopt content analysis to interpret the methodological choices of authors identified in the corpus. The results point out that exploratory studies are more predominant when considering the research purpose; regarding the research approach, bibliographic and documentary studies are more predominant; systematic observation, questionnaire and interview were the most widely used techniques; document analysis and content analysis are the most widely used types of analysis; the research focus of theoretical, historical and bibliometric studies are more predominant. We found that some studies use two methodological choices and explicit epistemological approaches, such as the studies following the positivist approach in the 1970s, and those influenced by the phenomenological approach in the 1980s, which increased the use of methods in qualitative research.

  5. Information Management System Development for the Investigation, Reporting, and Analysis of Human Error in Naval Aviation Maintenance

    National Research Council Canada - National Science Library

    Nelson, Douglas

    2001-01-01

    The purpose of this research is to evaluate and refine a safety information management system that will facilitate data collection, organization, query, analysis and reporting of maintenance errors...

  6. Medication errors in residential aged care facilities: a distributed cognition analysis of the information exchange process.

    Science.gov (United States)

    Tariq, Amina; Georgiou, Andrew; Westbrook, Johanna

    2013-05-01

    Medication safety is a pressing concern for residential aged care facilities (RACFs). Retrospective studies in RACF settings identify inadequate communication between RACFs, doctors, hospitals and community pharmacies as the major cause of medication errors. Existing literature offers limited insight about the gaps in the existing information exchange process that may lead to medication errors. The aim of this research was to explicate the cognitive distribution that underlies RACF medication ordering and delivery to identify gaps in medication-related information exchange which lead to medication errors in RACFs. The study was undertaken in three RACFs in Sydney, Australia. Data were generated through ethnographic field work over a period of five months (May-September 2011). Triangulated analysis of data primarily focused on examining the transformation and exchange of information between different media across the process. The findings of this study highlight the extensive scope and intense nature of information exchange in RACF medication ordering and delivery. Rather than attributing error to individual care providers, the explication of distributed cognition processes enabled the identification of gaps in three information exchange dimensions which potentially contribute to the occurrence of medication errors namely: (1) design of medication charts which complicates order processing and record keeping (2) lack of coordination mechanisms between participants which results in misalignment of local practices (3) reliance on restricted communication bandwidth channels mainly telephone and fax which complicates the information processing requirements. The study demonstrates how the identification of these gaps enhances understanding of medication errors in RACFs. Application of the theoretical lens of distributed cognition can assist in enhancing our understanding of medication errors in RACFs through identification of gaps in information exchange. Understanding

  7. The 2006 Analysis of Information Remaining on Disks Offered for Sale on the Second Hand Market

    Directory of Open Access Journals (Sweden)

    Andy Jones

    2006-09-01

    Full Text Available All organisations, whether in the public or private sector, use computers for the storage and processing of information relating to their business or services, their employees and their customers. A large proportion of families and individuals in their homes now also use personal computers and, both intentionally and inadvertently, often store on those computers personal information. It is clear that most organisations and individuals continue to be unaware of the information that may be stored on the hard disks that the computers contain, and have not considered what may happen to the information after the disposal of the equipment.In 2005, joint research was carried out by the University of Glamorgan in Wales and Edith Cowan University in Australia to determine whether second hand computer disks that were purchased from a number of sources still contained any information or whether the information had been effectively erased. The research revealed that, for the majority of the disks that were examined, the information had not been effectively removed and as a result, both organisations and individuals were potentially exposed to a range of potential crimes.  It is worthy of note that in the disposal of this equipment, the organisations involved had failed to meet their statutory, regulatory and legal obligations.This paper describes a second research project that was carried out in 2006 which repeated the research carried out the previous year and also extended the scope of the research to include additional countries.  The methodology used was the same as that in the previous year and the disks that were used for the research were again supplied blind by a third party. The research involved the forensic imaging of the disks which was followed by an analysis of the disks to determine what information remained and whether it could be easily recovered using publicly available tools and techniques.

  8. Development of efficient system for collection-analysis-application of information using system for technology and information in field of RI-biomics

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Sol Ah; Kim, Joo Yeon; Park, Tai Jin [Korean Association for Radiation Application, Seoul (Korea, Republic of)

    2015-08-15

    RI-Biomics is the new radiation fusion technology of which, such as the characteristics of radioisotope, is applied to the biomics. In order to sharing and overall analysis of data between the institutions through total management of information in the field of RI-Biomics, RI-Biomics Information portal ‘RIBio-Info’ was constructed by KARA (Korean Association for Radiation Application) in February 2015. For systematic operation of this ‘RIBio-Info’ system, it is required to develop system of collection-analysis-application of information. So, in this paper, we summarized development of document forms at each processes of collection-analysis-application of information and systematization of collection methods of information, establishment of characteristically analysis methods of reports such as issue paper, policy report, global market report and watch report. Therefore, these are expected to improving the practical applicability in this field through the vitalization of technology development of users by achieving the circular structure of collection analysis-application of information.

  9. Information Gap Analysis: near real-time evaluation of disaster response

    Science.gov (United States)

    Girard, Trevor

    2014-05-01

    Disasters, such as major storm events or earthquakes, trigger an immediate response by the disaster management system of the nation in question. The quality of this response is a large factor in its ability to limit the impacts on the local population. Improving the quality of disaster response therefore reduces disaster impacts. Studying past disasters is a valuable exercise to understand what went wrong, identify measures which could have mitigated these issues, and make recommendations to improve future disaster planning and response. While such ex post evaluations can lead to improvements in the disaster management system, there are limitations. The main limitation that has influenced this research is that ex post evaluations do not have the ability to inform the disaster response being assessed for the obvious reason that they are carried out long after the response phase is over. The result is that lessons learned can only be applied to future disasters. In the field of humanitarian relief, this limitation has led to the development of real time evaluations. The key aspect of real time humanitarian evaluations is that they are completed while the operation is still underway. This results in findings being delivered at a time when they can still make a difference to the humanitarian response. Applying such an approach to the immediate disaster response phase requires an even shorter time-frame, as well as a shift in focus from international actors to the nation in question's government. As such, a pilot study was started and methodology developed, to analyze disaster response in near real-time. The analysis uses the information provided by the disaster management system within the first 0 - 5 days of the response. The data is collected from publicly available sources such as ReliefWeb and sorted under various categories which represent each aspect of disaster response. This process was carried out for 12 disasters. The quantity and timeliness of information

  10. Integrating semantic annotation and information visualization for the analysis of multichannel fluorescence micrographs from pancreatic tissue.

    Science.gov (United States)

    Herold, Julia; Zhou, Luxian; Abouna, Sylvie; Pelengaris, Stella; Epstein, David; Khan, Michael; Nattkemper, Tim W

    2010-09-01

    The challenging problem of computational bioimage analysis receives growing attention from life sciences. Fluorescence microscopy is capable of simultaneously visualizing multiple molecules by staining with different fluorescent dyes. In the analysis of the result multichannel images, segmentation of ROIs resembles only a first step which must be followed by a second step towards the analysis of the ROI's signals in the different channels. In this paper we present a system that combines image segmentation and information visualization principles for an integrated analysis of fluorescence micrographs of tissue samples. The analysis aims at the detection and annotation of cells of the Islets of Langerhans and the whole pancreas, which is of great importance in diabetes studies and in the search for new anti-diabetes treatments. The system operates with two modules. The automatic annotation module applies supervised machine learning for cell detection and segmentation. The second information visualization module can be used for an interactive classification and visualization of cell types following the link-and-brush principle for filtering. We can compare the results obtained with our system with results obtained manually by an expert, who evaluated a set of example images three times to account for his intra-observer variance. The comparison shows that using our system the images can be evaluated with high accuracy which allows a considerable speed up of the time-consuming evaluation process.

  11. Deriving Quantitative Crystallographic Information from the Wavelength-Resolved Neutron Transmission Analysis Performed in Imaging Mode

    Directory of Open Access Journals (Sweden)

    Hirotaka Sato

    2017-12-01

    Full Text Available Current status of Bragg-edge/dip neutron transmission analysis/imaging methods is presented. The method can visualize real-space distributions of bulk crystallographic information in a crystalline material over a large area (~10 cm with high spatial resolution (~100 μm. Furthermore, by using suitable spectrum analysis methods for wavelength-dependent neutron transmission data, quantitative visualization of the crystallographic information can be achieved. For example, crystallographic texture imaging, crystallite size imaging and crystalline phase imaging with texture/extinction corrections are carried out by the Rietveld-type (wide wavelength bandwidth profile fitting analysis code, RITS (Rietveld Imaging of Transmission Spectra. By using the single Bragg-edge analysis mode of RITS, evaluations of crystal lattice plane spacing (d-spacing relating to macro-strain and d-spacing distribution’s FWHM (full width at half maximum relating to micro-strain can be achieved. Macro-strain tomography is performed by a new conceptual CT (computed tomography image reconstruction algorithm, the tensor CT method. Crystalline grains and their orientations are visualized by a fast determination method of grain orientation for Bragg-dip neutron transmission spectrum. In this paper, these imaging examples with the spectrum analysis methods and the reliabilities evaluated by optical/electron microscope and X-ray/neutron diffraction, are presented. In addition, the status at compact accelerator driven pulsed neutron sources is also presented.

  12. Advances in research methods for information systems research data mining, data envelopment analysis, value focused thinking

    CERN Document Server

    Osei-Bryson, Kweku-Muata

    2013-01-01

    Advances in social science research methodologies and data analytic methods are changing the way research in information systems is conducted. New developments in statistical software technologies for data mining (DM) such as regression splines or decision tree induction can be used to assist researchers in systematic post-positivist theory testing and development. Established management science techniques like data envelopment analysis (DEA), and value focused thinking (VFT) can be used in combination with traditional statistical analysis and data mining techniques to more effectively explore

  13. Temporal Information Processing and Stability Analysis of the MHSN Neuron Model in DDF

    Directory of Open Access Journals (Sweden)

    Saket Kumar Choudhary

    2016-12-01

    Full Text Available Implementation of a neuron like information processing structure at hardware level is a burning research problem. In this article, we analyze the modified hybrid spiking neuron model (the MHSN model in distributed delay framework (DDF for hardware level implementation point of view. We investigate its temporal information processing capability in term of inter-spike-interval (ISI distribution. We also perform the stability analysis of the MHSN model, in which, we compute nullclines, steady state solution, eigenvalues corresponding the MHSN model. During phase plane analysis, we notice that the MHSN model generates limit cycle oscillations which is an important phenomenon in many biological processes. Qualitative behavior of these limit cycle does not changes due to the variation in applied input stimulus, however, delay effect the spiking activity and duration of cycle get altered.

  14. Analysis of public consciousness structure and consideration of information supply against the nuclear power generation

    International Nuclear Information System (INIS)

    Shimooka, Hiroshi

    2001-01-01

    The Energy Engineering Research Institute carried out six times of questionnaire on analysis of public consciousness structure for fiscal years for 1986 to 1999, to obtain a lot of informations on public recognition against the nuclear power generation. In recent, as a feasibility on change of consciousness against the power generation was supposed by occurrence of the JCO critical accident forming the first victim in Japan on September, 1999 after investigation in fiscal year 1998, by carrying out the same questionnaire as one in previous fiscal year to the same objects after the accident, to analyze how evaluation, behavior determining factor and so forth on the power generation changed by the accident. In this paper, on referring to results of past questionnaires, were introduced on the questionnaire results and their analysis carried out before and after the JCO critical accident, to consider on information supply referred by them. (G.K.)

  15. Analysis of information for cerebrovascular disorders obtained by 3D MR imaging

    International Nuclear Information System (INIS)

    Yoshikawa, Kohki; Yoshioka, Naoki; Watanabe, Fumio; Shiono, Takahiro; Sugishita, Morihiro; Umino, Kazunori.

    1995-01-01

    Recently, it becomes easy to analyze information obtained by 3D MR imaging due to remarkable progress of fast MR imaging technique and analysis tool. Six patients suffered from aphasia (4 cerebral infarctions and 2 bleedings) were performed 3D MR imaging (3D FLASH-TR/TE/flip angle; 20-50 msec/6-10 msec/20-30 degrees) and their volume information were analyzed by multiple projection reconstruction (MPR), surface rendering 3D reconstruction, and volume rendering 3D reconstruction using Volume Design PRO (Medical Design Co., Ltd.). Four of them were diagnosed as Broca's aphasia clinically and their lesions could be detected around the cortices of the left inferior frontal gyrus. Another 2 patients were diagnosed as Wernicke's aphasia and the lesions could be detected around the cortices of the left supramarginal gyrus. This technique for 3D volume analyses would provide quite exact locational information about cerebral cortical lesions. (author)

  16. Analysis of information for cerebrovascular disorders obtained by 3D MR imaging

    Energy Technology Data Exchange (ETDEWEB)

    Yoshikawa, Kohki [Tokyo Univ. (Japan). Inst. of Medical Science; Yoshioka, Naoki; Watanabe, Fumio; Shiono, Takahiro; Sugishita, Morihiro; Umino, Kazunori

    1995-12-01

    Recently, it becomes easy to analyze information obtained by 3D MR imaging due to remarkable progress of fast MR imaging technique and analysis tool. Six patients suffered from aphasia (4 cerebral infarctions and 2 bleedings) were performed 3D MR imaging (3D FLASH-TR/TE/flip angle; 20-50 msec/6-10 msec/20-30 degrees) and their volume information were analyzed by multiple projection reconstruction (MPR), surface rendering 3D reconstruction, and volume rendering 3D reconstruction using Volume Design PRO (Medical Design Co., Ltd.). Four of them were diagnosed as Broca`s aphasia clinically and their lesions could be detected around the cortices of the left inferior frontal gyrus. Another 2 patients were diagnosed as Wernicke`s aphasia and the lesions could be detected around the cortices of the left supramarginal gyrus. This technique for 3D volume analyses would provide quite exact locational information about cerebral cortical lesions. (author).

  17. An analysis method of the press information related with the nuclear activity in Argentina

    International Nuclear Information System (INIS)

    Alsina, Griselda.

    1989-01-01

    The articles published by the newspapers during the year 1987 were analyzed and classified according to their contents. An attribute was assigned to each article (positive, negative or neutral) in agreement with its connotation regarding the nuclear activity in Argentina. An ISIS base system was developed using these data. The purpose of this analysis was to evaluate the influence of the press in the public opinion. The relation between the different variables show the importance and approach (environmental, technico-scientifical or political) given by the press to the different subjects. The results show a general lack of knowledge about nuclear activities and a concern among the readers associated with the environmental risks, which calls for the need to develop an information program for the community. The fundamentals of this program should improve the organization in order to make the information reach the external demands, to promote educational programs and to continuously provide information to the press. (S.M.) [es

  18. Wireless Information-Theoretic Security in an Outdoor Topology with Obstacles: Theoretical Analysis and Experimental Measurements

    Directory of Open Access Journals (Sweden)

    Dagiuklas Tasos

    2011-01-01

    Full Text Available This paper presents a Wireless Information-Theoretic Security (WITS scheme, which has been recently introduced as a robust physical layer-based security solution, especially for infrastructureless networks. An autonomic network of moving users was implemented via 802.11n nodes of an ad hoc network for an outdoor topology with obstacles. Obstructed-Line-of-Sight (OLOS and Non-Line-of-Sight (NLOS propagation scenarios were examined. Low-speed user movement was considered, so that Doppler spread could be discarded. A transmitter and a legitimate receiver exchanged information in the presence of a moving eavesdropper. Average Signal-to-Noise Ratio (SNR values were acquired for both the main and the wiretap channel, and the Probability of Nonzero Secrecy Capacity was calculated based on theoretical formula. Experimental results validate theoretical findings stressing the importance of user location and mobility schemes on the robustness of Wireless Information-Theoretic Security and call for further theoretical analysis.

  19. Information search behaviour among new car buyers: A two-step cluster analysis

    Directory of Open Access Journals (Sweden)

    S.M. Satish

    2010-03-01

    Full Text Available A two-step cluster analysis of new car buyers in India was performed to identify taxonomies of search behaviour using personality and situational variables, apart from sources of information. Four distinct groups were found—broad moderate searchers, intense heavy searchers, low broad searchers, and low searchers. Dealers can identify the members of each segment by measuring the variables used for clustering, and can then design appropriate communication strategies.

  20. Using pattern structures to support information retrieval with Formal Concept Analysis

    OpenAIRE

    Codocedo , Victor; Lykourentzou , Ioanna; Astudillo , Hernan; Napoli , Amedeo

    2013-01-01

    International audience; In this paper we introduce a novel approach to information retrieval (IR) based on Formal Concept Analysis (FCA). The use of concept lattices to support the task of document retrieval in IR has proven effective since they allow querying in the space of terms modelled by concept intents and navigation in the space of documents modelled by concept extents. However, current approaches use binary representations to illustrate the relations between documents and terms (''do...

  1. Information Operations Versus Civilian Marketing and Advertising: A Comparative Analysis to Improve IO Planning and Strategy

    Science.gov (United States)

    2008-03-01

    concepts, relevant to IO, which are known successful marketing practices. Successful marketing strategy includes the basic “ 4Ps of marketing ...OPERATIONS VERSUS CIVILIAN MARKETING AND ADVERTISING: A COMPARATIVE ANALYSIS TO IMPROVE IO PLANNING AND STRATEGY by Dan Chilton March 2008...REPORT DATE March 2008 3. REPORT TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE Information Operations Versus Civilian Marketing and

  2. Transportation Routing Analysis Geographic Information System (TRAGIS) User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, PE

    2003-09-18

    The Transportation Routing Analysis Geographic Information System (TRAGIS) model is used to calculate highway, rail, or waterway routes within the United States. TRAGIS is a client-server application with the user interface and map data files residing on the user's personal computer and the routing engine and network data files on a network server. The user's manual provides documentation on installation and the use of the many features of the model.

  3. Comparison of Seven Methods for Boolean Factor Analysis and Their Evaluation by Information Gain

    Czech Academy of Sciences Publication Activity Database

    Frolov, A.; Húsek, Dušan; Polyakov, P.Y.

    2016-01-01

    Roč. 27, č. 3 (2016), s. 538-550 ISSN 2162-237X R&D Projects: GA MŠk ED1.1.00/02.0070 Institutional support: RVO:67985807 Keywords : associative memory * bars problem (BP) * Boolean factor analysis (BFA) * data mining * dimension reduction * Hebbian learning rule * information gain * likelihood maximization (LM) * neural network application * recurrent neural network * statistics Subject RIV: IN - Informatics, Computer Science Impact factor: 6.108, year: 2016

  4. EQUILIBRIUM ANALYSIS OF FINANCIAL COMPANY BASED ON INFORMATION PROVIDED BY THE BALANCE SHEET

    Directory of Open Access Journals (Sweden)

    Ștefăniță ȘUȘU

    2014-06-01

    Full Text Available This article highlights the importance of indicators (as net working capital, working capital requirements and net cash by means of which it is considered in the context of financial balances capitalization information released by the balance sheet of an entity tourist profile. Theoretical concepts presented in a logical sequence are combined with the practical example transposed Turism Covasna company. The results of the analysis are interpreted while trying to formulate solutions to the economic and financial viability of the entity.

  5. Automating annotation of information-giving for analysis of clinical conversation.

    Science.gov (United States)

    Mayfield, Elijah; Laws, M Barton; Wilson, Ira B; Penstein Rosé, Carolyn

    2014-02-01

    Coding of clinical communication for fine-grained features such as speech acts has produced a substantial literature. However, annotation by humans is laborious and expensive, limiting application of these methods. We aimed to show that through machine learning, computers could code certain categories of speech acts with sufficient reliability to make useful distinctions among clinical encounters. The data were transcripts of 415 routine outpatient visits of HIV patients which had previously been coded for speech acts using the Generalized Medical Interaction Analysis System (GMIAS); 50 had also been coded for larger scale features using the Comprehensive Analysis of the Structure of Encounters System (CASES). We aggregated selected speech acts into information-giving and requesting, then trained the machine to automatically annotate using logistic regression classification. We evaluated reliability by per-speech act accuracy. We used multiple regression to predict patient reports of communication quality from post-visit surveys using the patient and provider information-giving to information-requesting ratio (briefly, information-giving ratio) and patient gender. Automated coding produces moderate reliability with human coding (accuracy 71.2%, κ=0.57), with high correlation between machine and human prediction of the information-giving ratio (r=0.96). The regression significantly predicted four of five patient-reported measures of communication quality (r=0.263-0.344). The information-giving ratio is a useful and intuitive measure for predicting patient perception of provider-patient communication quality. These predictions can be made with automated annotation, which is a practical option for studying large collections of clinical encounters with objectivity, consistency, and low cost, providing greater opportunity for training and reflection for care providers.

  6. Examining the Reasoning of Conflicting Science Information from the Information Processing Perspective--An Eye Movement Analysis

    Science.gov (United States)

    Yang, Fang-Ying

    2017-01-01

    The main goal of this study was to investigate how readers' visual attention distribution during reading of conflicting science information is related to their scientific reasoning behavior. A total of 25 university students voluntarily participated in the study. They were given conflicting science information about earthquake predictions to read…

  7. Zone analysis in biology articles as a basis for information extraction.

    Science.gov (United States)

    Mizuta, Yoko; Korhonen, Anna; Mullen, Tony; Collier, Nigel

    2006-06-01

    In the field of biomedicine, an overwhelming amount of experimental data has become available as a result of the high throughput of research in this domain. The amount of results reported has now grown beyond the limits of what can be managed by manual means. This makes it increasingly difficult for the researchers in this area to keep up with the latest developments. Information extraction (IE) in the biological domain aims to provide an effective automatic means to dynamically manage the information contained in archived journal articles and abstract collections and thus help researchers in their work. However, while considerable advances have been made in certain areas of IE, pinpointing and organizing factual information (such as experimental results) remains a challenge. In this paper we propose tackling this task by incorporating into IE information about rhetorical zones, i.e. classification of spans of text in terms of argumentation and intellectual attribution. As the first step towards this goal, we introduce a scheme for annotating biological texts for rhetorical zones and provide a qualitative and quantitative analysis of the data annotated according to this scheme. We also discuss our preliminary research on automatic zone analysis, and its incorporation into our IE framework.

  8. Geographic information systems, remote sensing, and spatial analysis activities in Texas, 2008-09

    Science.gov (United States)

    ,

    2009-01-01

    Geographic information system (GIS) technology has become an important tool for scientific investigation, resource management, and environmental planning. A GIS is a computer-aided system capable of collecting, storing, analyzing, and displaying spatially referenced digital data. GIS technology is useful for analyzing a wide variety of spatial data. Remote sensing involves collecting remotely sensed data, such as satellite imagery, aerial photography, or radar images, and analyzing the data to gather information or investigate trends about the environment or the Earth's surface. Spatial analysis combines remotely sensed, thematic, statistical, quantitative, and geographical data through overlay, modeling, and other analytical techniques to investigate specific research questions. It is the combination of data formats and analysis techniques that has made GIS an essential tool in scientific investigations. This fact sheet presents information about the technical capabilities and project activities of the U.S. Geological Survey (USGS) Texas Water Science Center (TWSC) GIS Workgroup during 2008 and 2009. After a summary of GIS Workgroup capabilities, brief descriptions of activities by project at the local and national levels are presented. Projects are grouped by the fiscal year (October-September 2008 or 2009) the project ends and include overviews, project images, and Internet links to additional project information and related publications or articles.

  9. ANALYSIS OF INFORMATION SYSTEM IMPLEMENTATION IN BINUS UNIVERSITY USING DELONE AND MCLEAN INFORMATION SYSTEM SUCCESS MODEL AND COBIT FRAMEWORK

    Directory of Open Access Journals (Sweden)

    Johan Muliadi Kerta

    2013-05-01

    Full Text Available The success of implementation of information system in an organization will supportthe organization in the process of achieving goals. Successful information system will support theorganization's day-to-day operations, so that problem can be resolved more quickly and easily. Theinformation system which has been developed and implemented is also necessary to measure thematurity level. Therefore, it can determine whether the implementation of information systemsmade in accordance with the goals of the organization. Measuring the success of informationsystems used the DeLone and McLean IS success model. To measure the maturity level ofinformation systems used COBIT (Control Objectives for Information and related Technologyframeworks that provides best practices for IT governance and control. The results of this analysiswill assist and support the IT team in order to develop and build information systems that better fitthe needs and goals of the organization.

  10. Exploratory spatial analysis of pilot fatality rates in general aviation crashes using geographic information systems.

    Science.gov (United States)

    Grabowski, Jurek G; Curriero, Frank C; Baker, Susan P; Li, Guohua

    2002-03-01

    Geographic information systems and exploratory spatial analysis were used to describe the geographic characteristics of pilot fatality rates in 1983-1998 general aviation crashes within the continental United States. The authors plotted crash sites on a digital map; rates were computed at regular grid intersections and then interpolated by using geographic information systems. A test for significance was performed by using Monte Carlo simulations. Further analysis compared low-, medium-, and high-rate areas in relation to pilot characteristics, aircraft type, and crash circumstance. Of the 14,051 general aviation crashes studied, 31% were fatal. Seventy-four geographic areas were categorized as having low fatality rates and 53 as having high fatality rates. High-fatality-rate areas tended to be mountainous, such as the Rocky Mountains and the Appalachian region, whereas low-rate areas were relatively flat, such as the Great Plains. Further analysis comparing low-, medium-, and high-fatality-rate areas revealed that crashes in high-fatality-rate areas were more likely than crashes in other areas to have occurred under instrument meteorologic conditions and to involve aircraft fire. This study demonstrates that geographic information systems are a valuable tool for injury prevention and aviation safety research.

  11. Risk-Informed External Hazards Analysis for Seismic and Flooding Phenomena for a Generic PWR

    Energy Technology Data Exchange (ETDEWEB)

    Parisi, Carlo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Prescott, Steve [Idaho National Lab. (INL), Idaho Falls, ID (United States); Ma, Zhegang [Idaho National Lab. (INL), Idaho Falls, ID (United States); Spears, Bob [Idaho National Lab. (INL), Idaho Falls, ID (United States); Szilard, Ronaldo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Coleman, Justin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kosbab, Ben [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-07-26

    This report describes the activities performed during the FY2017 for the US-DOE Light Water Reactor Sustainability Risk-Informed Safety Margin Characterization (LWRS-RISMC), Industry Application #2. The scope of Industry Application #2 is to deliver a risk-informed external hazards safety analysis for a representative nuclear power plant. Following the advancements occurred during the previous FYs (toolkits identification, models development), FY2017 focused on: increasing the level of realism of the analysis; improving the tools and the coupling methodologies. In particular the following objectives were achieved: calculation of buildings pounding and their effects on components seismic fragility; development of a SAPHIRE code PRA models for 3-loops Westinghouse PWR; set-up of a methodology for performing static-dynamic PRA coupling between SAPHIRE and EMRALD codes; coupling RELAP5-3D/RAVEN for performing Best-Estimate Plus Uncertainty analysis and automatic limit surface search; and execute sample calculations for demonstrating the capabilities of the toolkit in performing a risk-informed external hazards safety analyses.

  12. Vocal acoustic analysis as a biometric indicator of information processing: implications for neurological and psychiatric disorders.

    Science.gov (United States)

    Cohen, Alex S; Dinzeo, Thomas J; Donovan, Neila J; Brown, Caitlin E; Morrison, Sean C

    2015-03-30

    Vocal expression reflects an integral component of communication that varies considerably within individuals across contexts and is disrupted in a range of neurological and psychiatric disorders. There is reason to suspect that variability in vocal expression reflects, in part, the availability of "on-line" resources (e.g., working memory, attention). Thus, understanding vocal expression is a potentially important biometric index of information processing, not only across but within individuals over time. A first step in this line of research involves establishing a link between vocal expression and information processing systems in healthy adults. The present study employed a dual attention experimental task where participants provided natural speech while simultaneously engaged in a baseline, medium or high nonverbal processing-load task. Objective, automated, and computerized analysis was employed to measure vocal expression in 226 adults. Increased processing load resulted in longer pauses, fewer utterances, greater silence overall and less variability in frequency and intensity levels. These results provide compelling evidence of a link between information processing resources and vocal expression, and provide important information for the development of an automated, inexpensive and uninvasive biometric measure of information processing. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  13. Value of information analysis for interventional and counterfactual Bayesian networks in forensic medical sciences.

    Science.gov (United States)

    Constantinou, Anthony Costa; Yet, Barbaros; Fenton, Norman; Neil, Martin; Marsh, William

    2016-01-01

    Inspired by real-world examples from the forensic medical sciences domain, we seek to determine whether a decision about an interventional action could be subject to amendments on the basis of some incomplete information within the model, and whether it would be worthwhile for the decision maker to seek further information prior to suggesting a decision. The method is based on the underlying principle of Value of Information to enhance decision analysis in interventional and counterfactual Bayesian networks. The method is applied to two real-world Bayesian network models (previously developed for decision support in forensic medical sciences) to examine the average gain in terms of both Value of Information (average relative gain ranging from 11.45% and 59.91%) and decision making (potential amendments in decision making ranging from 0% to 86.8%). We have shown how the method becomes useful for decision makers, not only when decision making is subject to amendments on the basis of some unknown risk factors, but also when it is not. Knowing that a decision outcome is independent of one or more unknown risk factors saves us from the trouble of seeking information about the particular set of risk factors. Further, we have also extended the assessment of this implication to the counterfactual case and demonstrated how answers about interventional actions are expected to change when some unknown factors become known, and how useful this becomes in forensic medical science. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Information dimension analysis of bacterial essential and nonessential genes based on chaos game representation

    International Nuclear Information System (INIS)

    Zhou, Qian; Yu, Yong-ming

    2014-01-01

    Essential genes are indispensable for the survival of an organism. Investigating features associated with gene essentiality is fundamental to the prediction and identification of the essential genes. Selecting features associated with gene essentiality is fundamental to predict essential genes with computational techniques. We use fractal theory to make comparative analysis of essential and nonessential genes in bacteria. The information dimensions of essential genes and nonessential genes available in the DEG database for 27 bacteria are calculated based on their gene chaos game representations (CGRs). It is found that weak positive linear correlation exists between information dimension and gene length. Moreover, for genes of similar length, the average information dimension of essential genes is larger than that of nonessential genes. This indicates that essential genes show less regularity and higher complexity than nonessential genes. Our results show that for bacterium with a similar number of essential genes and nonessential genes, the CGR information dimension is helpful for the classification of essential genes and nonessential genes. Therefore, the gene CGR information dimension is very probably a useful gene feature for a genetic algorithm predicting essential genes. (paper)

  15. Comparison of methods of extracting information for meta-analysis of observational studies in nutritional epidemiology

    Directory of Open Access Journals (Sweden)

    Jong-Myon Bae

    2016-01-01

    Full Text Available OBJECTIVES: A common method for conducting a quantitative systematic review (QSR for observational studies related to nutritional epidemiology is the “highest versus lowest intake” method (HLM, in which only the information concerning the effect size (ES of the highest category of a food item is collected on the basis of its lowest category. However, in the interval collapsing method (ICM, a method suggested to enable a maximum utilization of all available information, the ES information is collected by collapsing all categories into a single category. This study aimed to compare the ES and summary effect size (SES between the HLM and ICM. METHODS: A QSR for evaluating the citrus fruit intake and risk of pancreatic cancer and calculating the SES by using the HLM was selected. The ES and SES were estimated by performing a meta-analysis using the fixed-effect model. The directionality and statistical significance of the ES and SES were used as criteria for determining the concordance between the HLM and ICM outcomes. RESULTS: No significant differences were observed in the directionality of SES extracted by using the HLM or ICM. The application of the ICM, which uses a broader information base, yielded more-consistent ES and SES, and narrower confidence intervals than the HLM. CONCLUSIONS: The ICM is advantageous over the HLM owing to its higher statistical accuracy in extracting information for QSR on nutritional epidemiology. The application of the ICM should hence be recommended for future studies.

  16. Information dimension analysis of bacterial essential and nonessential genes based on chaos game representation

    Science.gov (United States)

    Zhou, Qian; Yu, Yong-ming

    2014-11-01

    Essential genes are indispensable for the survival of an organism. Investigating features associated with gene essentiality is fundamental to the prediction and identification of the essential genes. Selecting features associated with gene essentiality is fundamental to predict essential genes with computational techniques. We use fractal theory to make comparative analysis of essential and nonessential genes in bacteria. The information dimensions of essential genes and nonessential genes available in the DEG database for 27 bacteria are calculated based on their gene chaos game representations (CGRs). It is found that weak positive linear correlation exists between information dimension and gene length. Moreover, for genes of similar length, the average information dimension of essential genes is larger than that of nonessential genes. This indicates that essential genes show less regularity and higher complexity than nonessential genes. Our results show that for bacterium with a similar number of essential genes and nonessential genes, the CGR information dimension is helpful for the classification of essential genes and nonessential genes. Therefore, the gene CGR information dimension is very probably a useful gene feature for a genetic algorithm predicting essential genes.

  17. Prediction Model of Collapse Risk Based on Information Entropy and Distance Discriminant Analysis Method

    Directory of Open Access Journals (Sweden)

    Hujun He

    2017-01-01

    Full Text Available The prediction and risk classification of collapse is an important issue in the process of highway construction in mountainous regions. Based on the principles of information entropy and Mahalanobis distance discriminant analysis, we have produced a collapse hazard prediction model. We used the entropy measure method to reduce the influence indexes of the collapse activity and extracted the nine main indexes affecting collapse activity as the discriminant factors of the distance discriminant analysis model (i.e., slope shape, aspect, gradient, and height, along with exposure of the structural face, stratum lithology, relationship between weakness face and free face, vegetation cover rate, and degree of rock weathering. We employ postearthquake collapse data in relation to construction of the Yingxiu-Wolong highway, Hanchuan County, China, as training samples for analysis. The results were analyzed using the back substitution estimation method, showing high accuracy and no errors, and were the same as the prediction result of uncertainty measure. Results show that the classification model based on information entropy and distance discriminant analysis achieves the purpose of index optimization and has excellent performance, high prediction accuracy, and a zero false-positive rate. The model can be used as a tool for future evaluation of collapse risk.

  18. Empowering Students to Make Sense of an Information Saturated World: The Evolution of Information Searching and Analysis

    Directory of Open Access Journals (Sweden)

    James H. Wittebols

    2016-06-01

    Full Text Available How well students conduct research online is an increasing concern for educators at all levels but especially higher education. The paper describes the evolution of a course that grew from a unit within a course to a whole course that examines confirmation bias, information searching and the political economy of information as keys to becoming more information and media literate. After a key assignment in which students assess their own tendency to engage in confirmation bias, students choose a social justice issue to investigate across web, news and academic research resources. Designed to build good analytical skills in assessing the trustworthiness of a variety of sources of information, the course empowers students as researchers, citizens and consumers.

  19. Spatiotemporal Analysis of the Formation of Informal Settlements in a Metropolitan Fringe: Seoul (1950–2015

    Directory of Open Access Journals (Sweden)

    Yiwen Han

    2017-07-01

    Full Text Available In many metropolitan areas, the urban fringe is defined by highly sensitive habitats such as forests and wetlands. However, the explosive growth of urban areas has led to the formation of informal settlements in the urban fringe, subsequently threatening these sensitive habitats and exaggerating several social and environmental problems. We seek to improve the current understanding of informal settlements and their formation in the metropolitan fringe through a comprehensive spatiotemporal analysis of the Guryong Area (GA in the Gangnam District, Seoul, South Korea. We measured the land-use and land-cover (LULC changes in the entire GA from 1950 to 2015, and then analyzed the changes in one specific land-use type defined as “spontaneous settlements”. We then combined these changes with landform and slope data in 600-m-wide bands along the gradient of urbanization. The results showed spontaneous settlements distributed in small clusters in 1975, and the growth of this distribution into larger, more condensed clusters beginning in 1985. Between 1950 and 2015, the total area of spontaneous settlements decreased, while the settlement locations shifted from the urban core to the marginal area of the GA. Meanwhile, the locations selected for spontaneous settlements moved from plain areas with slopes of 2–7%, to more steeply sloped, remote areas such as the mountain foothills with slopes of 15–30%. These results suggest that the spatial characteristics of informal settlements are shown in the degree of aggregation and marginalized trend indicated from the analysis of spontaneous settlements. Finally, we hope the spatial analysis can be used as a basis and starting point for the evaluation process of informal settlement redevelopments in other areas of Seoul, as well as in other Asian cities.

  20. FACTORS OF INFLUENCE ON THE ENTREPRENEURIAL INTEREST: AN ANALYSIS WITH STUDENTS OF INFORMATION TECHNOLOGY RELATED COURSES

    Directory of Open Access Journals (Sweden)

    Diego Guilherme Bonfim

    2009-10-01

    Full Text Available The purpose of the research was to analyze the entrepreneurial interest of students in information technology related courses. A literature review was performed, from which four hypotheses were announced, affirming that the student interest in entrepreneurial activity is influenced by (1 the perceived vocation of the area, (2 the ownership of a company, (3 the perceived social support from friends and family, and (4 the entrepreneurial skills mastery. A field study was developed, with data collected from the 171 students of higher education institutions from Fortaleza. The data were analyzed by using statistical techniques of descriptive analysis, analysis of variance, and multiple regression analysis. It was found that: (1 students, in general, have a moderate predisposition to engage in entrepreneurial activities; (2 the entrepreneurial interest is influenced by the perceived entrepreneurial vocation of the area, the social support, and the perceived strategic entrepreneurial skills mastery.

  1. BIM ORIENTATION: GRADES OF GENERATION AND INFORMATION FOR DIFFERENT TYPE OF ANALYSIS AND MANAGEMENT PROCESS

    Directory of Open Access Journals (Sweden)

    F. Banfi

    2017-08-01

    Full Text Available Architecture, Engineering and Construction (AEC industry is facing a great process re-engineering of the management procedures for new constructions, and recent studies show a significant increase of the benefits obtained through the use of Building Information Modelling (BIM methodologies. This innovative approach needs new developments for information and communication technologies (ICT in order to improve cooperation and interoperability among different actors and scientific disciplines. Accordingly, BIM could be described as a new tool capable of collect/analyse a great quantity of information (Big data and improve the management of building during its life of cycle (LC. The main aim of this research is, in addition to a reduction in production times, reduce physical and financial resources (economic impact, to demonstrate how technology development can support a complex generative process with new digital tools (modelling impact. This paper reviews recent BIMs of different historical Italian buildings such as Basilica of Collemaggio in L’Aquila, Masegra Castle in Sondrio, Basilica of Saint Ambrose in Milan and Visconti Bridge in Lecco and carries out a methodological analysis to optimize output information and results combining different data and modelling techniques into a single hub (cloud service through the use of new Grade of Generation (GoG and Information (GoI (management impact. Finally, this study shows the need to orient GoG and GoI for a different type of analysis, which requires a high Grade of Accuracy (GoA and an Automatic Verification System (AVS at the same time.

  2. Historic Surface Rupture Informing Probabilistic Fault Displacement Analysis: New Zealand Case Studies

    Science.gov (United States)

    Villamor, P.; Litchfield, N. J.; Van Dissen, R. J.; Langridge, R.; Berryman, K. R.; Baize, S.

    2016-12-01

    Surface rupture associated with the 2010 Mw7.1 Darfield Earthquake (South Island, New Zealand) was extremely well documented, thanks to an immediate field mapping response and the acquisition of LiDAR data within days of the event. With respect to informing Probabilistic Fault Displacement Analysis (PFDHA) the main insights and outcomes from this rupture through Quaternary gravel are: 1) significant distributed deformation either side of the main trace (30 to 300 m wide deformation zone) and how the deformation is distributed away from the main trace; 2) a thorough analysis of uncertainty of the displacement measures obtained using the LIDAR data and repeated measurements from several scientists; and 3) the short surface rupture length for the reported magnitude, resulting from complex fault rupture with 5-6 reverse and strike-slip strands, most of which had no surface rupture. While the 2010 event is extremely well documented and will be an excellent case to add to the Surface Rupture during Earthquakes database (SURE), other NZ historical earthquakes that are not so well documented, but can provide important information for PFDHA. New Zealand has experienced about 10 historical surface fault ruptures since 1848, comprising ruptures on strike-slip, reverse and normal faults. Mw associated with these ruptures ranges between 6.3 and 8.1. From these ruptures we observed that the surface expression of deformation can be influenced by: fault maturity; the type of Quaternary sedimentary cover; fault history (e.g., influence of inversion tectonics, flexural slip); fault complexity; and primary versus secondary rupture. Other recent >Mw 6.6 earthquakes post-2010 that did not rupture the ground surface have been documented with InSAR and can inform Mw thresholds for surface fault rupture. It will be important to capture all this information and that of similar events worldwide to inform the SURE database and ultimately PFDHA.

  3. Improving access to health information for older migrants by using grounded theory and social network analysis to understand their information behaviour and digital technology use.

    Science.gov (United States)

    Goodall, K T; Newman, L A; Ward, P R

    2014-11-01

    Migrant well-being can be strongly influenced by the migration experience and subsequent degree of mainstream language acquisition. There is little research on how older Culturally And Linguistically Diverse (CALD) migrants who have 'aged in place' find health information, and the role which digital technology plays in this. Although the research for this paper was not focused on cancer, we draw out implications for providing cancer-related information to this group. We interviewed 54 participants (14 men and 40 women) aged 63-94 years, who were born in Italy or Greece, and who migrated to Australia mostly as young adults after World War II. Constructivist grounded theory and social network analysis were used for data analysis. Participants identified doctors, adult children, local television, spouse, local newspaper and radio as the most important information sources. They did not generally use computers, the Internet or mobile phones to access information. Literacy in their birth language, and the degree of proficiency in understanding and using English, influenced the range of information sources accessed and the means used. The ways in which older CALD migrants seek and access information has important implications for how professionals and policymakers deliver relevant information to them about cancer prevention, screening, support and treatment, particularly as information and resources are moved online as part of e-health. © 2014 John Wiley & Sons Ltd.

  4. Network meta-analysis of multiple outcome measures accounting for borrowing of information across outcomes.

    Science.gov (United States)

    Achana, Felix A; Cooper, Nicola J; Bujkiewicz, Sylwia; Hubbard, Stephanie J; Kendrick, Denise; Jones, David R; Sutton, Alex J

    2014-07-21

    Network meta-analysis (NMA) enables simultaneous comparison of multiple treatments while preserving randomisation. When summarising evidence to inform an economic evaluation, it is important that the analysis accurately reflects the dependency structure within the data, as correlations between outcomes may have implication for estimating the net benefit associated with treatment. A multivariate NMA offers a framework for evaluating multiple treatments across multiple outcome measures while accounting for the correlation structure between outcomes. The standard NMA model is extended to multiple outcome settings in two stages. In the first stage, information is borrowed across outcomes as well across studies through modelling the within-study and between-study correlation structure. In the second stage, we make use of the additional assumption that intervention effects are exchangeable between outcomes to predict effect estimates for all outcomes, including effect estimates on outcomes where evidence is either sparse or the treatment had not been considered by any one of the studies included in the analysis. We apply the methods to binary outcome data from a systematic review evaluating the effectiveness of nine home safety interventions on uptake of three poisoning prevention practices (safe storage of medicines, safe storage of other household products, and possession of poison centre control telephone number) in households with children. Analyses are conducted in WinBUGS using Markov Chain Monte Carlo (MCMC) simulations. Univariate and the first stage multivariate models produced broadly similar point estimates of intervention effects but the uncertainty around the multivariate estimates varied depending on the prior distribution specified for the between-study covariance structure. The second stage multivariate analyses produced more precise effect estimates while enabling intervention effects to be predicted for all outcomes, including intervention effects on

  5. Risk analysis of information security in a mobile instant messaging and presence system for healthcare.

    Science.gov (United States)

    Bønes, Erlend; Hasvold, Per; Henriksen, Eva; Strandenaes, Thomas

    2007-09-01

    Instant messaging (IM) is suited for immediate communication because messages are delivered almost in real time. Results from studies of IM use in enterprise work settings make us believe that IM based services may prove useful also within the healthcare sector. However, today's public instant messaging services do not have the level of information security required for adoption of IM in healthcare. We proposed MedIMob, our own architecture for a secure enterprise IM service for use in healthcare. MedIMob supports IM clients on mobile devices in addition to desktop based clients. Security threats were identified in a risk analysis of the MedIMob architecture. The risk analysis process consists of context identification, threat identification, analysis of consequences and likelihood, risk evaluation, and proposals for risk treatment. The risk analysis revealed a number of potential threats to the information security of a service like this. Many of the identified threats are general when dealing with mobile devices and sensitive data; others are threats which are more specific to our service and architecture. Individual threats identified in the risks analysis are discussed and possible counter measures presented. The risk analysis showed that most of the proposed risk treatment measures must be implemented to obtain an acceptable risk level; among others blocking much of the additional functionality of the smartphone. To conclude on the usefulness of this IM service, it will be evaluated in a trial study of the human-computer interaction. Further work also includes an improved design of the proposed MedIMob architecture. 2006 Elsevier Ireland Ltd

  6. Computed ABC Analysis for Rational Selection of Most Informative Variables in Multivariate Data

    Science.gov (United States)

    Ultsch, Alfred; Lötsch, Jörn

    2015-01-01

    Objective Multivariate data sets often differ in several factors or derived statistical parameters, which have to be selected for a valid interpretation. Basing this selection on traditional statistical limits leads occasionally to the perception of losing information from a data set. This paper proposes a novel method for calculating precise limits for the selection of parameter sets. Methods The algorithm is based on an ABC analysis and calculates these limits on the basis of the mathematical properties of the distribution of the analyzed items. The limits im-plement the aim of any ABC analysis, i.e., comparing the increase in yield to the required additional effort. In particular, the limit for set A, the “important few”, is optimized in a way that both, the effort and the yield for the other sets (B and C), are minimized and the additional gain is optimized. Results As a typical example from biomedical research, the feasibility of the ABC analysis as an objective replacement for classical subjective limits to select highly relevant variance components of pain thresholds is presented. The proposed method improved the biological inter-pretation of the results and increased the fraction of valid information that was obtained from the experimental data. Conclusions The method is applicable to many further biomedical problems in-cluding the creation of diagnostic complex biomarkers or short screening tests from comprehensive test batteries. Thus, the ABC analysis can be proposed as a mathematically valid replacement for traditional limits to maximize the information obtained from multivariate research data. PMID:26061064

  7. Patient information on breast reconstruction in the era of the world wide web. A snapshot analysis of information available on youtube.com.

    Science.gov (United States)

    Tan, M L H; Kok, K; Ganesh, V; Thomas, S S

    2014-02-01

    Breast cancer patient's expectation and choice of reconstruction is increasing and patients often satisfy their information needs outside clinic time by searching the world wide web. The aim of our study was to analyse the quality of content and extent of information regarding breast reconstruction available on YouTube videos and whether this is an appropriate additional source of information for patients. A snapshot qualitative and quantitative analysis of the first 100 videos was performed after the term 'breast reconstruction' was input into the search window of the video sharing website www.youtube.com on the 1st of September 2011. Qualitative categorical analysis included patient, oncological and reconstruction factors. It was concluded that although videos uploaded onto YouTube do not provide comprehensive information, it is a useful resource that can be utilised in patient education provided comprehensive and validated videos are made available. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Data Flow Analysis and Visualization for Spatiotemporal Statistical Data without Trajectory Information.

    Science.gov (United States)

    Kim, Seokyeon; Jeong, Seongmin; Woo, Insoo; Jang, Yun; Maciejewski, Ross; Ebert, David S

    2018-03-01

    Geographic visualization research has focused on a variety of techniques to represent and explore spatiotemporal data. The goal of those techniques is to enable users to explore events and interactions over space and time in order to facilitate the discovery of patterns, anomalies and relationships within the data. However, it is difficult to extract and visualize data flow patterns over time for non-directional statistical data without trajectory information. In this work, we develop a novel flow analysis technique to extract, represent, and analyze flow maps of non-directional spatiotemporal data unaccompanied by trajectory information. We estimate a continuous distribution of these events over space and time, and extract flow fields for spatial and temporal changes utilizing a gravity model. Then, we visualize the spatiotemporal patterns in the data by employing flow visualization techniques. The user is presented with temporal trends of geo-referenced discrete events on a map. As such, overall spatiotemporal data flow patterns help users analyze geo-referenced temporal events, such as disease outbreaks, crime patterns, etc. To validate our model, we discard the trajectory information in an origin-destination dataset and apply our technique to the data and compare the derived trajectories and the original. Finally, we present spatiotemporal trend analysis for statistical datasets including twitter data, maritime search and rescue events, and syndromic surveillance.

  9. #FluxFlow: Visual Analysis of Anomalous Information Spreading on Social Media.

    Science.gov (United States)

    Zhao, Jian; Cao, Nan; Wen, Zhen; Song, Yale; Lin, Yu-Ru; Collins, Christopher

    2014-12-01

    We present FluxFlow, an interactive visual analysis system for revealing and analyzing anomalous information spreading in social media. Everyday, millions of messages are created, commented, and shared by people on social media websites, such as Twitter and Facebook. This provides valuable data for researchers and practitioners in many application domains, such as marketing, to inform decision-making. Distilling valuable social signals from the huge crowd's messages, however, is challenging, due to the heterogeneous and dynamic crowd behaviors. The challenge is rooted in data analysts' capability of discerning the anomalous information behaviors, such as the spreading of rumors or misinformation, from the rest that are more conventional patterns, such as popular topics and newsworthy events, in a timely fashion. FluxFlow incorporates advanced machine learning algorithms to detect anomalies, and offers a set of novel visualization designs for presenting the detected threads for deeper analysis. We evaluated FluxFlow with real datasets containing the Twitter feeds captured during significant events such as Hurricane Sandy. Through quantitative measurements of the algorithmic performance and qualitative interviews with domain experts, the results show that the back-end anomaly detection model is effective in identifying anomalous retweeting threads, and its front-end interactive visualizations are intuitive and useful for analysts to discover insights in data and comprehend the underlying analytical model.

  10. Breast cancer information communicated on a public online platform: an analysis of 'Yahoo! Answer Japan'.

    Science.gov (United States)

    Ohigashi, An; Ahmed, Salim; Afzal, Arfan R; Shigeta, Naoko; Tam-Tham, Helen; Kanda, Hideyuki; Ishikawa, Yoshihiro; Turin, Tanvir C

    2017-06-01

    INTRODUCTION Japan is a developed country with high use of Internet and online platforms for health information. 'Yahoo! Answer Japan' is the most commonly used question-and-answer service in Japan. AIM To explore the information users seek regarding breast cancer from the 'Yahoo! Answer Japan' web portal. METHODS The 'Yahoo! Answer Japan' portal was searched for the key word 'breast cancer' and all questions searched for the period of 1 January to 31 December 2014 were obtained. The selected questions related to human breast cancer and were not advertisements or promotional material. The questions were categorized using a coding schema. High and low access of the questions were defined by the number of view-counts. RESULTS Among the 2392 selected questions, six major categories were identified; (1) suspected breast cancer, (2) breast cancer screening, (3) treatment of breast cancer, (4) life with breast cancer, (5) prevention of breast cancer and (6) others. The highest number of questions were treatment related (28.8%) followed by suspected breast cancer-related questions (23.4%) and screening-related questions (20%). Statistical analysis revealed that the treatment-related questions were more likely to be highly accessed. CONCLUSION Content analysis of Internet question-answer communities is important, as questions posted on these sites would serve as a rich source of direct reflection regarding the health-related information needs of the general population.

  11. Analysis of the Health Information and Communication System and Cloud Computing

    Directory of Open Access Journals (Sweden)

    Matija Varga

    2015-05-01

    Full Text Available This paper describes an analysis and shows its use in analysing strengths, weaknesses, opportunities and threats (risks within the health care system.The aim is further more to show strengths, weaknesses, opportunities and threats when using cloud computing in the health care system. Cloud computing in medicine is an integral part of telemedicine. Based on the information presented in this paper, employees may identify the advantages and disadvantages of using cloud computing. When introducing new information technologies in the health care business the implementers will encounter numerous problems, such as: the complexity of the existing and the new information system, the costs of maintaining and updating the software, the cost of implementing new modules,a way of protecting the existing data in the database and the data that will be collected in the diagnosis. Using the SWOT analysis this paper evaluates the feasibility and possibility of adopting cloud computing in the health sector to improve health services based on samples (examples from abroad. The intent of cloud computing in medicine is to send data of the patient to the doctor instead of the patient sending it himself/herself.

  12. Combining Global and Local Information for Knowledge-Assisted Image Analysis and Classification

    Directory of Open Access Journals (Sweden)

    Mezaris V

    2007-01-01

    Full Text Available A learning approach to knowledge-assisted image analysis and classification is proposed that combines global and local information with explicitly defined knowledge in the form of an ontology. The ontology specifies the domain of interest, its subdomains, the concepts related to each subdomain as well as contextual information. Support vector machines (SVMs are employed in order to provide image classification to the ontology subdomains based on global image descriptions. In parallel, a segmentation algorithm is applied to segment the image into regions and SVMs are again employed, this time for performing an initial mapping between region low-level visual features and the concepts in the ontology. Then, a decision function, that receives as input the computed region-concept associations together with contextual information in the form of concept frequency of appearance, realizes image classification based on local information. A fusion mechanism subsequently combines the intermediate classification results, provided by the local- and global-level information processing, to decide on the final image classification. Once the image subdomain is selected, final region-concept association is performed using again SVMs and a genetic algorithm (GA for optimizing the mapping between the image regions and the selected subdomain concepts taking into account contextual information in the form of spatial relations. Application of the proposed approach to images of the selected domain results in their classification (i.e., their assignment to one of the defined subdomains and the generation of a fine granularity semantic representation of them (i.e., a segmentation map with semantic concepts attached to each segment. Experiments with images from the personal collection domain, as well as comparative evaluation with other approaches of the literature, demonstrate the performance of the proposed approach.

  13. Constructing osteoarthritis through discourse – a qualitative analysis of six patient information leaflets on osteoarthritis

    Directory of Open Access Journals (Sweden)

    Ong Bie

    2007-04-01

    Full Text Available Abstract Background Health service policy in the United Kingdom emphasises the importance of self-care by patients with chronic conditions. Written information for patients about their condition is seen as an important aid to help patients look after themselves. From a discourse analysis perspective written texts such as patient information leaflets do not simply describe the reality of a medical condition and its management but by drawing on some sorts of knowledge and evidence rather than others help construct the reality of that condition. This study explored patient information leaflets on osteoarthritis (OA to see how OA was constructed and to consider the implications for self-care. Methods Systematic and repeated readings of six patient information leaflets on osteoarthritis to look for similarities and differences across leaflets, contradictions within leaflets and the resources called on to make claims about the nature of OA and its management. Results Biomedical discourse of OA as a joint disease dominated. Only one leaflet included an illness discourse albeit limited, and was also the only one to feature patient experiences of living with OA. The leaflets had different views on the causes of OA including the role of lifestyle and ageing. Most emphasised patient responsibility for preventing the progression of OA. Advice about changing behaviour such as diet and exercise was not grounded in lived experience. There were inconsistent messages about using painkillers, exercise and the need to involve professionals when making changes to lifestyle. Conclusion The nature of the discourse impacted on how OA and the respective roles of patients and professionals were depicted. Limited discourse on illness meant that the complexity of living with OA and its consequences was underestimated. Written information needs to shift from joint biology to helping patients live with osteoarthritis. Written information should incorporate patient

  14. Parametric sensitivity analysis for biochemical reaction networks based on pathwise information theory.

    Science.gov (United States)

    Pantazis, Yannis; Katsoulakis, Markos A; Vlachos, Dionisios G

    2013-10-22

    Stochastic modeling and simulation provide powerful predictive methods for the intrinsic understanding of fundamental mechanisms in complex biochemical networks. Typically, such mathematical models involve networks of coupled jump stochastic processes with a large number of parameters that need to be suitably calibrated against experimental data. In this direction, the parameter sensitivity analysis of reaction networks is an essential mathematical and computational tool, yielding information regarding the robustness and the identifiability of model parameters. However, existing sensitivity analysis approaches such as variants of the finite difference method can have an overwhelming computational cost in models with a high-dimensional parameter space. We develop a sensitivity analysis methodology suitable for complex stochastic reaction networks with a large number of parameters. The proposed approach is based on Information Theory methods and relies on the quantification of information loss due to parameter perturbations between time-series distributions. For this reason, we need to work on path-space, i.e., the set consisting of all stochastic trajectories, hence the proposed approach is referred to as "pathwise". The pathwise sensitivity analysis method is realized by employing the rigorously-derived Relative Entropy Rate, which is directly computable from the propensity functions. A key aspect of the method is that an associated pathwise Fisher Information Matrix (FIM) is defined, which in turn constitutes a gradient-free approach to quantifying parameter sensitivities. The structure of the FIM turns out to be block-diagonal, revealing hidden parameter dependencies and sensitivities in reaction networks. As a gradient-free method, the proposed sensitivity analysis provides a significant advantage when dealing with complex stochastic systems with a large number of parameters. In addition, the knowledge of the structure of the FIM can allow to efficiently address

  15. Theoretical analysis of three research apparatuses about media and information literacy in France

    Directory of Open Access Journals (Sweden)

    Jacques Kerneis

    2013-11-01

    Full Text Available In this article, we compare three projects about mapping digital-, media- and information literacy in France. For this study, we first used the concept of “apparatus” in Foucauldian (1977 and Agambenian sense (2009. After this analysis, we called on Bachelard (1932 and his distinction between phénoménotechnique and phénoménographie. The first project began in 2006 around a professional association (Fadben: http://www.fadben.asso.fr/, with the main goal being to distinguish 64 main concepts in information literacy. This work is now completed, and we can observe it quietly through publications. The second project emanates from a research group (GRCDI: http://culturedel.info/grcdi/ that is still active. In 2011, GRCDI produced a status report, including future perspectives, which introduced the idea of transliteracy (media and information culture.The third project (Limin-R: http://www.iscc.cnrs.fr/spip.php?article1115 is an open group (media, information, computer science with support from CNRS. We aim at mapping the web around these concepts, and in all three projects wiki tools have been used, which has been important for the success and limits of the collective action. This paper presents highlights and lessons learned, as well as ideas for further development.

  16. A large scale analysis of information-theoretic network complexity measures using chemical structures.

    Directory of Open Access Journals (Sweden)

    Matthias Dehmer

    Full Text Available This paper aims to investigate information-theoretic network complexity measures which have already been intensely used in mathematical- and medicinal chemistry including drug design. Numerous such measures have been developed so far but many of them lack a meaningful interpretation, e.g., we want to examine which kind of structural information they detect. Therefore, our main contribution is to shed light on the relatedness between some selected information measures for graphs by performing a large scale analysis using chemical networks. Starting from several sets containing real and synthetic chemical structures represented by graphs, we study the relatedness between a classical (partition-based complexity measure called the topological information content of a graph and some others inferred by a different paradigm leading to partition-independent measures. Moreover, we evaluate the uniqueness of network complexity measures numerically. Generally, a high uniqueness is an important and desirable property when designing novel topological descriptors having the potential to be applied to large chemical databases.

  17. Analysis and Comparison of Information Theory-based Distances for Genomic Strings

    Science.gov (United States)

    Balzano, Walter; Cicalese, Ferdinando; Del Sorbo, Maria Rosaria; Vaccaro, Ugo

    2008-07-01

    Genomic string comparison via alignment are widely applied for mining and retrieval of information in biological databases. In some situation, the effectiveness of such alignment based comparison is still unclear, e.g., for sequences with non-uniform length and with significant shuffling of identical substrings. An alternative approach is the one based on information theory distances. Biological data information content is stored in very long strings of only four characters. In last ten years, several entropic measures have been proposed for genomic string analysis. Notwithstanding their individual merit and experimental validation, to the nest of our knowledge, there is no direct comparison of these different metrics. We shall present four of the most representative alignment-free distance measures, based on mutual information. Each one has a different origin and expression. Our comparison involves a sort of arrangement, to reduce different concepts to a unique formalism, so as it has been possible to construct a phylogenetic tree for each of them. The trees produced via these metrics are compared to the ones widely accepted as biologically validated. In general the results provided more evidence of the reliability of the alignment-free distance models. Also, we observe that one of the metrics appeared to be more robust than the other three. We believe that this result can be object of further researches and observations. Many of the results of experimentation, the graphics and the table are available at the following URL: http://people.na.infn.it/˜wbalzano/BIO

  18. Bringing trauma-informed practice to domestic violence programs: A qualitative analysis of current approaches.

    Science.gov (United States)

    Wilson, Joshua M; Fauci, Jenny E; Goodman, Lisa A

    2015-11-01

    Three out of 10 women and 1 out of 10 men in the United States experience violence at the hands of an intimate partner-often with devastating costs. In response, hundreds of residential and community-based organizations have sprung up to support survivors. Over the last decade, many of these organizations have joined other human service systems in adopting trauma-informed care (TIC), an approach to working with survivors that responds directly to the effects of trauma. Although there have been various efforts to describe TIC in domestic violence (DV) programs, there is a need to further synthesize this discourse on trauma-informed approaches to better understand specific applications and practices for DV programs. This study aimed to address this gap. The authors of this study systematically identified key documents that describe trauma-informed approaches in DV services and then conducted a qualitative content analysis to identify core themes. Results yielded 6 principles (Establishing emotional safety, Restoring choice and control, Facilitating connection, Supporting coping, Responding to identity and context, and Building strengths), each of which comprised a set of concrete practices. Despite the common themes articulated across descriptions of DV-specific trauma-informed practices (TIP), we also found critical differences, with some publications focusing narrowly on individual healing and others emphasizing the broader community and social contexts of violence and oppression. Implications for future research and evaluation are discussed. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  19. The mediated information in speech Edir Macedo: analysis of publishers of Universal Leaf

    Directory of Open Access Journals (Sweden)

    Ciro Athayde Barros Monteiro

    2017-04-01

    Full Text Available Introduction: The information mediated in the speech of Edir Macedo remains in prominent position in front of the transformations of contemporary society. The study proposed to analyze the strategies used in his speech to mediate information through the editorial of the newspaper Folha Universal (FU, the journal of the “Igreja Universal do Reino de Deus (IURD. Objective: To know the discursive strategies used by Edir Macedo in order to understand how this information is mediated and how can expand its influence daily turning it into one of the major mediators of Brazil. Methodology: Four editorials were selected writings of newspaper between 2009 and 2011, use of Discourse Analysis methodology. Results: The editorials analyzed show that the bishop uses primarily persuasive speech to get public support by appealing almost always emotional function and the imperative mood. Conclusions: We highlight the need for CI to understand this discourse, since this information is responsible for influencing a large number of people making the IURD, every day, expand its space in the press and society.

  20. AN ECONOMIC ANALYSIS OF THE DETERMINANTS OF ENTREPRENEURSHIP: THE CASE OF MASVINGO INFORMAL BUSINESSES

    Directory of Open Access Journals (Sweden)

    Clainos Chidoko

    2013-03-01

    Full Text Available In the past decade, Zimbabwe has been hit by its worst economic performance since its independence in 1980. Capacity utilization shrank to ten percent and unemployment rate was above eighty percent by 2008 as the private and public sector witnessed massive retrenchments. As a result many people are finding themselves engaging in informal businesses to make ends meet. However not all people have joined the informal sector as has been witnessed by the number of people who left the country in droves to neighbouring countries. It is against this background that this research conducted an economic analysis of the determinants of entrepreneurship in Masvingo urban with an emphasis on the informal businesses. The research targeted a sample of 100 informal businesses (30 from Rujeko Light industrial area, 40 from Mucheke Light industrial area and 30 from Masvingo Central Business District. The businesses included among others flea market operators, furniture manufacturers, suppliers and producers of agricultural products, and food vendors. The research found out that level of education, gender, age, marital status, number of dependants, type of subjects studied at secondary school and vocational training are the main determinants that influence the type of business that entrepreneur ventures into. The study recommends formal training for the participants, for the businesses to continue into existence since they fill in the gap that is left vacant by most formal enterprises.

  1. Obtaining informed consent for genomics research in Africa: analysis of H3Africa consent documents.

    Science.gov (United States)

    Munung, Nchangwi Syntia; Marshall, Patricia; Campbell, Megan; Littler, Katherine; Masiye, Francis; Ouwe-Missi-Oukem-Boyer, Odile; Seeley, Janet; Stein, D J; Tindana, Paulina; de Vries, Jantina

    2016-02-01

    The rise in genomic and biobanking research worldwide has led to the development of different informed consent models for use in such research. This study analyses consent documents used by investigators in the H3Africa (Human Heredity and Health in Africa) Consortium. A qualitative method for text analysis was used to analyse consent documents used in the collection of samples and data in H3Africa projects. Thematic domains included type of consent model, explanations of genetics/genomics, data sharing and feedback of test results. Informed consent documents for 13 of the 19 H3Africa projects were analysed. Seven projects used broad consent, five projects used tiered consent and one used specific consent. Genetics was mostly explained in terms of inherited characteristics, heredity and health, genes and disease causation, or disease susceptibility. Only one project made provisions for the feedback of individual genetic results. H3Africa research makes use of three consent models-specific, tiered and broad consent. We outlined different strategies used by H3Africa investigators to explain concepts in genomics to potential research participants. To further ensure that the decision to participate in genomic research is informed and meaningful, we recommend that innovative approaches to the informed consent process be developed, preferably in consultation with research participants, research ethics committees and researchers in Africa. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  2. An Analysis of the Influence of Controlling Shareholder Identity over Earnings Informativeness on Brazilian Capital Market

    Directory of Open Access Journals (Sweden)

    Rodrigo Vicente Prazeres

    2017-09-01

    Full Text Available This paper aimed to investigate the influence of controlling shareholder identity over earnings informativeness and to contribute empirically on the advance on the understanding of the agency conflict between controlling shareholders and minority investors through the lens of value relevance. The research sample considered 104 shares of non-financial firms negotiated on BM&FBovespa from 2011 to 2016. The methodology was conducted through panel data regression analysis. As results, this paper concludes with the following findings: i the higher the control/vote power of controlling shareholder (ownership concentration and the lower the stock liquidity, the less informative are the earnings and the greater is the probability of entrenchment and wealth expropriation by controlling shareholders; ii larger firms and highly leveraged firms have more informative earnings; iii the stock prices reflect the controlling shareholder identity; iv Firms controlled by financial institutions, nonfinancial institutions and the government are much more likely to expropriate minority investors wealth and have less informative earnings; v family firms are positively priced by the market.

  3. Security analysis and improvement of a privacy authentication scheme for telecare medical information systems.

    Science.gov (United States)

    Wu, Fan; Xu, Lili

    2013-08-01

    Nowadays, patients can gain many kinds of medical service on line via Telecare Medical Information Systems(TMIS) due to the fast development of computer technology. So security of communication through network between the users and the server is very significant. Authentication plays an important part to protect information from being attacked by malicious attackers. Recently, Jiang et al. proposed a privacy enhanced scheme for TMIS using smart cards and claimed their scheme was better than Chen et al.'s. However, we have showed that Jiang et al.'s scheme has the weakness of ID uselessness and is vulnerable to off-line password guessing attack and user impersonation attack if an attacker compromises the legal user's smart card. Also, it can't resist DoS attack in two cases: after a successful impersonation attack and wrong password input in Password change phase. Then we propose an improved mutual authentication scheme used for a telecare medical information system. Remote monitoring, checking patients' past medical history record and medical consultant can be applied in the system where information transmits via Internet. Finally, our analysis indicates that the suggested scheme overcomes the disadvantages of Jiang et al.'s scheme and is practical for TMIS.

  4. An importance-performance analysis of hospital information system attributes: A nurses' perspective.

    Science.gov (United States)

    Cohen, Jason F; Coleman, Emma; Kangethe, Matheri J

    2016-02-01

    Health workers have numerous concerns about hospital IS (HIS) usage. Addressing these concerns requires understanding the system attributes most important to their satisfaction and productivity. Following a recent HIS implementation, our objective was to identify priorities for managerial intervention based on user evaluations of the performance of the HIS attributes as well as the relative importance of these attributes to user satisfaction and productivity outcomes. We collected data along a set of attributes representing system quality, data quality, information quality, and service quality from 154 nurse users. Their quantitative responses were analysed using the partial least squares approach followed by an importance-performance analysis. Qualitative responses were analysed using thematic analysis to triangulate and supplement the quantitative findings. Two system quality attributes (responsiveness and ease of learning), one information quality attribute (detail), one service quality attribute (sufficient support), and three data quality attributes (records complete, accurate and never missing) were identified as high priorities for intervention. Our application of importance-performance analysis is unique in HIS evaluation and we have illustrated its utility for identifying those system attributes for which underperformance is not acceptable to users and therefore should be high priorities for intervention. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  5. Information Theory Analysis of Cascading Process in a Synthetic Model of Fluid Turbulence

    Directory of Open Access Journals (Sweden)

    Massimo Materassi

    2014-02-01

    Full Text Available The use of transfer entropy has proven to be helpful in detecting which is the verse of dynamical driving in the interaction of two processes, X and Y . In this paper, we present a different normalization for the transfer entropy, which is capable of better detecting the information transfer direction. This new normalized transfer entropy is applied to the detection of the verse of energy flux transfer in a synthetic model of fluid turbulence, namely the Gledzer–Ohkitana–Yamada shell model. Indeed, this is a fully well-known model able to model the fully developed turbulence in the Fourier space, which is characterized by an energy cascade towards the small scales (large wavenumbers k, so that the application of the information-theory analysis to its outcome tests the reliability of the analysis tool rather than exploring the model physics. As a result, the presence of a direct cascade along the scales in the shell model and the locality of the interactions in the space of wavenumbers come out as expected, indicating the validity of this data analysis tool. In this context, the use of a normalized version of transfer entropy, able to account for the difference of the intrinsic randomness of the interacting processes, appears to perform better, being able to discriminate the wrong conclusions to which the “traditional” transfer entropy would drive.

  6. Co-word analysis for the non-scientific information example of Reuters Business Briefings

    Directory of Open Access Journals (Sweden)

    B Delecroix

    2006-01-01

    Full Text Available Co-word analysis is based on a sociological theory developed by the CSI and the SERPIA (Callon, Courtial, Turner, 1991 in the mid eighties. This method, originally dedicated to scientific fields, measures the association strength between terms in documents to reveal and visualise the evolution of scientific fields through the construction of clusters and strategic diagram. This method has since been successfully applied to investigate the structure of many scientific areas. Nowadays it occurs in many software systems which are used by companies to improve their business, and define their strategy but its relevance to this kind of application has not been proved yet. Using the example of economic and marketing information on DSL technologies from Reuters Business Briefing, this presentation gives an interpretation of co-word analysis for this kind of information. After an overview of the software we used (Sampler and after an outline of the experimental protocol, we investigate and explain each step of the co-word analysis process: terminological extraction, computation of clusters and the strategic diagram. In particular, we explain the meaning of each parameter of the method: the choice of variables and similarity measures is discussed. Finally we try to give a global interpretation of the method in an economic context. Further studies will be added to this work in order to allow a generalisation of these results.

  7. Analysis of Active Learning Suitability of Subjects in Information and Electronics

    Directory of Open Access Journals (Sweden)

    Yoshikatsu Kubota

    2017-09-01

    Full Text Available “Active Learning (AL,” the teaching method which puts emphasis on students’ active participation in class and their abilities to discover problems and solve them, has been coming under the spotlight worldwide. NIT, Sendai College, Hirose Campus is promoting AL in the field of information and electronics. Especially, we are practicing “A3 Learning System” specializing in utilizing computers. We have aggressively introduced AL in class and seen the good effects of them. However, some problems are emerging in a certain type of subjects, which may mean that there are subjects unsuitable for AL. In this paper we report large-scale analysis of introduction of AL in information and electronics field and suggest that successful introduction of AL in class depends on the type of subjects.

  8. Duopoly Market Analysis within One-Shot Decision Framework with Asymmetric Possibilistic Information

    Directory of Open Access Journals (Sweden)

    Peijun Guo

    2010-12-01

    Full Text Available In this paper, a newly emerging duopoly market with a short life cycle is analyzed. The partially known information of market is characterized by the possibility distribution of the parameter in the demand function. Since the life cycle of the new product is short, how many products should be produced by two rival firms is a typical one-shot decision problem. Within the one-shot decision framework, the possibilistic Cournot equilibrium is obtained for the optimal production level of each firm in a duopoly market with asymmetrical possibilistic information. The analysis results show that the proposed approaches are reasonable for one-shot decision problems, which are extensively encountered in business and economics.

  9. Methods and apparatuses for information analysis on shared and distributed computing systems

    Science.gov (United States)

    Bohn, Shawn J [Richland, WA; Krishnan, Manoj Kumar [Richland, WA; Cowley, Wendy E [Richland, WA; Nieplocha, Jarek [Richland, WA

    2011-02-22

    Apparatuses and computer-implemented methods for analyzing, on shared and distributed computing systems, information comprising one or more documents are disclosed according to some aspects. In one embodiment, information analysis can comprise distributing one or more distinct sets of documents among each of a plurality of processes, wherein each process performs operations on a distinct set of documents substantially in parallel with other processes. Operations by each process can further comprise computing term statistics for terms contained in each distinct set of documents, thereby generating a local set of term statistics for each distinct set of documents. Still further, operations by each process can comprise contributing the local sets of term statistics to a global set of term statistics, and participating in generating a major term set from an assigned portion of a global vocabulary.

  10. Niche Overlap and Discrediting Acts: An Empirical Analysis of Informing in Hollywood

    Directory of Open Access Journals (Sweden)

    Giacomo Negro

    2015-06-01

    Full Text Available This article examines informing on others as a discrediting act between individual agents in a labor market. We conduct an empirical analysis of artists called to testify during the 1950s Congressional hearings into Communism in Hollywood, and multi-level regression models reveal that the odds of an artist informing on another increase when their career histories are more similar. The similarity reflects levels of niche overlap in the labor market. The finding that similarity contributes to discredit in the context of resource competition is compatible with a social comparison process, whereby uncertainty about performance leads more similar people to attend to and exclude one another to a greater extent.

  11. A media information analysis for implementing effective countermeasure against harmful rumor

    Science.gov (United States)

    Nagao, Mitsuyoshi; Suto, Kazuhiro; Ohuchi, Azuma

    2010-04-01

    When large scale earthquake occurred, the word of "harmful rumor" came to be frequently heard. The harmful rumor means an economic damage which is caused by the action that people regard actually safe foods or areas as dangerous and then abort consumption or sightseeing. In the case of harmful rumor caused by earthquake, especially, tourism industry receives massive economic damage. Currently, harmful rumor which gives substantial economic damage have become serious social issue which must be solved. In this paper, we propose a countermeasure method for harmful rumor on the basis of media trend in order to implement speedy recovery from harmful rumor. Here, we investigate the amount and content of information which is transmitted to the general public by the media when an earthquake occurred. In addition, the media information in three earthquakes is treated as instance. Finally, we discuss an effective countermeasure method for dispeling harmful rumor through these analysis results.

  12. Threats and risks to information security: a practical analysis of free access wireless networks

    Science.gov (United States)

    Quirumbay, Daniel I.; Coronel, Iván. A.; Bayas, Marcia M.; Rovira, Ronald H.; Gromaszek, Konrad; Tleshova, Akmaral; Kozbekova, Ainur

    2017-08-01

    Nowadays, there is an ever-growing need to investigate, consult and communicate through the internet. This need leads to the intensification of free access to the web in strategic and functional points for the benefit of the community. However, this open access is also related to the increase of information insecurity. The existing works on computer security primarily focus on the development of techniques to reduce cyber-attacks. However, these approaches do not address the sector of inexperienced users who have difficulty understanding browser settings. Two methods can solve this problem: first the development of friendly browsers with intuitive setups for new users and on the other hand, by implementing awareness programs on essential security without deepening on technical information. This article addresses an analysis of the vulnerabilities of wireless equipment that provides internet service in the open access zones and the potential risks that could be found when using these means.

  13. Analysis of workers' dose records from the Greek Dose Registry Information System

    International Nuclear Information System (INIS)

    Kamenopoulou, V.; Dimitriou, P.; Proukakis, Ch.

    1995-01-01

    The object of this work is the study of the individual film badge annual dose information of classified workers in Greece, monitored and assessed by the central dosimetry service of the Greek Atomic Energy Commission. Dose summaries were recorded and processed by the Dose Registry Information System. The statistical analysis refers to the years 1989-93 and deals with the distribution of individuals in the occupational groups, the mean annual dose, the collective dose, the distribution of the dose over the different specialties and the number of workers that have exceeded any of the established dose limits. Results concerning the annual dose summaries, demonstrate a year-by-year reduction in the mean individual dose to workers in the health sector. Conversely, exposures in the industrial sector did not show any decreasing tendency during the period under consideration. (Author)

  14. The value of information as applied to the Landsat Follow-on benefit-cost analysis

    Science.gov (United States)

    Wood, D. B.

    1978-01-01

    An econometric model was run to compare the current forecasting system with a hypothetical (Landsat Follow-on) space-based system. The baseline current system was a hybrid of USDA SRS domestic forecasts and the best known foreign data. The space-based system improved upon the present Landsat by the higher spatial resolution capability of the thematic mapper. This satellite system is a major improvement for foreign forecasts but no better than SRS for domestic forecasts. The benefit analysis was concentrated on the use of Landsat Follow-on to forecast world wheat production. Results showed that it was possible to quantify the value of satellite information and that there are significant benefits in more timely and accurate crop condition information.

  15. Analysis and Design Information System Logistics Delivery Service in Pt Repex Wahana

    Directory of Open Access Journals (Sweden)

    Stephanie Surja

    2015-12-01

    Full Text Available Analysis and Design of Logistic Delivery System in PT Repex Wahana aims to analyze company’s need in existing business process of logistic delivery service. This will then be used in the development of an integrated system that can address the problems in the running process of sending and tracking the whereaboutsor status of the delivered goods which are the core business processes in the enterprise. The result then will be used as basis in the development of integrated information system in pursuit of corporate solution for process business automation, delivery process, inventory, and logistic delivery tracking, which is the core of the company business process, and it will be documented using Unified Modeling Language. The information system is meant to simplify the delivery and tracking process in the company, besides will minimize lost and error of data which is often happened because of the manual and unorganized transaction data processing.

  16. Morphometric analysis in geographic information systems: applications of free software GRASS and R

    Science.gov (United States)

    Grohmann, Carlos Henrique

    2004-11-01

    Development and interpretation of morphometric maps are important tools in studies related to neotectonics and geomorphology; Geographic Information Systems (GIS) allows speed and precision to this process, but applied methodology will vary according to available tools and degree of knowledge of each researcher about involved software. A methodology to integrate GIS and statistics in morphometric analysis is presented for the most usual morphometric parameters—hypsometry, slope, aspect, swath profiles, lineaments and drainage density, surface roughness, isobase and hydraulic gradient. The GIS used was the Geographic Resources Analysis Support System (GRASS-GIS), an open-source project that offers an integrated environment for raster and vector analysis, image processing and maps/graphics creation. Statistical analysis of parameters can be carried out on R, a system for statistical computation and graphics, through an interface with GRASS that allows raster maps and points files to be treated as variables for analysis. The basic element for deriving morphometric maps is the digital elevation model (DEM). It can be interpolated from scattered points or contours, either in raster or vector format; it is also possible to use DEMs from NASA's Shuttle Radar Topographic Mission, with 30 m of ground resolution for the USA and 90 m for other countries. Proposed methodology can be adapted according to necessities and available tools. The use of free and open-source tools guarantees access to everyone, and its increasing popularization opens new development perspectives in this research field.

  17. Bird Activity Analysis Using Avian Radar Information in Naval Air Station airport, WA

    Science.gov (United States)

    Wang, J.; Herricks, E.

    2010-12-01

    The number of bird strikes on aircraft has increased sharply over recent years and airport bird hazard management has gained increasing attention in wildlife management and control. Evaluation of bird activity near airport is very critical to analyze the hazard of bird strikes. Traditional methods for bird activity analysis using visual counting provide a direct approach to bird hazard assessment. However this approach is limited to daylight and good visual conditions. Radar has been proven to be a useful and effective tool for bird detection and movement analysis. Radar eliminates observation bias and supports consistent data collection for bird activity analysis and hazard management. In this study bird activity data from the Naval Air Station Whidbey Island was collected by Accipiter Avian Radar System. Radar data was pre-processed by filtering out non-bird noises, including traffic vehicle, aircraft, insects, wind, rainfall, ocean waves and so on. Filtered data is then statistically analyzed using MATLAB programs. The results indicated bird movement dynamics in target areas near the airport, which includes (1) the daily activity varied at dawn and dusk; (2) bird activity varied by target area due to the habitat difference; and (3) both temporal and spatial movement patterns varied by bird species. This bird activity analysis supports bird hazard evaluation and related analysis and modeling to provide very useful information in airport bird hazard management planning.

  18. Value of information analysis from a societal perspective: a case study in prevention of major depression.

    Science.gov (United States)

    Mohseninejad, Leyla; van Baal, Pieter H M; van den Berg, Matthijs; Buskens, Erik; Feenstra, Talitha

    2013-06-01

    Productivity losses usually have a considerable impact on cost-effectiveness estimates while their estimated values are often relatively uncertain. Therefore, parameters related to these indirect costs play a role in setting priorities for future research from a societal perspective. Until now, however, value of information analyses have usually applied a health care perspective for economic evaluations. Hence, the effect of productivity losses has rarely been investigated in such analyses. The aim of the current study therefore was to investigate the effects of including or excluding productivity costs in value of information analyses. Expected value of information analysis (EVPI) was performed in cost-effectiveness evaluation of prevention from both societal and health care perspectives, to give us the opportunity to compare different perspectives. Priorities for future research were determined by partial EVPI. The program to prevent major depression in patients with subthreshold depression was opportunistic screening followed by minimal contact psychotherapy. The EVPI indicated that regardless of perspective, further research is potentially worthwhile. Partial EVPI results underlined the importance of productivity losses when a societal perspective was considered. Furthermore, priority setting for future research differed according to perspective. The results illustrated that advise for future research will differ for a health care versus a societal perspective and hence the value of information analysis should be adjusted to the perspective that is relevant for the decision makers involved. The outcomes underlined the need for carefully choosing the suitable perspective for the decision problem at hand. Copyright © 2013 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  19. Applying information network analysis to fire-prone landscapes: implications for community resilience

    Directory of Open Access Journals (Sweden)

    Derric B. Jacobs

    2017-03-01

    Full Text Available Resilient communities promote trust, have well-developed networks, and can adapt to change. For rural communities in fire-prone landscapes, current resilience strategies may prove insufficient in light of increasing wildfire risks due to climate change. It is argued that, given the complexity of climate change, adaptations are best addressed at local levels where specific social, cultural, political, and economic conditions are matched with local risks and opportunities. Despite the importance of social networks as key attributes of community resilience, research using social network analysis on coupled human and natural systems is scarce. Furthermore, the extent to which local communities in fire-prone areas understand climate change risks, accept the likelihood of potential changes, and have the capacity to develop collaborative mitigation strategies is underexamined, yet these factors are imperative to community resiliency. We apply a social network framework to examine information networks that affect perceptions of wildfire and climate change in Central Oregon. Data were collected using a mailed questionnaire. Analysis focused on the residents' information networks that are used to gain awareness of governmental activities and measures of community social capital. A two-mode network analysis was used to uncover information exchanges. Results suggest that the general public develops perceptions about climate change based on complex social and cultural systems rather than as patrons of scientific inquiry and understanding. It appears that perceptions about climate change itself may not be the limiting factor in these communities' adaptive capacity, but rather how they perceive local risks. We provide a novel methodological approach in understanding rural community adaptation and resilience in fire-prone landscapes and offer a framework for future studies.

  20. Information Analysis Methodology for Border Security Deployment Prioritization and Post Deployment Evaluation

    International Nuclear Information System (INIS)

    Booker, Paul M.; Maple, Scott A.

    2010-01-01

    Due to international commerce, cross-border conflicts, and corruption, a holistic, information driven, approach to border security is required to best understand how resources should be applied to affect sustainable improvements in border security. The ability to transport goods and people by land, sea, and air across international borders with relative ease for legitimate commercial purposes creates a challenging environment to detect illicit smuggling activities that destabilize national level border security. Smuggling activities operated for profit or smuggling operations driven by cross border conflicts where militant or terrorist organizations facilitate the transport of materials and or extremists to advance a cause add complexity to smuggling interdiction efforts. Border security efforts are further hampered when corruption thwarts interdiction efforts or reduces the effectiveness of technology deployed to enhance border security. These issues necessitate the implementation of a holistic approach to border security that leverages all available data. Large amounts of information found in hundreds of thousands of documents can be compiled to assess national or regional borders to identify variables that influence border security. Location data associated with border topics of interest may be extracted and plotted to better characterize the current border security environment for a given country or region. This baseline assessment enables further analysis, but also documents the initial state of border security that can be used to evaluate progress after border security improvements are made. Then, border security threats are prioritized via a systems analysis approach. Mitigation factors to address risks can be developed and evaluated against inhibiting factor such as corruption. This holistic approach to border security helps address the dynamic smuggling interdiction environment where illicit activities divert to a new location that provides less resistance

  1. Incorporating twitter-based human activity information in spatial analysis of crashes in urban areas.

    Science.gov (United States)

    Bao, Jie; Liu, Pan; Yu, Hao; Xu, Chengcheng

    2017-09-01

    The primary objective of this study was to investigate how to incorporate human activity information in spatial analysis of crashes in urban areas using Twitter check-in data. This study used the data collected from the City of Los Angeles in the United States to illustrate the procedure. The following five types of data were collected: crash data, human activity data, traditional traffic exposure variables, road network attributes and social-demographic data. A web crawler by Python was developed to collect the venue type information from the Twitter check-in data automatically. The human activities were classified into seven categories by the obtained venue types. The collected data were aggregated into 896 Traffic Analysis Zones (TAZ). Geographically weighted regression (GWR) models were developed to establish a relationship between the crash counts reported in a TAZ and various contributing factors. Comparative analyses were conducted to compare the performance of GWR models which considered traditional traffic exposure variables only, Twitter-based human activity variables only, and both traditional traffic exposure and Twitter-based human activity variables. The model specification results suggested that human activity variables significantly affected the crash counts in a TAZ. The results of comparative analyses suggested that the models which considered both traditional traffic exposure and human activity variables had the best goodness-of-fit in terms of the highest R 2 and lowest AICc values. The finding seems to confirm the benefits of incorporating human activity information in spatial analysis of crashes using Twitter check-in data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Object-oriented analysis and design for information systems Modeling with UML, OCL, IFML

    CERN Document Server

    Wazlawick, Raul Sidnei

    2014-01-01

    Object-Oriented Analysis and Design for Information Systems clearly explains real object-oriented programming in practice. Expert author Raul Sidnei Wazlawick explains concepts such as object responsibility, visibility and the real need for delegation in detail. The object-oriented code generated by using these concepts in a systematic way is concise, organized and reusable. The patterns and solutions presented in this book are based in research and industrial applications. You will come away with clarity regarding processes and use cases and a clear understand of how to expand a use case.

  3. Perceptual Normalized Information Distance for Image Distortion Analysis Based on Kolmogorov Complexity

    Science.gov (United States)

    Nikvand, Nima; Wang, Zhou

    2011-11-01

    Image distortion analysis is a fundamental issue in many image processing problems, including compression, restoration, recognition, classification, and retrieval. In this work, we investigate the problem of image distortion measurement based on the theories of Kolmogorov complexity and normalized information distance (NID), which have rarely been studied in the context of image processing. Based on a wavelet domain Gaussian scale mixture model of images, we approximate NID using a Shannon entropy based method. This leads to a series of novel distortion measures that are competitive with state-of-the-art image quality assessment approaches.

  4. Automatic generation of stop word lists for information retrieval and analysis

    Science.gov (United States)

    Rose, Stuart J

    2013-01-08

    Methods and systems for automatically generating lists of stop words for information retrieval and analysis. Generation of the stop words can include providing a corpus of documents and a plurality of keywords. From the corpus of documents, a term list of all terms is constructed and both a keyword adjacency frequency and a keyword frequency are determined. If a ratio of the keyword adjacency frequency to the keyword frequency for a particular term on the term list is less than a predetermined value, then that term is excluded from the term list. The resulting term list is truncated based on predetermined criteria to form a stop word list.

  5. Transportation Big Data: Unbiased Analysis and Tools to Inform Sustainable Transportation Decisions

    Energy Technology Data Exchange (ETDEWEB)

    2016-06-01

    Today, transportation operation and energy systems data are generated at an unprecedented scale. The U.S. Department of Energy's National Renewable Energy Laboratory (NREL) is the go-to source for expertise in providing data and analysis to inform industry and government transportation decision making. The lab's teams of data experts and engineers are mining and analyzing large sets of complex data -- or 'big data' -- to develop solutions that support the research, development, and deployment of market-ready technologies that reduce fuel consumption and greenhouse gas emissions.

  6. Towards a Structurational Theory of Information Systems: a substantive case analysis

    DEFF Research Database (Denmark)

    Rose, Jeremy; Hackney, R. H

    2003-01-01

    This paper employs the analysis of an interpretive case study within a Regional Train Operating Company (RTOC) to arrive at theoretical understandings of Information Systems (IS). Giddens’ ‘structuration theory’ is developed which offers an account of structure and agency; social practices...... from whom this paper draws particular attention. Structurational concepts (system integration, time-space distanciation and routinization) as well as Giddens’ conceptualization of social change are further developed to help explain IS phenomena. Some fifty interviews were conducted at every level...

  7. Tools for Trade Analysis and Open Source Information Monitoring for Non-proliferation

    International Nuclear Information System (INIS)

    Cojazzi, G.G.M.; Versino, C.; Wolfart, E.; Renda, G.; Janssens, W.A.M.; )

    2015-01-01

    The new state level approach being proposed by IAEA envisions an objective based and information driven safeguards approach utilizing all relevant information to improve the effectiveness and efficiency of safeguards. To this goal the IAEA makes also use of open source information, here broadly defined as any information that is neither classified nor proprietary. It includes, but is not limited to: media sources, government and non-governmental reports and analyzes, commercial data, and scientific/technical literature, including trade data. Within the EC support programme to IAEA, JRC has surveyed and catalogued open sources on import-export customs trade data and developed tools for supporting the use of the related databases in safeguards. The JRC software The Big Table, (TBT), supports i.a.: a) the search through a collection of reference documents relevant to trade analysis (legal/regulatory documents, technical handbooks); b) the selection of items of interests to specific verifications and c) the mapping of these items to customs commodities searchable in trade databases. In the field of open source monitoring, JRC is developing and operating a ''Nuclear Security Media Monitor'' (NSMM), which is a web-based multilingual news aggregation system that automatically collects news articles from pre-defined web sites. NSMM is a domain specific version of the general JRC-Europe Media Monitor (EMM). NSMM has been established within the EC support programme with the aim, i.e., to streamline IAEA's process of open source information monitoring. In the first part, the paper will recall the trade data sources relevant for non-proliferation and will then illustrate the main features of TBT, recently coupled with the IAEA Physical Model, and new visualization techniques applied to trade data. In the second part it will present the main aspects of the NSMM also by illustrating some of uses done at JRC. (author)

  8. The use of cloud enabled building information models – an expert analysis

    Directory of Open Access Journals (Sweden)

    Alan Redmond

    2015-10-01

    Full Text Available The dependency of today’s construction professionals to use singular commercial applications for design possibilities creates the risk of being dictated by the language-tools they use. This unknowingly approach to converting to the constraints of a particular computer application’s style, reduces one’s association with cutting-edge design as no single computer application can support all of the tasks associated with building-design and production. Interoperability depicts the need to pass data between applications, allowing multiple types of experts and applications to contribute to the work at hand. Cloud computing is a centralized heterogeneous platform that enables different applications to be connected to each other through using remote data servers. However, the possibility of providing an interoperable process based on binding several construction applications through a single repository platform ‘cloud computing’ required further analysis. The following Delphi questionnaires analysed the exchanging information opportunities of Building Information Modelling (BIM as the possible solution for the integration of applications on a cloud platform. The survey structure is modelled to; (i identify the most appropriate applications for advancing interoperability at the early design stage, (ii detect the most severe barriers of BIM implementation from a business and legal viewpoint, (iii examine the need for standards to address information exchange between design team, and (iv explore the use of the most common interfaces for exchanging information. The anticipated findings will assist in identifying a model that will enhance the standardized passing of information between systems at the feasibility design stage of a construction project.

  9. Social Network Analysis of Elders' Health Literacy and their Use of Online Health Information.

    Science.gov (United States)

    Jang, Haeran; An, Ji-Young

    2014-07-01

    Utilizing social network analysis, this study aimed to analyze the main keywords in the literature regarding the health literacy of and the use of online health information by aged persons over 65. Medical Subject Heading keywords were extracted from articles on the PubMed database of the National Library of Medicine. For health literacy, 110 articles out of 361 were initially extracted. Seventy-one keywords out of 1,021 were finally selected after removing repeated keywords and applying pruning. Regarding the use of online health information, 19 articles out of 26 were selected. One hundred forty-four keywords were initially extracted. After removing the repeated keywords, 74 keywords were finally selected. Health literacy was found to be strongly connected with 'Health knowledge, attitudes, practices' and 'Patient education as topic.' 'Computer literacy' had strong connections with 'Internet' and 'Attitude towards computers.' 'Computer literacy' was connected to 'Health literacy,' and was studied according to the parameters 'Attitude towards health' and 'Patient education as topic.' The use of online health information was strongly connected with 'Health knowledge, attitudes, practices,' 'Consumer health information,' 'Patient education as topic,' etc. In the network, 'Computer literacy' was connected with 'Health education,' 'Patient satisfaction,' 'Self-efficacy,' 'Attitude to computer,' etc. Research on older citizens' health literacy and their use of online health information was conducted together with study of computer literacy, patient education, attitude towards health, health education, patient satisfaction, etc. In particular, self-efficacy was noted as an important keyword. Further research should be conducted to identify the effective outcomes of self-efficacy in the area of interest.

  10. The use of cloud enabled building information models – an expert analysis

    Directory of Open Access Journals (Sweden)

    Alan Redmond

    2012-12-01

    Full Text Available The dependency of today’s construction professionals to use singular commercial applications for design possibilities creates the risk of being dictated by the language-tools they use. This unknowingly approach to converting to the constraints of a particular computer application’s style, reduces one’s association with cutting-edge design as no single computer application can support all of the tasks associated with building-design and production. Interoperability depicts the need to pass data between applications, allowing multiple types of experts and applications to contribute to the work at hand. Cloud computing is a centralized heterogeneous platform that enables different applications to be connected to each other through using remote data servers. However, the possibility of providing an interoperable process based on binding several construction applications through a single repository platform ‘cloud computing’ required further analysis. The following Delphi questionnaires analysed the exchanging information opportunities of Building Information Modelling (BIM as the possible solution for the integration of applications on a cloud platform. The survey structure is modelled to; (i identify the most appropriate applications for advancing interoperability at the early design stage, (ii detect the most severe barriers of BIM implementation from a business and legal viewpoint, (iii examine the need for standards to address information exchange between design team, and (iv explore the use of the most common interfaces for exchanging information. The anticipated findings will assist in identifying a model that will enhance the standardized passing of information between systems at the feasibility design stage of a construction project.

  11. Exploring the presentation of HPV information online: A semantic network analysis of websites.

    Science.gov (United States)

    Ruiz, Jeanette B; Barnett, George A

    2015-06-26

    Negative vaccination-related information online leads some to opt out of recommended vaccinations. To determine how HPV vaccine information is presented online and what concepts co-occur. A semantic network analysis of the words in first-page Google search results was conducted using three negative, three neutral, and three positive search terms for 10 base concepts such as HPV vaccine, and HPV immunizations. In total, 223 of the 300 websites retrieved met inclusion requirements. Website information was analyzed using network statistics to determine what words most frequently appear, which words co-occur, and the sentiment of the words. High levels of word interconnectivity were found suggesting a rich set of semantic links and a very integrated set of concepts. Limited number of words held centrality indicating limited concept prominence. This dense network signifies concepts that are well connected. Negative words were most prevalent and were associated with describing the HPV vaccine's side-effects as well as the negative effects of HPV and cervical cancer. A smaller cluster focuses on reporting negative vaccine side-effects. Clustering shows the words women and girls closely located to the words sexually, virus, and infection. Information about the HPV vaccine online centered on a limited number of concepts. HPV vaccine benefits as well as the risks of HPV, including severity and susceptibility, were centrally presented. Word cluster results imply that HPV vaccine information for women and girls is discussed in more sexual terms than for men and boys. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Log Usage Analysis: What it Discloses about Use, Information Seeking and Trustworthiness

    Directory of Open Access Journals (Sweden)

    David Nicholas

    2014-06-01

    Full Text Available The Trust and Authority in Scholarly Communications in the Light of the Digital Transition research project1 was a study which investigated the behaviours and attitudes of academic researchers as producers and consumers of scholarly information resources in respect to how they determine authority and trustworthiness. The research questions for the study arose out of CIBER’s studies of the virtual scholar. This paper focuses on elements of this study, mainly an analysis of a scholarly publisher’s usage logs, which was undertaken at the start of the project in order to build an evidence base, which would help calibrate the main methodological tools used by the project: interviews and questionnaire. The specific purpose of the log study was to identify and assess the digital usage behaviours that potentially raise trustworthiness and authority questions. Results from the self-report part of the study were additionally used to explain the logs. The main findings were that: 1 logs provide a good indicator of use and information seeking behaviour, albeit in respect to just a part of the information seeking journey; 2 the ‘lite’ form of information seeking behaviour observed in the logs is a sign of users trying to make their mind up in the face of a tsunami of information as to what is relevant and to be trusted; 3 Google and Google Scholar are the discovery platforms of choice for academic researchers, which partly points to the fact that they are influenced in what they use and read by ease of access; 4 usage is not a suitable proxy for quality. The paper also provides contextual data from CIBER’s previous studies.

  13. Development of Design Information Template for Nuclear Power Plants for Electromagnetic Pulse (EMP) Effect Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Minyi; Ryu, Hosan; Ye, Songhae; Lee, Euijong [KNHP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    An electromagnetic pulse (EMP) is a transient electromagnetic shock wave that has powerful electric and magnetic fields that can destroy electronic equipment. It is generally well-known that EMPs can cause the malfunction and disorder of electronic equipment and serious damages to electric power systems and communication networks. Research is being carried out to protect nuclear power plants (NPPs) from EMP threats. Penetration routes of EMPs can be roughly categorized into two groups, radioactivity and conductivity. The radioactive effect refers to an impact transmitted to the ground from high-altitude electromagnetic pulses (HEMP). Such an impact may affect target equipment through the point of entry (POE) of the concrete structure of an NPP. The conductive effect refers to induced voltage or current coupled to the NPPs cable structure. The induced voltage and current affect the target equipment via connected cables. All these factors must be considered when taking into account EMP effect analysis for NPPs. To examine all factors, it is necessary to fully understand the schemes of NPPs. This paper presents a four type design information template that can be used to analyze the EMP effect in operating nuclear power plants. In order to analyze of the effects of EMPs on operating NPPs, we must consider both the conductive and radioactive effects on the target (system, equipment, structure). For these reasons, not only the equipment information, but also the information about the structure and the external penetration will be required. We are developing a design information template for robust nuclear design information acquisition. We expect to develop a block diagram on the basis of the template.

  14. Poisson mixture distribution analysis for North Carolina SIDS counts using information criteria

    Directory of Open Access Journals (Sweden)

    Tyler Massaro

    2017-09-01

    Full Text Available Mixture distribution analysis provides us with a tool for identifying unlabeled clusters that naturally arise in a data set.  In this paper, we demonstrate how to use the information criteria AIC and BIC to choose the optimal number of clusters for a given set of univariate Poisson data.  We give an empirical comparison between minimum Hellinger distance (MHD estimation and EM estimation for finding parameters in a mixture of Poisson distributions with artificial data.  In addition, we discuss Bayes error in the context of classification problems with mixture of 2, 3, 4, and 5 Poisson models.  Finally, we provide an example with real data, taken from a study that looked at sudden infant death syndrome (SIDS count data from 100 North Carolina counties (Symons et al., 1983.  This gives us an opportunity to demonstrate the advantages of the proposed model framework in comparison with the original analysis.

  15. SDI-based business processes: A territorial analysis web information system in Spain

    Science.gov (United States)

    Béjar, Rubén; Latre, Miguel Á.; Lopez-Pellicer, Francisco J.; Nogueras-Iso, Javier; Zarazaga-Soria, F. J.; Muro-Medrano, Pedro R.

    2012-09-01

    Spatial Data Infrastructures (SDIs) provide access to geospatial data and operations through interoperable Web services. These data and operations can be chained to set up specialized geospatial business processes, and these processes can give support to different applications. End users can benefit from these applications, while experts can integrate the Web services in their own business processes and developments. This paper presents an SDI-based territorial analysis Web information system for Spain, which gives access to land cover, topography and elevation data, as well as to a number of interoperable geospatial operations by means of a Web Processing Service (WPS). Several examples illustrate how different territorial analysis business processes are supported. The system has been established by the Spanish National SDI (Infraestructura de Datos Espaciales de España, IDEE) both as an experimental platform for geoscientists and geoinformation system developers, and as a mechanism to contribute to the Spanish citizens knowledge about their territory.

  16. Study on Network Error Analysis and Locating based on Integrated Information Decision System

    Science.gov (United States)

    Yang, F.; Dong, Z. H.

    2017-10-01

    Integrated information decision system (IIDS) integrates multiple sub-system developed by many facilities, including almost hundred kinds of software, which provides with various services, such as email, short messages, drawing and sharing. Because the under-layer protocols are different, user standards are not unified, many errors are occurred during the stages of setup, configuration, and operation, which seriously affect the usage. Because the errors are various, which may be happened in different operation phases, stages, TCP/IP communication protocol layers, sub-system software, it is necessary to design a network error analysis and locating tool for IIDS to solve the above problems. This paper studies on network error analysis and locating based on IIDS, which provides strong theory and technology supports for the running and communicating of IIDS.

  17. Competitiveness analysis on the grounds of information from the annual financial statements of the enterprise

    Directory of Open Access Journals (Sweden)

    Ivanova Rositsa

    2010-01-01

    Full Text Available Competitiveness is a category which is innate in every economic subject. It shows itself in the conditions of market competition. Between competition and competitiveness objectively exist mutual relations and dependence. The presence of competition is a prerequisite for rise of competitiveness and for its manifestation in the real business. In the scientific literature, there is no synonymous definition of the nature of competitiveness. This paper represents an attempt at systematization of the views on this scientific issue. The methodology and the methods of competitiveness analysis should be based on a system of methods. The purpose of this paper is to present possibilities for use of information from the annual financial statements of the enterprises for analysis of their competitiveness by taking into consideration the legislative decisions and the specific business conditions in the Republic of Bulgaria. .

  18. The application of ontology information system based on rough formal concept analysis

    Science.gov (United States)

    Wang, Yi

    2011-10-01

    The role of this ontology is to provide a shared vocabulary and semantics between domain knowledge and vision. Ontology presents semantics for interpreting data instances. Formal concept lattices and rough set theory are two kinds of complementary mathematical tools for data analysis and data processing. The paper proposes the method of constructiong ontology nodes information system based on rough formal concept analysis in order to constructing the ontology model of knowledge management. Experimental studies are done with rough set and FCA in constructing the numbers of ontology nodes. The experimental results indicate that this method has great promise and is showing that ROUGH SET is superior to FCA in building the numbers of ontology nodes.

  19. Science information to support Missouri River Scaphirhynchus albus (pallid sturgeon) effects analysis

    Science.gov (United States)

    Jacobson, Robert B.; Parsley, Michael J.; Annis, Mandy L.; Colvin, Michael E.; Welker, Timothy L.; James, Daniel A.

    2016-01-26

    The Missouri River Pallid Sturgeon Effects Analysis (EA) was commissioned by the U.S. Army Corps of Engineers to develop a foundation of understanding of how pallid sturgeon (Scaphirhynchus albus) population dynamics are linked to management actions in the Missouri River. The EA consists of several steps: (1) development of comprehensive, conceptual ecological models illustrating pallid sturgeon population dynamics and links to management actions and other drivers; (2) compilation and assessment of available scientific literature, databases, and models; (3) development of predictive, quantitative models to explore the system dynamics and population responses to management actions; and (4) analysis and assessment of effects of system operations and actions on species’ habitats and populations. This report addresses the second objective, compilation and assessment of relevant information.

  20. Combined Analysis of GRIDICE and BOSS Information Recorded During CMS-LCG0 Production

    CERN Document Server

    Coviello, Tommaso; Donvito, Giacinto; Maggi, Giorgio; Maggi, Marcello; Pierro, A

    2004-01-01

    Input parameters needed to a CMS data analysis computing model simulation were determined by a combined analysis of the data stored by the GridICE and BOSS monitoring systems from July to October 2003 when simulating two million events for CMS data challenge 2004, in a Grid distributed environment on a dedicated CMS testbed (CMS-LCG0). During the production, the two monitoring systems were taking records of complementary information for each submitted job. In particular, by integrating data from both monitoring system databases, the measurements of the job execution performance on different processors used in the CMS-LCG0 testbed, were obtained. First results about the simulation of the CMS-LCG0 testbed using the Ptolemy program were also reported.

  1. The Wind ENergy Data and Information (WENDI) Gateway: New Information and Analysis Tools for Wind Energy Stakeholders

    Science.gov (United States)

    Kaiser, D.; Palanisamy, G.; Santhana Vannan, S.; Wei, Y.; Smith, T.; Starke, M.; Wibking, M.; Pan, Y.; Devarakonda, Ranjeet; Wilson, B. E.; Wind Energy Data; Information (WENDI) Gateway Team

    2010-12-01

    In support of the U.S. Department of Energy’s (DOE) Energy Efficiency and Renewable Energy (EERE) Office, DOE's Oak Ridge National Laboratory (ORNL) has launched the Wind ENergy Data & Information (WENDI) Gateway. The WENDI Gateway is intended to serve a broad range of wind-energy stakeholders by providing easy access to a large amount of wind energy-related data and information through its two main interfaces: the Wind Energy Metadata Clearinghouse and the Wind Energy Geographic Information System (WindGIS). The Metadata Clearinghouse is a powerful, customized search tool for discovering, accessing, and sharing wind energy-related data and information. Its database of metadata records points users to a diverse array of wind energy-related resources: from technical and scientific journal articles to mass media news stories; from annual government and industry reports to downloadable datasets, and much more. Through the WindGIS, users can simultaneously visualize a wide spectrum of United States wind energy-related spatial data, including wind energy power plant locations; wind resource maps; state-level installed wind capacity, generation, and renewable portfolio standards; electric transmission lines; transportation infrastructure; interconnection standards; land ownership, designation, and usage; and various ecological data layers. In addition, WindGIS allows users to download much of the data behind the layers. References: [1] Devarakonda R., et al. Mercury: reusable metadata management, data discovery and access system. Earth Science Informatics (2010), 3(1): 87-94. [2] Wilson, Bruce E., et al. "Mercury Toolset for Spatiotemporal Metadata." (2010).

  2. Managing Information Uncertainty in Wave Height Modeling for the Offshore Structural Analysis through Random Set

    Directory of Open Access Journals (Sweden)

    Keqin Yan

    2017-01-01

    Full Text Available This chapter presents a reliability study for an offshore jacket structure with emphasis on the features of nonconventional modeling. Firstly, a random set model is formulated for modeling the random waves in an ocean site. Then, a jacket structure is investigated in a pushover analysis to identify the critical wave direction and key structural elements. This is based on the ultimate base shear strength. The selected probabilistic models are adopted for the important structural members and the wave direction is specified in the weakest direction of the structure for a conservative safety analysis. The wave height model is processed in a P-box format when it is used in the numerical analysis. The models are applied to find the bounds of the failure probabilities for the jacket structure. The propagation of this wave model to the uncertainty in results is investigated in both an interval analysis and Monte Carlo simulation. The results are compared in context of information content and numerical accuracy. Further, the failure probability bounds are compared with the conventional probabilistic approach.

  3. Lexical analysis in schizophrenia: how emotion and social word use informs our understanding of clinical presentation.

    Science.gov (United States)

    Minor, Kyle S; Bonfils, Kelsey A; Luther, Lauren; Firmin, Ruth L; Kukla, Marina; MacLain, Victoria R; Buck, Benjamin; Lysaker, Paul H; Salyers, Michelle P

    2015-05-01

    The words people use convey important information about internal states, feelings, and views of the world around them. Lexical analysis is a fast, reliable method of assessing word use that has shown promise for linking speech content, particularly in emotion and social categories, with psychopathological symptoms. However, few studies have utilized lexical analysis instruments to assess speech in schizophrenia. In this exploratory study, we investigated whether positive emotion, negative emotion, and social word use was associated with schizophrenia symptoms, metacognition, and general functioning in a schizophrenia cohort. Forty-six participants generated speech during a semi-structured interview, and word use categories were assessed using a validated lexical analysis measure. Trained research staff completed symptom, metacognition, and functioning ratings using semi-structured interviews. Word use categories significantly predicted all variables of interest, accounting for 28% of the variance in symptoms and 16% of the variance in metacognition and general functioning. Anger words, a subcategory of negative emotion, significantly predicted greater symptoms and lower functioning. Social words significantly predicted greater metacognition. These findings indicate that lexical analysis instruments have the potential to play a vital role in psychosocial assessments of schizophrenia. Future research should replicate these findings and examine the relationship between word use and additional clinical variables across the schizophrenia-spectrum. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Benefit of hindsight: systematic analysis of coronial inquest data to inform patient safety in hospitals.

    Science.gov (United States)

    Pudney, Val; Grech, Carol

    2016-09-01

    Objective The aim of the present study was to explore the potential of coronial inquest data to inform patient safety improvement in hospitals at a system level. Methods A retrospective analysis of 20 years of South Australian (SA) coronial inquest findings was performed using both qualitative content analysis methods and statistical descriptive analyses. Results In all, 113 cases were analysed. More than one-third of deaths (39%) were associated with emergency care. Analysis revealed 11 recurrent themes and two notable contributing factors that highlighted specific areas of concern for SA hospitals over that time period. The most common action recommended by coroners (49.6%; n=56 cases) was the review or development of policy, protocol, procedure or guidelines designed to improve patient care. In almost one-quarter (24%) of deaths reviewed, coroners alerted health authorities to poor standards of care and/or instructed individual clinicians to review the standard of their clinical practice. Conclusions The analysis provided a retrospective review of coronial inquest data associated with hospital care over a 20-year period. The findings highlight specific areas of concern for patient safety over that time. More broadly, this analysis contributes to an emerging body of evidence in the Australian academic literature that demonstrates the value of systematic analysis of coronial data at a system level to inform patient safety improvement in Australian healthcare. What is known about the topic? Australian coroners have an important role to play in public health and safety. Many areas of social inquiry across Australia use coronial inquest data to identify recurrent hazards and assist in the development of relevant social policy. However, there is very little research reported in the academic literature that associates analyses of coronial data with patient safety improvement in healthcare. Although coronial recommendations made from individual cases of avoidable death

  5. 76 FR 65317 - 60-Day Notice of Proposed Information Collection: DS-4184, Risk Management and Analysis (RAM)

    Science.gov (United States)

    2011-10-20

    ..., Risk Management and Analysis (RAM) ACTION: Notice of request for public comments. SUMMARY: The... of 1995. Title of Information Collection: Risk Analysis and Management. OMB Control Number: None.... Methodology: The State Department, is implementing a Risk Analysis and Management Program to vet potential...

  6. A Review of Methods for Analysis of the Expected Value of Information.

    Science.gov (United States)

    Heath, Anna; Manolopoulou, Ioanna; Baio, Gianluca

    2017-10-01

    In recent years, value-of-information analysis has become more widespread in health economic evaluations, specifically as a tool to guide further research and perform probabilistic sensitivity analysis. This is partly due to methodological advancements allowing for the fast computation of a typical summary known as the expected value of partial perfect information (EVPPI). A recent review discussed some approximation methods for calculating the EVPPI, but as the research has been active over the intervening years, that review does not discuss some key estimation methods. Therefore, this paper presents a comprehensive review of these new methods. We begin by providing the technical details of these computation methods. We then present two case studies in order to compare the estimation performance of these new methods. We conclude that a method based on nonparametric regression offers the best method for calculating the EVPPI in terms of accuracy, computational time, and ease of implementation. This means that the EVPPI can now be used practically in health economic evaluations, especially as all the methods are developed in parallel with R functions and a web app to aid practitioners.

  7. Managing Returnable Containers Logistics - A Case Study Part I - Physical and Information Flow Analysis

    Directory of Open Access Journals (Sweden)

    Reza A. Maleki

    2011-05-01

    Full Text Available This case study paper is the result of a project conducted on behalf of a company, hereon referred to as Midwest Assembly and Manufacturing or MAAN. The company's operations include component manufacturing, painting, and assembling products. The company also purchases a relatively large percentage of components and major assemblies that are needed to support final assembly operations. MAAN uses its own returnable containers to transport purchased parts from suppliers. Due to poor tracking of the containers, the company has been experiencing lost containers and occasional production disruptions at its facility well as at the supplier sites. The objective of this project was to develop a proposal to enable MAAN to more effectively track and manage its returnable containers. The research activities in support of this project included the analysis and documentation of both the physical flow and the information flow associated with the containers as well as some of the technologies that can help with automatic identification and tracking of containers. The focal point of this paper is on a macro?level approach for the analysis of container and information flow within the logistics chain. A companion paper deals with several of the automatic identification technologies that have the potential to improve the management of MAAN's returnable containers.

  8. Information system analysis of an e-learning system used for dental restorations simulation.

    Science.gov (United States)

    Bogdan, Crenguţa M; Popovici, Dorin M

    2012-09-01

    The goal of using virtual and augmented reality technologies in therapeutic interventions simulation, in the fixed prosthodontics (VirDenT) project, is to increase the quality of the educational process in dental faculties, by assisting students in learning how to prepare teeth for all-ceramic restorations. Its main component is an e-learning virtual reality-based software system that will be used for the developing skills in grinding teeth, needed in all-ceramic restorations. The complexity of the domain problem that the software system dealt with made the analysis of the information system supported by VirDenT necessary. The analysis contains the following activities: identification and classification of the system stakeholders, description of the business processes, formulation of the business rules, and modelling of business objects. During this stage, we constructed the context diagram, the business use case diagram, the activity diagrams and the class diagram of the domain model. These models are useful for the further development of the software system that implements the VirDenT information system. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  9. Understanding and Managing Our Earth through Integrated Use and Analysis of Geo-Information

    Directory of Open Access Journals (Sweden)

    Wolfgang Kainz

    2011-09-01

    Full Text Available All things in our world are related to some location in space and time, and according to Tobler’s first law of geography “everything is related to everything else, but near things are more related than distant things” [1]. Since humans exist they have been contemplating about space and time and have tried to depict and manage the geographic space they live in. We know graphic representations of the land from various regions of the world dating back several thousands of years. The processing and analysis of spatial data has a long history in the disciplines that deal with spatial data such as geography, surveying engineering, cartography, photogrammetry, and remote sensing. Until recently, all these activities have been analog in nature; only since the invention of the computer in the second half of the 20th century and the use of computers for the acquisition, storage, analysis, and display of spatial data starting in the 1960s we speak of geo-information and geo-information systems. [...

  10. Content Analysis of Papers Submitted to Communications in Information Literacy, 2007-2013

    Directory of Open Access Journals (Sweden)

    Christopher V. Hollister

    2014-07-01

    Full Text Available The author conducted a content analysis of papers submitted to the journal, Communications in Information Literacy, from the years 2007-2013. The purpose was to investigate and report on the overall quality characteristics of a statistically significant sample of papers submitted to a single-topic, open access, library and information science (LIS journal. Characteristics of manuscript submissions, authorship, reviewer evaluations, and editorial decisions were illuminated to provide context; particular emphasis was given to the analysis of major criticisms found in reviewer evaluations of rejected papers. Overall results were compared to previously published research. The findings suggest a trend in favor of collaborative authorship, and a possible trend toward a more practice-based literature. The findings also suggest a possible deterioration in some of the skills that are required of LIS authors relative to the preparation of scholarly papers. The author discusses potential implications for authors and the disciplinary literature, recommends directions for future research, and where possible, provides recommendations for the benefit of the greater community of LIS scholars.

  11. A systematic approach for analysis and design of secure health information systems.

    Science.gov (United States)

    Blobel, B; Roger-France, F

    2001-06-01

    A toolset using object-oriented techniques including the nowadays popular unified modelling language (UML) approach has been developed to facilitate the different users' views for security analysis and design of health care information systems. Paradigm and concepts used are based on the component architecture of information systems and on a general layered security model. The toolset was developed in 1996/1997 within the ISHTAR project funded by the European Commission as well as through international standardisation activities. Analysing and systematising real health care scenarios, only six and nine use case types could be found in the health and the security-related view, respectively. By combining these use case types, the analysis and design of any thinkable system architecture can be simplified significantly. Based on generic schemes, the environment needed for both communication and application security can be established by appropriate sets of security services and mechanisms. Because of the importance and the basic character of electronic health care record (EHCR) systems, the understanding of the approach is facilitated by (incomplete) examples for this application.

  12. Congruence analysis of geodetic networks - hypothesis tests versus model selection by information criteria

    Science.gov (United States)

    Lehmann, Rüdiger; Lösler, Michael

    2017-12-01

    Geodetic deformation analysis can be interpreted as a model selection problem. The null model indicates that no deformation has occurred. It is opposed to a number of alternative models, which stipulate different deformation patterns. A common way to select the right model is the usage of a statistical hypothesis test. However, since we have to test a series of deformation patterns, this must be a multiple test. As an alternative solution for the test problem, we propose the p-value approach. Another approach arises from information theory. Here, the Akaike information criterion (AIC) or some alternative is used to select an appropriate model for a given set of observations. Both approaches are discussed and applied to two test scenarios: A synthetic levelling network and the Delft test data set. It is demonstrated that they work but behave differently, sometimes even producing different results. Hypothesis tests are well-established in geodesy, but may suffer from an unfavourable choice of the decision error rates. The multiple test also suffers from statistical dependencies between the test statistics, which are neglected. Both problems are overcome by applying information criterions like AIC.

  13. Mapping informative clusters in a hierarchical [corrected] framework of FMRI multivariate analysis.

    Directory of Open Access Journals (Sweden)

    Rui Xu

    Full Text Available Pattern recognition methods have become increasingly popular in fMRI data analysis, which are powerful in discriminating between multi-voxel patterns of brain activities associated with different mental states. However, when they are used in functional brain mapping, the location of discriminative voxels varies significantly, raising difficulties in interpreting the locus of the effect. Here we proposed a hierarchical framework of multivariate approach that maps informative clusters rather than voxels to achieve reliable functional brain mapping without compromising the discriminative power. In particular, we first searched for local homogeneous clusters that consisted of voxels with similar response profiles. Then, a multi-voxel classifier was built for each cluster to extract discriminative information from the multi-voxel patterns. Finally, through multivariate ranking, outputs from the classifiers were served as a multi-cluster pattern to identify informative clusters by examining interactions among clusters. Results from both simulated and real fMRI data demonstrated that this hierarchical approach showed better performance in the robustness of functional brain mapping than traditional voxel-based multivariate methods. In addition, the mapped clusters were highly overlapped for two perceptually equivalent object categories, further confirming the validity of our approach. In short, the hierarchical framework of multivariate approach is suitable for both pattern classification and brain mapping in fMRI studies.

  14. Evidence of global demand for medication abortion information: an analysis of www.medicationabortion.com.

    Science.gov (United States)

    Foster, Angel M; Wynn, L L; Trussell, James

    2014-03-01

    The worldwide expansion of the Internet offers an important modality of disseminating medically accurate information about medication abortion. We chronicle the story of www.medicationabortion.com, an English-, Spanish-, Arabic- and French-language website dedicated to three early abortion regimens. We evaluated the website use patterns from 2005 through 2009. We also conducted a content and thematic analysis of 1910 emails submitted during this period. The website experienced steady growth in use. In 2009, it received 35,000 visits each month from more than 20,000 unique visitors and was accessed by users in 208 countries and territories. More than half of all users accessed the website from a country in which abortion is legally restricted. Users from more than 40 countries sent emails with individual questions. Women often wrote in extraordinary detail about the circumstances of their pregnancies and attempts to obtain an abortion. These emails also reflect considerable demand for information about the use of misoprostol for self-induction. The use patterns of www.medicationabortion.com indicate that there is significant demand for online information about abortion, and the findings suggest future priorities for research, collaboration and educational outreach. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. Value of information analysis for groundwater quality monitoring network design Case study: Eocene Aquifer, Palestine

    Science.gov (United States)

    Khader, A.; McKee, M.

    2010-12-01

    Value of information (VOI) analysis evaluates the benefit of collecting additional information to reduce or eliminate uncertainty in a specific decision-making context. It makes explicit any expected potential losses from errors in decision making due to uncertainty and identifies the “best” information collection strategy as one that leads to the greatest expected net benefit to the decision-maker. This study investigates the willingness to pay for groundwater quality monitoring in the Eocene Aquifer, Palestine, which is an unconfined aquifer located in the northern part of the West Bank. The aquifer is being used by 128,000 Palestinians to fulfill domestic and agricultural demands. The study takes into account the consequences of pollution and the options the decision maker might face. Since nitrate is the major pollutant in the aquifer, the consequences of nitrate pollution were analyzed, which mainly consists of the possibility of methemoglobinemia (blue baby syndrome). In this case, the value of monitoring was compared to the costs of treating for methemoglobinemia or the costs of other options like water treatment, using bottled water or importing water from outside the aquifer. And finally, an optimal monitoring network that takes into account the uncertainties in recharge (climate), aquifer properties (hydraulic conductivity), pollutant chemical reaction (decay factor), and the value of monitoring is designed by utilizing a sparse Bayesian modeling algorithm called a relevance vector machine.

  16. [Analysis of health terminologies for use as ontologies in healthcare information systems].

    Science.gov (United States)

    Romá-Ferri, Maria Teresa; Palomar, Manuel

    2008-01-01

    Ontologies are a resource that allow the concept of meaning to be represented informatically, thus avoiding the limitations imposed by standardized terms. The objective of this study was to establish the extent to which terminologies could be used for the design of ontologies, which could be serve as an aid to resolve problems such as semantic interoperability and knowledge reusability in healthcare information systems. To determine the extent to which terminologies could be used as ontologies, six of the most important terminologies in clinical, epidemiologic, documentation and administrative-economic contexts were analyzed. The following characteristics were verified: conceptual coverage, hierarchical structure, conceptual granularity of the categories, conceptual relations, and the language used for conceptual representation. MeSH, DeCS and UMLS ontologies were considered lightweight. The main differences among these ontologies concern conceptual specification, the types of relation and the restrictions among the associated concepts. SNOMED and GALEN ontologies have declaratory formalism, based on logical descriptions. These ontologies include explicit qualities and show greater restrictions among associated concepts and rule combinations and were consequently considered as heavyweight. Analysis of the declared representation of the terminologies shows the extent to which they could be reused as ontologies. Their degree of usability depends on whether the aim is for healthcare information systems to solve problems of semantic interoperability (lightweight ontologies) or to reuse the systems' knowledge as an aid to decision making (heavyweight ontologies) and for non-structured information retrieval, extraction, and classification.

  17. Personal information of adolescents on the Internet: A quantitative content analysis of MySpace.

    Science.gov (United States)

    Hinduja, Sameer; Patchin, Justin W

    2008-02-01

    Many youth have recently embraced online social networking sites such as MySpace (myspace.com) to meet their social and relational needs. While manifold benefits stem from participating in such web-based environments, the popular media has been quick to demonize MySpace even though an exponentially small proportion of its users have been victimized due to irresponsible or naive usage of the technology it affords. Major concerns revolve around the possibility of sexual predators and pedophiles finding and then assaulting adolescents who carelessly or unwittingly reveal identifiable information on their personal profile pages. The current study sought to empirically ascertain the type of information youth are publicly posting through an extensive content analysis of randomly sampled MySpace profile pages. Among other findings, 8.8% revealed their full name, 57% included a picture, 27.8% listed their school, and 0.3% provided their telephone number. When considered in its proper context, these results indicate that the problem of personal information disclosure on MySpace may not be as widespread as many assume, and that the overwhelming majority of adolescents are responsibly using the web site. Implications for Internet safety among adolescents and future research regarding adolescent Internet use are discussed.

  18. Development of SNS Stream Analysis Based on Forest Disaster Warning Information Service System

    Science.gov (United States)

    Oh, J.; KIM, D.; Kang, M.; Woo, C.; Kim, D.; Seo, J.; Lee, C.; Yoon, H.; Heon, S.

    2017-12-01

    Forest disasters, such as landslides and wildfires, cause huge economic losses and casualties, and the cost of recovery is increasing every year. While forest disaster mitigation technologies have been focused on the development of prevention and response technologies, they are now required to evolve into evacuation and border evacuation, and to develop technologies fused with ICT. In this study, we analyze the SNS (Social Network Service) stream and implement a system to detect the message that the forest disaster occurred or the forest disaster, and search the keyword related to the forest disaster in advance in real time. It is possible to detect more accurate forest disaster messages by repeatedly learning the retrieved results using machine learning techniques. To do this, we designed and implemented a system based on Hadoop and Spark, a distributed parallel processing platform, to handle Twitter stream messages that open SNS. In order to develop the technology to notify the information of forest disaster risk, a linkage of technology such as CBS (Cell Broadcasting System) based on mobile communication, internet-based civil defense siren, SNS and the legal and institutional issues for applying these technologies are examined. And the protocol of the forest disaster warning information service system that can deliver the SNS analysis result was developed. As a result, it was possible to grasp real-time forest disaster situation by real-time big data analysis of SNS that occurred during forest disasters. In addition, we confirmed that it is possible to rapidly propagate alarm or warning according to the disaster situation by using the function of the forest disaster warning information notification service. However, the limitation of system application due to the restriction of opening and sharing of SNS data currently in service and the disclosure of personal information remains a problem to be solved in the future. Keyword : SNS stream, Big data, Machine

  19. Informational and Linguistic Analysis of Large Genomic Sequence Collections via Efficient Hadoop Cluster Algorithms.

    Science.gov (United States)

    Ferraro Petrillo, Umberto; Roscigno, Gianluca; Cattaneo, Giuseppe; Giancarlo, Raffaele

    2018-01-12

    Information theoretic and compositional/linguistic analysis of genomes have a central role in bioinformatics, even more so since the associated methodologies are becoming very valuable also for epigenomic and meta-genomic studies. The kernel of those methods is based on the collection of k-mer statistics, i.e., how many times each k-mer in {A;C; G; T}k occurs in a DNA sequence. Although this problem is computationally very simple and efficiently solvable on a conventional computer, the sheer amount of data available now in applications demands to resort to parallel and distributed computing. Indeed, those type of algorithms have been developed to collect k-mer statistics in the realm of genome assembly. However, they are so specialized to this domain that they do not extend easily to the computation of informational and linguistic indices, concurrently on sets of genomes. Following the well established approach in many disciplines, and with a growing success also in bioinformatics, to resort to MapReduce and Hadoop to deal with "Big Data" problems, we present KCH, the first set of MapReduce algorithms able to perform concurrently informational and linguistic analysis of large collections of genomic sequences on a Hadoop cluster. The benchmarking of KCH that we provide indicates that it is quite effective and versatile. It is also competitive with respect to the parallel and distributed algorithms highly specialized to k-mer statistics collection for genome assembly problems. In conclusion, KCH is a much needed addition to the growing number of algorithms and tools that use MapReduce for bioinformatics core applications. The software, including instructions for running it over Amazon AWS, as well as the datasets are available at http://www.di-srv.unisa.it/KCH. umberto.ferraro@uniroma1.it. Supplementary data are available at Bioinformatics online. © The Author (2018). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  20. A path analysis study of retention of healthcare professionals in urban India using health information technology.

    Science.gov (United States)

    Bhattacharya, Indrajit; Ramachandran, Anandhi

    2015-07-31

    Healthcare information technology (HIT) applications are being ubiquitously adopted globally and have been indicated to have effects on certain dimensions of recruitment and retention of healthcare professionals. Retention of healthcare professionals is affected by their job satisfaction (JS), commitment to the organization and intention to stay (ITS) that are interlinked with each other and influenced by many factors related to job, personal, organization, etc. The objectives of the current study were to determine if HIT was one among the factors and, if so, propose a probable retention model that incorporates implementation and use of HIT as a strategy. This was a cross-sectional survey study covering 20 hospitals from urban areas of India. The sample (n = 586) consisted of doctors, nurses, paramedics and hospital administrators. Data was collected through a structured questionnaire. Factors affecting job satisfaction were determined. Technology acceptance by the healthcare professionals was also determined. Interactions between the factors were predicted using a path analysis model. The overall satisfaction rate of the respondents was 51 %. Based on factor analysis method, 10 factors were identified for JS and 9 factors for ITS. Availability and use of information technology was one factor that affected JS. The need for implementing technology influenced ITS through work environment and career growth. Also, the study indicated that nearly 70 % of the respondents had awareness of HIT, but only 40 % used them. The importance of providing training for HIT applications was stressed by many respondents. The results are in agreement with literature studies exploring job satisfaction and retention among healthcare professionals. Our study documented a relatively medium level of job satisfaction among the healthcare professionals in the urban area. Information technology was found to be one among the factors that can plausibly influence their job satisfaction and

  1. Petrophysical Analysis and Geographic Information System for San Juan Basin Tight Gas Reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Martha Cather; Robert Lee; Robert Balch; Tom Engler; Roger Ruan; Shaojie Ma

    2008-10-01

    The primary goal of this project is to increase the availability and ease of access to critical data on the Mesaverde and Dakota tight gas reservoirs of the San Juan Basin. Secondary goals include tuning well log interpretations through integration of core, water chemistry and production analysis data to help identify bypassed pay zones; increased knowledge of permeability ratios and how they affect well drainage and thus infill drilling plans; improved time-depth correlations through regional mapping of sonic logs; and improved understanding of the variability of formation waters within the basin through spatial analysis of water chemistry data. The project will collect, integrate, and analyze a variety of petrophysical and well data concerning the Mesaverde and Dakota reservoirs of the San Juan Basin, with particular emphasis on data available in the areas defined as tight gas areas for purpose of FERC. A relational, geo-referenced database (a geographic information system, or GIS) will be created to archive this data. The information will be analyzed using neural networks, kriging, and other statistical interpolation/extrapolation techniques to fine-tune regional well log interpretations, improve pay zone recognition from old logs or cased-hole logs, determine permeability ratios, and also to analyze water chemistries and compatibilities within the study area. This single-phase project will be accomplished through four major tasks: Data Collection, Data Integration, Data Analysis, and User Interface Design. Data will be extracted from existing databases as well as paper records, then cleaned and integrated into a single GIS database. Once the data warehouse is built, several methods of data analysis will be used both to improve pay zone recognition in single wells, and to extrapolate a variety of petrophysical properties on a regional basis. A user interface will provide tools to make the data and results of the study accessible and useful. The final deliverable

  2. Analysis of satisfaction factors at urban transport interchanges: Measuring travelers’ attitudes to information, security and waiting

    Energy Technology Data Exchange (ETDEWEB)

    Lois Garcia, D.; Monzon de Caceres, A.; Hernandez del Olmo, S.

    2016-07-01

    Transport interchanges can be considered as a node, where people transfer from one mode to another, and as a place to stay, using facilities and services as well as waiting areas. Reducing disruption of transfer in multimodal trips is a key element for assuring seamless mobility in big cities. Based on previous research (Hernández & Monzón, 2016) this paper aims to explore the predictive capacity of attitudes towards several service factors on general satisfaction with transport interchange. Complementary, it was analyzing how personal and trip characteristics are related to evaluation of some variables, and examining the influence of waiting time on the perceived quality. To that end, a two steps methodology was conducted (personal and on-line interview) in a representative sample of 740 users (54% female, 55% work purpose trip). We performed path analysis to test the model showing a satisfactory statistical fit. The model developed show good performance for predicting general satisfaction at Moncloa Transport Interchange (Madrid, Spain). The outputs of the model indicate that Information and Safety and Security factors predicted 49% of general satisfaction. Furthermore, the results showed also a strong association between evaluation of Design and Environmental quality, factors that not affect directly general satisfaction but do so through Information and Safety & Security perception, acting the last as mediator variables. Nevertheless, spending time queuing inside the interchange show a negative influence on Information and Safety & Security, while age of participants affect negatively to Information, which mean that elder have some cognitive accessibility problems. Moreover, our data shows gender differences in safety perception, since women feel less safe (particularity the youngest) inside the interchange. The results indicate a number of priority measures to enhance. (Author)

  3. Teen suicide information on the internet: a systematic analysis of quality.

    Science.gov (United States)

    Szumilas, Magdalena; Kutcher, Stan

    2009-09-01

    To synthesize the literature on youth suicide risk factors (RFs) and prevention strategies (PSs); evaluate quality of information regarding youth suicide RFs and PSs found on selected Canadian websites; determine if website source was related to evidence-based rating (EBR); and determine the association of website quality indicators with EBR. Five systematic reviews of youth suicide research were analyzed to assemble the evidence base for RFs and PSs. The top 20 most commonly accessed youth suicide information websites were analyzed for quality indicators and EBR. Univariate logistic regression was conducted to determine if quality indicators predicted statements supported by evidence (SSEs). Multivariate analysis was used to calculate adjusted odds ratios for SSEs and quality indicators. Only 44.2% of statements were SSEs. The 10 most highly ranked websites contained almost 80% of the total statements analyzed, and one-half had a negative EBR. Compared with government websites, nonprofit organization websites were more likely (OR 1.45, 95% CI 0.66 to 3.18), and personal and media websites were less likely (OR 0.62, 95% CI 0.26 to 1.47), to have a positive EBR. Crediting of an author (AOR 2.65, 95% CI 1.34 to 5.28), and recommendation to consult a health professional (AOR 2.08, 95% CI 1.18 to 3.68), increased the odds of SSEs. Fundamental to addressing youth suicide is the availability of high-quality, evidence-based information accessible to the public, health providers, and policy-makers. Many websites, including those sponsored by the federal government and national organizations, need to improve the evidence-based quality of the information provided.

  4. Information content analysis: the potential for methane isotopologue retrieval from GOSAT-2

    Science.gov (United States)

    Malina, Edward; Yoshida, Yukio; Matsunaga, Tsuneo; Muller, Jan-Peter

    2018-02-01

    Atmospheric methane is comprised of multiple isotopic molecules, with the most abundant being 12CH4 and 13CH4, making up 98 and 1.1 % of atmospheric methane respectively. It has been shown that is it possible to distinguish between sources of methane (biogenic methane, e.g. marshland, or abiogenic methane, e.g. fracking) via a ratio of these main methane isotopologues, otherwise known as the δ13C value. δ13C values typically range between -10 and -80 ‰, with abiogenic sources closer to zero and biogenic sources showing more negative values. Initially, we suggest that a δ13C difference of 10 ‰ is sufficient, in order to differentiate between methane source types, based on this we derive that a precision of 0.2 ppbv on 13CH4 retrievals may achieve the target δ13C variance. Using an application of the well-established information content analysis (ICA) technique for assumed clear-sky conditions, this paper shows that using a combination of the shortwave infrared (SWIR) bands on the planned Greenhouse gases Observing SATellite (GOSAT-2) mission, 13CH4 can be measured with sufficient information content to a precision of between 0.7 and 1.2 ppbv from a single sounding (assuming a total column average value of 19.14 ppbv), which can then be reduced to the target precision through spatial and temporal averaging techniques. We therefore suggest that GOSAT-2 can be used to differentiate between methane source types. We find that large unconstrained covariance matrices are required in order to achieve sufficient information content, while the solar zenith angle has limited impact on the information content.

  5. Information content analysis: the potential for methane isotopologue retrieval from GOSAT-2

    Directory of Open Access Journals (Sweden)

    E. Malina

    2018-02-01

    Full Text Available Atmospheric methane is comprised of multiple isotopic molecules, with the most abundant being 12CH4 and 13CH4, making up 98 and 1.1 % of atmospheric methane respectively. It has been shown that is it possible to distinguish between sources of methane (biogenic methane, e.g. marshland, or abiogenic methane, e.g. fracking via a ratio of these main methane isotopologues, otherwise known as the δ13C value. δ13C values typically range between −10 and −80 ‰, with abiogenic sources closer to zero and biogenic sources showing more negative values. Initially, we suggest that a δ13C difference of 10 ‰ is sufficient, in order to differentiate between methane source types, based on this we derive that a precision of 0.2 ppbv on 13CH4 retrievals may achieve the target δ13C variance. Using an application of the well-established information content analysis (ICA technique for assumed clear-sky conditions, this paper shows that using a combination of the shortwave infrared (SWIR bands on the planned Greenhouse gases Observing SATellite (GOSAT-2 mission, 13CH4 can be measured with sufficient information content to a precision of between 0.7 and 1.2 ppbv from a single sounding (assuming a total column average value of 19.14 ppbv, which can then be reduced to the target precision through spatial and temporal averaging techniques. We therefore suggest that GOSAT-2 can be used to differentiate between methane source types. We find that large unconstrained covariance matrices are required in order to achieve sufficient information content, while the solar zenith angle has limited impact on the information content.

  6. PathNet: a tool for pathway analysis using topological information

    Directory of Open Access Journals (Sweden)

    Dutta Bhaskar

    2012-09-01

    Full Text Available Abstract Background Identification of canonical pathways through enrichment of differentially expressed genes in a given pathway is a widely used method for interpreting gene lists generated from high-throughput experimental studies. However, most algorithms treat pathways as sets of genes, disregarding any inter- and intra-pathway connectivity information, and do not provide insights beyond identifying lists of pathways. Results We developed an algorithm (PathNet that utilizes the connectivity information in canonical pathway descriptions to help identify study-relevant pathways and characterize non-obvious dependencies and connections among pathways using gene expression data. PathNet considers both the differential expression of genes and their pathway neighbors to strengthen the evidence that a pathway is implicated in the biological conditions characterizing the experiment. As an adjunct to this analysis, PathNet uses the connectivity of the differentially expressed genes among all pathways to score pathway contextual associations and statistically identify biological relations among pathways. In this study, we used PathNet to identify biologically relevant results in two Alzheimer’s disease microarray datasets, and compared its performance with existing methods. Importantly, PathNet identified de-regulation of the ubiquitin-mediated proteolysis pathway as an important component in Alzheimer’s disease progression, despite the absence of this pathway in the standard enrichment analyses. Conclusions PathNet is a novel method for identifying enrichment and association between canonical pathways in the context of gene expression data. It takes into account topological information present in pathways to reveal biological information. PathNet is available as an R workspace image from http://www.bhsai.org/downloads/pathnet/.

  7. Concept analysis of perspective-taking: meeting informal caregiver needs for communication competence and accurate perception.

    Science.gov (United States)

    Lobchuk, Michelle M

    2006-05-01

    This paper is a report of an analysis of perspective-taking as presented in the nursing and psychological literature between 1972 and 2004. Little is known about the caregiving processes that drive communication competence in patient and informal caregiver relationships. Evidence to date suggests that the empathic perspective-taking process plays a key role in promoting communication competence, perceptual accuracy and enhanced ability by caregivers to meet patients' needs arising in illness. Perspective-taking is a concept that has been explored extensively in health or social psychology, but not in nursing literature. Guided by Morse's typology of attributes and rules of relation, the concept of perspective-taking is explored as it is presented in nursing and social psychology literature, and in accordance with Davis's empathy model. Extant research and theory suggest that perspective-taking is an interpersonal empathic process involving a conscious effort in differentiating one's view from the view of another that can bring the caregivers' viewpoints in closer alignment with patients' viewpoints. Beginning evidence suggests that observers might achieve accurate perceptual accuracy about patients' illness experiences if they are prompted to imagine how patients perceive the situation and how they feel as a result. Research also needs to analyse the characteristics of the patient, informal caregiver and illness situation in order to comprehend more fully which caregiver dyads need assistance with perspective-taking to optimize their skill in providing sufficient patient care. The current emphasis for empirical research in caregiving is to uncover underlying caregiving processes that exist in pre-existing patient and informal caregiver relationships. Once further evidence is found to further corroborate the perspective-taking process and perceptual accuracy linkage and factors that moderate this linkage, then evidence-based interventions can be designed and tested

  8. Selecting essential information for biosurveillance--a multi-criteria decision analysis.

    Science.gov (United States)

    Generous, Nicholas; Margevicius, Kristen J; Taylor-McCabe, Kirsten J; Brown, Mac; Daniel, W Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina

    2014-01-01

    The National Strategy for Biosurveillance defines biosurveillance as "the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels." However, the strategy does not specify how "essential information" is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being "essential". The question of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of "essential information" for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system.

  9. Selecting essential information for biosurveillance--a multi-criteria decision analysis.

    Directory of Open Access Journals (Sweden)

    Nicholas Generous

    Full Text Available The National Strategy for Biosurveillance defines biosurveillance as "the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels." However, the strategy does not specify how "essential information" is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being "essential". The question of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of "essential information" for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system.

  10. Value of information analysis for Corrective Action Unit 97: Yucca Flat, Nevada Test Site, Nevada

    International Nuclear Information System (INIS)

    1999-01-01

    The value-of-information analysis evaluated data collection options for characterizing groundwater transport of contamination associated with the Yucca Flat and Climax Mine Corrective Action Units. Experts provided inputs for the evaluation of 48 characterization options, which included 27 component activities, 12 combinations of activities (subgroups), and 9 combinations of subgroups (groups). The options range from an individual study using existing data and intended to address a relatively narrow uncertainty to a 52-million dollar group of activities designed to collect and analyze new information to broadly address multiple uncertainties. A modified version of the contaminant transport component of the regional model was used to simulate contaminant transport and to estimate the maximum extent of the contaminant boundary, defined as that distance beyond which the committed effective dose equivalent from the residual radionuclides in groundwater will not exceed 4 millirem per year within 1,000 years. These simulations identified the model parameters most responsible for uncertainty over the contaminant boundary and determined weights indicating the relative importance of these parameters. Key inputs were identified through sensitivity analysis; the five selected parameters were flux for flow into Yucca Flat from the north, hydrologic source term, effective porosity and diffusion parameter for the Lower Carbonate Aquifer, and path length from the Volcanic Confining Unit to the Lower Carbonate Aquifer. Four measures were used to quantify uncertainty reduction. Using Bayesian analysis, the options were compared and ranked based on their costs and estimates of their effectiveness at reducing the key uncertainties relevant to predicting the maximum contaminant boundary

  11. Value of information analysis for Corrective Action Unit 97: Yucca Flat, Nevada Test Site, Nevada

    Energy Technology Data Exchange (ETDEWEB)

    IT Corporation Las Vegas

    1999-11-19

    The value-of-information analysis evaluated data collection options for characterizing groundwater transport of contamination associated with the Yucca Flat and Climax Mine Corrective Action Units. Experts provided inputs for the evaluation of 48 characterization options, which included 27 component activities, 12 combinations of activities (subgroups), and 9 combinations of subgroups (groups). The options range from an individual study using existing data and intended to address a relatively narrow uncertainty to a 52-million dollar group of activities designed to collect and analyze new information to broadly address multiple uncertainties. A modified version of the contaminant transport component of the regional model was used to simulate contaminant transport and to estimate the maximum extent of the contaminant boundary, defined as that distance beyond which the committed effective dose equivalent from the residual radionuclides in groundwater will not exceed 4 millirem per year within 1,000 years. These simulations identified the model parameters most responsible for uncertainty over the contaminant boundary and determined weights indicating the relative importance of these parameters. Key inputs were identified through sensitivity analysis; the five selected parameters were flux for flow into Yucca Flat from the north, hydrologic source term, effective porosity and diffusion parameter for the Lower Carbonate Aquifer, and path length from the Volcanic Confining Unit to the Lower Carbonate Aquifer. Four measures were used to quantify uncertainty reduction. Using Bayesian analysis, the options were compared and ranked based on their costs and estimates of their effectiveness at reducing the key uncertainties relevant to predicting the maximum contaminant boundary.

  12. Inverse analysis of aerodynamic loads from strain information using structural models and neural networks

    Science.gov (United States)

    Wada, Daichi; Sugimoto, Yohei

    2017-04-01

    Aerodynamic loads on aircraft wings are one of the key parameters to be monitored for reliable and effective aircraft operations and management. Flight data of the aerodynamic loads would be used onboard to control the aircraft and accumulated data would be used for the condition-based maintenance and the feedback for the fatigue and critical load modeling. The effective sensing techniques such as fiber optic distributed sensing have been developed and demonstrated promising capability of monitoring structural responses, i.e., strains on the surface of the aircraft wings. By using the developed techniques, load identification methods for structural health monitoring are expected to be established. The typical inverse analysis for load identification using strains calculates the loads in a discrete form of concentrated forces, however, the distributed form of the loads is essential for the accurate and reliable estimation of the critical stress at structural parts. In this study, we demonstrate an inverse analysis to identify the distributed loads from measured strain information. The introduced inverse analysis technique calculates aerodynamic loads not in a discrete but in a distributed manner based on a finite element model. In order to verify the technique through numerical simulations, we apply static aerodynamic loads on a flat panel model, and conduct the inverse identification of the load distributions. We take two approaches to build the inverse system between loads and strains. The first one uses structural models and the second one uses neural networks. We compare the performance of the two approaches, and discuss the effect of the amount of the strain sensing information.

  13. Transportation Routing Analysis Geographic Information System (WebTRAGIS) User's Manual

    International Nuclear Information System (INIS)

    Michelhaugh, R.D.

    2000-01-01

    In the early 1980s, Oak Ridge National Laboratory (ORNL) developed two transportation routing models: HIGHWAY, which predicts truck transportation routes, and INTERLINE, which predicts rail transportation routes. Both of these models have been used by the U.S. Department of Energy (DOE) community for a variety of routing needs over the years. One of the primary uses of the models has been to determine population-density information, which is used as input for risk assessment with the RADTRAN model, which is available on the TRANSNET computer system. During the recent years, advances in the development of geographic information systems (GISs) have resulted in increased demands from the user community for a GIS version of the ORNL routing models. In April 1994, the DOE Transportation Management Division (EM-261) held a Baseline Requirements Assessment Session with transportation routing experts and users of the HIGHWAY and INTERLINE models. As a result of the session, the development of a new GIS routing model, Transportation Routing Analysis GIS (TRAGIS), was initiated. TRAGIS is a user-friendly, GIS-based transportation and analysis computer model. The older HIGHWAY and INTERLINE models are useful to calculate routes, but they cannot display a graphic of the calculated route. Consequently, many users have experienced difficulty determining the proper node for facilities and have been confused by or have misinterpreted the text-based listing from the older routing models. Some of the primary reasons for the development of TRAGIS are (a) to improve the ease of selecting locations for routing, (b) to graphically display the calculated route, and (c) to provide for additional geographic analysis of the route

  14. Electronic tools for health information exchange: an evidence-based analysis.

    Science.gov (United States)

    2013-01-01

    As patients experience transitions in care, there is a need to share information between care providers in an accurate and timely manner. With the push towards electronic medical records and other electronic tools (eTools) (and away from paper-based health records) for health information exchange, there remains uncertainty around the impact of eTools as a form of communication. To examine the impact of eTools for health information exchange in the context of care coordination for individuals with chronic disease in the community. A literature search was performed on April 26, 2012, using OVID MEDLINE, OVID MEDLINE In-Process and Other Non-Indexed Citations, OVID EMBASE, EBSCO Cumulative Index to Nursing & Allied Health Literature (CINAHL), the Wiley Cochrane Library, and the Centre for Reviews and Dissemination database, for studies published until April 26, 2012 (no start date limit was applied). A systematic literature search was conducted, and meta-analysis conducted where appropriate. Outcomes of interest fell into 4 categories: health services utilization, disease-specific clinical outcomes, process-of-care indicators, and measures of efficiency. The quality of the evidence was assessed individually for each outcome. Expert panels were assembled for stakeholder engagement and contextualization. Eleven articles were identified (4 randomized controlled trials and 7 observational studies). There was moderate quality evidence of a reduction in hospitalizations, hospital length of stay, and emergency department visits following the implementation of an electronically generated laboratory report with recommendations based on clinical guidelines. The evidence showed no difference in disease-specific outcomes; there was no evidence of a positive impact on process-of-care indicators or measures of efficiency. A limited body of research specifically examined eTools for health information exchange in the population and setting of interest. This evidence included a

  15. ANALYSIS OF TRAIN SHEET IN THE INFORMATION SYSTEM OF JSC «UKRZALIZNYTSIA»: PERSPECTIVE

    Directory of Open Access Journals (Sweden)

    S. M. Ovcharenko

    2016-04-01

    Full Text Available Purpose. The system of train sheet analysis (TSA in the information system of JSC «Ukrzaliznytsia» provides work with passenger and suburban trains and has considerable potential. Therefore it is necessary to establish the prospects of development of the system. Methodology. Departments’ setup and the train delay causes should be carried out at every station and span, where such delays took place. This requires the fixation of condition deviations of infrastructure from normal and other adverse factors. In the sector of freight transportations the train schedule analysis is insufficient, since this analysis does not account for deviations from the terms of delivery. Therefore it also is necessary to analyze the delivery graphs. The basis for monitoring the cargo delivery is the method of control time points (CTP of technological operations performed with cargo at railway stations. On the basis of CTP to assess the quality of the transport process one should calculate the values of the analysis of cargo delivery schedule (performance level of the cargo delivery schedule, the coefficient of ahead of schedule/delay delivery. Findings. The article proposes to develop the system TSA using the input and display of the train delay causes on-line by transportation service employees, expansion of statistical databases and processing of the input delay causes during its calculation train sheet analysis of freight trains and quality assessment of the delivery schedule fulfillment. It is also appropriate before the new operator companies had appeared to make changes in the instructions TSCHU-TSD-0002 on the list of departments, which include delayed trains, by adding «the department» «The fault of operator companies» and corresponding causes of delays. Originality. The scheme of automated TSA in the information system of JSC «Ukrzaliznytsia» was improved. The author proposes to determine the cargo delivery quality on the certain polygon using the

  16. Correlations between MRI and Information Processing Speed in MS: A Meta-Analysis

    Directory of Open Access Journals (Sweden)

    S. M. Rao

    2014-01-01

    Full Text Available Objectives. To examine relationships between conventional MRI measures and the paced auditory serial addition test (PASAT and symbol digit modalities test (SDMT. Methods. A systematic literature review was conducted. Included studies had ≥30 multiple sclerosis (MS patients, administered the SDMT or PASAT, and measured T2LV or brain atrophy. Meta-analysis of MRI/information processing speed (IPS correlations, analysis of MRI/IPS significance tests to account for reporting bias, and binomial testing to detect trends when comparing correlation strengths of SDMT versus PASAT and T2LV versus atrophy were conducted. Results. The 39 studies identified frequently reported only significant correlations, suggesting reporting bias. Direct meta-analysis was only feasible for correlations between SDMT and T2LV (r=-0.45, P<0.001 and atrophy in patients with mixed-MS subtypes (r=-0.54, P<0.001. Familywise Holm-Bonferroni testing found that selective reporting was not the source of at least half of significant results reported. Binomial tests (P=0.006 favored SDMT over PASAT in strength of MRI correlations. Conclusions. A moderate-to-strong correlation exists between impaired IPS and MRI in mixed MS populations. Correlations with MRI were stronger for SDMT than for PASAT. Neither heterogeneity among populations nor reporting bias appeared to be responsible for these findings.

  17. Origins of modern data analysis linked to the beginnings and early development of computer science and information engineering

    OpenAIRE

    Murtagh, F.

    2008-01-01

    The history of data analysis that is addressed here is underpinned by two themes, -- those of tabular data analysis, and the analysis of collected heterogeneous data. "Exploratory data analysis" is taken as the heuristic approach that begins with data and information and seeks underlying explanation for what is observed or measured. I also cover some of the evolving context of research and applications, including scholarly publishing, technology transfer and the economic relationship of the u...

  18. How Did the Information Flow in the #AlphaGo Hashtag Network? A Social Network Analysis of the Large-Scale Information Network on Twitter.

    Science.gov (United States)

    Kim, Jinyoung

    2017-12-01

    As it becomes common for Internet users to use hashtags when posting and searching information on social media, it is important to understand who builds a hashtag network and how information is circulated within the network. This article focused on unlocking the potential of the #AlphaGo hashtag network by addressing the following questions. First, the current study examined whether traditional opinion leadership (i.e., the influentials hypothesis) or grassroot participation by the public (i.e., the interpersonal hypothesis) drove dissemination of information in the hashtag network. Second, several unique patterns of information distribution by key users were identified. Finally, the association between attributes of key users who exerted great influence on information distribution (i.e., the number of followers and follows) and their central status in the network was tested. To answer the proffered research questions, a social network analysis was conducted using a large-scale hashtag network data set from Twitter (n = 21,870). The results showed that the leading actors in the network were actively receiving information from their followers rather than serving as intermediaries between the original information sources and the public. Moreover, the leading actors played several roles (i.e., conversation starters, influencers, and active engagers) in the network. Furthermore, the number of their follows and followers were significantly associated with their central status in the hashtag network. Based on the results, the current research explained how the information was exchanged in the hashtag network by proposing the reciprocal model of information flow.

  19. Carbon Dioxide Information Analysis Center and World Data Center - A for atmospheric trace gases. Fiscal year 1996, annual report

    Energy Technology Data Exchange (ETDEWEB)

    Cushman, R.M.; Boden, T.A.; Jones, S.B. [and others

    1997-02-01

    Fiscal year 1996 was especially productive for the Carbon Dioxide Information Analysis Center (CDIAC) at Oak Ridge National Laboratory (ORNL). This report describes publications and statistical data from the CDIAC.

  20. Visual MRI: merging information visualization and non-parametric clustering techniques for MRI dataset analysis.

    Science.gov (United States)

    Castellani, Umberto; Cristani, Marco; Combi, Carlo; Murino, Vittorio; Sbarbati, Andrea; Marzola, Pasquina

    2008-11-01

    This paper presents Visual MRI, an innovative tool for the magnetic resonance imaging (MRI) analysis of tumoral tissues. The main goal of the analysis is to separate each magnetic resonance image in meaningful clusters, highlighting zones which are more probably related with the cancer evolution. Such non-invasive analysis serves to address novel cancer treatments, resulting in a less destabilizing and more effective type of therapy than the chemotherapy-based ones. The advancements brought by Visual MRI are two: first, it is an integration of effective information visualization (IV) techniques into a clustering framework, which separates each MRI image in a set of informative clusters; the second improvement relies in the clustering framework itself, which is derived from a recently re-discovered non-parametric grouping strategy, i.e., the mean shift. The proposed methodology merges visualization methods and data mining techniques, providing a computational framework that allows the physician to move effectively from the MRI image to the images displaying the derived parameter space. An unsupervised non-parametric clustering algorithm, derived from the mean shift paradigm, and called MRI-mean shift, is the novel data mining technique proposed here. The main underlying idea of such approach is that the parameter space is regarded as an empirical probability density function to estimate: the possible separate modes and their attraction basins represent separated clusters. The mean shift algorithm needs sensibility threshold values to be set, which could lead to highly different segmentation results. Usually, these values are set by hands. Here, with the MRI-mean shift algorithm, we propose a strategy based on a structured optimality criterion which faces effectively this issue, resulting in a completely unsupervised clustering framework. A linked brushing visualization technique is then used for representing clusters on the parameter space and on the MRI image