WorldWideScience

Sample records for analysis georgraphic information

  1. Transportation Routing Analysis Georgraphic Information System (WebTRAGIS) User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Michelhaugh, R.D.

    2000-04-20

    In the early 1980s, Oak Ridge National Laboratory (ORNL) developed two transportation routing models: HIGHWAY, which predicts truck transportation routes, and INTERLINE, which predicts rail transportation routes. Both of these models have been used by the U.S. Department of Energy (DOE) community for a variety of routing needs over the years. One of the primary uses of the models has been to determine population-density information, which is used as input for risk assessment with the RADTRAN model, which is available on the TRANSNET computer system. During the recent years, advances in the development of geographic information systems (GISs) have resulted in increased demands from the user community for a GIS version of the ORNL routing models. In April 1994, the DOE Transportation Management Division (EM-261) held a Baseline Requirements Assessment Session with transportation routing experts and users of the HIGHWAY and INTERLINE models. As a result of the session, the development of a new GIS routing model, Transportation Routing Analysis GIS (TRAGIS), was initiated. TRAGIS is a user-friendly, GIS-based transportation and analysis computer model. The older HIGHWAY and INTERLINE models are useful to calculate routes, but they cannot display a graphic of the calculated route. Consequently, many users have experienced difficulty determining the proper node for facilities and have been confused by or have misinterpreted the text-based listing from the older routing models. Some of the primary reasons for the development of TRAGIS are (a) to improve the ease of selecting locations for routing, (b) to graphically display the calculated route, and (c) to provide for additional geographic analysis of the route.

  2. HIRENASD analysis Information Package

    Data.gov (United States)

    National Aeronautics and Space Administration — Updated November 2, 2011 Contains summary information and analysis condition details for the Aeroelastic Prediction Workshop Information plotted in this package is...

  3. Canonical Information Analysis

    DEFF Research Database (Denmark)

    Vestergaard, Jacob Schack; Nielsen, Allan Aasbjerg

    2015-01-01

    Canonical correlation analysis is an established multivariate statistical method in which correlation between linear combinations of multivariate sets of variables is maximized. In canonical information analysis introduced here, linear correlation as a measure of association between variables is...... replaced by the information theoretical, entropy based measure mutual information, which is a much more general measure of association. We make canonical information analysis feasible for large sample problems, including for example multispectral images, due to the use of a fast kernel density estimator...... for entropy estimation. Canonical information analysis is applied successfully to (1) simple simulated data to illustrate the basic idea and evaluate performance, (2) fusion of weather radar and optical geostationary satellite data in a situation with heavy precipitation, and (3) change detection in...

  4. Information security risk analysis

    CERN Document Server

    Peltier, Thomas R

    2001-01-01

    Effective Risk AnalysisQualitative Risk AnalysisValue AnalysisOther Qualitative MethodsFacilitated Risk Analysis Process (FRAP)Other Uses of Qualitative Risk AnalysisCase StudyAppendix A: QuestionnaireAppendix B: Facilitated Risk Analysis Process FormsAppendix C: Business Impact Analysis FormsAppendix D: Sample of ReportAppendix E: Threat DefinitionsAppendix F: Other Risk Analysis OpinionsIndex

  5. Information Security Risk Analysis

    CERN Document Server

    Peltier, Thomas R

    2010-01-01

    Offers readers with the knowledge and the skill-set needed to achieve a highly effective risk analysis assessment. This title demonstrates how to identify threats and then determine if those threats pose a real risk. It is suitable for industry and academia professionals.

  6. Information Flow Analysis for VHDL

    DEFF Research Database (Denmark)

    Tolstrup, Terkel Kristian; Nielson, Flemming; Nielson, Hanne Riis

    2005-01-01

    We describe a fragment of the hardware description language VHDL that is suitable for implementing the Advanced Encryption Standard algorithm. We then define an Information Flow analysis as required by the international standard Common Criteria. The goal of the analysis is to identify the entire...... information flow through the VHDL program. The result of the analysis is presented as a non-transitive directed graph that connects those nodes (representing either variables or signals) where an information flow might occur. We compare our approach to that of Kemmerer and conclude that our approach yields...

  7. The informatively-factor analysis as new information technology

    OpenAIRE

    Али, Аль-Аммори; National Transport University

    2010-01-01

    The informatively-factor analysis of processes of flight as new perspective approach is offered taking into account polifactorial. Historical bases of origin are examined; development of factor analysis and information theory, the value of informatively-factor analysis is underlined for providing of safety and efficiency of flight.

  8. Information analysis and international safeguards

    Energy Technology Data Exchange (ETDEWEB)

    Pilat, Joseph F.; Budlong-Sylvester, K. W. (Kory W.); Tape, J. W. (James W.)

    2004-01-01

    After the first Gulf War, it was recognized that one of the key weaknesses of the international safeguards system was that there was no systematic attempt by the International Atomic Energy Agency (IAEA) to analyze all available information about States nuclear programs to determine whether these programs were consistent with nonproliferation obligations. The IAEA, as part of its effort to redesign the international safeguards system, is looking closely at the issue of information review and evaluation. The application of information analysis (IA) techniques to the international nuclear safeguards system has the potential to revolutionize the form and practice of safeguards. Assessing the possibilities of IA for the IAEA, and in particular those embodied in concepts of information-driven safeguards, requires an understanding of IA, the limits on its effectiveness and the requirements placed on such analyses in a variety of safeguards contexts. The Australian Safeguards and Nonproliferation Office (ASNO) and the United States Department of Energy (DOE) agreed in July 2002 to undertake a joint study of 'information-driven safeguards' under a long-standing cooperative arrangement. It was decided that a broad range of ideas should be considered, and that the study would not be intended to be and would not be an elaboration of either US or Australian governmental positions. This paper reports some findings of Phase 1 of this collaborative effort and offers some initial thinking on the part of the authors on the outstanding issues to be addressed in Phase 2. An effort to explore through case studies alternative strategies for utilizing IA by the IAEA that provide the same or increased confidence in safeguards conclusions while allowing safeguards resource allocation to be determined not only by the types and quantities of nuclear material and facilities in a State but also by other objective factors.

  9. Information analysis and international safeguards

    International Nuclear Information System (INIS)

    After the first Gulf War, it was recognized that one of the key weaknesses of the international safeguards system was that there was no systematic attempt by the International Atomic Energy Agency (IAEA) to analyze all available information about States nuclear programs to determine whether these programs were consistent with nonproliferation obligations. The IAEA, as part of its effort to redesign the international safeguards system, is looking closely at the issue of information review and evaluation. The application of information analysis (IA) techniques to the international nuclear safeguards system has the potential to revolutionize the form and practice of safeguards. Assessing the possibilities of IA for the IAEA, and in particular those embodied in concepts of information-driven safeguards, requires an understanding of IA, the limits on its effectiveness and the requirements placed on such analyses in a variety of safeguards contexts. The Australian Safeguards and Nonproliferation Office (ASNO) and the United States Department of Energy (DOE) agreed in July 2002 to undertake a joint study of 'information-driven safeguards' under a long-standing cooperative arrangement. It was decided that a broad range of ideas should be considered, and that the study would not be intended to be and would not be an elaboration of either US or Australian governmental positions. This paper reports some findings of Phase 1 of this collaborative effort and offers some initial thinking on the part of the authors on the outstanding issues to be addressed in Phase 2. An effort to explore through case studies alternative strategies for utilizing IA by the IAEA that provide the same or increased confidence in safeguards conclusions while allowing safeguards resource allocation to be determined not only by the types and quantities of nuclear material and facilities in a State but also by other objective factors.

  10. Marketing Information: A Competitive Analysis

    OpenAIRE

    Miklos Sarvary; Philip M. Parker

    1997-01-01

    Selling information that is later used in decision making constitutes an increasingly important business in modern economies (Jensen [Jensen, Fred O. 1991. Information services. Congram, Friedman, eds. , Chapter 22. AMA-COM, New York, 423–443.]). Information is sold under a large variety of forms: industry reports, consulting services, database access, and/or professional opinions given by medical, engineering, accounting/financial, and legal professionals, among others. This paper is the fir...

  11. Using Information from Operating Experience to Inform Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bruce P. Hallbert; David I. Gertman; Julie Marble; Erasmia Lois; Nathan Siu

    2004-06-01

    This paper reports on efforts being sponsored by the U.S. NRC and performed by INEEL to develop a technical basis and perform work to extract information from sources for use in HRA. The objectives of this work are to: 1) develop a method for conducting risk-informed event analysis of human performance information that stems from operating experience at nuclear power plants and for compiling and documenting the results in a structured manner; 2) provide information from these analyses for use in risk-informed and performance-based regulatory activities; 3) create methods for information extraction and a repository for this information that, likewise, support HRA methods and their applications.

  12. Geographic Information System Data Analysis

    Science.gov (United States)

    Billings, Chad; Casad, Christopher; Floriano, Luis G.; Hill, Tracie; Johnson, Rashida K.; Locklear, J. Mark; Penn, Stephen; Rhoulac, Tori; Shay, Adam H.; Taylor, Antone; Thorpe, Karina

    1995-01-01

    Data was collected in order to further NASA Langley Research Center's Geographic Information System(GIS). Information on LaRC's communication, electrical, and facility configurations was collected. Existing data was corrected through verification, resulting in more accurate databases. In addition, Global Positioning System(GPS) points were used in order to accurately impose buildings on digitized images. Overall, this project will help the Imaging and CADD Technology Team (ICTT) prove GIS to be a valuable resource for LaRC.

  13. Empirical Analysis of Informal Institutions

    OpenAIRE

    Park, Sang-Min

    2013-01-01

    The New Institutional Economics has established itself as widely accepted extension to the standard neoclassical paradigm. Here, institutions are defined as commonly known rules that structure recurring interaction situations and the corresponding sanctioning mechanism. For-mal institutions describe rules with a sanction mechanism that is organized by the state. In-formal institutions describe rules with a sanction mechanism that ...

  14. Analysis of information risk management methods

    OpenAIRE

    Zudin, Rodion

    2014-01-01

    Zudin, Rodion Analysis of information risk management methods Jyväskylä: University of Jyväskylä, 2014, 33 p. Information Systems, Bachelor’s Thesis Supervisor: Siponen, Mikko A brief overview in the information risk management field is done in this study by introducing the shared terminology and methodology of the field using literature overview in the first chapter. Second chapter consists of examining and comparing two information risk management methodologies propo...

  15. Information systems and organizational analysis

    OpenAIRE

    Stafseth, Ådne; Braadland, Trond Reidar

    2004-01-01

    The paper presents different approaches to the field of information systems (IS) research and organization theory. In the field of IS, much of what is written deals with practical frameworks for IS development and use, but there is also some theory building and causal models. In organization theory, there are several well known paradigms like rational organization models, institutional theory and transaction cost theory. In typical organization theory, little is said of the ...

  16. Social Network Analysis and informal trade

    DEFF Research Database (Denmark)

    Walther, Olivier

    networks can be applied to better understand informal trade in developing countries, with a particular focus on Africa. The paper starts by discussing some of the fundamental concepts developed by social network analysis. Through a number of case studies, we show how social network analysis can i...... approaches. The paper finally highlights some of the applications of social network analysis and their implications for trade policies.......networks can be applied to better understand informal trade in developing countries, with a particular focus on Africa. The paper starts by discussing some of the fundamental concepts developed by social network analysis. Through a number of case studies, we show how social network analysis can...

  17. Generalized Full-Information Item Bifactor Analysis

    Science.gov (United States)

    Cai, Li; Yang, Ji Seung; Hansen, Mark

    2011-01-01

    Full-information item bifactor analysis is an important statistical method in psychological and educational measurement. Current methods are limited to single-group analysis and inflexible in the types of item response models supported. We propose a flexible multiple-group item bifactor analysis framework that supports a variety of…

  18. INFORMATION SYSTEM OF THE FINANCIAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    MIRELA MONEA

    2013-12-01

    Full Text Available Financial analysis provides the information necessary for decision making, and also helps both the external and internal users of these. The results of the financial analysis work are dependent on the quality, accuracy, relevance and effectiveness of the information collected, and processed. Essential sources of information for financial analysis are financial statements, which are considered the raw material of financial analysis. One of the financial statements -the balance sheet - provides information about assets, liabilities, equity, liquidity, solvency, risk, financial flexibility. The profit and loss account is a synthesis accounting document, part of the financial statement reporting enterprise financial performances during of a specified accounting period and summarizes all revenues earned and expenses of an accounting period and reports the results.

  19. ANALYSIS APPROACHES TO EVALUATION OF INFORMATION PROTECTION

    Directory of Open Access Journals (Sweden)

    Zyuzin A. S.

    2015-03-01

    Full Text Available The article is devoted to an actual problem of information systems’ security assessment and the importance of objective quantitative assessment results receiving. The author offers the creation of complex system of information security with system approach, which will be used at each stage of information system’s life cycle. On the basis of this approach the author formulates the general scheme of an information security assessment of information system, and also the principles of an assessment’s carrying out method choice. In this work the existing methods of a quantitative assessment based on object-oriented methods of the system analysis, and also the objectivity of the received estimates on the basis of this approach are considered. On the basis of the carried-out analysis, serious shortcomings of the used modern techniques of an information systems’ security assessment are allocated, then the idea of the scientific and methodical device providing the increase of objectivity and complexity of an information assessment means on the basis of expert data formalization creation necessity was formulated. The possibility of this approach application for expeditious receiving a quantitative information security assessment in the conditions security threat’s dynamics changes, functioning and developments of information system is considered. The problem definition of automated information systems’ security assessment is executed, and the general technique of protection means of information in systems of this type was formulated

  20. Informed spectral analysis: audio signal parameter estimation using side information

    Science.gov (United States)

    Fourer, Dominique; Marchand, Sylvain

    2013-12-01

    Parametric models are of great interest for representing and manipulating sounds. However, the quality of the resulting signals depends on the precision of the parameters. When the signals are available, these parameters can be estimated, but the presence of noise decreases the resulting precision of the estimation. Furthermore, the Cramér-Rao bound shows the minimal error reachable with the best estimator, which can be insufficient for demanding applications. These limitations can be overcome by using the coding approach which consists in directly transmitting the parameters with the best precision using the minimal bitrate. However, this approach does not take advantage of the information provided by the estimation from the signal and may require a larger bitrate and a loss of compatibility with existing file formats. The purpose of this article is to propose a compromised approach, called the 'informed approach,' which combines analysis with (coded) side information in order to increase the precision of parameter estimation using a lower bitrate than pure coding approaches, the audio signal being known. Thus, the analysis problem is presented in a coder/decoder configuration where the side information is computed and inaudibly embedded into the mixture signal at the coder. At the decoder, the extra information is extracted and is used to assist the analysis process. This study proposes applying this approach to audio spectral analysis using sinusoidal modeling which is a well-known model with practical applications and where theoretical bounds have been calculated. This work aims at uncovering new approaches for audio quality-based applications. It provides a solution for challenging problems like active listening of music, source separation, and realistic sound transformations.

  1. Crime analysis using open source information

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah; Shah, Azhar Ali;

    2015-01-01

    In this paper, we present a method of crime analysis from open source information. We employed un-supervised methods of data mining to explore the facts regarding the crimes of an area of interest. The analysis is based on well known clustering and association techniques. The results show that the...

  2. Requirements Analysis for Information-Intensive Systems

    Science.gov (United States)

    Callender, E. D.; Hartsough, C.; Morris, R. V.; Yamamoto, Y.

    1986-01-01

    Report discusses role of requirements analysis in development of information-intensive systems. System examined from variety of human viewpoints during design, development, and implementation. Such examination, called requirements analysis, ensures system simultaneously meets number of distinct but interacting needs. Viewpoints defined and integrated to help attain objectives.

  3. Canonical analysis based on mutual information

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Vestergaard, Jacob Schack

    2015-01-01

    Canonical correlation analysis (CCA) is an established multi-variate statistical method for finding similarities between linear combinations of (normally two) sets of multivariate observations. In this contribution we replace (linear) correlation as the measure of association between the linear...... combinations with the information theoretical measure mutual information (MI). We term this type of analysis canonical information analysis (CIA). MI allows for the actual joint distribution of the variables involved and not just second order statistics. While CCA is ideal for Gaussian data, CIA facilitates...... analysis of variables with different genesis and therefore different statistical distributions and different modalities. As a proof of concept we give a toy example. We also give an example with one (weather radar based) variable in the one set and eight spectral bands of optical satellite data in the...

  4. Information analysis in the strengthened safeguards system

    International Nuclear Information System (INIS)

    Full text: The changing political framework of the early 1990s revealed the limitations of traditional Safeguards and led the Member States to strengthen Safeguards. This was done under a two- pronged approach: while Programme 93+2 developed the model Additional Protocol (INFCIRC/540), which would require an expanded declaration, complementary access, and broader environmental sampling, the Safeguards measures in force under the existing comprehensive Safeguards Agreements were strengthened through voluntary reporting, environmental sampling, and enhanced information analysis. The purpose of both of these initiatives was to broaden the mission of the IAEA to include the detection of undeclared nuclear material and activities. The effect was to increase dramatically the amount of information available to the Agency and the challenge was to develop a method and infrastructure to evaluate all this information. Since 1992 the Agency has developed a state evaluation process for collecting, organizing, and analyzing new types of information (open source information, the results of environmental sampling, commercial satellite imagery) and integrating this information with the familiar state declared and verification information traditionally collected and evaluated by the Agency. This address will describe the components and challenges of information analysis under strengthened Safeguards, the heart of the state evaluation process. (author)

  5. Information analysis in the strengthened safeguards system

    International Nuclear Information System (INIS)

    Information Analysis is the heart of Strengthened Safeguards. The strengthened safeguards system is to detect the diversion of declared nuclear material and the presence of undeclared nuclear material and activities in a State with Comprehensive Safeguards agreements in force. In other words, under Strengthened Safeguards the Department of Safeguards must verify both the correctness and the completeness of State declarations concerning their nuclear inventories, facilities and activities. The changing political framework of the early 1990's, including such watershed events as the dissolution of the former Soviet Union, the information unearthed about the Iraqi weapon program, the declaration by South Africa of its past program, the resistance of the DPRK to verifying the completeness of its declarations, and the indefinite extension of the NPT, caused the IAEA Member States to demand stronger Safeguards and to accept the correspondingly more intrusive inspections and more complete declarations requirements. Two tracks were followed to strengthen safeguards under the 93 +2 Programme: additional measures were proposed under the existing authority of comprehensive safeguards agreements and still other measures were proposed which required the additional authority captured in a model Additional protocol (INFCIRC/540). Both types of additional measures required the Department to collect and process more information. Both types required new methods for integrating and evaluating this information. The new measures adopted under existing Comprehensive Safeguards Agreements included: voluntary reporting by States of the export of specified nuclear-related equipment and non-nuclear materials; collection and processing of environmental samples collected during inspections; remote monitoring of facilities and the use of unannounced inspections; acquisition of information from other Agency Departments; collection of publicly available, nuclear-related textual information from

  6. Safeguards information analysis: Progress, challenges and solutions

    International Nuclear Information System (INIS)

    While the IAEA's authority to verify the correctness and completeness of a State's declarations under its comprehensive safeguards agreement derives from the agreement itself, it is only with the provisions for broader access to information and locations available under an additional protocol that the IAEA is able to draw the safeguards conclusion regarding the absence of undeclared nuclear material and activities in the State. Under the State level concept, all relevant information about a State's nuclear activities is assessed to obtain as complete a picture as possible of the State's current and planned nuclear programme. The array of sources for information evaluation is both broad and diverse. Mainly, it encompasses information provided by States, information obtained from open sources, commercial satellite imagery, and inspectors' measurements. All the information is checked for internal consistency and consistency with information gathered during inspections and visits in the field. The 'information driven' approach has required an expansion of knowledge, expertise, information and analytical/evaluation skills. As the IAEA's corporate knowledge in exploiting new types of information has increased, so too has its capability for detecting proliferation indicators. In all areas concerned (open source, satellite imagery, consistency and trend analysis, nuclear trade analysis, environmental sampling, as well as State-declared information) remarkable improvements have been made with regard to methodologies, tools, expertise and skills. Although each of these areas has proven invaluable for the detection of certain undeclared activities/material, it is obvious that the strength of these methods is in their integration. Further qualitative and quantitative improvements will require the acquisition of additional specialist expertise, knowledge and technologies. The combination of all (new) technologies will be important for enhancing the IAEA's ability to detect

  7. Information Flow Analysis of Interactome Networks

    OpenAIRE

    Missiuro, Patrycja Vasilyev; Liu, Kesheng; Zou, Lihua; Zhao, Guoyan; Ge, Hui; Ross, Brian C.; Liu, Jun

    2009-01-01

    Recent studies of cellular networks have revealed modular organizations of genes and proteins. For example, in interactome networks, a module refers to a group of interacting proteins that form molecular complexes and/or biochemical pathways and together mediate a biological process. However, it is still poorly understood how biological information is transmitted between different modules. We have developed information flow analysis, a new computational approach that identifies proteins centr...

  8. Risk Analysis of Accounting Information System Infrastructure

    OpenAIRE

    MIHALACHE, Arsenie-Samoil

    2011-01-01

    National economy and security are fully dependent on information technology and infrastructure. At the core of the information infrastructure society relies on, we have the Internet, a system designed initially as a scientists’ forum for unclassified research. The use of communication networks and systems may lead to hazardous situations that generate undesirable effects such as communication systems breakdown, loss of data or taking the wrong decisions. The paper studies the risk analysis of...

  9. Social Network Analysis and informal trade

    OpenAIRE

    Walther, Olivier

    2015-01-01

    The objective of this article is to show how a formal approach to social networks can be applied to better understand informal trade in developing countries, with a particular focus on Africa. The paper starts by discussing some of the fundamental concepts developed by social network analysis. Through a number of case studies, we show how social network analysis can illuminate the relevant causes of social patterns, the impact of social ties on economic performance, the diffusion of resources...

  10. Air Force geographic information and analysis system

    Energy Technology Data Exchange (ETDEWEB)

    Henney, D.A.; Jansing, D.S.; Durfee, R.C.; Margle, S.M.; Till, L.E.

    1987-01-01

    A microcomputer-based geographic information and analysis system (GIAS) was developed to assist Air Force planners with environmental analysis, natural resources management, and facility and land-use planning. The system processes raster image data, topological data structures, and geometric or vector data similar to that produced by computer-aided design and drafting (CADD) systems, integrating the data where appropriate. Data types included Landsat imagery, scanned images of base maps, digitized point and chain features, topographic elevation data, USGS stream course data, highway networks, railroad networks, and land use/land cover information from USGS interpreted aerial photography. The system is also being developed to provide an integrated display and analysis capability with base maps and facility data bases prepared on CADD systems. 3 refs.

  11. Information flow analysis of interactome networks.

    Directory of Open Access Journals (Sweden)

    Patrycja Vasilyev Missiuro

    2009-04-01

    Full Text Available Recent studies of cellular networks have revealed modular organizations of genes and proteins. For example, in interactome networks, a module refers to a group of interacting proteins that form molecular complexes and/or biochemical pathways and together mediate a biological process. However, it is still poorly understood how biological information is transmitted between different modules. We have developed information flow analysis, a new computational approach that identifies proteins central to the transmission of biological information throughout the network. In the information flow analysis, we represent an interactome network as an electrical circuit, where interactions are modeled as resistors and proteins as interconnecting junctions. Construing the propagation of biological signals as flow of electrical current, our method calculates an information flow score for every protein. Unlike previous metrics of network centrality such as degree or betweenness that only consider topological features, our approach incorporates confidence scores of protein-protein interactions and automatically considers all possible paths in a network when evaluating the importance of each protein. We apply our method to the interactome networks of Saccharomyces cerevisiae and Caenorhabditis elegans. We find that the likelihood of observing lethality and pleiotropy when a protein is eliminated is positively correlated with the protein's information flow score. Even among proteins of low degree or low betweenness, high information scores serve as a strong predictor of loss-of-function lethality or pleiotropy. The correlation between information flow scores and phenotypes supports our hypothesis that the proteins of high information flow reside in central positions in interactome networks. We also show that the ranks of information flow scores are more consistent than that of betweenness when a large amount of noisy data is added to an interactome. Finally, we

  12. Analysis of information security reliability: A tutorial

    International Nuclear Information System (INIS)

    This article presents a concise reliability analysis of network security abstracted from stochastic modeling, reliability, and queuing theories. Network security analysis is composed of threats, their impacts, and recovery of the failed systems. A unique framework with a collection of the key reliability models is presented here to guide the determination of the system reliability based on the strength of malicious acts and performance of the recovery processes. A unique model, called Attack-obstacle model, is also proposed here for analyzing systems with immunity growth features. Most computer science curricula do not contain courses in reliability modeling applicable to different areas of computer engineering. Hence, the topic of reliability analysis is often too diffuse to most computer engineers and researchers dealing with network security. This work is thus aimed at shedding some light on this issue, which can be useful in identifying models, their assumptions and practical parameters for estimating the reliability of threatened systems and for assessing the performance of recovery facilities. It can also be useful for the classification of processes and states regarding the reliability of information systems. Systems with stochastic behaviors undergoing queue operations and random state transitions can also benefit from the approaches presented here. - Highlights: • A concise survey and tutorial in model-based reliability analysis applicable to information security. • A framework of key modeling approaches for assessing reliability of networked systems. • The framework facilitates quantitative risk assessment tasks guided by stochastic modeling and queuing theory. • Evaluation of approaches and models for modeling threats, failures, impacts, and recovery analysis of information systems

  13. Fisher information analysis in electrical impedance tomography

    International Nuclear Information System (INIS)

    This paper provides a quantitative analysis of the optimal accuracy and resolution in electrical impedance tomography (EIT) based on the Cramér–Rao lower bound. The imaging problem is characterized by the forward operator and its Jacobian. The Fisher information operator is defined for a deterministic parameter in a real Hilbert space and a stochastic measurement in a finite-dimensional complex Hilbert space with a Gaussian measure. The connection between the Fisher information and the singular value decomposition (SVD) based on the maximum likelihood (ML) criterion (the ML-based SVD) is established. It is shown that the eigenspaces of the Fisher information provide a suitable basis to quantify the trade-off between the accuracy and the resolution of the (nonlinear) inverse problem. It is also shown that the truncated ML-based pseudo-inverse is a suitable regularization strategy for a linearized problem, which exploits sufficient statistics for estimation within these subspaces. The statistical-based Cramér–Rao lower bound provides a complement to the deterministic upper bounds and the L-curve techniques that are employed with linearized inversion. To this end, electrical impedance tomography provides an interesting example where the eigenvalues of the SVD usually do not exhibit a very sharp cut-off, and a trade-off between the accuracy and the resolution may be of practical importance. A numerical study of a hypothetical EIT problem is described, including a statistical analysis of the model errors due to the linearization. (paper)

  14. Sustainability and information in urban system analysis

    International Nuclear Information System (INIS)

    In the present paper, a possible application of information theory for urban system analysis is shown. The ESM method proposed, based on Shannon's entropy analysis, is useful to evaluate different alternative measures of new energy saving technology transfer at different programming stages for consumption reduction and environmental impact control. A case study has been conducted in an urban area of Florence (Italy): the action/factor interaction entropy values can provide a scale of intervention priority and by comparing results obtained evaluating conditional entropy, ambiguity and redundancy, it is possible to identify the highest energy sustainable intervention in terms of higher or lower critical and risky action/factor combinations for the project being carried out. The ESM method proposed, if applied to different urban areas, can provide a rational criterion to compare complex innovative and sustainable technologies for irreversibility reduction and energy efficiency increase

  15. Analysis and countermeasures in information work of current nuclear power

    International Nuclear Information System (INIS)

    That is analysis information requirements of current user of nuclear power, and proposed to explore nuclear information services in new ways under the new situation. Nuclear information workers should be courageous in practice, to be creative, close to users, close to the demand, give full play to advantages of nuclear power information network, to carry out information services, to make nuclear information services. (author)

  16. 78 FR 38096 - Fatality Analysis Reporting System Information Collection

    Science.gov (United States)

    2013-06-25

    ... National Highway Traffic Safety Administration Fatality Analysis Reporting System Information Collection... Reporting System (FARS) is a major system that acquires national fatality information directly from existing...: Request for public comment on proposed collection of information. SUMMARY: Before a Federal agency...

  17. Astrophysical data analysis with information field theory

    Energy Technology Data Exchange (ETDEWEB)

    Enßlin, Torsten, E-mail: ensslin@mpa-garching.mpg.de [Max Planck Institut für Astrophysik, Karl-Schwarzschild-Straße 1, D-85748 Garching, Germany and Ludwig-Maximilians-Universität München, Geschwister-Scholl-Platz 1, D-80539 München (Germany)

    2014-12-05

    Non-parametric imaging and data analysis in astrophysics and cosmology can be addressed by information field theory (IFT), a means of Bayesian, data based inference on spatially distributed signal fields. IFT is a statistical field theory, which permits the construction of optimal signal recovery algorithms. It exploits spatial correlations of the signal fields even for nonlinear and non-Gaussian signal inference problems. The alleviation of a perception threshold for recovering signals of unknown correlation structure by using IFT will be discussed in particular as well as a novel improvement on instrumental self-calibration schemes. IFT can be applied to many areas. Here, applications in in cosmology (cosmic microwave background, large-scale structure) and astrophysics (galactic magnetism, radio interferometry) are presented.

  18. Astrophysical data analysis with information field theory

    International Nuclear Information System (INIS)

    Non-parametric imaging and data analysis in astrophysics and cosmology can be addressed by information field theory (IFT), a means of Bayesian, data based inference on spatially distributed signal fields. IFT is a statistical field theory, which permits the construction of optimal signal recovery algorithms. It exploits spatial correlations of the signal fields even for nonlinear and non-Gaussian signal inference problems. The alleviation of a perception threshold for recovering signals of unknown correlation structure by using IFT will be discussed in particular as well as a novel improvement on instrumental self-calibration schemes. IFT can be applied to many areas. Here, applications in in cosmology (cosmic microwave background, large-scale structure) and astrophysics (galactic magnetism, radio interferometry) are presented

  19. Astrophysical data analysis with information field theory

    CERN Document Server

    Enßlin, Torsten

    2014-01-01

    Non-parametric imaging and data analysis in astrophysics and cosmology can be addressed by information field theory (IFT), a means of Bayesian, data based inference on spatially distributed signal fields. IFT is a statistical field theory, which permits the construction of optimal signal recovery algorithms. It exploits spatial correlations of the signal fields even for nonlinear and non-Gaussian signal inference problems. The alleviation of a perception threshold for recovering signals of unknown correlation structure by using IFT will be discussed in particular as well as a novel improvement on instrumental self-calibration schemes. IFT can be applied to many areas. Here, applications in in cosmology (cosmic microwave background, large-scale structure) and astrophysics (galactic magnetism, radio interferometry) are presented.

  20. Modeling uncertainty in geographic information and analysis

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Uncertainty modeling and data quality for spatial data and spatial analyses are im-portant topics in geographic information science together with space and time in geography,as well as spatial analysis. In the past two decades,a lot of efforts have been made to research the uncertainty modeling for spatial data and analyses. This paper presents our work in the research. In particular,four progresses in the re-search are given out: (a) from determinedness-to uncertainty-based representation of geographic objects in GIS; (b) from uncertainty modeling for static data to dy-namic spatial analyses; (c) from modeling uncertainty for spatial data to models; and (d) from error descriptions to quality control for spatial data.

  1. Formal Concept Analysis for Information Retrieval

    CERN Document Server

    Qadi, Abderrahim El; Ennouary, Yassine

    2010-01-01

    In this paper we describe a mechanism to improve Information Retrieval (IR) on the web. The method is based on Formal Concepts Analysis (FCA) that it is makes semantical relations during the queries, and allows a reorganizing, in the shape of a lattice of concepts, the answers provided by a search engine. We proposed for the IR an incremental algorithm based on Galois lattice. This algorithm allows a formal clustering of the data sources, and the results which it turns over are classified by order of relevance. The control of relevance is exploited in clustering, we improved the result by using ontology in field of image processing, and reformulating the user queries which make it possible to give more relevant documents.

  2. Information- Theoretic Analysis for the Difficulty of Extracting Hidden Information

    Institute of Scientific and Technical Information of China (English)

    ZHANG Wei-ming; LI Shi-qu; CAO Jia; LIU Jiu-fen

    2005-01-01

    The difficulty of extracting hidden information,which is essentially a kind of secrecy, is analyzed by information-theoretic method. The relations between key rate, message rate, hiding capacity and difficulty of extraction are studied in the terms of unicity distance of stego-key, and the theoretic conclusion is used to analyze the actual extracting attack on Least Significant Bit(LSB) steganographic algorithms.

  3. Multicriteria analysis of ontologically represented information

    Science.gov (United States)

    Wasielewska, K.; Ganzha, M.; Paprzycki, M.; Bǎdicǎ, C.; Ivanovic, M.; Lirkov, I.

    2014-11-01

    Our current work concerns the development of a decision support system for the software selection problem. The main idea is to utilize expert knowledge to help the user in selecting the best software / method / computational resource to solve a computational problem. Obviously, this involves multicriterial decision making and the key open question is: which method to choose. The context of the work is provided by the Agents in Grid (AiG) project, where the software selection (and thus multicriterial analysis) is to be realized when all information concerning the problem, the hardware and the software is ontologically represented. Initially, we have considered the Analytical Hierarchy Process (AHP), which is well suited for the hierarchical data structures (e.g., such that have been formulated in terms of ontologies). However, due to its well-known shortcomings, we have decided to extend our search for the multicriterial analysis method best suited for the problem in question. In this paper we report results of our search, which involved: (i) TOPSIS (Technique for Order Preference by Similarity to Ideal Solution), (ii) PROMETHEE, and (iii) GRIP (Generalized Regression with Intensities of Preference). We also briefly argue why other methods have not been considered as valuable candidates.

  4. Information security policies : a content analysis

    OpenAIRE

    Lopes, Isabel Maria; Sá-Soares, Filipe de

    2012-01-01

    Completed research paper Among information security controls, the literature gives a central role to information security policies. However, there is a reduced number ofempirical studies about the features and components of information security policies. Thisresearch aims to contribute to fill this gap. It presents a synthesis of the literature on information security policies content and it characterizes 25 City Councils information security policy documents in terms of features and compo...

  5. INFORMATION SOURCES OF COMPANY'S COMPETITIVE ENVIRONMENT STATISTICAL ANALYSIS

    OpenAIRE

    Khvostenko, O.

    2010-01-01

    The article is dedicated to a problem of the company's competitive environment statistical analysis and its information sources. The main features of information system and its significance in the competitive environment statistical research have been considered.

  6. Applying Galois compliance for data analysis in information systems

    Directory of Open Access Journals (Sweden)

    Kozlov Sergey

    2016-03-01

    Full Text Available The article deals with the data analysis in information systems. The author discloses the possibility of using Galois compliance to identify the characteristics of the information system structure. The author reveals the specificity of the application of Galois compliance for the analysis of information system content with the use of invariants of graph theory. Aspects of introduction of mathematical apparatus of Galois compliance for research of interrelations between elements of the adaptive training information system of individual testing are analyzed.

  7. Analysis of information systems for hydropower operations

    Science.gov (United States)

    Sohn, R. L.; Becker, L.; Estes, J.; Simonett, D.; Yeh, W. W. G.

    1976-01-01

    The operations of hydropower systems were analyzed with emphasis on water resource management, to determine how aerospace derived information system technologies can increase energy output. Better utilization of water resources was sought through improved reservoir inflow forecasting based on use of hydrometeorologic information systems with new or improved sensors, satellite data relay systems, and use of advanced scheduling techniques for water release. Specific mechanisms for increased energy output were determined, principally the use of more timely and accurate short term (0-7 days) inflow information to reduce spillage caused by unanticipated dynamic high inflow events. The hydrometeorologic models used in predicting inflows were examined to determine the sensitivity of inflow prediction accuracy to the many variables employed in the models, and the results used to establish information system requirements. Sensor and data handling system capabilities were reviewed and compared to the requirements, and an improved information system concept outlined.

  8. Informational and technological provision of marginal analysis

    OpenAIRE

    Назаренко, Тетяна Петрівна

    2016-01-01

    Approaches of scientists towards the methods of carrying out marginal analysis have been analyzed as well as the main stages and procedures of carrying out such kind of analysis have been singled out.

  9. A Comparative Analysis of University Information Systems within the Scope of the Information Security Risks

    Directory of Open Access Journals (Sweden)

    Rustu Yilmaz

    2016-05-01

    Full Text Available Universities are the leading institutions that are the sources of educated human population who both produce information and ensure to develop new products and new services by using information effectively, and who are needed in every area. Therefore, universities are expected to be institutions where information and information management are used efficiently. In the present study, the topics such as infrastructure, operation, application, information, policy and human-based information security at universities were examined within the scope of the information security standards which are highly required and intended to be available at each university today, and then a comparative analysis was conducted specific to Turkey. Within the present study, the Microsoft Security Assessment Tool developed by Microsoft was used as the risk analysis tool. The analyses aim to enable the universities to compare their information systems with the information systems of other universities within the scope of the information security awareness, and to make suggestions in this regard.

  10. Function analysis for waste information systems

    International Nuclear Information System (INIS)

    This study has a two-fold purpose. It seeks to identify the functional requirements of a waste tracking information system and to find feasible alternatives for meeting those requirements on the Oak Ridge Reservation (ORR) and the Portsmouth (PORTS) and Paducah (PGDP) facilities; identify options that offer potential cost savings to the US government and also show opportunities for improved efficiency and effectiveness in managing waste information; and, finally, to recommend a practical course of action that can be immediately initiated. In addition to identifying relevant requirements, it also identifies any existing requirements that are currently not being completely met. Another aim of this study is to carry out preliminary benchmarking by contacting representative companies about their strategic directions in waste information. The information obtained from representatives of these organizations is contained in an appendix to the document; a full benchmarking effort, however, is beyond the intended scope of this study

  11. Discourse Analysis: Part I, Information Management and Cohesion.

    Science.gov (United States)

    Lovejoy, Kim Brian; Lance, Donald M.

    Combining linguistics and composition studies, this paper (part 1 of a two-part article) proposes a model for the analysis of information management and cohesion in written discourse. It defines concepts of discourse analysis--specifically information management, syntax, semantic reference, lexicon, cohesion, and intonation, with examples taken…

  12. Multicriteria Evaluation and Sensitivity Analysis on Information Security

    Science.gov (United States)

    Syamsuddin, Irfan

    2013-05-01

    Information security plays a significant role in recent information society. Increasing number and impact of cyber attacks on information assets have resulted the increasing awareness among managers that attack on information is actually attack on organization itself. Unfortunately, particular model for information security evaluation for management levels is still not well defined. In this study, decision analysis based on Ternary Analytic Hierarchy Process (T-AHP) is proposed as a novel model to aid managers who responsible in making strategic evaluation related to information security issues. In addition, sensitivity analysis is applied to extend our analysis by using several "what-if" scenarios in order to measure the consistency of the final evaluation. Finally, we conclude that the final evaluation made by managers has a significant consistency shown by sensitivity analysis results.

  13. Information architecture. Volume 2, Part 1: Baseline analysis summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-01

    The Department of Energy (DOE) Information Architecture, Volume 2, Baseline Analysis, is a collaborative and logical next-step effort in the processes required to produce a Departmentwide information architecture. The baseline analysis serves a diverse audience of program management and technical personnel and provides an organized way to examine the Department`s existing or de facto information architecture. A companion document to Volume 1, The Foundations, it furnishes the rationale for establishing a Departmentwide information architecture. This volume, consisting of the Baseline Analysis Summary (part 1), Baseline Analysis (part 2), and Reference Data (part 3), is of interest to readers who wish to understand how the Department`s current information architecture technologies are employed. The analysis identifies how and where current technologies support business areas, programs, sites, and corporate systems.

  14. Value of Information Analysis in Structural Safety

    DEFF Research Database (Denmark)

    Konakli, Katerina; Faber, Michael Havbro

    2014-01-01

    structural systems. In this context, experiments may refer to inspections or techniques of structural health monitoring. The Value of Information concept provides a powerful tool for determining whether the experimental cost is justified by the expected benefit and for identifying the optimal among different...

  15. A Grid Information Infrastructure for Medical Image Analysis

    OpenAIRE

    Rogulin, D.; F. Estrella(UWE); Hauer, T.; McClatchey, R.; Solomonides, T

    2004-01-01

    The storage and manipulation of digital images and the analysis of the information held in those images are essential requirements for next-generation medical information systems. The medical community has been exploring collaborative approaches for managing image data and exchanging knowledge and Grid technology [1] is a promising approach to enabling distributed analysis across medical institutions and for developing new collaborative and cooperative approaches for image analysis without th...

  16. A Multidisciplinary Analysis of Cyber Information Sharing

    Directory of Open Access Journals (Sweden)

    Aviram Zrahia

    2014-12-01

    Full Text Available The emergence of the cyber threat phenomenon is forcing organizations to change the way they think about security. One of these changes relates to organizations’ policy on sharing cyber information with outside parties. This means shifting away from the view of the organization as an isolated, compartmentalized entity towards a view of the organization as a sharing one. Sharing generates a complex, multifaceted challenge to technology, law, organizational culture and even politics. Establishing a system of sharing serves many parties, including regulatory bodies, governments, legal authorities, intelligence agencies, the manufacturers of solutions and services, as well as the organizations themselves, but it also arouses opposition among elements within the organization, and organizations defending the right for privacy. The purpose of this essay is to present the various challenges posed by cyber information sharing, expose the reader to its conceptual world, and present some insights and forecasts for its future development.

  17. Neurodynamics analysis of brain information transmission

    Institute of Scientific and Technical Information of China (English)

    Ru-bin WANG; Zhi-kang ZHANG; Chi K. Tse

    2009-01-01

    This paper proposes a model of neural networks consisting of populations of perceptive neurons, inter-neurons, and motor neurons according to the theory of stochastic phase resetting dynamics. According to this model, the dynamical characteristics of neural networks are studied in three coupling cases, namely, series and parallel coupling, series coupling, and unilateral coupling. The results show that the indentified structure of neural networks enables the basic characteristics of neural information processing to be described in terms of the actions of both the optional motor and the reflected motor. The excitation of local neural networks is caused by the action of the optional motor. In particular, the excitation of the neural population caused by the action of the optional motor in the motor cortex is larger than that caused by the action of the reflected motor. This phenomenon indicates that there are more neurons participating in the neural information processing and the excited synchronization motion under the action of the optional motor.

  18. Information ratio analysis of momentum strategies

    OpenAIRE

    Ferreira, Fernando F.; A. Christian Silva; Ju-Yi Yen

    2014-01-01

    In the past 20 years, momentum or trend following strategies have become an established part of the investor toolbox. We introduce a new way of analyzing momentum strategies by looking at the information ratio (IR, average return divided by standard deviation). We calculate the theoretical IR of a momentum strategy, and show that if momentum is mainly due to the positive autocorrelation in returns, IR as a function of the portfolio formation period (look-back) is very different from momentum ...

  19. Relevant market, economic analysis and information

    Czech Academy of Sciences Publication Activity Database

    Zemplinerová, Alena

    Praha: Czech Association for Competition Law, 2008, s. 28-39. ISBN 978-80-254-3276-1 R&D Projects: GA MŠk LC542 Institutional research plan: CEZ:AV0Z70850503 Keywords : relevant market definition * competition law * economic analysis Subject RIV: AH - Economics

  20. A Bibliometric Analysis of Certain Information Science Literature

    Science.gov (United States)

    Donohue, Joseph C.

    1972-01-01

    Several bibliometric techniques previously applied to separate scientific literatures were used together in the analysis of a single corpus of journal articles relating to information science. Techniques included were: Bradford Analysis, epidemic analysis, identification of research front, and bibliographic coupling. (16 references) (Author)

  1. Hydrogen Technical Analysis -- Dissemination of Information

    Energy Technology Data Exchange (ETDEWEB)

    George Kervitsky, Jr.

    2006-03-20

    SENTECH is a small energy and environmental consulting firm providing technical, analytical, and communications solutions to technology management issues. The activities proposed by SENTECH focused on gathering and developing communications materials and information, and various dissemination activities to present the benefits of hydrogen energy to a broad audience while at the same time establishing permanent communications channels to enable continued two-way dialog with these audiences in future years. Effective communications and information dissemination is critical to the acceptance of new technology. Hydrogen technologies face the additional challenge of safety preconceptions formed primarily as a result of the crash of the Hindenburg. Effective communications play a key role in all aspects of human interaction, and will help to overcome the perceptual barriers, whether of safety, economics, or benefits. As originally proposed SENTECH identified three distinct information dissemination activities to address three distinct but important audiences; these formed the basis for the task structure used in phases 1 and 2. The tasks were: (1) Print information--Brochures that target the certain segment of the population and will be distributed via relevant technical conferences and traditional distribution channels. (2) Face-to-face meetings--With industries identified to have a stake in hydrogen energy. The three industry audiences are architect/engineering firms, renewable energy firms, and energy companies that have not made a commitment to hydrogen (3) Educational Forums--The final audience is students--the future engineers, technicians, and energy consumers. SENTECH will expand on its previous educational work in this area. The communications activities proposed by SENTECH and completed as a result of this cooperative agreement was designed to compliment the research and development work funded by the DOE by presenting the technical achievements and validations

  2. INFORMATION ARCHITECTURE ANALYSIS USING BUSINESS INTELLIGENCE TOOLS BASED ON THE INFORMATION NEEDS OF EXECUTIVES

    Directory of Open Access Journals (Sweden)

    Fabricio Sobrosa Affeldt

    2013-08-01

    Full Text Available Devising an information architecture system that enables an organization to centralize information regarding its operational, managerial and strategic performance is one of the challenges currently facing information technology. The present study aimed to analyze an information architecture system developed using Business Intelligence (BI technology. The analysis was performed based on a questionnaire enquiring as to whether the information needs of executives were met during the process. A theoretical framework was applied consisting of information architecture and BI technology, using a case study methodology. Results indicated that the transaction processing systems studied did not meet the information needs of company executives. Information architecture using data warehousing, online analytical processing (OLAP tools and data mining may provide a more agile means of meeting these needs. However, some items must be included and others modified, in addition to improving the culture of information use by company executives.

  3. Information Leakage by Trace Analysis in QUAIL

    OpenAIRE

    Biondi, Fabrizio; Quilbeuf, Jean; Legay, Axel

    2014-01-01

    Quantitative security techniques have been proven effective to measure the security of systems against various types of attackers. However, such tech-niques are based on computing exponentially large channel matrices or Markov chains, making them impractical for large programs. We propose a different ap-proach based on abstract trace analysis. By analyzing directly sets of execution traces of the program and computing security measures on the results, we are able to scale down the exponential...

  4. Water Information Management & Analysis System (WIMAS) v 4.0

    Data.gov (United States)

    Kansas Data Access and Support Center — The Water Information Management and Analysis System (WIMAS) is an ArcView based GIS application that allows users to query Kansas water right data maintained by...

  5. The threat nets approach to information system security risk analysis

    OpenAIRE

    Mirembe, Drake

    2015-01-01

    The growing demand for healthcare services is motivating hospitals to strengthen outpatient case management using information systems in order to serve more patients using the available resources. Though the use of information systems in outpatient case management raises patient data security concerns, it was established that the current approaches to information systems risk analysis do not provide logical recipes for quantifying threat impact and determining the cost-effectiveness of risk m...

  6. ANALYSIS BETWEEN ACCOUNTING INFORMATION DISCLOSURE QUALITY AND STAKEHOLDER INTERESTS

    OpenAIRE

    Ioana (HERBEI) MOT; MOLDOVAN Nicoleta-Claudia; Cristina CERNIT

    2015-01-01

    The evolution and globalization of the markets, financial scandals that collapsed the American and European economical systems, pressure growth from investors over economic performance, underlined the fundamental role of economic-financial communication, corporate governance model and information transparence have over the information disclosure quality. The main objective of this paper is the ratio analysis between accounting information disclosure quality and stakeholder interests and the e...

  7. Information delivery manuals to facilitate it supported energy analysis

    DEFF Research Database (Denmark)

    Mondrup, Thomas Fænø; Karlshøj, Jan; Vestergaard, Flemming

    of information exchange and digital workflows is required. This paper presents the preliminary findings of an ongoing study aimed at developing an Information Delivery Manual (IDM) for IT supported energy analysis at concept design phase. The IDM development is based on: (1) a review of current approaches (2...

  8. HISTORICAL AND LEGAL ANALYSIS OF THE FORMATION OF INFORMATION LAW

    OpenAIRE

    Indrisova Z. N.

    2014-01-01

    The article is devoted to carrying out the historical and legal analysis of the formation of information law. Based on this study, it is proposed to have classification stages of the formation of information law, which includes a pre-scientific, elementary, secondary stage, the stage of uncertainly and the modern stage

  9. Shuttle Topography Data Inform Solar Power Analysis

    Science.gov (United States)

    2013-01-01

    The next time you flip on a light switch, there s a chance that you could be benefitting from data originally acquired during the Space Shuttle Program. An effort spearheaded by Jet Propulsion Laboratory (JPL) and the National Geospatial-Intelligence Agency (NGA) in 2000 put together the first near-global elevation map of the Earth ever assembled, which has found use in everything from 3D terrain maps to models that inform solar power production. For the project, called the Shuttle Radar Topography Mission (SRTM), engineers at JPL designed a 60-meter mast that was fitted onto Shuttle Endeavour. Once deployed in space, an antenna attached to the end of the mast worked in combination with another antenna on the shuttle to simultaneously collect data from two perspectives. Just as having two eyes makes depth perception possible, the SRTM data sets could be combined to form an accurate picture of the Earth s surface elevations, the first hight-detail, near-global elevation map ever assembled. What made SRTM unique was not just its surface mapping capabilities but the completeness of the data it acquired. Over the course of 11 days, the shuttle orbited the Earth nearly 180 times, covering everything between the 60deg north and 54deg south latitudes, or roughly 80 percent of the world s total landmass. Of that targeted land area, 95 percent was mapped at least twice, and 24 percent was mapped at least four times. Following several years of processing, NASA released the data to the public in partnership with NGA. Robert Crippen, a member of the SRTM science team, says that the data have proven useful in a variety of fields. "Satellites have produced vast amounts of remote sensing data, which over the years have been mostly two-dimensional. But the Earth s surface is three-dimensional. Detailed topographic data give us the means to visualize and analyze remote sensing data in their natural three-dimensional structure, facilitating a greater understanding of the features

  10. Key Information Systems Issues: An Analysis of MIS Publications.

    Science.gov (United States)

    Palvia, Prashant C.; And Others

    1996-01-01

    Presents results of a content analysis of journal articles discussing management information systems (MIS) that was conducted to identify, classify, and prioritize the key issues; to perform a trend analysis; and to compare results with previous studies. Twenty-six key issues are ranked according to frequency of occurrence. Contains 52 references.…

  11. A time sequence analysis on the informal information flow mechanism of microblogging

    Institute of Scientific and Technical Information of China (English)

    Yuan; HU; Xiaoli; LIAO; Andong; WU

    2011-01-01

    Microblog is a new Internet featured product,which has seen a rapid development in recent years.Researchers from different countries are making various technical analyses on microblogging applications.In this study,through using the natural language processing(NLP)and data mining,we analyzed the information content transmitted via a microblog,users’social networks and their interactions,and carried out an empirical analysis on the dissemination process of one particular piece of information via Sina Weibo.Based on the result of these analyses,we attempt to develop a better understanding about the rule and mechanism of the informal information flow in microblogging.

  12. A Strategic Analysis of Information Sharing Among Cyber Attackers

    Directory of Open Access Journals (Sweden)

    Kjell Hausken

    2015-10-01

    Full Text Available We build a game theory model where the market design is such that one firm invests in security to defend against cyber attacks by two hackers. The firm has an asset, which is allocated between the three market participants dependent on their contest success. Each hacker chooses an optimal attack, and they share information with each other about the firm’s vulnerabilities. Each hacker prefers to receive information, but delivering information gives competitive advantage to the other hacker. We find that each hacker’s attack and information sharing are strategic complements while one hacker’s attack and the other hacker’s information sharing are strategic substitutes. As the firm’s unit defense cost increases, the attack is inverse U-shaped and reaches zero, while the firm’s defense and profit decrease, and the hackers’ information sharing and profit increase. The firm’s profit increases in the hackers’ unit cost of attack, while the hackers’ information sharing and profit decrease. Our analysis also reveals the interesting result that the cumulative attack level of the hackers is not affected by the effectiveness of information sharing between them and moreover, is also unaffected by the intensity of joint information sharing. We also find that as the effectiveness of information sharing between hackers increases relative to the investment in attack, the firm’s investment in cyber security defense and profit are constant, the hackers’ investments in attacks decrease, and information sharing levels and hacker profits increase. In contrast, as the intensity of joint information sharing increases, while the firm’s investment in cyber security defense and profit remain constant, the hackers’ investments in attacks increase, and the hackers’ information sharing levels and profits decrease. Increasing the firm’s asset causes all the variables to increase linearly, except information sharing which is constant. We extend

  13. Agricultural information dissemination using ICTs: A review and analysis of information dissemination models in China

    Directory of Open Access Journals (Sweden)

    Yun Zhang

    2016-03-01

    Full Text Available Over the last three decades, China’s agriculture sector has been transformed from the traditional to modern practice through the effective deployment of Information and Communication Technologies (ICTs. Information processing and dissemination have played a critical role in this transformation process. Many studies in relation to agriculture information services have been conducted in China, but few of them have attempted to provide a comprehensive review and analysis of different information dissemination models and their applications. This paper aims to review and identify the ICT based information dissemination models in China and to share the knowledge and experience in applying emerging ICTs in disseminating agriculture information to farmers and farm communities to improve productivity and economic, social and environmental sustainability. The paper reviews and analyzes the development stages of China’s agricultural information dissemination systems and different mechanisms for agricultural information service development and operations. Seven ICT-based information dissemination models are identified and discussed. Success cases are presented. The findings provide a useful direction for researchers and practitioners in developing future ICT based information dissemination systems. It is hoped that this paper will also help other developing countries to learn from China’s experience and best practice in their endeavor of applying emerging ICTs in agriculture information dissemination and knowledge transfer.

  14. Informational Analysis for Compressive Sampling in Radar Imaging

    Directory of Open Access Journals (Sweden)

    Jingxiong Zhang

    2015-03-01

    Full Text Available Compressive sampling or compressed sensing (CS works on the assumption of the sparsity or compressibility of the underlying signal, relies on the trans-informational capability of the measurement matrix employed and the resultant measurements, operates with optimization-based algorithms for signal reconstruction and is thus able to complete data compression, while acquiring data, leading to sub-Nyquist sampling strategies that promote efficiency in data acquisition, while ensuring certain accuracy criteria. Information theory provides a framework complementary to classic CS theory for analyzing information mechanisms and for determining the necessary number of measurements in a CS environment, such as CS-radar, a radar sensor conceptualized or designed with CS principles and techniques. Despite increasing awareness of information-theoretic perspectives on CS-radar, reported research has been rare. This paper seeks to bridge the gap in the interdisciplinary area of CS, radar and information theory by analyzing information flows in CS-radar from sparse scenes to measurements and determining sub-Nyquist sampling rates necessary for scene reconstruction within certain distortion thresholds, given differing scene sparsity and average per-sample signal-to-noise ratios (SNRs. Simulated studies were performed to complement and validate the information-theoretic analysis. The combined strategy proposed in this paper is valuable for information-theoretic orientated CS-radar system analysis and performance evaluation.

  15. Analysis of information and power transfer in wireless communications

    OpenAIRE

    Caspers, Erick; Ho Yeung, Sai; Sarkar, Tapan K; Garcia-Lamperez, Alejandro; Salazr, Magdalena; Lagunas Hernandez, Miguel A.; PÉREZ NEIRA, Ana Isabel

    2013-01-01

    An analysis of wireless information compared to power transfer over the same channel, consisting of a transmitting and receiving antenna system, is discussed. This frequency-selective additive-white-Gaussian-noise channel displays a fundamental tradeoff between the rate at which energy and the rate at which reliable information can be transmitted over the same channel, as in an RFID system, a power-line communication system, or for an energy-harvesting system. The optimal tradeoffs between po...

  16. Noun-Phrase Analysis in Unrestricted Text for Information Retrieval

    OpenAIRE

    Evans, David A.; Zhai, ChengXiang

    1996-01-01

    Information retrieval is an important application area of natural-language processing where one encounters the genuine challenge of processing large quantities of unrestricted natural-language text. This paper reports on the application of a few simple, yet robust and efficient noun-phrase analysis techniques to create better indexing phrases for information retrieval. In particular, we describe a hybrid approach to the extraction of meaningful (continuous or discontinuous) subcompounds from ...

  17. Similarity Measures, Author Cocitation Analysis, and Information Theory

    CERN Document Server

    Leydesdorff, Loet

    2009-01-01

    The use of Pearson's correlation coefficient in Author Cocitation Analysis was compared with Salton's cosine measure in a number of recent contributions. Unlike the Pearson correlation, the cosine is insensitive to the number of zeros. However, one has the option of applying a logarithmic transformation in correlation analysis. Information calculus is based on both the logarithmic transformation and provides a non-parametric statistics. Using this methodology one can cluster a document set in a precise way and express the differences in terms of bits of information. The algorithm is explained and used on the data set which was made the subject of this discussion.

  18. Similarity Measures, Author Cocitation Analysis, and Information Theory

    OpenAIRE

    Leydesdorff, Loet

    2009-01-01

    The use of Pearson's correlation coefficient in Author Cocitation Analysis was compared with Salton's cosine measure in a number of recent contributions. Unlike the Pearson correlation, the cosine is insensitive to the number of zeros. However, one has the option of applying a logarithmic transformation in correlation analysis. Information calculus is based on both the logarithmic transformation and provides a non-parametric statistics. Using this methodology one can cluster a document set in...

  19. Collaborative for Historical Information and Analysis: Vision and Work Plan

    OpenAIRE

    Vladimir Zadorozhny; Patrick Manning; Daniel J. Bain; Ruth Mostern

    2013-01-01

    This article conveys the vision of a world-historical dataset, constructed in order to provide data on human social affairs at the global level over the past several centuries. The construction of this dataset will allow the routine application of tools developed for analyzing “Big Data” to global, historical analysis. The work is conducted by the Collaborative for Historical Information and Analysis (CHIA). This association of groups at universities and research institutes in the U.S. and Eu...

  20. Practical survival analysis tools for heterogeneous cohorts and informative censoring

    OpenAIRE

    Rowley, M; Garmo, H; Van Hemelrijck, M; Wulaningsih, W.; Grundmark, B.; Zethelius, B.; Hammar, N.; Walldius, G; M. Inoue; Holmberg, L; Coolen, A. C. C.

    2015-01-01

    In heterogeneous cohorts and those where censoring by non-primary risks is informative many conventional survival analysis methods are not applicable; the proportional hazards assumption is usually violated at population level and the observed crude hazard rates are no longer estimators of what they would have been in the absence of other risks. In this paper, we develop a fully Bayesian survival analysis to determine the probabilistically optimal description of a heterogeneous cohort and we ...

  1. HJD-I record and analysis meter for nuclear information

    International Nuclear Information System (INIS)

    A low-cost, small-volume, multi-function and new model intelligent nuclear electronic meter HJD-I Record and Analysis Meter are stated for Nuclear Information. It's hardware and software were detailed and the 137Cs spectrum with this meter was presented

  2. Digital Avionics Information System (DAIS): Training Requirements Analysis Model (TRAMOD).

    Science.gov (United States)

    Czuchry, Andrew J.; And Others

    The training requirements analysis model (TRAMOD) described in this report represents an important portion of the larger effort called the Digital Avionics Information System (DAIS) Life Cycle Cost (LCC) Study. TRAMOD is the second of three models that comprise an LCC impact modeling system for use in the early stages of system development. As…

  3. Webometric Analysis of Departments of Librarianship and Information Science.

    Science.gov (United States)

    Thomas, Owen; Willett, Peter

    2000-01-01

    Describes a webometric analysis of linkages to library and information science (LIS) department Web sites in United Kingdom universities. Concludes that situation data are not well suited to evaluation of LIS departments and that departments can boost Web site visibility by hosting a wide range of materials. (Author/LRW)

  4. Shape Analysis for Complex Systems Using Information Geometry Tools.

    OpenAIRE

    Sanctis, Angela De

    2012-01-01

    In this paper we use Information Geometry tools to model statistically patterns arising in complex systems and describe their evolution in time. In particular, we focus on the analysis of images with medical applications and propose an index that can estimate the level of self-organization and predict future problems that may occur in these systems.

  5. ANALYSIS OF INFORMATION FACTORS FOR DESIGNING INTELLECTUAL MECHATRONIC SYSTEM

    Directory of Open Access Journals (Sweden)

    A. V. Gulai

    2016-01-01

    Full Text Available The paper proposes to evaluate achievement of main results in operation of intellectual mechatronic systems with digital control by the obtained information effect. In this respect, common information requirements with intellectual components are considered as a basic information factor which influences on the process of mechatronic system designing. Therefore, some parameters have been accentuated and they can help to provide rather complete description of the processes used for obtaining and using systematic information within the volume of the intellectual mechatronic system. Conformity degree of control vector parameters synthesized by the system and identification results of its current states have been selected as an information criterion of the control efficiency. A set of expected probability values for location of each parameter of an control object and a mechatronic system within the required tolerances has been used for formation of possible states. The paper shows that when a complex information description of the system is used then it is expedient to use an expert assessment of selection probability for allowable control vectors which ensure a system transfer to favorable states. This approach has made it possible to pinpoint main information and technical specifications of the intellectual mechatronic system: structural construction (informational and technical compatibility and information matching of its components; control object (uncertainty of its state and information vector, information capacity of the mechatronic system; control actions (their hierarchy and entropic balance of control process, managerial resource of mechatronic system; functioning result (informational effect and control efficiency criterion, probabilistic selection of system states. In accordance with the fulfilled analysis it is possible to note the most effective directions for practical use of the proposed informational approach for creation of the

  6. Environmental Quality Information Analysis Center multi-year plan

    International Nuclear Information System (INIS)

    An information analysis center (IAC) is a federal resource that provides technical information for a specific technology field. An IAC links an expert technical staff with an experienced information specialist group, supported by in-house or external data bases to provide technical information and maintain a corporate knowledge in a technical area. An IAC promotes the rapid transfer of technology among its users and provides assistance in adopting new technology and predicting and assessing emerging technology. This document outlines the concept, requirements, and proposed development of an Environmental Quality IAC (EQIAC). An EQIAC network is composed of several nodes, each of which has specific technology capabilities. This document outlines strategic and operational objectives for the phased development of one such node of an EQIAC network

  7. Multiscale Analysis of Information Dynamics for Linear Multivariate Processes

    CERN Document Server

    Faes, Luca; Stramaglia, Sebastiano; Nollo, Giandomenico; Stramaglia, Sebastiano

    2016-01-01

    In the study of complex physical and physiological systems represented by multivariate time series, an issue of great interest is the description of the system dynamics over a range of different temporal scales. While information-theoretic approaches to the multiscale analysis of complex dynamics are being increasingly used, the theoretical properties of the applied measures are poorly understood. This study introduces for the first time a framework for the analytical computation of information dynamics for linear multivariate stochastic processes explored at different time scales. After showing that the multiscale processing of a vector autoregressive (VAR) process introduces a moving average (MA) component, we describe how to represent the resulting VARMA process using state-space (SS) models and how to exploit the SS model parameters to compute analytical measures of information storage and information transfer for the original and rescaled processes. The framework is then used to quantify multiscale infor...

  8. Comparative Analysis of Splice Site Regions by Information Content

    Institute of Scientific and Technical Information of China (English)

    T. Shashi Rekha; Chanchal K. Mitra

    2006-01-01

    We have applied concepts from information theory for a comparative analysis of donor (gt) and acceptor (ag) splice site regions in the genes of five different organisms by calculating their mutual information content (relative entropy) over a selected block of nucleotides. A similar pattern that the information content decreases as the block size increases was observed for both regions in all the organisms studied. This result suggests that the information required for splicing might be contained in the consensus of ~6-8 nt at both regions. We assume from our study that even though the nucleotides are showing some degrees of conservation in the flanking regions of the splice sites, certain level of variability is still tolerated,which leads the splicing process to occur normally even if the extent of base pairing is not fully satisfied. We also suggest that this variability can be compensated by recognizing different splice sites with different spliceosomal factors.

  9. Error Detection Analysis of Aircraft Information-Communication System using Information Entropy

    Directory of Open Access Journals (Sweden)

    Erna Fučić

    2016-01-01

    Full Text Available Contemporary transport aircraft information-communication system is extremely sophisticated. The aim of the current study is to give contribution to the current knowledge of information entropy, and to show how its alteration could indicate possible errors, which may lead to preventing future aircraft calamities. In this study a principle model of such system is described, consisting of two peripheral, sensory units and their central, processing units, upon which a numerical simulation is carried out. Two states of the system are defined – states of regular and irregular dynamics. Data transfer between system elements is defined through information entropy, whose average change and accompanying standard deviation shows the difference between the regular and non-regular state. When introducing an error of the same kind upon each of the sensors, the type of results corresponds to a sufficiently intensive deviation, which may make error detection by information entropy analysis possible.

  10. Structured information analysis for human reliability analysis of emergency tasks in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Kim, Jae Whan; Park, Jin Kyun; Ha, Jae Joo [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-02-01

    More than twenty HRA (Human Reliability Analysis) methodologies have been developed and used for the safety analysis in nuclear field during the past two decades. However, no methodology appears to have universally been accepted, as various limitations have been raised for more widely used ones. One of the most important limitations of conventional HRA is insufficient analysis of the task structure and problem space. To resolve this problem, we suggest SIA (Structured Information Analysis) for HRA. The proposed SIA consists of three parts. The first part is the scenario analysis that investigates the contextual information related to the given task on the basis of selected scenarios. The second is the goals-means analysis to define the relations between the cognitive goal and task steps. The third is the cognitive function analysis module that identifies the cognitive patterns and information flows involved in the task. Through the three-part analysis, systematic investigation is made possible from the macroscopic information on the tasks to the microscopic information on the specific cognitive processes. It is expected that analysts can attain a structured set of information that helps to predict the types and possibility of human error in the given task. 48 refs., 12 figs., 11 tabs. (Author)

  11. IMMAN: free software for information theory-based chemometric analysis.

    Science.gov (United States)

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  12. Activation Analysis. Proceedings of an Informal Study Group Meeting

    International Nuclear Information System (INIS)

    As part of its programme to promote the exchange of information relating to nuclear science and technology, the International Atomic Energy Agency convened in Bangkok, Thailand, from 6-8 July 1970, an informal meeting to discuss the topic of Activation Analysis. The meeting was attended by participants drawn from the following countries: Australia, Burma, Ceylon, Republic of China, India, Indonesia, Prance, Japan, Republic of Korea, New Zealand, Philippines, Singapore, Thailand, United States of America and Vietnam. The proceedings consist of the contributions presented at the meeting with minor editorial changes

  13. Food labelled Information: An Empirical Analysis of Consumer Preferences

    Directory of Open Access Journals (Sweden)

    Alessandro Banterle

    2012-12-01

    Full Text Available This paper aims at analysing which kinds of currently labelled information are of interest and actually used by consumers, and which additional kinds could improve consumer choices. We investigate the attitude of consumers with respect to innovative strategies for the diffusion of product information, as smart-labels for mobile-phones. The empirical analysis was organised in focus groups followed by a survey on 240 consumers. Results show that the most important nutritional claims are vitamins, energy and fat content. Consumers show a high interest in the origin of the products, GMOs, environmental impact, animal welfare and type of breeding.

  14. Enhancing Safeguards through Information Analysis: Business Analytics Tools

    International Nuclear Information System (INIS)

    For the past 25 years the IBM i2 Intelligence Analysis product portfolio has assisted over 4,500 organizations across law enforcement, defense, government agencies, and commercial private sector businesses to maximize the value of the mass of information to discover and disseminate actionable intelligence that can help identify, investigate, predict, prevent, and disrupt criminal, terrorist, and fraudulent acts; safeguarding communities, organizations, infrastructures, and investments. The collaborative Intelligence Analysis environment delivered by i2 is specifically designed to be: · scalable: supporting business needs as well as operational and end user environments · modular: an architecture which can deliver maximum operational flexibility with ability to add complimentary analytics · interoperable: integrating with existing environments and eases information sharing across partner agencies · extendable: providing an open source developer essential toolkit, examples, and documentation for custom requirements i2 Intelligence Analysis brings clarity to complex investigations and operations by delivering industry leading multidimensional analytics that can be run on-demand across disparate data sets or across a single centralized analysis environment. The sole aim is to detect connections, patterns, and relationships hidden within high-volume, all-source data, and to create and disseminate intelligence products in near real time for faster informed decision making. (author)

  15. Implantation of a safety management system information under the ISO 27001: risk analysis information

    Directory of Open Access Journals (Sweden)

    José Gregorio Arévalo Ascanio

    2015-11-01

    Full Text Available In this article the structure of the business of the city of Ocaña is explored with the aim of expanding the information and knowledge of the main variables of the productive activity of the municipality, its entrepreneurial spirit, technological development and productive structure. For this, a descriptive research was performed to identify economic activity in its various forms and promote the implementation of administrative practices consistent with national and international references.The results allowed to establish business weaknesses, including information, which once identified are used to design spaces training, acquisition of abilities and employers management practices in consistent with the challenges of competitiveness and stay on the market.As of the results was collected information regarding technological component companies of the productive fabric of the city, for which the application of tools for the analysis of information systems is proposed using the ISO 27001: 2005, using most appropriate technologies to study organizations that protect their most important asset information: information.

  16. Cognitive Dimensions Analysis of Interfaces for Information Seeking

    CERN Document Server

    Golovchinsky, Gene

    2009-01-01

    Cognitive Dimensions is a framework for analyzing human-computer interaction. It is used for meta-analysis, that is, for talking about characteristics of systems without getting bogged down in details of a particular implementation. In this paper, I discuss some of the dimensions of this theory and how they can be applied to analyze information seeking interfaces. The goal of this analysis is to introduce a useful vocabulary that practitioners and researchers can use to describe systems, and to guide interface design toward more usable and useful systems

  17. Structural Simulations and Conservation Analysis -Historic Building Information Model (HBIM

    Directory of Open Access Journals (Sweden)

    C. Dore

    2015-02-01

    Full Text Available In this paper the current findings to date of the Historic Building Information Model (HBIM of the Four Courts in Dublin are presented. The Historic Building Information Model (HBIM forms the basis for both structural and conservation analysis to measure the impact of war damage which still impacts on the building. The laser scan survey was carried out in the summer of 2014 of the internal and external structure. After registration and processing of the laser scan survey, the HBIM was created of the damaged section of the building and is presented as two separate workflows in this paper. The first is the model created from historic data, the second a procedural and segmented model developed from laser scan survey of the war damaged drum and dome. From both models structural damage and decay simulations will be developed for documentation and conservation analysis.

  18. Information analysis of census data by using statistical models

    Czech Academy of Sciences Publication Activity Database

    Grim, Jiří; Hora, J.; Boček, Pavel; Somol, Petr; Pudil, P.

    Prague : Czech Statistical Office , 2004 - (Krovák, J.), s. 1-7 [Statistics - Investment in the Future . Prague (CZ), 06.09.2004-07.09.2004] R&D Projects: GA ČR GA402/02/1271 EU Projects: European Commission(XE) 507752 - MUSCLE Institutional research plan: CEZ:AV0Z1075907 Keywords : statistical databases * information analysis * statistical models Subject RIV: AO - Sociology, Demography

  19. Formal Concept Analysis and Information Retrieval – A Survey

    OpenAIRE

    Codocedo, Victor; Napoli, Amedeo

    2015-01-01

    One of the first models to be proposed as a document index for retrieval purposes was a lattice structure, decades before the introduction of Formal Concept Analysis. Nevertheless, the main notions that we consider so familiar within the community (" extension " , " intension " , " closure operators " , " order ") were already an important part of it. In the '90s, as FCA was starting to settle as an epistemic community, lattice-based Information Retrieval (IR) systems smoothly transitioned to...

  20. Estimation of Boolean Factor Analysis Performance by Informational Gain

    Czech Academy of Sciences Publication Activity Database

    Frolov, A.; Húsek, Dušan; Polyakov, P.Y.

    Berlin : Springer, 2010 - (Snášel, V.; Szczepaniak, P.; Abraham, A.; Kacprzyk, J.), s. 83-94 ISBN 978-3-642-10686-6. - (Advances in Intelligent and Soft Computing. 67). [AWIC 2009. Atlantic Web Intelligence Conference /6./. Prague (CZ), 09.09.2009-11.09.2009] Institutional research plan: CEZ:AV0Z10300504 Keywords : Boolean factor analysis * informational gain * Hopfield-like network Subject RIV: IN - Informatics, Computer Science

  1. Economic Efficiency Analysis for Information Technology in Developing Countries

    OpenAIRE

    Ghassan F. Issa; Shaki M. Hussain; Hussein Al-Bahadili

    2009-01-01

    Problem statement: The introduction of Information Technology (IT) to government institutions in developing countries bears a great deal of risk of failure. The lack of qualified personnel, lack of financial support and the lack of planning and proper justification are just few of the causes of projects failure. Study presented in this study focused on the justification issue of IT projects through the application of Cost Benefit Analysis (CBA) as part of a comprehensive Economic Efficiency A...

  2. Mediating Informal Care Online: Findings from an Extensive Requirements Analysis

    OpenAIRE

    Christiane Moser; Alina Krischkowsky; Katja Neureiter; Manfred Tscheligi

    2015-01-01

    Organizing and satisfying the increasing demand for social and informal care for older adults is an important topic. We aim at building a peer-to-peer exchange platform that empowers older adults to benefit from receiving support for daily activities and reciprocally offering support to others. In situated interviews and within a survey we investigated the requirements and needs of 246 older adults with mild impairments. Additionally, we conducted an interpretative role analysis of older adul...

  3. Crime Mapping and Geographical Information Systems in Crime Analysis

    OpenAIRE

    Murat Dağlar; Uğur Argun

    2016-01-01

    As essential apparatus in crime analysis, crime mapping and Geographical Information Systems (GIS) are being progressively more accepted by police agencies. Development in technology and the accessibility of geographic data sources make it feasible for police departments to use GIS and crime mapping. GIS and crime mapping can be utilized as devices to discover reasons contributing to crime, and hence let law enforcement agencies proactively take action against the crime problems before they b...

  4. A Bibliometric Analysis on the Journal of Information Science

    OpenAIRE

    Ming-Yueh Tsay

    2011-01-01

    The purpose of this study is to explore the journal bibliometric characteristics of the Journal of Information Science (JIS) and the subject relationship with other disciplines by citation analysis. The citation data were drawn from references of each article of JIS during 1998 and 2008. The Ulrich’s Periodical Directory, Library of Congress Subject Heading, retrieved from the WorldCat and LISA database were used to identify the main class, subclass and subject of cited journals and books. Th...

  5. Latent morpho-semantic analysis : multilingual information retrieval with character n-grams and mutual information.

    Energy Technology Data Exchange (ETDEWEB)

    Bader, Brett William; Chew, Peter A.; Abdelali, Ahmed (New Mexico State University)

    2008-08-01

    We describe an entirely statistics-based, unsupervised, and language-independent approach to multilingual information retrieval, which we call Latent Morpho-Semantic Analysis (LMSA). LMSA overcomes some of the shortcomings of related previous approaches such as Latent Semantic Analysis (LSA). LMSA has an important theoretical advantage over LSA: it combines well-known techniques in a novel way to break the terms of LSA down into units which correspond more closely to morphemes. Thus, it has a particular appeal for use with morphologically complex languages such as Arabic. We show through empirical results that the theoretical advantages of LMSA can translate into significant gains in precision in multilingual information retrieval tests. These gains are not matched either when a standard stemmer is used with LSA, or when terms are indiscriminately broken down into n-grams.

  6. Carbon Dioxide Information Analysis Center: FY 1992 activities

    Energy Technology Data Exchange (ETDEWEB)

    Cushman, R.M. [Oak Ridge National Lab., TN (United States). Carbon Dioxide Information Analysis Center; Stoss, F.W. [Tennessee Univ., Knoxville, TN (United States). Energy, Environment and Resources Center

    1993-03-01

    During the course of a fiscal year, Oak Ridge National Laboratory`s Carbon Dioxide Information Analysis Center (CDIAC) distributes thousands of specialty publications-numeric data packages (NDPs), computer model packages (CMPs), technical reports, public communication publications, newsletters, article reprints, and reference books-in response to requests for information related to global environmental issues, primarily those pertaining to climate change. CDIACs staff also provides technical responses to specific inquiries related to carbon dioxide (CO{sub 2}), other trace gases, and climate. Hundreds of referrals to other researchers, policy analysts, information specialists, or organizations are also facilitated by CDIAC`s staff. This report provides an account of the activities accomplished by CDIAC during the period October 1, 1991 to September 30, 1992. An organizational overview of CDIAC and its staff is supplemented by a detailed description of inquiries received and CDIAC`s response to those inquiries. As analysis and description of the preparation and distribution of numeric data packages, computer model packages, technical reports, newsletters, fact sheets, specialty publications, and reprints is provided. Comments and descriptions of CDIAC`s information management systems, professional networking, and special bilateral agreements are also described.

  7. Carbon Dioxide Information Analysis Center: FY 1991 activities

    Energy Technology Data Exchange (ETDEWEB)

    Cushman, R.M.; Stoss, F.W.

    1992-06-01

    During the course of a fiscal year, Oak Ridge National Laboratory's Carbon Dioxide Information Analysis Center (CDIAC) distributes thousands of specially publications-numeric data packages (NDPs), computer model packages (CMPs), technical reports, public communication publications, newsletters, article reprints, and reference books-in response to requests for information related to global environmental issues, primarily those pertaining to climate change. CDIAC's staff also provides technical responses to specific inquiries related to carbon dioxide (CO{sub 2}), other trace gases, and climate. Hundreds of referrals to other researchers, policy analysts, information specialists, or organizations are also facilitated by CDIAC's staff. This report provides an account of the activities accomplished by CDIAC during the period October 1, 1990 to September 30, 1991. An organizational overview of CDIAC and its staff is supplemented by a detailed description of inquiries received and CDIAC's response to those inquiries. An analysis and description of the preparation and distribution of numeric data packages, computer model packages, technical reports, newsletters, factsheets, specially publications, and reprints is provided. Comments and descriptions of CDIAC's information management systems, professional networking, and special bilateral agreements are also described.

  8. Carbon Dioxide Information Analysis Center: FY 1991 activities

    Energy Technology Data Exchange (ETDEWEB)

    Cushman, R.M.; Stoss, F.W.

    1992-06-01

    During the course of a fiscal year, Oak Ridge National Laboratory`s Carbon Dioxide Information Analysis Center (CDIAC) distributes thousands of specially publications-numeric data packages (NDPs), computer model packages (CMPs), technical reports, public communication publications, newsletters, article reprints, and reference books-in response to requests for information related to global environmental issues, primarily those pertaining to climate change. CDIAC`s staff also provides technical responses to specific inquiries related to carbon dioxide (CO{sub 2}), other trace gases, and climate. Hundreds of referrals to other researchers, policy analysts, information specialists, or organizations are also facilitated by CDIAC`s staff. This report provides an account of the activities accomplished by CDIAC during the period October 1, 1990 to September 30, 1991. An organizational overview of CDIAC and its staff is supplemented by a detailed description of inquiries received and CDIAC`s response to those inquiries. An analysis and description of the preparation and distribution of numeric data packages, computer model packages, technical reports, newsletters, factsheets, specially publications, and reprints is provided. Comments and descriptions of CDIAC`s information management systems, professional networking, and special bilateral agreements are also described.

  9. Integrated information system for analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Performing complicated engineering analyses of a nuclear power plant requires storage and manipulation of a large amount of information, both data and knowledge. This information is characterized by its multidisciplinary nature, complexity, and diversity. The problems caused by inefficient and lengthy manual operations involving the data flow management within the frame-work of the safety-related analysis of a power plant can be solved by applying the computer aided engineering principles. These principles are the basis of the design of an integrated information storage system (IRIS). The basic idea is to create a computerized environment, which includes both database and functional capabilities. Consideration and analysis of the data types and required data manipulation capabilities as well as operational requirements, resulted in the choice of an object-oriented data-base management system (OODBMS) as a development platform for solving the software engineering problems. Several advantages of OODBMSs over conventional relations database systems were found of crucial importance, especially providing the necessary flexibility for different data types and extensibility potential. A detailed design of a data model is produced for the plant technical data and for the storage of analysis results. The overall system architecture was designed to assure the feasibility of integrating database capabilities with procedures and functions written in conventional algorithmic programming languages

  10. Structured information analysis for human reliability analysis of emergency tasks in nuclear power plants

    International Nuclear Information System (INIS)

    Being supported by scarce empirical data, most of the performance influencing factors in human reliability analysis (HRA) have to be assessed on the basis of the analyst's knowledge on the human performance in given tasks and their context. Therefore, the outcome of HRA may only be warranted by a proper application of their knowledge based on sufficient information about the tasks and situations. However, most of the HRA methodologies, including the newly developed ones, focus on the provision of cognitive models, error mechanisms, error types and analysis method while leaving the information collection mostly in the hands of the analyst. This paper suggests structured information analysis (SIA), which helps HRA analysts in collecting and structuring such information on tasks and contexts. The SIA consists of three parts: the scenario analysis, the goal-means analysis, and the cognitive function analysis. An expert evaluation showed that this three-part information analysis allowed more expressiveness and hence more confidence on the error prediction than ASEP HRA

  11. Principal Component Analysis - A Powerful Tool in Computing Marketing Information

    Directory of Open Access Journals (Sweden)

    Constantin C.

    2014-12-01

    Full Text Available This paper is about an instrumental research regarding a powerful multivariate data analysis method which can be used by the researchers in order to obtain valuable information for decision makers that need to solve the marketing problem a company face with. The literature stresses the need to avoid the multicollinearity phenomenon in multivariate analysis and the features of Principal Component Analysis (PCA in reducing the number of variables that could be correlated with each other to a small number of principal components that are uncorrelated. In this respect, the paper presents step-by-step the process of applying the PCA in marketing research when we use a large number of variables that naturally are collinear.

  12. Patent portfolio analysis model based on legal status information

    Institute of Scientific and Technical Information of China (English)

    Xuezhao; WANG; Yajuan; ZHAO; Jing; ZHANG; Ping; ZHAO

    2014-01-01

    Purpose:This research proposes a patent portfolio analysis model based on the legal status information to chart out a competitive landscape in a particular field,enabling organizations to position themselves within the overall technology landscape.Design/methodology/approach:Three indicators were selected for the proposed model:Patent grant rate,valid patents rate and patent maintenance period.The model uses legal status information to perform a qualitative evaluation of relative values of the individual patents,countries or regions’ technological capabilities and competitiveness of patent applicants.The results are visualized by a four-quadrant bubble chart To test the effectiveness of the model,it is used to present a competitive landscape in the lithium ion battery field.Findings:The model can be used to evaluate the values of the individual patents,highlight countries or regions’ positions in the field,and rank the competitiveness of patent applicants in the field.Research limitations:The model currently takes into consideration only three legal status indicators.It is actually feasible to introduce more indicators such as the reason for invalid patents and the distribution of patent maintenance time and associate them with those in the proposed model.Practical implications:Analysis of legal status information in combination of patent application information can help an organization to spot gaps in its patent claim coverage,as well as evaluate patent quality and maintenance situation of its granted patents.The study results can be used to support technology assessment,technology innovation and intellectual property management.Originality/value:Prior studies attempted to assess patent quality or competitiveness by using either single patent legal status indicator or comparative analysis of the impacts of each indicator.However,they are insufficient in presenting the combined effects of the evaluation indicators.Using our model,it appears possible to get a

  13. Petroleum labour market information supply demand analysis 2009-2020

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2010-03-15

    Since 2006, the petroleum industry has been interested in collaboration to determine labour demand and supply/demand gaps for the upstream petroleum industry. In 2006, the petroleum industry experienced strong employment growth and was having difficulty finding workers. Comprehensive, up-to-date labour market information and analysis are the key foundation for addressing labour supply/demand issues. This document presented labour market information on the petroleum industry in order to inform company retention and recruitment offices; government departments involved in development of labour market policies and programs; education and training institutions; guidance counsellors, employment centres and organizations that work with youth and labour supply pools; and job seekers. Specific topics that were discussed included two industry scenarios (growth and base case) in determining the petroleum industry's medium-and long-term employment needs; labour supply/demand considerations for the industry as a whole and an industry-wide cost management; and an analysis of exploration and production, oil sands, services, and pipeline sectors to 2020. It was concluded that while new employment is not expected to lead to labour shortages within the pipeline sector, attrition due to requirements almost certainly would. In the growth scenario, it is likely the pipeline sector will be challenged by competition from the other petroleum industry sectors. tabs., figs., appendices.

  14. Petroleum labour market information supply demand analysis 2009-2020

    International Nuclear Information System (INIS)

    Since 2006, the petroleum industry has been interested in collaboration to determine labour demand and supply/demand gaps for the upstream petroleum industry. In 2006, the petroleum industry experienced strong employment growth and was having difficulty finding workers. Comprehensive, up-to-date labour market information and analysis are the key foundation for addressing labour supply/demand issues. This document presented labour market information on the petroleum industry in order to inform company retention and recruitment offices; government departments involved in development of labour market policies and programs; education and training institutions; guidance counsellors, employment centres and organizations that work with youth and labour supply pools; and job seekers. Specific topics that were discussed included two industry scenarios (growth and base case) in determining the petroleum industry's medium-and long-term employment needs; labour supply/demand considerations for the industry as a whole and an industry-wide cost management; and an analysis of exploration and production, oil sands, services, and pipeline sectors to 2020. It was concluded that while new employment is not expected to lead to labour shortages within the pipeline sector, attrition due to requirements almost certainly would. In the growth scenario, it is likely the pipeline sector will be challenged by competition from the other petroleum industry sectors. tabs., figs., appendices.

  15. Thermal analysis and safety information for metal nanopowders by DSC

    International Nuclear Information System (INIS)

    Highlights: • Metal nanopowders are common and frequently employed in industry. • Nano iron powder experimental results of To were 140–150 °C. • Safety information can benefit relevant metal powders industries. - Abstract: Metal nanopowders are common and frequently employed in industry. Iron is mostly applied in high-performance magnetic materials and pollutants treatment for groundwater. Zinc is widely used in brass, bronze, die casting metal, alloys, rubber, and paints, etc. Nonetheless, some disasters induced by metal powders are due to the lack of related safety information. In this study, we applied differential scanning calorimetry (DSC) and used thermal analysis software to evaluate the related thermal safety information, such as exothermic onset temperature (To), peak of temperature (Tp), and heat of reaction (ΔH). The nano iron powder experimental results of To were 140–150 °C, 148–158 °C, and 141–149 °C for 15 nm, 35 nm, and 65 nm, respectively. The ΔH was larger than 3900 J/g, 5000 J/g, and 3900 J/g for 15 nm, 35 nm, and 65 nm, respectively. Safety information can benefit the relevant metal powders industries for preventing accidents from occurring

  16. Efficiency of crude oil markets: Evidences from informational entropy analysis

    International Nuclear Information System (INIS)

    The role of crude oil as the main energy source for the global economic activity has motivated the discussion about the dynamics and causes of crude oil price changes. An accurate understanding of the issue should provide important guidelines for the design of optimal policies and government budget planning. Using daily data for WTI over the period January 1986–March 2011, we analyze the evolution of the informational complexity and efficiency for the crude oil market through multiscale entropy analysis. The results indicated that the crude oil market is informationally efficient over the scrutinized period except for two periods that correspond to the early 1990s and late 2000s US recessions. Overall, the results showed that deregulation has improved the operation of the market in the sense of making returns less predictable. On the other hand, there is some evidence that the probability of having a severe US economic recession increases as the informational efficiency decreases, which indicates that returns from crude oil markets are less uncertain during economic downturns. - Highlights: ► Entropy concepts are used to characterize crude oil prices. ► An index of market efficiency is introduced. ► Except for periods of economic recession, the crude oil market is informationally efficient.

  17. Brain Tumor Detection Based On Mathematical Analysis and Symmetry Information

    Directory of Open Access Journals (Sweden)

    Narkhede Sachin G.,

    2014-02-01

    Full Text Available Image segmentation some of the challenging issues on brain magnetic resonance (MR image tumor segmentation caused by the weak correlation between magnetic resonance imaging (MRI intensity and anatomical meaning. With the objective of utilizing more meaningful information to improve brain tumor segmentation, an approach which employs bilateral symmetry information as an additional feature for segmentation is proposed. This is motivated by potential performance improvement in the general automatic brain tumor segmentation systems which are important for many medical and scientific applications. Brain Magnetic Resonance Imaging (MRI segmentation is a complex problem in the field of medical imaging despite various presented methods. MR image of human brain can be divided into several sub-regions especially soft tissues such as gray matter, white matter and cerebrospinal fluid. Although edge information is the main clue in image segmentation, it can’t get a better result in analysis the content of images without combining other information. Our goal is to detect the position and boundary of tumors automatically. Experiments were conducted on real pictures, and the results show that the algorithm is flexible and convenient.

  18. Environmental Quality Information Analysis Center (EQIAC) operating procedures handbook

    Energy Technology Data Exchange (ETDEWEB)

    Walsh, T.E. [Florida Univ., Gainesville, FL (United States); Das, S. [Oak Ridge National Lab., TN (United States)

    1992-08-01

    The Operating Procedures Handbook of the Environmental Quality Information Analysis Center (EQIAC) is intended to be kept current as EQIAC develops and evolves. Its purpose is to provide a comprehensive guide to the mission, infrastructure, functions, and operational procedures of EQIAC. The handbook is a training tool for new personnel and a reference manual for existing personnel. The handbook will be distributed throughout EQIAC and maintained in binders containing current dated editions of the individual sections. The handbook will be revised at least annually to reflect the current structure and operational procedures of EQIAC. The EQIAC provides information on environmental issues such as compliance, restoration, and environmental monitoring do the Air Force and DOD contractors.

  19. Least Dependent Component Analysis Based on Mutual Information

    CERN Document Server

    Stögbauer, H; Astakhov, S A; Grassberger, P; St\\"ogbauer, Harald; Kraskov, Alexander; Astakhov, Sergey A.; Grassberger, Peter

    2004-01-01

    We propose to use precise estimators of mutual information (MI) to find least dependent components in a linearly mixed signal. On the one hand this seems to lead to better blind source separation than with any other presently available algorithm. On the other hand it has the advantage, compared to other implementations of `independent' component analysis (ICA) some of which are based on crude approximations for MI, that the numerical values of the MI can be used for: (i) estimating residual dependencies between the output components; (ii) estimating the reliability of the output, by comparing the pairwise MIs with those of re-mixed components; (iii) clustering the output according to the residual interdependencies. For the MI estimator we use a recently proposed k-nearest neighbor based algorithm. For time sequences we combine this with delay embedding, in order to take into account non-trivial time correlations. After several tests with artificial data, we apply the resulting MILCA (Mutual Information based ...

  20. Analysis of Information Leakage in Quantum Key Agreement

    Institute of Scientific and Technical Information of China (English)

    LIU Sheng-li; ZHENG Dong; CHENG Ke-fei

    2006-01-01

    Quantum key agreement is one of the approaches to unconditional security. Since 1980's, different protocols for quantum key agreement have been proposed and analyzed. A new quantum key agreement protocol was presented in 2004, and a detailed analysis to the protocol was given. The possible game played between legitimate users and the enemy was described:sitting in the middle, an adversary can play a "man-in-the-middle" attack to cheat the sender and receiver. The information leaked to the adversary is essential to the length of the final quantum secret key. It was shown how to determine the amount of information leaked to the enemy and the amount of uncertainty between the legitimate sender and receiver.

  1. Probabilistic analysis of the human transcriptome with side information

    CERN Document Server

    Lahti, Leo

    2011-01-01

    Understanding functional organization of genetic information is a major challenge in modern biology. Following the initial publication of the human genome sequence in 2001, advances in high-throughput measurement technologies and efficient sharing of research material through community databases have opened up new views to the study of living organisms and the structure of life. In this thesis, novel computational strategies have been developed to investigate a key functional layer of genetic information, the human transcriptome, which regulates the function of living cells through protein synthesis. The key contributions of the thesis are general exploratory tools for high-throughput data analysis that have provided new insights to cell-biological networks, cancer mechanisms and other aspects of genome function. A central challenge in functional genomics is that high-dimensional genomic observations are associated with high levels of complex and largely unknown sources of variation. By combining statistical ...

  2. Use of historical information in extreme storm surges frequency analysis

    Science.gov (United States)

    Hamdi, Yasser; Duluc, Claire-Marie; Deville, Yves; Bardet, Lise; Rebour, Vincent

    2013-04-01

    The prevention of storm surge flood risks is critical for protection and design of coastal facilities to very low probabilities of failure. The effective protection requires the use of a statistical analysis approach having a solid theoretical motivation. Relating extreme storm surges to their frequency of occurrence using probability distributions has been a common issue since 1950s. The engineer needs to determine the storm surge of a given return period, i.e., the storm surge quantile or design storm surge. Traditional methods for determining such a quantile have been generally based on data from the systematic record alone. However, the statistical extrapolation, to estimate storm surges corresponding to high return periods, is seriously contaminated by sampling and model uncertainty if data are available for a relatively limited period. This has motivated the development of approaches to enlarge the sample extreme values beyond the systematic period. The nonsystematic data occurred before the systematic period is called historical information. During the last three decades, the value of using historical information as a nonsystematic data in frequency analysis has been recognized by several authors. The basic hypothesis in statistical modeling of historical information is that a perception threshold exists and that during a giving historical period preceding the period of tide gauging, all exceedances of this threshold have been recorded. Historical information prior to the systematic records may arise from high-sea water marks left by extreme surges on the coastal areas. It can also be retrieved from archives, old books, earliest newspapers, damage reports, unpublished written records and interviews with local residents. A plotting position formula, to compute empirical probabilities based on systematic and historical data, is used in this communication paper. The objective of the present work is to examine the potential gain in estimation accuracy with the

  3. Minimum Information Loss Cluster Analysis for Cathegorical Data

    Czech Academy of Sciences Publication Activity Database

    Grim, Jiří; Hora, Jan

    2007-01-01

    Roč. 2007, Č. 4571 (2007), s. 233-247. ISSN 0302-9743. [International Conference on Machine Learning and Data Mining MLDM 2007 /5./. Leipzig, 18.07.2007-20.07.2007] R&D Projects: GA MŠk 1M0572; GA ČR GA102/07/1594 Grant ostatní: GA MŠk(CZ) 2C06019 Institutional research plan: CEZ:AV0Z10750506 Keywords : Cluster Analysis * Cathegorical Data * EM algorithm Subject RIV: BD - Theory of Information Impact factor: 0.402, year: 2005

  4. The visual analysis of textual information: Browsing large document sets

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, J.; Pennock, K.; Fiegel, T.; Wise, J.; Pottier, M.; Schur, A.; Crow, V. [Pacific Northwest Lab., Richland, WA (United States); Lantrip, D. [California Univ., Santa Barbara, CA (United States)

    1995-05-01

    Visualization tools have been invaluable in the process of scientific discovery by providing researchers with insights gained through graphical tools and techniques. At PNL, the Multidimensional Visualization and Advanced Browsing (MVAB) project is extending visualization technology to the problems of intelligence analysis of textual documents by creating spatial representations of textual information. By representing an entire corpus of documents as points in a coordinate space of two or more dimensions, the tools developed by the MVAB team give the analyst the ability to quickly browse the entire document base and determine relationships among documents and publication patterns not readily discernible through traditional lexical means.

  5. Analysis of Internet Information on Lateral Lumbar Interbody Fusion.

    Science.gov (United States)

    Belayneh, Rebekah; Mesfin, Addisu

    2016-07-01

    Lateral lumbar interbody fusion (LLIF) is a surgical technique that is being increasingly used. The authors' objective was to examine information on the Internet pertaining to the LLIF technique. An analysis was conducted of publicly accessible websites pertaining to LLIF. The following search engines were used: Google (www.google.com), Bing (www.bing.com), and Yahoo (www.yahoo.com). DuckDuckGo (www.duckduckgo.com) was an additional search engine used due to its emphasis on generating accurate and consistent results while protecting searchers' privacy and reducing advertisements. The top 35 websites providing information on LLIF from the 4 search engines were identified. A total of 140 websites were evaluated. Each web-site was categorized based on authorship (academic, private, medical industry, insurance company, other) and content of information. Using the search term lateral lumbar interbody fusion, 174,000 Google results, 112,000 Yahoo results, and 112,000 Bing results were obtained. DuckDuckGo does not display the number of results found for a search. From the top 140 websites collected from each website, 78 unique websites were identified. Websites were authored by a private medical group in 46.2% of the cases, an academic medical group in 26.9% of the cases, and the biomedical industry in 5.1% of the cases. Sixty-eight percent of websites reported indications, and 24.4% reported contraindications. Benefits of LLIF were reported by 69.2% of websites. Thirty-six percent of websites reported complications of LLIF. Overall, the quality of information regarding LLIF on the Internet is poor. Spine surgeons and spine societies can assist in improving the quality of the information on the Internet regarding LLIF. [Orthopedics. 2016; 39(4):e701-e707.]. PMID:27111081

  6. Analysis of Financial Markets' Fluctuation by Textual Information

    Science.gov (United States)

    Izumi, Kiyoshi; Goto, Takashi; Matsui, Tohgoroh

    In this study, we proposed a new text-mining methods for long-term market analysis. Using our method, we analyzed monthly price data of financial markets; Japanese government bond market, Japanese stock market, and the yen-dollar market. First we extracted feature vectors from monthly reports of Bank of Japan. Then, trends of each market were estimated by regression analysis using the feature vectors. As a result, determination coefficients were over 75%, and market trends were explained well by the information that was extracted from textual data. We compared the predictive power of our method among the markets. As a result, the method could estimate JGB market best and the stock market is the second.

  7. An Examination of Canadian Information Professionals' Involvement in the Provision of Business Information Synthesis and Analysis Services

    Science.gov (United States)

    Patterson, Liane; Martzoukou, Konstantina

    2012-01-01

    The present study investigated the processes information professionals, working in a business environment, follow to meet business clients' information needs and particularly their involvement in information synthesis and analysis practices. A combination of qualitative and quantitative data was collected via a survey of 98 information…

  8. Information Retrieval and Graph Analysis Approaches for Book Recommendation

    Directory of Open Access Journals (Sweden)

    Chahinez Benkoussas

    2015-01-01

    Full Text Available A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval approach to related document network comprised of social links. We called Directed Graph of Documents (DGD a network constructed with documents and social information provided from each one of them. Specifically, this work tackles the problem of book recommendation in the context of INEX (Initiative for the Evaluation of XML retrieval Social Book Search track. A series of reranking experiments demonstrate that combining retrieval models yields significant improvements in terms of standard ranked retrieval metrics. These results extend the applicability of link analysis algorithms to different environments.

  9. Analysis of informational redundancy in the protein-assembling machinery

    Science.gov (United States)

    Berkovich, Simon

    2004-03-01

    Entropy analysis of the DNA structure does not reveal a significant departure from randomness indicating lack of informational redundancy. This signifies the absence of a hidden meaning in the genome text and supports the 'barcode' interpretation of DNA given in [1]. Lack of informational redundancy is a characteristic property of an identification label rather than of a message of instructions. Yet randomness of DNA has to induce non-random structures of the proteins. Protein synthesis is a two-step process: transcription into RNA with gene splicing and formation a structure of amino acids. Entropy estimations, performed by A. Djebbari, show typical values of redundancy of the biomolecules along these pathways: DNA gene 4proteins 15-40in gene expression, the RNA copy carries the same information as the original DNA template. Randomness is essentially eliminated only at the step of the protein creation by a degenerate code. According to [1], the significance of the substitution of U for T with a subsequent gene splicing is that these transformations result in a different pattern of RNA oscillations, so the vital DNA communications are protected against extraneous noise coming from the protein making activities. 1. S. Berkovich, "On the 'barcode' functionality of DNA, or the Phenomenon of Life in the Physical Universe", Dorrance Publishing Co., Pittsburgh, 2003

  10. SUCCESS CONCEPT ANALYSIS APPLIED TO THE INFORMATION TECHNOLOGY PROJECT MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Cassio C. Montenegro Duarte

    2012-05-01

    Full Text Available This study evaluates the concept of success in project management that is applicable to the IT universe, from the classical theory associated with the techniques of project management. Therefore, it applies the theoretical analysis associated to the context of information technology in enterprises as well as the classic literature of traditional project management, focusing on its application in business information technology. From the literature developed in the first part of the study, four propositions were prepared for study which formed the basis for the development of the field research with three large companies that develop projects of Information Technology. The methodology used in the study predicted the development of the multiple case study. Empirical evidence suggests that the concept of success found in the classical literature in project management adjusts to the environment management of IT projects. Showed that it is possible to create the model of standard IT projects in order to replicate it in future derivatives projects, which depends on the learning acquired at the end of a long and continuous process and sponsorship of senior management, which ultimately results in its merger into the company culture.

  11. Genetic association analysis of complex diseases incorporating intermediate phenotype information.

    Directory of Open Access Journals (Sweden)

    Yafang Li

    Full Text Available Genetic researchers often collect disease related quantitative traits in addition to disease status because they are interested in understanding the pathophysiology of disease processes. In genome-wide association (GWA studies, these quantitative phenotypes may be relevant to disease development and serve as intermediate phenotypes or they could be behavioral or other risk factors that predict disease risk. Statistical tests combining both disease status and quantitative risk factors should be more powerful than case-control studies, as the former incorporates more information about the disease. In this paper, we proposed a modified inverse-variance weighted meta-analysis method to combine disease status and quantitative intermediate phenotype information. The simulation results showed that when an intermediate phenotype was available, the inverse-variance weighted method had more power than did a case-control study of complex diseases, especially in identifying susceptibility loci having minor effects. We further applied this modified meta-analysis to a study of imputed lung cancer genotypes with smoking data in 1154 cases and 1137 matched controls. The most significant SNPs came from the CHRNA3-CHRNA5-CHRNB4 region on chromosome 15q24-25.1, which has been replicated in many other studies. Our results confirm that this CHRNA region is associated with both lung cancer development and smoking behavior. We also detected three significant SNPs--rs1800469, rs1982072, and rs2241714--in the promoter region of the TGFB1 gene on chromosome 19 (p = 1.46×10(-5, 1.18×10(-5, and 6.57×10(-6, respectively. The SNP rs1800469 is reported to be associated with chronic obstructive pulmonary disease and lung cancer in cigarette smokers. The present study is the first GWA study to replicate this result. Signals in the 3q26 region were also identified in the meta-analysis. We demonstrate the intermediate phenotype can potentially enhance the power of complex

  12. Analysis of Mining Terrain Deformation Characteristics with Deformation Information System

    Science.gov (United States)

    Blachowski, Jan; Milczarek, Wojciech; Grzempowski, Piotr

    2014-05-01

    Mapping and prediction of mining related deformations of the earth surface is an important measure for minimising threat to surface infrastructure, human population, the environment and safety of the mining operation itself arising from underground extraction of useful minerals. The number of methods and techniques used for monitoring and analysis of mining terrain deformations is wide and increasing with the development of geographical information technologies. These include for example: terrestrial geodetic measurements, global positioning systems, remote sensing, spatial interpolation, finite element method modelling, GIS based modelling, geological modelling, empirical modelling using the Knothe theory, artificial neural networks, fuzzy logic calculations and other. The aim of this paper is to introduce the concept of an integrated Deformation Information System (DIS) developed in geographic information systems environment for analysis and modelling of various spatial data related to mining activity and demonstrate its applications for mapping and visualising, as well as identifying possible mining terrain deformation areas with various spatial modelling methods. The DIS concept is based on connected modules that include: the spatial database - the core of the system, the spatial data collection module formed by: terrestrial, satellite and remote sensing measurements of the ground changes, the spatial data mining module for data discovery and extraction, the geological modelling module, the spatial data modeling module with data processing algorithms for spatio-temporal analysis and mapping of mining deformations and their characteristics (e.g. deformation parameters: tilt, curvature and horizontal strain), the multivariate spatial data classification module and the visualization module allowing two-dimensional interactive and static mapping and three-dimensional visualizations of mining ground characteristics. The Systems's functionality has been presented on

  13. Fisher statistics for analysis of diffusion tensor directional information.

    Science.gov (United States)

    Hutchinson, Elizabeth B; Rutecki, Paul A; Alexander, Andrew L; Sutula, Thomas P

    2012-04-30

    A statistical approach is presented for the quantitative analysis of diffusion tensor imaging (DTI) directional information using Fisher statistics, which were originally developed for the analysis of vectors in the field of paleomagnetism. In this framework, descriptive and inferential statistics have been formulated based on the Fisher probability density function, a spherical analogue of the normal distribution. The Fisher approach was evaluated for investigation of rat brain DTI maps to characterize tissue orientation in the corpus callosum, fornix, and hilus of the dorsal hippocampal dentate gyrus, and to compare directional properties in these regions following status epilepticus (SE) or traumatic brain injury (TBI) with values in healthy brains. Direction vectors were determined for each region of interest (ROI) for each brain sample and Fisher statistics were applied to calculate the mean direction vector and variance parameters in the corpus callosum, fornix, and dentate gyrus of normal rats and rats that experienced TBI or SE. Hypothesis testing was performed by calculation of Watson's F-statistic and associated p-value giving the likelihood that grouped observations were from the same directional distribution. In the fornix and midline corpus callosum, no directional differences were detected between groups, however in the hilus, significant (p<0.0005) differences were found that robustly confirmed observations that were suggested by visual inspection of directionally encoded color DTI maps. The Fisher approach is a potentially useful analysis tool that may extend the current capabilities of DTI investigation by providing a means of statistical comparison of tissue structural orientation. PMID:22342971

  14. 75 FR 58374 - 2010 Release of CADDIS (Causal Analysis/Diagnosis Decision Information System)

    Science.gov (United States)

    2010-09-24

    ... AGENCY 2010 Release of CADDIS (Causal Analysis/Diagnosis Decision Information System) AGENCY... Decision Information System (CADDIS). This Web site was developed to help scientists find, develop, organize, and use environmental information to improve causal assessments of biological impairment....

  15. The bioethics discussion forum--an implementation of an Internet-based bioethics information analysis resource.

    OpenAIRE

    Derse, A. R.; Krogull, S. R.

    1995-01-01

    Ethical analysis is crucial to decision making in biomedicine and health care, necessitating both rapid access to diffusely disseminated sources of information pertinent to bioethics and promotion of analysis in the field of bioethics through a resource for information analysis. We developed the Bioethics Discussion Forum, an Internet-based information analysis resource, in order to supplement the Bioethics Online Service with an interactive information medium to meet the demand for such an i...

  16. Economic Efficiency Analysis for Information Technology in Developing Countries

    Directory of Open Access Journals (Sweden)

    Ghassan F. Issa

    2009-01-01

    Full Text Available Problem statement: The introduction of Information Technology (IT to government institutions in developing countries bears a great deal of risk of failure. The lack of qualified personnel, lack of financial support and the lack of planning and proper justification are just few of the causes of projects failure. Study presented in this study focused on the justification issue of IT projects through the application of Cost Benefit Analysis (CBA as part of a comprehensive Economic Efficiency Analysis (EEA of IT Projects, thus providing management with a decision making tool which highlights existing and future problems and reduces the risk of failure. Approach: Cost-Benefit Analysis (CBA based on Economic Efficiency Analysis (EEA was performed on selected IT projects from ministries and key institutions in the government of Jordan using a well established approach employed by the Federal Government of Germany (KBSt approach. The approach was then modified and refined to suit the needs of developing countries so that it captured all the relevant elements of cost and benefits both quantitatively and qualitatively and includes a set of guidelines for data collection strategy. Results: When IT projects were evaluated using CBA, most cases yielded negative Net Present Value (NPV, even though, some cases showed some reduction in operation cost starting from the third year of project life. However, when the CBA was applied as a part of a comprehensive EEA by introducing qualitative aspects and urgency criteria, proper justification for new projects became feasible. Conclusion: The modified EEA represented a systematic approach which was well suited for the government of Jordan as a developing country. This approach was capable of dealing with the justification issue, evaluation of existing systems and the urgency of replacing legacy systems. This study explored many of the challenges and inherited problems existing in the public sectors of developing

  17. Automation of Large-scale Computer Cluster Monitoring Information Analysis

    Science.gov (United States)

    Magradze, Erekle; Nadal, Jordi; Quadt, Arnulf; Kawamura, Gen; Musheghyan, Haykuhi

    2015-12-01

    High-throughput computing platforms consist of a complex infrastructure and provide a number of services apt to failures. To mitigate the impact of failures on the quality of the provided services, a constant monitoring and in time reaction is required, which is impossible without automation of the system administration processes. This paper introduces a way of automation of the process of monitoring information analysis to provide the long and short term predictions of the service response time (SRT) for a mass storage and batch systems and to identify the status of a service at a given time. The approach for the SRT predictions is based on Adaptive Neuro Fuzzy Inference System (ANFIS). An evaluation of the approaches is performed on real monitoring data from the WLCG Tier 2 center GoeGrid. Ten fold cross validation results demonstrate high efficiency of both approaches in comparison to known methods.

  18. A Study on the Information Analysis and Legal Affairs

    International Nuclear Information System (INIS)

    It is followed that results and contents of a Study on the Nuclear Information Analyses and Legal Affairs. Our team makes an effort to secure KAERI's best legal interest in the process of enacting nuclear laws and codes, international collaborative study, and management. Moreover, as a international trend analysis, we studied Japan government's position to nuclear energy under the aspect of reducing climate change and supplying sustainable energy. Improvement of Japan's radiation use showed increasing contribution of radiation technology to the people. Results of studies of nuclear policy of Kazakhstan, forecasting global trend in 2030 of Nuclear area, and new U.S. government's policy to nuclear energy are also explained. Lastly, we performed evaluation of source of electric generator which reduce emitting carbon dioxide in the aspect of greenhouse gas emission statistic and tested green gas reducing ability of Korea's green source of electric generator that reducing greenhouse gas effect

  19. Overall analysis of meteorological information in the daeduk nuclear complex

    International Nuclear Information System (INIS)

    Problem shooting in tower structure, sensor installation, earth, and cabling have been done with integrated field-test, establishment of data acquisition system, and instrument calibration since the completion of the main tower construction in this year. Procedure guide was also made for the effective management covering instrument operation, calibration and repair. Real measurement has been done during two months from this October after whole integration of equipments. Occurrence of nocturnal inversion layer, fogging, and frequent stable condition of atmospheric stability were shown as the analysis results of measured data which well represented seasonal and regional characteristics in the site. Wireless data transmission to MIPS(Meteorological Information Processing System) has been done after collection in the DAS(data acquision system) where environmental assessment can be done by the developed simulation programs in both cases of normal operation and emergency. (Author)

  20. Mediating Informal Care Online: Findings from an Extensive Requirements Analysis

    Directory of Open Access Journals (Sweden)

    Christiane Moser

    2015-05-01

    Full Text Available Organizing and satisfying the increasing demand for social and informal care for older adults is an important topic. We aim at building a peer-to-peer exchange platform that empowers older adults to benefit from receiving support for daily activities and reciprocally offering support to others. In situated interviews and within a survey we investigated the requirements and needs of 246 older adults with mild impairments. Additionally, we conducted an interpretative role analysis of older adults’ collaborative care processes (i.e., support exchange practices in order to identify social roles and understand the inherent expectations towards the execution of support. We will describe our target group in the form of personas and different social roles, as well as user requirements for establishing a successful peer-to-peer collaboration. We also consider our finding from the perspective of social capital theory that allows us to describe in our requirements how relationships provide valuable social resources (i.e., social capital for informal and social care.

  1. Maximal information component analysis: a novel non-linear network analysis method

    Directory of Open Access Journals (Sweden)

    Christoph Daniel Rau

    2013-03-01

    Full Text Available Background: Network construction and analysis algorithms provide scientists with the ability to sift through high-throughput biological outputs, such as transcription microarrays, for small groups of genes (modules that are relevant for further research. Most of these algorithms ignore the important role of nonlinear interactions in the data, and the ability for genes to operate in multiple functional groups at once, despite clear evidence for both of these phenomena in observed biological systems. Results: We have created a novel co-expression network analysis algorithm that incorporates both of these principles by combining the information-theoretic association measure of the Maximal Information Coefficient with an Interaction Component Model. We evaluate the performance of this approach on two datasets collected from a large panel of mice, one from macrophages and the other from liver by comparing the two measures based on a measure of module entropy, GO enrichment and scale free topology fit. Our algorithm outperforms a widely used co-expression analysis method, Weighted Gene Coexpression Network Analysis (WGCNA, in the macrophage data, while returning comparable results in the liver dataset when using these criteria. We demonstrate that the macrophage data has more nonlinear interactions than the liver dataset, which may explain the increased performance of our method, termed Maximal Information Component Analysis (MICA in that case.Conclusions: In making our network algorithm more accurately reflect known biological principles, we are able to generate modules with improved relevance, particularly in networks with confounding factors such as gene by environment interactions.

  2. Information Security Risk Analysis Methods and Research Trends: AHP and Fuzzy Comprehensive Method

    OpenAIRE

    Ming-Chang Lee

    2014-01-01

    Information security risk analysis becomes an increasingly essential component of organization’s operations. Traditional Information security risk analysis is quantitative and qualitative analysis methods. Quantitative and qualitative analysis methods have some advantages for information risk analysis. However, hierarchy process has been widely used in security assessment. A future research direction may be development and application of soft computing such as rough sets, grey set...

  3. A Comparative Analysis of University Information Systems within the Scope of the Information Security Risks

    OpenAIRE

    Rustu Yilmaz; Yildiray Yalman

    2016-01-01

    Universities are the leading institutions that are the sources of educated human population who both produce information and ensure to develop new products and new services by using information effectively, and who are needed in every area. Therefore, universities are expected to be institutions where information and information management are used efficiently. In the present study, the topics such as infrastructure, operation, application, information, policy and human-based information secu...

  4. Collaborative for Historical Information and Analysis: Vision and Work Plan

    Directory of Open Access Journals (Sweden)

    Vladimir Zadorozhny

    2013-02-01

    Full Text Available This article conveys the vision of a world-historical dataset, constructed in order to provide data on human social affairs at the global level over the past several centuries. The construction of this dataset will allow the routine application of tools developed for analyzing “Big Data” to global, historical analysis. The work is conducted by the Collaborative for Historical Information and Analysis (CHIA. This association of groups at universities and research institutes in the U.S. and Europe includes five groups funded by the National Science Foundation for work to construct infrastructure for collecting and archiving data on a global level. The article identifies the elements of infrastructure-building, shows how they are connected, and sets the project in the context of previous and current efforts to build large-scale historical datasets. The project is developing a crowd-sourcing application for ingesting and documenting data, a broad and flexible archive, and a “data hoover” process to locate and gather historical datasets for inclusion. In addition, the article identifies four types of data and analytical questions to be explored through this data resource, addressing development, governance, social structure, and the interaction of social and natural variables.

  5. Success story in software engineering using NIAM (Natural language Information Analysis Methodology)

    Energy Technology Data Exchange (ETDEWEB)

    Eaton, S.M.; Eaton, D.S.

    1995-10-01

    To create an information system, we employ NIAM (Natural language Information Analysis Methodology). NIAM supports the goals of both the customer and the analyst completely understanding the information. We use the customer`s own unique vocabulary, collect real examples, and validate the information in natural language sentences. Examples are discussed from a successfully implemented information system.

  6. Applications of Geographic Information System (GIS) analysis of Lake Uluabat.

    Science.gov (United States)

    Hacısalihoğlu, Saadet; Karaer, Feza; Katip, Aslıhan

    2016-06-01

    Lake Uluabat is one of the most important wetlands in Turkey because of its rich biodiversity, lying on a migratory bird route with almost all its shores being covered by submerged plants. The lake has been protected by the Ramsar Convention since 1998. However, the Lake is threatened by natural and anthropogenic stressors as a consequence of its location. Geographic Information System (GIS) analysis is a tool that has been widely used, especially for water quality management in recent years. This study aimed to investigate the water quality and determined most polluted points using GIS analysis of the lake. Temperature, pH, dissolved oxygen, chemical oxygen demand, Kjeldahl nitrogen, total phosphorus, chlorophyll-a, arsenic, boron, iron, and manganese were monitored monthly from June 2008 to May 2009, with the samples taken from 8 points in the lake. Effect of pH, relation of temperature, and Chl-a with other water quality parameters and metals are designated as statistically significant. Data were mapped using ArcGIS 9.1 software and were assessed according to the Turkish Water Pollution Control Regulations (TWPCR). The research also focused on classifying and mapping the water quality in the lake by using the spatial analysis functions of GIS. As a result, it was determined that Lake Uluabat belonged to the 4th class, i.e., highly polluted water, including any water of lower quality. A remarkable portion of the pollution in the water basin was attributed to domestic wastewater discharges, industrial and agricultural activities, and mining. PMID:27154052

  7. ANALYSIS OF INFORMATIONAL COMPETENCY LEVEL OF FUTURE MANAGERS

    Directory of Open Access Journals (Sweden)

    Aleksandra Karpechenko

    2012-01-01

    Full Text Available This article describes the results of estimation of future managers’ informational competency level performed based on the developed method that includes diagnostics of maturity of informational thesaurus and three main informational competency components. It is proposed to carry out development of informational competency through implementation of integrative course.

  8. ANALYSIS ON AGRICULTRUAL INFORMATION DEMAND --Case of Jilin Province

    Institute of Scientific and Technical Information of China (English)

    ZHANG Hui-min; JIANG Hui-ming

    2005-01-01

    With the rapid development of agricultural informalization in the world, the demand of agricultural information has been a focus in the international agriculture and information fields. Based on the investigation, this paper presented the four characteristics of the demand of agricultural information in China, including regionality, seasonality, great potential demand and variation in kind and level. The factors influencing the demand of agricultural information were analyzed by the Optimized Less Square (OLS) method. The result shows that, of all factors influcing agricultural information demand, the most important one is economy, the second is facility of information pass,and knowledge and education of user, credit of agricultural information service system and production situation follow. Taking Jilin Province as an example, this article also elaborated the agricultural information demand status, and deduced the regression model of agricultural information demand and verified it by the survey in rural Jilin.

  9. INFORMATION CULTURE OF EDUCATION MANAGER: LEVEL AND FACTOR ANALYSIS

    OpenAIRE

    Koshevenko, Svetlana; Silchenkova, Svetlana

    2016-01-01

    The article presents the results of a monographic research of information culture of  educational manager. The article includes the author's definition of the concept of "information culture  of education manager", the functional model of information culture of educational manager consisting of following components: motivational, normative-valuable, cognitive-operational, communicative, informational, critical, educational and creative. The article includes information about level and factor ...

  10. The system for statistical analysis of logistic information

    Directory of Open Access Journals (Sweden)

    Khayrullin Rustam Zinnatullovich

    2015-05-01

    Full Text Available The current problem for managers in logistic and trading companies is the task of improving the operational business performance and developing the logistics support of sales. The development of logistics sales supposes development and implementation of a set of works for the development of the existing warehouse facilities, including both a detailed description of the work performed, and the timing of their implementation. Logistics engineering of warehouse complex includes such tasks as: determining the number and the types of technological zones, calculation of the required number of loading-unloading places, development of storage structures, development and pre-sales preparation zones, development of specifications of storage types, selection of loading-unloading equipment, detailed planning of warehouse logistics system, creation of architectural-planning decisions, selection of information-processing equipment, etc. The currently used ERP and WMS systems did not allow us to solve the full list of logistics engineering problems. In this regard, the development of specialized software products, taking into account the specifics of warehouse logistics, and subsequent integration of these software with ERP and WMS systems seems to be a current task. In this paper we suggest a system of statistical analysis of logistics information, designed to meet the challenges of logistics engineering and planning. The system is based on the methods of statistical data processing.The proposed specialized software is designed to improve the efficiency of the operating business and the development of logistics support of sales. The system is based on the methods of statistical data processing, the methods of assessment and prediction of logistics performance, the methods for the determination and calculation of the data required for registration, storage and processing of metal products, as well as the methods for planning the reconstruction and development

  11. Finite-Block-Length Analysis in Classical and Quantum Information Theory

    OpenAIRE

    Hayashi, Masahito

    2016-01-01

    This is a review article of finite-block-length analysis in classical and quantum information theory for non-specialist. Transmitting an information is a fundamental technology. However, there are several demands for this transmission. The research area to study such problems is called information theory. In the information transmission, the information is transmitted via a physical media. Hence, the analysis of this problem might depends on the property of the physical media. Indeed, while i...

  12. Situated student learning and spatial informational analysis for environmental problems

    Science.gov (United States)

    Olsen, Timothy Paul

    Ninth and tenth grade high school Biology student research teams used spatial information analysis tools to site a prairie restoration plot on a 55 acre campus during a four-week environment unit. Students made use of innovative technological practices by applying geographic information systems (GIS) approaches to solving environmental and land use problems. Student learning was facilitated by starting with the students' initial conceptions of computing, local landscape and biological environment, and then by guiding them through a problem-based science project process. The project curriculum was framed by the perspective of legitimate peripheral participation (Lave & Wenger, 1991) where students were provided with learning opportunities designed to allow them to act like GIS practitioners. Sociocultural lenses for learning were employed to create accounts of human mental processes that recognize the essential relationship between these processes and their cultural, historical, and institutional settings (Jacob, 1997; Wertsch, 1991). This research investigated how student groups' meaning-making actions were mediated by GIS tools on the periphery of a scientific community of practice. Research observations focused on supporting interpretations of learners' socially constructed actions and the iterative building of assertions from multiple sources. These included the artifacts students produced, the tools they used, the cultural contexts that constrained their activity, and how people begin to adopt ways of speaking (speech genres) of the referent community to negotiate meanings and roles. Students gathered field observations and interpreted attributes of landscape entities from the GIS data to advocate for an environmental decision. However, even while gaining proficiencies with GIS tools, most students did not begin to appropriate roles from the GIS community of practice. Students continued to negotiate their project actions simply as school exercises motivated by

  13. An Analysis of the Information Behaviour of Geography Teachers in a Developing African Country–Lesotho

    Directory of Open Access Journals (Sweden)

    Constance BITSO

    2012-08-01

    Full Text Available Information behaviour studies have the potential to inform the design of effective information services that incorporate the information needs, information-seeking and preferences for information sources of target users; hence a doctoral study was conducted on the information behaviour of geography teachers in Lesotho with the aim of guiding the design and implementation of an information service model for these teachers. This paper focuses on the analysis of the information behaviour of geography teachers in Lesotho as a contribution of original knowledge on geography teachers’ information behaviour. The analysis established the information behaviour of geography teachers using the information behaviour concept that encompasses information needs, information-seeking and information sources. Data were collected and analyzed through focus group discussions and conceptual content analysis respectively.The analysis reveals that these geography teachers need current and accurate information covering a variety of aspects in teaching and learning, such as content, pedagogy, classroom management and learners’ assessment. Owing to the increasing number of orphans in schools as a result of the HIV and AIDS pandemic, most teachers expressed the need for information on social assistance for orphans and vulnerable children. Recommendations include information literacy training for teachers and access to the Internet in schools, including the use of open access journals on the Internet by the teachers.

  14. An Analysis Of Underlying Competencies And Computer And Information Technology Learning Objectives For Business Analysis

    OpenAIRE

    Quigley, Ryan Thomas

    2013-01-01

    This research examines whether the Computer and Information Technology (CIT) department at Purdue University should develop a business analyst concentration. The differences between system and business analysts, evolution of the business analyst profession, job demand and trends, and applicable model curricula were explored to support this research. Review of relevant literature regarding the topics suggested that a business analyst concentration should be developed. A gap analysis was perfor...

  15. Using Failure Information Analysis to Detect Enterprise Zombies

    Science.gov (United States)

    Zhu, Zhaosheng; Yegneswaran, Vinod; Chen, Yan

    We propose failure information analysis as a novel strategy for uncovering malware activity and other anomalies in enterprise network traffic. A focus of our study is detecting self-propagating malware such as worms and botnets. We begin by conducting an empirical study of transport- and application-layer failure activity using a collection of long-lived malware traces. We dissect the failure activity observed in this traffic in several dimensions, finding that their failure patterns differ significantly from those of real-world applications. Based on these observations, we describe the design of a prototype system called Netfuse to automatically detect and isolate malware-like failure patterns. The system uses an SVM-based classification engine to identify suspicious systems and clustering to aggregate failure activity of related enterprise hosts. Our evaluation using several malware traces demonstrates that the Netfuse system provides an effective means to discover suspicious application failures and infected enterprise hosts. We believe it would be a useful complement to existing defenses.

  16. [Mutual information-based correlation analysis of herbs against insomnia].

    Science.gov (United States)

    Tian, Jin; Liu, Ren-quan

    2015-10-01

    This paper aims to analyze Professor Guo Rongjuan's medication experience on insomnia therapy based on the Traditional Chinese Medicine (TCM) Inheritance Support Plat. First, TCM formulae prescribed by Professor Guo for insomnia therapy were collected from the TCM Inheritance Support Plat. Next, unsupervised data mining algorithms, including apriori, modified mutual-information, and entropy clustering of complex system were applied to obtain the frequencies for different herbs and identify the association rules among the herbs. Accordingly, we can gain new insights into Professor Guo's medication experience on insomnia therapy. Based on analysis of 3 084 formulae, we determined the frequencies for herbs in the formulae and identified the association rules among these herbs. At last, 41 core combinations and 7 new formulae were obtained. The identified medication experience conform with Professor Guo's views on the etiology and pathogenesis of insomnia: "pathogenic fire derived from stagnation of liver-QI (Gan Yu Hua Huo)" is the core pathogenesis of insomnia; "liver stagnation and spleen deficiency" and "chronic illness transferred to kidney" are the main features for insomnia. The TCM Inheritance Support Plat is of great practical value for mining clinical experience of famous TCM doctors. PMID:26975117

  17. Enhancing multilingual latent semantic analysis with term alignment information.

    Energy Technology Data Exchange (ETDEWEB)

    Chew, Peter A.; Bader, Brett William

    2008-08-01

    Latent Semantic Analysis (LSA) is based on the Singular Value Decomposition (SVD) of a term-by-document matrix for identifying relationships among terms and documents from co-occurrence patterns. Among the multiple ways of computing the SVD of a rectangular matrix X, one approach is to compute the eigenvalue decomposition (EVD) of a square 2 x 2 composite matrix consisting of four blocks with X and XT in the off-diagonal blocks and zero matrices in the diagonal blocks. We point out that significant value can be added to LSA by filling in some of the values in the diagonal blocks (corresponding to explicit term-to-term or document-to-document associations) and computing a term-by-concept matrix from the EVD. For the case of multilingual LSA, we incorporate information on cross-language term alignments of the same sort used in Statistical Machine Translation (SMT). Since all elements of the proposed EVD-based approach can rely entirely on lexical statistics, hardly any price is paid for the improved empirical results. In particular, the approach, like LSA or SMT, can still be generalized to virtually any language(s); computation of the EVD takes similar resources to that of the SVD since all the blocks are sparse; and the results of EVD are just as economical as those of SVD.

  18. FACTORS INFLUENCING INFORMATION TECHNOLOGY ADPOTION: A CROSS-SECTIONAL ANALYSIS

    OpenAIRE

    Stroade, Jeri L.; Schurle, Bryan W.

    2003-01-01

    This project will explore information technology adoption issues. The unique characteristics of information technology will be discussed. Advantages and disadvantages to adoption will also be identified. Finally, a statistical model of Internet adoption will be developed to estimate the impacts of certain variables on the underlying process of information technology adoption.

  19. Integration of IT-Security Aspects into Information Demand Analysis and Patterns

    OpenAIRE

    Sandkuhl, Kurt; Matulevicius, Raimundas; Kirikova, Marite; Ahmed, Naved

    2015-01-01

    Information logistics in general addresses demand-oriented information supply in organizations. IT-security has not received much attention in information logistics research. However, integration of security aspects into information logistics methods could be useful for application contexts with strong security requirements. As a contribution to this aspect, the paper investigates the possibility to extend information demand patterns (IDP) and information demand analysis (IDA) with security e...

  20. Consumer information on fetal heart rate monitoring during labor: a content analysis: a content analysis.

    Science.gov (United States)

    Torres, Jennifer; De Vries, Raymond; Low, Lisa Kane

    2014-01-01

    Electronic fetal monitoring (EFM) is used for the majority of births that occur in the United States. While there are indications for use of EFM for women with high-risk pregnancies, its use in low-risk pregnancies is less evidence-based. In low-risk women, the use of EFM is associated with an increased risk for cesarean birth compared with the use of intermittent auscultation of the fetal heart rate. The purpose of this investigation was to evaluate the existence of evidence-based information on fetal heart rate monitoring in popular consumer-focused maternity books and Web sites. Content analysis of information in consumer-oriented Web sites and books was completed using the NVivo software (QRSinternational, Melbourne, Australia). Themes identified included lack of clear terminology when discussing fetal monitoring, use of broad categories such as low risk and high risk, limited presentation of information about intermittent auscultation, and presentation of EFM as the standard of care, particularly upon admission into the labor unit. More than one-third of the sources did not mention auscultation, and conflicting information about monitoring methods was presented. The availability of accurate, publically accessible information offers consumers the opportunity to translate knowledge into the power to seek evidence-based care practices during their maternity care experience. PMID:24781772

  1. Information Security Risk Analysis Methods and Research Trends: AHP and Fuzzy Comprehensive Method

    Directory of Open Access Journals (Sweden)

    Ming-Chang Lee

    2014-02-01

    Full Text Available Information security risk analysis becomes an increasingly essential component of organization’s operations. Traditional Information security risk analysis is quantitative and qualitative analysis methods. Quantitative and qualitative analysis methods have some advantages for information risk analysis. However, hierarchy process has been widely used in security assessment. A future research direction may be development and application of soft computing such as rough sets, grey sets, fuzzy systems, generic algorithm, support vector machine, and Bayesian network and hybrid model. Hybrid model are developed by integrating two or more existing model. A Practical advice for evaluation information security risk is discussed. This approach is combination with AHP and Fuzzy comprehensive method

  2. Analysis of Information Quality in event triggered Smart Grid Control

    DEFF Research Database (Denmark)

    Kristensen, Thomas le Fevre; Olsen, Rasmus Løvenstein; Rasmussen, Jakob Gulddahl

    2015-01-01

    dependability of existing networks. We develop models for network delays and information dynamics, and uses these to model information quality for three given information access schemes in an event triggered control scenario. We analyse the impact of model parameters, and show how optimal choice of information...... access scheme depends on network conditions as well as trade-offs between information quality, network resources and control reactivity.......The integration of renewable energy sources into the power grid requires added control intelligence which imposes new communication requirements onto the future power grid. Since large scale implementation of new communication infrastructure is infeasible, we consider methods of increasing...

  3. An analysis of human intelligence losses from a secure information system

    Energy Technology Data Exchange (ETDEWEB)

    Bott, T.F.

    1992-05-01

    A systematic method for evaluating the performance of an information security system is described. The method includes procedures for observable signature determination, analysis of information loss paths, security system faults, and a rudimentary information importance scheme. A simple quantification of the probability of information disclosure is developed based on historical data sources. Some examples of applications are given.

  4. Managing Approach Plate Information Study (MAPLIST): An Information Requirements Analysis of Approach Chart Use

    Science.gov (United States)

    Ricks, Wendell R.; Jonnson, Jon E.; Barry, John S.

    1996-01-01

    Adequately presenting all necessary information on an approach chart represents a challenge for cartographers. Since many tasks associated with using approach charts are cognitive (e.g., planning the approach and monitoring its progress), and since the characteristic of a successful interface is one that conforms to the users' mental models, understanding pilots' underlying models of approach chart information would greatly assist cartographers. To provide such information, a new methodology was developed for this study that enhances traditional information requirements analyses by combining psychometric scaling techniques with a simulation task to provide quantifiable links between pilots' cognitive representations of approach information and their use of approach information. Results of this study should augment previous information requirements analyses by identifying what information is acquired, when it is acquired, and what presentation concepts might facilitate its efficient use by better matching the pilots' cognitive model of the information. The primary finding in this study indicated that pilots mentally organize approach chart information into ten primary categories: communications, geography, validation, obstructions, navigation, missed approach, final items, other runways, visibility requirement, and navigation aids. These similarity categories were found to underlie the pilots' information acquisitions, other mental models, and higher level cognitive processes that are used to accomplish their approach and landing tasks.

  5. Commercial babassu mesocarp: microbiological evaluation and analysis of label information

    Directory of Open Access Journals (Sweden)

    Laisa Lis Fontinele Sá

    2015-10-01

    Full Text Available The babassu mesocarp is easily found in supermarkets and other commercial establishments in Brazil. Despite its widespread use in both pharmaceutical and food industries, the literature has neither scientific studies about microbial contamination for these products nor about legal information expressed on label. The aim of this study was to evaluate the level of microbiological contamination in babassu mesocarp sold in commercial establishments in Teresina-PI/Brazil besides the conformity of label information according to the rules of Brazilian Sanitary Surveillance Agency (ANVISA. Ten samples of babassu mesocarp powder sold in the region were selected for study. Determination of heterotrophic microorganisms was carried out using the seeding technique of Plate Count Agar (CFU g-1. It was used Sabouraud Dextrose Agar medium for cultivation of fungi. For the analysis of label information, the resolutions (RDC, 259 of September 20, 2002, and 360 of December 23, 2003, beyond the law 10,674 of May 16, 2003 were used. The results of levels of contamination for heterotrophic bacteria and fungi showed high contamination for all samples. Most of the label samples were according to the rules. Therefore, the results suggest a more comprehensive monitoring of these microorganisms besides the development of more effective methods for decontamination of these products sold in Brazil.Keywords: Babassu. Label. Contamination. Food. Pharmacy. RESUMO O mesocarpo de babaçu é encontrado facilmente em supermercados e em outros estabelecimentos comercias e apesar de sua ampla utilização, tanto na indústria farmacêutica e de alimentos, na literatura não há trabalhos científicos que avaliem sua contaminação microbiológica ou informações legais necessárias para rótulos. O objetivo do estudo foi avaliar o nível de contaminação microbiológica do mesocarpo de babaçu, vendidos no comércio de Teresina-PI, bem como verificar a conformidade das informa

  6. MULTIPLE CRITERIA ANALYSIS FOR EVALUATION OF INFORMATION SYSTEM RISK

    OpenAIRE

    Olson, David L.; DESHENG DASH WU

    2011-01-01

    Information technology (IT) involve a wide set of risks. Enterprise information systems are a major developing form of information technology involving their own set of risks, thus creating potential blind spots. This paper describes risk management issues involved in enterprise resource planning systems (ERP) which have high impact on organizations due to their high cost, and their pervasive impact on organizational operations. Alternative means of acquiring ERP systems, to include outsourci...

  7. Analysis on Uncertain Information and Actions for Preventing Collision

    Institute of Scientific and Technical Information of China (English)

    胡甚平; FANG; Quan-gen

    2007-01-01

    Discusses and analyzes the causes and characteristics of the uncertainties of the information and actions for preventing collision at sea on the basic knowledge of the collision avoidance. Describes the ways and functions of the investigations about the uncertainties of the information and actions of collision avoidance with the navigation simulators. Puts forward some suggestions for the officers to master the skills of the recognition of these uncertainties of the information and actions by the training with the simulator during the MET course.

  8. Content of information ethics in the Holy Quran: an analysis

    Directory of Open Access Journals (Sweden)

    Akram Mehrandasht

    2014-06-01

    Full Text Available Background and Objectives: Information ethics, according to Islam, equals observing human rights and morals in dealing with information transmission and provision, which must be based on honesty and truthfulness. As Islam highly values society and social issues, it believes that human behaviors and interactions are strongly interconnected in a society. Committing to ethical issues regarding information is a responsibility of all members of a society according to Quran. Islam prohibits Believers from unmasking peoples’ private information. Results: Thus, from the viewpoint of Islam, the purpose of information ethics is to observe and protect human rights in the society and to show the positive effects of the right information on all aspects of human life. Method s: For the purpose of the current study all the words and verses related to information ethics was identified in Quran. The data was then evaluated and analyzed from the perspective of scholars and experts of Quran. Conclusions: To implement information ethics, it is necessary to be aware of the position Islam has given this concept. Following Quran guidelines and the manners of the family members of the Prophet Muhammad (pbuh results in a society in which information ethics is observed optimally.

  9. Analysis and improvement of vehicle information sharing networks

    Science.gov (United States)

    Gong, Hang; He, Kun; Qu, Yingchun; Wang, Pu

    2016-06-01

    Based on large-scale mobile phone data, mobility demand was estimated and locations of vehicles were inferred in the Boston area. Using the spatial distribution of vehicles, we analyze the vehicle information sharing network generated by the vehicle-to-vehicle (V2V) communications. Although a giant vehicle cluster is observed, the coverage and the efficiency of the information sharing network remain limited. Consequently, we propose a method to extend the information sharing network's coverage by adding long-range connections between targeted vehicle clusters. Furthermore, we employ the optimal design strategy discovered in square lattice to improve the efficiency of the vehicle information sharing network.

  10. INFORMATION ECONOMICS, INSTRUMENT OF ANALYSIS IN NEW MICROECONOMICS

    Directory of Open Access Journals (Sweden)

    Maria Zenovia GRIGORE

    2009-10-01

    Full Text Available In the New Microeconomics the walrasian postulate of perfect information is replaced by two theorems concerning production of information: 1. the acquisition and dissemination of information raise production costs; 2.  the specialisation in information activity is efficient; there are specialists in the production or use of informationInformation economics or the economics of information studies decisions in transaction where one party has more or better information than the other. Incomplete and asymmetric information could generate two types of risks: adverse selection, which can be reduced with signaling games and screening games, and moral hazard, studied in the frame of agency theory, by the principal agent model. The principal agent model treats the difficulties that arise when a principal hires an agent to pursue the interests of the former. There are some mechanisms that align the interests of the agent in solidarity with those of the principal, such as commissions, promotions, profit sharing, efficiency wages, deferred compensation, fear of firing and so on.

  11. Stages of information search in the internet for political phenomena analysis

    OpenAIRE

    Канюк, Надія Вікторівна; Пелещишин, Андрій Миколайович

    2013-01-01

    This paper deals with topical issue of efficient search of information in the World Wide Web system, information about political phenomena which is subject to further analysis (for example content or event analysis).3 requirements to information intended for analyzing political phenomena were outlined in the paper: it must be up to date, credible and in sufficient quantity. 6 stages of information search and selection based on these requirements were also proposed: determination of geographic...

  12. Identification of Data Element Categories for Clinical Nursing Information Systems via Information Analysis of Nursing Practice

    OpenAIRE

    Graves, Judith R.; Corcoran, Sheila

    1988-01-01

    In order to empirically identify data elements for content of a Clinical Nursing Information System serving cardiovascular nurses, the question “What supplemental information (or data, or knowledge) do nurses seek in order to make decisions about patient care?” was asked. Data was collected from nurses working all shifts in three different agencies: a community hospital, a large private teaching hospital, and a large public teaching hospital. For each instance of supplemental information-seek...

  13. A Novel Quantitative Analysis Model for Information System Survivability Based on Conflict Analysis

    Institute of Scientific and Technical Information of China (English)

    WANG Jian; WANG Huiqiang; ZHAO Guosheng

    2007-01-01

    This paper describes a novel quantitative analysis model for system survivability based on conflict analysis, which provides a direct-viewing survivable situation. Based on the three-dimensional state space of conflict, each player's efficiency matrix on its credible motion set can be obtained. The player whose desire is the strongest in all initiates the moving and the overall state transition matrix of information system may be achieved. In addition, the process of modeling and stability analysis of conflict can be converted into a Markov analysis process, thus the obtained results with occurring probability of each feasible situation will help the players to quantitatively judge the probability of their pursuing situations in conflict. Compared with the existing methods which are limited to post-explanation of system's survivable situation, the proposed model is relatively suitable for quantitatively analyzing and forecasting the future development situation of system survivability. The experimental results show that the model may be effectively applied to quantitative analysis for survivability. Moreover, there will be a good application prospect in practice.

  14. Analysis of information systems for the enterprises marketing activities management

    OpenAIRE

    A.O. Natorina

    2014-01-01

    The article deals with the role of the computer information systems in the enterprise marketing activities strategic management, enterprises marketing management information systems. Tthe stages of the development system and launch of a new product into the market within its life cycle are analyzed, exemplified by fat and oil industry.

  15. The threat nets approach to information system security risk analysis

    NARCIS (Netherlands)

    Mirembe, Drake

    2015-01-01

    The growing demand for healthcare services is motivating hospitals to strengthen outpatient case management using information systems in order to serve more patients using the available resources. Though the use of information systems in outpatient case management raises patient data security concer

  16. Analysis of the Interdisciplinary Nature of Library and Information Science

    Science.gov (United States)

    Prebor, Gila

    2010-01-01

    Library and information science (LIS) is highly interdisciplinary by nature and is affected by the incessant evolution of technologies. A recent study surveying research trends in the years 2002-6 at various information science departments worldwide has found that a clear trend was identified in Masters theses and doctoral dissertations of social…

  17. Web Information Monitoring: An Analysis of Web Page Updates.

    Science.gov (United States)

    Tan, Bing; Foo, Schubert; Hui, Siu Cheung

    2001-01-01

    Discusses the need for Web information mentoring systems that help users track and monitor Web information based on users' interests. Describes a study that analyzed Web pages, Web site domains and change frequency for a Web monitoring system developed at Nanyang Technological University (Singapore). (Author/LRW)

  18. Сonsistent analysis of criterion parameters informativeness signals

    Directory of Open Access Journals (Sweden)

    О.К. Юдін

    2006-01-01

    Full Text Available  The goal of this article is synthesis of algorithm's consistent criterion of making a decision on the concept basis of adequate information. The considered method carries out of making a decision consistently with information accounting, which were saved on previous intervals of supervision, that enables its uses for a wide range of tasks of automated management.

  19. Trading in markets with noisy information: an evolutionary analysis

    Science.gov (United States)

    Bloembergen, Daan; Hennes, Daniel; McBurney, Peter; Tuyls, Karl

    2015-07-01

    We analyse the value of information in a stock market where information can be noisy and costly, using techniques from empirical game theory. Previous work has shown that the value of information follows a J-curve, where averagely informed traders perform below market average, and only insiders prevail. Here we show that both noise and cost can change this picture, in several cases leading to opposite results where insiders perform below market average, and averagely informed traders prevail. Moreover, we investigate the effect of random explorative actions on the market dynamics, showing how these lead to a mix of traders being sustained in equilibrium. These results provide insight into the complexity of real marketplaces, and show under which conditions a broad mix of different trading strategies might be sustainable.

  20. Information Services at the Nuclear Safety Analysis Center.

    Science.gov (United States)

    Simard, Ronald

    This paper describes the operations of the Nuclear Safety Analysis Center. Established soon after an accident at the Three Mile Island nuclear power plant near Harrisburg, Pennsylvania, its efforts were initially directed towards a detailed analysis of the accident. Continuing functions include: (1) the analysis of generic nuclear safety issues,…

  1. BOSTON CONSULTING GROUP (BCG) MATRIX ANALYSIS DURING THE STRATEGIC PLANNING OF GOVERNMENT ADMINISTRATION INFORMATION SYSTEMS (GAIS)

    OpenAIRE

    Maček, Vlatko; Brumec, Josip; Dušak, Vesna

    2000-01-01

    This paper deals with the BCG matrix analysis applied during the Strategic Planning of a selected group of Government Administration Information Systems (GAIS) i. e. Customs Administration Information System (CAIS), Tax Administration Information System (TAIS), Treasury Information System (TRIS) and Local Government Information System (LGIS). This selection is taken to be a representative corpus of the GAIS. The characteristics of the GA members will also be mentioned. These include its impor...

  2. Information needs of engineers. The methodology developed by the WFEO Committee on Engineering Information and the use of value analysis for improving information services

    International Nuclear Information System (INIS)

    The World Federation of Engineering Organizations - WFEO - through the work of its Committee on Engineering Information, aims at improving the efficiency of engineers and particularly at developing new attitudes and practices concerning the specialized information mastering. One important part of the WFEO/CEI programme of activities during the last years and for the next years was and is devoted to a better understanding of the information needs of engineers. But also, it seems now essential to WFEO/CEI to better evaluate information services in order to correctly adapt them to the identified needs of engineers. The following communication will emphasize these two main and related perspectives: identifying the information needs of engineers; developing Value Analysis approaches for engineering information services. (author). 3 refs

  3. Independent component analysis of edge information for face recognition

    CERN Document Server

    Karande, Kailash Jagannath

    2013-01-01

    The book presents research work on face recognition using edge information as features for face recognition with ICA algorithms. The independent components are extracted from edge information. These independent components are used with classifiers to match the facial images for recognition purpose. In their study, authors have explored Canny and LOG edge detectors as standard edge detection methods. Oriented Laplacian of Gaussian (OLOG) method is explored to extract the edge information with different orientations of Laplacian pyramid. Multiscale wavelet model for edge detection is also propos

  4. Information Flow in the Launch Vehicle Design/Analysis Process

    Science.gov (United States)

    Humphries, W. R., Sr.; Holland, W.; Bishop, R.

    1999-01-01

    This paper describes the results of a team effort aimed at defining the information flow between disciplines at the Marshall Space Flight Center (MSFC) engaged in the design of space launch vehicles. The information flow is modeled at a first level and is described using three types of templates: an N x N diagram, discipline flow diagrams, and discipline task descriptions. It is intended to provide engineers with an understanding of the connections between what they do and where it fits in the overall design process of the project. It is also intended to provide design managers with a better understanding of information flow in the launch vehicle design cycle.

  5. Information management in NACD regimes: a comparative analysis

    International Nuclear Information System (INIS)

    While all non-proliferation, arms control and disarmament (NACD) regimes must address the issue of information management, this area has remained an under-explored part of the arms control field. This paper compares information management processes across a variety of NACD regimes for the purpose of identifying potential synergies between regimes and suggesting means by which to strengthen future arms control verification efforts. The paper explores the information management systems of the International Atomic Energy Agency (IAEA), the United Nations Special Commission in Iraq (UNSCOM), the Conventional Forces in Europe Agreement (CFE), and the Comprehensive Test Ban Treaty (CTBT). (author)

  6. SYSTEM TECHNICAL ANALYSIS OF INFORMATION AND DOCUMENTATION IN STRUCTURAL DESIGN

    OpenAIRE

    Ignatiev Oleg Vladimirovich; Pavlov Aleksandr Sergeevich; Lavdansky Pavel Aleksandrovich

    2012-01-01

    The authors review the general types of documentation used in construction. Much attention is given to the different kinds of information contained in electronic files as well as data processing and transmission.

  7. Quantitative Analysis of Information Leakage in Probabilistic and Nondeterministic Systems

    OpenAIRE

    Andrés, Miguel,

    2011-01-01

    As we dive into the digital era, there is growing concern about the amount of personal digital information that is being gathered about us. Websites often track people's browsing behavior, health care insurers gather medical data, and many smartphones and navigation systems store or trans- mit information that makes it possible to track the physical location of their users at any time. Hence, anonymity, and privacy in general, are in- creasingly at stake. Anonymity protocols counter this conc...

  8. Analysis on Recommended System for Web Information Retrieval Using HMM

    OpenAIRE

    Himangni Rathore; Hemant Verma

    2014-01-01

    Web is a rich domain of data and knowledge, which is spread over the world in unstructured manner. The number of users is continuously access the information over the internet. Web mining is an application of data mining where web related data is extracted and manipulated for extracting knowledge. The data mining is used in the domain of web information mining is refers as web mining, that is further divided into three major domains web uses mining, web content mining and web stru...

  9. Information flow analysis for mobile code in dynamic security environments

    OpenAIRE

    Grabowski, Robert

    2012-01-01

    With the growing amount of data handled by Internet-enabled mobile devices, the task of preventing software from leaking confidential information is becoming increasingly important. At the same time, mobile applications are typically executed on different devices whose users have varying requirements for the privacy of their data. Users should be able to define their personal information security settings, and they should get a reliable assurance that the installed softwa...

  10. Information technology portfolio in supply chain management using factor analysis

    OpenAIRE

    Ahmad Jaafarnejad; Davood Rafierad; Masoumeh Gardeshi

    2013-01-01

    The adoption of information technology (IT) along with supply chain management (SCM) has become increasingly a necessity among most businesses. This enhances supply chain (SC) performance and helps companies achieve the organizational competitiveness. IT systems capture and analyze information and enable management to make decisions by considering a global scope across the entire SC. This paper reviews the existing literature on IT in SCM and considers pertinent criteria. Using principal comp...

  11. Brain Tumor Detection Based On Mathematical Analysis and Symmetry Information

    OpenAIRE

    G., Narkhede Sachin; Khairnar, Vaishali; Kadu, Sujata

    2014-01-01

    Image segmentation some of the challenging issues on brain magnetic resonance image tumor segmentation caused by the weak correlation between magnetic resonance imaging intensity and anatomical meaning.With the objective of utilizing more meaningful information to improve brain tumor segmentation,an approach which employs bilateral symmetry information as an additional feature for segmentation is proposed.This is motivated by potential performance improvement in the general automatic brain tu...

  12. Brain Tumor Detection Based On Mathematical Analysis and Symmetry Information

    OpenAIRE

    Narkhede Sachin G.,; Prof. Vaishali Khairnar

    2014-01-01

    Image segmentation some of the challenging issues on brain magnetic resonance (MR) image tumor segmentation caused by the weak correlation between magnetic resonance imaging (MRI) intensity and anatomical meaning. With the objective of utilizing more meaningful information to improve brain tumor segmentation, an approach which employs bilateral symmetry information as an additional feature for segmentation is proposed. This is motivated by potential performance improvement in ...

  13. Integrating Information and Communication Technology for Health Information System Strengthening: A Policy Analysis.

    Science.gov (United States)

    Marzuki, Nuraidah; Ismail, Saimy; Al-Sadat, Nabilla; Ehsan, Fauziah Z; Chan, Chee-Khoon; Ng, Chiu-Wan

    2015-11-01

    Despite the high costs involved and the lack of definitive evidence of sustained effectiveness, many low- and middle-income countries had begun to strengthen their health information system using information and communication technology in the past few decades. Following this international trend, the Malaysian Ministry of Health had been incorporating Telehealth (National Telehealth initiatives) into national health policies since the 1990s. Employing qualitative approaches, including key informant interviews and document review, this study examines the agenda-setting processes of the Telehealth policy using Kingdon's framework. The findings suggested that Telehealth policies emerged through actions of policy entrepreneurs within the Ministry of Health, who took advantage of several simultaneously occurring opportunities--official recognition of problems within the existing health information system, availability of information and communication technology to strengthen health information system and political interests surrounding the national Multimedia Super Corridor initiative being developed at the time. The last was achieved by the inclusion of Telehealth as a component of the Multimedia Super Corridor. PMID:26085477

  14. The Effect of Information Analysis Automation Display Content on Human Judgment Performance in Noisy Environments

    Science.gov (United States)

    Bass, Ellen J.; Baumgart, Leigh A.; Shepley, Kathryn Klein

    2014-01-01

    Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noisy sensor data are used by both the human and the information analysis automation to make judgments. In a simplified air traffic conflict prediction experiment, 32 participants made probability of horizontal conflict judgments under different display content conditions. After being exposed to the information analysis automation, judgment achievement significantly improved for all participants as compared to judgments without any of the automation's information. Participants provided with additional display content pertaining to cue variability in the task environment had significantly higher aided judgment achievement compared to those provided with only the automation's judgment of a probability of conflict. When designing information analysis automation for environments where the automation's judgment achievement is impacted by noisy environmental data, it may be beneficial to show additional task environment information to the human judge in order to improve judgment performance. PMID:24847184

  15. 75 FR 35457 - Draft of the 2010 Causal Analysis/Diagnosis Decision Information System (CADDIS)

    Science.gov (United States)

    2010-06-22

    ... AGENCY Draft of the 2010 Causal Analysis/Diagnosis Decision Information System (CADDIS) AGENCY... ``anonymous access'' system, which means that EPA will not know your identity or contact information unless... period for the draft Web site, ``2010 release of the Causal Analysis/Diagnosis Decision...

  16. 77 FR 64390 - Agency Information Collection (Food Service and Nutritional Care Analysis) Activities Under OMB...

    Science.gov (United States)

    2012-10-19

    ... determine whether improvements are needed to enhance patient's nutritional therapy. An agency may not... AFFAIRS Agency Information Collection (Food Service and Nutritional Care Analysis) Activities Under OMB....'' SUPPLEMENTARY INFORMATION: Title: Food Service and Nutritional Care Analysis, VA Form 10-5387. OMB...

  17. An Information Theoretic Analysis of Decision in Computer Chess

    CERN Document Server

    Godescu, Alexandru

    2011-01-01

    The basis of the method proposed in this article is the idea that information is one of the most important factors in strategic decisions, including decisions in computer chess and other strategy games. The model proposed in this article and the algorithm described are based on the idea of a information theoretic basis of decision in strategy games . The model generalizes and provides a mathematical justification for one of the most popular search algorithms used in leading computer chess programs, the fractional ply scheme. However, despite its success in leading computer chess applications, until now few has been published about this method. The article creates a fundamental basis for this method in the axioms of information theory, then derives the principles used in programming the search and describes mathematically the form of the coefficients. One of the most important parameters of the fractional ply search is derived from fundamental principles. Until now this coefficient has been usually handcrafted...

  18. Usability Analysis of Geographic Information System Software: A Case Study

    Directory of Open Access Journals (Sweden)

    Mehedi Masud

    2009-07-01

    Full Text Available A Geographical Information System (GIS is a computer system capable of creating, capturing and storing, analyzing, managing, and displaying geographically referenced information. A GIS tool offers interactive user interfaces to submit queries, analyze and edit data. The usability criterion of a GIS tool is an important factor for analyzing geographical information. This paper presents a methodology for evaluating the usability of a GIS tool and proposes some guidelines to find out the severity ratings of problems in a GIS tool. The paper also demonstrates how to scrutinize the usability to discover potential problems using a prototype user interface. Based on the study, experience, and observation, this paper also proposes a number of general usability evaluation guidelines for GIS tools.

  19. An Information Diffusion Technique for Fire Risk Analysis

    Institute of Scientific and Technical Information of China (English)

    刘静; 黄崇福

    2004-01-01

    There are many kinds of fires occurring under different conditions. For a specific site, it is difficult to collect sufficient data for analyzing the fire risk. In this paper, we suggest an information diffusion technique to analyze fire risk with a small sample. The information distribution method is applied to change crisp observations into fuzzy sets, and then to effectively construct a fuzzy relationship between fire and surroundings. With the data of Shanghai in winter, we show how to use the technique to analyze the fire risk.

  20. Shannon Information and Power Law Analysis of the Chromosome Code

    Directory of Open Access Journals (Sweden)

    J. A. Tenreiro Machado

    2012-01-01

    Full Text Available This paper studies the information content of the chromosomes of twenty-three species. Several statistics considering different number of bases for alphabet character encoding are derived. Based on the resulting histograms, word delimiters and character relative frequencies are identified. The knowledge of this data allows moving along each chromosome while evaluating the flow of characters and words. The resulting flux of information is captured by means of Shannon entropy. The results are explored in the perspective of power law relationships allowing a quantitative evaluation of the DNA of the species.

  1. An Information Geometric Analysis of Entangled Continuous Variable Quantum Systems

    Energy Technology Data Exchange (ETDEWEB)

    Kim, D-H [Institute for the Early Universe, Ewha Womans University, Seoul 120-750 (Korea, Republic of); Ali, S A [Department of Physics, State University of New York at Albany, 1400 Washington Avenue, Albany, NY 12222 (United States); Cafaro, C; Mancini, S [School of Science and Technology, Physics Division, University of Camerino, I-62032 Camerino (Italy)

    2011-07-08

    In this work, using information geometric (IG) techniques, we investigate the effects of micro-correlations on the evolution of maximal probability paths on statistical manifolds induced by systems whose microscopic degrees of freedom are Gaussian distributed. Analytical estimates of the information geometric entropy (IGE) as well as the IG analogue of the Lyapunov exponents are presented. It is shown that the entanglement duration is related to the scattering potential and incident particle energies. Finally, the degree of entanglement generated by an s-wave scattering event between minimum uncertainty wave-packets is computed in terms of the purity of the system.

  2. Informal Housing in Greece: A Multinomial Logistic Regression Analysis at the Regional Level

    OpenAIRE

    Polyzos, Serafeim; MINETOS, Dionysios

    2014-01-01

    This paper deals with the primary causes of informal housing in Greece as well as the observed differentiations in informal housing patterns across space. The spatial level of analysis is the prefectural administrative level. The results of the multinomial logistic regression analysis indicate that Greek prefectures differ in the way they experience the informal housing phenomenon. An explanation for the observed differences may be the separate development paths followed and the ...

  3. A Novel Approach for Information Content Retrieval and Analysis of Bio-Images using Datamining techniques

    OpenAIRE

    Ayyagari Sri Nagesh; G.P.Saradhi Varma; Govardhan, A.

    2012-01-01

    In Bio-Medical image processing domain, content-based analysis and Information retrieval of bio-images is very critical for disease diagnosis. Content-Based Image Analysis and Information Retrieval (CBIAIR) has become a significant part of information retrieval technology. One challenge in this area is that the ever-increasing number of bio-images acquired through the digital world makes the brute force searching almost impossible. Medical Image structural objects content and object identific...

  4. Efficiency and credit ratings: a permutation-information-theory analysis

    Science.gov (United States)

    Fernandez Bariviera, Aurelio; Zunino, Luciano; Belén Guercio, M.; Martinez, Lisana B.; Rosso, Osvaldo A.

    2013-08-01

    The role of credit rating agencies has been under severe scrutiny after the subprime crisis. In this paper we explore the relationship between credit ratings and informational efficiency of a sample of thirty nine corporate bonds of US oil and energy companies from April 2008 to November 2012. For this purpose we use a powerful statistical tool, relatively new in the financial literature: the complexity-entropy causality plane. This representation space allows us to graphically classify the different bonds according to their degree of informational efficiency. We find that this classification agrees with the credit ratings assigned by Moody’s. In particular, we detect the formation of two clusters, which correspond to the global categories of investment and speculative grades. Regarding the latter cluster, two subgroups reflect distinct levels of efficiency. Additionally, we also find an intriguing absence of correlation between informational efficiency and firm characteristics. This allows us to conclude that the proposed permutation-information-theory approach provides an alternative practical way to justify bond classification.

  5. Information Technology and Economic Performance: A Global Analysis

    OpenAIRE

    Surfield, Christopher J.

    2008-01-01

    We statistically evaluate the impact that technology has on economic performance. We find that technologies such as the computer increase the productivity of an economy’s labor force and increase the per capita GDP. We find it is developing, not industrialized, economies that most benefit from information technology.

  6. A Critical Analysis of Vector Space Model for Information Retrieval.

    Science.gov (United States)

    Raghavan, Vijay V.; Wong, S. K. M.

    1986-01-01

    Presents notations and definitions necessary to identify the concepts and relationships that are important in modelling information retrieval objects and processes in the context of vector spaces. Earlier work on the use of vector models is evaluated in terms of the concepts introduced and certain problems are identified. (Author/EM)

  7. Repeater Analysis for Combining Information from Different Assessments

    Science.gov (United States)

    Haberman, Shelby; Yao, Lili

    2015-01-01

    Admission decisions frequently rely on multiple assessments. As a consequence, it is important to explore rational approaches to combine the information from different educational tests. For example, U.S. graduate schools usually receive both TOEFL iBT® scores and GRE® General scores of foreign applicants for admission; however, little guidance…

  8. Efficiency and credit ratings: a permutation-information-theory analysis

    International Nuclear Information System (INIS)

    The role of credit rating agencies has been under severe scrutiny after the subprime crisis. In this paper we explore the relationship between credit ratings and informational efficiency of a sample of thirty nine corporate bonds of US oil and energy companies from April 2008 to November 2012. For this purpose we use a powerful statistical tool, relatively new in the financial literature: the complexity–entropy causality plane. This representation space allows us to graphically classify the different bonds according to their degree of informational efficiency. We find that this classification agrees with the credit ratings assigned by Moody’s. In particular, we detect the formation of two clusters, which correspond to the global categories of investment and speculative grades. Regarding the latter cluster, two subgroups reflect distinct levels of efficiency. Additionally, we also find an intriguing absence of correlation between informational efficiency and firm characteristics. This allows us to conclude that the proposed permutation-information-theory approach provides an alternative practical way to justify bond classification. (paper)

  9. Information Technology: A Community of Practice. A Workplace Analysis

    Science.gov (United States)

    Guerrero, Tony

    2014-01-01

    Information Technology (IT) encompasses all aspects of computing technology. IT is concerned with issues relating to supporting technology users and meeting their needs within an organizational and societal context through the selection, creation, application, integration, and administration of computing technologies (Lunt, et. al., 2008). The…

  10. Information sharing for consumption tax purposes : An empirical analysis

    NARCIS (Netherlands)

    Ligthart, Jenny E.

    2007-01-01

    The paper studies the determinants of information sharing between Swedish tax authorities and 14 EU tax authorities for value-added tax (VAT) purposes. It is shown that trade-related variables (such as the partner country's net trade position and population size), reciprocity, and legal arrangements

  11. Chromatic Information and Feature Detection in Fast Visual Analysis

    Science.gov (United States)

    Del Viva, Maria M.; Punzi, Giovanni; Shevell, Steven K.

    2016-01-01

    The visual system is able to recognize a scene based on a sketch made of very simple features. This ability is likely crucial for survival, when fast image recognition is necessary, and it is believed that a primal sketch is extracted very early in the visual processing. Such highly simplified representations can be sufficient for accurate object discrimination, but an open question is the role played by color in this process. Rich color information is available in natural scenes, yet artist's sketches are usually monochromatic; and, black-and-white movies provide compelling representations of real world scenes. Also, the contrast sensitivity of color is low at fine spatial scales. We approach the question from the perspective of optimal information processing by a system endowed with limited computational resources. We show that when such limitations are taken into account, the intrinsic statistical properties of natural scenes imply that the most effective strategy is to ignore fine-scale color features and devote most of the bandwidth to gray-scale information. We find confirmation of these information-based predictions from psychophysics measurements of fast-viewing discrimination of natural scenes. We conclude that the lack of colored features in our visual representation, and our overall low sensitivity to high-frequency color components, are a consequence of an adaptation process, optimizing the size and power consumption of our brain for the visual world we live in. PMID:27478891

  12. Considering Complex Objectives and Scarce Resources in Information Systems' Analysis.

    Science.gov (United States)

    Crowther, Warren

    The low efficacy of many of the library and large-scale information systems that have been implemented in the developing countries has been disappointing, and their appropriateness is often questioned in the governmental and educational institutions of more industrialized countries beset by budget-crunching and a very dynamic transformation of…

  13. Cross-Language Information Retrieval: An Analysis of Errors.

    Science.gov (United States)

    Ruiz, Miguel E.; Srinivasan, Padmini

    1998-01-01

    Investigates an automatic method for Cross Language Information Retrieval (CLIR) that utilizes the multilingual Unified Medical Language System (UMLS) Metathesaurus to translate Spanish natural-language queries into English. Results indicate that for Spanish, the UMLS Metathesaurus-based CLIR method is at least equivalent to if not better than…

  14. Analysis on Cloud Computing Information Security Problems and the Countermeasures

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Cloud computing is one of the most popular terms in the present IT industry, as well as one of the most prosperous technology. This paper introduces the concept, principle and characteristics of cloud computing, analyzes information security problems resulted from cloud computing, and puts forward corresponding solutions.

  15. A SOCIOECONOMIC ANALYSIS OF MARKETING INFORMATION USAGE AMONG OHIO FRUIT PRODUCERS

    OpenAIRE

    Jones, Eugene; Batte, Marvin T.; Schnitkey, Gary D.

    1990-01-01

    Farm producers attempt to mitigate risk and uncertainty by utilizing accurate and reliable information. This research attempts to identify sources of information used by Ohio fruit producers and then determine which of these sources are best meeting their information needs. Results are based on a logit analysis of Ohio fruit producers and several factors are shown to influence producers' evaluation of the "adequacy" of their marketing information. Among these factors are age, business size, e...

  16. Value of information analysis for corrective action unit No. 98: Frenchman Flat

    International Nuclear Information System (INIS)

    A value of information analysis has been completed as part of the corrective action process for Frenchman Flat, the first Nevada Test Site underground test area to be scheduled for the corrective action process. A value of information analysis is a cost-benefit analysis applied to the acquisition of new information which is needed to reduce the uncertainty in the prediction of a contaminant boundary surrounding underground nuclear tests in Frenchman Flat. The boundary location will be established to protect human health and the environment from the consequences of using contaminated groundwater on the Nevada Test Site. Uncertainties in the boundary predictions are assumed to be the result of data gaps. The value of information analysis in this document compares the cost of acquiring new information with the benefit of acquiring that information during the corrective action investigation at Frenchman Flat. Methodologies incorporated into the value of information analysis include previous geological modeling, groundwater flow modeling, contaminant transport modeling, statistics, sensitivity analysis, uncertainty analysis, and decision analysis

  17. Analysis on Recommended System for Web Information Retrieval Using HMM

    Directory of Open Access Journals (Sweden)

    Himangni Rathore

    2014-11-01

    Full Text Available Web is a rich domain of data and knowledge, which is spread over the world in unstructured manner. The number of users is continuously access the information over the internet. Web mining is an application of data mining where web related data is extracted and manipulated for extracting knowledge. The data mining is used in the domain of web information mining is refers as web mining, that is further divided into three major domains web uses mining, web content mining and web structure mining. The proposed work is intended to work with web uses mining. The concept of web mining is to improve the user feedbacks and user navigation pattern discovery for a CRM system. Finally a new algorithm HMM is used for finding the pattern in data, which method promises to provide much accurate recommendation.

  18. Analysis of multiple source/multiple informant data in Stata

    OpenAIRE

    Nicholas Horton; Garrett Fitzmaurice

    2005-01-01

    We describe regression-based methods for analyzing multiple-source data arising from complex sample survey designs in Stata. We use the term multiple-source data to encompass all cases where data are simultaneously obtained from multiple informants, or raters (e.g., self-reports, family members, health care providers, administrators) or via different/parallel instruments, indicators or methods (e.g., symptom rating scales, standardized diagnostic interviews, or clinical diagnoses). We review ...

  19. A Modified MCEM Approach to Full Information Binary Factor Analysis

    OpenAIRE

    Xinming An; Bentler, P M

    2011-01-01

    Since binary data are ubiquitous in educational, psychological, and social research, methods for effectively exploring the underlying factor structure of such data are still undergoing development (Schilling and Bock 2005; Maydeu- Olivares and Joe 2005; and Song and Lee 2005). Two distinct types of methods have been developed, those relying on limited information from low-order marginal and joint frequency responses and those relying on the frequencies of all distinct item response vectors. T...

  20. HIV Drug-Resistant Patient Information Management, Analysis, and Interpretation

    OpenAIRE

    Singh, Yashik; Mars, Maurice

    2012-01-01

    Introduction The science of information systems, management, and interpretation plays an important part in the continuity of care of patients. This is becoming more evident in the treatment of human immunodeficiency virus (HIV) and acquired immune deficiency syndrome (AIDS), the leading cause of death in sub-Saharan Africa. The high replication rates, selective pressure, and initial infection by resistant strains of HIV infer that drug resistance will inevitably become an important health car...

  1. Benetton and Zara information systems:a comparative analysis

    OpenAIRE

    Pirone, Chiara

    2010-01-01

    Supply chain management has emerged as one of the major areas for companies to gain a competitive edge. Managing supply chains effectively is a complex and challenging task, due to the current business trends of expanding product variety, short product life cycle, increasing outsourcing, globalization of businesses, and continuous advances in information technology. Because of shorter and shorter product life cycles, the pressure for dynamically adjusting and adapting a company¿s supply chain...

  2. Analysis of consumer information brochures on osteoporosis prevention and treatment

    Directory of Open Access Journals (Sweden)

    Mühlhauser, Ingrid

    2007-01-01

    Full Text Available Purpose: Evidence-based consumer information is a prerequisite for informed decision making. So far, there are no reports on the quality of consumer information brochures on osteoporosis. In the present study we analysed brochures on osteoporosis available in Germany. Method: All printed brochures from patient and consumer advocacy groups, physician and governmental organisations, health insurances, and pharmaceutical companies were initially collected in 2001, and updated in December 2004. Brochures were analysed by two independent researchers using 37 internationally proposed criteria addressing evidence-based content, risk communication, transparency of the development process, and layout and design. Results: A total of 165 brochures were identified; 59 were included as they specifically targeted osteoporosis prevention and treatment. Most brochures were provided by pharmaceutical companies (n=25, followed by health insurances (n=11 and patient and consumer advocacy groups (n=11. Quality of brochures did not differ between providers. Only 1 brochure presented lifetime risk estimate; 4 mentioned natural course of osteoporosis. A balanced report on benefit versus lack of benefit was presented in 2 brochures and on benefit versus adverse effects in 8 brochures. Four brochures mentioned relative risk reduction, 1 reported absolute risk reduction through hormone replacement therapy (HRT. Out of 28 brochures accessed in 2004 10 still recommended HRT without discussing adverse effects. Transparency of the development process was limited: 25 brochures reported publication date, 26 cited author and only 1 references. In contrast, readability and design was generally good. Conclusion: The quality of consumer brochures on osteoporosis in Germany is utterly inadequate. They fail to give evidence-based data on diagnosis and treatment options. Therefore, the material is not useful to enhance informed consumer choice.

  3. Pakistan Journal of Library and Information Science: A bibliometric analysis

    OpenAIRE

    Warraich, Nosheen Fatima; Ahmad, Sajjad

    2011-01-01

    Pakistan Journal of Library and Information Science (PJLIS) is an HEC recognized journal published by the Department of Library and Informat ion Science, University of the Punjab, Lahore. A total of 111 publications from 11 issues of PJLIS were pu blished during 1995 to 2010. It has outstanding contribution in the dissemination of LIS research on national and international level as it publishes both in print and electronic format. A bibl...

  4. FORTES: Forensic Information Flow Analysis of Business Processes

    OpenAIRE

    Accorsi, Rafael; Müller, Günter

    2010-01-01

    Nearly 70% of all business processes in use today rely on automated workflow systems for their execution. Despite the growing expenses in the design of advanced tools for secure and compliant deployment of workflows, an exponential growth of dependability incidents persists. Concepts beyond access control focusing on information flow control offer new paradigms to design security mechanisms for reliable and secure IT-based workflows. This talk presents FORTES, an approach for the forensic...

  5. Sensor data analysis and information extraction for structural health monitoring

    OpenAIRE

    Yan, Linjun

    2006-01-01

    Recently, advances in sensing techniques, internet technologies, and wireless communications are increasingly facilitating and allowing practical deployment of large and dense sensor networks for structural health monitoring. Thus, it is vital to develop efficient techniques to process and analyze the massive amount of sensor data in order to extract essential information on the monitored structures. The efforts in this dissertation are mainly dedicated to this direction, including studies on...

  6. Food labelled Information: An Empirical Analysis of Consumer Preferences

    OpenAIRE

    Alessandro Banterle; Alessia Cavaliere; Elena Claire Ricci

    2012-01-01

    Labelling can support consumers in making choices connected to their preferences in terms of qualitative features. Nevertheless, the space available on packaging is limited and some indications are not used by consumers. This paper aims at analysing which kinds of currently labelled information are of interest and actually used by consumers, and which additional kinds could improve consumer choices. Moreover, we investigate the attitude of consumers with respect to innovative strategies for t...

  7. Consumer Information Search Revisited: Theory and Empirical Analysis.

    OpenAIRE

    Moorthy, Sridhar; Ratchford, Brian T; Talukdar, Debabrata

    1997-01-01

    A comprehensive theoretical framework for understanding consumers' information search behavior is presented. Unlike previous research, our model identifies not only what factors affect consumers' search behavior but also how these factors interact with each other. In particular, the model emphasizes the effect of prior brand perceptions on the search process. We argue that when consumers have brand-specific prior distributions of utility, the existence of relative uncertainty among brands is ...

  8. Commercial babassu mesocarp: microbiological evaluation and analysis of label information

    OpenAIRE

    Laisa Lis Fontinele Sá; Tamyres Andrade Macedo; Sean Teles Pereira; Nouga Cardoso Batista; Lívio Cesar Cunha Nunes; Waleska Ferreira Albuguergue

    2015-01-01

    The babassu mesocarp is easily found in supermarkets and other commercial establishments in Brazil. Despite its widespread use in both pharmaceutical and food industries, the literature has neither scientific studies about microbial contamination for these products nor about legal information expressed on label. The aim of this study was to evaluate the level of microbiological contamination in babassu mesocarp sold in commercial establishments in Teresina-PI/Brazil besides the conformity of ...

  9. Deactivation and Decommissioning Planning and Analysis with Geographic Information Systems

    International Nuclear Information System (INIS)

    From the mid-1950's through the 1980's, the U.S. Department of Energy's Savannah River Site produced nuclear materials for the weapons stockpile, for medical and industrial applications, and for space exploration. Although SRS has a continuing defense-related mission, the overall site mission is now oriented toward environmental restoration and management of legacy chemical and nuclear waste. With the change in mission, SRS no longer has a need for much of the infrastructure developed to support the weapons program. This excess infrastructure, which includes over 1000 facilities, will be decommissioned and demolished over the forthcoming years. Dis-positioning facilities for decommissioning and deactivation requires significant resources to determine hazards, structure type, and a rough-order-of-magnitude estimate for the decommissioning and demolition cost. Geographic information systems (GIS) technology was used to help manage the process of dis-positioning infrastructure and for reporting the future status of impacted facilities. Several thousand facilities of various ages and conditions are present at SRS. Many of these facilities, built to support previous defense-related missions, now represent a potential hazard and cost for maintenance and surveillance. To reduce costs and the hazards associated with this excess infrastructure, SRS has developed an ambitious plan to decommission and demolish unneeded facilities in a systematic fashion. GIS technology was used to assist development of this plan by: providing locational information for remote facilities, identifying the location of known waste units adjacent to buildings slated for demolition, and for providing a powerful visual representation of the impact of the overall plan. Several steps were required for the development of the infrastructure GIS model. The first step involved creating an accurate and current GIS representation of the infrastructure data. This data is maintained in a Computer Aided Design

  10. Search Strategies in Nanotechnology Databases: Log Analysis. Journal of Information Processing and Management

    Directory of Open Access Journals (Sweden)

    Nadjla Hariri

    2014-03-01

    Full Text Available Major concern of users of information systems is information retrieval related their information needs, query used by the users is a manifestation of their information needs. The research purpose is to analyze databases of nanotechnology through the query analysis and follow-up users navigation. The research method is transaction log analysis. This study was performed through the analysis of the interaction between users and the database transaction files. Results show that nanothechnology databases users using three methods to retrieve information needs: search engines, referral sites and directly use. In the used directly Bounce Rate have been lower and more pages have been visited. The average length of query is 3.36. And easier search strategies are used to retrieve information.

  11. 信息分析基础理论研究%Researches on the Elementary Theory of Information Analysis

    Institute of Scientific and Technical Information of China (English)

    高柳宾; 孙云川

    2000-01-01

    The engendering and developing process of the elementary theory of Information Analysis is studied. From the developmental point of view, the elementary theory of Information Analysis can be divided into three stages: Bibliometrics-based stage, Informetrics-based stage and Microeconomics of Information-based stage, an exploring, studying and creating stage of the elementary theory of Information Analysis.

  12. Cost-Benefit Analysis of Electronic Information: A Case Study.

    Science.gov (United States)

    White, Gary W.; Crawford, Gregory A.

    1998-01-01

    Describes a study at Pennsylvania State University Harrisburg in which cost-benefit analysis (CBA) was used to examine the cost effectiveness of an electronic database. Concludes that librarians can use the results of CBA studies to justify budgets and acquisitions and to provide insight into the true costs of providing library services. (PEN)

  13. 10 CFR 52.157 - Contents of applications; technical information in final safety analysis report.

    Science.gov (United States)

    2010-01-01

    ... requirements in 10 CFR Chapter I, including: (1) An analysis and evaluation of the design and performance of...; technical information in final safety analysis report. The application must contain a final safety analysis... NRC: (a) The principal design criteria for the reactor to be manufactured. Appendix A of 10 CFR...

  14. Analysis and optimalization of information and material flow in choosen company

    OpenAIRE

    SVOJŠE, Zdeněk

    2012-01-01

    This thesis is focused on analysis of information and material flow of production company, supplier to automotive industry. The analysis indicate the fact that each flow doesn´t work separately from the other. The results of the analysis are 3 improvements.We can reduce the cost by the optimalization and the company is more competitive.

  15. Evaluation of the Field Test of Project Information Packages: Volume III--Resource Cost Analysis.

    Science.gov (United States)

    Al-Salam, Nabeel; And Others

    The third of three volumes evaluating the first year field test of the Project Information Packages (PIPs) provides a cost analysis study as a key element in the total evaluation. The resource approach to cost analysis is explained and the specific resource methodology used in the main cost analysis of the 19 PIP field-test projects detailed. The…

  16. Computer programs: Information retrieval and data analysis, a compilation

    Science.gov (United States)

    1972-01-01

    The items presented in this compilation are divided into two sections. Section one treats of computer usage devoted to the retrieval of information that affords the user rapid entry into voluminous collections of data on a selective basis. Section two is a more generalized collection of computer options for the user who needs to take such data and reduce it to an analytical study within a specific discipline. These programs, routines, and subroutines should prove useful to users who do not have access to more sophisticated and expensive computer software.

  17. Sentiment Analysis of Feedback Information in Hospitality Industry

    Directory of Open Access Journals (Sweden)

    Manzoor Ahmad

    2015-11-01

    Full Text Available Sentiment analysis is the study of opinions, emotions of a person‟s towards events or entities which can enable to rate that event or entity for decision making by the prospective buyers/users. In this research paper I have tried to demonstrate the use of automatic opinion mining/sentiment analysis to rate a hotel and its service‟s based on the guest feedback data. We have used a semantic resource for a feature vector and Naïve Bayes classifier for the review classification after reducing the feature sets for better accuracy and efficiency. Also an improvement in the accuracy of the classification has been observed after the use of bi-gram and tri-gram language model.

  18. Temporal abstraction for the analysis of intensive care information

    Energy Technology Data Exchange (ETDEWEB)

    Hadad, Alejandro J [Artificial Intelligence Group, Fac. de Ingenieria, University Nac. de Entre Rios, FI-UNER (Argentina); Evin, Diego A [Artificial Intelligence Group, Fac. de Ingenieria, University Nac. de Entre Rios, FI-UNER (Argentina); Drozdowicz, Bartolome [Artificial Intelligence Group, Fac.de Ingenieria, University Nac. de Entre Rios, FI-UNER (Argentina); Chiotti, Omar [Instituto de Desarrollo y Diseno, INGAR-CONICET (Argentina)

    2007-11-15

    This paper proposes a scheme for the analysis of time-stamped series data from multiple monitoring devices of intensive care units, using Temporal Abstraction concepts. This scheme is oriented to obtain a description of the patient state evolution in an unsupervised way. The case of study is based on a dataset clinically classified with Pulmonary Edema. For this dataset a trends based Temporal Abstraction mechanism is proposed, by means of a Behaviours Base of time-stamped series and then used in a classification step. Combining this approach with the introduction of expert knowledge, using Fuzzy Logic, and multivariate analysis by means of Self-Organizing Maps, a states characterization model is obtained. This model is feasible of being extended to different patients groups and states. The proposed scheme allows to obtain intermediate states descriptions through which it is passing the patient and that could be used to anticipate alert situations.

  19. Value flow mapping: Using networks to inform stakeholder analysis

    Science.gov (United States)

    Cameron, Bruce G.; Crawley, Edward F.; Loureiro, Geilson; Rebentisch, Eric S.

    2008-02-01

    Stakeholder theory has garnered significant interest from the corporate community, but has proved difficult to apply to large government programs. A detailed value flow exercise was conducted to identify the value delivery mechanisms among stakeholders for the current Vision for Space Exploration. We propose a method for capturing stakeholder needs that explicitly recognizes the outcomes required of the value creating organization. The captured stakeholder needs are then translated into input-output models for each stakeholder, which are then aggregated into a network model. Analysis of this network suggests that benefits are infrequently linked to the root provider of value. Furthermore, it is noted that requirements should not only be written to influence the organization's outputs, but also to influence the propagation of benefit further along the value chain. A number of future applications of this model to systems architecture and requirement analysis are discussed.

  20. Temporal abstraction for the analysis of intensive care information

    International Nuclear Information System (INIS)

    This paper proposes a scheme for the analysis of time-stamped series data from multiple monitoring devices of intensive care units, using Temporal Abstraction concepts. This scheme is oriented to obtain a description of the patient state evolution in an unsupervised way. The case of study is based on a dataset clinically classified with Pulmonary Edema. For this dataset a trends based Temporal Abstraction mechanism is proposed, by means of a Behaviours Base of time-stamped series and then used in a classification step. Combining this approach with the introduction of expert knowledge, using Fuzzy Logic, and multivariate analysis by means of Self-Organizing Maps, a states characterization model is obtained. This model is feasible of being extended to different patients groups and states. The proposed scheme allows to obtain intermediate states descriptions through which it is passing the patient and that could be used to anticipate alert situations

  1. Sentiment Analysis of Feedback Information in Hospitality Industry

    Directory of Open Access Journals (Sweden)

    Manzoor Ahmad

    2014-06-01

    Full Text Available Sentiment analysis is the study of opinions, emotions of a person‟s towards events or entities which can enable to rate that event or entity for decision making by the prospective buyers/users. In this research paper I have tried to demonst rate the use of automatic opinion mining/sentiment analysis to rate a hotel and its service‟s based on the guest feedback data. We have used a semantic resource for a feature vector and Naïve Bayes classifier for the review classification after reducing the feature sets for better accuracy and efficiency. Also an improvement in the accuracy of the classification has been observed after the use of bi-gram and tri-gram language model.

  2. Real-Time Environmental Information Network and Analysis System (REINAS)

    OpenAIRE

    Nuss, Wendell A.

    1998-01-01

    The long term goals of the NPS portion of this project, which is joint with UCSC, are to develop a mesoscale coastal analysis system for use in diagnosing and predicting coastal circulations in a topographically complex coastal region and to provide guidance to UCSC for the development of data collection, data management, and visualization tools for mesoscale meteorological problems. Funding Document Number: N0001498WR30144

  3. EMAIL ANALYSIS AND INFORMATION EXTRACTION FOR ENTERPRISE BENEFIT

    OpenAIRE

    Michal Laclavík; Štefan Dlugolinský; Martin Šeleng; Marcel Kvassay; Emil Gatial; Zoltán Balogh; Ladislav Hluchý

    2011-01-01

    In spite of rapid advances in multimedia and interactive technologies, enterprise users prefer to battle with email spam and overload rather than lose the benefits of communicating, collaborating and solving business tasks over email. Many aspects of email have significantly improved over time, but its overall integration with the enterprise environment remained practically the same. In this paper we describe and evaluate a light-weight approach to enterprise email communication analysis and ...

  4. Information-Theoretic Analysis of an Energy Harvesting Communication System

    CERN Document Server

    Ozel, Omur

    2010-01-01

    In energy harvesting communication systems, an exogenous recharge process supplies energy for the data transmission and arriving energy can be buffered in a battery before consumption. Transmission is interrupted if there is not sufficient energy. We address communication with such random energy arrivals in an information-theoretic setting. Based on the classical additive white Gaussian noise (AWGN) channel model, we study the coding problem with random energy arrivals at the transmitter. We show that the capacity of the AWGN channel with stochastic energy arrivals is equal to the capacity with an average power constraint equal to the average recharge rate. We provide two different capacity achieving schemes: {\\it save-and-transmit} and {\\it best-effort-transmit}. Next, we consider the case where energy arrivals have time-varying average in a larger time scale. We derive the optimal offline power allocation for maximum average throughput and provide an algorithm that finds the optimal power allocation.

  5. Overall analysis of meteorological information in the Daeduk nuclear complex

    International Nuclear Information System (INIS)

    Inspection and repair of tower structure and lift, instrument calibration have been done. Wireless data transmission to MIPS(Meteorological Information Processing System) has been done after collection in the DAS where environmental assessment can be done by the developed simulation programs in both cases of normal operation and emergency. Wind direction, wind speed, temperature, humidity at 67m, 27m, and 10m height and temperature, humidity, atmospheric pressure, solar radiation, precipitation, and visibility at surface have been measured and analyzed with statistical methods. At the site, the prevailing wind directions were SW in spring and summer, NNW in winter season. The calm distributed 28.6% at 67m, 20.5% at 27m, 39.2% at 10m height. 9 tabs., 4 figs., 6 refs. (Author)

  6. Value of Information Analysis Project Gnome Site, New Mexico

    International Nuclear Information System (INIS)

    The Project Gnome site in southeastern New Mexico was the location of an underground nuclear detonation in 1961 and a hydrologic tracer test using radionuclides in 1963. The tracer test is recognized as having greater radionuclide migration potential than the nuclear test because the tracer test radionuclides (tritium, 90Sr, 131I, and 137Cs) are in direct contact with the Culebra Dolomite aquifer, whereas the nuclear test is within a bedded salt formation. The tracer test is the topic here. Recognizing previous analyses of the fate of the Gnome tracer test contaminants (Pohll and Pohlmann, 1996; Pohlmann and Andricevic, 1994), and the existence of a large body of relevant investigations and analyses associated with the nearby Waste Isolation Pilot Plant (WIPP) site (summarized in US DOE, 2009), the Gnome Site Characterization Work Plan (U.S. DOE, 2002) called for a Data Decision Analysis to determine whether or not additional characterization data are needed prior to evaluating existing subsurface intrusion restrictions and determining long-term monitoring for the tracer test. Specifically, the Work Plan called for the analysis to weigh the potential reduction in uncertainty from additional data collection against the cost of such field efforts.

  7. Value of Information Analysis Project Gnome Site, New Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Greg Pohll; Jenny Chapman

    2010-01-01

    The Project Gnome site in southeastern New Mexico was the location of an underground nuclear detonation in 1961 and a hydrologic tracer test using radionuclides in 1963. The tracer test is recognized as having greater radionuclide migration potential than the nuclear test because the tracer test radionuclides (tritium, 90Sr, 131I, and 137Cs) are in direct contact with the Culebra Dolomite aquifer, whereas the nuclear test is within a bedded salt formation. The tracer test is the topic here. Recognizing previous analyses of the fate of the Gnome tracer test contaminants (Pohll and Pohlmann, 1996; Pohlmann and Andricevic, 1994), and the existence of a large body of relevant investigations and analyses associated with the nearby Waste Isolation Pilot Plant (WIPP) site (summarized in US DOE, 2009), the Gnome Site Characterization Work Plan (U.S. DOE, 2002) called for a Data Decision Analysis to determine whether or not additional characterization data are needed prior to evaluating existing subsurface intrusion restrictions and determining long-term monitoring for the tracer test. Specifically, the Work Plan called for the analysis to weigh the potential reduction in uncertainty from additional data collection against the cost of such field efforts.

  8. 2004/2008 labour market information comparative analysis report

    International Nuclear Information System (INIS)

    The electricity sector has entered into a phase of both challenges and opportunities. Challenges include workforce retirement, labour shortages, and increased competition from other employers to attract and retain the skilled people required to deliver on the increasing demand for electricity in Canada. The electricity sector in Canada is also moving into a new phase, whereby much of the existing infrastructure is either due for significant upgrades, or complete replacement. The increasing demand for electricity means that increased investment and capital expenditure will need to be put toward building new infrastructure altogether. The opportunities for the electricity industry will lie in its ability to effectively and efficiently react to these challenges. The purpose of this report was to provide employers and stakeholders in the sector with relevant and current trend data to help them make appropriate policy and human resource decisions. The report presented a comparative analysis of a 2004 Canadian Electricity Association employer survey with a 2008 Electricity Sector Council employer survey. The comparative analysis highlighted trends and changes that emerged between the 2004 and 2008 studies. Specific topics that were addressed included overall employment trends; employment diversity in the sector; age of non-support staff; recruitment; and retirements and pension eligibility. Recommendations were also offered. It was concluded that the electricity sector could benefit greatly from implementing on-going recruitment campaigns. refs., tabs., figs

  9. Using visual information analysis to explore complex patterns in the activity of designers

    DEFF Research Database (Denmark)

    Cash, Philip; Stanković, Tino; Štorga, Mario

    2014-01-01

    The analysis of complex interlinked datasets poses a significant problem for design researchers. This is addressed by proposing an information visualisation method for analysing patterns of design activity, qualitatively and quantitatively, with respect to time. This method visualises the tempora...

  10. Transportation routing analysis geographic information system - tragis, progress on improving a routing tool

    International Nuclear Information System (INIS)

    The Transportation Routing Analysis Geographic Information System (TRAGIS) model provides a useful tool to calculate and analyze transportation routes for radioactive materials within the continental United States. This paper outlines some of the features available in this model. (authors)

  11. Transportation Routing Analysis Geographic Information System -- TRAGIS, progress on improving a routing tool

    International Nuclear Information System (INIS)

    The Transportation Routing Analysis Geographic Information System (TRAGIS) model provides a useful tool to calculate and analyze transportation routes for radioactive materials within the continental US. This paper outlines some of the features available in this model

  12. Drought Analysis of Aksu Irrigation Area in Antalya by Aydeniz Method and Geographic Information Systems

    OpenAIRE

    ARSLAN, Onur; ÖNDER, Hasan Hüseyin; ÖZDEMİR, Gültekin

    2014-01-01

    n this study, a drought analysis has been carried out for Aksu-Antalya Irrigation Area by using Aydeniz Method and Geographic Information Systems. Meteorological data of Antalya, Isparta, Korkuteli and Manavgat I

  13. URBAN TRAFFIC ACCIDENT ANALYSIS BY USING GEOGRAPHIC INFORMATION SYSTEM

    Directory of Open Access Journals (Sweden)

    Meltem SAPLIOĞLU

    2006-03-01

    Full Text Available In recent years, traffic accidents that cause more social and economic losses than that of natural disasters,have become a national problem in Turkey. To solve this problem and to reduce the casualties, road safety programs are tried to be developed. It is necessary to develop the most effective measures with low investment cost due to limited budgets allocated to such road safety programs. The most important program is to determine dangerous locations of traffic accidents and to improve these sections from the road safety view point. New Technologies are driving a cycle of continuous improvement that causes rapid changes in the traffic engineering and any engineering services within it. It is obvious that this developed services will be the potential for forward-thinking engineering studies to take a more influence role. In this study, Geographic Information System (GIS was used to identify the hazardous locations of traffic accidents in Isparta. Isparta city map was digitized by using Arcinfo 7.21. Traffic accident reports occurred between 1998-2002 were obtained from Directory of Isparta Traffic Region and had been used to form the database. Topology was set up by using Crash Diagrams and Geographic Position Reference Systems. Tables are formed according to the obtained results and interpreted.

  14. Commodity predictability analysis with a permutation information theory approach

    Science.gov (United States)

    Zunino, Luciano; Tabak, Benjamin M.; Serinaldi, Francesco; Zanin, Massimiliano; Pérez, Darío G.; Rosso, Osvaldo A.

    2011-03-01

    It is widely known that commodity markets are not totally efficient. Long-range dependence is present, and thus the celebrated Brownian motion of prices can be considered only as a first approximation. In this work we analyzed the predictability in commodity markets by using a novel approach derived from Information Theory. The complexity-entropy causality plane has been recently shown to be a useful statistical tool to distinguish the stage of stock market development because differences between emergent and developed stock markets can be easily discriminated and visualized with this representation space [L. Zunino, M. Zanin, B.M. Tabak, D.G. Pérez, O.A. Rosso, Complexity-entropy causality plane: a useful approach to quantify the stock market inefficiency, Physica A 389 (2010) 1891-1901]. By estimating the permutation entropy and permutation statistical complexity of twenty basic commodity future markets over a period of around 20 years (1991.01.02-2009.09.01), we can define an associated ranking of efficiency. This ranking is quantifying the presence of patterns and hidden structures in these prime markets. Moreover, the temporal evolution of the commodities in the complexity-entropy causality plane allows us to identify periods of time where the underlying dynamics is more or less predictable.

  15. Process analysis of the Centre of Information and Library Services of the University of Economics, Prague

    OpenAIRE

    DAŇKOVÁ, Lucie

    2009-01-01

    Theme of the diploma thesis is the analysis of the processes inside the library of the University of economics. The first part is dedicated to the definition of the goals of the thesis and to the theory of the company processes. The text interprets the term of information literacy and information education and links these terms with the procedural analysis. The thesis introduces the library and identified processes for optimalization. In the second part, the identified processes are analyzed ...

  16. A Novel Complex Event Processing Engine for Intelligent Data Analysis in Integrated Information Systems

    OpenAIRE

    Dong Wang; Mingquan Zhou; Sajid Ali; Pengbo Zhou; Yusong Liu; Xuesong Wang

    2016-01-01

    Novel and effective engines for data analysis in integrated information systems are urgently required by diverse applications, in which massive business data can be analyzed to enable the capturing of various situations in real time. The performance of existing engines has limited capacity of data processing in distributed computing. Although Complex Event Processing (CEP) has enhanced the capacity of data analysis in information systems, it is still a big challenging task since events are ra...

  17. The Effect of Corporate Break-ups on Information Asymmetry: A Market Microstructure Analysis

    OpenAIRE

    Bardong, Florian; Bartram, Söhnke M.; Yadav, Pradeep K.

    2006-01-01

    This paper investigates the information environment during and after a corporate break-up utilizing direct measures of information asymmetry developed in the market microstructure literature. The analysis is based on all corporate break-ups in the United States in the period 1995-2005. The results document that information asymmetry declines significantly as a result of a break-up. However, this reduction takes place not at the time of its announcement or its completion, but after it has been...

  18. Research of Classical and Intelligent Information System Solutions for Criminal Intelligence Analysis

    OpenAIRE

    Šimović, Vladimir

    2001-01-01

    The objective of this study is to present research on classical and intelligent information system solutions used in criminal intelligence analysis in Croatian security system theory. The study analyses objective and classical methods of information science, including artificial intelligence and other scientific methods. The intelligence and classical software solutions researched, proposed, and presented in this study were used in developing the integrated information system for the Croatian...

  19. Climate networks constructed by using information-theoretic measures and ordinal time-series analysis

    OpenAIRE

    Deza, Juan Ignacio

    2015-01-01

    This Thesis is devoted to the construction of global climate networks (CNs) built from time series -surface air temperature anomalies (SAT)- using nonlinear analysis. Several information theory measures have been used including mutual information (MI) and conditional mutual information (CMI). The ultimate goal of the study is to improve the present understanding of climatic variability by means of networks, focusing on the different spatial and time-scales of climate phenomena. An intro...

  20. Information technologies of the analysis of the state of agricultural enterprises of Krasnodar region

    Directory of Open Access Journals (Sweden)

    Elena Malceva

    2014-04-01

    Full Text Available The article is devoted to the application of information technologies at the enterprises of agro-industrial complex of Krasnodar region for the analysis of financial-economic activity. The author also analyses the development of the agriculture sector, market of information technologies, applications software products in the activities of agricultural enterprises.

  1. 76 FR 37371 - Agency Information Collection: Comment Request for National Gap Analysis Program Evaluation

    Science.gov (United States)

    2011-06-27

    ... documents, (2) measure user satisfaction, and (3) understand user needs. Additionally, this survey can....S. Geological Survey Agency Information Collection: Comment Request for National Gap Analysis... that we have submitted to the Office of Management and Budget (OMB) an information collection...

  2. A probabilistic framework for image information fusion with an application to mammographic analysis.

    NARCIS (Netherlands)

    Velikova, M.; Lucas, P.J.; Samulski, M.; Karssemeijer, N.

    2012-01-01

    The recent increased interest in information fusion methods for solving complex problem, such as in image analysis, is motivated by the wish to better exploit the multitude of information, available from different sources, to enhance decision-making. In this paper, we propose a novel method, that ad

  3. A New Method for Supporting Information Management in Engineering-consulting Companies: Organizational Network Analysis

    OpenAIRE

    Mehdi Jafari Rizi; Ali Shaemi Barzaki; Mohammad Hossein Yarmohammadian

    2014-01-01

    Organizational performance depends on specialized information that is transfered throughout an organization via communication networks among employees. Interactions that occur within these networks are poorly understood and are generally unmanaged. In this case study, we describe a method that has potential to provide systematic support for information management in engineering- consulting companies. We applied organizational network analysis, a method for studying commun...

  4. Can Raters with Reduced Job Descriptive Information Provide Accurate Position Analysis Questionnaire (PAQ) Ratings?

    Science.gov (United States)

    Friedman, Lee; Harvey, Robert J.

    1986-01-01

    Job-naive raters provided with job descriptive information made Position Analysis Questionnaire (PAQ) ratings which were validated against ratings of job analysts who were also job content experts. None of the reduced job descriptive information conditions enabled job-naive raters to obtain either acceptable levels of convergent validity with…

  5. A New Classification Analysis of Customer Requirement Information Based on Quantitative Standardization for Product Configuration

    OpenAIRE

    Zheng Xiao; Zude Zhou; Buyun Sheng

    2016-01-01

    Traditional methods used for the classification of customer requirement information are typically based on specific indicators, hierarchical structures, and data formats and involve a qualitative analysis in terms of stationary patterns. Because these methods neither consider the scalability of classification results nor do they regard subsequent application to product configuration, their classification becomes an isolated operation. However, the transformation of customer requirement inform...

  6. Sheltering Children from the Whole Truth: A Critical Analysis of an Informational Picture Book.

    Science.gov (United States)

    Lamme, Linda; Fu, Danling

    2001-01-01

    Uses Orbis Pictus Award Committee criteria (accuracy, organization, design, and style) to examine an informational book, "Rice Is Life," by Rita Golden Gelman. Subjects the book to a deeper critical analysis. Suggests that it is important to help students become critical thinkers about everything they read, including informational books. (SG)

  7. Information Theory for Correlation Analysis and Estimation of Uncertainty Reduction in Maps and Models

    Directory of Open Access Journals (Sweden)

    J. Florian Wellmann

    2013-04-01

    Full Text Available The quantification and analysis of uncertainties is important in all cases where maps and models of uncertain properties are the basis for further decisions. Once these uncertainties are identified, the logical next step is to determine how they can be reduced. Information theory provides a framework for the analysis of spatial uncertainties when different subregions are considered as random variables. In the work presented here, joint entropy, conditional entropy, and mutual information are applied for a detailed analysis of spatial uncertainty correlations. The aim is to determine (i which areas in a spatial analysis share information, and (ii where, and by how much, additional information would reduce uncertainties. As an illustration, a typical geological example is evaluated: the case of a subsurface layer with uncertain depth, shape and thickness. Mutual information and multivariate conditional entropies are determined based on multiple simulated model realisations. Even for this simple case, the measures not only provide a clear picture of uncertainties and their correlations but also give detailed insights into the potential reduction of uncertainties at each position, given additional information at a different location. The methods are directly applicable to other types of spatial uncertainty evaluations, especially where multiple realisations of a model simulation are analysed. In summary, the application of information theoretic measures opens up the path to a better understanding of spatial uncertainties, and their relationship to information and prior knowledge, for cases where uncertain property distributions are spatially analysed and visualised in maps and models.

  8. Transit light curves with finite integration time: Fisher information analysis

    International Nuclear Information System (INIS)

    Kepler has revolutionized the study of transiting planets with its unprecedented photometric precision on more than 150,000 target stars. Most of the transiting planet candidates detected by Kepler have been observed as long-cadence targets with 30 minute integration times, and the upcoming Transiting Exoplanet Survey Satellite will record full frame images with a similar integration time. Integrations of 30 minutes affect the transit shape, particularly for small planets and in cases of low signal to noise. Using the Fisher information matrix technique, we derive analytic approximations for the variances and covariances on the transit parameters obtained from fitting light curve photometry collected with a finite integration time. We find that binning the light curve can significantly increase the uncertainties and covariances on the inferred parameters when comparing scenarios with constant total signal to noise (constant total integration time in the absence of read noise). Uncertainties on the transit ingress/egress time increase by a factor of 34 for Earth-size planets and 3.4 for Jupiter-size planets around Sun-like stars for integration times of 30 minutes compared to instantaneously sampled light curves. Similarly, uncertainties on the mid-transit time for Earth and Jupiter-size planets increase by factors of 3.9 and 1.4. Uncertainties on the transit depth are largely unaffected by finite integration times. While correlations among the transit depth, ingress duration, and transit duration all increase in magnitude with longer integration times, the mid-transit time remains uncorrelated with the other parameters. We provide code in Python and Mathematica for predicting the variances and covariances at www.its.caltech.edu/~eprice.

  9. 76 FR 65317 - 60-Day Notice of Proposed Information Collection: DS-4184, Risk Management and Analysis (RAM)

    Science.gov (United States)

    2011-10-20

    ... Notice of Proposed Information Collection: DS-4184, Risk Management and Analysis (RAM) ACTION: Notice of... of technology. Abstract of proposed collection: The information collected from individuals and...: Risk Analysis and Management. OMB Control Number: None. Type of Request: New. Originating...

  10. Information extraction from topographic map using colour and shape analysis

    Indian Academy of Sciences (India)

    Nikam Gitanjali Ganpatrao; Jayanta Kumar Ghosh

    2014-10-01

    The work presented in this paper is related to symbols and toponym understanding with application to scanned Indian topographic maps. The proposed algorithm deals with colour layer separation of enhanced topographic map using kmeans colour segmentation followed by outline detection and chaining, respectively. Outline detection is performed through linear filtering using canny edge detector. Outline is then encoded in a Freeman way, the x-y offsets have been used to obtain a complex representation of outlines. Final matching of shapes is done by computing Fourier descriptors from the chain-codes; comparison of descriptors having same colour index is embedded in a normalized scalar product of descriptors. As this matching process is not rotation invariant (starting point selection), an interrelation function has been proposed to make the method shifting invariant. The recognition rates of symbols, letters and numbers are 84.68, 91.73 and 92.19%, respectively. The core contribution is dedicated to a shape analysis method based on contouring and Fourier descriptors. To improve recognition rate, obtaining most optimal segmentation solution for complex topographic map will be the future scope of work.

  11. Efficiency and Effectiveness in the Collection and Analysis of S&T Open Source Information

    International Nuclear Information System (INIS)

    While looking for information in scientific database, we are overwhelmed by the amount of information that we encounter. In this big data collection, getting information with added-value could be strategic for nuclear verification. In our study, we have worked about ''best practices'' in collecting, processing and analyzing open source scientific and technical information. First, we were insistent on working with information authenticated by referees such as scientific publications (structured information). Analysis of this structured data is made with bibliometric tools. Several steps are carried out: collecting data related to the paradigm, creating a database to store data generated by bibliographic research, analyzing data with selected tools. With analysis of bibliographic data only, we are able to get: · a panoramic view of countries that publish in the paradigm, · co-publication networks, · organizations that contribute to scientific publications, · countries with which a country collaborates, · areas of interest of a country, . . . So we are able to identify a target. On a second phase, we can focus on a target (countries for example). Working with non-structured data (i.e., press release, social networks, full text analysis of publications) is in progress and needs other tools to be added to the process, as we will discuss in this paper. In information analysis, methodology and expert analysis are important. Software analysis is just a tool to achieve our goal. This presentation deals with concrete measures that improve the efficiency and effectiveness in the use of open source S&T information and in the management of that information over time. Examples are shown. (author)

  12. A Real-Time and Dynamic Biological Information Retrieval and Analysis System (BIRAS)

    Institute of Scientific and Technical Information of China (English)

    Qi Zhou; Hong Zhang; Meiying Geng; Chenggang Zhang

    2003-01-01

    The aim of this study is to design a biological information retrieval and analysis system (BIRAS) based on the Internet. Using the specific network protocol, BIRAS system could send and receive information from the Entrez search and retrieval system maintained by National Center for Biotechnology Information (NCBI) in USA. The literatures, nucleotide sequence, protein sequences, and other resources according to the user-defined term could then be retrieved and sent to the user by pop up message or by E-mail informing automatically using BIRAS system.All the information retrieving and analyzing processes are done in real-time. As a robust system for intelligently and dynamically retrieving and analyzing on the user-defined information, it is believed that BIRAS would be extensively used to retrieve specific information from large amount of biological databases in now days.The program is available on request from the corresponding author.

  13. A Real—Time and Dynamic Biological Information Retrieval and Analysis System(BIRAS)

    Institute of Scientific and Technical Information of China (English)

    QiZhou; HongZhang; MeiyingGeng; ChenggangZhang

    2003-01-01

    The aim of this study is to design a biological information retrieval and analysis system(BIRAS) based on the Internet.Using the specific network protocol,BIRAS system could send and receive information from the Entrez search and retrieval system maintained by National Center for Biotechnology Information(NCBI)in USA.The literatures,nucleotide sequence,protein sequences,and other resources according to the user-defined term could then be retrieved and sent to the user by pop up message or by E-amil informing automatically using BIRAS system.All the information retrieving and analyzing processes are done in real-time.As a robust system for intelligently and dynamically retrieving and analyzing on the user-defined information,it is believed that BIRAS would be extensively used to retrieve specific information from large amount of biological databases in now days.The program is available on request from the corresponding author.

  14. REAL-TIME ENERGY INFORMATION AND CONSUMER BEHAVIOR: A META-ANALYSIS AND FORECAST

    Science.gov (United States)

    The meta-analysis of literature and program results will shed light on potential causes of study-to-study variation in information feedback programs and trials. Outputs from the meta-analysis, such as price elasticity, will be used in NEMS to estimate the impact of a nation...

  15. 30 CFR 250.227 - What environmental impact analysis (EIA) information must accompany the EP?

    Science.gov (United States)

    2010-07-01

    ... harvest practices, recreation, recreational and commercial fishing (including typical fishing seasons... 30 Mineral Resources 2 2010-07-01 2010-07-01 false What environmental impact analysis (EIA... and Information Contents of Exploration Plans (ep) § 250.227 What environmental impact analysis...

  16. Auditing Information Structures in Organizations: A Review of Data Collection Techniques for Network Analysis

    NARCIS (Netherlands)

    Zwijze-Koning, Karen H.; Jong, de Menno D.T.

    2005-01-01

    Network analysis is one of the current techniques for investigating organizational communication. Despite the amount of how-to literature about using network analysis to assess information flows and relationships in organizations, little is known about the methodological strengths and weaknesses of

  17. 77 FR 45717 - Proposed Information Collection (Food Service and Nutritional Care Analysis) Activity; Comment...

    Science.gov (United States)

    2012-08-01

    ... are needed to enhance patient's nutritional therapy. Affected Public: Individuals and Households... AFFAIRS Proposed Information Collection (Food Service and Nutritional Care Analysis) Activity; Comment.... Title: Food Service and Nutritional Care Analysis, VA Form 10-5387. OMB Control Number: 2900-0227....

  18. Citation analysis in journal rankings: medical informatics in the library and information science literature.

    Science.gov (United States)

    Vishwanatham, R

    1998-10-01

    Medical informatics is an interdisciplinary field. Medical informatics articles will be found in the literature of various disciplines including library and information science publications. The purpose of this study was to provide an objectively ranked list of journals that publish medical informatics articles relevant to library and information science. Library Literature, Library and Information Science Abstracts, and Social Science Citation Index were used to identify articles published on the topic of medical informatics and to identify a ranked list of journals. This study also used citation analysis to identify the most frequently cited journals relevant to library and information science. PMID:9803294

  19. Change detection in bi-temporal data by canonical information analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Vestergaard, Jacob Schack

    2015-01-01

    Canonical correlation analysis (CCA) is an established multivariate statistical method for finding similarities between linear combinations of (normally two) sets of multivariate observations. In this contribution we replace (linear) correlation as the measure of association between the linear...... combinations with the information theoretical measure mutual information (MI). We term this type of analysis canonical information analysis (CIA). MI allows for the actual joint distribution of the variables involved and not just second order statistics. Where CCA is ideal for Gaussian data, CIA facilitates...... analysis of variables with different genesis and therefore different statistical distributions. As a proof of concept we give a toy example. We also give an example with DLR 3K camera data from two time points covering a motor way....

  20. The US Support Program Assistance to the IAEA Safeguards Information Technology, Collection, and Analysis 2008

    Energy Technology Data Exchange (ETDEWEB)

    Tackentien,J.

    2008-06-12

    One of the United States Support Program's (USSP) priorities for 2008 is to support the International Atomic Energy Agency's (IAEA) development of an integrated and efficient safeguards information infrastructure, including reliable and maintainable information systems, and effective tools and resources to collect and analyze safeguards-relevant information. The USSP has provided funding in support of this priority for the ISIS Re-engineering Project (IRP), and for human resources support to the design and definition of the enhanced information analysis architecture project (nVision). Assistance for several other information technology efforts is provided. This paper will report on the various ongoing support measures undertaken by the USSP to support the IAEA's information technology enhancements and will provide some insights into activities that the USSP may support in the future.

  1. A comparative study of information diffusion in weblogs and microblogs based on social network analysis

    Institute of Scientific and Technical Information of China (English)

    Yang; ZHANG; Wanyang; LING

    2012-01-01

    Purpose:This paper intends to explore a quantitative method for investigating the characteristics of information diffusion through social media like weblogs and microblogs.By using the social network analysis methods,we attempt to analyze the different characteristics of information diffusion in weblogs and microblogs as well as the possible reasons of these differences.Design/methodology/approach:Using the social network analysis methods,this paper carries out an empirical study by taking the Chinese weblogs and microblogs in the field of Library and Information Science(LIS)as the research sample and employing measures such as network density,core/peripheral structure and centrality.Findings:Firstly,both bloggers and microbloggers maintain weak ties,and both of their social networks display a small-world effect.Secondly,compared with weblog users,microblog users are more interconnected,more equal and more capable of developing relationships with people outside their own social networks.Thirdly,the microblogging social network is more conducive to information diffusion than the blogging network,because of their differences in functions and the information flow mechanism.Finally,the communication mode emerged with microblogging,with the characteristics of micro-content,multi-channel information dissemination,dense and decentralized social network and content aggregation,will be one of the trends in the development of the information exchange platform in the future.Research limitations:The sample size needs to be increased so that samples are more representative.Errors may exist during the data collection.Moreover,the individual-level characteristics of the samples as well as the types of information exchanged need to be further studied.Practical implications:This preliminary study explores the characteristics of information diffusion in the network environment and verifies the feasibility of conducting a quantitative analysis of information diffusion through social

  2. A generalized rough set-based information filling technique for failure analysis of thruster experimental data

    Institute of Scientific and Technical Information of China (English)

    Han Shan; Zhu Qiang; Li Jianxun; Chen Lin

    2013-01-01

    Interval-valued data and incomplete data are two key problems for failure analysis of thruster experimental data and have been basically solved by the proposed methods in this paper. Firstly, information data acquired from the simulation and evaluation system formed as interval-valued information system (IIS) is classified by the interval similarity relation. Then, as an improve-ment of the classical rough set, a new kind of generalized information entropy called‘‘H0-informa-tion entropy’’ is suggested for the measurement of uncertainty and the classification ability of IIS. There is an innovative information filling technique using the properties of H0-information entropy to replace missing data by some smaller estimation intervals. Finally, an improved method of failure analysis synthesized by the above achievements is presented to classify the thruster experimental data, complete the information, and extract the failure rules. The feasibility and advantage of this method is testified by an actual application of failure analysis, whose performance is evaluated by the quantification of E-condition entropy.

  3. Financial Ratio Analysis: the Development of a Dedicated Management Information System

    Directory of Open Access Journals (Sweden)

    Voicu-Dan Dragomir

    2007-01-01

    Full Text Available This paper disseminates the results of the development process for a financial analysis information system. The system has been subject to conceptual design using the Unified Modeling Language (UML and has been implemented in object-oriented manner using the Visual Basic .NET 2003 programming language. The classic financial analysis literature is focused on the chain-substitution method of computing the prior-year to current-year variation of linked financial ratios. We have applied this technique on the DuPont System of analysis concerning the Return on Equity ratio, by designing several structural UML diagrams depicting the breakdown and analysis of each financial ratio involved. The resulting computer application offers a flexible approach to the analytical tools: the user is required to introduce the raw data and the system provides both table-style and charted information on the output of computation. User-friendliness is also a key feature of this particular financial analysis application.

  4. A critical care monitoring system for depth of anaesthesia analysis based on entropy analysis and physiological information database.

    Science.gov (United States)

    Wei, Qin; Li, Yang; Fan, Shou-Zen; Liu, Quan; Abbod, Maysam F; Lu, Cheng-Wei; Lin, Tzu-Yu; Jen, Kuo-Kuang; Wu, Shang-Ju; Shieh, Jiann-Shing

    2014-09-01

    Diagnosis of depth of anaesthesia (DoA) plays an important role in treatment and drug usage in the operating theatre and intensive care unit. With the flourishing development of analysis methods and monitoring devices for DoA, a small amount of physiological data had been stored and shared for further researches. In this paper, a critical care monitoring (CCM) system for DoA monitoring and analysis was designed and developed, which includes two main components: a physiologic information database (PID) and a DoA analysis subsystem. The PID, including biologic data and clinical information was constructed through a browser and server model so as to provide a safe and open platform for storage, sharing and further study of clinical anaesthesia information. In the analysis of DoA, according to our previous studies on approximate entropy, sample entropy (SampEn) and multi-scale entropy (MSE), the SampEn and MSE were integrated into the subsystem for indicating the state of patients underwent surgeries in real time because of their stability. Therefore, this CCM system not only supplies the original biological data and information collected from the operating room, but also shares our studies for improvement and innovation in the research of DoA. PMID:24981134

  5. The E-net model for the Risk Analysis and Assessment System for the Information Security of Communication and Information Systems ("Defining" Subsystem)

    OpenAIRE

    Stoianov, Nikolai; Aleksandrova, Veselina

    2010-01-01

    This paper presents one suggestion that comprises the authors' experience in development and implementation of systems for information security in the Automated Information Systems of the Bulgarian Armed Forces. The architecture of risk analysis and assessment system for the communication and information system's information security (CIS IS) has been presented. E-net model of "Defining" Subsystem as a tool that allows to examine the subsystems is proposed as well. Such approach can be applie...

  6. Static versus Dynamic Data Information Fusion Analysis using DDDAS for Cyber Security Trust

    OpenAIRE

    Erik Blasch; Youssif Al-Nashif; Salim Hariri

    2015-01-01

    Information fusion includes signals, features, and decision-level analysis over various types of data including imagery, text, and cyber security detection. With the maturity of data processing, the explosion of big data, and the need for user acceptance; the Dynamic Data-Driven Application System (DDDAS) philosophy fosters insights into the usability of information systems solutions. In this paper, we explore a notion of an adaptive adjustment of secure communication...

  7. Development of a new service-oriented modelling method for information systems analysis and design

    OpenAIRE

    Gustiené, Prima

    2010-01-01

    This thesis presents a new modelling method for information systems analysis and design, where the concept of service and the principles of service orientation are used for integrated modelling and reasoning about information systems architectures across organisational and technical systems boundaries. The concept of service enables cohesion of the intersubjective and objective modelling traditions by using a single type of diagram that facilitates detection of semantic inconsistency, incompl...

  8. Mapping recent information behavior research: an analysis of co-authorship and cocitation networks

    OpenAIRE

    González Teruel, Aurora; González Alcaide, Gregorio; Barrios, Maite; Abad García, María Francisca

    2015-01-01

    There has been an increase in research published on information behavior in recent years, and this has been accompanied by an increase in its diversity and interaction with other fields, particularly information retrieval (HR). The aims of this study are to determine which researchers have contributed to producing the current body of knowledge on this subject, and to describe its intellectual basis. A bibliometric and network analysis was applied to authorship and co-authorship as well as cit...

  9. Description of a method to support public health information management: organizational network analysis

    OpenAIRE

    Merrill, Jacqueline; Bakken, Suzanne; Rockoff, Maxine; Gebbie, Kristine; Carley, Kathleen

    2006-01-01

    In this case study we describe a method that has potential to provide systematic support for public health information management. Public health agencies depend on specialized information that travels throughout an organization via communication networks among employees. Interactions that occur within these networks are poorly understood and are generally unmanaged. We applied organizational network analysis, a method for studying communication networks, to assess the method’s utility to supp...

  10. Industrial benchmarking through information visualization and data envelopment analysis: a new framework

    OpenAIRE

    Ertek, Gürdal; Ertek, Gurdal; Sevinç, Mete; Sevinc, Mete; Ulus, Firdevs; Köse, Özlem; Kose, Ozlem; Şahin, Güvenç; Sahin, Guvenc

    2013-01-01

    We present a benchmarking study on the companies in the Turkish food industry based on their financial data. Our aim is to develop a comprehensive benchmarking framework using Data Envelopment Analysis (DEA) and information visualization. Besides DEA, a traditional tool for financial benchmarking based on financial ratios is also incorporated. The consistency/inconsistency between the two methodologies is investigated using information visualization tools. In addition, k-means clustering, a f...

  11. Task Analysis in Action: The Role of Information Systems in Communicable Disease Reporting

    OpenAIRE

    Pina, Jamie; Turner, Anne; Kwan-Gett, Tao; Duchin, Jeff

    2009-01-01

    In order to improve the design of information systems for notifiable conditions reporting, it is essential to understand the role of such systems in public health practice. Using qualitative techniques, we performed a task analysis of the activities associated with notifiable conditions reporting at a large urban health department. We identified seventeen primary tasks associated with the use of the department’s information system. The results of this investigation suggest that communicable d...

  12. Analysis Of Factors Affecting The Success Of The Application Of Accounting Information System

    OpenAIRE

    Deni Iskandar

    2015-01-01

    Abstract The purpose of this study was to find solutions for problems related to the quality of accounting information systems accounting information quality when connected with management commitment user competency and organizational culture. This research was conducted through deductive analysis supported the phenomenon then sought evidence through empirical facts especially about the effect of management commitment competence and users of organizational culture on the quality of accounting...

  13. User satisfaction-based quality evaluation model and survey analysis of network information service

    Institute of Scientific and Technical Information of China (English)

    LEI; Xue; JIAO; Yuying

    2009-01-01

    On the basis of user satisfaction,authors made research hypotheses by learning from relevant e-service quality evaluation models.A questionnaire survey was then conducted on some content-based websites in terms of their convenience,information quality,personalization and site aesthetics,which may affect the overall satisfaction of users.Statistical analysis was also made to build a user satisfaction-based quality evaluation system of network information service.

  14. Open source information acquisition, analysis and integration in the IAEA Department of Safeguards

    International Nuclear Information System (INIS)

    Acquisition and analysis of open source information plays an increasingly important role in the IAEA strengthened safeguards system. The Agency's focal point for open source information collection and analysis is the Division of Safeguards Information Management (SGIM) within the IAEA Department of Safeguards. In parallel with the approval of the Model Additional Protocol in 1997, a new centre of information acquisition and analysis expertise was created within SGIM. By acquiring software, developing databases, retraining existing staff and hiring new staff with diverse analytical skills, SGIM is pro actively contributing to the future implementation of information-driven safeguards in collaboration with other Divisions within the Department of Safeguards. Open source information support is now fully integrated with core safeguards processes and activities, and has become an effective tool in the work of the Department of Safeguards. This provides and overview of progress realized through the acquisition and use of open source information in several thematic areas: evaluation of additional protocol declarations; support to the State Evaluation process; in-depth investigation of safeguards issues, including assisting inspections and complementary access; research on illicit nuclear procurement networks and trafficking; and monitoring nuclear developments. Demands for open source information have steadily grown and are likely to continue to grow in the future. Coupled with the enormous growth and accessibility in the volume and sources of information, new challenges are presented, both technical and analytical. This paper discusses actions taken and future plans for multi-source and multi-disciplinary analytic integration to strengthen confidence in safeguards conclusions - especially regarding the absence of undeclared nuclear materials and activities. (Author)

  15. Open Source Information Acquisition and Analysis in the IAEA Department of Safeguards

    International Nuclear Information System (INIS)

    Acquisition and analysis of open source information plays an increasingly important role in the IAEA's strengthened safeguards system. The Agency's focal point for open source information collection and analysis is the Division of Safeguards Information Management (SGIM) within the IAEA's Department of Safeguards. In parallel with the approval of the Model Additional Protocol in 1997, a new center of information acquisition and analysis expertise was created within SGIM. By acquiring software, developing databases, retraining existing staff and hiring new staff with diverse analytical skills, SGIM is pro-actively contributing to the future implementation of information-driven safeguards in collaboration with other Divisions within the Department of Safeguards. Open source information support is now fully integrated with core safeguards processes and activities, and has become an effective tool in the work of the Department of Safeguards. This paper provides an overview of progress realized through the acquisition and use of open source information in several thematic areas: evaluation of additional protocol declarations; support to the State Evaluation process; in-depth investigation of safeguards issues, including assisting inspections and complementary access; research on illicit nuclear procurement networks and trafficking; and monitoring nuclear developments. Demands for open source information have steadily grown and will likely continue to grow in the future. Coupled with the enormous growth and accessibility in the volume and sources of information, new challenges are presented, both technical and analytical. This paper discusses actions taken and future plans for multi-source and multidisciplinary analytic integration to strengthen confidence in safeguards conclusions especially regarding the absence of undeclared nuclear materials and activities. (authors)

  16. RACLOUDS - Model for Clouds Risk Analysis in the Information Assets Context

    Directory of Open Access Journals (Sweden)

    SILVA, P. F.

    2016-06-01

    Full Text Available Cloud computing offers benefits in terms of availability and cost, but transfers the responsibility of information security management for the cloud service provider. Thus the consumer loses control over the security of their information and services. This factor has prevented the migration to cloud computing in many businesses. This paper proposes a model where the cloud consumer can perform risk analysis on providers before and after contracting the service. The proposed model establishes the responsibilities of three actors: Consumer, Provider and Security Labs. The inclusion of actor Security Labs provides more credibility to risk analysis making the results more consistent for the consumer.

  17. APPLICATION OF INFORMATION THEORY AND A.S.C. ANALYSIS FOR EXPERIMENTAL RESEARCH IN NUMBER THEORY

    Directory of Open Access Journals (Sweden)

    Lutsenko Y. V.

    2014-03-01

    Full Text Available Is it possible to automate the study of the properties of numbers and their relationship so that the results of this study can be formulated in the form of statements, indicating the specific quantity of information stored in them? To answer this question it is offered to apply the same method that is widely tested and proved in studies of real objects and their relations in various fields to study the properties of numbers in the theory of numbers namely - the automated system-cognitive analysis (A.S.C. analysis, based on information theory

  18. Urban Planning and Management Information Systems Analysis and Design Based on GIS

    Science.gov (United States)

    Xin, Wang

    Based on the analysis of existing relevant systems on the basis of inadequate, after a detailed investigation and research, urban planning and management information system will be designed for three-tier structure system, under the LAN using C/S mode architecture. Related functions for the system designed in accordance with the requirements of the architecture design of the functional relationships between the modules. Analysis of the relevant interface and design, data storage solutions proposed. The design for small and medium urban planning information system provides a viable building program.

  19. An Empirical Analysis of the Default Rate of Informal Lending—Evidence from Yiwu, China

    Science.gov (United States)

    Lu, Wei; Yu, Xiaobo; Du, Juan; Ji, Feng

    This study empirically analyzes the underlying factors contributing to the default rate of informal lending. This paper adopts snowball sampling interview to collect data and uses the logistic regression model to explore the specific factors. The results of these analyses validate the explanation of how the informal lending differs from the commercial loan. Factors that contribute to the default rate have particular attributes, while sharing some similarities with commercial bank or FICO credit scoring Index. Finally, our concluding remarks draw some inferences from empirical analysis and speculate as to what this may imply for the role of formal and informal financial sectors.

  20. Analysis of Automated Modern Web Crawling and Testing Tools and Their Possible Employment for Information Extraction

    Directory of Open Access Journals (Sweden)

    Tomas Grigalis

    2012-04-01

    Full Text Available World Wide Web has become an enormously big repository of data. Extracting, integrating and reusing this kind of data has a wide range of applications, including meta-searching, comparison shopping, business intelligence tools and security analysis of information in websites. However, reaching information in modern WEB 2.0 web pages, where HTML tree is often dynamically modified by various JavaScript codes, new data are added by asynchronous requests to the web server and elements are positioned with the help of cascading style sheets, is a difficult task. The article reviews automated web testing tools for information extraction tasks.Article in Lithuanian

  1. SETUP OF RESOLUTIVE CRITERION FOR SEDIMENT-RELATED DISASTER WARNING INFORMATION USING LOGISTIC REGRESSION ANALYSIS

    Science.gov (United States)

    Sugihara, Shigemitsu; Shinozaki, Tsuguhiro; Ohishi, Hiroyuki; Araki, Yoshinori; Furukawa, Kohei

    It is difficult to deregulate sediment-related disaster warning information, for the reason that it is difficult to quantify the risk of disaster after the heavy rain. If we can quantify the risk according to the rain situation, it will be an indication of deregulation. In this study, using logistic regression analysis, we quantified the risk according to the rain situation as the probability of disaster occurrence. And we analyzed the setup of resolutive criterion for sediment-related disaster warning information. As a result, we can improve convenience of the evaluation method of probability of disaster occurrence, which is useful to provide information of imminently situation.

  2. Functional analysis of ultra high information rates conveyed by rat vibrissal primary afferents

    Directory of Open Access Journals (Sweden)

    André Maia Chagas

    2013-12-01

    Full Text Available Sensory receptors determine the type and the quantity of information available for perception. Here, we quantified and characterized the information transferred by primary afferents in the rat whisker system using neural system identification. Quantification of ‘how much’ information is conveyed by primary afferents, using the direct method, a classical information theoretic tool, revealed that primary afferents transfer huge amounts of information (up to 529 bits/s. Information theoretic analysis of instantaneous spike-triggered kinematic stimulus features was used to gain functional insight on ‘what’ is coded by primary afferents. Amongst the kinematic variables tested - position, velocity, and acceleration - primary afferent spikes encoded velocity best. The other two variables contribute to information transfer, but only if combined with velocity. We further revealed three additional characteristics that play a role in information transfer by primary afferents. Firstly, primary afferent spikes show preference for well separated multiple stimuli (i.e. well separated sets of combinations of the three instantaneous kinematic variables. Secondly, spikes are sensitive to short strips of the stimulus trajectory (up to 10 ms pre-spike time, and thirdly, they show spike patterns (precise doublet and triplet spiking. In order to deal with these complexities, we used a flexible probabilistic neuron model fitting mixtures of Gaussians to the spike triggered stimulus distributions, which quantitatively captured the contribution of the mentioned features and allowed us to achieve a full functional analysis of the total information rate indicated by the direct method. We found that instantaneous position, velocity, and acceleration explained about 50% of the total information rate. Adding a 10 ms pre-spike interval of stimulus trajectory achieved 80-90%. The final 10-20% were found to be due to non-linear coding by spike bursts.

  3. Analysis of Nuclear Relevant Information on International Procurement and Industrial Activities for Safeguards Purposes

    International Nuclear Information System (INIS)

    Through the use of information on trade and industry, analysts in the Department of Safeguards create an understanding of relevant technological capabilities available to States with safeguards agreements in force and the nuclear related equipment and materials they can make use of either through indigenous manufacture or import. This information gives a valuable independent input into the consistency analysis of States' declarations and may identify inconsistencies or provide indicators of possible undeclared activities. Information on procurement attempts of potential safeguards relevance is made available to the Department through the voluntary support of several Member States. These provide complete and original primary details on enquiries that reach expert suppliers of nuclear relevant goods in the respective Member States, enquiries that may not adequately declare the intended end use of the goods. Information on export/import activities (EXIM) is collected from a variety of publicly available statistical trade databases. These provide details on trade flows of commodities between States. The information is categorized according to the World Customs Organization's universal product nomenclature: the Harmonized System (HS). Querying relevant HS codes allows analysis of EXIM information for indicators of safeguards relevance, providing insight into potential safeguards relevant capabilities, resources or activities. Surveys of nuclear relevant manufacturing capabilities of States are performed by collecting information from publicly available business directories. Such information is then further refined by identifying the actual activities of the individual manufacturers and suppliers of interest. This survey provides valuable knowledge on the technical capabilities of States. This paper will discuss the most important types of information used, clarify why they are relevant, describe the methodologies now routinely used in the Department of

  4. Information

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    There are unstructured abstracts (no more than 256 words) and structured abstracts (no more than 480). The specific requirements for structured abstracts are as follows:An informative, structured abstracts of no more than 4-80 words should accompany each manuscript. Abstracts for original contributions should be structured into the following sections. AIM (no more than 20 words): Only the purpose should be included. Please write the aim as the form of "To investigate/ study/..."; MATERIALS AND METHODS (no more than 140 words); RESULTS (no more than 294 words): You should present P values where appropnate and must provide relevant data to illustrate how they were obtained, e.g. 6.92 ± 3.86 vs 3.61 ± 1.67, P< 0.001; CONCLUSION (no more than 26 words).

  5. 40 CFR 1400.9 - Access to off-site consequence analysis information by State and local government officials.

    Science.gov (United States)

    2010-07-01

    ... local government official in electronic form, unless the official specifically requests the information... analysis information by State and local government officials. 1400.9 Section 1400.9 Protection of... Consequence Analysis Information by Government Officials. § 1400.9 Access to off-site consequence...

  6. Near-Real-Time Analysis of Publicly Communicated Disaster Response Information

    Science.gov (United States)

    Girard, Trevor

    2015-04-01

    During a disaster situation the public will need to make critical actions regarding what to do, where to go, how to get there, and so on. The more informed the public is, the better actions they are able to make, resulting in reduced disaster impacts. The criteria for what information to provide the public needs to change depending on the specific needs of the disaster affected population. The method of dissemination also needs to match the communication channels that the public typically uses in disaster situations. This research project investigates the dynamic information needs of disaster affected populations and how information leads to actions. The purpose of the research project is to identify key indicators for measuring how well informed the public is during disasters. The indicators are limited to those which can be observed as communication is happening (i.e., in near-real-time). By doing so, the indicators can be analyzed as disaster situations unfold, deficiencies can be identified, and recommendations can be made to potentially improve communication while the response is still underway. The end goal of the research is to improve the ability of communicators to inform disaster affected communities. A classification scheme has been developed to categorize the information provided to the public during disasters. Under each category is a set of typical questions that the information should answer. These questions are the result of a best observed practice review of the information available during 11 disasters. For example, under the category 'Life Saving Response', the questions which should be answered are who is doing what (Evacuation, SAR), where and when, and the amount of the affected communities' needs being covered by these actions. Review of what questions remain unanswered acts as the first indicator, referred to as an 'Information Gap Analysis'. Comparative analysis of the information within categories, between categories, and between similar

  7. A risk-informed perspective on deterministic safety analysis of nuclear power plants

    International Nuclear Information System (INIS)

    In this work, the deterministic safety analysis (DSA) approach to nuclear safety is examined from a risk-informed perspective. One objective of safety analysis of a nuclear power plant is to demonstrate via analysis that the risks to the public from events or accidents that are within the design basis of the power plant are within acceptable levels with a high degree of assurance. This nuclear safety analysis objective can be translated into two requirements on the risk estimates of design basis events or accidents: the nominal risk estimate to the public must be shown to be within acceptable levels, and the uncertainty in the risk estimates must be shown to be small on an absolute or relative basis. The DSA approach combined with the defense-in-depth (DID) principle is a simplified safety analysis approach that attempts to achieve the above safety analysis objective in the face of potentially large uncertainties in the risk estimates of a nuclear power plant by treating the various uncertainty contributors using a stylized conservative binary (yes-no) approach, and applying multiple overlapping physical barriers and defense levels to protect against the release of radioactivity from the reactor. It is shown that by focusing on the consequence aspect of risk, the previous two nuclear safety analysis requirements on risk can be satisfied with the DSA-DID approach to nuclear safety. It is also shown the use of multiple overlapping physical barriers and defense levels in the traditional DSA-DID approach to nuclear safety is risk-informed in the sense that it provides a consistently high level of confidence in the validity of the safety analysis results for various design basis events or accidents with a wide range of frequency of occurrence. It is hoped that by providing a linkage between the consequence analysis approach in DSA with a risk-informed perspective, greater understanding of the limitation and capability of the DSA approach is obtained. (author)

  8. 10 CFR 52.79 - Contents of applications; technical information in final safety analysis report.

    Science.gov (United States)

    2010-01-01

    ...; technical information in final safety analysis report. (a) The application must contain a final safety... CFR part 50, appendix A, GDC 3, and § 50.48 of this chapter; (7) A description of protection provided... important to safety and the list of electric equipment important to safety that is required by 10 CFR...

  9. Quantitative and Qualitative Analysis of Nutrition and Food Safety Information in School Science Textbooks of India

    Science.gov (United States)

    Subba Rao, G. M.; Vijayapushapm, T.; Venkaiah, K.; Pavarala, V.

    2012-01-01

    Objective: To assess quantity and quality of nutrition and food safety information in science textbooks prescribed by the Central Board of Secondary Education (CBSE), India for grades I through X. Design: Content analysis. Methods: A coding scheme was developed for quantitative and qualitative analyses. Two investigators independently coded the…

  10. Design and Implementation of Marine Information System, and Analysis of Learners' Intention toward

    Science.gov (United States)

    Pan, Yu-Jen; Kao, Jui-Chung; Yu, Te-Cheng

    2016-01-01

    The goal of this study is to conduct further research and discussion on applying the internet on marine education, utilizing existing technologies such as cloud service, social network, data collection analysis, etc. to construct a marine environment education information system. The content to be explored includes marine education information…

  11. The Technical Report: An Analysis of Information Design and Packaging for an Inelastic Market.

    Science.gov (United States)

    Pinelli, Thomas E.; And Others

    As part of an evaluation of its scientific and technical information program, the National Aeronautics and Space Administration (NASA) conducted a review and analysis of structural, language, and presentation components of its technical report form. The investigation involved comparing and contrasting NASA's publications standards for technical…

  12. Analysis of patent activity in the field of quantum information processing

    CERN Document Server

    Winiarczyk, Ryszard; Miszczak, Jarosław Adam; Pawela, Łukasz; Puchała, Zbigniew

    2013-01-01

    This paper provides an analysis of patent activity in the field of quantum information processing. Data from the PatentScope database from the years 1993-2011 was used. In order to predict the future trends in the number of filed patents time series models were used.

  13. Forty Years of the "Journal of Librarianship and Information Science": A Quantitative Analysis, Part I

    Science.gov (United States)

    Furner, Jonathan

    2009-01-01

    This paper reports on the first part of a two-part quantitative analysis of volume 1-40 (1969-2008) of the "Journal of Librarianship and Information Science" (formerly the "Journal of Librarianship"). It provides an overview of the current state of LIS research journal publishing in the UK; a review of the publication and printing history of…

  14. A Comprehensive Analysis of the Quality of Online Health-Related Information regarding Schizophrenia

    Science.gov (United States)

    Guada, Joseph; Venable, Victoria

    2011-01-01

    Social workers are major mental health providers and, thus, can be key players in guiding consumers and their families to accurate information regarding schizophrenia. The present study, using the WebMedQual scale, is a comprehensive analysis across a one-year period at two different time points of the top for-profit and nonprofit sites that…

  15. Developing Information Skills Test for Malaysian Youth Students Using Rasch Analysis

    Science.gov (United States)

    Karim, Aidah Abdul; Shah, Parilah M.; Din, Rosseni; Ahmad, Mazalah; Lubis, Maimun Aqhsa

    2014-01-01

    This study explored the psychometric properties of a locally developed information skills test for youth students in Malaysia using Rasch analysis. The test was a combination of 24 structured and multiple choice items with a 4-point grading scale. The test was administered to 72 technical college students and 139 secondary school students. The…

  16. Coex-Rank: An approach incorporating co-expression information for combined analysis of microarray data

    OpenAIRE

    Cai, Jinlu; Keen, Henry L.; Sigmund, Curt D.; Casavant, Thomas L.

    2012-01-01

    Microarrays have been widely used to study differential gene expression at the genomic level. They can also provide genome-wide co-expression information. Biologically related datasets from independent studies are publicly available, which requires robust combined approaches for integration and validation. Previously, meta-analysis has been adopted to solve this problem.

  17. Information system subsystems execution and development order algorithm implementation and analysis

    OpenAIRE

    Robert Kudelic; Alen Lovrencic; Mladen Konecki

    2012-01-01

    In our previous research we have constructed theoretical foundations for automated approach that can determine information system subsystems execution and development order according to data class interactions. In this paper, we will, from those theoretical foundations, develop C# algorithm through which we can see its real time behavior and calculate complexity. Finally, after algorithm analysis we will conclude with our plans for further work.

  18. 78 FR 69689 - Agency Information Collection Activities; Proposed Collection; Comment Request; Hazard Analysis...

    Science.gov (United States)

    2013-11-20

    ... appropriate, and other forms of information technology. Hazard Analysis and Critical Control Point (HACCP... 0910-0466)--Extension FDA regulations in part 120 (21 CFR part 120) mandate the application of HACCP principles to the processing of fruit and vegetable juices. HACCP is a preventive system of hazard...

  19. 75 FR 40839 - Agency Information Collection Activities; Proposed Collection; Comment Request; Hazard Analysis...

    Science.gov (United States)

    2010-07-14

    ... appropriate, and other forms of information technology. Hazard Analysis and Critical Control Point (HACCP... 0910-0466)--Extension FDA's regulations in part 120 (21 CFR part 120) mandate the application of HACCP procedures to fruit and vegetable juice processing. HACCP is a preventative system of hazard control that...

  20. Handheld tools that 'Informate' Assessment of Student Learning in Science: A Requirements Analysis

    Science.gov (United States)

    Roschelle, Jeremy; Penuel, William R.; Yarnall, Louise; Shechtman, Nicole; Tatar, Deborah

    2005-01-01

    An important challenge faced by many teachers as they involve students in science investigations is measuring (assessing) students' progress. Our detailed requirements analysis in a particular school district led to the idea that what teachers need most are ways to increase the quality of the information they have about what students know and can…

  1. Sensitivity Analysis of Multiple Informant Models When Data Are Not Missing at Random

    Science.gov (United States)

    Blozis, Shelley A.; Ge, Xiaojia; Xu, Shu; Natsuaki, Misaki N.; Shaw, Daniel S.; Neiderhiser, Jenae M.; Scaramella, Laura V.; Leve, Leslie D.; Reiss, David

    2013-01-01

    Missing data are common in studies that rely on multiple informant data to evaluate relationships among variables for distinguishable individuals clustered within groups. Estimation of structural equation models using raw data allows for incomplete data, and so all groups can be retained for analysis even if only 1 member of a group contributes…

  2. Translingual Alteration of Conceptual Information in Medical Translation: A Crosslanguage Analysis between English and Chinese.

    Science.gov (United States)

    He, Shaoyi

    2000-01-01

    This study conducted a crosslanguage analysis of conceptual alteration in medical translation between English and Chinese, focusing on the original and translated concepts in article titles from two English medical journals and two Chinese medical journals. Findings provide insights into future studies on crosslanguage information retrieval via…

  3. Three dimensional visualization breakthrough in analysis and communication of technical information for nuclear waste management

    International Nuclear Information System (INIS)

    Computer graphics systems that provide interactive display and manipulation of three-dimensional data are powerful tools for the analysis and communication of technical information required for characterization and design of a geologic repository for nuclear waste. Greater understanding of site performance and repository design information is possible when performance-assessment modeling results can be visually analyzed in relation to site geologic and hydrologic information and engineering data for surface and subsurface facilities. In turn, this enhanced visualization capability provides better communication between technical staff and program management with respect to analysis of available information and prioritization of program planning. A commercially-available computer system was used to demonstrate some of the current technology for three-dimensional visualization within the architecture of systems for nuclear waste management. This computer system was used to interactively visualize and analyze the information for two examples: (1) site-characterization and engineering data for a potential geologic repository at Yucca Mountain, Nevada; and (2) three-dimensional simulations of a hypothetical release and transport of contaminants from a source of radionuclides to the vadose zone. Users may assess the three-dimensional distribution of data and modeling results by interactive zooming, rotating, slicing, and peeling operations. For those parts of the database where information is sparse or not available, the software incorporates models for the interpolation and extrapolation of data over the three-dimensional space of interest. 12 refs., 4 figs

  4. MANAGEMENT INFORMATION SYSTEMS ISSUES: CO-CITATION ANALYSIS OF JOURNAL ARTICLES

    Directory of Open Access Journals (Sweden)

    Wen-Lung Shiau

    2015-06-01

    Full Text Available This study aimed to analyze and identify key issues being studied in leading Management Information Systems (MIS journals collected in an ISI database. With the help of co-citation analysis and factor analysis, thirteen core issues were identified, including: (1 Technology Acceptance; (2 Information Technology (IT, Organization Performance, and Competitive Advantage; (3 IT and Organizational Structure; (4 Case Study and Methodology Issues; (5 Trust Issues in IT; (6 Knowledge Management; (7 Measurement Issues in MIS study; (8 Diffusion of Innovation; (9 Success Factors of IT; (10 Research Modeling and Approach; (11 Theory, Research and Practice; (12 MIS as an academic discipline; and (13 Enterprise Information Systems. These results can help MIS researchers and practitioners gain a better awareness of core and significant issues being studied in the field.

  5. The dynamic of information-driven coordination phenomena: a transfer entropy analysis

    CERN Document Server

    Borge-Holthoefer, Javier; Gonçalves, Bruno; González-Bailón, Sandra; Arenas, Alex; Moreno, Yamir; Vespignani, Alessandro

    2015-01-01

    Data from social media are providing unprecedented opportunities to investigate the processes that rule the dynamics of collective social phenomena. Here, we consider an information theoretical approach to define and measure the temporal and structural signatures typical of collective social events as they arise and gain prominence. We use the symbolic transfer entropy analysis of micro-blogging time series to extract directed networks of influence among geolocalized sub-units in social systems. This methodology captures the emergence of system-level dynamics close to the onset of socially relevant collective phenomena. The framework is validated against a detailed empirical analysis of five case studies. In particular, we identify a change in the characteristic time-scale of the information transfer that flags the onset of information-driven collective phenomena. Furthermore, our approach identifies an order-disorder transition in the directed network of influence between social sub-units. In the absence of ...

  6. The Correspondence Analysis Platform for Uncovering Deep Structure in Data and Information

    CERN Document Server

    Murtagh, Fionn

    2008-01-01

    We study two aspects of information semantics: (i) the collection of all relationships, (ii) tracking and spotting anomaly and change. The first is implemented by endowing all relevant information spaces with a Euclidean metric in a common projected space. The second is modelled by an induced ultrametric. A very general way to achieve a Euclidean embedding of different information spaces based on cross-tabulation counts (and from other input data formats) is provided by Correspondence Analysis. From there, the induced ultrametric that we are particularly interested in takes a sequential - e.g. temporal - ordering of the data into account. We employ such a perspective to look at narrative, "the flow of thought and the flow of language" (Chafe). In application to policy decision making, we show how we can focus analysis in a small number of dimensions.

  7. Management and analysis of water-use data using a geographic information system

    Science.gov (United States)

    Juracek, K.E.; Kenny, J.F.

    1993-01-01

    As part of its mission, the U.S. Geological Survey conducts water-resources research. Site-specific and aggregate water-use data are used in the Survey's National Water-Use Information Program and in various hydrologic investigations. Both types of activities have specific requirements in terms of water-use data access, analysis, and display. In Kansas, the Survey obtains water-use information from several sources. Typically, this information is in a format that is not readily usable by the Survey. Geographic information system (GIS) technology is being used to restructure the available water-use data into a format that allows users to readily access and summarize site-specific water-use data by source (i.e., surface or ground water), type of use, and user-defined area.

  8. An economic analysis of five selected LANDSAT assisted information systems in Oregon

    Science.gov (United States)

    Solomon, S.; Maher, K. M.

    1979-01-01

    A comparative cost analysis was performed on five LANDSAT-based information systems. In all cases, the LANDSAT system was found to have cost advantages over its alternative. The information sets generated by LANDSAT and the alternative method are not identical but are comparable in terms of satisfying the needs of the sponsor. The information obtained from the LANDSAT system in some cases is said to lack precision and detail. On the other hand, it was found to be superior in terms of providing information on areas that are inaccessible and unobtainable through conventional means. There is therefore a trade-off between precision and detail, and considerations of costs. The projects examined were concerned with locating irrigation circles in Morrow County; monitoring tansy ragwort infestation; inventoring old growth Douglas fir near Spotted Owl habitats; inventoring vegetation and resources in all state-owned lands; and determining and use for Columbia River water policies.

  9. LANDSAT-4 MSS and Thematic Mapper data quality and information content analysis

    Science.gov (United States)

    Anuta, P.; Bartolucci, L.; Dean, E.; Lozano, F.; Malaret, E.; Mcgillem, C. D.; Valdes, J.; Valenzuela, C.

    1984-01-01

    LANDSAT-4 thematic mapper (TM) and multispectral scanner (MSS) data were analyzed to obtain information on data quality and information content. Geometric evaluations were performed to test band-to-band registration accuracy. Thematic mapper overall system resolution was evaluated using scene objects which demonstrated sharp high contrast edge responses. Radiometric evaluation included detector relative calibration, effects of resampling, and coherent noise effects. Information content evaluation was carried out using clustering, principal components, transformed divergence separability measure, and supervised classifiers on test data. A detailed spectral class analysis (multispectral classification) was carried out to compare the information content of the MSS and TM for a large number of scene classes. A temperature-mapping experiment was carried out for a cooling pond to test the quality of thermal-band calibration. Overall TM data quality is very good. The MSS data are noisier than previous LANDSAT results.

  10. The development of an information criterion for Change-Point Analysis

    CERN Document Server

    Wiggins, Paul A

    2015-01-01

    Change-point analysis is a flexible and computationally tractable tool for the analysis of times series data from systems that transition between discrete states and whose observables are corrupted by noise. The change-point algorithm is used to identify the time indices (change points) at which the system transitions between these discrete states. We present a unified information-based approach to testing for the existence of change points. This new approach reconciles two previously disparate approaches to Change-Point Analysis (frequentist and information-based) for testing transitions between states. The resulting method is statistically principled, parameter and prior free and widely applicable to a wide range of change-point problems.

  11. A qualitative analysis of Māori and Pacific smokers' views on informed choice and smoking

    Science.gov (United States)

    Gifford, Heather; Tautolo, El-Shadan; Erick, Stephanie; Hoek, Janet; Gray, Rebecca; Edwards, Richard

    2016-01-01

    Objectives Tobacco companies frame smoking as an informed choice, a strategy that holds individuals responsible for harms they incur. Few studies have tested this argument, and even fewer have examined how informed indigenous smokers or those from minority ethnicities are when they start smoking. We explored how young adult Māori and Pacific smokers interpreted ‘informed choice’ in relation to smoking. Participants Using recruitment via advertising, existing networks and word of mouth, we recruited and undertook qualitative in-depth interviews with 20 Māori and Pacific young adults aged 18–26 years who smoked. Analyses Data were analysed using an informed-choice framework developed by Chapman and Liberman. We used a thematic analysis approach to identify themes that extended this framework. Results Few participants considered themselves well informed and none met more than the framework's initial two criteria. Most reflected on their unthinking uptake and subsequent addiction, and identified environmental factors that had facilitated uptake. Nonetheless, despite this context, most agreed that they had made an informed choice to smoke. Conclusions The discrepancy between participants' reported knowledge and understanding of smoking's risks, and their assessment of smoking as an informed choice, reflects their view of smoking as a symbol of adulthood. Policies that make tobacco more difficult to use in social settings could help change social norms around smoking and the ease with which initiation and addiction currently occur. PMID:27188813

  12. Analysis of biological time-lapse microscopic experiment from the point of view of the information theory.

    Science.gov (United States)

    Štys, Dalibor; Urban, Jan; Vaněk, Jan; Císař, Petr

    2011-06-01

    We report objective analysis of information in the microscopic image of the cell monolayer. The process of transfer of information about the cell by the microscope is analyzed in terms of the classical Shannon information transfer scheme. The information source is the biological object, the information transfer channel is the whole microscope including the camera chip. The destination is the model of biological system. The information contribution is analyzed as information carried by a point to overall information in the image. Subsequently we obtain information reflection of the biological object. This is transformed in the biological model which, in information terminology, is the destination. This, we propose, should be constructed as state transitions in individual cells modulated by information bonds between the cells. We show examples of detected cell states in multidimensional state space. This space is reflected as colour channel intensity phenomenological state space. We have also observed information bonds and show examples of them. PMID:25478628

  13. ANALYSIS OF A WEB INFORMATION SYSTEM APPLIED MANAGEMENT SCHOOL OF COMPUTING

    Directory of Open Access Journals (Sweden)

    ROGER CRISTHIAN GOMES

    2010-01-01

    Full Text Available One of the tasks of an entrepreneur is choose a computerized information system for the management of your business, regardless of their size and field of expertise. Having to determine if the information system will be modeling for local use, also known as standalone, or developed for the web, is becoming increasingly common, as the Internet, with its characteristics, greatly facilitates the work of the manager. However, can not simply deduct or take into account only the technological trends and market to resolve an issue that will require in the form of operation, administration and management. To choose between one or another type of system is necessary to examine the advantages and disadvantages of each model in relation to the business in question. This study aimed to list the main features intrinsic to web and stand-alone applications. The study of these two types of applications was based on analysis of an information system applied to a company to provide services in computer training. For the analysis of the information system were carried out a survey of the main requirements and modeling of a prototype. It was proposed to develop the system in a web environment, using the JAVA platform with the database manager MySQL, because the tools are complete, well documented, free, and with features that help to ensure the functionality and quality of the information system web.

  14. Evaluation of a gene information summarization system by users during the analysis process of microarray datasets

    Directory of Open Access Journals (Sweden)

    Cohen Aaron

    2009-02-01

    Full Text Available Abstract Background Summarization of gene information in the literature has the potential to help genomics researchers translate basic research into clinical benefits. Gene expression microarrays have been used to study biomarkers for disease and discover novel types of therapeutics and the task of finding information in journal articles on sets of genes is common for translational researchers working with microarray data. However, manually searching and scanning the literature references returned from PubMed is a time-consuming task for scientists. We built and evaluated an automatic summarizer of information on genes studied in microarray experiments. The Gene Information Clustering and Summarization System (GICSS is a system that integrates two related steps of the microarray data analysis process: functional gene clustering and gene information gathering. The system evaluation was conducted during the process of genomic researchers analyzing their own experimental microarray datasets. Results The clusters generated by GICSS were validated by scientists during their microarray analysis process. In addition, presenting sentences in the abstract provided significantly more important information to the users than just showing the title in the default PubMed format. Conclusion The evaluation results suggest that GICSS can be useful for researchers in genomic area. In addition, the hybrid evaluation method, partway between intrinsic and extrinsic system evaluation, may enable researchers to gauge the true usefulness of the tool for the scientists in their natural analysis workflow and also elicit suggestions for future enhancements. Availability GICSS can be accessed online at: http://ir.ohsu.edu/jianji/index.html

  15. A Novel Approach for Information Content Retrieval and Analysis of Bio-Images using Datamining techniques

    Directory of Open Access Journals (Sweden)

    Ayyagari Sri Nagesh

    2012-11-01

    Full Text Available In Bio-Medical image processing domain, content-based analysis and Information retrieval of bio-images is very critical for disease diagnosis. Content-Based Image Analysis and Information Retrieval (CBIAIR has become a significant part of information retrieval technology. One challenge in this area is that the ever-increasing number of bio-images acquired through the digital world makes the brute force searching almost impossible. Medical Image structural objects content and object identification plays significant role for image content analysis and information retrieval. There are basically three fundamental concepts for content-based bio-image retrieval, i.e. visual-feature extraction, multi-dimensional indexing, and retrieval system process. Each image has three contents such as: colour, texture and shape features. Colour and Texture both plays important image visual features used in Content-Based Image Retrieval to improve results. In this paper, we have presented an effective image retrieval system using features like texture, shape and color, called CBIAIR (Content-Based Image Analysis and Information Retrieval. Here, we have taken three different features such as texture, color and shape. Firstly, we have developed a new texture pattern feature for pixel based feature in CBIAIR system. Subsequently, we have used semantic color feature for color based feature and the shape based feature selection is done using the existing technique. For retrieving, these features are extracted from the query image and matched with the feature library using the feature weighted distance. After that, all feature vectors will be stored in the database using indexing procedure. Finally, the relevant images that have less matched distance than the predefined threshold value are retrieved from the image database after adapting the K-NN classifier.

  16. Maximal information component analysis: a novel non-linear network analysis method

    OpenAIRE

    Christoph Daniel Rau; Nicholas eWisniewski; Orozco, Luz D; Brian eBennett; James Nathaniel Weiss; Aldons Jake Lusis

    2013-01-01

    Background: Network construction and analysis algorithms provide scientists with the ability to sift through high-throughput biological outputs, such as transcription microarrays, for small groups of genes (modules) that are relevant for further research. Most of these algorithms ignore the important role of nonlinear interactions in the data, and the ability for genes to operate in multiple functional groups at once, despite clear evidence for both of these phenomena in observed biological...

  17. Maximal information component analysis: a novel non-linear network analysis method

    OpenAIRE

    Rau, Christoph D.; Wisniewski, Nicholas; Orozco, Luz D; Bennett, Brian; Weiss, James; Lusis, Aldons J.

    2013-01-01

    Background: Network construction and analysis algorithms provide scientists with the ability to sift through high-throughput biological outputs, such as transcription microarrays, for small groups of genes (modules) that are relevant for further research. Most of these algorithms ignore the important role of non-linear interactions in the data, and the ability for genes to operate in multiple functional groups at once, despite clear evidence for both of these phenomena in observed biological ...

  18. APPLICATION OF OLAP SYSTEM IN INFORMATION SUB-SYSTEM OF QMS INCOSISTENCY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Alempije Veljovic

    2008-03-01

    Full Text Available Records of inconsistencies arise as a result of incompliance of certain requirements during the execution of the process for the quality management system (QMS functioning. In this study, the established connection between QMS and projected information sub-system for inconsistencies management is presented. The information model of inconsistencies management provides a possibility to analyse inconsistencies from the aspect of interactive analytical data processing (OLAPsystems on the basis of multi-dimensional tables (OLAPcubes created in MSSQL Server-Analysis Services programme.

  19. Technology and Research Requirements for Combating Human Trafficking: Enhancing Communication, Analysis, Reporting, and Information Sharing

    Energy Technology Data Exchange (ETDEWEB)

    Kreyling, Sean J.; West, Curtis L.; Olson, Jarrod

    2011-03-17

    DHS’ Science & Technology Directorate directed PNNL to conduct an exploratory study on the domain of human trafficking in the Pacific Northwest in order to examine and identify technology and research requirements for enhancing communication, analysis, reporting, and information sharing – activities that directly support efforts to track, identify, deter, and prosecute human trafficking – including identification of potential national threats from smuggling and trafficking networks. This effort was conducted under the Knowledge Management Technologies Portfolio as part of the Integrated Federal, State, and Local/Regional Information Sharing (RISC) and Collaboration Program.

  20. Book Review: Cyber Security and Global Information Assurance: Threat Analysis and Response Solutions

    OpenAIRE

    Gary Kessler

    2009-01-01

    Knapp, K.J. (Ed.) (2009). Cyber Security and Global Information Assurance: Threat Analysis and Response Solutions. Hershey, NY: Information Science Reference. 434 + xxii pages, ISBN: 978-1-60566-326-5, US$195.Reviewed by Gary C. Kessler ()I freely admit that this book was sent to me by the publisher for the expressed purpose of my writing a review and that I know several of the chapter authors. With that disclosure out of the way, let me say that the book is well worth the ...

  1. Fusion Energy: Contextual Analysis of the Information Panels Developed by the Scientific Community versus Citizen Discourse

    International Nuclear Information System (INIS)

    The report presents an exploratory study on the impact of scientific dissemination, particularly a comparative analysis of two discourses on fusion energy as an alternative energy future. The report introduces a comparative analysis of the institutional discourse, as portrayed by the scientific jargon used in a European travelling exhibition on nuclear fusion Fusion Expo, and the social discourse, as illustrated by a citizen deliberation on this very same exhibition. Through textual analysis, the scientific discourse as deployed in the informative panels at the Fusion Expo is compared with the citizen discourse as developed in the discussions within the citizen groups. The ConText software was applied for such analysis. The purpose is to analyze how visitors assimilate, capture and understand highly technical information. Results suggest that, in despite of convergence points, the two discourses present certain differences, showing diverse levels of communication. The scientific discourse shows a great profusion of formalisms and technicalities of scientific jargon. The citizen discourse shows abundance of words associated with daily life and the more practical aspects (economy, efficiency), concerning institutional and evaluative references. In sum, the study shows that although there are a few common communicative spaces, there are still very few turning points. These data indicate that although exhibitions can be a good tool to disseminate advances in fusion energy in informal learning contexts, public feedback is a powerful tool for improving the quality of social dialogue. (Author)

  2. Collection and Analysis of Open Source News for Information Awareness and Early Warning in Nuclear Safeguards

    International Nuclear Information System (INIS)

    Acquisition and analysis of open source information plays an increasingly important role in the IAEA’s move towards safeguards implementation based on all safeguards relevant information known about a State. The growing volume of open source information requires the development of technology and tools capable of effectively collecting relevant information, filtering out “noise”, organizing valuable information in a clear and accessible manner, and assessing its relevance. In this context, the IAEA’s Division of Information Management (SGIM) and the EC’s Joint Research Centre (JRC) are currently implementing a joint project to advance the effectiveness and efficiency of the IAEA’s workflow for open source information collection and analysis. The objective is to provide tools to support SGIM in the production of the SGIM Open Source Highlights, which is a daily news brief consisting of the most pertinent news stories relevant to safeguards and non-proliferation. The process involves the review and selection of hundreds of articles from a wide array of specifically selected sources. The joint activity exploits the JRC’s Europe Media Monitor (EMM) and NewsDesk applications: EMM automatically collects and analyses news articles from a pre-defined list of web sites, and NewsDesk allows an analyst to manually select the most relevant articles from the EMM stream for further processing. The paper discusses the IAEA’s workflow for the production of SGIM Open Source Highlights and describes the capabilities of EMM and NewsDesk. It then provides an overview of the joint activities since the project started in 2011, which were focused i) on setting up a separate EMM installation dedicated to the nuclear safeguards and security domain (Nuclear Security Media Monitor, NSMM) and ii) on evaluating the NSMM/NewsDesk for meeting the IAEA’s needs. Finally, it presents the current use NSMM/NewsDesk at the IAEA and proposes options for further integration with the

  3. Applying a sociolinguistic model to the analysis of informed consent documents.

    Science.gov (United States)

    Granero-Molina, José; Fernández-Sola, Cayetano; Aguilera-Manrique, Gabriel

    2009-11-01

    Information on the risks and benefits related to surgical procedures is essential for patients in order to obtain their informed consent. Some disciplines, such as sociolinguistics, offer insights that are helpful for patient-professional communication in both written and oral consent. Communication difficulties become more acute when patients make decisions through an informed consent document because they may sign this with a lack of understanding and information, and consequently feel deprived of their freedom to make their choice about different treatments or surgery. This article discusses findings from documentary analysis using the sociolinguistic SPEAKING model, which was applied to the general and specific informed consent documents required for laparoscopic surgery of the bile duct at Torrecárdenas Hospital, Almería, Spain. The objective of this procedure was to identify flaws when information was provided, together with its readability, its voluntary basis, and patients' consent. The results suggest potential linguistic communication difficulties, different languages being used, cultural clashes, asymmetry of communication between professionals and patients, assignment of rights on the part of patients, and overprotection of professionals and institutions. PMID:19889919

  4. Formulating informative, data-based priors for failure probability estimation in reliability analysis

    International Nuclear Information System (INIS)

    Priors play an important role in the use of Bayesian methods in risk analysis, and using all available information to formulate an informative prior can lead to more accurate posterior inferences. This paper examines the practical implications of using five different methods for formulating an informative prior for a failure probability based on past data. These methods are the method of moments, maximum likelihood (ML) estimation, maximum entropy estimation, starting from a non-informative 'pre-prior', and fitting a prior based on confidence/credible interval matching. The priors resulting from the use of these different methods are compared qualitatively, and the posteriors are compared quantitatively based on a number of different scenarios of observed data used to update the priors. The results show that the amount of information assumed in the prior makes a critical difference in the accuracy of the posterior inferences. For situations in which the data used to formulate the informative prior is an accurate reflection of the data that is later observed, the ML approach yields the minimum variance posterior. However, the maximum entropy approach is more robust to differences between the data used to formulate the prior and the observed data because it maximizes the uncertainty in the prior subject to the constraints imposed by the past data

  5. SWOT analysis on National Common Geospatial Information Service Platform of China

    Science.gov (United States)

    Zheng, Xinyan; He, Biao

    2010-11-01

    Currently, the trend of International Surveying and Mapping is shifting from map production to integrated service of geospatial information, such as GOS of U.S. etc. Under this circumstance, the Surveying and Mapping of China is inevitably shifting from 4D product service to NCGISPC (National Common Geospatial Information Service Platform of China)-centered service. Although State Bureau of Surveying and Mapping of China has already provided a great quantity of geospatial information service to various lines of business, such as emergency and disaster management, transportation, water resource, agriculture etc. The shortcomings of the traditional service mode are more and more obvious, due to the highly emerging requirement of e-government construction, the remarkable development of IT technology and emerging online geospatial service demands of various lines of business. NCGISPC, which aimed to provide multiple authoritative online one-stop geospatial information service and API for further development to government, business and public, is now the strategic core of SBSM (State Bureau of Surveying and Mapping of China). This paper focuses on the paradigm shift that NCGISPC brings up by using SWOT (Strength, Weakness, Opportunity and Threat) analysis, compared to the service mode that based on 4D product. Though NCGISPC is still at its early stage, it represents the future service mode of geospatial information of China, and surely will have great impact not only on the construction of digital China, but also on the way that everyone uses geospatial information service.

  6. Celebrity Health Announcements and Online Health Information Seeking: An Analysis of Angelina Jolie's Preventative Health Decision.

    Science.gov (United States)

    Dean, Marleah

    2016-01-01

    On May 14, 2013, Angelina Jolie disclosed she carries BRCA1, which means she has an 87% risk of developing breast cancer during her lifetime. Jolie decided to undergo a preventative bilateral mastectomy (PBM), reducing her risk to 5%. The purpose of this study was to analyze the type of information individuals are exposed to when using the Internet to search health information regarding Jolie's decision. Qualitative content analysis revealed four main themes--information about genetics, information about a PBM, information about health care, and information about Jolie's gender identity. Broadly, the identified websites mention Jolie's high risk for developing cancer due to the genetic mutation BRCA1, describe a PBM occasionally noting reasons why she had this surgery and providing alternatives to the surgery, discuss issues related to health care services, costs, and insurances about Jolie's health decision, and portray Jolie as a sexual icon, a partner to Brad Pitt, a mother of six children, and an inspirational humanitarian. The websites also depict Jolie's health decision in positive, negative, and/or both ways. Discussion centers on how this actress' health decision impacts the public. PMID:26574936

  7. Wavelet q-Fisher Information for Scaling Signal Analysis

    Directory of Open Access Journals (Sweden)

    Joel Trejo-Sanchez

    2012-08-01

    Full Text Available Abstract: This article first introduces the concept of wavelet q-Fisher information  and then derives a closed-form  expression of this quantifier for scaling signals of parameter α.  It is shown that this information measure appropriately describes the complexities  of scaling signals and provides further analysis flexibility with the parameter q. In the limit of q → 1, wavelet q-Fisher information  reduces to the standard wavelet Fisher information  and for q  > 2 it reverses its behavior. Experimental results on synthesized fGn signals validates the level-shift  detection capabilities of wavelet q-Fisher information. A comparative study also shows that wavelet q-Fisher information  locates structural changes in correlated and anti-correlated fGn signals in a way comparable with standard breakpoint location techniques but at a fraction of the time. Finally, the application of this quantifier to H.263 encoded video signals is presented.

  8. NASA Informal Education: Final Report. A Descriptive Analysis of NASA's Informal Education Portfolio: Preliminary Case Studies

    Science.gov (United States)

    Rulf Fountain, Alyssa; Levy, Abigail Jurist

    2010-01-01

    This report was requested by the National Aeronautics and Space Administration's (NASA), Office of Education in July 2009 to evaluate the Informal Education Program. The goals of the evaluation were twofold: (1) to gain insight into its investment in informal education; and (2) to clarify existing distinctions between its informal education…

  9. Information Communication Technology and Politics: A Synthesized Analysis of the Impacts of Information Technology on Voter Participation in Kenya

    Science.gov (United States)

    Tsuma, Clive Katiba

    2011-01-01

    The availability of political information throughout society made possible by the evolution of contemporary information communication technology has precipitated conflicting debate regarding the effects of technology use on real life political participation. Proponents of technology argue that the use of new information technology stimulates…

  10. Empowering Students to Make Sense of an Information-Saturated World: The Evolution of "Information Searching and Analysis"

    Science.gov (United States)

    Wittebols, James H.

    2016-01-01

    How well students conduct research online is an increasing concern for educators at all levels, especially higher education. This paper describes the evolution of a course that examines confirmation bias, information searching, and the political economy of information as keys to becoming more information and media literate. After a key assignment…

  11. The 2006 Analysis of Information Remaining on Disks Offered for Sale on the Second Hand Market

    Directory of Open Access Journals (Sweden)

    Andy Jones

    2006-09-01

    Full Text Available All organisations, whether in the public or private sector, use computers for the storage and processing of information relating to their business or services, their employees and their customers. A large proportion of families and individuals in their homes now also use personal computers and, both intentionally and inadvertently, often store on those computers personal information. It is clear that most organisations and individuals continue to be unaware of the information that may be stored on the hard disks that the computers contain, and have not considered what may happen to the information after the disposal of the equipment.In 2005, joint research was carried out by the University of Glamorgan in Wales and Edith Cowan University in Australia to determine whether second hand computer disks that were purchased from a number of sources still contained any information or whether the information had been effectively erased. The research revealed that, for the majority of the disks that were examined, the information had not been effectively removed and as a result, both organisations and individuals were potentially exposed to a range of potential crimes.  It is worthy of note that in the disposal of this equipment, the organisations involved had failed to meet their statutory, regulatory and legal obligations.This paper describes a second research project that was carried out in 2006 which repeated the research carried out the previous year and also extended the scope of the research to include additional countries.  The methodology used was the same as that in the previous year and the disks that were used for the research were again supplied blind by a third party. The research involved the forensic imaging of the disks which was followed by an analysis of the disks to determine what information remained and whether it could be easily recovered using publicly available tools and techniques.

  12. Practical Performance Analysis for Multiple Information Fusion Based Scalable Localization System Using Wireless Sensor Networks.

    Science.gov (United States)

    Zhao, Yubin; Li, Xiaofan; Zhang, Sha; Meng, Tianhui; Zhang, Yiwen

    2016-01-01

    In practical localization system design, researchers need to consider several aspects to make the positioning efficiently and effectively, e.g., the available auxiliary information, sensing devices, equipment deployment and the environment. Then, these practical concerns turn out to be the technical problems, e.g., the sequential position state propagation, the target-anchor geometry effect, the Non-line-of-sight (NLOS) identification and the related prior information. It is necessary to construct an efficient framework that can exploit multiple available information and guide the system design. In this paper, we propose a scalable method to analyze system performance based on the Cramér-Rao lower bound (CRLB), which can fuse all of the information adaptively. Firstly, we use an abstract function to represent all of the wireless localization system model. Then, the unknown vector of the CRLB consists of two parts: the first part is the estimated vector, and the second part is the auxiliary vector, which helps improve the estimation accuracy. Accordingly, the Fisher information matrix is divided into two parts: the state matrix and the auxiliary matrix. Unlike the theoretical analysis, our CRLB can be a practical fundamental limit to denote the system that fuses multiple information in the complicated environment, e.g., recursive Bayesian estimation based on the hidden Markov model, the map matching method and the NLOS identification and mitigation methods. Thus, the theoretical results are approaching the real case more. In addition, our method is more adaptable than other CRLBs when considering more unknown important factors. We use the proposed method to analyze the wireless sensor network-based indoor localization system. The influence of the hybrid LOS/NLOS channels, the building layout information and the relative height differences between the target and anchors are analyzed. It is demonstrated that our method exploits all of the available information for

  13. The importance of open source information and satellite imagery analysis in safeguards implementation

    International Nuclear Information System (INIS)

    Full text: Since 1996, the analysis and evaluation of open source information, including satellite imagery, has become key to safeguards implementation. The Department of Safeguards' dedicated, open source database now contains over 4 million records. The satellite imagery database has approximately 7000 images. An important strategy is to collect open source information from as many diverse sources as possible. Information is collected from many geographical regions and in different languages. The types of information collected range from general news reports to highly specialized technical articles. Source diversity contributes towards the important objective of assessing the credibility of information obtained. Open sources provide a solid basis for assessing whether a State has the economic and industrial capabilities to support the development of nuclear weapons. Scientific literature is also a valuable indicator of nuclear capabilities and of dual-use technologies used in non-nuclear applications that could also be applied to nuclear ones. Information relating to company and business activities is important for improving knowledge of imports and exports of safeguards relevance and for assessing technological capabilities in the States concerned. Commercial satellite imagery is a powerful tool used in conjunction with open sources and with State-declared information which gives details of specific geographical locations. Routinely, satellite imagery is used to assess the functions and capabilities of research and nuclear fuel cycle facilities. It can also be helpful in detecting any changes in the features and characteristics of locations of safeguards relevance and in the identification of nuclear-related activities. For the future, it will be important to utilize software applications of greater sophistication in order to extract the knowledge from the large and rapidly growing databases of open source and satellite imagery information. Keyword search and

  14. Construction Process Simulation and Safety Analysis Based on Building Information Model and 4D Technology

    Institute of Scientific and Technical Information of China (English)

    HU Zhenzhong; ZHANG Jianping; DENG Ziyin

    2008-01-01

    Time-dependent structure analysis theory has been proved to be more accurate and reliable com-pared to commonly used methods during construction. However, so far applications are limited to partial pe-riod and part of the structure because of immeasurable artificial intervention. Based on the building informa-tion model (BIM) and four-dimensional (4D) technology, this paper proposes an improves structure analysis method, which can generate structural geometry, resistance model, and loading conditions automatically by a close interlink of the schedule information, architectural model, and material properties. The method was applied to a safety analysis during a continuous and dynamic simulation of the entire construction process.The results show that the organic combination of the BIM, 4D technology, construction simulation, and safety analysis of time-dependent structures is feasible and practical. This research also lays a foundation for further researches on building lifecycle management by combining architectural design, structure analy-sis, and construction management.

  15. Information Gap Analysis: near real-time evaluation of disaster response

    Science.gov (United States)

    Girard, Trevor

    2014-05-01

    Disasters, such as major storm events or earthquakes, trigger an immediate response by the disaster management system of the nation in question. The quality of this response is a large factor in its ability to limit the impacts on the local population. Improving the quality of disaster response therefore reduces disaster impacts. Studying past disasters is a valuable exercise to understand what went wrong, identify measures which could have mitigated these issues, and make recommendations to improve future disaster planning and response. While such ex post evaluations can lead to improvements in the disaster management system, there are limitations. The main limitation that has influenced this research is that ex post evaluations do not have the ability to inform the disaster response being assessed for the obvious reason that they are carried out long after the response phase is over. The result is that lessons learned can only be applied to future disasters. In the field of humanitarian relief, this limitation has led to the development of real time evaluations. The key aspect of real time humanitarian evaluations is that they are completed while the operation is still underway. This results in findings being delivered at a time when they can still make a difference to the humanitarian response. Applying such an approach to the immediate disaster response phase requires an even shorter time-frame, as well as a shift in focus from international actors to the nation in question's government. As such, a pilot study was started and methodology developed, to analyze disaster response in near real-time. The analysis uses the information provided by the disaster management system within the first 0 - 5 days of the response. The data is collected from publicly available sources such as ReliefWeb and sorted under various categories which represent each aspect of disaster response. This process was carried out for 12 disasters. The quantity and timeliness of information

  16. Advances in research methods for information systems research data mining, data envelopment analysis, value focused thinking

    CERN Document Server

    Osei-Bryson, Kweku-Muata

    2013-01-01

    Advances in social science research methodologies and data analytic methods are changing the way research in information systems is conducted. New developments in statistical software technologies for data mining (DM) such as regression splines or decision tree induction can be used to assist researchers in systematic post-positivist theory testing and development. Established management science techniques like data envelopment analysis (DEA), and value focused thinking (VFT) can be used in combination with traditional statistical analysis and data mining techniques to more effectively explore

  17. Information Security Management: ANP Based Approach for Risk Analysis and Decision Making

    Directory of Open Access Journals (Sweden)

    H. Brožová

    2016-03-01

    Full Text Available In information systems security, the objectives of risk analysis process are to help to identify new threats and vulnerabilities, to estimate their business impact and to provide a dynamic set of tools to control the security level of the information system. The identification of risk factors as well as the estimation of their business impact require tools for assessment of risk with multi-value scales according to different stakeholders’ point of view. Therefore, the purpose of this paper is to model risk analysis decision making problem using semantic network to develop the decision network and the Analytical Network Process (ANP that allows solving complex problems taking into consideration quantitative and qualitative data. As a decision support technique ANP also measures the dependency among risk factors related to the elicitation of individual judgement. An empirical study involving the Forestry Company is used to illustrate the relevance of ANP.

  18. Analysis of public consciousness structure and consideration of information supply against the nuclear power generation

    International Nuclear Information System (INIS)

    The Energy Engineering Research Institute carried out six times of questionnaire on analysis of public consciousness structure for fiscal years for 1986 to 1999, to obtain a lot of informations on public recognition against the nuclear power generation. In recent, as a feasibility on change of consciousness against the power generation was supposed by occurrence of the JCO critical accident forming the first victim in Japan on September, 1999 after investigation in fiscal year 1998, by carrying out the same questionnaire as one in previous fiscal year to the same objects after the accident, to analyze how evaluation, behavior determining factor and so forth on the power generation changed by the accident. In this paper, on referring to results of past questionnaires, were introduced on the questionnaire results and their analysis carried out before and after the JCO critical accident, to consider on information supply referred by them. (G.K.)

  19. An analysis method of the press information related with the nuclear activity in Argentina

    International Nuclear Information System (INIS)

    The articles published by the newspapers during the year 1987 were analyzed and classified according to their contents. An attribute was assigned to each article (positive, negative or neutral) in agreement with its connotation regarding the nuclear activity in Argentina. An ISIS base system was developed using these data. The purpose of this analysis was to evaluate the influence of the press in the public opinion. The relation between the different variables show the importance and approach (environmental, technico-scientifical or political) given by the press to the different subjects. The results show a general lack of knowledge about nuclear activities and a concern among the readers associated with the environmental risks, which calls for the need to develop an information program for the community. The fundamentals of this program should improve the organization in order to make the information reach the external demands, to promote educational programs and to continuously provide information to the press. (S.M.)

  20. Local mine production safety supervision game analysis based on incomplete information

    Institute of Scientific and Technical Information of China (English)

    LI Xing-dong; LI Ying; REN Da-wei; LIU Zhao-xia

    2007-01-01

    Utilized fundamental theory and analysis method of Incomplete Information repeated games, introduced Incomplete Information into repeated games, and established two stages dynamic games model of the local authority and the coal mine owner. The analytic result indicates that: so long as the country established the corresponding rewards and punishments incentive mechanism to the local authority departments responsible for the work, it reports the safety accident in the coal mine on time. The conclusion that the local government displays right and wrong cooperation behavior will be changed with the introduction of the Incomplete Information. Only has the local authority fulfill their responsibility, can the unsafe accident be controlled effectively. Once this kind of cooperation of local government appears, the costs of the country on the safe supervise and the difficulty will be able to decrease greatly.

  1. Wireless Information-Theoretic Security in an Outdoor Topology with Obstacles: Theoretical Analysis and Experimental Measurements

    Directory of Open Access Journals (Sweden)

    Dagiuklas Tasos

    2011-01-01

    Full Text Available This paper presents a Wireless Information-Theoretic Security (WITS scheme, which has been recently introduced as a robust physical layer-based security solution, especially for infrastructureless networks. An autonomic network of moving users was implemented via 802.11n nodes of an ad hoc network for an outdoor topology with obstacles. Obstructed-Line-of-Sight (OLOS and Non-Line-of-Sight (NLOS propagation scenarios were examined. Low-speed user movement was considered, so that Doppler spread could be discarded. A transmitter and a legitimate receiver exchanged information in the presence of a moving eavesdropper. Average Signal-to-Noise Ratio (SNR values were acquired for both the main and the wiretap channel, and the Probability of Nonzero Secrecy Capacity was calculated based on theoretical formula. Experimental results validate theoretical findings stressing the importance of user location and mobility schemes on the robustness of Wireless Information-Theoretic Security and call for further theoretical analysis.

  2. Webometric analysis of departments of librarianship and information science: a follow-up study

    OpenAIRE

    Arakaki, M.; Willett, P.

    2009-01-01

    This paper reports an analysis of the websites of UK departments of library and information science. Inlink counts of these websites revealed no statistically significant correlation with the quality of the research carried out by these departments, as quantified using departmental grades in the 2001 Research Assessment Exercise and citations in Google Scholar to publications submitted for that Exercise. Reasons for this lack of correlation include: difficulties in disambiguating department...

  3. Open source tools for the information theoretic analysis of neural data

    OpenAIRE

    Alberto Mazzoni; Petersen, Rasmus S.

    2010-01-01

    The recent and rapid development of open source software tools for the analysis of neurophysiological datasets consisting of simultaneous multiple recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and for the integration of information obtained at different spatial and temporal scales. In this review we focus...

  4. Transportation Routing Analysis Geographic Information System (TRAGIS) User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, PE

    2003-09-18

    The Transportation Routing Analysis Geographic Information System (TRAGIS) model is used to calculate highway, rail, or waterway routes within the United States. TRAGIS is a client-server application with the user interface and map data files residing on the user's personal computer and the routing engine and network data files on a network server. The user's manual provides documentation on installation and the use of the many features of the model.

  5. Enterprise Architecture for Information System Analysis : Modeling and assessing data accuracy, availability, performance and application usage

    OpenAIRE

    Per, Närman

    2012-01-01

    Decisions concerning IT systems are often made without adequate decision-support. This has led to unnecessary IT costs and failures to realize business benefits. The present thesis presents a framework for analysis of four information systems properties relevant to IT decision-making. The work is founded on enterprise architecture, a model-based IT and business management discipline. Based on the existing ArchiMate framework, a new enterprise architecture framework has been developed and impl...

  6. An Illumination Invariant Face Detection Based on Human Shape Analysis and Skin Color Information

    OpenAIRE

    Dibakar Chakraborty

    2012-01-01

    This paper provides a novel approach towards face area localization through analyzing the shape characteristics of human body. The face region is extracted by determining the sharp increase in body pixels in the shoulder area from neck region. For ensuring face area skin color information is also analyzed. The experimental analysis shows that the proposed algorithm detects the face area effectively and it’s performance is found to be quite satisfactory.

  7. An Illumination Invariant Face Detection Based on Human Shape Analysis and Skin Color Information

    Directory of Open Access Journals (Sweden)

    Dibakar Chakraborty

    2012-07-01

    Full Text Available This paper provides a novel approach towards face area localization through analyzing the shape characteristics of human body. The face region is extracted by determining the sharp increase in body pixels in the shoulder area from neck region. For ensuring face area skin color information is also analyzed. The experimental analysis shows that the proposed algorithm detects the face area effectively and it’s performance is found to be quite satisfactory.

  8. Study on the Application of Information Technologies on Suitability Evaluation Analysis in Agriculture

    OpenAIRE

    Yu, Ying; Shi, Leigang; Huai, Heju; Li, Cunjun

    2013-01-01

    It is expounded the suitability evaluation research in agriculture in three aspects: land suitability, climatic suitability and crop ecological suitability in this paper. The suitability evaluation methods in agriculture are summarized systematically, which including of traditional mathematic model methods (fuzzy comprehensive assessment analysis, AHP etc.) and modern information technologies (GIS, RS and ES etc.). The future development trends of suitability evaluation in agriculture is poin...

  9. Analysis Framework for the Interaction Between Lean Construction and Building Information Modelling

    OpenAIRE

    Sacks, Rafael; Dave, Bhargav; Koskela, Lauri; Owen, Robert

    2009-01-01

    Building with Building Information Modelling (BIM) changes design and production processes. But can BIM be used to support process changes designed according to lean production and lean construction principles? To begin to answer this question we provide a conceptual analysis of the interaction of lean construction and BIM for improving construction. This was investigated by compiling a detailed listing of lean construction principles and BIM functionalities which are relevant from this persp...

  10. Log Usage Analysis: What it Discloses about Use, Information Seeking and Trustworthiness

    OpenAIRE

    David Nicholas; David Clark; Hamid R. Jamali; Anthony Watkinson

    2014-01-01

    The Trust and Authority in Scholarly Communications in the Light of the Digital Transition research project1) was a study which investigated the behaviours and attitudes of academic researchers as producers and consumers of scholarly information resources in respect to how they determine authority and trustworthiness. The research questions for the study arose out of CIBER’s studies of the virtual scholar. This paper focuses on elements of this study, mainly an analysis of a scholarly publish...

  11. A Panel Analysis of the Strategic Association Between Information and Communication Technology and Public Health Delivery

    OpenAIRE

    Wu, Sarah Jinhui; Raghupathi, Wullianallur

    2012-01-01

    Background In this exploratory research, we use panel data analysis to examine the correlation between Information and Communication Technology (ICTs) and public health delivery at the country level. Objective The goal of this exploratory research is to examine the strategic association over time between ICTs and country-level public health. Methods Using data from the World Development Indicators, we construct a panel data set of countries of five different income levels and look closely at ...

  12. Comparison of Seven Methods for Boolean Factor Analysis and Their Evaluation by Information Gain

    Czech Academy of Sciences Publication Activity Database

    Frolov, A.; Húsek, Dušan; Polyakov, P.Y.

    2016-01-01

    Roč. 27, č. 3 (2016), s. 538-550. ISSN 2162-237X R&D Projects: GA MŠk ED1.1.00/02.0070 Institutional support: RVO:67985807 Keywords : associative memory * bars problem (BP) * Boolean factor analysis (BFA) * data mining * dimension reduction * Hebbian learning rule * information gain * likelihood maximization (LM) * neural network application * recurrent neural network * statistics Subject RIV: IN - Informatics, Computer Science Impact factor: 4.291, year: 2014

  13. An Illumination Invariant Face Detection Based on Human Shape Analysis and Skin Color Information

    Directory of Open Access Journals (Sweden)

    Dibakar Chakraborty

    2012-06-01

    Full Text Available This paper provides a novel approach towards face area localization through analyzing the shape characteristics of human body. The face region is extracted by determining the sharp increase in body pixels in the shoulder area from neck region. For ensuring face area skin color information is also analyzed. The experimental analysis shows that the proposed algorithm detects the face area effectively and it’s performance is found to be quite satisfactory

  14. The dynamic of information-driven coordination phenomena: a transfer entropy analysis

    OpenAIRE

    Borge-Holthoefer, Javier; Perra, Nicola; Gonçalves, Bruno; González-Bailón, Sandra; Arenas, Alex; Moreno, Yamir; Vespignani, Alessandro

    2015-01-01

    Data from social media are providing unprecedented opportunities to investigate the processes that rule the dynamics of collective social phenomena. Here, we consider an information theoretical approach to define and measure the temporal and structural signatures typical of collective social events as they arise and gain prominence. We use the symbolic transfer entropy analysis of micro-blogging time series to extract directed networks of influence among geolocalized sub-units in social syste...

  15. The dynamics of information-driven coordination phenomena: A transfer entropy analysis

    OpenAIRE

    Borge-Holthoefer, Javier; Perra, Nicola; Gonçalves, Bruno; González-Bailón, Sandra; Arenas, Alex; Moreno, Yamir; Vespignani, Alessandro

    2016-01-01

    Data from social media provide unprecedented opportunities to investigate the processes that govern the dynamics of collective social phenomena. We consider an information theoretical approach to define and measure the temporal and structural signatures typical of collective social events as they arise and gain prominence. We use the symbolic transfer entropy analysis of microblogging time series to extract directed networks of influence among geolocalized subunits in social systems. This met...

  16. Geographic Information Systems, Remote Sensing, and Spatial Analysis Activities in Texas, 2008-09

    Science.gov (United States)

    U.S. Geological Survey

    2009-01-01

    Geographic information system (GIS) technology has become an important tool for scientific investigation, resource management, and environmental planning. A GIS is a computer-aided system capable of collecting, storing, analyzing, and displaying spatially referenced digital data. GIS technology is useful for analyzing a wide variety of spatial data. Remote sensing involves collecting remotely sensed data, such as satellite imagery, aerial photography, or radar images, and analyzing the data to gather information or investigate trends about the environment or the Earth's surface. Spatial analysis combines remotely sensed, thematic, statistical, quantitative, and geographical data through overlay, modeling, and other analytical techniques to investigate specific research questions. It is the combination of data formats and analysis techniques that has made GIS an essential tool in scientific investigations. This fact sheet presents information about the technical capabilities and project activities of the U.S. Geological Survey (USGS) Texas Water Science Center (TWSC) GIS Workgroup during 2008 and 2009. After a summary of GIS Workgroup capabilities, brief descriptions of activities by project at the local and national levels are presented. Projects are grouped by the fiscal year (October-September 2008 or 2009) the project ends and include overviews, project images, and Internet links to additional project information and related publications or articles.

  17. A Time Series Analysis of Cancer-Related Information Seeking: Hints From the Health Information National Trends Survey (HINTS) 2003-2014.

    Science.gov (United States)

    Huerta, Timothy R; Walker, Daniel M; Johnson, Tyler; Ford, Eric W

    2016-09-01

    Recent technological changes, such as the growth of the Internet, have made cancer information widely available. However, it remains unknown whether changes in access have resulted in concomitant changes in information seeking behavior. Previous work explored the cancer information seeking behaviors of the general population using the 2003 Health Information National Trends Survey (HINTS). This article aims to reproduce, replicate, and extend that existing analysis using the original dataset and five additional iterations of HINTS (2007, 2011, 2012, 2013, 2014). This approach builds on the earlier work by quantifying the magnitude of change in information seeking behaviors. Bivariate comparison of the 2003 and 2014 data revealed very similar results; however, the multivariate model including all years of data indicated differences between the original and extended models: individuals age 65 and older were no longer less likely to seek cancer information than the 18-35 reference population, and Hispanics were also no longer less likely to be cancer information seekers. The results of our analysis indicate an overall shift in cancer information seeking behaviors and also illuminate the impact of increased Internet usage over the past decade, suggesting specific demographic groups that may benefit from cancer information seeking encouragement. PMID:27565190

  18. Geographic Information Systems, Remote Sensing, and Spatial Analysis Activities in Texas, 2002-07

    Science.gov (United States)

    Pearson, D.K.; Gary, R.H.; Wilson, Z.D.

    2007-01-01

    Geographic information system (GIS) technology has become an important tool for scientific investigation, resource management, and environmental planning. A GIS is a computer-aided system capable of collecting, storing, analyzing, and displaying spatially referenced digital data. GIS technology is particularly useful when analyzing a wide variety of spatial data such as with remote sensing and spatial analysis. Remote sensing involves collecting remotely sensed data, such as satellite imagery, aerial photography, or radar images, and analyzing the data to gather information or investigate trends about the environment or the Earth's surface. Spatial analysis combines remotely sensed, thematic, statistical, quantitative, and geographical data through overlay, modeling, and other analytical techniques to investigate specific research questions. It is the combination of data formats and analysis techniques that has made GIS an essential tool in scientific investigations. This document presents information about the technical capabilities and project activities of the U.S. Geological Survey (USGS) Texas Water Science Center (TWSC) GIS Workgroup from 2002 through 2007.

  19. Trial sequential analysis reveals insufficient information size and potentially false positive results in many meta-analyses

    DEFF Research Database (Denmark)

    Brok, Jesper; Thorlund, Kristian; Gluud, Christian;

    2008-01-01

    To evaluate meta-analyses with trial sequential analysis (TSA). TSA adjusts for random error risk and provides the required number of participants (information size) in a meta-analysis. Meta-analyses not reaching information size are analyzed with trial sequential monitoring boundaries analogous ...

  20. Development program of autopsychological competence of future specialists in information technology: content and analysis of the effectiveness

    OpenAIRE

    Проскурка, Наталія Миколаївна

    2016-01-01

    The article deals with the peculiarities of development of autopsychological competence of future specialists in information technology. The content of the program of development of autopsychological competence of future specialists in information technology are described. The analysis of the results of implementation effectiveness of the development program of autopsychological competence of future specialists in the field of information technology is presented

  1. Information dimension analysis of bacterial essential and nonessential genes based on chaos game representation

    International Nuclear Information System (INIS)

    Essential genes are indispensable for the survival of an organism. Investigating features associated with gene essentiality is fundamental to the prediction and identification of the essential genes. Selecting features associated with gene essentiality is fundamental to predict essential genes with computational techniques. We use fractal theory to make comparative analysis of essential and nonessential genes in bacteria. The information dimensions of essential genes and nonessential genes available in the DEG database for 27 bacteria are calculated based on their gene chaos game representations (CGRs). It is found that weak positive linear correlation exists between information dimension and gene length. Moreover, for genes of similar length, the average information dimension of essential genes is larger than that of nonessential genes. This indicates that essential genes show less regularity and higher complexity than nonessential genes. Our results show that for bacterium with a similar number of essential genes and nonessential genes, the CGR information dimension is helpful for the classification of essential genes and nonessential genes. Therefore, the gene CGR information dimension is very probably a useful gene feature for a genetic algorithm predicting essential genes. (paper)

  2. Vocal acoustic analysis as a biometric indicator of information processing: implications for neurological and psychiatric disorders.

    Science.gov (United States)

    Cohen, Alex S; Dinzeo, Thomas J; Donovan, Neila J; Brown, Caitlin E; Morrison, Sean C

    2015-03-30

    Vocal expression reflects an integral component of communication that varies considerably within individuals across contexts and is disrupted in a range of neurological and psychiatric disorders. There is reason to suspect that variability in vocal expression reflects, in part, the availability of "on-line" resources (e.g., working memory, attention). Thus, understanding vocal expression is a potentially important biometric index of information processing, not only across but within individuals over time. A first step in this line of research involves establishing a link between vocal expression and information processing systems in healthy adults. The present study employed a dual attention experimental task where participants provided natural speech while simultaneously engaged in a baseline, medium or high nonverbal processing-load task. Objective, automated, and computerized analysis was employed to measure vocal expression in 226 adults. Increased processing load resulted in longer pauses, fewer utterances, greater silence overall and less variability in frequency and intensity levels. These results provide compelling evidence of a link between information processing resources and vocal expression, and provide important information for the development of an automated, inexpensive and uninvasive biometric measure of information processing. PMID:25656172

  3. An Analysis of Information Technology on Data Processing by using Cobit Framework

    Directory of Open Access Journals (Sweden)

    Surni Erniwati

    2015-09-01

    Full Text Available Information technology and processes is inter connected, directing and controlling the company in achieving corporate goals through value-added and balancing the risks and benefits of information technology. This study is aimed to analyze the level of maturity (maturity level on the data process and produced information technology recommendations that can be made as regards the management of IT to support the academic performance of the service to be better. Maturity level calculation was done by analyzing questionnaires on the state of information technology. The results of this study obtainable that the governance of information technology in data processing in Mataram ASM currently quite good. Current maturity value for the data processing has the value 2.69. This means that the company / organization already has a pattern of repeatedly done in managing the activities related to data management processes. Based on the data analysis, there is an effect on the current conditions and expected conditions can be taken solution or corrective actions to improve IT governance in the process of data management at ASM Mataram gradually.

  4. Quantifying information transfer by protein domains: Analysis of the Fyn SH2 domain structure

    Directory of Open Access Journals (Sweden)

    Serrano Luis

    2008-10-01

    Full Text Available Abstract Background Efficient communication between distant sites within a protein is essential for cooperative biological response. Although often associated with large allosteric movements, more subtle changes in protein dynamics can also induce long-range correlations. However, an appropriate formalism that directly relates protein structural dynamics to information exchange between functional sites is still lacking. Results Here we introduce a method to analyze protein dynamics within the framework of information theory and show that signal transduction within proteins can be considered as a particular instance of communication over a noisy channel. In particular, we analyze the conformational correlations between protein residues and apply the concept of mutual information to quantify information exchange. Mapping out changes of mutual information on the protein structure then allows visualizing how distal communication is achieved. We illustrate the approach by analyzing information transfer by the SH2 domain of Fyn tyrosine kinase, obtained from Monte Carlo dynamics simulations. Our analysis reveals that the Fyn SH2 domain forms a noisy communication channel that couples residues located in the phosphopeptide and specificity binding sites and a number of residues at the other side of the domain near the linkers that connect the SH2 domain to the SH3 and kinase domains. We find that for this particular domain, communication is affected by a series of contiguous residues that connect distal sites by crossing the core of the SH2 domain. Conclusion As a result, our method provides a means to directly map the exchange of biological information on the structure of protein domains, making it clear how binding triggers conformational changes in the protein structure. As such it provides a structural road, next to the existing attempts at sequence level, to predict long-range interactions within protein structures.

  5. Legal analysis of the instructions for use information in the K-Files packages

    Directory of Open Access Journals (Sweden)

    Rhonan Ferreira da SILVA

    2010-06-01

    Full Text Available Introduction: Anvisa classifies endodontic files as medical products and, therefore, all commercial trademarks sold in Brazil must have an adequate registration. To achieve this registration, several information on the product should be available to consumers (dentists in order to allow its proper use and to avoid possible accidents. Objective: To examine whether the information set forth in endodontic K-Files packages, labels and instructions for use are in accordance with current legislation, especially those established by Anvisa and the Consumer’s Defense Code (CDC. Material and methods: 29 retail dental centers were visited and 11 samples of different commercial trademarks of K-Files first series (15-40 were obtained, and the information available on them was submitted to an analysis based on legal orders. Results:In all trademarks, there was no information available on how to use the product and on the means of storing the files before/after use. Only SybronEndo trademark warned about the risks of using the files and reported criteria of number of use and disposal. Only Mani trademark adequately informed on how to sterilize. Conclusion: It was verified that certain rules established by Anvisa and CDC are being disregarded concerning the display of certain necessary and required information that should be included on labels, instructions for use or K-Files commercial packages. Considering the large amount of information that must be available for the proper use of endodontic files, it is important that they are displayed preferably by means of instructions for use in the commercial package to be acquired by dentists.

  6. Information content analysis and noise characterization in remote sensing image interpretation

    Science.gov (United States)

    Corner, Brian R.

    The maximum information obtainable from an image is limited primarily by the quality of the data. The information content must be characterized to properly select the data needed and the related costs to satisfy the requirements of a given application. This dissertation examines variables such as spatial resolution, spectral resolution, and noise and how they relate to image information content. Algorithms are developed which estimate the noise standard deviation in addition to modeling and quantifying the amount of image information content. As noise is a common degradation which affects image interpretability, a novel method of noise standard deviation estimation was developed in this dissertation using data masking. The technique uses Laplacian and gradient convolution filters and compares favorably to existing methods of noise estimation. By examining the effects of noise on principal component analysis (PCA), it is observed that PCA is more effective in reducing uncorrelated normally distributed additive and multiplicative noise as compared to correlated forms of noise. This is demonstrated by noting the change in the PCA standard deviation as a function of the amount of noise. Quantification of the information content was established using an interpretability-based approach with a utility index model. The model compares the results of degraded and non-degraded imagery using a k-means unsupervised classifier. The information content is defined to be the relative classification accuracy of the imagery. The model is shown to be robust for widely varying data types and scenes. Application of the information content model to urban growth, lake surface area change, and fire burn severity verifies the effectiveness and the wide applicability of the model.

  7. Extraction and Analysis of Information Related to Research & Development Declared Under an Additional Protocol

    International Nuclear Information System (INIS)

    The additional protocol (AP) provides important tools to strengthen and improve the effectiveness and efficiency of the safeguards system. Safeguards are designed to verify that States comply with their international commitments not to use nuclear material or to engage in nuclear-related activities for the purpose of developing nuclear weapons or other nuclear explosive devices. Under an AP based on INFCIRC/540, a State must provide to the IAEA additional information about, and inspector access to, all parts of its nuclear fuel cycle. In addition, the State has to supply information about its nuclear fuel cycle-related research and development (R&D) activities. The majority of States declare their R&D activities under the AP Articles 2.a.(i), 2.a.(x), and 2.b.(i) as part of initial declarations and their annual updates under the AP. In order to verify consistency and completeness of information provided under the AP by States, the Agency has started to analyze declared R&D information by identifying interrelationships between States in different R&D areas relevant to safeguards. The paper outlines the quality of R&D information provided by States to the Agency, describes how the extraction and analysis of relevant declarations are currently carried out at the Agency and specifies what kinds of difficulties arise during evaluation in respect to cross-linking international projects and finding gaps in reporting. In addition, the paper tries to elaborate how the reporting quality of AP information with reference to R&D activities and the assessment process of R&D information could be improved. (author)

  8. What Are Your Patients Reading Online About Soft-tissue Fillers? An Analysis of Internet Information

    Science.gov (United States)

    Al Youha, Sarah A.; Bull, Courtney E.; Butler, Michael B.; Williams, Jason G.

    2016-01-01

    Background: Soft-tissue fillers are increasingly being used for noninvasive facial rejuvenation. They generally offer minimal downtime and reliable results. However, significant complications are reported and patients need to be aware of these as part of informed consent. The Internet serves as a vital resource to inform patients of the risks and benefits of this procedure. Methods: Three independent reviewers performed a structured analysis of 65 Websites providing information on soft-tissue fillers. Validated instruments were used to analyze each site across multiple domains, including readability, accessibility, reliability, usability, quality, and accuracy. Associations between the endpoints and Website characteristics were assessed using linear regression and proportional odds modeling. Results: The majority of Websites were physician private practice sites (36.9%) and authored by board-certified plastic surgeons or dermatologists (35.4%) or nonphysicians (27.7%). Sites had a mean Flesch-Kincaid grade level of 11.9 ± 2.6, which is well above the recommended average of 6 to 7 grade level. Physician private practice sites had the lowest scores across all domains with a notable lack of information on complications. Conversely, Websites of professional societies focused in plastic surgery and dermatology, as well as academic centers scored highest overall. Conclusions: As the use of soft-tissue fillers is rising, patients should be guided toward appropriate sources of information such as Websites sponsored by professional societies. Medical professionals should be aware that patients may be accessing poor information online and strive to improve the overall quality of information available on soft-tissue fillers.

  9. An analysis of water data systems to inform the Open Water Data Initiative

    Science.gov (United States)

    Blodgett, David L.; Read, Emily Kara; Lucido, Jessica M.; Slawecki, Tad; Young, Dwane

    2016-01-01

    Improving access to data and fostering open exchange of water information is foundational to solving water resources issues. In this vein, the Department of the Interior's Assistant Secretary for Water and Science put forward the charge to undertake an Open Water Data Initiative (OWDI) that would prioritize and accelerate work toward better water data infrastructure. The goal of the OWDI is to build out the Open Water Web (OWW). We therefore considered the OWW in terms of four conceptual functions: water data cataloging, water data as a service, enriching water data, and community for water data. To describe the current state of the OWW and identify areas needing improvement, we conducted an analysis of existing systems using a standard model for describing distributed systems and their business requirements. Our analysis considered three OWDI-focused use cases—flooding, drought, and contaminant transport—and then examined the landscape of other existing applications that support the Open Water Web. The analysis, which includes a discussion of observed successful practices of cataloging, serving, enriching, and building community around water resources data, demonstrates that we have made significant progress toward the needed infrastructure, although challenges remain. The further development of the OWW can be greatly informed by the interpretation and findings of our analysis.

  10. Structure analysis of the Polish academic information society using MDS method

    Science.gov (United States)

    Kaliczynska, Malgorzata

    2006-03-01

    The article presents the methodology of webometrics research and analysis aiming at determining similar features of objects belonging to the Polish information society, which uses the Internet and its www resources for communication purposes. In particular, the analysis applies to the selected Polish technical universities. The research was carried out in several phases - on different data groups - with regards to the Internet space and time changes. The results have been presented in a form of two and three-dimensional topography maps. For the purposes of this analysis, the computer methods of multidimensional scaling were used. The research will be further continued for a selected group of objects over a longer time frame. Its next stage will be the research on more diversified objects, also in a multinational aspect.

  11. Information Extraction for System-Software Safety Analysis: Calendar Year 2007 Year-End Report

    Science.gov (United States)

    Malin, Jane T.

    2008-01-01

    This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis on the models to identify possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations; 4) perform discrete-time-based simulation on the models to investigate scenarios where these paths may play a role in failures and mishaps; and 5) identify resulting candidate scenarios for software integration testing. This paper describes new challenges in a NASA abort system case, and enhancements made to develop the integrated tool set.

  12. FACTORS OF INFLUENCE ON THE ENTREPRENEURIAL INTEREST: AN ANALYSIS WITH STUDENTS OF INFORMATION TECHNOLOGY RELATED COURSES

    Directory of Open Access Journals (Sweden)

    Diego Guilherme Bonfim

    2009-10-01

    Full Text Available The purpose of the research was to analyze the entrepreneurial interest of students in information technology related courses. A literature review was performed, from which four hypotheses were announced, affirming that the student interest in entrepreneurial activity is influenced by (1 the perceived vocation of the area, (2 the ownership of a company, (3 the perceived social support from friends and family, and (4 the entrepreneurial skills mastery. A field study was developed, with data collected from the 171 students of higher education institutions from Fortaleza. The data were analyzed by using statistical techniques of descriptive analysis, analysis of variance, and multiple regression analysis. It was found that: (1 students, in general, have a moderate predisposition to engage in entrepreneurial activities; (2 the entrepreneurial interest is influenced by the perceived entrepreneurial vocation of the area, the social support, and the perceived strategic entrepreneurial skills mastery.

  13. Message Structures: a modelling technique for information systems analysis and design

    CERN Document Server

    España, Sergio; Pastor, Óscar; Ruiz, Marcela

    2011-01-01

    Despite the increasing maturity of model-driven software development (MDD), some research challenges remain open in the field of information systems (IS). For instance, there is a need to improve modelling techniques so that they cover several development stages in an integrated way, and they facilitate the transition from analysis to design. This paper presents Message Structures, a technique for the specification of communicative interactions between the IS and organisational actors. This technique can be used both in the analysis stage and in the design stage. During analysis, it allows abstracting from the technology that will support the IS, and to complement business process diagramming techniques with the specification of the communicational needs of the organisation. During design, Message Structures serves two purposes: (i) it allows to systematically derive a specification of the IS memory (e.g. a UML class diagram), (ii) and it allows to reason the user interface design using abstract patterns. Thi...

  14. The Search and Matching Equilibrium in an Economy with an Informal Sector: A Positive Analysis of Labor Market Policies

    OpenAIRE

    Luz Adriana Flórez

    2014-01-01

    This paper contributes to the theoretical analysis of the informal sector through the search and matching framework. Building upon the work of Albrecht, Navarro and Vroman (2009), where the informal sector consists of unregulated selfemployment, I describe the search and matching equilibrium in an economy with an informal sector where workers are risk neutral and the government is able to see whether a worker is in the formal sector or in the informal one. In this case, I solve the matching e...

  15. Analysis and design of a local area network information support system for the Marine Corps Air Station, Yuma, Arizona

    OpenAIRE

    Jordan, Samuel L.

    1988-01-01

    This thesis provides an analysis of the organizational information system of the Marine Corps Air Station, Yuma, Arizona. A discussion of academic theory concerning structured systems analysis and design, local area network communications standards, and the characteristics of a local area network provide the theoretical foundation for the analysis and design of a local area network information support system for the Air Station. A survey was conducted to identify the problems and the function...

  16. Comparative Analysis of Postmodern Design for Information Technology in Education in Relation to Modernism

    Directory of Open Access Journals (Sweden)

    Saeid Zarghami Hamrah

    2012-01-01

    Full Text Available Problem statement: The purpose of present study is a comparative analysis of the philosophical bases of postmodernism in relation to modernism and suggesting the necessities of each base in the designing information technology in education. Approach: The research method for the present study was comparative analysis. Results: The first base was rejection of objective view toward the universe and accepting the “pre-objective universe”. In this regard, it was suggested that information technology should be considered in relation to and as a component of life. The second base was doing away with totality. The necessity of this base was in the rejection of universal approaches and designing for specific situations. The third base was uncertainty. Regarding this base, it was suggested that the educational software provide a text in which the learner confront subjects for questioning and interpreting. The forth base was focusing on the complexities of the phenomena. In this ground, it was especially necessary for the design to be integrational. Conclusion/Recommendations: It seems that postmodernism view has been able to provide the possibility of recreating information technology in education through going beyond the basic assumptions of modernism. At last and in order to escape the metanarrative view toward postmodern ideas, we cannot regard the recommended solutions by postmodernists as the definite, final and general solution for educational issues of present and past times. But, we can look at them for further illumination of technological education condition of the present time.

  17. Analysis of the Health Information and Communication System and Cloud Computing

    Directory of Open Access Journals (Sweden)

    Matija Varga

    2015-05-01

    Full Text Available This paper describes an analysis and shows its use in analysing strengths, weaknesses, opportunities and threats (risks within the health care system.The aim is further more to show strengths, weaknesses, opportunities and threats when using cloud computing in the health care system. Cloud computing in medicine is an integral part of telemedicine. Based on the information presented in this paper, employees may identify the advantages and disadvantages of using cloud computing. When introducing new information technologies in the health care business the implementers will encounter numerous problems, such as: the complexity of the existing and the new information system, the costs of maintaining and updating the software, the cost of implementing new modules,a way of protecting the existing data in the database and the data that will be collected in the diagnosis. Using the SWOT analysis this paper evaluates the feasibility and possibility of adopting cloud computing in the health sector to improve health services based on samples (examples from abroad. The intent of cloud computing in medicine is to send data of the patient to the doctor instead of the patient sending it himself/herself.

  18. Application of evidence theory in information fusion of multiple sources in bayesian analysis

    Institute of Scientific and Technical Information of China (English)

    周忠宝; 蒋平; 武小悦

    2004-01-01

    How to obtain proper prior distribution is one of the most critical problems in Bayesian analysis. In many practical cases, the prior information often comes from different sources, and the prior distribution form could be easily known in some certain way while the parameters are hard to determine. In this paper, based on the evidence theory, a new method is presented to fuse the information of multiple sources and determine the parameters of the prior distribution when the form is known. By taking the prior distributions which result from the information of multiple sources and converting them into corresponding mass functions which can be combined by Dempster-Shafer (D-S) method, we get the combined mass function and the representative points of the prior distribution. These points are used to fit with the given distribution form to determine the parameters of the prior distrbution. And then the fused prior distribution is obtained and Bayesian analysis can be performed.How to convert the prior distributions into mass functions properly and get the representative points of the fused prior distribution is the central question we address in this paper. The simulation example shows that the proposed method is effective.

  19. Pattern Classification of Decomposed Wavelet Information using ART2 Networks for echoes Analysis

    Directory of Open Access Journals (Sweden)

    Solís M.

    2008-04-01

    Full Text Available The Ultrasonic Pulse-Echo technique has been successfully used in a non-destructive testing of materials. Toperform Ultrasonic Non-destructive Evaluation (NDE, an ultrasonic pulsed wave is transmitted into thematerials using a transmitting/receiving transducer or arrays of transducers,that produces an image ofultrasonic reflectivity. The information inherent in ultrasonic signals or image are the echoes coming fromflaws, grains, and boundaries of the tested material. The main goal of this evaluation is to determine theexistence of defect, its size and its position; for that matter, an innovative methodology is proposed based onpattern recognition and wavelet analysis for flaws detection and localization.The pattern recognition technique used in this work is the neural network named ART2 (Adaptive ResonanceTheory trained by the information given by the time-scale information of the signals via the wavelet transform.A thorough analysis between the neural network training and the type wavelets used for the training has beendeveloped, showing that the Symlet 6 wavelet is the optimum for our problem.

  20. Information-theoretical analysis of the statistical dependencies among three variables: Applications to written language

    Science.gov (United States)

    Hernández, Damián G.; Zanette, Damián H.; Samengo, Inés

    2015-08-01

    We develop the information-theoretical concepts required to study the statistical dependencies among three variables. Some of such dependencies are pure triple interactions, in the sense that they cannot be explained in terms of a combination of pairwise correlations. We derive bounds for triple dependencies, and characterize the shape of the joint probability distribution of three binary variables with high triple interaction. The analysis also allows us to quantify the amount of redundancy in the mutual information between pairs of variables, and to assess whether the information between two variables is or is not mediated by a third variable. These concepts are applied to the analysis of written texts. We find that the probability that a given word is found in a particular location within the text is not only modulated by the presence or absence of other nearby words, but also, on the presence or absence of nearby pairs of words. We identify the words enclosing the key semantic concepts of the text, the triplets of words with high pairwise and triple interactions, and the words that mediate the pairwise interactions between other words.

  1. Information-theoretical analysis of the statistical dependencies among three variables: Applications to written language

    CERN Document Server

    Hernández, Damián G; Samengo, Inés

    2015-01-01

    We develop the information-theoretical concepts required to study the statistical dependencies among three variables. Some of such dependencies are pure triple interactions, in the sense that they cannot be explained in terms of a combination of pairwise correlations. We derive bounds for triple dependencies, and characterize the shape of the joint probability distribution of three binary variables with high triple interaction. The analysis also allows us to quantify the amount of redundancy in the mutual information between pairs of variables, and to assess whether the information between two variables is or is not mediated by a third variable. These concepts are applied to the analysis of written texts. We find that the probability that a given word is found in a particular location within the text is not only modulated by the presence or absence of other nearby words, but also, on the presence or absence of nearby pairs of words. We identify the words enclosing the key semantic concepts of the text, the tr...

  2. MEDLINE demand profiles: an analysis of requests for clinical and research information.

    Science.gov (United States)

    Greenberg, B; Breedlove, R; Berger, W

    1977-01-01

    When a medical library serves both research scientists and practicing physicians, it may be predicted from the results of previous studies that computerized bibliographic search services will show more research and less clinical activity. The present paper reports the results of a statistical analysis of professional use of the National Library of Medicine's bibliographic retrieval system. MEDLINE (Medical Literature Analysis and Retrieval System on-Line), at a large medical school library. Results indicate that (1) demand for MEDLINE service is primarily research oriented; (2) frequency of use bears a relationship to rank and departmental affiliation; (3) broad and comprehensive searches are requested more frequently than searches for specific information; (4) usage shows an interesting curvilinear relationship with age and status of the user; and (5) grant funds and support correlate with the number of searches requested. Implication of these findings are that since clinicians' use of MEDLINE was found to be minimal, information services should be reevaluated in order to assist in meeting their information needs more effectively. PMID:831884

  3. Detection of allosteric signal transmission by information-theoretic analysis of protein dynamics

    Science.gov (United States)

    Pandini, Alessandro; Fornili, Arianna; Fraternali, Franca; Kleinjung, Jens

    2012-01-01

    Allostery offers a highly specific way to modulate protein function. Therefore, understanding this mechanism is of increasing interest for protein science and drug discovery. However, allosteric signal transmission is difficult to detect experimentally and to model because it is often mediated by local structural changes propagating along multiple pathways. To address this, we developed a method to identify communication pathways by an information-theoretical analysis of molecular dynamics simulations. Signal propagation was described as information exchange through a network of correlated local motions, modeled as transitions between canonical states of protein fragments. The method was used to describe allostery in two-component regulatory systems. In particular, the transmission from the allosteric site to the signaling surface of the receiver domain NtrC was shown to be mediated by a layer of hub residues. The location of hubs preferentially connected to the allosteric site was found in close agreement with key residues experimentally identified as involved in the signal transmission. The comparison with the networks of the homologues CheY and FixJ highlighted similarities in their dynamics. In particular, we showed that a preorganized network of fragment connections between the allosteric and functional sites exists already in the inactive state of all three proteins.—Pandini, A., Fornili, A., Fraternali, F., Kleinjung, J. Detection of allosteric signal transmission by information-theoretic analysis of protein dynamics. PMID:22071506

  4. Combining Global and Local Information for Knowledge-Assisted Image Analysis and Classification

    Directory of Open Access Journals (Sweden)

    Mezaris V

    2007-01-01

    Full Text Available A learning approach to knowledge-assisted image analysis and classification is proposed that combines global and local information with explicitly defined knowledge in the form of an ontology. The ontology specifies the domain of interest, its subdomains, the concepts related to each subdomain as well as contextual information. Support vector machines (SVMs are employed in order to provide image classification to the ontology subdomains based on global image descriptions. In parallel, a segmentation algorithm is applied to segment the image into regions and SVMs are again employed, this time for performing an initial mapping between region low-level visual features and the concepts in the ontology. Then, a decision function, that receives as input the computed region-concept associations together with contextual information in the form of concept frequency of appearance, realizes image classification based on local information. A fusion mechanism subsequently combines the intermediate classification results, provided by the local- and global-level information processing, to decide on the final image classification. Once the image subdomain is selected, final region-concept association is performed using again SVMs and a genetic algorithm (GA for optimizing the mapping between the image regions and the selected subdomain concepts taking into account contextual information in the form of spatial relations. Application of the proposed approach to images of the selected domain results in their classification (i.e., their assignment to one of the defined subdomains and the generation of a fine granularity semantic representation of them (i.e., a segmentation map with semantic concepts attached to each segment. Experiments with images from the personal collection domain, as well as comparative evaluation with other approaches of the literature, demonstrate the performance of the proposed approach.

  5. AN ECONOMIC ANALYSIS OF THE DETERMINANTS OF ENTREPRENEURSHIP: THE CASE OF MASVINGO INFORMAL BUSINESSES

    Directory of Open Access Journals (Sweden)

    Clainos Chidoko

    2013-03-01

    Full Text Available In the past decade, Zimbabwe has been hit by its worst economic performance since its independence in 1980. Capacity utilization shrank to ten percent and unemployment rate was above eighty percent by 2008 as the private and public sector witnessed massive retrenchments. As a result many people are finding themselves engaging in informal businesses to make ends meet. However not all people have joined the informal sector as has been witnessed by the number of people who left the country in droves to neighbouring countries. It is against this background that this research conducted an economic analysis of the determinants of entrepreneurship in Masvingo urban with an emphasis on the informal businesses. The research targeted a sample of 100 informal businesses (30 from Rujeko Light industrial area, 40 from Mucheke Light industrial area and 30 from Masvingo Central Business District. The businesses included among others flea market operators, furniture manufacturers, suppliers and producers of agricultural products, and food vendors. The research found out that level of education, gender, age, marital status, number of dependants, type of subjects studied at secondary school and vocational training are the main determinants that influence the type of business that entrepreneur ventures into. The study recommends formal training for the participants, for the businesses to continue into existence since they fill in the gap that is left vacant by most formal enterprises.

  6. Bringing trauma-informed practice to domestic violence programs: A qualitative analysis of current approaches.

    Science.gov (United States)

    Wilson, Joshua M; Fauci, Jenny E; Goodman, Lisa A

    2015-11-01

    Three out of 10 women and 1 out of 10 men in the United States experience violence at the hands of an intimate partner-often with devastating costs. In response, hundreds of residential and community-based organizations have sprung up to support survivors. Over the last decade, many of these organizations have joined other human service systems in adopting trauma-informed care (TIC), an approach to working with survivors that responds directly to the effects of trauma. Although there have been various efforts to describe TIC in domestic violence (DV) programs, there is a need to further synthesize this discourse on trauma-informed approaches to better understand specific applications and practices for DV programs. This study aimed to address this gap. The authors of this study systematically identified key documents that describe trauma-informed approaches in DV services and then conducted a qualitative content analysis to identify core themes. Results yielded 6 principles (Establishing emotional safety, Restoring choice and control, Facilitating connection, Supporting coping, Responding to identity and context, and Building strengths), each of which comprised a set of concrete practices. Despite the common themes articulated across descriptions of DV-specific trauma-informed practices (TIP), we also found critical differences, with some publications focusing narrowly on individual healing and others emphasizing the broader community and social contexts of violence and oppression. Implications for future research and evaluation are discussed. (PsycINFO Database Record PMID:26594925

  7. CITATION ANALYSIS OF JOURNAL OF LIBRARY AND INFORMATION SCIENCE (2004-2009

    Directory of Open Access Journals (Sweden)

    Ahmed Olakunle Simisaye

    2010-01-01

    Full Text Available Citation analysis of all the journal articles published in the journal of Library and Information Science (JOLIS from 2004-2009 is carried out. 72 articles were published in the journal during five (5 years covered. Highest number of (14 articles were published in 2007 and 2008.A total of 998 references were generated by the journals, indicating that 13.7 average citation per articles. The result shows that journals were the most cited materials as it accounted for 37.14% of the total citations, followed by books with 33.14%. The individual articles that had the highest citation had 44 references and was published in 2008. The findings further show that 62 library and information science (LIS journals cited produced 172 citations. African Journal of Library, Archives and Information science led the ten (10 most cited library and information science journals (LIS with 40 citations in the journal. 15(24.19% of the (LIS journals were published in Nigeria, 45(72.58% were from outside African continent, while only 2 (3.22% other journals were from Africa. The majority (38.2% of materials cited was published in 1995 and beyond, authorship pattern shows that (79.85% of the materials cited was written by single authors while only 8.8% of the total citations were Internet resources.

  8. European consumers’ interest toward nutritional information on wine labeling: A cross-country analysis*

    Directory of Open Access Journals (Sweden)

    Annunziata Azzurra

    2015-01-01

    Full Text Available This paper explores, through an empirical analysis European consumers’ interest toward nutritional information on wine labels, examining interest, knowledge and understanding of these information. In this regard, results from direct survey with a cross-country sample of 500 wine consumers (i.e. respondents drinking at least once a month living in Italy, France and Spain are presented and discussed. Preliminary results reveal that consumers are quite confused about the nutritional aspects of wine and tend to be interested in receiving nutritional information on wine labels. However, the interest expressed towards this kind of information differ from country to country and is influenced by other socio-demographic variables. Findings from current research should be valuable to contribute to the debate on updating international and national standards on wine labelling concerning nutrition. At the same time, the research provides a number of useful indications for policy makers in defining future development of wine nutritional labelling programs and in implementing strategies focused in enhancing efficacy and readability of labels.

  9. Fuzzy - Expert System for Cost Benefit Analysis of Enterprise Information Systems: A Framework

    Directory of Open Access Journals (Sweden)

    Faith –Michael E. Uzoka

    2009-11-01

    Full Text Available Enterprise Information Systems (EIS are collections of hardware, software, data, people and procedures that work together to manage organizational information resources, ultimately enhancing decision making, and strategic advantage. One of the key issues in the acquisition and utilization of EIS is the determination of the value of investment in such systems. Traditional capital budgeting models such as NPV, IRR, payback period, and profitability index focus mainly on quantifiable variables. However, there are many intangible variables that make the use of entirely quantitative measures incomplete and less inclusive. The high level of impact of information systems (IS on the entire organizational strategy and the information intensity of IS makes the use of such traditional methods less practicable. Attempts have been made to overcome these shortcomings by utilizing other techniques such as the real options model, goal programming model, knowledge value model and intelligent techniques. This paper proposes the adoption of a hybrid intelligent technique (fuzzy-expert system in carrying out a cost benefit analysis of EIS investment. The study takes high cognizance of intangible variables and vagueness / imprecision in human group decision making that requires a good level of consensus.

  10. Towards a New Approach of the Economic Intelligence Process: Basic Concepts, Analysis Methods and Informational Tools

    Directory of Open Access Journals (Sweden)

    Sorin Briciu

    2009-04-01

    Full Text Available One of the obvious trends in current business environment is the increased competition. In this context, organizations are becoming more and more aware of the importance of knowledge as a key factor in obtaining competitive advantage. A possible solution in knowledge management is Economic Intelligence (EI that involves the collection, evaluation, processing, analysis, and dissemination of economic data (about products, clients, competitors, etc. inside organizations. The availability of massive quantities of data correlated with advances in information and communication technology allowing for the filtering and processing of these data provide new tools for the production of economic intelligence.The research is focused on innovative aspects of economic intelligence process (models of analysis, activities, methods and informational tools and is providing practical guidelines for initiating this process. In this paper, we try: (a to contribute to a coherent view on economic intelligence process (approaches, stages, fields of application; b to describe the most important models of analysis related to this process; c to analyze the activities, methods and tools associated with each stage of an EI process.

  11. Information Extraction for System-Software Safety Analysis: Calendar Year 2008 Year-End Report

    Science.gov (United States)

    Malin, Jane T.

    2009-01-01

    This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.

  12. Research on analysis method for temperature control information of high arch dam construction

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Temperature control,which is directly responsible for the project quality and progress,plays an important role in high arch dam construction.How to discover the rules from a large amount of temperature control information collected in order to guide the adjustment of temperature control measures to prevent cracks on site is the key scientific problem.In this paper,a mathematic logical model was built firstly by means of a coupling analysis of temperature control system decomposition and coordination for high arch dam.Then,an analysis method for temperature control information was presented based on data mining technology.Furthermore,the data warehouse of temperature control was designed,and the artificial neural network forecasting model for the highest temperature of concrete was also developed.Finally,these methods were applied to a practical project. The result showed that the efficiency and precision of temperature control was improved,and rationality and scientificity of management and decision-making were strengthened.All of these researches provided an advanced analysis method for temperature control in the high arch dam construction process.

  13. Parametric Sensitivity Analysis for Stochastic Molecular Systems using Information Theoretic Metrics

    CERN Document Server

    Tsourtis, Anastasios; Katsoulakis, Markos A; Harmandaris, Vagelis

    2014-01-01

    In this paper we extend the parametric sensitivity analysis (SA) methodology proposed in Ref. [Y. Pantazis and M. A. Katsoulakis, J. Chem. Phys. 138, 054115 (2013)] to continuous time and continuous space Markov processes represented by stochastic differential equations and, particularly, stochastic molecular dynamics as described by the Langevin equation. The utilized SA method is based on the computation of the information-theoretic (and thermodynamic) quantity of relative entropy rate (RER) and the associated Fisher information matrix (FIM) between path distributions. A major advantage of the pathwise SA method is that both RER and pathwise FIM depend only on averages of the force field therefore they are tractable and computable as ergodic averages from a single run of the molecular dynamics simulation both in equilibrium and in non-equilibrium steady state regimes. We validate the performance of the extended SA method to two different molecular stochastic systems, a standard Lennard-Jones fluid and an al...

  14. Analysis of workers' dose records from the Greek Dose Registry Information System

    International Nuclear Information System (INIS)

    The object of this work is the study of the individual film badge annual dose information of classified workers in Greece, monitored and assessed by the central dosimetry service of the Greek Atomic Energy Commission. Dose summaries were recorded and processed by the Dose Registry Information System. The statistical analysis refers to the years 1989-93 and deals with the distribution of individuals in the occupational groups, the mean annual dose, the collective dose, the distribution of the dose over the different specialties and the number of workers that have exceeded any of the established dose limits. Results concerning the annual dose summaries, demonstrate a year-by-year reduction in the mean individual dose to workers in the health sector. Conversely, exposures in the industrial sector did not show any decreasing tendency during the period under consideration. (Author)

  15. Information Flow Through Stages of Complex Engineering Design Projects: A Dynamic Network Analysis Approach

    DEFF Research Database (Denmark)

    Parraguez, Pedro; Eppinger, Steven D.; Maier, Anja

    2015-01-01

    as those activities are implemented through the network of people executing the project. To address this gap, we develop a dynamic modeling method that integrates both the network of people and the network of activities in the project. We then employ a large dataset collected from an industrial setting......, consisting of project-related e-mails and activity records from the design and development of a renewable energy plant over the course of more than three years. Using network metrics for centrality and clustering, we make three important contributions: 1)We demonstrate a novel method for analyzing...... information flows between activities in complex engineering design projects; 2) we show how the network of information flows in a large-scale engineering project evolved over time and how network analysis yields several managerial insights; and 3) we provide a useful new representation of the engineering...

  16. Methods and apparatuses for information analysis on shared and distributed computing systems

    Energy Technology Data Exchange (ETDEWEB)

    Bohn, Shawn J [Richland, WA; Krishnan, Manoj Kumar [Richland, WA; Cowley, Wendy E [Richland, WA; Nieplocha, Jarek [Richland, WA

    2011-02-22

    Apparatuses and computer-implemented methods for analyzing, on shared and distributed computing systems, information comprising one or more documents are disclosed according to some aspects. In one embodiment, information analysis can comprise distributing one or more distinct sets of documents among each of a plurality of processes, wherein each process performs operations on a distinct set of documents substantially in parallel with other processes. Operations by each process can further comprise computing term statistics for terms contained in each distinct set of documents, thereby generating a local set of term statistics for each distinct set of documents. Still further, operations by each process can comprise contributing the local sets of term statistics to a global set of term statistics, and participating in generating a major term set from an assigned portion of a global vocabulary.

  17. The value of information as applied to the Landsat Follow-on benefit-cost analysis

    Science.gov (United States)

    Wood, D. B.

    1978-01-01

    An econometric model was run to compare the current forecasting system with a hypothetical (Landsat Follow-on) space-based system. The baseline current system was a hybrid of USDA SRS domestic forecasts and the best known foreign data. The space-based system improved upon the present Landsat by the higher spatial resolution capability of the thematic mapper. This satellite system is a major improvement for foreign forecasts but no better than SRS for domestic forecasts. The benefit analysis was concentrated on the use of Landsat Follow-on to forecast world wheat production. Results showed that it was possible to quantify the value of satellite information and that there are significant benefits in more timely and accurate crop condition information.

  18. Database Semantic Interoperability based on Information Flow Theory and Formal Concept Analysis

    Directory of Open Access Journals (Sweden)

    Guanghui Yang

    2012-07-01

    Full Text Available As databases become widely used, there is a growing need to translate information between multiple databases. Semantic interoperability and integration has been a long standing challenge for the database community and has now become a prominent area of database research. In this paper, we aim to answer the question how semantic interoperability between two databases can be achieved by using Formal Concept Analysis (FCA for short and Information Flow (IF for short theories. For our purposes, firstly we discover knowledge from different databases by using FCA, and then align what is discovered by using IF and FCA. The development of FCA has led to some software systems such as TOSCANA and TUPLEWARE, which can be used as a tool for discovering knowledge in databases. A prototype based on the IF and FCA has been developed. Our method is tested and verified by using this prototype and TUPLEWARE.

  19. Benefit-Cost Analysis of Security Systems for Multiple Protected Assets Based on Information Entropy

    Directory of Open Access Journals (Sweden)

    Qing Cai

    2012-03-01

    Full Text Available This article proposes a quantitative risk assessment for security systems which have multiple protected assets and a risk-based benefit-cost analysis for decision makers. The proposed methodology consists of five phases: identification of assets, security unit and intrusion path, security unit effectiveness estimation, intrusion path effectiveness estimation, security system risk assessment and benefit-cost estimation. Key innovations in this methodology include its use of effectiveness entropy to measure the degree of uncertainty of a security system to complete a protection task, and the fact it measures risk like information theory measures the amount of information. A notional example is provided to demonstrate an application of the proposed methodology.

  20. A media information analysis for implementing effective countermeasure against harmful rumor

    Science.gov (United States)

    Nagao, Mitsuyoshi; Suto, Kazuhiro; Ohuchi, Azuma

    2010-04-01

    When large scale earthquake occurred, the word of "harmful rumor" came to be frequently heard. The harmful rumor means an economic damage which is caused by the action that people regard actually safe foods or areas as dangerous and then abort consumption or sightseeing. In the case of harmful rumor caused by earthquake, especially, tourism industry receives massive economic damage. Currently, harmful rumor which gives substantial economic damage have become serious social issue which must be solved. In this paper, we propose a countermeasure method for harmful rumor on the basis of media trend in order to implement speedy recovery from harmful rumor. Here, we investigate the amount and content of information which is transmitted to the general public by the media when an earthquake occurred. In addition, the media information in three earthquakes is treated as instance. Finally, we discuss an effective countermeasure method for dispeling harmful rumor through these analysis results.

  1. Rock Slopes Failure Susceptibility Analysis: From Remote Sensing Measurements to Geographic Information System Raster Modules

    Directory of Open Access Journals (Sweden)

    Filipello Andrea

    2010-01-01

    Full Text Available Problem statement: Two important step can be recognised in the rockfall analysis: the potential failure detection and the run out simulation. Analyzing the stability of rock slopes, most important kinematisms are planar or wedge slidings and topplings. The aim of this study was to coupling a deterministic approach for landslide initiation (potential rockfall source areas with a runout analysis by developing new GRASS GIS raster modules. A case study in the Ossola Valley, at the border between Italy and Swiss, was discussed. Approach: New GIS raster modules for rockfall analysis were developed. Slope stability modules were based on rock mass classification indexes and on limit equilibrium model, while the prediction of rockfall travel distance was based on the shadow angle approach. Results: The study highlighted the importance of GIS tools for analysis of landslide susceptibility. The spatial forecasts provided by the new GIS modules were validated and supplemented by traditional analysis. Conclusion: This study proved that there is a good correspondence between the prediction of high attitude to instability calculated by the modules and the location of past events. The new modules have provided an opportunity to assess, in an objective and repeatable way, the susceptibility to failure and also quantitative information about area of invasion for rock falling.

  2. Bird Activity Analysis Using Avian Radar Information in Naval Air Station airport, WA

    Science.gov (United States)

    Wang, J.; Herricks, E.

    2010-12-01

    The number of bird strikes on aircraft has increased sharply over recent years and airport bird hazard management has gained increasing attention in wildlife management and control. Evaluation of bird activity near airport is very critical to analyze the hazard of bird strikes. Traditional methods for bird activity analysis using visual counting provide a direct approach to bird hazard assessment. However this approach is limited to daylight and good visual conditions. Radar has been proven to be a useful and effective tool for bird detection and movement analysis. Radar eliminates observation bias and supports consistent data collection for bird activity analysis and hazard management. In this study bird activity data from the Naval Air Station Whidbey Island was collected by Accipiter Avian Radar System. Radar data was pre-processed by filtering out non-bird noises, including traffic vehicle, aircraft, insects, wind, rainfall, ocean waves and so on. Filtered data is then statistically analyzed using MATLAB programs. The results indicated bird movement dynamics in target areas near the airport, which includes (1) the daily activity varied at dawn and dusk; (2) bird activity varied by target area due to the habitat difference; and (3) both temporal and spatial movement patterns varied by bird species. This bird activity analysis supports bird hazard evaluation and related analysis and modeling to provide very useful information in airport bird hazard management planning.

  3. Informativeness of minisatellite and microsatellite markers for genetic analysis in papaya.

    Science.gov (United States)

    Oliveira, G A F; Dantas, J L L; Oliveira, E J

    2015-10-01

    The objective of this study was to evaluate information on minisatellite and microsatellite markers in papaya (Carica papaya L.). Forty minisatellites and 91 microsatellites were used for genotyping 24 papaya accessions. Estimates of genetic diversity, genetic linkage and analyses of population structure were compared. A lower average number of alleles per locus was observed in minisatellites (3.10) compared with microsatellites (3.57), although the minisatellites showed rarer alleles (18.54 %) compared with microsatellite (13.85 %). Greater expected (He = 0.52) and observed (Ho = 0.16) heterozygosity was observed in the microsatellites compared with minisatellites (He = 0.42 and Ho = 0.11), possibly due to the high number of hermaphroditic accessions, resulting in high rates of self-fertilization. The polymorphic information content and Shannon-Wiener diversity were also higher for microsatellites (from 0.47 to 1.10, respectively) compared with minisatellite (0.38 and 0.85, respectively). The probability of paternity exclusion was high for both markers (>0.999), and the combined probability of identity was from 1.65(-13) to 4.33(-38) for mini- and micro-satellites, respectively, which indicates that both types of markers are ideal for genetic analysis. The Bayesian analysis indicated the formation of two groups (K = 2) for both markers, although the minisatellites indicated a substructure (K = 4). A greater number of accessions with a low probability of assignment to specific groups were observed for microsatellites. Collectively, the results indicated higher informativeness of microsatellites. However, the lower informative power of minisatellites may be offset by the use of larger number of loci. Furthermore, minisatellites are subject to less error in genotyping because there is greater power to detect genotyping systems when larger motifs are used. PMID:26280323

  4. Information Analysis Methodology for Border Security Deployment Prioritization and Post Deployment Evaluation

    International Nuclear Information System (INIS)

    Due to international commerce, cross-border conflicts, and corruption, a holistic, information driven, approach to border security is required to best understand how resources should be applied to affect sustainable improvements in border security. The ability to transport goods and people by land, sea, and air across international borders with relative ease for legitimate commercial purposes creates a challenging environment to detect illicit smuggling activities that destabilize national level border security. Smuggling activities operated for profit or smuggling operations driven by cross border conflicts where militant or terrorist organizations facilitate the transport of materials and or extremists to advance a cause add complexity to smuggling interdiction efforts. Border security efforts are further hampered when corruption thwarts interdiction efforts or reduces the effectiveness of technology deployed to enhance border security. These issues necessitate the implementation of a holistic approach to border security that leverages all available data. Large amounts of information found in hundreds of thousands of documents can be compiled to assess national or regional borders to identify variables that influence border security. Location data associated with border topics of interest may be extracted and plotted to better characterize the current border security environment for a given country or region. This baseline assessment enables further analysis, but also documents the initial state of border security that can be used to evaluate progress after border security improvements are made. Then, border security threats are prioritized via a systems analysis approach. Mitigation factors to address risks can be developed and evaluated against inhibiting factor such as corruption. This holistic approach to border security helps address the dynamic smuggling interdiction environment where illicit activities divert to a new location that provides less resistance

  5. Object-oriented analysis and design for information systems Modeling with UML, OCL, IFML

    CERN Document Server

    Wazlawick, Raul Sidnei

    2014-01-01

    Object-Oriented Analysis and Design for Information Systems clearly explains real object-oriented programming in practice. Expert author Raul Sidnei Wazlawick explains concepts such as object responsibility, visibility and the real need for delegation in detail. The object-oriented code generated by using these concepts in a systematic way is concise, organized and reusable. The patterns and solutions presented in this book are based in research and industrial applications. You will come away with clarity regarding processes and use cases and a clear understand of how to expand a use case.

  6. Automatic generation of stop word lists for information retrieval and analysis

    Science.gov (United States)

    Rose, Stuart J

    2013-01-08

    Methods and systems for automatically generating lists of stop words for information retrieval and analysis. Generation of the stop words can include providing a corpus of documents and a plurality of keywords. From the corpus of documents, a term list of all terms is constructed and both a keyword adjacency frequency and a keyword frequency are determined. If a ratio of the keyword adjacency frequency to the keyword frequency for a particular term on the term list is less than a predetermined value, then that term is excluded from the term list. The resulting term list is truncated based on predetermined criteria to form a stop word list.

  7. Transportation Big Data: Unbiased Analysis and Tools to Inform Sustainable Transportation Decisions

    Energy Technology Data Exchange (ETDEWEB)

    2016-06-01

    Today, transportation operation and energy systems data are generated at an unprecedented scale. The U.S. Department of Energy's National Renewable Energy Laboratory (NREL) is the go-to source for expertise in providing data and analysis to inform industry and government transportation decision making. The lab's teams of data experts and engineers are mining and analyzing large sets of complex data -- or 'big data' -- to develop solutions that support the research, development, and deployment of market-ready technologies that reduce fuel consumption and greenhouse gas emissions.

  8. Clustering analysis of questionnaire for Ph.D. studies in Electrical and Information Engineering in Europe

    DEFF Research Database (Denmark)

    Papadourakis, George M.; Christinaki, Eirini; Hatzi, Panagiota; Thiriet, Jean-Marc; Yahoui, Hamed; Bonnaud, Olivier; Friesel, Anna; Sidibe, Dro-Desire; Tsirigotis, Georgios

    The ELLEIEC ERASMUS thematic network was running from October 2008 to September 2011. This project deals with several aspects linked to Lifelong Learning in Electrical and Information Engineering in Europe. Within this project, Task 4 is dedicated to “Implementation issues across EIE”. The idea of...... were carried out. The survey for the doctoral studies was statistically analyzed. Furthermore, the doctoral studies survey was investigated using clustering analysis and more precisely hierarchical, in order to compare the answers in 21 European countries and to find similarities among them....

  9. The use of cloud enabled building information models – an expert analysis

    Directory of Open Access Journals (Sweden)

    Alan Redmond

    2012-12-01

    Full Text Available The dependency of today’s construction professionals to use singular commercial applications for design possibilities creates the risk of being dictated by the language-tools they use. This unknowingly approach to converting to the constraints of a particular computer application’s style, reduces one’s association with cutting-edge design as no single computer application can support all of the tasks associated with building-design and production. Interoperability depicts the need to pass data between applications, allowing multiple types of experts and applications to contribute to the work at hand. Cloud computing is a centralized heterogeneous platform that enables different applications to be connected to each other through using remote data servers. However, the possibility of providing an interoperable process based on binding several construction applications through a single repository platform ‘cloud computing’ required further analysis. The following Delphi questionnaires analysed the exchanging information opportunities of Building Information Modelling (BIM as the possible solution for the integration of applications on a cloud platform. The survey structure is modelled to; (i identify the most appropriate applications for advancing interoperability at the early design stage, (ii detect the most severe barriers of BIM implementation from a business and legal viewpoint, (iii examine the need for standards to address information exchange between design team, and (iv explore the use of the most common interfaces for exchanging information. The anticipated findings will assist in identifying a model that will enhance the standardized passing of information between systems at the feasibility design stage of a construction project.

  10. Analysis on health information extracted from an urban professional population in Beijing

    Institute of Scientific and Technical Information of China (English)

    ZHANG Tie-mei; ZHANG Yan; LIU Bin; JIA Hong-bo; LIU Yun-jie; ZHU Ling; LUO Sen-lin; HAN Yi-wen; ZHANG Yan; YANG Shu-wen; LIU An-nan; MA Lan-jun; ZHAO Yan-yan

    2011-01-01

    Background The assembled data from a population could provide information on health trends within the population.The aim of this research was to extract and know basic health information from an urban professional population in Beijing.Methods Data analysis was carried out in a population who underwent a routine medical check-up and aged >20 years,including 30 058 individuals.General information,data from physical examinations and blood samples were collected in the same method.The health status was separated into three groups by the criteria generated in this study,i.e.,people with common chronic diseases,people in a sub-clinic situation,and healthy people.The proportion of both common diseases suffered and health risk distribution of different age groups were also analyzed.Results The proportion of people with common chronic diseases,in the sub-clinic group and in the healthy group was 28.6%,67.8% and 3.6% respectively.There were significant differences in the health situation in different age groups.Hypertension was on the top of list of self-reported diseases.The proportion of chronic diseases increased significantly in people after 35 years of age.Meanwhile,the proportion of sub-clinic conditions was decreasing at the same rate.The complex risk factors to health in this population were metabolic disturbances (61.3%),risk for tumor (2.7%),abnormal results of morphological examination (8.2%) and abnormal results of lab tests of serum (27.8%).Conclusions Health information could be extracted from a complex data set from the heath check-ups of the general population.The information should be applied to support prevention and control chronic diseases as well as for directing intervention for patients with risk factors for disease.

  11. Log Usage Analysis: What it Discloses about Use, Information Seeking and Trustworthiness

    Directory of Open Access Journals (Sweden)

    David Nicholas

    2014-06-01

    Full Text Available The Trust and Authority in Scholarly Communications in the Light of the Digital Transition research project1 was a study which investigated the behaviours and attitudes of academic researchers as producers and consumers of scholarly information resources in respect to how they determine authority and trustworthiness. The research questions for the study arose out of CIBER’s studies of the virtual scholar. This paper focuses on elements of this study, mainly an analysis of a scholarly publisher’s usage logs, which was undertaken at the start of the project in order to build an evidence base, which would help calibrate the main methodological tools used by the project: interviews and questionnaire. The specific purpose of the log study was to identify and assess the digital usage behaviours that potentially raise trustworthiness and authority questions. Results from the self-report part of the study were additionally used to explain the logs. The main findings were that: 1 logs provide a good indicator of use and information seeking behaviour, albeit in respect to just a part of the information seeking journey; 2 the ‘lite’ form of information seeking behaviour observed in the logs is a sign of users trying to make their mind up in the face of a tsunami of information as to what is relevant and to be trusted; 3 Google and Google Scholar are the discovery platforms of choice for academic researchers, which partly points to the fact that they are influenced in what they use and read by ease of access; 4 usage is not a suitable proxy for quality. The paper also provides contextual data from CIBER’s previous studies.

  12. The use of cloud enabled building information models – an expert analysis

    Directory of Open Access Journals (Sweden)

    Alan Redmond

    2015-10-01

    Full Text Available The dependency of today’s construction professionals to use singular commercial applications for design possibilities creates the risk of being dictated by the language-tools they use. This unknowingly approach to converting to the constraints of a particular computer application’s style, reduces one’s association with cutting-edge design as no single computer application can support all of the tasks associated with building-design and production. Interoperability depicts the need to pass data between applications, allowing multiple types of experts and applications to contribute to the work at hand. Cloud computing is a centralized heterogeneous platform that enables different applications to be connected to each other through using remote data servers. However, the possibility of providing an interoperable process based on binding several construction applications through a single repository platform ‘cloud computing’ required further analysis. The following Delphi questionnaires analysed the exchanging information opportunities of Building Information Modelling (BIM as the possible solution for the integration of applications on a cloud platform. The survey structure is modelled to; (i identify the most appropriate applications for advancing interoperability at the early design stage, (ii detect the most severe barriers of BIM implementation from a business and legal viewpoint, (iii examine the need for standards to address information exchange between design team, and (iv explore the use of the most common interfaces for exchanging information. The anticipated findings will assist in identifying a model that will enhance the standardized passing of information between systems at the feasibility design stage of a construction project.

  13. Organizational culture, creative behavior, and information and communication technology (ICT) usage: a facet analysis.

    Science.gov (United States)

    Carmeli, Abraham; Sternberg, Akiva; Elizur, D

    2008-04-01

    Despite the prominence of organizational culture (OC), this concept is controversial and its structure has yet to be systematically analyzed. This study develops a three-pronged formal definitional framework on the basis of facet theory (FT) and explores behavior modality, referent, and object. This facet analysis (FA) of OC accounts successfully for variation in both creative behavior at work and the usage of information and communication technologies (ICTs). An analysis of data collected from 230 employees in the financial industry indicates that a radex structure was obtained for work and ICT. The behavior modality facet ordered the space from center to periphery, and referents facet relates to the direction angles away from the origin. PMID:18422410

  14. Towards a Structurational Theory of Information Systems: a substantive case analysis

    DEFF Research Database (Denmark)

    Rose, Jeremy; Hackney, R. H

    2003-01-01

    This paper employs the analysis of an interpretive case study within a Regional Train Operating Company (RTOC) to arrive at theoretical understandings of Information Systems (IS). Giddens’ ‘structuration theory’ is developed which offers an account of structure and agency; social practices...... developing and changing over time and space. The most common application of structuration theory to the IS domain is the analysis of empirical situations using the ‘dimensions of the duality of structure’ model. The best-known attempts to theorize IS concerns using this approach have come from Orlikowski...... the company (RTOC) from engineers and train drivers to the board of directors. Participant observation was also undertaken with the authors attending twenty-one meetings, workshops and presentations. The resulting theoretical model describes IS embedded in social practices, which evolve to display...

  15. SDI-based business processes: A territorial analysis web information system in Spain

    Science.gov (United States)

    Béjar, Rubén; Latre, Miguel Á.; Lopez-Pellicer, Francisco J.; Nogueras-Iso, Javier; Zarazaga-Soria, F. J.; Muro-Medrano, Pedro R.

    2012-09-01

    Spatial Data Infrastructures (SDIs) provide access to geospatial data and operations through interoperable Web services. These data and operations can be chained to set up specialized geospatial business processes, and these processes can give support to different applications. End users can benefit from these applications, while experts can integrate the Web services in their own business processes and developments. This paper presents an SDI-based territorial analysis Web information system for Spain, which gives access to land cover, topography and elevation data, as well as to a number of interoperable geospatial operations by means of a Web Processing Service (WPS). Several examples illustrate how different territorial analysis business processes are supported. The system has been established by the Spanish National SDI (Infraestructura de Datos Espaciales de España, IDEE) both as an experimental platform for geoscientists and geoinformation system developers, and as a mechanism to contribute to the Spanish citizens knowledge about their territory.

  16. Analysis of health impact inputs to the US Department of Energy's risk information system

    International Nuclear Information System (INIS)

    The US Department of Energy (DOE) is in the process of completing a survey of environmental problems, referred to as the Environmental Survey, at their facilities across the country. The DOE Risk Information System (RIS) is being used to prioritize these environmental problems identified in the Environmental Survey's findings. This report contains a discussion of site-specific public health risk parameters and the rationale for their inclusion in the RIS. These parameters are based on computed potential impacts obtained with the Multimedia Environmental Pollutant Assessment System (MEPAS). MEPAS is a computer-based methodology for evaluating the potential exposures resulting from multimedia environmental transport of hazardous materials. This report has three related objectives: document the role of MEPAS in the RIS framework, report the results of the analysis of alternative risk parameters that led to the current RIS risk parameters, and describe analysis of uncertainties in the risk-related parameters. 20 refs., 17 figs., 10 tabs

  17. An index of information content for genotype probabilities derived from segregation analysis.

    Science.gov (United States)

    Kinghorn, B P

    1997-02-01

    A genotype probability index (GPI) is proposed to indicate the information content of genotype probabilities derived from a segregation analysis. Typically, some individuals are genotyped at a marker locus or a quantitative trait locus, and segregation analysis is used to make genotype inferences about ungenotyped relatives. Genotype probabilities for a two-allele autosomal locus are plotted on a triangular surface. The GPI has a value of zero at the point corresponding to Hardy-Weinberg frequencies, and a value of 100% at the vertices of the triangle. Trigonometric functions are used to help calculate intermediate index values. It is proposed that such an index can be useful to help identify which ungenotyped individuals or loci should be genotyped to maximize the benefit/cost of genotyping operations. PMID:9071600

  18. Information Synthesis in Uncertainty Studies: Application to the Analysis of the BEMUSE Results

    International Nuclear Information System (INIS)

    To demonstrate that the nuclear power plants are designed to respond safely at numerous postulated accidents computer codes are used. The models of these computer codes are an approximation of the real physical behaviour occurring during an accident. Moreover the data used to run these codes are also known with a limited accuracy. Therefore the code predictions are not exact but uncertain. To deal with these uncertainties, 'best estimate' codes with 'best estimate' input data are used to obtain a best estimate calculation and it is necessary to derive the uncertainty associated to their estimations. For this reason, regulatory authorities demand in particular to technical safety organization such as the French Institut de Radioprotection et de Surete Nucleaire (IRSN) to provide results taking into account all the uncertainty sources to assess safety quantities are below critical values. Uncertainty analysis can be seen as a problem of information treatment and a special effort on four methodological key issues has to be done. The first one is related to information modelling. In safety studies, one can distinguish two kinds of uncertainty. The first type, called aleatory uncertainty, is due to the natural variability of an observed phenomenon and cannot be reduced by the arrival of new information. The second type, called epistemic uncertainty, can arise from imprecision. Contrary to the previous one, this uncertainty can be reduced by increasing the state of knowledge. Performing a relevant information modelling therefore requires to work with a mathematical formalism flexible enough to faithfully treat both types of uncertainties. The second one deals with information propagation through a computer code. It requires to run the codes several times and it is usually achieved thanks to a coupling to a statistical software. The complexity of the propagation is strongly connected to the mathematical framework used for the information modelling. The more general the

  19. Confirmatory Factor Analysis of IT-based Competency Questionnaire in Information Science & Knowledge Studies, Based on Job Market Analysis

    Directory of Open Access Journals (Sweden)

    Rahim Shahbazi

    2016-03-01

    Full Text Available The main purpose of the present research is to evaluate the validity of an IT-based competency questionnaire in Information Science & Knowledge Studies. The Survey method has been used in the present research. A data collection tool has been a researcher-made questionnaire. Statistic samples, which are 315 people, have been chosen purposefully from among Iranian faculty members, Ph.D. students, and information center employees. The findings showed that by eliminating 17 items from the whole questionnaire and Confirmatory Factor Analysis of the rest and rotating findings using the Varimax method, 8 Factors were revealed. The resulting components and also the items which had a high load factor with these components were considerably consistent with the classifications in the questionnaire and partly consistent with the findings of other researchers. 76 competency indicators (knowledge, skills, and attitudes were validated and grouped under 8 main categories: 1. “Computer Basics” 2. “Database Operating, Collection Development of Digital Resources, & Digital Library Management” 3. “Basics of Computer Networking” 4. “Basics of Programming & Database Designing” 5. “Web Designing & Web Content Analysis” 6. “Library Software & Computerized Organizing” 7. Archive of Digital Resources and 8. Attitudes.

  20. Understanding and Managing Our Earth through Integrated Use and Analysis of Geo-Information

    Directory of Open Access Journals (Sweden)

    Wolfgang Kainz

    2011-09-01

    Full Text Available All things in our world are related to some location in space and time, and according to Tobler’s first law of geography “everything is related to everything else, but near things are more related than distant things” [1]. Since humans exist they have been contemplating about space and time and have tried to depict and manage the geographic space they live in. We know graphic representations of the land from various regions of the world dating back several thousands of years. The processing and analysis of spatial data has a long history in the disciplines that deal with spatial data such as geography, surveying engineering, cartography, photogrammetry, and remote sensing. Until recently, all these activities have been analog in nature; only since the invention of the computer in the second half of the 20th century and the use of computers for the acquisition, storage, analysis, and display of spatial data starting in the 1960s we speak of geo-information and geo-information systems. [...

  1. Managing Returnable Containers Logistics - A Case Study Part I - Physical and Information Flow Analysis

    Directory of Open Access Journals (Sweden)

    Reza A. Maleki

    2011-05-01

    Full Text Available This case study paper is the result of a project conducted on behalf of a company, hereon referred to as Midwest Assembly and Manufacturing or MAAN. The company's operations include component manufacturing, painting, and assembling products. The company also purchases a relatively large percentage of components and major assemblies that are needed to support final assembly operations. MAAN uses its own returnable containers to transport purchased parts from suppliers. Due to poor tracking of the containers, the company has been experiencing lost containers and occasional production disruptions at its facility well as at the supplier sites. The objective of this project was to develop a proposal to enable MAAN to more effectively track and manage its returnable containers. The research activities in support of this project included the analysis and documentation of both the physical flow and the information flow associated with the containers as well as some of the technologies that can help with automatic identification and tracking of containers. The focal point of this paper is on a macro?level approach for the analysis of container and information flow within the logistics chain. A companion paper deals with several of the automatic identification technologies that have the potential to improve the management of MAAN's returnable containers.

  2. Analysis Level Of Utilization Information And Communication Technology With The Competency Level Of Extension Workers

    Directory of Open Access Journals (Sweden)

    Veronice Veronice

    2015-01-01

    Full Text Available Extension placed man as the subject of development and human capital to develop into independent and empowered (dignity in adapting to the environment, thus being able to improve the quality of life for themselves, their families and communities. It is therefore necessary professional competence standard extension clear and effective controls in carrying counseling profession domination supported by Information and Communication Technology (ICT. This research aimed to analyze the relationship between the level of competency with the level of ICT use by the extension workers. The study was designed as a descriptive survey research correlational ,which was observed by quantitative analysis approach that is supported by descriptive and inferential statistic analysis. The study was conducted in Bogor Regency,West Java Province. Based on this research can be concluded  level of ICT utilization in the range of aspects related resources are very real to the competence of extension on the capability of understanding the potential of the region, entrepreneurial ability and the ability of the system guides the network, while the variation of the material aspects of counseling and a variety of related information is very real with all levels of competence extension.

  3. Information and Emotion in Advertising: A Content Analysis on the Internet in Brazil

    Directory of Open Access Journals (Sweden)

    Melby Karina Zuniga Huertas

    2012-02-01

    Full Text Available The significant increase in the number of users and transactions over the Internet has highlighted the importance of this channel, bringing new opportunities and challenges for advertising. However, the way advertisers use information and emotional appeals in their messages is still a subject little studied. Hence, the objectives of this paper are to: a analyze some general features of Internet advertising; and b explore the information content and emotional appeal of Internet advertising. We conducted an exploratory search through content analysis of a sample of 156 ads on sites using a convenience sample of Brazilian Internet users. The results showed that the advertising on the analyzed sites was for an assortment of products but the products were concentrated in a few categories. The types of informational content most commonly used are "availability" and "components or content" of products. Also the advertisements use only positive emotional appeals, which bring opportunities to create campaigns. This study contributes knowledge about the use of internet in marketing communications and suggests the direction of future research.

  4. Training needs for toxicity testing in the 21st century: a survey-informed analysis.

    Science.gov (United States)

    Lapenna, Silvia; Gabbert, Silke; Worth, Andrew

    2012-12-01

    Current training needs on the use of alternative methods in predictive toxicology, including new approaches based on mode-of-action (MoA) and adverse outcome pathway (AOP) concepts, are expected to evolve rapidly. In order to gain insight into stakeholder preferences for training, the European Commission's Joint Research Centre (JRC) conducted a single-question survey with twelve experts in regulatory agencies, industry, national research organisations, NGOs and consultancies. Stakeholder responses were evaluated by means of theory-based qualitative data analysis. Overall, a set of training topics were identified that relate both to general background information and to guidance for applying alternative testing methods. In particular, for the use of in silico methods, stakeholders emphasised the need for training on data integration and evaluation, in order to increase confidence in applying these methods for regulatory purposes. Although the survey does not claim to offer an exhaustive overview of the training requirements, its findings support the conclusion that the development of well-targeted and tailor-made training opportunities that inform about the usefulness of alternative methods, in particular those that offer practical experience in the application of in silico methods, deserves more attention. This should be complemented by transparent information and guidance on the interpretation of the results generated by these methods and software tools. PMID:23398336

  5. Model parameter analysis using remotely sensed pattern information in a multi-constraint framework

    Science.gov (United States)

    Stisen, Simon; McCabe, Matthew F.; Refsgaard, Jens C.; Lerer, Sara; Butts, Michael B.

    2011-10-01

    SummaryThe development of sophisticated distributed hydrological models that characterize groundwater, surface water and atmospheric interactions, inevitably increases the amount of observational data required to force the model and evaluate simulations. The limitations of using traditional evaluation data and the benefits of including independent and spatially distributed observational constraints are illustrated here through a sensitivity analysis of a coupled surface-water/groundwater/atmosphere model simulated at the catchment scale. In the analyses, model performance based on objective functions using conventional stream flow and groundwater head observations are compared against objective functions that utilize spatially distributed satellite based surface temperature retrievals as the calibration variable. The advantage of incorporating remote sensing based observations into the model evaluation process is their spatially distributed information content, enabling an assessment of the capacity of the model to reproduce observed spatial patterns. Results indicate that employing spatially distributed model parameterizations has limited impact on improving model performance when evaluated against traditional model objectives of stream flow and groundwater head. Indeed, a spatially uniform parameterization produced almost identical model performance. In contrast, objective functions that incorporate remote sensing based surface temperatures, highlighted the comparatively poor reproduction of spatial patterns when using a spatially uniform parameterization. Although lumped observations such as stream discharge contain valuable information regarding the total catchment water budget, such traditional observations should be merged with independent data sets that incorporate spatial pattern information to strengthen the robustness of model performance and model parameter constraint.

  6. Drive-Response Analysis of Global Ice Volume, CO2, and Insolation using Information Transfer

    Science.gov (United States)

    Brendryen, J.; Hannisdal, B.

    2014-12-01

    The processes and interactions that drive global ice volume variability and deglaciations are a topic of considerable debate. Here we analyze the drive-response relationships between data sets representing global ice volume, CO2 and insolation over the past 800 000 years using an information theoretic approach. Specifically, we use a non-parametric measure of directional information transfer (IT) based on the construct of transfer entropy to detect the relative strength and directionality of interactions in the potentially chaotic and non-linear glacial-interglacial climate system. Analyses of unfiltered data suggest a tight coupling between CO2 and ice volume, detected as strong, symmetric information flow consistent with a two-way interaction. In contrast, IT from Northern Hemisphere (NH) summer insolation to CO2 is highly asymmetric, suggesting that insolation is an important driver of CO2. Conditional analysis further suggests that CO2 is a dominant influence on ice volume, with the effect of insolation also being significant but limited to smaller-scale variability. However, the strong correlation between CO2 and ice volume renders them information redundant with respect to insolation, confounding further drive-response attribution. We expect this information redundancy to be partly explained by the shared glacial-interglacial "sawtooth" pattern and its overwhelming influence on the transition probability distributions over the target interval. To test this, we filtered out the abrupt glacial terminations from the ice volume and CO2 records to focus on the residual variability. Preliminary results from this analysis confirm insolation as a driver of CO2 and two-way interactions between CO2 and ice volume. However, insolation is reduced to a weak influence on ice volume. Conditional analyses support CO2 as a dominant driver of ice volume, while ice volume and insolation both have a strong influence on CO2. These findings suggest that the effect of orbital

  7. Beyond usage: understanding the use of electronic journals on the basis of information activity analysis. Electronic journals, Use studies, Information activity, Scientific communication

    Directory of Open Access Journals (Sweden)

    Annaïg Mahé

    2004-01-01

    Full Text Available In this article, which reports the second part of a two-part study of the use of electronic journals by researchers in two French research institutions, we attempt to explain the integration of the use of electronic journals in the scientists' information habits, going beyond usage analysis. First, we describe how the development of electronic journals use follows a three-phase innovation process - research-development, first uses, and technical acculturation. Then, we attempt to find more significant explanatory factors, and emphasis is placed on the wider context of information activity. Three main information activity types are outlined - marginal, parallel, and integrated. Each of these types corresponds to a particular attitude towards scientific information and to different levels of electronic journal use.

  8. Impact of Corporate Governance on Social and Environmental Information Disclosure of Malaysian Listed Banks: Panel Data Analysis

    OpenAIRE

    Mohamad Akhyar Adnan; Hafiz Majdi Ab. Rashid; Sheila Nu Nu Htay; Ahamed Kameel Mydin Meera

    2012-01-01

    This study investigates the impact of corporate governance on social and environmental information disclosure of Malaysian listed banks by using a panel data analysis. The proxies for good corporate governance are board leadership structure, board composition, board size, director ownership, institutional ownership and block ownership. Social and environmental information disclosure index is developed and content analysis is conducted by cross checking between the social and environm...

  9. A framework for systematic analysis of Open Access journals and its application in software engineering and information systems

    OpenAIRE

    Graziotin, Daniel; Wang, Xiaofeng; Abrahamsson, Pekka

    2013-01-01

    This article is a contribution towards an understanding of Open Access (OA) publishing. It proposes an analysis framework of 18 core attributes, divided into the areas of Bibliographic information, Activity metrics, Economics, Accessibility, and Predatory issues of OA journals. The framework has been employed in a systematic analysis of 30 OA journals in software engineering (SE) and information systems (IS), which were selected among 386 OA journals in Computer Science from the Directory of ...

  10. A comparative analysis of the explanatory power of accounting and patent information for the market values of German firms

    OpenAIRE

    Reitzig, Markus; Ramb, Fred

    2004-01-01

    We present a theoretical and empirical analysis of the fitness of national German (German Commercial Code – Handelsgesetzbuch (HGB)) and international (IAS and US-GAAP) accounting information, as well as European patent data to explain the market values of German manufacturing firms. For the chosen volatile period from 1997 to 2002, cautious national accounting information does not correlate with the firms’ residual market values (RMV). International accounting information make...

  11. 企业ERP投资公告信息分析%Information Analysis on ERP Investment

    Institute of Scientific and Technical Information of China (English)

    徐扬; 程媛媛

    2015-01-01

    Investment bulletin is valuable information resource open to public,and how to extract tacit information from it is a key issue in information analysis.Enterprise Resource Planning (ERP) is widely applied by enterprises because of its standardization,process-oriented management method and high integration.Researches on ERP investment value have caught attention from academic scholars and enterprise managers,and help decision making of enterprises.This paper analyzes information gathered from ERP investment bulletin,tries to measure ERP value by reflection from stock market,so as to make tacit information which may influence ERP value explicit.Based on former insightful studies,this paper gathers ERP investment bulletin of American publicly traded enterprises from 1997-2013,analyzes how information from investment bulletin may influence enterprise value,and provides decision making supports.%投资公告是重要的企业公开信息来源,如何通过提炼其中蕴含的隐性知识并将其显性表达是企业信息分析与决策的重要内容.本文以企业资源规划(Enterprise Resource Planning,ERP)投资公告为分析对象,将隐藏在投资公告中的ERP投资价值影响因素显性化,在对前人研究进行总结和提炼的基础上,采集了1997~2013年在美国上市企业的ERP投资公告,通过对九大变量的抽取、编码和计算,结合信息系统的组织集成和期权价值理论,分析这些公告信息对企业市值的影响,从而对企业投资决策提供支持.

  12. HIV/AIDS Information Needs of Sexually Transmitted Infection Clinic Patients: Content Analysis of Questions Asked during Prevention Counseling

    Science.gov (United States)

    Kalichman, Seth C.; Cain, Demetria; Knecht, Joanna; Hill, Justin

    2008-01-01

    Basic factual information about disease is the cornerstone of health promotion and disease prevention interventions. Previous studies have shown that content analysis of the questions asked of service providers can elucidate the information needs of service consumers. Questions asked by individuals at known high risk for HIV infection have not…

  13. Carbon Dioxide Information Analysis Center and World Data Center-A for atmospheric trace gases: Fiscal year 1995 annual report

    Energy Technology Data Exchange (ETDEWEB)

    Burtis, M.D. [comp.; Cushman, R.M.; Boden, T.A.; Jones, S.B.; Nelson, T.; Stoss, F.W.

    1996-01-01

    Fiscal year 1995 was both a very productive year for the Carbon Dioxide Information Analysis Center and a year of significant change. This document presents information about the most notable accomplishments made during the year. Topics include: high-lights; statistics; future plans; publications, presentations, and awards; and change in organization and staff.

  14. THE USAGE OF HRU SEGMENT MATRIX ACCESS IN THE ANALYSIS OF INFORMATION SECURITY SYSTEMS WHICH MAKE MANDATORY ACCESS CONTROL

    Directory of Open Access Journals (Sweden)

    Korolev I. D.

    2014-09-01

    Full Text Available In this article we consider the usage of HRU access matrix changing system allowing for information security system which makes mandatory access control in case of information security analysis by using an automatic classification of formalized documents in the system of electronic document management

  15. Value of information analysis for Corrective Action Unit 97: Yucca Flat, Nevada Test Site, Nevada

    International Nuclear Information System (INIS)

    The value-of-information analysis evaluated data collection options for characterizing groundwater transport of contamination associated with the Yucca Flat and Climax Mine Corrective Action Units. Experts provided inputs for the evaluation of 48 characterization options, which included 27 component activities, 12 combinations of activities (subgroups), and 9 combinations of subgroups (groups). The options range from an individual study using existing data and intended to address a relatively narrow uncertainty to a 52-million dollar group of activities designed to collect and analyze new information to broadly address multiple uncertainties. A modified version of the contaminant transport component of the regional model was used to simulate contaminant transport and to estimate the maximum extent of the contaminant boundary, defined as that distance beyond which the committed effective dose equivalent from the residual radionuclides in groundwater will not exceed 4 millirem per year within 1,000 years. These simulations identified the model parameters most responsible for uncertainty over the contaminant boundary and determined weights indicating the relative importance of these parameters. Key inputs were identified through sensitivity analysis; the five selected parameters were flux for flow into Yucca Flat from the north, hydrologic source term, effective porosity and diffusion parameter for the Lower Carbonate Aquifer, and path length from the Volcanic Confining Unit to the Lower Carbonate Aquifer. Four measures were used to quantify uncertainty reduction. Using Bayesian analysis, the options were compared and ranked based on their costs and estimates of their effectiveness at reducing the key uncertainties relevant to predicting the maximum contaminant boundary

  16. Value of information analysis for Corrective Action Unit 97: Yucca Flat, Nevada Test Site, Nevada

    Energy Technology Data Exchange (ETDEWEB)

    IT Corporation Las Vegas

    1999-11-19

    The value-of-information analysis evaluated data collection options for characterizing groundwater transport of contamination associated with the Yucca Flat and Climax Mine Corrective Action Units. Experts provided inputs for the evaluation of 48 characterization options, which included 27 component activities, 12 combinations of activities (subgroups), and 9 combinations of subgroups (groups). The options range from an individual study using existing data and intended to address a relatively narrow uncertainty to a 52-million dollar group of activities designed to collect and analyze new information to broadly address multiple uncertainties. A modified version of the contaminant transport component of the regional model was used to simulate contaminant transport and to estimate the maximum extent of the contaminant boundary, defined as that distance beyond which the committed effective dose equivalent from the residual radionuclides in groundwater will not exceed 4 millirem per year within 1,000 years. These simulations identified the model parameters most responsible for uncertainty over the contaminant boundary and determined weights indicating the relative importance of these parameters. Key inputs were identified through sensitivity analysis; the five selected parameters were flux for flow into Yucca Flat from the north, hydrologic source term, effective porosity and diffusion parameter for the Lower Carbonate Aquifer, and path length from the Volcanic Confining Unit to the Lower Carbonate Aquifer. Four measures were used to quantify uncertainty reduction. Using Bayesian analysis, the options were compared and ranked based on their costs and estimates of their effectiveness at reducing the key uncertainties relevant to predicting the maximum contaminant boundary.

  17. A Citation Analysis of Australian Information Systems Researchers: Towards a New ERA?

    Directory of Open Access Journals (Sweden)

    Roger Clarke

    2008-05-01

    Full Text Available Citation analysis is a potentially valuable means of assessing the contributions of researchers, in Information Systems (IS as in other disciplines. In particular, a combination of raw counts and deeper analysis of citation data can deliver insights into the impact of a researcher's publications on other researchers. Despite this potential, the limited literature in the IS discipline has paid very little attention to the use of citation analysis for this purpose. Meanwhile, the federal department responsible for education funding has convinced successive federal governments to develop research quality measures that can be used as a basis for differential funding. The Howard Government's proposed Research Quality Framework (RQF has been abandoned, but a number of aspects of it survive within the Rudd Government's Excellence in Research for Australia (ERA initiative. The ERA also appears likely to involve a highly formalised process whereby 'research groupings' within individual universities will be evaluated, with (as yet unclear impacts on the distribution of research funding. Funding agencies have an interest in score-keeping, whether or not their enthusiasm is shared by Australian researchers. It is therefore highly advisable that Australian disciplines, and especially less well-established and powerful disciplines like Information Systems, achieve a clear understanding of their performance as indicated by the available measurement techniques applied to the available data. This paper reports on citation analysis using data from both the longstanding Thomson/ISI collection and the more recently developed Google Scholar service. Few Australian IS researchers have achieved scores of any great significance in the Thomson/ISI collection, whereas the greater depth available in Google Scholar provides a more realistic picture. Quality assessment of the Thomson/ISI collection shows it to be seriously inappropriate for relatively new disciplines

  18. Species-specific analysis of protein sequence motifs using mutual information

    Directory of Open Access Journals (Sweden)

    Weckwerth Wolfram

    2005-06-01

    Full Text Available Abstract Background Protein sequence motifs are by definition short fragments of conserved amino acids, often associated with a specific function. Accordingly protein sequence profiles derived from multiple sequence alignments provide an alternative description of functional motifs characterizing families of related sequences. Such profiles conveniently reflect functional necessities by pointing out proximity at conserved sequence positions as well as depicting distances at variable positions. Discovering significant conservation characteristics within the variable positions of profiles mirrors group-specific and, in particular, evolutionary features of the underlying sequences. Results We describe the tool PROfile analysis based on Mutual Information (PROMI that enables comparative analysis of user-classified protein sequences. PROMI is implemented as a web service using Perl and R as well as other publicly available packages and tools on the server-side. On the client-side platform-independence is achieved by generally applied internet delivery standards. As one possible application analysis of the zinc finger C2H2-type protein domain is introduced to illustrate the functionality of the tool. Conclusion The web service PROMI should assist researchers to detect evolutionary correlations in protein profiles of defined biological sequences. It is available at http://promi.mpimp-golm.mpg.de where additional documentation can be found.

  19. The Wind ENergy Data and Information (WENDI) Gateway: New Information and Analysis Tools for Wind Energy Stakeholders

    Science.gov (United States)

    Kaiser, D.; Palanisamy, G.; Santhana Vannan, S.; Wei, Y.; Smith, T.; Starke, M.; Wibking, M.; Pan, Y.; Devarakonda, Ranjeet; Wilson, B. E.; Wind Energy Data; Information (WENDI) Gateway Team

    2010-12-01

    In support of the U.S. Department of Energy’s (DOE) Energy Efficiency and Renewable Energy (EERE) Office, DOE's Oak Ridge National Laboratory (ORNL) has launched the Wind ENergy Data & Information (WENDI) Gateway. The WENDI Gateway is intended to serve a broad range of wind-energy stakeholders by providing easy access to a large amount of wind energy-related data and information through its two main interfaces: the Wind Energy Metadata Clearinghouse and the Wind Energy Geographic Information System (WindGIS). The Metadata Clearinghouse is a powerful, customized search tool for discovering, accessing, and sharing wind energy-related data and information. Its database of metadata records points users to a diverse array of wind energy-related resources: from technical and scientific journal articles to mass media news stories; from annual government and industry reports to downloadable datasets, and much more. Through the WindGIS, users can simultaneously visualize a wide spectrum of United States wind energy-related spatial data, including wind energy power plant locations; wind resource maps; state-level installed wind capacity, generation, and renewable portfolio standards; electric transmission lines; transportation infrastructure; interconnection standards; land ownership, designation, and usage; and various ecological data layers. In addition, WindGIS allows users to download much of the data behind the layers. References: [1] Devarakonda R., et al. Mercury: reusable metadata management, data discovery and access system. Earth Science Informatics (2010), 3(1): 87-94. [2] Wilson, Bruce E., et al. "Mercury Toolset for Spatiotemporal Metadata." (2010).

  20. On the development of an interactive resource information management system for analysis and display of spatiotemporal data

    Science.gov (United States)

    Schell, J. A.

    1974-01-01

    The recent availability of timely synoptic earth imagery from the Earth Resources Technology Satellites (ERTS) provides a wealth of information for the monitoring and management of vital natural resources. Formal language definitions and syntax interpretation algorithms were adapted to provide a flexible, computer information system for the maintenance of resource interpretation of imagery. These techniques are incorporated, together with image analysis functions, into an Interactive Resource Information Management and Analysis System, IRIMAS, which is implemented on a Texas Instruments 980A minicomputer system augmented with a dynamic color display for image presentation. A demonstration of system usage and recommendations for further system development are also included.

  1. Web-based Weather and Climate Information Service of Forensic Disaster Analysis

    Science.gov (United States)

    Mühr, Bernhard; Kunz, Michael; Köbele, Daniel

    2014-05-01

    , Europe, and the other continents. In 2007, 'Wettergefahren-Frühwarnung' became part of CEDIM and contributed to the activity of near-real time Forensic Disaster Analysis ahead, during and after a major event. Information is provided as text, own weather charts or data.

  2. Library and Information Science Research Areas: A Content Analysis of Articles from the Top 10 Journals 2007-8

    Science.gov (United States)

    Aharony, Noa

    2012-01-01

    The current study seeks to describe and analyze journal research publications in the top 10 Library and Information Science journals from 2007-8. The paper presents a statistical descriptive analysis of authorship patterns (geographical distribution and affiliation) and keywords. Furthermore, it displays a thorough content analysis of keywords and…

  3. On the predictive information criteria for model determination in seismic hazard analysis

    Science.gov (United States)

    Varini, Elisa; Rotondi, Renata

    2016-04-01

    estimate, but it is hardly applicable to data which are not independent given parameters (Watanabe, J. Mach. Learn. Res., 2010). A solution is given by Ando and Tsay criterion where the joint density may be decomposed into the product of the conditional densities (Ando and Tsay, Int. J. Forecast., 2010). The above mentioned criteria are global summary measures of model performance, but more detailed analysis could be required to discover the reasons for poor global performance. In this latter case, a retrospective predictive analysis is performed on each individual observation. In this study we performed the Bayesian analysis of Italian data sets by four versions of a long-term hazard model known as the stress release model (Vere-Jones, J. Physics Earth, 1978; Bebbington and Harte, Geophys. J. Int., 2003; Varini and Rotondi, Environ. Ecol. Stat., 2015). Then we illustrate the results on their performance evaluated by Bayes Factor, predictive information criteria and retrospective predictive analysis.

  4. Information Technology Project Portfolio and Strategy Alignment Assessment Based on Data Envelopment Analysis

    Directory of Open Access Journals (Sweden)

    Marisa Analía Sánchez

    2012-11-01

    Full Text Available Recent researches have shown that companies face considerable difficulties in assessing the strategy value contribution of Information Technology (IT investments. One of the major obstacles to achieving strategy alignment is that organizations find extremely difficult to link and quantify the IT investments benefits with strategic goals. The aim of this paper is to define an approach to assess portfolio-strategy alignment. To this end a formal specification of Kaplan and Norton Strategy Map is developed utilizing Unified Modeling Language (UML. The approach uses the Strategy Map as a framework for defining the portfolio value contribution and Data Envelopment Analysis (DEA is used as the methodology for measuring efficiency of project portfolios.DOI:10.5585/gep.v3i2.66

  5. Analysis of information provided by natural or artificial tracers used to study aquifer systems in hydrogeology

    International Nuclear Information System (INIS)

    After referring to the use of the transfer function method in the analysis of multivariable linear systems, the authors discuss - on the basis of standard examples - the use of residence time distributions, which are a fundamental kind of information provided by tracers. Measurement of flow rates by Allen's classical method is the first example and illustrates, for confined flows, the method of analysing multivariable systems. The authors examine the conditions for employing this method and introduce the property of pulse response identity, the practical significance of which is considerable. This property, which is established by representing the flow by a Markov chain, is extended to underground flows. In another example, the authors analyse the problem of water transfer between two boreholes. Lastly, they deal with two important sampling problems involving residence time distributions: the dating of water and the study of the kinetics of interactions between pollutants and the medium during their transfer (self-purification). (author)

  6. Geographical information system (GIS) suitability analysis of radioactive waste repository site in Pahang, Malaysia

    International Nuclear Information System (INIS)

    The aim of this project is to identify a suitable site for radioactive waste repository in Pahang using remote sensing and geographical information system (GIS) technologies. There are ten parameters considered in the analysis, which divided into Selection Criteria and Exclusion Criteria. The Selection Criteria parameters consists of land use, rainfall, lineament, slope, groundwater potential and elevation while Exclusion Criteria parameters consist of urban, protected land and island. Furthermore, all parameters were integrated, given weight age and ranked for site selection evaluation in GIS environment. At the first place, about twelve sites have been identified as suitable sites for radioactive waste repository throughout the study area. These sites were further analysed by ground checking on the physical setting including geological, drainage, and population density in order to finalise three most suitable sites for radioactive waste repository. (author)

  7. Hidden Markov model analysis of force/torque information in telemanipulation

    Science.gov (United States)

    Hannaford, Blake; Lee, Paul

    1991-01-01

    A model for the prediction and analysis of sensor information recorded during robotic performance of telemanipulation tasks is presented. The model uses the hidden Markov model to describe the task structure, the operator's or intelligent controller's goal structure, and the sensor signals. A methodology for constructing the model parameters based on engineering knowledge of the task is described. It is concluded that the model and its optimal state estimation algorithm, the Viterbi algorithm, are very succesful at the task of segmenting the data record into phases corresponding to subgoals of the task. The model provides a rich modeling structure within a statistical framework, which enables it to represent complex systems and be robust to real-world sensory signals.

  8. Analysis Methods for Extracting Knowledge from Large-Scale WiFi Monitoring to Inform Building Facility Planning

    DEFF Research Database (Denmark)

    Ruiz-Ruiz, Antonio; Blunck, Henrik; Prentow, Thor Siiger;

    2014-01-01

    realistic data to inform facility planning. In this paper, we propose analysis methods to extract knowledge from large sets of network collected WiFi traces to better inform facility management and planning in large building complexes. The analysis methods, which build on a rich set of temporal and spatial....... Spatio-temporal visualization tools built on top of these methods enable planners to inspect and explore extracted information to inform facility-planning activities. To evaluate the methods, we present results for a large hospital complex covering more than 10 hectares. The evaluation is based on WiFi...... traces collected in the hospital’s WiFi infrastructure over two weeks observing around 18000 different devices recording more than a billion individual WiFi measurements. For the presented analysis methods we present quantitative performance results, e.g., demonstrating over 95% accuracy for correct...

  9. Analysis Of Educational Services Distribution-Based Geographic Information System GIS

    Directory of Open Access Journals (Sweden)

    Waleed Lagrab

    2015-03-01

    Full Text Available Abstract This study analyzes the spatial distribution of kindergarten facilities in the study area based on the Geographic Information Systems GIS in order to test an efficiency of GIS technology to redistribute the existing kindergarten and choose the best location in the future and applying the standard criteria for selecting the suitable locations for kindergarten. To achieve this goal the data and information are collected via interviews and comprehensive statistics on the education facilities in Mukalla districts in YEMEN which contributed to building a geographic database for the study area. After that the Kindergarten spatial patterns are analyzed in terms of proximity to each other and used near some other land in the surrounding area such as streets highways factories etc. Also measures the concentration dispersion clustering and distribution direction for the kindergarten this study showed the effectiveness of the GIS for spatial data analysis. One of the most important finding that most of the Kindergarten was established in Mukalla city did not take into account the criteria that set by the authorities. Furthermore almost district suffers from a shortage in the number of kindergarten and pattern of distribution of those kindergartens dominated by spatial dispersed.

  10. Research in health sciences library and information science: a quantitative analysis.

    Science.gov (United States)

    Dimitroff, A

    1992-10-01

    A content analysis of research articles published between 1966 and 1990 in the Bulletin of the Medical Library Association was undertaken. Four specific questions were addressed: What subjects are of interest to health sciences librarians? Who is conducting this research? How do health sciences librarians conduct their research? Do health sciences librarians obtain funding for their research activities? Bibliometric characteristics of the research articles are described and compared to characteristics of research in library and information science as a whole in terms of subject and methodology. General findings were that most research in health sciences librarianship is conducted by librarians affiliated with academic health sciences libraries (51.8%); most deals with an applied (45.7%) or a theoretical (29.2%) topic; survey (41.0%) or observational (20.7%) research methodologies are used; descriptive quantitative analytical techniques are used (83.5%); and over 25% of research is funded. The average number of authors was 1.85, average article length was 7.25 pages, and average number of citations per article was 9.23. These findings are consistent with those reported in the general library and information science literature for the most part, although specific differences do exist in methodological and analytical areas. PMID:1422504

  11. Network Analysis of the Shanghai Stock Exchange Based on Partial Mutual Information

    Directory of Open Access Journals (Sweden)

    Tao You

    2015-06-01

    Full Text Available Analyzing social systems, particularly financial markets, using a complex network approach has become one of the most popular fields within econophysics. A similar trend is currently appearing within the econometrics and finance communities, as well. In this study, we present a state-of-the-artmethod for analyzing the structure and risk within stockmarkets, treating them as complex networks using model-free, nonlinear dependency measures based on information theory. This study is the first network analysis of the stockmarket in Shanghai using a nonlinear network methodology. Further, it is often assumed that markets outside the United States and Western Europe are inherently riskier. We find that the Chinese stock market is not structurally risky, contradicting this popular opinion. We use partial mutual information to create filtered networks representing the Shanghai stock exchange, comparing them to networks based on Pearson’s correlation. Consequently, we discuss the structure and characteristics of both the presented methods and the Shanghai stock exchange. This paper provides an insight into the cutting edge methodology designed for analyzing complex financial networks, as well as analyzing the structure of the market in Shanghai and, as such, is of interest to both researchers and financial analysts.

  12. Building Information Modelling and Standardised Construction Contracts: a Content Analysis of the GC21 Contract

    Directory of Open Access Journals (Sweden)

    Aaron Manderson

    2015-08-01

    Full Text Available Building Information Modelling (BIM is seen as a panacea to many of the ills confronting the Architectural, Engineering and Construction (AEC sector. In spite of its well documented benefits the widespread integration of BIM into the project lifecycle is yet to occur. One commonly identified barrier to BIM adoption is the perceived legal risks associated with its integration, coupled with the need for implementation in a collaborative environment. Many existing standardised contracts used in the Australian AEC industry were drafted before the emergence of BIM. As BIM continues to become ingrained in the delivery process the shortcomings of these existing contracts have become apparent. This paper reports on a study that reviewed and consolidated the contractual and legal concerns associated with BIM implementation. The findings of the review were used to conduct a qualitative content analysis of the GC21 2nd edition, an Australian standardised construction contract, to identify possible changes to facilitate the implementation of BIM in a collaborative environment. The findings identified a number of changes including the need to adopt a collaborative contract structure with equitable risk and reward mechanisms, recognition of the model as a contract document and the need for standardisation of communication/information exchange.

  13. Parametric sensitivity analysis for stochastic molecular systems using information theoretic metrics

    Energy Technology Data Exchange (ETDEWEB)

    Tsourtis, Anastasios, E-mail: tsourtis@uoc.gr [Department of Mathematics and Applied Mathematics, University of Crete, Crete (Greece); Pantazis, Yannis, E-mail: pantazis@math.umass.edu; Katsoulakis, Markos A., E-mail: markos@math.umass.edu [Department of Mathematics and Statistics, University of Massachusetts, Amherst, Massachusetts 01003 (United States); Harmandaris, Vagelis, E-mail: harman@uoc.gr [Department of Mathematics and Applied Mathematics, University of Crete, and Institute of Applied and Computational Mathematics (IACM), Foundation for Research and Technology Hellas (FORTH), GR-70013 Heraklion, Crete (Greece)

    2015-07-07

    In this paper, we present a parametric sensitivity analysis (SA) methodology for continuous time and continuous space Markov processes represented by stochastic differential equations. Particularly, we focus on stochastic molecular dynamics as described by the Langevin equation. The utilized SA method is based on the computation of the information-theoretic (and thermodynamic) quantity of relative entropy rate (RER) and the associated Fisher information matrix (FIM) between path distributions, and it is an extension of the work proposed by Y. Pantazis and M. A. Katsoulakis [J. Chem. Phys. 138, 054115 (2013)]. A major advantage of the pathwise SA method is that both RER and pathwise FIM depend only on averages of the force field; therefore, they are tractable and computable as ergodic averages from a single run of the molecular dynamics simulation both in equilibrium and in non-equilibrium steady state regimes. We validate the performance of the extended SA method to two different molecular stochastic systems, a standard Lennard-Jones fluid and an all-atom methane liquid, and compare the obtained parameter sensitivities with parameter sensitivities on three popular and well-studied observable functions, namely, the radial distribution function, the mean squared displacement, and the pressure. Results show that the RER-based sensitivities are highly correlated with the observable-based sensitivities.

  14. Environmental factor analysis of cholera in China using remote sensing and geographical information systems.

    Science.gov (United States)

    Xu, M; Cao, C X; Wang, D C; Kan, B; Xu, Y F; Ni, X L; Zhu, Z C

    2016-04-01

    Cholera is one of a number of infectious diseases that appears to be influenced by climate, geography and other natural environments. This study analysed the environmental factors of the spatial distribution of cholera in China. It shows that temperature, precipitation, elevation, and distance to the coastline have significant impact on the distribution of cholera. It also reveals the oceanic environmental factors associated with cholera in Zhejiang, which is a coastal province of China, using both remote sensing (RS) and geographical information systems (GIS). The analysis has validated the correlation between indirect satellite measurements of sea surface temperature (SST), sea surface height (SSH) and ocean chlorophyll concentration (OCC) and the local number of cholera cases based on 8-year monthly data from 2001 to 2008. The results show the number of cholera cases has been strongly affected by the variables of SST, SSH and OCC. Utilizing this information, a cholera prediction model has been established based on the oceanic and climatic environmental factors. The model indicates that RS and GIS have great potential for designing an early warning system for cholera. PMID:26464184

  15. Parametric sensitivity analysis for stochastic molecular systems using information theoretic metrics

    International Nuclear Information System (INIS)

    In this paper, we present a parametric sensitivity analysis (SA) methodology for continuous time and continuous space Markov processes represented by stochastic differential equations. Particularly, we focus on stochastic molecular dynamics as described by the Langevin equation. The utilized SA method is based on the computation of the information-theoretic (and thermodynamic) quantity of relative entropy rate (RER) and the associated Fisher information matrix (FIM) between path distributions, and it is an extension of the work proposed by Y. Pantazis and M. A. Katsoulakis [J. Chem. Phys. 138, 054115 (2013)]. A major advantage of the pathwise SA method is that both RER and pathwise FIM depend only on averages of the force field; therefore, they are tractable and computable as ergodic averages from a single run of the molecular dynamics simulation both in equilibrium and in non-equilibrium steady state regimes. We validate the performance of the extended SA method to two different molecular stochastic systems, a standard Lennard-Jones fluid and an all-atom methane liquid, and compare the obtained parameter sensitivities with parameter sensitivities on three popular and well-studied observable functions, namely, the radial distribution function, the mean squared displacement, and the pressure. Results show that the RER-based sensitivities are highly correlated with the observable-based sensitivities

  16. An Information-Based Approach to Precision Analysis of Indoor WLAN Localization Using Location Fingerprint

    Directory of Open Access Journals (Sweden)

    Mu Zhou

    2015-12-01

    Full Text Available In this paper, we proposed a novel information-based approach to precision analysis of indoor wireless local area network (WLAN localization using location fingerprint. First of all, by using the Fisher information matrix (FIM, we derive the fundamental limit of WLAN fingerprint-based localization precision considering different signal distributions in characterizing the variation of received signal strengths (RSSs in the target environment. After that, we explore the relationship between the localization precision and access point (AP placement, which can provide valuable suggestions for the design of the highly-precise localization system. Second, we adopt the heuristic simulated annealing (SA algorithm to optimize the AP locations for the sake of approaching the fundamental limit of localization precision. Finally, the extensive simulations and experiments are conducted in both regular line-of-sight (LOS and irregular non-line-of-sight (NLOS environments to demonstrate that the proposed approach can not only effectively improve the WLAN fingerprint-based localization precision, but also reduce the time overhead.

  17. A Bibliometric Analysis on the Literature of Information Organization in Taiwan

    Directory of Open Access Journals (Sweden)

    Chiao-Min Lin

    2009-12-01

    Full Text Available The purpose of this study is to explore the characteristics of the literature on information organization in Taiwan. A total of 610 articles from 1882 to 2008 and 113 theses and dissertations from 1971 to 2008 were identified and analyzed. The growth of literature, research subjects, author productivity, the distribution of journals and organizations are addressed. The results of this study reveal that the journal articles in Taiwan had been up-growing except after 2003, but the theses and dissertations had been growing stably. The major research subject of journal articles is classification theory and descriptive cataloging, but the theses and dissertations are knowledge organization. The zone analysis from Bradford’s law is not applicable to journal productivity. Productivity is mainly focused on single author and toward coauthor. The Journal of Educational Media and Library Sciences is the major journal that publishes information organization articles. The National Taiwan University is the most productive schools of theses and dissertations, but various schools have their characteristics of research subjects. [Article content in Chinese

  18. Sub pixel analysis and processing of sensor data for mobile target intelligence information and verification

    Science.gov (United States)

    Williams, Theresa Allen

    This dissertation introduces a novel process to study and analyze sensor data in order to obtain information pertaining to mobile targets at the sub-pixel level. The process design is modular in nature and utilizes a set of algorithmic tools for change detection, target extraction and analysis, super-pixel processing and target refinement. The scope of this investigation is confined to a staring sensor that records data of sub-pixel vehicles traveling horizontally across the ground. Statistical models of the targets and background are developed with noise and jitter effects. Threshold Change Detection, Duration Change Detection and Fast Adaptive Power Iteration (FAPI) Detection techniques are the three methods used for target detection. The PolyFit and FermiFit are two tools developed and employed for target analysis, which allows for flexible processing. Tunable parameters in the detection methods, along with filters for false alarms, show the adaptability of the procedures. Super-pixel processing tools are designed, and Refinement Through Tracking (RTT) techniques are investigated as post-processing refinement options. The process is tested on simulated datasets, and validated with sensor datasets obtained from RP Flight Systems, Inc.

  19. Information Loss Determination on Digital Image Compression and Reconstruction Using Qualitative and Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Zhengmao Ye

    2011-12-01

    Full Text Available To effectively utilize the storage capacity, digital image compression has been applied to numerous science and engineering problems. There are two fundamental image compression techniques, either lossless or lossy. The former employs probabilistic models for lossless storage on a basis of statistical redundancy occurred in digital images. However, it has limitation in the compression ratio and bit per pixel ratio. Hence, the latter has also been widely implemented to further improve the storage capacities, covering various fundamental digital image processing approaches. It has been well documented that most lossy compression schemes will provide perfect visual perception under an exceptional compression ratio, among which discrete wavelet transform, discrete Fourier transform and some statistical optimization compression schemes (e.g., principal component analysis and independent component analysis are the dominant approaches. It is necessary to evaluate these compression and reconstruction schemes objectively in addition to the visual appealing. Using a well defined set of the quantitative metrics from Information Theory, a comparative study on several typical digital image compression and reconstruction schemes will be conducted in this research.

  20. Stakeholder Analysis as a Medium to Aid Change in Information System Reengineering Projects

    Directory of Open Access Journals (Sweden)

    Jean Davison

    2004-04-01

    Full Text Available The importance of involving stakeholders within a change process is well recognised, and successfully managed change is equally important. Information systems development and redesign is a form of change activity involving people and social issues and therefore resistance to change may occur. A stakeholder identification and analysis (SIA technique has been developed as an enhancement to PISO® (Process Improvement for Strategic Objectives, a method that engages the users of a system in the problem solving and reengineering of their own work-based problem areas. The SIA technique aids the identification and analysis of system stakeholders, and helps view the projected outcome of system changes and their effect on relevant stakeholders with attention being given to change resistance to ensure smooth negotiation and achieve consensus. A case study is presented here describing the successful implementation of a direct appointment booking system for patients within the National Health Service in the UK, utilising the SIA technique, which resulted in a feeling of empowerment and ownership of the change of those involved.

  1. Ontology-based time information representation of vaccine adverse events in VAERS for temporal analysis

    Directory of Open Access Journals (Sweden)

    Tao Cui

    2012-12-01

    Full Text Available Abstract Background The U.S. FDA/CDC Vaccine Adverse Event Reporting System (VAERS provides a valuable data source for post-vaccination adverse event analyses. The structured data in the system has been widely used, but the information in the write-up narratives is rarely included in these kinds of analyses. In fact, the unstructured nature of the narratives makes the data embedded in them difficult to be used for any further studies. Results We developed an ontology-based approach to represent the data in the narratives in a “machine-understandable” way, so that it can be easily queried and further analyzed. Our focus is the time aspect in the data for time trending analysis. The Time Event Ontology (TEO, Ontology of Adverse Events (OAE, and Vaccine Ontology (VO are leveraged for the semantic representation of this purpose. A VAERS case report is presented as a use case for the ontological representations. The advantages of using our ontology-based Semantic web representation and data analysis are emphasized. Conclusions We believe that representing both the structured data and the data from write-up narratives in an integrated, unified, and “machine-understandable” way can improve research for vaccine safety analyses, causality assessments, and retrospective studies.

  2. Application of information theory for the analysis of cogeneration-system performance

    International Nuclear Information System (INIS)

    Successful cogeneration system performance depends critically upon the correct estimation of load variation and the accuracy of demand prediction. We need not only aggregated annual heat and electricity demands, but also hourly and monthly patterns in order to evaluate a cogeneration system's performance by computer simulation. These data are usually obtained from the actual measurements of energy demand in existing buildings. However, it is extremely expensive to collect actual energy demand data and store it over a long period for many buildings. However we face the question of whether it is really necessary to survey hourly demands. This paper provides a sensitivity analysis of the influence of demand-prediction error upon the efficiency of cogeneration system, so as to evaluate the relative importance of various demand components. These components are annual energy demand, annual heat-to-electricity ratio, daily load factor and so forth. Our approach employs the concept of information theory to construct a mathematical model. This analysis provides an indication of the relative importances of demand indices, and identifies what may become a good measure of assessing the efficiency of the cogeneration system for planning purposes. (Author)

  3. Informational database methodology for urban risk analysis.Case study: the historic centre of Bucharest

    Science.gov (United States)

    Armas, I.; Dumitrascu, S.

    2009-04-01

    , but is also a very populated area, this being factors that favour a high susceptibility level. In addition, the majority of the buildings are included in the first and second categories of seismic risk, being built between 1875 and 1940, the age of the buildings establishing an increased vulnerability to natural hazards. The methodology was developed through the contribution of three partner universities from Bucharest: the University of Bucharest, the Academy for Economic Studies and the Technical University of Constructions. The method suggested was based on the analysis and processing of digital and statistical spatial information resulted from 1:500 topographical plans, satellite pictures, archives and historical maps used for the identification of the age of the buildings. Also, an important stage was represented by the field investigations that resulted with the data used in the assessment of the buildings: year of construction, location and vicinity, height, number of floors, state and function of the building, equipment and construction type. The information collected from the field together with the data resulted from the digitization of the ortophotoplans were inserted in ArcGIS in order to compile the database. Furthermore, the team from the Cybernetics Faculty developed a special software package in Visual Studio and SQL server in order to insert the sheets in GIS so that they could be statistically processed. The final product of the study is a program that includes as main functions editing, the analysis based on selected factors (individual or group) and viewing of building information in the shape of maps or 3D visualization. The strengths of the informational system resulted are given by the extended range of applicability, the short processing period, accessibility, capacity of support for a large amount of information and, thus, standing out as an adequate instrument to fit the needs of a susceptible population.

  4. A parametric Bayesian combination of local and regional information in flood frequency analysis

    Science.gov (United States)

    Seidou, O.; Ouarda, T. B. M. J.; Barbet, M.; Bruneau, P.; BobéE, B.

    2006-11-01

    Because of their impact on hydraulic structure design as well as on floodplain management, flood quantiles must be estimated with the highest precision given available information. If the site of interest has been monitored for a sufficiently long period (more than 30-40 years), at-site frequency analysis can be used to estimate flood quantiles with a fair precision. Otherwise, regional estimation may be used to mitigate the lack of data, but local information is then ignored. A commonly used approach to combine at-site and regional information is the linear empirical Bayes estimation: Under the assumption that both local and regional flood quantile estimators have a normal distribution, the empirical Bayesian estimator of the true quantile is the weighted average of both estimations. The weighting factor for each estimator is conversely proportional to its variance. We propose in this paper an alternative Bayesian method for combining local and regional information which provides the full probability density of quantiles and parameters. The application of the method is made with the generalized extreme values (GEV) distribution, but it can be extended to other types of extreme value distributions. In this method the prior distributions are obtained using a regional log linear regression model, and then local observations are used within a Markov chain Monte Carlo algorithm to infer the posterior distributions of parameters and quantiles. Unlike the empirical Bayesian approach the proposed method works even with a single local observation. It also relaxes the hypothesis of normality of the local quantiles probability distribution. The performance of the proposed methodology is compared to that of local, regional, and empirical Bayes estimators on three generated regional data sets with different statistical characteristics. The results show that (1) when the regional log linear model is unbiased, the proposed method gives better estimations of the GEV quantiles and

  5. Climate Risk Informed Decision Analysis: A Hypothetical Application to the Waas Region

    Science.gov (United States)

    Gilroy, Kristin; Mens, Marjolein; Haasnoot, Marjolijn; Jeuken, Ad

    2016-04-01

    More frequent and intense hydrologic events under climate change are expected to enhance water security and flood risk management challenges worldwide. Traditional planning approaches must be adapted to address climate change and develop solutions with an appropriate level of robustness and flexibility. The Climate Risk Informed Decision Analysis (CRIDA) method is a novel planning approach embodying a suite of complementary methods, including decision scaling and adaptation pathways. Decision scaling offers a bottom-up approach to assess risk and tailors the complexity of the analysis to the problem at hand and the available capacity. Through adaptation pathway,s an array of future strategies towards climate robustness are developed, ranging in flexibility and immediacy of investments. Flexible pathways include transfer points to other strategies to ensure that the system can be adapted if future conditions vary from those expected. CRIDA combines these two approaches in a stakeholder driven process which guides decision makers through the planning and decision process, taking into account how the confidence in the available science, the consequences in the system, and the capacity of institutions should influence strategy selection. In this presentation, we will explain the CRIDA method and compare it to existing planning processes, such as the US Army Corps of Engineers Principles and Guidelines as well as Integrated Water Resources Management Planning. Then, we will apply the approach to a hypothetical case study for the Waas Region, a large downstream river basin facing rapid development threatened by increased flood risks. Through the case study, we will demonstrate how a stakeholder driven process can be used to evaluate system robustness to climate change; develop adaptation pathways for multiple objectives and criteria; and illustrate how varying levels of confidence, consequences, and capacity would play a role in the decision making process, specifically

  6. Using Enabling Technologies to Advance Data Intensive Analysis Tools in the JPL Tropical Cyclone Information System

    Science.gov (United States)

    Knosp, B.; Gangl, M. E.; Hristova-Veleva, S. M.; Kim, R. M.; Lambrigtsen, B.; Li, P.; Niamsuwan, N.; Shen, T. P. J.; Turk, F. J.; Vu, Q. A.

    2014-12-01

    The JPL Tropical Cyclone Information System (TCIS) brings together satellite, aircraft, and model forecast data from several NASA, NOAA, and other data centers to assist researchers in comparing and analyzing data related to tropical cyclones. The TCIS has been supporting specific science field campaigns, such as the Genesis and Rapid Intensification Processes (GRIP) campaign and the Hurricane and Severe Storm Sentinel (HS3) campaign, by creating near real-time (NRT) data visualization portals. These portals are intended to assist in mission planning, enhance the understanding of current physical processes, and improve model data by comparing it to satellite and aircraft observations. The TCIS NRT portals allow the user to view plots on a Google Earth interface. To compliment these visualizations, the team has been working on developing data analysis tools to let the user actively interrogate areas of Level 2 swath and two-dimensional plots they see on their screen. As expected, these observation and model data are quite voluminous and bottlenecks in the system architecture can occur when the databases try to run geospatial searches for data files that need to be read by the tools. To improve the responsiveness of the data analysis tools, the TCIS team has been conducting studies on how to best store Level 2 swath footprints and run sub-second geospatial searches to discover data. The first objective was to improve the sampling accuracy of the footprints being stored in the TCIS database by comparing the Java-based NASA PO.DAAC Level 2 Swath Generator with a TCIS Python swath generator. The second objective was to compare the performance of four database implementations - MySQL, MySQL+Solr, MongoDB, and PostgreSQL - to see which database management system would yield the best geospatial query and storage performance. The final objective was to integrate our chosen technologies with our Joint Probability Density Function (Joint PDF), Wave Number Analysis, and

  7. Information Crisis

    CERN Document Server

    Losavio, Michael

    2012-01-01

    Information Crisis discusses the scope and types of information available online and teaches readers how to critically assess it and analyze potentially dangerous information, especially when teachers, editors, or other information gatekeepers are not available to assess the information for them. Chapters and topics include:. The Internet as an information tool. Critical analysis. Legal issues, traps, and tricks. Protecting personal safety and identity. Types of online information.

  8. Analysis of Information Publicity System%信息公开制度探析

    Institute of Scientific and Technical Information of China (English)

    朱庆华; 颜祥林

    2001-01-01

    Information publicity system is a guarantee in law in exploiting government information resources.This paper discusses the reasons for establishing such a system,and based on the Japanese Freedom of Information Act,discusses the main content of information publicity system.

  9. The Necessity and Functionality of Information Management Safety: A Case Analysis in Turkey Perspective

    OpenAIRE

    Hacer Tugba Eroglu; Hayriye Sagir

    2014-01-01

    Today, the use of information is gradually increasing in private sector and public sector. Besides that the importance of information increases, keeping and storing it in a safe way also appeared as an important need. Also, transferring it from one place to another place became inevitable need. This dependency to information made a current issue the need of keeping information. The attacks to information, its destroying, its cleaning, and being damaging of its confidentiality causes informati...

  10. Extraction of Benthic Cover Information from Video Tows and Photographs Using Object-Based Image Analysis

    Science.gov (United States)

    Estomata, M. T. L.; Blanco, A. C.; Nadaoka, K.; Tomoling, E. C. M.

    2012-07-01

    Mapping benthic cover in deep waters comprises a very small proportion of studies in the field of research. Majority of benthic cover mapping makes use of satellite images and usually, classification is carried out only for shallow waters. To map the seafloor in optically deep waters, underwater videos and photos are needed. Some researchers have applied this method on underwater photos, but made use of different classification methods such as: Neural Networks, and rapid classification via down sampling. In this study, accurate bathymetric data obtained using a multi-beam echo sounder (MBES) was attempted to be used as complementary data with the underwater photographs. Due to the absence of a motion reference unit (MRU), which applies correction to the data gathered by the MBES, accuracy of the said depth data was compromised. Nevertheless, even with the absence of accurate bathymetric data, object-based image analysis (OBIA), which used rule sets based on information such as shape, size, area, relative distance, and spectral information, was still applied. Compared to pixel-based classifications, OBIA was able to classify more specific benthic cover types other than coral and sand, such as rubble and fish. Through the use of rule sets on area, less than or equal to 700 pixels for fish and between 700 to 10,000 pixels for rubble, as well as standard deviation values to distinguish texture, fish and rubble were identified. OBIA produced benthic cover maps that had higher overall accuracy, 93.78±0.85%, as compared to pixel-based methods that had an average accuracy of only 87.30±6.11% (p-value = 0.0001, α = 0.05).

  11. EXTRACTION OF BENTHIC COVER INFORMATION FROM VIDEO TOWS AND PHOTOGRAPHS USING OBJECT-BASED IMAGE ANALYSIS

    Directory of Open Access Journals (Sweden)

    M. T. L. Estomata

    2012-07-01

    Full Text Available Mapping benthic cover in deep waters comprises a very small proportion of studies in the field of research. Majority of benthic cover mapping makes use of satellite images and usually, classification is carried out only for shallow waters. To map the seafloor in optically deep waters, underwater videos and photos are needed. Some researchers have applied this method on underwater photos, but made use of different classification methods such as: Neural Networks, and rapid classification via down sampling. In this study, accurate bathymetric data obtained using a multi-beam echo sounder (MBES was attempted to be used as complementary data with the underwater photographs. Due to the absence of a motion reference unit (MRU, which applies correction to the data gathered by the MBES, accuracy of the said depth data was compromised. Nevertheless, even with the absence of accurate bathymetric data, object-based image analysis (OBIA, which used rule sets based on information such as shape, size, area, relative distance, and spectral information, was still applied. Compared to pixel-based classifications, OBIA was able to classify more specific benthic cover types other than coral and sand, such as rubble and fish. Through the use of rule sets on area, less than or equal to 700 pixels for fish and between 700 to 10,000 pixels for rubble, as well as standard deviation values to distinguish texture, fish and rubble were identified. OBIA produced benthic cover maps that had higher overall accuracy, 93.78±0.85%, as compared to pixel-based methods that had an average accuracy of only 87.30±6.11% (p-value = 0.0001, α = 0.05.

  12. Measuring child maltreatment using multi-informant survey data: a higher-order confirmatory factor analysis

    Directory of Open Access Journals (Sweden)

    Giovanni A. Salum

    2016-03-01

    Full Text Available Objective To investigate the validity and reliability of a multi-informant approach to measuring child maltreatment (CM comprising seven questions assessing CM administered to children and their parents in a large community sample. Methods Our sample comprised 2,512 children aged 6 to 12 years and their parents. Child maltreatment (CM was assessed with three questions answered by the children and four answered by their parents, covering physical abuse, physical neglect, emotional abuse and sexual abuse. Confirmatory factor analysis was used to compare the fit indices of different models. Convergent and divergent validity were tested using parent-report and teacher-report scores on the Strengths and Difficulties Questionnaire. Discriminant validity was investigated using the Development and Well-Being Assessment to divide subjects into five diagnostic groups: typically developing controls (n = 1,880, fear disorders (n = 108, distress disorders (n = 76, attention deficit hyperactivity disorder (n = 143 and oppositional defiant disorder/conduct disorder (n = 56. Results A higher-order model with one higher-order factor (child maltreatment encompassing two lower-order factors (child report and parent report exhibited the best fit to the data and this model's reliability results were acceptable. As expected, child maltreatment was positively associated with measures of psychopathology and negatively associated with prosocial measures. All diagnostic category groups had higher levels of overall child maltreatment than typically developing children. Conclusions We found evidence for the validity and reliability of this brief measure of child maltreatment using data from a large survey combining information from parents and their children.

  13. CONTEMPORARY APPROACHES OF COMPANY PERFORMANCE ANALYSIS BASED ON RELEVANT FINANCIAL INFORMATION

    Directory of Open Access Journals (Sweden)

    Sziki Klara

    2012-12-01

    Full Text Available In this paper we chose to present two components of the financial statements: the profit and loss account and the cash flow statement. These summary documents and different indicators calculated based on them allow us to formulate assessments on the performance and profitability on various functions and levels of the company’s activity. This paper aims to support the hypothesis that the accounting information presented in the profit and loss account and in the cash flow statement is an appropriate source for assessing company performance. The purpose of this research is to answer the question linked to the main hypothesis: Is it the profit and loss statement or the cash flow account that reflects better the performance of a business? Based on the literature of specialty studied we tried a conceptual, analytical and practical approach of the term performance, overviewing some terminological acceptations of the term performance as well as the main indicators of performance analysis on the basis of the profit and loss account and of the cash flow statement: aggregated indicators, also known as intermediary balances of administration, economic rate of return, rate of financial profitability, rate of return through cash flows, operating cash flow rate, rate of generating operating cash out of gross operating result. At the same time we had a comparative approach of the profit and loss account and cash flow statement, outlining the main advantages and disadvantages of these documents. In order to demonstrate the above theoretical assessments, we chose to analyze these indicators based on information from the financial statements of SC Sinteza SA, a company in Bihor county, listed on the Bucharest Stock Exchange.

  14. Biological data analysis as an information theory problem: multivariable dependence measures and the shadows algorithm.

    Science.gov (United States)

    Sakhanenko, Nikita A; Galas, David J

    2015-11-01

    Information theory is valuable in multiple-variable analysis for being model-free and nonparametric, and for the modest sensitivity to undersampling. We previously introduced a general approach to finding multiple dependencies that provides accurate measures of levels of dependency for subsets of variables in a data set, which is significantly nonzero only if the subset of variables is collectively dependent. This is useful, however, only if we can avoid a combinatorial explosion of calculations for increasing numbers of variables.  The proposed dependence measure for a subset of variables, τ, differential interaction information, Δ(τ), has the property that for subsets of τ some of the factors of Δ(τ) are significantly nonzero, when the full dependence includes more variables. We use this property to suppress the combinatorial explosion by following the "shadows" of multivariable dependency on smaller subsets. Rather than calculating the marginal entropies of all subsets at each degree level, we need to consider only calculations for subsets of variables with appropriate "shadows." The number of calculations for n variables at a degree level of d grows therefore, at a much smaller rate than the binomial coefficient (n, d), but depends on the parameters of the "shadows" calculation. This approach, avoiding a combinatorial explosion, enables the use of our multivariable measures on very large data sets. We demonstrate this method on simulated data sets, and characterize the effects of noise and sample numbers. In addition, we analyze a data set of a few thousand mutant yeast strains interacting with a few thousand chemical compounds. PMID:26335709

  15. Measuring child maltreatment using multi-informant survey data: a higher-order confirmatory factor analysis.

    Science.gov (United States)

    Salum, Giovanni A; DeSousa, Diogo Araújo; Manfro, Gisele Gus; Pan, Pedro Mario; Gadelha, Ary; Brietzke, Elisa; Miguel, Eurípedes Constantino; Mari, Jair J; Rosário, Maria Conceição do; Grassi-Oliveira, Rodrigo

    2016-03-01

    Objective To investigate the validity and reliability of a multi-informant approach to measuring child maltreatment (CM) comprising seven questions assessing CM administered to children and their parents in a large community sample. Methods Our sample comprised 2,512 children aged 6 to 12 years and their parents. Child maltreatment (CM) was assessed with three questions answered by the children and four answered by their parents, covering physical abuse, physical neglect, emotional abuse and sexual abuse. Confirmatory factor analysis was used to compare the fit indices of different models. Convergent and divergent validity were tested using parent-report and teacher-report scores on the Strengths and Difficulties Questionnaire. Discriminant validity was investigated using the Development and Well-Being Assessment to divide subjects into five diagnostic groups: typically developing controls (n = 1,880), fear disorders (n = 108), distress disorders (n = 76), attention deficit hyperactivity disorder (n = 143) and oppositional defiant disorder/conduct disorder (n = 56). Results A higher-order model with one higher-order factor (child maltreatment) encompassing two lower-order factors (child report and parent report) exhibited the best fit to the data and this model's reliability results were acceptable. As expected, child maltreatment was positively associated with measures of psychopathology and negatively associated with prosocial measures. All diagnostic category groups had higher levels of overall child maltreatment than typically developing children. Conclusions We found evidence for the validity and reliability of this brief measure of child maltreatment using data from a large survey combining information from parents and their children. PMID:27007940

  16. Book Review: Cyber Security and Global Information Assurance: Threat Analysis and Response Solutions

    Directory of Open Access Journals (Sweden)

    Gary Kessler

    2009-09-01

    Full Text Available Knapp, K.J. (Ed. (2009. Cyber Security and Global Information Assurance: Threat Analysis and Response Solutions. Hershey, NY: Information Science Reference. 434 + xxii pages, ISBN: 978-1-60566-326-5, US$195.Reviewed by Gary C. Kessler (gck@garykessler.netI freely admit that this book was sent to me by the publisher for the expressed purpose of my writing a review and that I know several of the chapter authors. With that disclosure out of the way, let me say that the book is well worth the review (and I get to keep my review copy.The preface to the book cites the 2003 publication of The National Strategy to Secure Cyberspace by the White House, and the acknowledgement by the U.S. government that our economy and national security were fully dependent upon computers, networks, and the telecommunications infrastructure. This mayhave come as news to the general population but it was a long overdue public statement to those of us in the industry. The FBI's InfraGard program and the formation of the National Infrastructure Protection Center (NIPC pre-dated this report by at least a half-dozen years, so the report was hardly earthshattering. And the fact that the bulk of the telecom infrastructure is owned by the private sector is a less advertized fact. Nonetheless, reminding the community of these facts is always a Good Thing and provides the raison d’être of this book.(see PDF for full review

  17. Climate Risk Informed Decision Analysis (CRIDA): A novel practical guidance for Climate Resilient Investments and Planning

    Science.gov (United States)

    Jeuken, Ad; Mendoza, Guillermo; Matthews, John; Ray, Patrick; Haasnoot, Marjolijn; Gilroy, Kristin; Olsen, Rolf; Kucharski, John; Stakhiv, Gene; Cushing, Janet; Brown, Casey

    2016-04-01

    over time. They are part of the Dutch adaptive planning approach Adaptive Delta Management, executed and develop by the Dutch Delta program. Both decision scaling and adaptation pathways have been piloted in studies worldwide. The objective of CRIDA is to mainstream effective climate adaptation for professional water managers. The CRIDA publication, due in april 2016, follows the generic water design planning design cycle. At each step, CRIDA describes stepwise guidance for incorporating climate robustness: problem definition, stress test, alternatives formulation and recommendation, evaluation and selection. In the presentation the origin, goal, steps and practical tools available at each step of CRIDA will be explained. In two other abstracts ("Climate Risk Informed Decision Analysis: A Hypothetical Application to the Waas Region" by Gilroy et al., "The Application of Climate Risk Informed Decision Analysis to the Ioland Water Treatment Plant in Lusaka, Zambia, by Kucharski et al.), the application of CRIDA to cases is explained

  18. Multi-resolution, object-oriented fuzzy analysis of remote sensing data for GIS-ready information

    Science.gov (United States)

    Benz, Ursula C.; Hofmann, Peter; Willhauck, Gregor; Lingenfelder, Iris; Heynen, Markus

    Remote sensing from airborne and spaceborne platforms provides valuable data for mapping, environmental monitoring, disaster management and civil and military intelligence. However, to explore the full value of these data, the appropriate information has to be extracted and presented in standard format to import it into geo-information systems and thus allow efficient decision processes. The object-oriented approach can contribute to powerful automatic and semi-automatic analysis for most remote sensing applications. Synergetic use to pixel-based or statistical signal processing methods explores the rich information contents. Here, we explain principal strategies of object-oriented analysis, discuss how the combination with fuzzy methods allows implementing expert knowledge and describe a representative example for the proposed workflow from remote sensing imagery to GIS. The strategies are demonstrated using the first object-oriented image analysis software on the market, eCognition, which provides an appropriate link between remote sensing imagery and GIS.

  19. Geographic information system for fusion and analysis of high-resolution remote sensing and ground data

    Science.gov (United States)

    Freeman, Anthony; Way, Jo Bea; Dubois, Pascale; Leberl, Franz

    1993-01-01

    We seek to combine high-resolution remotely sensed data with models and ground truth measurements, in the context of a Geographical Information System (GIS), integrated with specialized image processing software. We will use this integrated system to analyze the data from two Case Studies, one at a boreal forest site, the other a tropical forest site. We will assess the information content of the different components of the data, determine the optimum data combinations to study biogeophysical changes in the forest, assess the best way to visualize the results, and validate the models for the forest response to different radar wavelengths/polarizations. During the 1990's, unprecedented amounts of high-resolution images from space of the Earth's surface will become available to the applications scientist from the LANDSAT/TM series, European and Japanese ERS-1 satellites, RADARSAT and SIR-C missions. When the Earth Observation Systems (EOS) program is operational, the amount of data available for a particular site can only increase. The interdisciplinary scientist, seeking to use data from various sensors to study his site of interest, may be faced with massive difficulties in manipulating such large data sets, assessing their information content, determining the optimum combinations of data to study a particular parameter, visualizing his results and validating his model of the surface. The techniques to deal with these problems are also needed to support the analysis of data from NASA's current program of Multi-sensor Airborne Campaigns, which will also generate large volumes of data. In the Case Studies outlined in this proposal, we will have somewhat unique data sets. For the Bonanza Creek Experimental Forest (Case 1) calibrated DC-8 SAR (Synthetic Aperture Radar) data and extensive ground truth measurement are already at our disposal. The data set shows documented evidence to temporal change. The Belize Forest Experiment (Case 2) will produce calibrated DC-8 SAR

  20. Model-Driven Integration and Analysis of Access-control Policies in Multi-layer Information Systems

    OpenAIRE

    Martínez, Salvador; Garcia-Alfaro, Joaquin; Cuppens, Frédéric; Cuppens-Boulahia, Nora; Cabot, Jordi

    2015-01-01

    Security is a critical concern for any information system. Security properties such as confidentiality, integrity and availability need to be enforced in order to make systems safe. In complex environments, where information systems are composed of a number of heterogeneous subsystems, each must participate in their achievement. Therefore, security integration mechanisms are needed in order to 1) achieve the global security goal and 2) facilitate the analysis of the security status of the who...