WorldWideScience

Sample records for modeling information assurance

  1. Models for Information Assurance Education and Outreach: A Report on Year 1 Implementation

    Science.gov (United States)

    Wang, Jianjun

    2013-01-01

    On September 22, 2012, NSF announced its decision to fund a three-year project, "Models for Information Assurance Education and Outreach" (MIAEO). In the first year of grant operation, MIAEO has invited 18 high school students, two K-12 teachers, and two CSUB student assistants to conduct research explorations in the fields of…

  2. Models for Information Assurance Education and Outreach: A Report on Year 2 Implementation

    Science.gov (United States)

    Wang, Jianjun

    2014-01-01

    "Models for Information Assurance Education and Outreach" (MIAEO) is an NSF-funded, three-year project to support hands-on explorations in "network security" and "cryptography" through Research Experience Vitalizing Science-University Program (REVS-UP) at California State University, Bakersfield. In addition, the…

  3. Information security assurance lifecycle research

    Institute of Scientific and Technical Information of China (English)

    XIE Cheng-shan; XUJIA Gu-yue; WANG Li

    2007-01-01

    This article proposes that problems of information security are mainly caused by the ineffective integration of people, operation, and technology, and not merely by the poor use of technology. Based on the information lifecycle, a model of the information security assurance lifecycle is presented. The crucial parts of the model are further discussed, with the information risk value and protect level, and the solution in each step of the lifecycle is presented with an ensured information risk level, in term of the integration of people, operation, and technology.

  4. Software Assurance Using Structured Assurance Case Models.

    Science.gov (United States)

    Rhodes, Thomas; Boland, Frederick; Fong, Elizabeth; Kass, Michael

    2010-01-01

    Software assurance is an important part of the software development process to reduce risks and ensure that the software is dependable and trustworthy. Software defects and weaknesses can often lead to software errors and failures and to exploitation by malicious users. Testing, certification and accreditation have been traditionally used in the software assurance process to attempt to improve software trustworthiness. In this paper, we examine a methodology known as a structured assurance model, which has been widely used for assuring system safety, for its potential application to software assurance. We describe the structured assurance model and examine its application and use for software assurance. We identify strengths and weaknesses of this approach and suggest areas for further investigation and testing.

  5. Information Assurance Study

    Science.gov (United States)

    1998-01-01

    Institute, C3I, Delfin Systems, ITT Industries, MITRE, Information Management Group, SAIC, GRC International, CSC, GTE, Harris and TRW. An official from...and organizations: Booz • Allen Hamilton, General Dynamics, Georgia Tech Research Institute, C3I, Delfin Systems, ITT Industries, MITRE, Information...Raczynski GD Julie Ryan JRI Mike Shank Delfin & FGM Mary Shupack TRW Angelo Spandaro GRCI Bob Thompson BAH Neil Wagoner Mitre Rusty Wall CSC Chris Wilson

  6. Information Assurance Under Fire

    NARCIS (Netherlands)

    Luiijf, H.A.M.

    2000-01-01

    Information and Communication Technology (ICT) has an immense impact on the Military Mode of Operation. Modern Armed Forces are increasingly using commercial-off-the-shelf (COTS) hardware, software and ICT-services. Defence and government decision-making units and its supporting critical industries

  7. Assured information flow capping architecture

    Science.gov (United States)

    Black, M. D.; Carvin, N. A.

    1985-05-01

    The Tactical Air Control System (TACS) is that set of Tactical Air Force assets used to assess the air and ground situation, and to plan, allocate, commit, and control assigned resources. Previous studies noted that the TACS elements should be more highly distributed to improve survivability in the battlefield of the future. This document reports on the results of the Assured Information Flow Capping Architecture study, which developed governing concepts for communications architectures that can support the information flow requirements of a future, distributed TACS. Architecture comprising existing and planned communications equipment were postulated and compared with a set of goals to identify deficiencies. Architectures using new equipment that resolve many of the deficiencies were then postulated, and areas needing further investigation were identified.

  8. Software Assurance Competency Model

    Science.gov (United States)

    2013-03-01

    2010a]: Application of technologies and processes to achieve a required level of confidence that software systems and services function in the...for specific projects. L5: Analyze assurance technologies and contribute to the development of new ones. Assured Software Development L1

  9. Building a global information assurance program

    CERN Document Server

    Curts, Raymond J

    2002-01-01

    INTRODUCTION TO INFORMATION ASSURANCE (IA)AuthenticationConfidentialityNon-repudiationBASIC CONCEPTSAttributesInformation AttributesPure Information AttributesAttributes Influenced by the SystemSystem AttributesSecurity AttributesInformation System Support Planning PrinciplesThe Bottom Line, RevisitedInformation Assurance (IA)Commercial CapabilitiesSecurityNetwork ViewsRisk ManagementCognitive HierarchyTypes of LogicSummaryRISK, THREAT AND VULNERABILITYOVERVIEW OF SYSTEMS ENGINEERINGA Systems Engineering Case StudyCase Study BackgroundThe MissionThe GoalAn Approach Toward A SolutionCase Tools:

  10. Development of Mathematical Models of Immune Networks Intended for Information Security Assurance

    Science.gov (United States)

    2006-02-01

    implementation of FIN has been proposed based on digital signal processor of super Harvard architecture (DSP SHARC). Keywords (about 10 words...hardware emulation of the developing models using digital signal processor (DSP) of the advanced super Harvard architecture (SHARC) has been also...the 2nd International Conference on Artificial Immune Systems (ICARIS󈧇), Napier University, Edinburgh, UK, August 31 – September 4, 2003 6

  11. Information Assurance and Forensic Readiness

    Science.gov (United States)

    Pangalos, Georgios; Katos, Vasilios

    Egalitarianism and justice are amongst the core attributes of a democratic regime and should be also secured in an e-democratic setting. As such, the rise of computer related offenses pose a threat to the fundamental aspects of e-democracy and e-governance. Digital forensics are a key component for protecting and enabling the underlying (e-)democratic values and therefore forensic readiness should be considered in an e-democratic setting. This position paper commences from the observation that the density of compliance and potential litigation activities is monotonically increasing in modern organizations, as rules, legislative regulations and policies are being constantly added to the corporate environment. Forensic practices seem to be departing from the niche of law enforcement and are becoming a business function and infrastructural component, posing new challenges to the security professionals. Having no a priori knowledge on whether a security related event or corporate policy violation will lead to litigation, we advocate that computer forensics need to be applied to all investigatory, monitoring and auditing activities. This would result into an inflation of the responsibilities of the Information Security Officer. After exploring some commonalities and differences between IS audit and computer forensics, we present a list of strategic challenges the organization and, in effect, the IS security and audit practitioner will face.

  12. Information Assurance Security in the Information Environment

    CERN Document Server

    Blyth, Andrew

    2006-01-01

    Intended for IT managers and assets protection professionals, this work aims to bridge the gap between information security, information systems security and information warfare. It covers topics such as the role of the corporate security officer; Corporate cybercrime; Electronic commerce and the global marketplace; Cryptography; and, more.

  13. INFORMATION ASSURANCE - INTELLIGENCE - INFORMATION SUPERIORITY RELATIONSHIP WITHIN NATO OPERATIONS

    Directory of Open Access Journals (Sweden)

    Gheorghe BOARU, Ioan-Mihai ILIEŞ

    2011-01-01

    Full Text Available There is a tight relationship between information assurance, the intelligence cycle and information superiority within NATO operations. The intelligence cycle has a discrete architecture and provides on-time and relevant intelligence products to the joint force commanders and to other authorized users in a specifi c joint area of operations. The intelligence cycle must follow the evolution of the operation. A permanent intelligence estimate will be performed during the military decision making process and operations execution. Information superiority is one of the most powerful intelligence cycle achievements. and decisively infuences the success of NATO joint operations. Information superiority must be preserved and enhanced through information assurance. Information assurance is an information operation that must be planned by the military in charge of operation security or by non-military experts, executed by all personnel during the entire intelligence cycle life time and employed during the planning and execution of NATO joint operations.

  14. Information Assurance and the Information Society

    NARCIS (Netherlands)

    Luiijf, H.A.M.

    1998-01-01

    Society is on the verge of a new era: the information age. Economical changes, a new way of looking at services and new types of conflict are forecasted. Some glimpses of these changes were noticed during the Persian Gulf War. Government decision units, organisations, society and critical industries

  15. Information Assurance and the Information Society

    NARCIS (Netherlands)

    Luiijf, H.A.M.

    1998-01-01

    Society is on the verge of a new era: the information age. Economical changes, a new way of looking at services and new types of conflict are forecasted. Some glimpses of these changes were noticed during the Persian Gulf War. Government decision units, organisations, society and critical industries

  16. Information Assurance and the Information Society

    NARCIS (Netherlands)

    Luiijf, H.A.M.

    1999-01-01

    Society is on the verge of a new era: the information age. Economical changes, a new way of looking at services and new types of conflict are forecasted. Some glimpses of these changes were noticed during the Persian Gulf War. Government decision units, organisations, society and critical industries

  17. NIF Projects Controls and Information Systems Software Quality Assurance Plan

    Energy Technology Data Exchange (ETDEWEB)

    Fishler, B

    2011-03-18

    Quality achievement for the National Ignition Facility (NIF) and the National Ignition Campaign (NIC) is the responsibility of the NIF Projects line organization as described in the NIF and Photon Science Directorate Quality Assurance Plan (NIF QA Plan). This Software Quality Assurance Plan (SQAP) is subordinate to the NIF QA Plan and establishes quality assurance (QA) activities for the software subsystems within Controls and Information Systems (CIS). This SQAP implements an activity level software quality assurance plan for NIF Projects as required by the LLNL Institutional Software Quality Assurance Program (ISQAP). Planned QA activities help achieve, assess, and maintain appropriate quality of software developed and/or acquired for control systems, shot data systems, laser performance modeling systems, business applications, industrial control and safety systems, and information technology systems. The objective of this SQAP is to ensure that appropriate controls are developed and implemented for management planning, work execution, and quality assessment of the CIS organization's software activities. The CIS line organization places special QA emphasis on rigorous configuration control, change management, testing, and issue tracking to help achieve its quality goals.

  18. Causal Models for Safety Assurance Technologies Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Fulfillment of NASA's System-Wide Safety and Assurance Technology (SSAT) project at NASA requires leveraging vast amounts of data into actionable knowledge. Models...

  19. 基于GB/T 20274的信息系统的安全技术保障度量与评估模型%GB/T 20274-based security assurance metrics and evaluation model for information systems

    Institute of Scientific and Technical Information of China (English)

    江常青; 安伟; 林家骏; 张雪芹; 袁文浩

    2012-01-01

    A security assurance measurement and evaluation model was developed to reduce information system security risks.The Chinese GB/T 20274 standard defines security assurance for information systems and specifies a security technical assurance metric with six capability maturity levels.This paper classifies the information assurance elements into three categories as the composite independent security assurance,composite complementary security assurance and composite correlated security assurance according to the relationships among system components.An access path and the dependence and correlation relationships among components are used in an evaluation model for information system security assurance.%为了实现现代信息系统的安全低风险运行,研究了信息系统的安全技术保障度量及评估模型。基于国家标准GB/T 20274定义了信息系统安全技术保障评估的安全技术保障要素集,并建议以能力成熟度等级作为信息系统的安全技术保障程度的度量。依据信息系统中组件组合后的相互关系将信息系统安全技术保障要素划分为组合独立性安全技术保障要素、组合互补性安全技术保障要素以及组合关联性安全技术保障要素,并且通过引入访问路径的定义和组件之间的依赖和关联关系,给出了信息系统安全技术保障的形式化评估模型及项目实现。

  20. Can the Analytical Hierarchy Process Model Be Effectively Applied in the Prioritization of Information Assurance Defense In-Depth Measures? --A Quantitative Study

    Science.gov (United States)

    Alexander, Rodney T.

    2017-01-01

    Organizational computing devices are increasingly becoming targets of cyber-attacks, and organizations have become dependent on the safety and security of their computer networks and their organizational computing devices. Business and government often use defense in-depth information assurance measures such as firewalls, intrusion detection…

  1. 77 FR 67366 - Federal Acquisition Regulation; Information Collection; Quality Assurance Requirements

    Science.gov (United States)

    2012-11-09

    ... Regulation; Information Collection; Quality Assurance Requirements AGENCY: Department of Defense (DOD... information collection requirement concerning quality assurance requirements. Public comments are particularly... Information Collection 9000- 0077, Quality Assurance Requirements, by any of the following methods...

  2. Mission Assurance Modeling and Simulation: A Cyber Security Roadmap

    Science.gov (United States)

    Gendron, Gerald; Roberts, David; Poole, Donold; Aquino, Anna

    2012-01-01

    This paper proposes a cyber security modeling and simulation roadmap to enhance mission assurance governance and establish risk reduction processes within constrained budgets. The term mission assurance stems from risk management work by Carnegie Mellon's Software Engineering Institute in the late 19905. By 2010, the Defense Information Systems Agency revised its cyber strategy and established the Program Executive Officer-Mission Assurance. This highlights a shift from simply protecting data to balancing risk and begins a necessary dialogue to establish a cyber security roadmap. The Military Operations Research Society has recommended a cyber community of practice, recognizing there are too few professionals having both cyber and analytic experience. The authors characterize the limited body of knowledge in this symbiotic relationship. This paper identifies operational and research requirements for mission assurance M&S supporting defense and homeland security. M&S techniques are needed for enterprise oversight of cyber investments, test and evaluation, policy, training, and analysis.

  3. Coalmine Safety Assurance Information System Based on GIS

    Institute of Scientific and Technical Information of China (English)

    LIU Qiao-xi; MAO Shan-jun; MA Ai-nai; MAO Yun-de; BAO Qing-guo

    2003-01-01

    The mine ventilation and safety is one of the most important factors to influence on the coal production.More attention has been paid to manage safety information in scientific, efficient, and real-time way. Therefore, it is important to develop a practical mine safety assurance information system (CSAIS). Based on analyzing the actual management mode for ventilation and safety on mine, the paper studies the structure and function of the mine safety assurance information system based on GIS in detail. Moreover, it also suggests some applications and solutions. By combining with the practical situation, the paper realizes the whole function of the present system.

  4. 75 FR 9142 - Information Assurance Scholarship Program (IASP)

    Science.gov (United States)

    2010-03-01

    ... other forms of information technology. Title: Information Assurance Scholarship Program (IASP). Type of... providing secondary education, or the recognized equivalent of such a certificate; (2) Is legally authorized... recipients to detail the results from their grant implementation. (2) Provide representation to the DoD...

  5. Information Assurance Science and Engineering Project

    Science.gov (United States)

    2004-03-01

    describe. However, they model only attacks. Since we have a generic state machine model , we can simultaneously model not just attacks, but also...States of the Finite State Machine Model The Network We model the network as a set of facts, each represented as a relational predicate. The state...represent communication between the state machines. 3.2 Step 2: Inject Faults Both links and nodes may be faulty. With our state machine model of the

  6. INFORMATION FLOW ASSURED BY ITC CONTINUITY PLANNING

    Directory of Open Access Journals (Sweden)

    Gabriel Cozgarea

    2009-05-01

    Full Text Available Forwarding the frequent usage of complex processes and the big volume of information, it is imperative to manage the automatic circuit of the document flow in a company activity. The main advantage of such a system consist in document waiting to be proces

  7. Strategic approach to information security and assurance in health research

    OpenAIRE

    Akazawa, Shunichi; Igarashi, Manabu; Sawa, Hirofumi; Tamashiro, Hiko

    2005-01-01

    Information security and assurance are an increasingly critical issue in health research. Whether health research be in genetics, new drugs, disease outbreaks, biochemistry, or effects of radiation, it deals with information that is highly sensitive and which could be targeted by rogue individuals or groups, corporations, national intelligence agencies, or terrorists, looking for financial, social, or political gains. The advents of the Internet and advances in recent information technologies...

  8. Voice Biometrics for Information Assurance Applications

    Science.gov (United States)

    2007-11-02

    verification capability). Approach #2: The individual speaker selects a test phrase (our approach) — In the NRL voice biomet - rics system, the...templates must be issued to all users. Approach #2: Unprocessed speech waveforms (our approach) — If the fingerprint-matching biomet - rics method stores...is that the amount of data to be stored is larger compared to the previous approach. The minimum amount of information we need to perform voice biomet

  9. Information Assurance Alignment: A Study of Performance Impacts

    Science.gov (United States)

    Ghezal, Said

    2011-01-01

    The positive effect on performance of the alignment between a business strategy and its different functional strategies has a wide support in the literature. As an emerging functional area, information assurance has come to play a strategic role by providing all departments and functions across an organization with a reliable, safe, and efficient…

  10. Integrated Reporting and Assurance of Sustainability Information: An Experimental Study on Professional Investors’ Information Processing

    NARCIS (Netherlands)

    Reimsbach, D.; Hahn, R.; Gürtürk, A.

    2017-01-01

    Sustainability-related non-financial information is increasingly deemed value relevant. Against this background, two recent trends in non-financial reporting are frequently discussed: integrated reporting and assurance of sustainability information. Using an established framework of information acqu

  11. Evaluating Outsourcing Information Technology and Assurance Expertise by Small Non-Profit Organizations

    Science.gov (United States)

    Guinn, Fillmore

    2013-01-01

    Small non-profit organizations outsource at least one information technology or information assurance process. Outsourcing information technology and information assurance processes has increased every year. The study was to determine the key reasons behind the choice to outsource information technology and information assurance processes. Using…

  12. Evaluating Outsourcing Information Technology and Assurance Expertise by Small Non-Profit Organizations

    Science.gov (United States)

    Guinn, Fillmore

    2013-01-01

    Small non-profit organizations outsource at least one information technology or information assurance process. Outsourcing information technology and information assurance processes has increased every year. The study was to determine the key reasons behind the choice to outsource information technology and information assurance processes. Using…

  13. A New Approach to Understanding Information Assurance

    Science.gov (United States)

    Blyth, Andrew; Williams, Colin; Bryant, Ian; Mattinson, Harvey

    The growth of technologies such as ubiquitous and the mobile computing has resulted in the need for a rethinking of the security paradigm. Over the past forty years technology has made fast steps forward, yet most organisations still view security in terms of Confidentiality, Integrity and Availability (CIA). This model of security has expanded to include Non-Repudiation and Authentication. However this thinking fails to address the social, ethical and business requirements that the modern use of computing has generated. Today computing devices are integrated into every facet of business with the result that security technologies have struggled to keep pace with the rate of change. In this paper we will argue that the currently view that most organisations/stakeholders have of security is out-of-date, or in some cases wrong, and that the new view of security needs to be rooted in business impact and business function.

  14. Planning Considerations for Defensive Information Warfare. Information Assurance

    Science.gov (United States)

    2007-11-02

    Systems Agency @ISA) ]oint Interoperability and Engineering Orga&zati&z (‘0) Center for Infomation Systems Security (CBS) Contract No. DCA 100-90-C...military operations. Service assurance features are designed into these systems at every level, [433 and yet they still M to meet even the challenge ...be developed. l Infrastructure design is different than systems design and should be treated as such. l Existing technical and human vulnerabilities

  15. Strategic approach to information security and assurance in health research.

    Science.gov (United States)

    Akazawa, Shunichi; Igarashi, Manabu; Sawa, Hirofumi; Tamashiro, Hiko

    2005-09-01

    Information security and assurance are an increasingly critical issue in health research. Whether health research be in genetics, new drugs, disease outbreaks, biochemistry, or effects of radiation, it deals with information that is highly sensitive and which could be targeted by rogue individuals or groups, corporations, national intelligence agencies, or terrorists, looking for financial, social, or political gains. The advents of the Internet and advances in recent information technologies have also dramatically increased opportunities for attackers to exploit sensitive and valuable information.Government agencies have deployed legislative measures to protect the privacy of health information and developed information security guidelines for epidemiological studies. However, risks are grossly underestimated and little effort has been made to strategically and comprehensively protect health research information by institutions, governments and international communities.There is a need to enforce a set of proactive measures to protect health research information locally and globally. Such measures should be deployed at all levels but will be successful only if research communities collaborate actively, governments enforce appropriate legislative measures at national level, and the international community develops quality standards, concluding treaties if necessary, at the global level.Proactive measures for the best information security and assurance would be achieved through rigorous management process with a cycle of "plan, do, check, and act". Each health research entity, such as hospitals, universities, institutions, or laboratories, should implement this cycle and establish an authoritative security and assurance organization, program and plan coordinated by a designatedChief Security Officer who will ensure implementation of the above process, putting appropriate security controls in place, with key focus areas such aspolicies and best practices, enforcement

  16. Model Based Mission Assurance: Emerging Opportunities for Robotic Systems

    Science.gov (United States)

    Evans, John W.; DiVenti, Tony

    2016-01-01

    The emergence of Model Based Systems Engineering (MBSE) in a Model Based Engineering framework has created new opportunities to improve effectiveness and efficiencies across the assurance functions. The MBSE environment supports not only system architecture development, but provides for support of Systems Safety, Reliability and Risk Analysis concurrently in the same framework. Linking to detailed design will further improve assurance capabilities to support failures avoidance and mitigation in flight systems. This also is leading new assurance functions including model assurance and management of uncertainty in the modeling environment. Further, the assurance cases, a structured hierarchal argument or model, are emerging as a basis for supporting a comprehensive viewpoint in which to support Model Based Mission Assurance (MBMA).

  17. A Rotational Blended Learning Model: Enhancement and Quality Assurance

    Science.gov (United States)

    Ghoul, Said

    2013-01-01

    Research on blended learning theory and practice is growing nowadays with a focus on the development, evaluation, and quality assurance of case studies. However, the enhancement of blended learning existing models, the specification of their online parts, and the quality assurance related specifically to them have not received enough attention.…

  18. [Role of medical information processing for quality assurance in obstetrics].

    Science.gov (United States)

    Selbmann, H K

    1983-06-01

    The paradigma of problem-orientated assuring of the professional quality of medical case is a kind of "control loop system" consisting of the following 5 steps: routine observation, identification of the problem, analysis of the problem, translation of problem solutions into daily practice and control as to whether the problem has been solved or eliminated. Medical data processing, which involves documentation, electronic data processing and statistics, can make substantial contributions especially to the steps of observation, identification of the problem, and follow-up control. Perinatal data collection, which has already been introduced in 6 Länder of the Federal Republic of Germany, has supplied ample proof of this. These operations were conducted under the heading "internal clinical assuring of quality with external aid". Those clinics who participated in this programme, were given the necessary aid in self-observation (questionnaires, clinical statistics), and they were also given comparative informative data to help them in identifying the problems (clinical profiles, etc.). It is entirely left to the responsibility of the clinics themselves--voluntary cooperation and guarantee of remaining anonymous being a matter of course -- to draw their own consequences from the collected data and to translate these into clinical everyday practice.

  19. SECURE MATHEMATICALLY- ASSURED COMPOSITION OF CONTROL MODELS

    Science.gov (United States)

    2017-09-27

    that is provably secure against many classes of cyber -attack. The goal of the project is to provide verifiable security ; that is, system designs which...architecture of the secure SMACCMcopter, illustrating the attack . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80 46 Failed cyber -attack...approach for building secure software. DARPA initiated the High Assurance Cyber Military Systems (HACMS) program to develop the technologies needed to

  20. Netcentric Information Orchestration: Assuring Information and System Quality in Public Safety Networks

    NARCIS (Netherlands)

    Bharosa, N.

    2011-01-01

    During daily operations, relief agencies such as police, fire brigade and medical services manage information in accordance with their respective processes and organization structure. When disaster strikes, the ad-hoc combinations of such hierarchy based information systems fail to assure high

  1. Netcentric Information Orchestration: Assuring Information and System Quality in Public Safety Networks

    NARCIS (Netherlands)

    Bharosa, N.

    2011-01-01

    During daily operations, relief agencies such as police, fire brigade and medical services manage information in accordance with their respective processes and organization structure. When disaster strikes, the ad-hoc combinations of such hierarchy based information systems fail to assure high infor

  2. QAM: PROPOSED MODEL FOR QUALITY ASSURANCE IN CBSS

    Directory of Open Access Journals (Sweden)

    Latika Kharb

    2015-08-01

    Full Text Available Component-based software engineering (CBSE / Component-Based Development (CBD lays emphasis on decomposition of the engineered systems into functional or logical components with well-defined interfaces used for communication across the components. Component-based software development approach is based on the idea to develop software systems by selecting appropriate off-the-shelf components and then to assemble them with a well-defined software architecture. Because the new software development paradigm is much different from the traditional approach, quality assurance for component-based software development is a new topic in the software engineering research community. Because component-based software systems are developed on an underlying process different from that of the traditional software, their quality assurance model should address both the process of components and the process of the overall system. Quality assurance for component-based software systems during the life cycle is used to analyze the components for achievement of high quality component-based software systems. Although some Quality assurance techniques and component based approach to software engineering have been studied, there is still no clear and well-defined standard or guidelines for component-based software systems. Therefore, identification of the quality assurance characteristics, quality assurance models, quality assurance tools and quality assurance metrics, are under urgent need. As a major contribution in this paper, I have proposed QAM: Quality Assurance Model for component-based software development, which covers component requirement analysis, component development, component certification, component architecture design, integration, testing, and maintenance.

  3. Health information management for research and quality assurance: the Comprehensive Renal Transplant Research Information System.

    Science.gov (United States)

    Famure, Olusegun; Phan, Nicholas Anh-Tuan; Kim, Sang Joseph

    2014-01-01

    The Kidney Transplant Program at the Toronto General Hospital uses numerous electronic health record platforms housing patient health information that is often not coded in a systematic manner to facilitate quality assurance and research. To address this, the comprehensive renal transplant research information system was conceived by a multidisciplinary healthcare team. Data analysis from comprehensive renal transplant research information system presented at programmatic retreats, scientific meetings, and peer-reviewed manuscripts contributes to quality improvement and knowledge in kidney transplantation.

  4. Quality Assurance Model for Digital Adult Education Materials

    Science.gov (United States)

    Dimou, Helen; Kameas, Achilles

    2016-01-01

    Purpose: This paper aims to present a model for the quality assurance of digital educational material that is appropriate for adult education. The proposed model adopts the software quality standard ISO/IEC 9126 and takes into account adult learning theories, Bloom's taxonomy of learning objectives and two instructional design models: Kolb's model…

  5. Quality assurance of weather data for agricultural system model input

    Science.gov (United States)

    It is well known that crop production and hydrologic variation on watersheds is weather related. Rarely, however, is meteorological data quality checks reported for agricultural systems model research. We present quality assurance procedures for agricultural system model weather data input. Problems...

  6. Risk-Informed Safety Assurance and Probabilistic Assessment of Mission-Critical Software-Intensive Systems

    Science.gov (United States)

    Guarro, Sergio B.

    2010-01-01

    This report validates and documents the detailed features and practical application of the framework for software intensive digital systems risk assessment and risk-informed safety assurance presented in the NASA PRA Procedures Guide for Managers and Practitioner. This framework, called herein the "Context-based Software Risk Model" (CSRM), enables the assessment of the contribution of software and software-intensive digital systems to overall system risk, in a manner which is entirely compatible and integrated with the format of a "standard" Probabilistic Risk Assessment (PRA), as currently documented and applied for NASA missions and applications. The CSRM also provides a risk-informed path and criteria for conducting organized and systematic digital system and software testing so that, within this risk-informed paradigm, the achievement of a quantitatively defined level of safety and mission success assurance may be targeted and demonstrated. The framework is based on the concept of context-dependent software risk scenarios and on the modeling of such scenarios via the use of traditional PRA techniques - i.e., event trees and fault trees - in combination with more advanced modeling devices such as the Dynamic Flowgraph Methodology (DFM) or other dynamic logic-modeling representations. The scenarios can be synthesized and quantified in a conditional logic and probabilistic formulation. The application of the CSRM method documented in this report refers to the MiniAERCam system designed and developed by the NASA Johnson Space Center.

  7. Statistical Modeling for Radiation Hardness Assurance: Toward Bigger Data

    Science.gov (United States)

    Ladbury, R.; Campola, M. J.

    2015-01-01

    New approaches to statistical modeling in radiation hardness assurance are discussed. These approaches yield quantitative bounds on flight-part radiation performance even in the absence of conventional data sources. This allows the analyst to bound radiation risk at all stages and for all decisions in the RHA process. It also allows optimization of RHA procedures for the project's risk tolerance.

  8. Quality Assurance Based on Descriptive and Parsimonious Appearance Models

    DEFF Research Database (Denmark)

    Nielsen, Jannik Boll; Eiríksson, Eyþór Rúnar; Kristensen, Rasmus Lyngby

    2015-01-01

    In this positional paper, we discuss the potential benefits of using appearance models in additive manufacturing, metal casting, wind turbine blade production, and 3D content acquisition. Current state of the art in acquisition and rendering of appearance cannot easily be used for quality assurance...

  9. Statistical Modeling for Radiation Hardness Assurance

    Science.gov (United States)

    Ladbury, Raymond L.

    2014-01-01

    We cover the models and statistics associated with single event effects (and total ionizing dose), why we need them, and how to use them: What models are used, what errors exist in real test data, and what the model allows us to say about the DUT will be discussed. In addition, how to use other sources of data such as historical, heritage, and similar part and how to apply experience, physics, and expert opinion to the analysis will be covered. Also included will be concepts of Bayesian statistics, data fitting, and bounding rates.

  10. A Framework for Managing the Assured Information Sharing Lifecycle

    Science.gov (United States)

    2013-11-06

    hoc networks. They developed an implementation of he g-SIS group-centric access control model and demonstrate its usefulness to use cases in...effectively used to devise better privacy control mechanisms to control information flow between users in such dynamic mobile systems. Mobile Ad- hoc ...Networks ( MANETs ) are extremely vulnerable to a variety of misbehaviors because of their basic features, including lack of communication infrastructure

  11. Quantifying and Assuring Information Transfer in Dynamic Heterogeneous Wireless Networks

    Science.gov (United States)

    2012-07-31

    Kumar, Estimating the state of a Markov chain over a noisy communication channel: A bound and an encoder. To appear in Proceedings of 49th IEEE...Transactions on Information Theory. 4. I-Hong Hou and P. R. Kumar, Queueing Systems with Hard Delay Constraints: A Framework andSolutions for Real-Time...J. Garcia-Haro, Z.J. Haas, A stochastic model for chain collisions of vehicles equipped with vehicular communications, accepted for publications in

  12. Incorporating Global Information Security and Assurance in I.S. Education

    Science.gov (United States)

    White, Garry L.; Hewitt, Barbara; Kruck, S. E.

    2013-01-01

    Over the years, the news media has reported numerous information security incidents. Because of identity theft, terrorism, and other criminal activities, President Obama has made information security a national priority. Not only is information security and assurance an American priority, it is also a global issue. This paper discusses the…

  13. Assuring the USAF Core Missions in the Information Age

    Science.gov (United States)

    2016-01-01

    volatile , uncertain, complex, and ambiguous (VUCA) environ- ment. Hence, a realistic battlefield that accurately represents the future environ- ments...support personnel across the globe with a portfolio valued at $17 bil- lion. He has overall responsibility for the Air Force’s information technology... portfolio as the senior authority for information technology investment strategy, networks, and network-centric policies, communications, information

  14. How to prepare for the next waves of Information Assurance issues?

    NARCIS (Netherlands)

    Luiijf, H.A.M.

    2006-01-01

    L'histore se répète. In general, each development wave of new technology shows a lack of security. The same lack of security can be found in the area of information and communications technology resulting in a lack of Critical Information Infrastructure (CII) Assurance. By looking back, we can predi

  15. 16 CFR 1101.32 - Reasonable steps to assure information is accurate.

    Science.gov (United States)

    2010-01-01

    ... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Reasonable steps to assure information is accurate. 1101.32 Section 1101.32 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION CONSUMER PRODUCT SAFETY ACT REGULATIONS INFORMATION DISCLOSURE UNDER SECTION 6(b) OF THE CONSUMER PRODUCT SAFETY ACT Reasonable Steps Commission Will Take...

  16. 16 CFR 1101.33 - Reasonable steps to assure information release is fair in the circumstances.

    Science.gov (United States)

    2010-01-01

    ... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Reasonable steps to assure information release is fair in the circumstances. 1101.33 Section 1101.33 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION CONSUMER PRODUCT SAFETY ACT REGULATIONS INFORMATION DISCLOSURE UNDER SECTION 6(b) OF THE CONSUMER PRODUCT SAFETY ACT Reasonable...

  17. MATHEMATICAL MODEL FOR SOFTWARE USABILITY AUTOMATED EVALUATION AND ASSURANCE

    Directory of Open Access Journals (Sweden)

    І. Гученко

    2011-04-01

    Full Text Available The subject of the research is software usability and the aim is construction of mathematicalmodel of estimation and providing of the set level of usability. Methodology of structural analysis,methods of multicriterion optimization and theory of making decision, method of convolution,scientific methods of analysis and analogies is used in the research. The result of executed work isthe model for software usability automated evaluation and assurance that allows not only toestimate the current level of usability during every iteration of agile development but also tomanage the usability of created software products. Results can be used for the construction ofautomated support systems of management the software usability.

  18. Self-Protecting Security for Assured Information Sharing

    Science.gov (United States)

    2015-08-29

    Security . We considered Android security a critical part of the self-protecting security framework especially for mobile and cloud computing. A summer...for a secure interoperable cloud-based Personal Health Record service, 2012 IEEE 4th International Conference on Cloud Computing Technology and...protecting security could be applied to protect sensitive information in cloud computing and mobile devices environments. Therefore, we viewed this area as

  19. Developing a Framework for Evaluating Organizational Information Assurance Metrics Programs

    Science.gov (United States)

    2007-03-01

    security effectiveness is speculative at best.”  How to Measure.    Leedy  and  Ormrod  (2005) mention several properties that measurement  must have in order...Nominal data provides the least information, while ratio  data provides the most (McClave and Benson, 2003;  Leedy  and  Ormrod , 2005).    72...attempts to draw a conclusion, which those instances would  support ( Leedy  and  Ormrod , 2005).  Soft Issues    Even though there are many things that can be

  20. Applying Business Process Reengineering to the Marine Corps Information Assurance Certification and Accreditation Process

    Science.gov (United States)

    2009-09-01

    Database Management System DATO : Denial of Authority To Operate DIACAP: DoD Information Assurance Certification and Accreditation Program DII...level of risk, based on the implementation of an approved set of technical, managerial, and procedural safeguards. (CNSSI, 2006, p. 2) IA...concerned with risk elimination but rather risk minimization. The need for IA C&A in USMC Information Technology (IT) systems is based on the need to

  1. Model Based Mission Assurance in a Model Based Systems Engineering (MBSE) Framework: State-of-the-Art Assessment

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.

    2016-01-01

    This report explores the current state of the art of Safety and Mission Assurance (S&MA) in projects that have shifted towards Model Based Systems Engineering (MBSE). Its goal is to provide insight into how NASA's Office of Safety and Mission Assurance (OSMA) should respond to this shift. In MBSE, systems engineering information is organized and represented in models: rigorous computer-based representations, which collectively make many activities easier to perform, less error prone, and scalable. S&MA practices must shift accordingly. The "Objective Structure Hierarchies" recently developed by OSMA provide the framework for understanding this shift. Although the objectives themselves will remain constant, S&MA practices (activities, processes, tools) to achieve them are subject to change. This report presents insights derived from literature studies and interviews. The literature studies gleaned assurance implications from reports of space-related applications of MBSE. The interviews with knowledgeable S&MA and MBSE personnel discovered concerns and ideas for how assurance may adapt. Preliminary findings and observations are presented on the state of practice of S&MA with respect to MBSE, how it is already changing, and how it is likely to change further. Finally, recommendations are provided on how to foster the evolution of S&MA to best fit with MBSE.

  2. Guidance for implementing an environmental, safety, and health-assurance program. Volume 15. A model plan for line organization environmental, safety, and health-assurance programs

    Energy Technology Data Exchange (ETDEWEB)

    Ellingson, A.C.; Trauth, C.A. Jr.

    1982-01-01

    This is 1 of 15 documents designed to illustrate how an Environmental, Safety and Health (ES and H) Assurance Program may be implemented. The generic definition of ES and H Assurance Programs is given in a companion document entitled An Environmental, Safety and Health Assurance Program Standard. This particular document presents a model operational-level ES and H Assurance Program that may be used as a guide by an operational-level organization in developing its own plan. The model presented here reflects the guidance given in the total series of 15 documents.

  3. Information Assurance and the Defense in Depth: A Study of Infosec Warriors and Infosec Cowboys

    Science.gov (United States)

    2003-01-01

    him unique insights into the challenges of information assurance. 26 Survey Structure Steiner Kvale described two metaphors of interviewers in his...knowledge ( Kvale 1996, 3-5). For the purpose of this study, the researcher attempted to be both miner and traveler. Each interview was structured around a...questions according to one of the five responses contained in Table 3. These basic responses, mined by the researcher ( Kvale 1996, 4), were entered into

  4. A Concept for Continuous Monitoring that Reduces Redundancy in Information Assurance Processes

    Science.gov (United States)

    2011-09-01

    Defense DAA Designated Accrediting Authority DATO Denial of Authorization to Operate DIACAP DoD Information Assurance Certification and...SFS) Program. This material is based on work supported by the National Science Foundation under Grant DUE-0414102. To Professor Cynthia Irvine, the...organization-specific registration tasks are performed. The baseline IA controls are generated from the DoDI 8500.2 based on the type and category

  5. Performance metrics and life-cycle information management for building performance assurance

    Energy Technology Data Exchange (ETDEWEB)

    Hitchcock, R.J.; Piette, M.A.; Selkowitz, S.E.

    1998-06-01

    Commercial buildings account for over $85 billion per year in energy costs, which is far more energy than technically necessary. One of the primary reasons buildings do not perform as well as intended is that critical information is lost, through ineffective documentation and communication, leading to building systems that are often improperly installed and operated. A life-cycle perspective on the management of building information provides a framework for improving commercial building energy performance. This paper describes a project to develop strategies and techniques to provide decision-makers with information needed to assure the desired building performance across the complete life cycle of a building project. A key element in this effort is the development of explicit performance metrics that quantitatively represent performance objectives of interest to various building stakeholders. The paper begins with a discussion of key problems identified in current building industry practice, and ongoing work to address these problems. The paper then focuses on the concept of performance metrics and their use in improving building performance during design, commissioning, and on-going operations. The design of a Building Life-cycle Information System (BLISS) is presented. BLISS is intended to provide an information infrastructure capable of integrating a variety of building information technologies that support performance assurance. The use of performance metrics in case study building projects is explored to illustrate current best practice. The application of integrated information technology for improving current practice is discussed.

  6. Prospects for Evidence -Based Software Assurance: Models and Analysis

    Science.gov (United States)

    2015-09-01

    would not only facilitate technology transition, but also the management of complex supply chains . A third challenge for R&D managers is tracing the...The project addresses the challenge of software assurance in the presence of rich supply chains . As a consequence of the focus on supply chains , the...area code) N/A Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std. Z39.18 software assurance, evidence-based software, software supply chain

  7. Gulf of Mexico dissolved oxygen model (GoMDOM) research and quality assurance project plan

    Science.gov (United States)

    An integrated high resolution mathematical modeling framework is being developed that will link hydrodynamic, atmospheric, and water quality models for the northern Gulf of Mexico. This Research and Quality Assurance Project Plan primarily focuses on the deterministic Gulf of Me...

  8. Image quality assurance in X-ray diagnostics - information on DIN-standard 6868

    Energy Technology Data Exchange (ETDEWEB)

    Becker-Gaab, C.; Borcke, E.; Bunde, E.; Hagemann, G.; Stender, H.S.; Kuetterer, G.; Lang, G.R.; Schoefer, H.; Stieve, F.E.; Widenmann, L.

    1985-11-01

    The Working Group for Standardisation justifies and comments on the establishment of a standard series for image quality assurance in X-ray diagnostic services (DIN 6868). In order to promote compliance with these standards, the users are given some background information on the structure of the standard series and on the recommended procedures to be followed. The definitions for the various hierarchically arranged tests, such as constancy test, status test, acceptance test as well as a definition for the term ''base-line image quality'' are explained.

  9. [Image quality assurance in x-ray diagnostic units. Information on DIN 6868 standard series].

    Science.gov (United States)

    Becker-Gaab, C; Borcke, E; Bunde, E; Hagemann, G; Kütterer, G; Lang, G R; Schöfer, H; Stender, H S; Stieve, F E; von Volkmann, T

    1985-11-01

    The Working Group for Standardisation justifies and comments on the establishment of a standard series for image quality assurance in X-ray diagnostic services (DIN 6868). In order to promote compliance with these standards, the users are given some background information on the structure of the standard series and on the recommended procedures to be followed. The definitions for the various hierarchically arranged tests, such as constancy test, status test, acceptance test as well as a definition for the term "base-line image quality" are explained.

  10. Using RUFDATA to guide a logic model for a quality assurance process in an undergraduate university program.

    Science.gov (United States)

    Sherman, Paul David

    2016-04-01

    This article presents a framework to identify key mechanisms for developing a logic model blueprint that can be used for an impending comprehensive evaluation of an undergraduate degree program in a Canadian university. The evaluation is a requirement of a comprehensive quality assurance process mandated by the university. A modified RUFDATA (Saunders, 2000) evaluation model is applied as an initiating framework to assist in decision making to provide a guide for conceptualizing a logic model for the quality assurance process. This article will show how an educational evaluation is strengthened by employing a RUFDATA reflective process in exploring key elements of the evaluation process, and then translating this information into a logic model format that could serve to offer a more focussed pathway for the quality assurance activities. Using preliminary program evaluation data from two key stakeholders of the undergraduate program as well as an audit of the curriculum's course syllabi, a case is made for, (1) the importance of inclusivity of key stakeholders participation in the design of the evaluation process to enrich the authenticity and accuracy of program participants' feedback, and (2) the diversification of data collection methods to ensure that stakeholders' narrative feedback is given ample exposure. It is suggested that the modified RUFDATA/logic model framework be applied to all academic programs at the university undergoing the quality assurance process at the same time so that economies of scale may be realized.

  11. Quality assurance of metabolomics.

    Science.gov (United States)

    Bouhifd, Mounir; Beger, Richard; Flynn, Thomas; Guo, Lining; Harris, Georgina; Hogberg, Helena; Kaddurah-Daouk, Rima; Kamp, Hennicke; Kleensang, Andre; Maertens, Alexandra; Odwin-DaCosta, Shelly; Pamies, David; Robertson, Donald; Smirnova, Lena; Sun, Jinchun; Zhao, Liang; Hartung, Thomas

    2015-01-01

    Metabolomics promises a holistic phenotypic characterization of biological responses to toxicants. This technology is based on advanced chemical analytical tools with reasonable throughput, including mass-spectroscopy and NMR. Quality assurance, however - from experimental design, sample preparation, metabolite identification, to bioinformatics data-mining - is urgently needed to assure both quality of metabolomics data and reproducibility of biological models. In contrast to microarray-based transcriptomics, where consensus on quality assurance and reporting standards has been fostered over the last two decades, quality assurance of metabolomics is only now emerging. Regulatory use in safety sciences, and even proper scientific use of these technologies, demand quality assurance. In an effort to promote this discussion, an expert workshop discussed the quality assurance needs of metabolomics. The goals for this workshop were 1) to consider the challenges associated with metabolomics as an emerging science, with an emphasis on its application in toxicology and 2) to identify the key issues to be addressed in order to establish and implement quality assurance procedures in metabolomics-based toxicology. Consensus has still to be achieved regarding best practices to make sure sound, useful, and relevant information is derived from these new tools.

  12. The evolving story of information assurance at the DoD.

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, Philip LaRoche

    2007-01-01

    This document is a review of five documents on information assurance from the Department of Defense (DoD), namely 5200.40, 8510.1-M, 8500.1, 8500.2, and an ''interim'' document on DIACAP [9]. The five documents divide into three sets: (1) 5200.40 & 8510.1-M, (2) 8500.1 & 8500.2, and (3) the interim DIACAP document. The first two sets describe the certification and accreditation process known as ''DITSCAP''; the last two sets describe the certification and accreditation process known as ''DIACAP'' (the second set applies to both processes). Each set of documents describes (1) a process, (2) a systems classification, and (3) a measurement standard. Appendices in this report (a) list the Phases, Activities, and Tasks of DITSCAP, (b) note the discrepancies between 5200.40 and 8510.1-M concerning DITSCAP Tasks and the System Security Authorization Agreement (SSAA), (c) analyze the DIACAP constraints on role fusion and on reporting, (d) map terms shared across the documents, and (e) review three additional documents on information assurance, namely DCID 6/3, NIST 800-37, and COBIT{reg_sign}.

  13. Information Assurance for Enterprise Resource Planning Systems: Risk Considerations in Public Sector Organizations

    Directory of Open Access Journals (Sweden)

    SHAHZAD NAEEM

    2016-10-01

    Full Text Available ERP (Enterprise Resource Planning systems reveal and pose non-typical risks due to its dependencies of interlinked business operations and process reengineering. Understanding of such type of risks is significant conducting and planning assurance involvement of the reliability of these complicated computer systems. Specially, in case of distributed environment where data reside at multiple sites and risks are of unique nature. Until now, there are brief pragmatic grounds on this public sector ERP issue. To analyze this subject, a partially organized consultation study was carried out with 15 skilled information systems auditors who are specialists in evaluating ERP systems risks. This methodology permitted to get more elaborated information about stakeholder?s opinions and customer experiences. In addition, interviewees mentioned a numerous basic execution troubles (e.g. inadequately skilled human resource and insufficient process reengineering attempts that lead into enhanced hazards. It was also reported by the interviewees that currently risks vary across vendors and across applications. Eventually, in offering assurance with ERP systems participants irresistibly stresses examining the process instead of system end product.

  14. Evaluation of a mandatory quality assurance data capture in anesthesia: a secure electronic system to capture quality assurance information linked to an automated anesthesia record.

    Science.gov (United States)

    Peterfreund, Robert A; Driscoll, William D; Walsh, John L; Subramanian, Aparna; Anupama, Shaji; Weaver, Melissa; Morris, Theresa; Arnholz, Sarah; Zheng, Hui; Pierce, Eric T; Spring, Stephen F

    2011-05-01

    Efforts to assure high-quality, safe, clinical care depend upon capturing information about near-miss and adverse outcome events. Inconsistent or unreliable information capture, especially for infrequent events, compromises attempts to analyze events in quantitative terms, understand their implications, and assess corrective efforts. To enhance reporting, we developed a secure, electronic, mandatory system for reporting quality assurance data linked to our electronic anesthesia record. We used the capabilities of our anesthesia information management system (AIMS) in conjunction with internally developed, secure, intranet-based, Web application software. The application is implemented with a backend allowing robust data storage, retrieval, data analysis, and reporting capabilities. We customized a feature within the AIMS software to create a hard stop in the documentation workflow before the end of anesthesia care time stamp for every case. The software forces the anesthesia provider to access the separate quality assurance data collection program, which provides a checklist for targeted clinical events and a free text option. After completing the event collection program, the software automatically returns the clinician to the AIMS to finalize the anesthesia record. The number of events captured by the departmental quality assurance office increased by 92% (95% confidence interval [CI] 60.4%-130%) after system implementation. The major contributor to this increase was the new electronic system. This increase has been sustained over the initial 12 full months after implementation. Under our reporting criteria, the overall rate of clinical events reported by any method was 471 events out of 55,382 cases or 0.85% (95% CI 0.78% to 0.93%). The new system collected 67% of these events (95% confidence interval 63%-71%). We demonstrate the implementation in an academic anesthesia department of a secure clinical event reporting system linked to an AIMS. The system enforces

  15. 16 CFR 1101.34 - Reasonable steps to assure information release is “reasonably related to effectuating the...

    Science.gov (United States)

    2010-01-01

    ... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Reasonable steps to assure information release is âreasonably related to effectuating the purposes of the Actsâ the Commission administers. 1101.34 Section 1101.34 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION CONSUMER PRODUCT SAFETY ACT REGULATIONS INFORMATION DISCLOSURE...

  16. Engineering Information Security The Application of Systems Engineering Concepts to Achieve Information Assurance

    CERN Document Server

    Jacobs, Stuart

    2011-01-01

    Information security is the act of protecting information from unauthorized access, use, disclosure, disruption, modification, or destruction. This book discusses why information security is needed and how security problems can have widespread impacts. It covers the complete security lifecycle of products and services, starting with requirements and policy development and progressing through development, deployment, and operations, and concluding with decommissioning. Professionals in the sciences, engineering, and communications fields will turn to this resource to understand the many legal,

  17. Developing and theoretically justifying innovative organizational practices in health information assurance

    Science.gov (United States)

    Collmann, Jeff R.

    2003-05-01

    This paper justifies and explains current efforts in the Military Health System (MHS) to enhance information assurance in light of the sociological debate between "Normal Accident" (NAT) and "High Reliability" (HRT) theorists. NAT argues that complex systems such as enterprise health information systems display multiple, interdependent interactions among diverse parts that potentially manifest unfamiliar, unplanned, or unexpected sequences that operators may not perceive or immediately understand, especially during emergencies. If the system functions rapidly with few breaks in time, space or process development, the effects of single failures ramify before operators understand or gain control of the incident thus producing catastrophic accidents. HRT counters that organizations with strong leadership support, continuous training, redundant safety features and "cultures of high reliability" contain the effects of component failures even in complex, tightly coupled systems. Building highly integrated, enterprise-wide computerized health information management systems risks creating the conditions for catastrophic breaches of data security as argued by NAT. The data security regulations of the Health Insurance Portability and Accountability Act of 1996 (HIPAA) implicitly depend on the premises of High Reliability Theorists. Limitations in HRT thus have implications for both safe program design and compliance efforts. MHS and other health care organizations should consider both NAT and HRT when designing and deploying enterprise-wide computerized health information systems.

  18. Evaluation framework for information system security assurance based on a CAE evidence reasoning evaluation model%基于《信息系统安全保障评估框架》的CAE证据推理评估模型

    Institute of Scientific and Technical Information of China (English)

    王雨; 江常青; 林家骏; 袁文浩

    2011-01-01

    A CAE Evidence Reasoning Model is given analyze security assurance evaluations based on the "Evaluation Framework for Information Systems Security Assurance".The model structure for this framework is given with a mapping between the framework and the "Baseline for Classified Protection of Information System Security".Finally,this paper introduces an evaluation method using the CAE Model as a unified framework with the evaluation framework as the evaluation standard.%为了实现基于《信息系统安全保障评估框架》(SCC)的安全保障评估,该文研究了CAE证据推理模型,通过对SCC结构的梳理,建立SCC与证据推理模型和《信息系统信息安全等级保护基本要求》的映射关系,提出以CAE证据推理模型为统一描述框架、以SCC为评估规约的安全保障评估流程,以实现基于SCC标准的保障评估。

  19. THE LAKE MICHIGAN MASS BALANCE PROJECT: QUALITY ASSURANCE PLAN FOR MATHEMATICAL MODELLING

    Science.gov (United States)

    This report documents the quality assurance process for the development and application of the Lake Michigan Mass Balance Models. The scope includes the overall modeling framework as well as the specific submodels that are linked to form a comprehensive synthesis of physical, che...

  20. Quality assurance in model based water management - review of existing practice and outline of new approaches

    NARCIS (Netherlands)

    Refsgaard, J.C.; Henriksen, H.; Harrar, B.; Scholten, H.; Kassahun, A.

    2005-01-01

    Quality assurance (QA) is defined as protocols and guidelines to support the proper application of models. In the water management context we classify QA guidelines according to how much focus is put on the dialogue between the modeller and the water manager as: (Type 1) Internal technical guideline

  1. An Exploration of Professional Culture Differentials and Their Potential Impact on the Information Assurance Component of Optical Transmission Networks Design

    Science.gov (United States)

    Cuthrell, Michael Gerard

    2011-01-01

    Optical transmission networks are an integral component of the critical infrastructures for many nations. Many people believe that optical transmission networks are impenetrable. In actuality, these networks possess weaknesses that can be exploited to bring about harm. An emerging Information Assurance (IA) industry has as its goals: to…

  2. Assurance specification documentation standard and Data Item Descriptions (DID). Volume of the information system life-cycle and documentation standards, volume 4

    Science.gov (United States)

    Callender, E. David; Steinbacher, Jody

    1989-01-01

    This is the fourth of five volumes on Information System Life-Cycle and Documentation Standards. This volume provides a well organized, easily used standard for assurance documentation for information systems and software, hardware, and operational procedures components, and related processes. The specifications are developed in conjunction with the corresponding management plans specifying the assurance activities to be performed.

  3. Quality Assurance in E-Learning: PDPP Evaluation Model and Its Application

    Science.gov (United States)

    Zhang, Weiyuan; Cheng, Y. L.

    2012-01-01

    E-learning has become an increasingly important teaching and learning mode in educational institutions and corporate training. The evaluation of e-learning, however, is essential for the quality assurance of e-learning courses. This paper constructs a four-phase evaluation model for e-learning courses, which includes planning, development,…

  4. Development Design Model of Academic Quality Assurance at Private Islamic University Jakarta Indonesia

    Science.gov (United States)

    Suprihatin, Krebet; Bin Mohamad Yusof, Hj. Abdul Raheem

    2015-01-01

    This study aims to evaluate the practice of academic quality assurance in design model based on seven aspects of quality are: curriculum design, teaching and learning, student assessment, student selection, support services, learning resources, and continuous improvement. The design study was conducted in two stages. The first stage is to obtain…

  5. Book Review: Cyber Security and Global Information Assurance: Threat Analysis and Response Solutions

    Directory of Open Access Journals (Sweden)

    Gary Kessler

    2009-09-01

    Full Text Available Knapp, K.J. (Ed. (2009. Cyber Security and Global Information Assurance: Threat Analysis and Response Solutions. Hershey, NY: Information Science Reference. 434 + xxii pages, ISBN: 978-1-60566-326-5, US$195.Reviewed by Gary C. Kessler (gck@garykessler.netI freely admit that this book was sent to me by the publisher for the expressed purpose of my writing a review and that I know several of the chapter authors. With that disclosure out of the way, let me say that the book is well worth the review (and I get to keep my review copy.The preface to the book cites the 2003 publication of The National Strategy to Secure Cyberspace by the White House, and the acknowledgement by the U.S. government that our economy and national security were fully dependent upon computers, networks, and the telecommunications infrastructure. This mayhave come as news to the general population but it was a long overdue public statement to those of us in the industry. The FBI's InfraGard program and the formation of the National Infrastructure Protection Center (NIPC pre-dated this report by at least a half-dozen years, so the report was hardly earthshattering. And the fact that the bulk of the telecom infrastructure is owned by the private sector is a less advertized fact. Nonetheless, reminding the community of these facts is always a Good Thing and provides the raison d’être of this book.(see PDF for full review

  6. Information Model for Product Modeling

    Institute of Scientific and Technical Information of China (English)

    焦国方; 刘慎权

    1992-01-01

    The Key problems in product modeling for integrated CAD ∥CAM systems are the information structures and representations of products.They are taking more and more important roles in engineering applications.With the investigation on engineering product information and from the viewpoint of industrial process,in this paper,the information models are proposed and the definitions of the framework of product information are given.And then,the integration and the consistence of product information are discussed by introucing the entity and its instance.As a summary,the information structures described in this paper have many advantage and natures helpful in engineering design.

  7. IMPROVED SOFTWARE QUALITY ASSURANCE TECHNIQUES USING SAFE GROWTH MODEL

    Directory of Open Access Journals (Sweden)

    M.Sangeetha

    2010-09-01

    Full Text Available In our lives are governed by large, complex systems with increasingly complex software, and the safety, security, and reliability of these systems has become a major concern. As the software in today’ssystems grows larger, it has more defects, and these defects adversely affect the safety, security, and reliability of the systems. Software engineering is the application of a systematic, disciplined, quantifiable approach to the development, operation, andmaintenance of software. Software divides into two pieces: internal and external quality characteristics.External quality characteristics are those parts of a product that face its users, where internal quality characteristics are those that do not.Quality is conformance to product requirements and should be free. This research concerns the role of software Quality. Software reliability is an important facet of software quality. It is the probability of failure-freeoperation of a computer program in a specified environment for a specified time. In software reliability modeling, the parameters of the model are typically estimated from the test data of the corresponding component. However, the widely used point estimatorsare subject to random variations in the data, resulting in uncertainties in these estimated parameters. This research describes a new approach to the problem of software testing. The approach is based on Bayesian graphical models and presents formal mechanisms forthe logical structuring of the software testing problem, the probabilistic and statistical treatment of the uncertainties to be addressed, the test design and analysis process, and the incorporation and implication of test results. Once constructed, the models produced are dynamic representations of the software testingproblem. It explains need of the common test-and-fix software quality strategy is no longer adequate, and characterizes the properties of the quality strategy.

  8. Multinational Quality Assurance

    Science.gov (United States)

    Kinser, Kevin

    2011-01-01

    Multinational colleges and universities pose numerous challenges to the traditional models of quality assurance that are designed to validate domestic higher education. When institutions cross international borders, at least two quality assurance protocols are involved. To guard against fraud and abuse, quality assurance in the host country is…

  9. ASPECTS REGARDING THE ROLE OF INFORMATION TECHNOLOGIES IN THE ASSURANCE OF SUPPLY CHAIN MANAGEMENT PERFORMANCE

    Directory of Open Access Journals (Sweden)

    Ilies Radu Ovidiu

    2013-07-01

    information technology such as Internet and ERP systems. Internet offers important opportunities to all partners from the supply chain to get information about consumption tendencies and changes in consumption request, virtual information about a product and the clients’ requests regarding the logistic services. As for ERP systems, it can be said that they mostly influence the designing of business processes, in order to assure coherence between them and the effective integration of different firm components. Even though the internal integration is an important aspect, an approach to management at the supply chain level, in an efficient and effective way, cannot be done without external integration with suppliers and clients. That is why we consider that companies belonging to the business field must focus on structuring key processes, to collaborate with their clients and suppliers and to integrate their internal systems, with the aim to support business operations.

  10. TU-G-BRD-02: Automated Systematic Quality Assurance Program for Radiation Oncology Information System Upgrades

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, B; Yi, B; Eley, J; Mutaf, Y; Rahman, S; D’Souza, W [University of Maryland School of Medicine, Baltimore, MD (United States)

    2015-06-15

    Purpose: To: (1) describe an independent, automated, systematic software-based protocol for verifying clinical data accuracy/integrity for mitigation of data corruption/loss risks following radiation oncology information system (ROIS) upgrades; and (2) report on application of this approach in an academic/community practice environment. Methods: We propose a robust approach to perform quality assurance on the ROIS after an upgrade, targeting four data sources: (1) ROIS relational database; (2) ROIS DICOM interface; (3) ROIS treatment machine data configuration; and (4) ROIS-generated clinical reports. We investigated the database schema for differences between pre-/post-upgrade states. Paired DICOM data streams for the same object (such as RT-Plan/Treatment Record) were compared between pre-/post-upgrade states for data corruption. We examined machine configuration and related commissioning data files for changes and corruption. ROIS-generated treatment appointment and treatment parameter reports were compared to ensure patient encounter and treatment plan accuracy. This protocol was supplemented by an end-to-end clinical workflow test to verify essential ROI functionality and integrity of components interfaced during patient care chain of activities. We describe the implementation of this protocol during a Varian ARIA system upgrade at our clinic. Results: We verified 1,638 data tables with 2.4 billion data records. For 222 under-treatment patients, 605 DICOM RT plans and 13,480 DICOM treatment records retrieved from the ROIS DICOM interface were compared, with no differences in fractions, doses delivered, or treatment parameters. We identified 82 new data tables and 78 amended/deleted tables consistent with the upgrade. Reports for 5,073 patient encounters over a 2-week horizon were compared and were identical to those before the upgrade. Content in 12,237 xml machine files was compared, with no differences identified. Conclusion: An independent QA

  11. 基于GB/T20274的信息系统安全技术保障评估及计算机实现%GB/T 20274-Based Information System Security Technical Assurance Evaluation and Computer Realization

    Institute of Scientific and Technical Information of China (English)

    安伟; 江常青; 林家骏; 张雪芹; 袁文浩

    2012-01-01

    National criteria GB/T 20274 defines the set of security technical assurance elements for the evaluation of information system security technical assurance, and provides security technical assurance metrics with different levels of capability maturity model. This paper firstly quantifies the security technical assurance metric levels, and then restates information system security technical assurance with the use of mathematical concepts, such as vector and vector infinity norm, and finally develops an effective algorithm for evaluating capability maturity levels of information systems in security technical assurance. The simulation shows that the proposed algorithm can effectively realize the security technical assurance evaluation of information systems.%国家标准GB/T20274定义了信息系统的安全技术保障要素集,并建议以能力成熟度等级的形式度量信息系统的安全技术保障性。本文首先对安全技术保障度量的能力成熟度等级进行量化处理;其次将信息系统组件的安全技术保障性表示成向量的形式,在向量和向量∞-范数的基础上重新阐述了信息系统的安全技术保障模型;最后,给出了信息系统的安全技术保障的计算机实现算法。仿真实验结果验证了本文算法能有效地实现信息系统的安全技术保障的评估。

  12. Market Analysis & Strategies for the Launch of New Product in the Landscape of Information Security & Assurance by Nexor

    OpenAIRE

    Gupta, Debraj

    2011-01-01

    Nexor has been an active player in the information assurance market for over two decades. They figured a gap in the information security horizon and subsequently undertook a project Carmen to cash in on the opportunity presented by the situation by bridging the gap. As a SME they have spent considerable resources, efforts and time for this. Now the product is ready to be taken to the market but prior to that it is necessary to evaluate the market scenario and possibilities of Carmen‘s success...

  13. IFP technologies for flow assurance. Modeling, thermal insulation, deposit prevention, additives, testing facilities

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-07-01

    Flow assurance has become one of the central topics covering the choice of a given field architecture and the specification of its production process. The relevant analysis includes the evaluation of risks and uncertainties associated with operational procedures, and contributes to a better estimate of the economics of a specific hydrocarbon production. This brochure presents an overview of innovative technologies, either available through IFP licensees or still under development by IFP and its industrial partners. The purpose of these technologies, related to Flow Assurance, is to secure the production operations, minimizing the down times, and reducing the production costs, particularly in the field of thermal insulation, deposit prevention and remediation. All these technologies benefit from the input of highly skilled teams from the Applied Mechanics, Applied Chemistry and Physical Chemistry Divisions of IFP, and rely on the design and use of sophisticated experimental laboratory and pilot equipment as well as advanced simulations and predictive modeling.

  14. El Aseguramiento de los Informes de Sostenibilidad: Diferencias Sustanciales con la Auditoría de Cuentas (Sustainability Reports Assurance: Substantial Differences with Financial Auditing)

    National Research Council Canada - National Science Library

    Amaia Zubiaurre

    2015-01-01

    Besides the growing interest of companies to communicate their commitment to sustainability, assurance of the information disclosed has increased, due to the interest of the stakeholders to know their reliability...

  15. A Generic Quality Assurance Model (GQAM) for successful e-health implementation in rural hospitals in South Africa.

    Science.gov (United States)

    Ruxwana, Nkqubela; Herselman, Marlien; Pottas, Dalenca

    2014-01-01

    Although e-health can potentially facilitate the management of scarce resources and improve the quality of healthcare services, implementation of e-health programs continues to fail or not fulfil expectations. A key contributor to the failure of e-health implementation in rural hospitals is poor quality management of projects. Based on a survey 35 participants from five rural hospitals in the Eastern Cape Province of South Africa, and using a qualitative case study research methodology, this article attempted to answer the question: does the adoption of quality assurance (QA) models add value and help to ensure success of information technology projects, especially in rural health settings? The study identified several weaknesses in the application of QA in these hospitals; however, findings also showed that the QA methods used, in spite of not being formally applied in a standardised manner, did nonetheless contribute to the success of some projects. The authors outline a generic quality assurance model (GQAM), developed to enhance the potential for successful acquisition of e-health solutions in rural hospitals, in order to improve the quality of care and service delivery in these hospitals.

  16. Building information modelling (BIM)

    CSIR Research Space (South Africa)

    Conradie, Dirk CU

    2009-02-01

    Full Text Available The concept of a Building Information Model (BIM) also known as a Building Product Model (BPM) is nothing new. A short article on BIM will never cover the entire filed, because it is a particularly complex filed that is recently beginning to receive...

  17. Quality Assurance in E-Learning: PDPP Evaluation Model and its Application

    Directory of Open Access Journals (Sweden)

    Weiyuan Zhang

    2012-06-01

    Full Text Available E-learning has become an increasingly important teaching and learning mode in educational institutions and corporate training. The evaluation of e-learning, however, is essential for the quality assurance of e-learning courses. This paper constructs a four-phase evaluation model for e-learning courses, which includes planning, development, process, and product evaluation, called the PDPP evaluation model. Planning evaluation includes market demand, feasibility, target student group, course objectives, and finance. Development evaluation includes instructional design, course material design, course Web site design, flexibility, student-student interaction, teacher/tutor support, technical support, and assessment. Process evaluation includes technical support, Web site utilization, learning interaction, learning evaluation, learning support, and flexibility. Product evaluation includes student satisfaction, teaching effectiveness, learning effectiveness, and sustainability. Using the PDPP model as a research framework, a purely e-learning course on Research Methods in Distance Education, developed by the School of Professional and Continuing Education at the University of Hong Kong (HKU SPACE and jointly offered with the School of Distance Learning for Medical Education of Peking University (SDLME, PKU, was used as a case study. Sixty students from mainland China, Hong Kong, Macau, and Malaysia were recruited for this course. According to summative evaluation through a student e-learning experience survey, the majority of students were very satisfied/satisfied on all e-learning dimensions of this course. The majority of students thought that the learning effectiveness of this course was equivalent, even better, than face-to-face learning because of cross-border collaborative learning, student-centred learning, sufficient learning support, and learning flexibility. This study shows that a high quality of teaching and learning might be assured by

  18. Tank waste information network system II (TWINS2) year 2000 compliance assurance plan

    Energy Technology Data Exchange (ETDEWEB)

    Adams, M.R.

    1998-04-16

    The scope of this plan includes the Tank Waste Information Network System II (TWINS2) that contains the following major components: Tank Characterization Database (TCD), Tank Vapor Database (TVD), Data Source Access (DSA), automated Tank Characterization Report, Best-Basis Inventory Model (BBIM), and Tracker (corrective action tracking) function. The automated Tank Characterization Report application currently in development also will reside on-the TWINS system as will the BBIM. Critical inputs to TWINS occur from the following databases: Labcore and SACS. Output does not occur from TWINS to these two databases.

  19. The role of reliability graph models in assuring dependable operation of complex hardware/software systems

    Science.gov (United States)

    Patterson-Hine, F. A.; Davis, Gloria J.; Pedar, A.

    1991-01-01

    The complexity of computer systems currently being designed for critical applications in the scientific, commercial, and military arenas requires the development of new techniques for utilizing models of system behavior in order to assure 'ultra-dependability'. The complexity of these systems, such as Space Station Freedom and the Air Traffic Control System, stems from their highly integrated designs containing both hardware and software as critical components. Reliability graph models, such as fault trees and digraphs, are used frequently to model hardware systems. Their applicability for software systems has also been demonstrated for software safety analysis and the analysis of software fault tolerance. This paper discusses further uses of graph models in the design and implementation of fault management systems for safety critical applications.

  20. Model of information diffusion

    CERN Document Server

    Lande, D V

    2008-01-01

    The system of cellular automata, which expresses the process of dissemination and publication of the news among separate information resources, has been described. A bell-shaped dependence of news diffusion on internet-sources (web-sites) coheres well with a real behavior of thematic data flows, and at local time spans - with noted models, e.g., exponential and logistic ones.

  1. Information model of economy

    Directory of Open Access Journals (Sweden)

    N.S.Gonchar

    2006-01-01

    Full Text Available A new stochastic model of economy is developed that takes into account the choice of consumers are the dependent random fields. Axioms of such a model are formulated. The existence of random fields of consumer's choice and decision making by firms are proved. New notions of conditionally independent random fields and random fields of evaluation of information by consumers are introduced. Using the above mentioned random fields the random fields of consumer choice and decision making by firms are constructed. The theory of economic equilibrium is developed.

  2. Geo-Information Logistical Modeling

    Directory of Open Access Journals (Sweden)

    Nikolaj I. Kovalenko

    2014-11-01

    Full Text Available This paper examines geo-information logistical modeling. The author illustrates the similarities between geo-informatics and logistics in the area of spatial objectives; illustrates that applying geo-data expands the potential of logistics; brings to light geo-information modeling as the basis of logistical modeling; describes the types of geo-information logistical modeling; describes situational geo-information modeling as a variety of geo-information logistical modeling.

  3. Quality assurance of high education

    Directory of Open Access Journals (Sweden)

    A. M. Aleksankov

    2016-01-01

    European and Russian approaches in Quality assurance will not appear. It means that the ways of harmonization of European and Russian requirements to Study Programmes’ Quality assurance could be found. And the logical part of implementation of international Quality management schemes will be the accreditation of Russian Study programmes in international organizations and networks. In order to ensure the effectiveness of such tasks, it is necessary to develop an appropriate tools, which could help to formalize and systematize procedures of Study Programmes’ Quality assurance with a glance of requirements of European standards. The experience of St. Petersburg Polytechnic University (SPbPU on Quality assurance of Study Programmes is discussed, in particular: development and appraisal of Technique for monitoring of Study Programmes and of the Model for on-line Quality Assurance of Study Programmes with a glance of requirements of European standards, which have been created in frames of the project TEMPUS EQUASP («On-line (Electronic Quality Assurance of Study Programmes» with participation of SPbPU. Implementation of proposed tools ensures the integrity and authenticity of information on all aspects of the realization of educational process, fulfi llment of all-European requirements on Study Programmes’ accreditation, harmonization of Russian and European Higher education systems, and, thus, forms the basis for Study Programmes’ accreditation in international organizations and networks. The Model for on-line Quality Assurance of Study Programmes is a powerful tool, which allows to bring the process of Quality Assurance of Study Programmes into accord with European standards and guidelines, to improve quality of Programmes, to increase their transparency and comparability.

  4. The Development of Evaluation Model for Internal Quality Assurance System of Dramatic Arts College of Bunditpattanasilpa Institute

    Science.gov (United States)

    Sinthukhot, Kittisak; Srihamongkol, Yannapat; Luanganggoon, Nuchwana; Suwannoi, Paisan

    2013-01-01

    The research purpose was to develop an evaluation model for the internal quality assurance system of the dramatic arts College of Bunditpattanasilpa Institute. The Research and Development method was used as research methodology which was divided into three phases; "developing the model and its guideline", "trying out the actual…

  5. Federal Plan for Cyber Security and Information Assurance Research and Development

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — Powerful personal computers, high-bandwidth and wireless networking technologies, and the widespread use of the Internet have transformed stand-alone computing...

  6. Information Assurance as a System of Systems in the Submarine Force

    Science.gov (United States)

    2013-09-01

    security, viruses , equipment, or personnel support, and so on. As long as the submarine could take advantage of increased productivity and information...files on servers creating a digital landfill would be considered to be as problematic as physical trash building up in the operating spaces (COMSUBFOR...outlines how an organization collects, uses, and protects the data stored within the digital landfill for command and control information. An independent

  7. DoD Information Assurance Certification and Accreditation Process (DIACAP) Survey and Decision Tree

    Science.gov (United States)

    2011-07-01

    CVC Compliance and Validation Certification DAA designated accrediting authority DATO denial of authorization to operate DIACAP DoD Information...standard based on implementation of the best practices listed in paragraph 2.3. c. Direct the DSG to rename the Data Protection Committee to the...Information Grid (GIG)- based environment. Figure A-1. DoD IA program management. 1.1.1 DIACAP Background. a. Interim DIACAP signed 6 July 2006

  8. Assurance Cases

    Science.gov (United States)

    2015-01-26

    2015 Carnegie Mellon University Assurance Cases Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 Charles B...1. REPORT DATE 26 JAN 2015 2. REPORT TYPE N/A 3. DATES COVERED 4. TITLE AND SUBTITLE Assurance Cases 5a. CONTRACT NUMBER 5b. GRANT NUMBER... Assurance Cases Charles B. Weinstock, January 2015 © 2015 Carnegie Mellon University Copyright 2015 Carnegie Mellon University This material is based upon

  9. Inexperienced clinicians can extract pathoanatomic information from MRI narrative reports with high reproducibility for use in research/quality assurance

    Directory of Open Access Journals (Sweden)

    Kent Peter

    2011-07-01

    Full Text Available Abstract Background Although reproducibility in reading MRI images amongst radiologists and clinicians has been studied previously, no studies have examined the reproducibility of inexperienced clinicians in extracting pathoanatomic information from magnetic resonance imaging (MRI narrative reports and transforming that information into quantitative data. However, this process is frequently required in research and quality assurance contexts. The purpose of this study was to examine inter-rater reproducibility (agreement and reliability among an inexperienced group of clinicians in extracting spinal pathoanatomic information from radiologist-generated MRI narrative reports. Methods Twenty MRI narrative reports were randomly extracted from an institutional database. A group of three physiotherapy students independently reviewed the reports and coded the presence of 14 common pathoanatomic findings using a categorical electronic coding matrix. Decision rules were developed after initial coding in an effort to resolve ambiguities in narrative reports. This process was repeated a further three times using separate samples of 20 MRI reports until no further ambiguities were identified (total n = 80. Reproducibility between trainee clinicians and two highly trained raters was examined in an arbitrary coding round, with agreement measured using percentage agreement and reliability measured using unweighted Kappa (k. Reproducibility was then examined in another group of three trainee clinicians who had not participated in the production of the decision rules, using another sample of 20 MRI reports. Results The mean percentage agreement for paired comparisons between the initial trainee clinicians improved over the four coding rounds (97.9-99.4%, although the greatest improvement was observed after the first introduction of coding rules. High inter-rater reproducibility was observed between trainee clinicians across 14 pathoanatomic categories over the

  10. El Aseguramiento de los Informes de Sostenibilidad: Diferencias Sustanciales con la Auditoría de Cuentas (Sustainability Reports Assurance: Substantial Differences with Financial Auditing

    Directory of Open Access Journals (Sweden)

    Amaia Zubiaurre

    2015-12-01

    Full Text Available Besides the growing interest of companies to communicate their commitment to sustainability, assurance of the information disclosed has increased, due to the interest of the stakeholders to know their reliability. Initially, we will explain the concept and benefits of sustainability reporting assurance. Subsequently, we will focus on the differences between the financial audit and sustainability reports assurance and the description of the main international report assurance statements. Finally, we will explain the main criticisms of assurance and some proposals for improvement. Junto al creciente interés de las empresas por comunicar su compromiso con la sostenibilidad, ha aumentado el aseguramiento de la información revelada, debido al interés de los grupos de interés por conocer su fiabilidad. Inicialmente, explicaremos el concepto y las ventajas del aseguramiento de los informes de sostenibilidad. Posteriormente, nos centraremos en las diferencias existentes entre la auditoría de cuentas y el aseguramiento de memorias de sostenibilidad y en la descripción de los principales estándares internacionales en materia de aseguramiento. Finalmente, expondremos las principales críticas al aseguramiento y algunas propuestas de mejora. DOWNLOAD THIS PAPER FROM SSRN: http://ssrn.com/abstract=2690158

  11. Information systems for administration, clinical documentation and quality assurance in an Austrian disease management programme.

    Science.gov (United States)

    Beck, Peter; Truskaller, Thomas; Rakovac, Ivo; Bruner, Fritz; Zanettin, Dominik; Pieber, Thomas R

    2009-01-01

    5.9% of the Austrian population is affected by diabetes mellitus. Disease Management is a structured treatment approach that is suitable for application to the diabetes mellitus area and often is supported by information technology. This article describes the information systems developed and implemented in the Austrian disease management programme for type 2 diabetes. Several workflows for administration as well as for clinical documentation have been implemented utilizing the Austrian e-Health infrastructure. De-identified clinical data is available for creating feedback reports for providers and programme evaluation.

  12. In vitro-in vivo Pharmacokinetic correlation model for quality assurance of antiretroviral drugs

    Directory of Open Access Journals (Sweden)

    Ricardo Rojas Gómez

    2015-10-01

    Full Text Available Introduction: The in vitro-in vivo pharmacokinetic correlation models (IVIVC are a fundamental part of the drug discovery and development process. The ability to accurately predict the in vivo pharmacokinetic profile of a drug based on in vitro observations can have several applications during a successful development process. Objective: To develop a comprehensive model to predict the in vivo absorption of antiretroviral drugs based on permeability studies, in vitro and in vivo solubility and demonstrate its correlation with the pharmacokinetic profile in humans. Methods: Analytical tools to test the biopharmaceutical properties of stavudine, lamivudine y zidovudine were developed. The kinetics of dissolution, permeability in caco-2 cells and pharmacokinetics of absorption in rabbits and healthy volunteers were evaluated. Results: The cumulative areas under the curve (AUC obtained in the permeability study with Caco-2 cells, the dissolution study and the pharmacokinetics in rabbits correlated with the cumulative AUC values in humans. These results demonstrated a direct relation between in vitro data and absorption, both in humans and in the in vivo model. Conclusions: The analytical methods and procedures applied to the development of an IVIVC model showed a strong correlation among themselves. These IVIVC models are proposed as alternative and cost/effective methods to evaluate the biopharmaceutical properties that determine the bioavailability of a drug and their application includes the development process, quality assurance, bioequivalence studies and pharmacosurveillance. 

  13. An Informational Analysis and Communications Squadron Survey of Cyberspace Mission Assurance

    Science.gov (United States)

    2010-06-01

    10 ITGI .......................................................................................................................... 12 ITIL ...discovery. However, given those limitations many resources were available. Among the most relevant were COSO, ITGI, ITIL , and ISO and it is on...COBIT may provide a mechanism to illustrate the benefits to those who control the budgets. ITIL The Information Technology Infrastructure Library

  14. ASPECTS REGARDING THE ROLE OF INFORMATION TECHNOLOGIES IN THE ASSURANCE OF SUPPLY CHAIN MANAGEMENT PERFORMANCE

    OpenAIRE

    Ilies Radu Ovidiu; Salagean Horatiu Catalin; Balc Bogdan; Gherman Mihai

    2013-01-01

    This paper is intended to outline the importance of e-logistics programs, based on the new information technologies and successful e-business applications in the case of Romanian companies that activate in the production and services field, namely producers, suppliers or distributors. The redesigning of the logistic system and the reconfiguration of the supply chain management (SCM) challenge the firms, especially the small ones, to explore new e-business applications, on the basis of feasibi...

  15. STEPP: A Grounded Model to Assure the Quality of Instructional Activities in e-Learning Environments

    Directory of Open Access Journals (Sweden)

    Hamdy AHMED ABDELAZIZ

    2013-07-01

    Full Text Available The present theoretical paper aims to develop a grounded model for designing instructional activities appropriate to e-learning and online learning environments. The suggested model is guided by learning principles of cognitivism, constructivism, and connectivism learning principles to help online learners constructing meaningful experiences and moving from knowledge acquisition to knowledge creation process. The proposed model consists of five dynamic and grounded domains that assure the quality of designing and using e-learning activities: Ø Social Domain; Ø Technological Domain; Ø Epistemological Domain; Ø Psychological domain; and Ø Pedagogical Domain. Each of these domains needs four types of presences to reflect the design and the application process of e-learning activities. These four presences are: Ø cognitive presence, Ø human presence, Ø psychological presence and Ø mental presence. Applying the proposed model (STEPP throughout all online and adaptive e-learning environments may improve the process of designing and developing e-learning activities to be used as mindtools for current and future learners.

  16. Model for Electromagnetic Information Leakage

    OpenAIRE

    Mao Jian; Li Yongmei; Zhang Jiemin; Liu Jinming

    2013-01-01

    Electromagnetic leakage will happen in working information equipments; it could lead to information leakage. In order to discover the nature of information in electromagnetic leakage, this paper combined electromagnetic theory with information theory as an innovative research method. It outlines a systematic model of electromagnetic information leakage, which theoretically describes the process of information leakage, intercept and reproduction based on electromagnetic radiation, and ana...

  17. Editorial: Special issue on resources for the computer security and information assurance curriculum: Issue 1Curriculum Editorial Comments, Volume 1 and Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Frincke, Deb; Ouderkirk, Steven J.; Popovsky, Barbara

    2006-12-28

    This is a pair of articles to be used as the cover editorials for a special edition of the Journal of Educational Resources in Computing (JERIC) Special Edition on Resources for the Computer Security and Information Assurance Curriculum, volumes 1 and 2.

  18. Least Information Modeling for Information Retrieval

    CERN Document Server

    Ke, Weimao

    2012-01-01

    We proposed a Least Information theory (LIT) to quantify meaning of information in probability distribution changes, from which a new information retrieval model was developed. We observed several important characteristics of the proposed theory and derived two quantities in the IR context for document representation. Given probability distributions in a collection as prior knowledge, LI Binary (LIB) quantifies least information due to the binary occurrence of a term in a document whereas LI Frequency (LIF) measures least information based on the probability of drawing a term from a bag of words. Three fusion methods were also developed to combine LIB and LIF quantities for term weighting and document ranking. Experiments on four benchmark TREC collections for ad hoc retrieval showed that LIT-based methods demonstrated very strong performances compared to classic TF*IDF and BM25, especially for verbose queries and hard search topics. The least information theory offers a new approach to measuring semantic qua...

  19. Information behaviour: models and concepts

    Directory of Open Access Journals (Sweden)

    Polona Vilar

    2005-01-01

    Full Text Available The article presents an overview of the research area of information behaviour. Information behaviour is defined as the behaviour of individuals in relation to information sources and channels, which results as a consequence of their information need, and encompasses passive and active searching of information, and its use. Theoretical foundations are presented, as well as some fundamental conceptual models of information behaviour and related concepts: information searching behaviour, which occurrs in active, purposeful searching for information, regardless of the information source used; and information seeking behaviour, which represents a micro-level of information searching behaviour, and is expressed by those individuals who interact with information retrieval systems.

  20. Software quality assurance handbook

    Energy Technology Data Exchange (ETDEWEB)

    1990-09-01

    There are two important reasons for Software Quality Assurance (SQA) at Allied-Signal Inc., Kansas City Division (KCD): First, the benefits from SQA make good business sense. Second, the Department of Energy has requested SQA. This handbook is one of the first steps in a plant-wide implementation of Software Quality Assurance at KCD. The handbook has two main purposes. The first is to provide information that you will need to perform software quality assurance activities. The second is to provide a common thread to unify the approach to SQA at KCD. 2 figs.

  1. Gender and information and communication technologies (ICT) anxiety: male self-assurance and female hesitation.

    Science.gov (United States)

    Broos, Agnetha

    2005-02-01

    This article presents the results of a quantitative study (n = 1,058) of the gender divide in ICT attitudes. In general, females had more negative attitudes towards computers and the Internet than did men. Results indicate a positive relationship between ICT experience and ICT attitudes. This experience is measured by period of time using a computer and self-perceived computer and Internet experience. Further analyses on the impact of gender on this correlation of ICT experience and ICT attitudes were conducted by means of a multivariate model. General Linear Model (GLM) analysis revealed that there was a significant effect of gender, computer use, and self-perceived computer experience on computer anxiety attitudes, as well as several significant interaction effects. Males were found to have less computer anxiety than females; respondents who have used computers for a longer period of time and respondents with a higher self-perception of experience also show less computer anxiety. However, the GLM plot shows that the influence of computer experience works in different ways for males and females. Computer experience has a positive impact on decreasing computer anxiety for men, but a similar effect was not found for women. The model was also tested for computer liking and Internet-liking factors.

  2. Objective information about energy models

    Energy Technology Data Exchange (ETDEWEB)

    Hale, D.R. (Energy Information Administration, Washington, DC (United States))

    1993-01-01

    This article describes the Energy Information Administration's program to develop objective information about its modeling systems without hindering model development and applications, and within budget and human resource constraints. 16 refs., 1 fig., 2 tabs.

  3. Information Retrieval Models

    NARCIS (Netherlands)

    Hiemstra, Djoerd; Göker, Ayse; Davies, John

    2009-01-01

    Many applications that handle information on the internet would be completely inadequate without the support of information retrieval technology. How would we find information on the world wide web if there were no web search engines? How would we manage our email without spam filtering? Much of the

  4. Information Retrieval Models

    NARCIS (Netherlands)

    Hiemstra, Djoerd; Göker, Ayse; Davies, John

    2009-01-01

    Many applications that handle information on the internet would be completely inadequate without the support of information retrieval technology. How would we find information on the world wide web if there were no web search engines? How would we manage our email without spam filtering? Much of the

  5. Model for Electromagnetic Information Leakage

    Directory of Open Access Journals (Sweden)

    Mao Jian

    2013-09-01

    Full Text Available Electromagnetic leakage will happen in working information equipments; it could lead to information leakage. In order to discover the nature of information in electromagnetic leakage, this paper combined electromagnetic theory with information theory as an innovative research method. It outlines a systematic model of electromagnetic information leakage, which theoretically describes the process of information leakage, intercept and reproduction based on electromagnetic radiation, and analyzes amount of leakage information with formulas.  

  6. An Analysis of Department of Defense Instruction 8500.2 'Information Assurance (IA) Implementation.'

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, Philip LaRoche

    2012-01-01

    The Department of Defense (DoD) provides its standard for information assurance in its Instruction 8500.2, dated February 6, 2003. This Instruction lists 157 'IA Controls' for nine 'baseline IA levels.' Aside from distinguishing IA Controls that call for elevated levels of 'robustness' and grouping the IA Controls into eight 'subject areas' 8500.2 does not examine the nature of this set of controls, determining, for example, which controls do not vary in robustness, how this set of controls compares with other such sets, or even which controls are required for all nine baseline IA levels. This report analyzes (1) the IA Controls, (2) the subject areas, and (3) the Baseline IA levels. For example, this report notes that there are only 109 core IA Controls (which this report refers to as 'ICGs'), that 43 of these core IA Controls apply without variation to all nine baseline IA levels and that an additional 31 apply with variations. This report maps the IA Controls of 8500.2 to the controls in NIST 800-53 and ITGI's CoBIT. The result of this analysis and mapping, as shown in this report, serves as a companion to 8500.2. (An electronic spreadsheet accompanies this report.)

  7. A decision model for financial assurance instruments in the upstream petroleum sector

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira, D. [State University of Campinas (UNICAMP) (Brazil). Dept. of Geology and Natural Resources; Suslick, S. [State University of Campinas (UNICAMP) (Brazil). Dept. of Geology and Natural Resources; Center for Petroleum Studies, Campinas (Brazil); Farley, J.; Costanza, R.; Krivov, S. [Maryland Univ., Solomons, MD (United States). Inst. for Ecological Economics

    2004-07-01

    The main objective of this paper is to deepen the discussion regarding the application of financial assurance instruments, bonds, in the upstream oil sector. This paper will also attempt to explain the current choice of instruments within the sector. The concepts of environmental damages and internalization of environmental and regulatory costs will be briefly explored. Bonding mechanisms are presently being adopted by several governments with the objective of guaranteeing the availability of funds for end-of-leasing operations. Regulators are mainly concerned with the prospect of inheriting liabilities from lessees. Several forms of bonding instruments currently available were identified and a new instrument classification was proposed. Ten commonly used instruments were selected and analyzed under the perspective of both regulators and industry (surety, paid-in and periodic payment collateral accounts, letters of credit, self-guarantees, investment grade securities, real estate collaterals, insurance policies, pools, and special funds). A multiattribute value function model was then proposed to examine current instrument preferences. Preliminary simulations confirm the current scenario where regulators are likely to require surety bonds, letters of credit, and periodic payment collateral account tools. (author)

  8. A model of quality assurance and quality improvement for post-graduate medical education in Europe.

    Science.gov (United States)

    Da Dalt, Liviana; Callegaro, Silvia; Mazzi, Anna; Scipioni, Antonio; Lago, Paola; Chiozza, Maria L; Zacchello, Franco; Perilongo, Giorgio

    2010-01-01

    The issue of quality assurance (QA) and quality improvement (QI), being the quality of medical education intimately related to the quality of the health care, is becoming of paramount importance worldwide. To describe a model of implementing a system for internal QA and QI within a post-graduate paediatric training programme based on the ISO 9001:2000 standard. For the ISO 9001:2000 standard, the curriculum was managed as a series of interrelated processes and their level of function was monitored by ad hoc elaborated objective indicators. The training programme was fragmented in 19 interlinked processes, 15 related procedures and 24 working instructions. All these materials, along with the quality policy, the mission, the strategies and the values were made publicly available. Based on the measurable indicators developed to monitor some of the processes, areas of weakness of the system were objectively identified and consequently QI actions implemented. The appropriateness of all this allowed the programme to finally achieve an official ISO 9000:2001 certification. The application of the ISO 9001:2000 standard served to develop an internal QA and QI system and to meet most of the standards developed for QA in higher and medical education.

  9. UST Financial Assurance Information

    Data.gov (United States)

    U.S. Environmental Protection Agency — Subtitle I of the Resource Conservation and Recovery Act, as amended by the Hazardous Waste Disposal Act of 1984, brought underground storage tanks (USTs) under...

  10. Advanced Information Assurance Handbook

    Science.gov (United States)

    2004-03-01

    to another host and to the console # kern.* /var/adm/kernel kern.crit @ finlandia kern.crit...var/adm/kernel. The second statement directs all kernel messages of the priority crit and higher to the remote host finlandia . This is useful... finlandia This rule would redirect all messages to a remote host called finlandia . This is useful especially in a cluster of machines where all

  11. Modeling spatiotemporal information generation

    NARCIS (Netherlands)

    Scheider, Simon; Gräler, Benedikt; Stasch, Christoph; Pebesma, Edzer

    2016-01-01

    Maintaining knowledge about the provenance of datasets, that is, about how they were obtained, is crucial for their further use. Contrary to what the overused metaphors of ‘data mining’ and ‘big data’ are implying, it is hardly possible to use data in a meaningful way if information about sources an

  12. A Systematic Comprehensive Computational Model for Stake Estimation in Mission Assurance: Applying Cyber Security Econometrics System (CSES) to Mission Assurance Analysis Protocol (MAAP)

    Energy Technology Data Exchange (ETDEWEB)

    Abercrombie, Robert K [ORNL; Sheldon, Frederick T [ORNL; Grimaila, Michael R [ORNL

    2010-01-01

    In earlier works, we presented a computational infrastructure that allows an analyst to estimate the security of a system in terms of the loss that each stakeholder stands to sustain as a result of security breakdowns. In this paper, we discuss how this infrastructure can be used in the subject domain of mission assurance as defined as the full life-cycle engineering process to identify and mitigate design, production, test, and field support deficiencies of mission success. We address the opportunity to apply the Cyberspace Security Econometrics System (CSES) to Carnegie Mellon University and Software Engineering Institute s Mission Assurance Analysis Protocol (MAAP) in this context.

  13. Internet Network Resource Information Model

    Institute of Scientific and Technical Information of China (English)

    陈传峰; 李增智; 唐亚哲; 刘康平

    2002-01-01

    The foundation of any network management systens is a database that con-tains information about the network resources relevant to the management tasks. A networkinformation model is an abstraction of network resources, including both managed resources andmanaging resources. In the SNMP-based management framework, management information isdefined almost exclusively from a "device" viewpoint, namely, managing a network is equiva-lent to managing a collection of individual nodes. Aiming at making use of recent advances indistributed computing and in object-oriented analysis and design, the Internet management ar-chitecture can also be based on the Open Distributed Processing Reference Model (RM-ODP).The purpose of this article is to provide an Internet Network Resource Information Model.First, a layered management information architecture will be discussed. Then the Internetnetwork resource information model is presented. The information model is specified usingObject-Z.

  14. Model for deployment of a Quality Assurance System in the nuclear fuel cycle facilities using Project Management techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lage, Ricardo F.; Ribeiro, Saulo F.Q., E-mail: rflage@gmail.com, E-mail: quintao.saulo@gmail.com [Industrias Nucleares do Brasil (INB), Rio de Janeiro, RJ (Brazil)

    2015-07-01

    The Nuclear Safety is the main goal in any nuclear facility. In this sense the Norm CNEN-NN-1.16 classifies the quality assurance issue as a management system to be deployed and implemented by the organization to achieving security goals. Quality Assurance is a set of systematic and planned actions necessary to provide adequate confidence ensuring that a structure, system, component or installation will work satisfactorily in s. Hence, the Quality Assurance System (QAS) is a complete and comprehensive methodology, going far beyond a management plan quality from the perspective of project management. The fundamental of QAS requirements is all activities that influence the quality, involving organizational, human resources, procurement, nuclear safety, projects, procedures and communication. Coordination of all these elements requires a great effort by the team responsible because it usually involves different areas and different levels of hierarchy within the organization. The objectives and desired benefits should be well set for everyone to understand what it means to be achieved and how to achieve. The support of senior management is critical at this stage, providing guidelines and resources necessary to get the job elapse clearly and efficiently, on time, cost and certain scope. The methodology of project management processes can be applied to facilitate and expedite the implementation of this system. Many of the principles of the QAS are correlated with knowledge areas of project management. The proposed model for implementation of a QAS in the nuclear fuel cycle facilities considered the best project management practices according to the Project Management Book of Knowledge (PMBOK - 5th edition) of the Project Management Institute (PMI). This knowledge is considered very good practices around the world. Since the model was defined, the deployment process becomes more practical and efficient, providing reduction in deployment time, better management of human

  15. Quality assurance

    Energy Technology Data Exchange (ETDEWEB)

    Gillespie, B.M.; Gleckler, B.P.

    1995-06-01

    This section of the 1994 Hanford Site Environmental Report summarizes the quality assurance and quality control practices of Hanford Site environmental monitoring and surveillance programs. Samples are analyzed according to documented standard analytical procedures. This section discusses specific measures taken to ensure quality in project management, sample collection, and analytical results.

  16. Textual information access statistical models

    CERN Document Server

    Gaussier, Eric

    2013-01-01

    This book presents statistical models that have recently been developed within several research communities to access information contained in text collections. The problems considered are linked to applications aiming at facilitating information access:- information extraction and retrieval;- text classification and clustering;- opinion mining;- comprehension aids (automatic summarization, machine translation, visualization).In order to give the reader as complete a description as possible, the focus is placed on the probability models used in the applications

  17. Software Quality Assurance Audits Guidebooks

    Science.gov (United States)

    1990-01-01

    The growth in cost and importance of software to NASA has caused NASA to address the improvement of software development across the agency. One of the products of this program is a series of guidebooks that define a NASA concept of the assurance processes that are used in software development. The Software Assurance Guidebook, NASA-GB-A201, issued in September, 1989, provides an overall picture of the NASA concepts and practices in software assurance. Second level guidebooks focus on specific activities that fall within the software assurance discipline, and provide more detailed information for the manager and/or practitioner. This is the second level Software Quality Assurance Audits Guidebook that describes software quality assurance audits in a way that is compatible with practices at NASA Centers.

  18. Information Theory: a Multifaceted Model of Information

    Directory of Open Access Journals (Sweden)

    Mark Burgin

    2003-06-01

    Full Text Available A contradictory and paradoxical situation that currently exists in information studies can be improved by the introduction of a new information approach, which is called the general theory of information. The main achievement of the general theory of information is explication of a relevant and adequate definition of information. This theory is built as a system of two classes of principles (ontological and sociological and their consequences. Axiological principles, which explain how to measure and evaluate information and information processes, are presented in the second section of this paper. These principles systematize and unify different approaches, existing as well as possible, to construction and utilization of information measures. Examples of such measures are given by Shannon’s quantity of information, algorithmic quantity of information or volume of information. It is demonstrated that all other known directions of information theory may be treated inside general theory of information as its particular cases.

  19. Printed Circuit Board Quality Assurance

    Science.gov (United States)

    Sood, Bhanu

    2016-01-01

    PCB Assurance Summary: PCB assurance actives are informed by risk in context of the Project. Lessons are being applied across Projects for continuous improvements. Newer component technologies, smaller/high pitch devices: tighter and more demanding PCB designs: Identifying new research areas. New materials, designs, structures and test methods.

  20. Construct the Assurance System of Network Information Transmission%网络信息传播保障体系构建

    Institute of Scientific and Technical Information of China (English)

    孟庆兰

    2012-01-01

    在分析了当今社会网络信息传播存在诸多问题的基础上,从多学科结合的角度提出技术、道德、法律三者相结合的网络信息传播保障体系。%On the basis of analyzing problems existing in network information transmission nowadays, it introduces the assurance system of network information transmission of technology, moral, law three combinations from a multi disciplinary combination angle.

  1. A new quality assurance package for hospital palliative care teams: the Trent Hospice Audit Group model.

    Science.gov (United States)

    Hunt, J; Keeley, V L; Cobb, M; Ahmedzai, S H

    2004-07-19

    Cancer patients in hospitals are increasingly cared for jointly by palliative care teams, as well as oncologists and surgeons. There has been a considerable growth in the number and range of hospital palliative care teams (HPCTs) in the United Kingdom. HPCTs can include specialist doctors and nurses, social workers, chaplains, allied health professionals and pharmacists. Some teams work closely with existing cancer multidisciplinary teams (MDTs) while others are less well integrated. Quality assurance and clinical governance requirements have an impact on the monitoring of such teams, but so far there is no standardised way of measuring the amount and quality of HPCTs' workload. Trent Hospice Audit Group (THAG) is a multiprofessional research group, which has been developing standards and audit tools for palliative care since the 1990s. These follow a format of structure-process-outcome for standards and measures. We describe a collaborative programme of work with HPCTs that has led to a new set of standards and audit tools. Nine HPCTs participated in three rounds of consultation, piloting and modification of standard statements and tools. The final pack of HPCT quality assurance tools covers: policies and documentation; medical notes review; questionnaires for ward-based staff. The tools measure the HPCT workload and casemix; the views of ward-based staff on the supportive role of the HPCT and the effectiveness of HPCT education programmes, particularly in changing practice. The THAG HPCT quality assurance pack is now available for use in cancer peer review.

  2. INFORMATION MODEL OF SOCIAL TRANSFORMATIONS

    Directory of Open Access Journals (Sweden)

    Мария Васильевна Комова

    2013-09-01

    Full Text Available The social transformation is considered as a process of qualitative changes of the society, creating a new level of organization in all areas of life, in different social formations, societies of different types of development. The purpose of the study is to create a universal model for studying social transformations based on their understanding as the consequence of the information exchange processes in the society. After defining the conceptual model of the study, the author uses the following methods: the descriptive method, analysis, synthesis, comparison.Information, objectively existing in all elements and systems of the material world, is an integral attribute of the society transformation as well. The information model of social transformations is based on the definition of the society transformation as the change in the information that functions in the society’s information space. The study of social transformations is the study of information flows circulating in the society and being characterized by different spatial, temporal, and structural states. Social transformations are a highly integrated system of social processes and phenomena, the nature, course and consequences of which are affected by the factors representing the whole complex of material objects. The integrated information model of social transformations foresees the interaction of the following components: social memory, information space, and the social ideal. To determine the dynamics and intensity of social transformations the author uses the notions of "information threshold of social transformations" and "information pressure".Thus, the universal nature of information leads to considering social transformations as a system of information exchange processes. Social transformations can be extended to any episteme actualized by social needs. The establishment of an information threshold allows to simulate the course of social development, to predict the

  3. Possibilities for Using TAM and Technology Frames Models to Assess the Acceptance of New Technologies in the Chilean Higher Education Quality Assurance

    Directory of Open Access Journals (Sweden)

    Luis González-Bravo

    2015-05-01

    Full Text Available This essay reviews the importance of assessing the degree of acceptance of new technologies in the Chilean higher education institutions, as an input for managing quality assurance. Technology Acceptance and Technology Frames models are described, emphasizing their benefits in this field. Understanding and facilitating the process of new technologies acceptance in the organizations, by identifying those elements which hinder it, allows improving the implementation of quality assurance mechanisms in order to make the educational process more efficient and effective.

  4. Benchmarking Software Assurance Implementation

    Science.gov (United States)

    2011-05-18

    product The chicken#. (a.k.a. Process Focused Assessment ) – Management Systems ( ISO 9001 , ISO 27001, ISO 2000) – Capability Maturity Models (CMMI...Assurance PRM, RMM, Assurance for CMMI)) – Lifecycle Processes ( ISO /IEEE 15288, ISO /IEEE 12207) – COBIT, ITIL, MS SDL, OSAMM, BSIMM 5 The egg...a.k.a Product Focused Assessments) – SCAP - NIST-SCAP – ISO /OMG W3C – KDM, BPMN, RIF, XMI, RDF – OWASP Top 10 – SANS TOP 25 – Secure Code Check Lists

  5. Vega flow assurance system

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Marit; Munaweera, Sampath

    2010-07-01

    Vega is a gas condensate field located at the west coast of Norway and developed as a tie-in to the Gjoea platform. Operator is Statoil, production startup is estimated to the end of 2010. Flow assurance challenges are high reservoir pressure and temperature, hydrate and wax control, liquid accumulation and monitoring the well/template production rates. The Vega Flow Assurance System (FAS) is a software that supports monitoring and operation of the field. The FAS is based FlowManagerTM designed for real time systems. This is a flexible tool with its own steady state multiphase- and flow assurance models. Due to the long flowlines lines and the dynamic behavior, the multiphase flow simulator OLGA is also integrated in the system. Vega FAS will be used as: - An online monitoring tool - An offline what-if simulation and validation tool - An advisory control system for well production allocation. (Author)

  6. Executive Information Systems' Multidimensional Models

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available Executive Information Systems are design to improve the quality of strategic level of management in organization through a new type of technology and several techniques for extracting, transforming, processing, integrating and presenting data in such a way that the organizational knowledge filters can easily associate with this data and turn it into information for the organization. These technologies are known as Business Intelligence Tools. But in order to build analytic reports for Executive Information Systems (EIS in an organization we need to design a multidimensional model based on the business model from the organization. This paper presents some multidimensional models that can be used in EIS development and propose a new model that is suitable for strategic business requests.

  7. Solar consumer assurance network briefing book

    Energy Technology Data Exchange (ETDEWEB)

    Connor, Lynda

    1980-06-01

    Background information is provided on the rationale and purpose of the Solar Consumer Assurance Network (SOLCAN) program. Mechanisms being instituted by states to meet solar consumer assurance needs are identified. Mechanisms being developed with Federal government support to encourage solar consumer assurance activities are described. The operation of the FY 80 SOLCAN effort is described. (MHR)

  8. Building Assured Systems Framework

    Science.gov (United States)

    2010-09-01

    information. We would like to thank John Goodenough and Carol Woody for their thoughtful review of this report. They made many valuable comments and... tested this hypothesis by assigning “maturity levels” 1 to each area of the MSwA2010 BoK. BoK areas include assurance across life cycles, risk...studies are typically available. To test this hypothesis further, we mapped existing CERT research work, described in the 2009 CERT Research Annual

  9. Secure Digital Cashless Transactions with Sequence Diagrams and Spatial Circuits to Enhance the Information Assurance and Security Education

    Directory of Open Access Journals (Sweden)

    Dr. Ajantha Herath

    2012-04-01

    Full Text Available Often students have difficulties mastering cryptographic algorithms. For some time we have been developing with methods for introducing important security concepts for both undergraduate and graduate students in Information Systems, Computer Science and Engineering students. To achieve this goal, Sequence diagrams and spatial circuit derivation from equations are introduced to students. Sequence diagrams represent progression of events with time. They learn system security concepts more effectively if they know how to transform equations and high level programming language constructs into spatial circuits or special purpose hardware. This paper describes an active learning module developed to help students understand secure protocols, algorithms and modeling web applications to prevent attacks and both software and hardware implementations related to encryption. These course materials can also be used in computer organization and architecture classes to help students understand and develop special purpose circuitry for cryptographic algorithms.

  10. PROCESS AND PRODUCT QUALITY ASSURANCE MEASURES IN CMMI

    Directory of Open Access Journals (Sweden)

    Mahmoud Khraiwesh

    2014-10-01

    Full Text Available Process and product quality assurance are very important aspects in development of software. Process and product quality assurance monitor the software engineering processes and methods to ensure quality. It is the process of confirming and verifying that whether services and products meet the customer expectation or not. This research will identify general measures for the specific goals and its specific practices of Process and Product Quality Assurance Process Area in Capability Maturity Model Integration (CMMI. CMMI is developed by Software Engineering Institute (SEI in Carnegie Mellon University in USA. CMMI is a framework for assessment and improvement of computer information systems. The procedure we used to determine the measures is to apply the Goal Questions Metrics (GQM approach to the two specific goals and its four specific practices of Process and Product Quality Assurance Process Area in CMMI.

  11. Building Information Modeling Comprehensive Overview

    Directory of Open Access Journals (Sweden)

    Sergey Kalinichuk

    2015-07-01

    Full Text Available The article is addressed to provide a comprehensive review on recently accelerated development of the Information Technology within project market such as industrial, engineering, procurement and construction. Author’s aim is to cover the last decades of the growth of the Information and Communication Technology in construction industry in particular Building Information Modeling and testifies that the problem of a choice of the effective project realization method not only has not lost its urgency, but has also transformed into one of the major condition of the intensive technology development. All of it has created a great impulse on shortening the project duration and has led to the development of various schedule compression techniques what becomes a focus of modern construction.

  12. Beyond Information Seeking: Towards a General Model of Information Behaviour

    Science.gov (United States)

    Godbold, Natalya

    2006-01-01

    Introduction: The aim of the paper is to propose new models of information behaviour that extend the concept beyond simply information seeking to consider other modes of behaviour. The models chiefly explored are those of Wilson and Dervin. Argument: A shortcoming of some models of information behaviour is that they present a sequence of stages…

  13. WHO informal consultation on the application of molecular methods to assure the quality, safety and efficacy of vaccines, Geneva, Switzerland, 7-8 April 2005.

    Science.gov (United States)

    Shin, Jinho; Wood, David; Robertson, James; Minor, Philip; Peden, Keith

    2007-03-01

    In April 2005, the World Health Organization convened an informal consultation on molecular methods to assure the quality, safety and efficacy of vaccines. The consultation was attended by experts from national regulatory authorities, vaccine industry and academia. Crosscutting issues on the application of molecular methods for a number of vaccines that are currently in use or under development were presented, and specific methods for further collaborative studies were discussed and identified. The main points of recommendation from meeting participants were fourfold: (i) that molecular methods should be encouraged; (ii) that collaborative studies are needed for many methods/applications; (iii) that basic science should be promoted; and (iv) that investment for training, equipment and facilities should be encouraged.

  14. Quality Assurance in the Presence of Variability

    Science.gov (United States)

    Lauenroth, Kim; Metzger, Andreas; Pohl, Klaus

    Software Product Line Engineering (SPLE) is a reuse-driven development paradigm that has been applied successfully in information system engineering and other domains. Quality assurance of the reusable artifacts of the product line (e.g. requirements, design, and code artifacts) is essential for successful product line engineering. As those artifacts are reused in several products, a defect in a reusable artifact can affect several products of the product line. A central challenge for quality assurance in product line engineering is how to consider product line variability. Since the reusable artifacts contain variability, quality assurance techniques from single-system engineering cannot directly be applied to those artifacts. Therefore, different strategies and techniques have been developed for quality assurance in the presence of variability. In this chapter, we describe those strategies and discuss in more detail one of those strategies, the so called comprehensive strategy. The comprehensive strategy aims at checking the quality of all possible products of the product line and thus offers the highest benefits, since it is able to uncover defects in all possible products of the product line. However, the central challenge for applying the comprehensive strategy is the complexity that results from the product line variability and the large number of potential products of a product line. In this chapter, we present one concrete technique that we have developed to implement the comprehensive strategy that addresses this challenge. The technique is based on model checking technology and allows for a comprehensive verification of domain artifacts against temporal logic properties.

  15. Information risk and security modeling

    Science.gov (United States)

    Zivic, Predrag

    2005-03-01

    This research paper presentation will feature current frameworks to addressing risk and security modeling and metrics. The paper will analyze technical level risk and security metrics of Common Criteria/ISO15408, Centre for Internet Security guidelines, NSA configuration guidelines and metrics used at this level. Information IT operational standards view on security metrics such as GMITS/ISO13335, ITIL/ITMS and architectural guidelines such as ISO7498-2 will be explained. Business process level standards such as ISO17799, COSO and CobiT will be presented with their control approach to security metrics. Top level, the maturity standards such as SSE-CMM/ISO21827, NSA Infosec Assessment and CobiT will be explored and reviewed. For each defined level of security metrics the research presentation will explore the appropriate usage of these standards. The paper will discuss standards approaches to conducting the risk and security metrics. The research findings will demonstrate the need for common baseline for both risk and security metrics. This paper will show the relation between the attribute based common baseline and corporate assets and controls for risk and security metrics. IT will be shown that such approach spans over all mentioned standards. The proposed approach 3D visual presentation and development of the Information Security Model will be analyzed and postulated. Presentation will clearly demonstrate the benefits of proposed attributes based approach and defined risk and security space for modeling and measuring.

  16. Sustainment and Net-ready Key Performance Parameters (KPP) in an Enterprise Information System (EIS) Value Assurance Framework (VAF)

    Science.gov (United States)

    2014-04-01

    this enterprise often requires that a trusted broker “ sneaker net” sanitized information from one proprietary communications circuit to another...process collaboration in this enterprise often requires that a trusted broker “ sneaker net” sanitized information from one proprietary communications

  17. The Safety Journey: Using a Safety Maturity Model for Safety Planning and Assurance in the UK Coal Mining Industry

    Directory of Open Access Journals (Sweden)

    Patrick Foster

    2013-02-01

    Full Text Available A Safety Maturity Model was developed for use in UK coal mining operations in order to assess the level of compliance and effectiveness with a recently introduced standards based safety management system. The developed model allowed for a “self-assessment” of the maturity to be undertaken by teams from the individual sites. Assessments were undertaken at all sites (surface and underground and in some cases within each site (e.g., underground operations, surface coal preparation plant. Once the level of maturity was established, improvement plans were developed to improve the maturity of individual standards that were weaker than the average and/or improve the maturity as a whole. The model was likened to a journey as there was a strong focus on continual improvement and effectiveness of the standards, rather than pure compliance. The model has been found to be a practical and useful tool by sites as a means of identifying strengths and weaknesses within their systems, and as a means of assurance with the safety management system standards.

  18. Quality Assurance - Construction

    DEFF Research Database (Denmark)

    Gaarslev, Axel

    1996-01-01

    Gives contains three main chapters:1. Quality Assurance initiated by external demands2. Quality Assurance initiated by internal company goals3. Innovation strategies......Gives contains three main chapters:1. Quality Assurance initiated by external demands2. Quality Assurance initiated by internal company goals3. Innovation strategies...

  19. Comment assurer une information financière de qualité sous le système comptable OHADA ?

    OpenAIRE

    Bampoky, Boniface

    2013-01-01

    International audience; The quality financial information is useful for forecasting, monitoring and the development of performance within a company, the effectiveness of the optimal investment choices, risk management, the economic policy choices. After more than a decade of implementation by business, OHADA Accounting System has not changed significantly, which does not mean absence of difficulties in its practical application. This article is the status of these difficulties.; L'information...

  20. Assured Service Concepts and Models. Volume 3. Availability in Distributed MLS Systems

    Science.gov (United States)

    1992-01-01

    a close similarity between the "data structures" used in a state machine model and those used in the actual system. For example, it seems reasonable...the value of each ot the four Boolean variables identified above. Then, the mapping between the state machine model and the actual system is: * PARKED... state machine model is small, then it is often possible to automate the analysis. For example, it is often desirable to determine whether certain

  1. An information theoretic approach for combining neural network process models.

    Science.gov (United States)

    Sridhar, D V.; Bartlett, E B.; Seagrave, R C.

    1999-07-01

    Typically neural network modelers in chemical engineering focus on identifying and using a single, hopefully optimal, neural network model. Using a single optimal model implicitly assumes that one neural network model can extract all the information available in a given data set and that the other candidate models are redundant. In general, there is no assurance that any individual model has extracted all relevant information from the data set. Recently, Wolpert (Neural Networks, 5(2), 241 (1992)) proposed the idea of stacked generalization to combine multiple models. Sridhar, Seagrave and Barlett (AIChE J., 42, 2529 (1996)) implemented the stacked generalization for neural network models by integrating multiple neural networks into an architecture known as stacked neural networks (SNNs). SNNs consist of a combination of the candidate neural networks and were shown to provide improved modeling of chemical processes. However, in Sridhar's work SNNs were limited to using a linear combination of artificial neural networks. While a linear combination is simple and easy to use, it can utilize only those model outputs that have a high linear correlation to the output. Models that are useful in a nonlinear sense are wasted if a linear combination is used. In this work we propose an information theoretic stacking (ITS) algorithm for combining neural network models. The ITS algorithm identifies and combines useful models regardless of the nature of their relationship to the actual output. The power of the ITS algorithm is demonstrated through three examples including application to a dynamic process modeling problem. The results obtained demonstrate that the SNNs developed using the ITS algorithm can achieve highly improved performance as compared to selecting and using a single hopefully optimal network or using SNNs based on a linear combination of neural networks.

  2. Harmonised Principles for Public Participation in Quality Assurance of Integrated Water Resources Modelling

    NARCIS (Netherlands)

    Henriksen, H.J.; Refsgaard, J.C.; Højberg, A.L.; Ferrand, N.; Gijsbers, P.; Scholten, H.

    2009-01-01

    The main purpose of public participation in integrated water resources modelling is to improve decision-making by ensuring that decisions are soundly based on shared knowledge, experience and scientific evidence. The present paper describes stakeholder involvement in the modelling process. The point

  3. United States Army Land Mobile Radio Communication System: Impacts of Information Assurance on Commercial Off-the-Shelf Systems

    Science.gov (United States)

    2010-06-01

    Manager; Program Manager PSTN Public Switched Telephone Network RF Radio Frequency SME Subject Matter Expert SBU Sensitive But Unclassified TDMA...Non-tactical LMRs likely to be used for communicating Sensitive But Unclassified ( SBU ) information shall meet medium or basic robustness…50

  4. Modeling and research of thermal aspects of precision holes drilling quality assurance

    Science.gov (United States)

    Kovalnogov, Vladislav N.; Nikiforov, Aleksandr A.; Fedorov, Ruslan V.

    2016-06-01

    The work suggests a mathematical model of thermal interaction of a workpiece and a cutting tool when drilling. The model is based on a simultaneous solution of heat conduction equations of the interacting bodies with the general boundary condition in a contact zone. The paper gives the results of modeling and research of the processing errors connected with thermal deformations, which under drilling the group of holes allows rationalizing a drilling sequence. Efficiency of ultrasound technology of cooling liquid supply in a cutting zone is shown.

  5. Towards microbiological quality assurance in radiation sterilization processing: a limiting case model

    Energy Technology Data Exchange (ETDEWEB)

    Doolan, P.T. (Becton Dickinson and Co., Parasmus, NJ (USA)); Dwyer, J.; Fitch, F.R. (Manchester Univ. (UK). Inst. of Science and Technology); Dwyer, V.M. (York Univ. (UK). Dept. of Physics); Halls, N.A. (Becton Dickinson and Co., Dun Laoghaire (Ireland)); Tallentire, A. (Manchester Univ. (UK). Dept. of Pharmacy)

    1985-03-01

    A Limiting Case Model has been developed which describes the dependence on radiation dose of the proportion of items, in a population of items subjected to irradiation, which are contaminated by one or more organisms. This model is independent of the initial distribution of numbers of micro-organisms on items and represents a conservative approach to estimation of the proportions of non-sterile items in an irradiated population of items.

  6. Evaluation System Research on Information Assurance of E-government%电子政务信息安全的评价体系探究

    Institute of Scientific and Technical Information of China (English)

    唐一冰

    2013-01-01

    With the development of internet technology,all the countries have importance to the development of e-government.At the same time,along with the electronic government affairs application is more and more widely,it also has a certain threats on the government information security.In this paper,it uses the e-government affairs information security research of other countries for reference,research on the e-government information assurance evaluation system.Through the research of e-government information security,application data security,network security and effective management,respectively proposed the evaluation system structure,and further improve the reliability of the e-government information security in management.%  进入21世纪以来,随着互联网技术的发展,各个国家也相继重视电子政务的发展。本文借鉴国外电子政务信息安全研究,对电子政务信息安全的评价体系进行研究。通过电子政务信息安全、数据应用安全、网络安全及有效管理安全四个方面分别提出了评价体系结构,从而进一步提高电子政务信息安全在管理方面的可靠性。

  7. Inexperienced clinicians can extract pathoanatomic information from MRI narrative reports with high reproducability for use in research/quality assurance

    DEFF Research Database (Denmark)

    Kent, Peter; Briggs, Andrew M; Albert, Hanne Birgit;

    2011-01-01

    Background Although reproducibility in reading MRI images amongst radiologists and clinicians has been studied previously, no studies have examined the reproducibility of inexperienced clinicians in extracting pathoanatomic information from magnetic resonance imaging (MRI) narrative reports...... pathoanatomic information from radiologist-generated MRI narrative reports. Methods Twenty MRI narrative reports were randomly extracted from an institutional database. A group of three physiotherapy students independently reviewed the reports and coded the presence of 14 common pathoanatomic findings using...... a categorical electronic coding matrix. Decision rules were developed after initial coding in an effort to resolve ambiguities in narrative reports. This process was repeated a further three times using separate samples of 20 MRI reports until no further ambiguities were identified (total n=80). Reproducibility...

  8. SWOT Analysis of King Abdullah II School for Information Technology at University of Jordan According to Quality Assurance Procedures

    Directory of Open Access Journals (Sweden)

    Lubna Naser Eddeen

    2013-02-01

    Full Text Available Many books and research papers have defined and referred to the term SWOT Analysis. SWOT Analysis can be defines as "strategic planning method used to evaluate the Strengths, Weaknesses, Opportunities, and Threats involved in a project or in a business venture". It's used to assess internal and external environmental factors which affect on the business. This paper analyze the main SWOT factors at King Abdullah II School for Information Technology.

  9. Construction of a business model to assure financial sustainability of biobanks.

    Science.gov (United States)

    Warth, Rainer; Perren, Aurel

    2014-12-01

    Biobank-suisse (BBS) is a collaborative network of biobanks in Switzerland. Since 2005, the network has worked with biobank managers towards a Swiss biobanking platform that harmonizes structures and procedures. The work with biobank managers has shown that long-term, sustainable financing is difficult to obtain. In this report, three typical biobank business models are identified and their characteristics analyzed. Five forces analysis was used to understand the competitive environment of biobanks. Data provided by OECD was used for financial estimations. The model was constructed using the business model canvas tool. The business models identified feature financing influenced by the economic situation and the research budgets in a given country. Overall, the competitive environment for biobanks is positive. The bargaining power with the buyer is negative since price setting and demand prediction is difficult. In Switzerland, the healthcare industry collects approximately 5600 U.S. dollars per person and year. If each Swiss citizen paid 0.1% (or 5 U.S. dollars) of this amount to Swiss biobanks, 45 million U.S. dollars could be collected. This compares to the approximately 10 million U.S. dollars made available for cohort studies, longitudinal studies, and pathology biobanks through science funding. With the same approach, Germany, the United States, Canada, France, and the United Kingdom could collect 361, 2634, 154, 264, and 221 million U.S. dollars, respectively. In Switzerland and in other countries, an annual fee less than 5 U.S. dollars per person is sufficient to provide biobanks with sustainable financing. This inspired us to construct a business model that not only includes the academic and industrial research sectors as customer segment, but also includes the population. The revenues would be collected as fees by the healthcare system. In Italy and Germany, a small share of healthcare spending is already used to finance selected clinical trials. The legal

  10. Force Displacement Model of Compliant Mechanisms Using Assur Sub-Chains

    DEFF Research Database (Denmark)

    Durango, Sebastian; Correa, Jorge; Aristizabal, Mauricio;

    2011-01-01

    This article develops a modular procedure to perform force-displacement modeling of planar flexurebased Compliant Mechanisms (CMs). The procedure is mostly suitable for planar lumped CMs. To achieve the position analysis of CMs requires: (i) to implement the kinematic analysis as in the case of o...... mechanism is used as case study. Results are compared with a Finite Element Analysis (FEA)....

  11. Refreshing Information Literacy: Learning from Recent British Information Literacy Models

    Science.gov (United States)

    Martin, Justine

    2013-01-01

    Models play an important role in helping practitioners implement and promote information literacy. Over time models can lose relevance with the advances in technology, society, and learning theory. Practitioners and scholars often call for adaptations or transformations of these frameworks to articulate the learning needs in information literacy…

  12. The Infopriv model for information privacy

    OpenAIRE

    2012-01-01

    D.Phil. (Computer Science) The privacy of personal information is crucial in today's information systems. Traditional security models are mainly concerned with the protection of information inside a computer system. These models assume that the users of a computer system are trustworthy and will not disclose information to unauthorised parties. However, this assumption does not always apply to information privacy since people are the major cause of privacy violations. Alternative models ar...

  13. Parsimonious Language Models for Information Retrieval

    NARCIS (Netherlands)

    Hiemstra, Djoerd; Robertson, Stephen; Zaragoza, Hugo

    2004-01-01

    We systematically investigate a new approach to estimating the parameters of language models for information retrieval, called parsimonious language models. Parsimonious language models explicitly address the relation between levels of language models that are typically used for smoothing. As such,

  14. Auditors’ Perceptions of Reasonable Assurance the Effectiveness of the Audit Risk Model. Case from Iran

    OpenAIRE

    Hashem Valipour; Javad Moradi; Hajar Moazaminezhad

    2012-01-01

    Despite the definition of the concept of logical confidence in auditing standards, the results from some studies conducted indicate a meaningful difference between perceptions this basic concept, by different auditors (Law, 2008, 180). The results from some researches also indicate that auditors’ perceptions about the effectiveness of the audit risk model vary (which is based on auditing general principles on the basis of risk) (Arense, 2006, 148). In so doing, aiming at studying the proof fo...

  15. Tools for evaluating Veterinary Services: an external auditing model for the quality assurance process.

    Science.gov (United States)

    Melo, E Correa

    2003-08-01

    The author describes the reasons why evaluation processes should be applied to the Veterinary Services of Member Countries, either for trade in animals and animal products and by-products between two countries, or for establishing essential measures to improve the Veterinary Service concerned. The author also describes the basic elements involved in conducting an evaluation process, including the instruments for doing so. These basic elements centre on the following:--designing a model, or desirable image, against which a comparison can be made--establishing a list of processes to be analysed and defining the qualitative and quantitative mechanisms for this analysis--establishing a multidisciplinary evaluation team and developing a process for standardising the evaluation criteria.

  16. Quality Assurance Tracking System - R7 (QATS-R7)

    Data.gov (United States)

    U.S. Environmental Protection Agency — This is metadata documentation for the Quality Assurance Tracking System - R7, an EPA Region 7 resource that tracks information on quality assurance reviews. Also...

  17. Evaluation of plan quality assurance models for prostate cancer patients based on fully automatically generated Pareto-optimal treatment plans

    Science.gov (United States)

    Wang, Yibing; Breedveld, Sebastiaan; Heijmen, Ben; Petit, Steven F.

    2016-06-01

    IMRT planning with commercial Treatment Planning Systems (TPSs) is a trial-and-error process. Consequently, the quality of treatment plans may not be consistent among patients, planners and institutions. Recently, different plan quality assurance (QA) models have been proposed, that could flag and guide improvement of suboptimal treatment plans. However, the performance of these models was validated using plans that were created using the conventional trail-and-error treatment planning process. Consequently, it is challenging to assess and compare quantitatively the accuracy of different treatment planning QA models. Therefore, we created a golden standard dataset of consistently planned Pareto-optimal IMRT plans for 115 prostate patients. Next, the dataset was used to assess the performance of a treatment planning QA model that uses the overlap volume histogram (OVH). 115 prostate IMRT plans were fully automatically planned using our in-house developed TPS Erasmus-iCycle. An existing OVH model was trained on the plans of 58 of the patients. Next it was applied to predict DVHs of the rectum, bladder and anus of the remaining 57 patients. The predictions were compared with the achieved values of the golden standard plans for the rectum D mean, V 65, and V 75, and D mean of the anus and the bladder. For the rectum, the prediction errors (predicted-achieved) were only  -0.2  ±  0.9 Gy (mean  ±  1 SD) for D mean,-1.0  ±  1.6% for V 65, and  -0.4  ±  1.1% for V 75. For D mean of the anus and the bladder, the prediction error was 0.1  ±  1.6 Gy and 4.8  ±  4.1 Gy, respectively. Increasing the training cohort to 114 patients only led to minor improvements. A dataset of consistently planned Pareto-optimal prostate IMRT plans was generated. This dataset can be used to train new, and validate and compare existing treatment planning QA models, and has been made publicly available. The OVH model was highly accurate

  18. Optimal Disturbance Accommodation with Limited Model Information

    CERN Document Server

    Farokhi, F; Johansson, K H

    2011-01-01

    The design of optimal dynamic disturbance-accommodation controller with limited model information is considered. We adapt the family of limited model information control design strategies, defined earlier by the authors, to handle dynamic-controllers. This family of limited model information design strategies construct subcontrollers distributively by accessing only local plant model information. The closed-loop performance of the dynamic-controllers that they can produce are studied using a performance metric called the competitive ratio which is the worst case ratio of the cost a control design strategy to the cost of the optimal control design with full model information.

  19. Tool Use Within NASA Software Quality Assurance

    Science.gov (United States)

    Shigeta, Denise; Port, Dan; Nikora, Allen P.; Wilf, Joel

    2013-01-01

    As space mission software systems become larger and more complex, it is increasingly important for the software assurance effort to have the ability to effectively assess both the artifacts produced during software system development and the development process itself. Conceptually, assurance is a straightforward idea - it is the result of activities carried out by an organization independent of the software developers to better inform project management of potential technical and programmatic risks, and thus increase management's confidence in the decisions they ultimately make. In practice, effective assurance for large, complex systems often entails assessing large, complex software artifacts (e.g., requirements specifications, architectural descriptions) as well as substantial amounts of unstructured information (e.g., anomaly reports resulting from testing activities during development). In such an environment, assurance engineers can benefit greatly from appropriate tool support. In order to do so, an assurance organization will need accurate and timely information on the tool support available for various types of assurance activities. In this paper, we investigate the current use of tool support for assurance organizations within NASA, and describe on-going work at JPL for providing assurance organizations with the information about tools they need to use them effectively.

  20. Quality assurance manual: Volume 2, Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Oijala, J.E.

    1988-06-01

    This paper contains quality assurance information on departments of the Stanford Linear Accelerator Center. Particular quality assurance policies and standards discussed are on: Mechanical Systems; Klystron and Microwave Department; Electronics Department; Plant Engineering; Accelerator Department; Purchasing; and Experimental Facilities Department. (LSP)

  1. Data Quality Assurance Governance

    OpenAIRE

    Montserrat Gonzalez; Stephanie Suhr

    2016-01-01

    This deliverable describes the ELIXIR-EXCELERATE Quality Management Strategy, addressing EXCELERATE Ethics requirement no. 5 on Data Quality Assurance Governance. The strategy describes the essential procedures and practices within ELIXIR-EXCELERATE concerning planning of quality management, performing quality assurance and controlling quality. It also depicts the overall organisation of ELIXIR with emphasis on authority and specific responsibilities related to quality assurance.

  2. Models for Information Assurance Education and Outreach: Year 3 and Summative Report

    Science.gov (United States)

    Wang, Jianjun

    2015-01-01

    Over the past three years, California State University, Bakersfield received NSF funding to support hands-on explorations in "network security" and "cryptography" through Research Experience Vitalizing Science-University Program (REVS-UP). In addition to the summer bridge component, the grant included development of…

  3. Directory of Energy Information Administration Models 1994

    Energy Technology Data Exchange (ETDEWEB)

    1994-07-01

    This directory revises and updates the 1993 directory and includes 15 models of the National Energy Modeling System (NEMS). Three other new models in use by the Energy Information Administration (EIA) have also been included: the Motor Gasoline Market Model (MGMM), Distillate Market Model (DMM), and the Propane Market Model (PPMM). This directory contains descriptions about each model, including title, acronym, purpose, followed by more detailed information on characteristics, uses and requirements. Sources for additional information are identified. Included in this directory are 37 EIA models active as of February 1, 1994.

  4. Information modelling and knowledge bases XXV

    CERN Document Server

    Tokuda, T; Jaakkola, H; Yoshida, N

    2014-01-01

    Because of our ever increasing use of and reliance on technology and information systems, information modelling and knowledge bases continue to be important topics in those academic communities concerned with data handling and computer science. As the information itself becomes more complex, so do the levels of abstraction and the databases themselves. This book is part of the series Information Modelling and Knowledge Bases, which concentrates on a variety of themes in the important domains of conceptual modeling, design and specification of information systems, multimedia information modelin

  5. Information technology and innovative drainage management practices for selenium load reduction from irrigated agriculture to provide stakeholder assurances and meet contaminant mass loading policy objectives

    Energy Technology Data Exchange (ETDEWEB)

    Quinn, N.W.T.

    2009-10-15

    Many perceive the implementation of environmental regulatory policy, especially concerning non-point source pollution from irrigated agriculture, as being less efficient in the United States than in many other countries. This is partly a result of the stakeholder involvement process but is also a reflection of the inability to make effective use of Environmental Decision Support Systems (EDSS) to facilitate technical information exchange with stakeholders and to provide a forum for innovative ideas for controlling non-point source pollutant loading. This paper describes one of the success stories where a standardized Environmental Protection Agency (EPA) methodology was modified to better suit regulation of a trace element in agricultural subsurface drainage and information technology was developed to help guide stakeholders, provide assurances to the public and encourage innovation while improving compliance with State water quality objectives. The geographic focus of the paper is the western San Joaquin Valley where, in 1985, evapoconcentration of selenium in agricultural subsurface drainage water, diverted into large ponds within a federal wildlife refuge, caused teratogenecity in waterfowl embryos and in other sensitive wildlife species. The fallout from this environmental disaster was a concerted attempt by State and Federal water agencies to regulate non-point source loads of the trace element selenium. The complexity of selenium hydrogeochemistry, the difficulty and expense of selenium concentration monitoring and political discord between agricultural and environmental interests created challenges to the regulation process. Innovative policy and institutional constructs, supported by environmental monitoring and the web-based data management and dissemination systems, provided essential decision support, created opportunities for adaptive management and ultimately contributed to project success. The paper provides a retrospective on the contentious planning

  6. PKI Integrated Information Security Assurance Framework%融合PKI基础设施的信息安全保障技术框架研究

    Institute of Scientific and Technical Information of China (English)

    张全伟

    2012-01-01

    传统的基于网络安全防护的安全技术已不能满足信息安全保障(IA)时代的安全需求,与此同时网络安防系统自身也存在各种安全风险和面临各种安全攻击,同样需要强化身份、认证、保密、完整、抗抵赖等安全保障。本文将通过强化密码基础设施的核心和基础安全支持服务功能,融合密码基础设施与网络安防技术,为信息系统及网络安全防护系统提供更加强壮的身份、认证、保密性、完整性及抗抵赖等安全服务。%The traditional network security technology has been unable to meet the security requirements of information assurance era. At the same time network security system itself is facing various risks and attack, also need to strengthen the identity, authentication, confidentiality, integrity, non-repudiation and so security protection. This paper will strengthen the core and foundation secudty support function of the PKI infrastructure, integrate the PKI infrastructure and network security technology, provide stronger identity, authentication, confidentiality, integrity and non-repudiation security service for the information systems and network security protection system.

  7. Complementarity of Historic Building Information Modelling and Geographic Information Systems

    Science.gov (United States)

    Yang, X.; Koehl, M.; Grussenmeyer, P.; Macher, H.

    2016-06-01

    In this paper, we discuss the potential of integrating both semantically rich models from Building Information Modelling (BIM) and Geographical Information Systems (GIS) to build the detailed 3D historic model. BIM contributes to the creation of a digital representation having all physical and functional building characteristics in several dimensions, as e.g. XYZ (3D), time and non-architectural information that are necessary for construction and management of buildings. GIS has potential in handling and managing spatial data especially exploring spatial relationships and is widely used in urban modelling. However, when considering heritage modelling, the specificity of irregular historical components makes it problematic to create the enriched model according to its complex architectural elements obtained from point clouds. Therefore, some open issues limiting the historic building 3D modelling will be discussed in this paper: how to deal with the complex elements composing historic buildings in BIM and GIS environment, how to build the enriched historic model, and why to construct different levels of details? By solving these problems, conceptualization, documentation and analysis of enriched Historic Building Information Modelling are developed and compared to traditional 3D models aimed primarily for visualization.

  8. Defense Healthcare Information Assurance Program

    Science.gov (United States)

    2001-06-01

    34* Computerized Patient Record Institute ( CPRI ) annual conference in Washington D.C. "* American Telemedicine Association (ATA) annual conference...TATRC) 3 May-00 Phoenix, AZ ATA Conference ATI I Awd + 4 Pittsburgh, PA IPR #1 ATI, SEI, ADL, 6 LMES, HOST, KRM, (TATRC) Jul-00 Washington, DC CPRI

  9. PRISM: a planned risk information seeking model.

    Science.gov (United States)

    Kahlor, LeeAnn

    2010-06-01

    Recent attention on health-related information seeking has focused primarily on information seeking within specific health and health risk contexts. This study attempts to shift some of that focus to individual-level variables that may impact health risk information seeking across contexts. To locate these variables, the researcher posits an integrated model, the Planned Risk Information Seeking Model (PRISM). The model, which treats risk information seeking as a deliberate (planned) behavior, maps variables found in the Theory of Planned Behavior (TPB; Ajzen, 1991) and the Risk Information Seeking and Processing Model (RISP; Griffin, Dunwoody, & Neuwirth, 1999), and posits linkages among those variables. This effort is further informed by Kahlor's (2007) Augmented RISP, the Theory of Motivated Information Management (Afifi & Weiner, 2004), the Comprehensive Model of Information Seeking (Johnson & Meischke, 1993), the Health Information Acquisition Model (Freimuth, Stein, & Kean, 1989), and the Extended Parallel Processing Model (Witte, 1998). The resulting integrated model accounted for 59% of the variance in health risk information-seeking intent and performed better than the TPB or the RISP alone.

  10. A proposed general model of information behaviour.

    Directory of Open Access Journals (Sweden)

    2003-01-01

    Full Text Available Presents a critical description of Wilson's (1996 global model of information behaviour and proposes major modification on the basis of research into information behaviour of managers, conducted in Poland. The theoretical analysis and research results suggest that Wilson's model has certain imperfections, both in its conceptual content, and in graphical presentation. The model, for example, cannot be used to describe managers' information behaviour, since managers basically are not the end users of external from organization or computerized information services, and they acquire information mainly through various intermediaries. Therefore, the model cannot be considered as a general model, applicable to every category of information users. The proposed new model encompasses the main concepts of Wilson's model, such as: person-in-context, three categories of intervening variables (individual, social and environmental, activating mechanisms, cyclic character of information behaviours, and the adoption of a multidisciplinary approach to explain them. However, the new model introduces several changes. They include: 1. identification of 'context' with the intervening variables; 2. immersion of the chain of information behaviour in the 'context', to indicate that the context variables influence behaviour at all stages of the process (identification of needs, looking for information, processing and using it; 3. stress is put on the fact that the activating mechanisms also can occur at all stages of the information acquisition process; 4. introduction of two basic strategies of looking for information: personally and/or using various intermediaries.

  11. Quality assurance of a solar housing project on the basis of an information and consulting campaign. Final report; Qualitaetssicherung mit Informations- und Beratungskampagne bei der Realisierung einer Solarsiedlung. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Otto, J.; Nasarek, P.; Toelle, L.

    2002-07-01

    A solar housing project of 68 single family dwellings is constructed on the southern slope of a hill in the town of Emmerthal. High thermal insulation, active and passive solar energy use, and a two-stage heat pump process will reduce carbon dioxide emissions by 50 percent as compared to the current standard. All buildings are designed to consume 30 percent less than the heating energy consumption required by the WSVO '95. 60 percent of the hot water will consumed will be heated by solar energy. Consulting and information of the builder-owners as well as quality assurance measures during construction will serve to ensure that these goals are met and will also provide a documentation of the energy balance of the housing project.

  12. [Quality assurance in human genetic testing].

    Science.gov (United States)

    Stuhrmann-Spangenberg, Manfred

    2015-02-01

    Advances in technical developments of genetic diagnostics for more than 50 years, as well as the fact that human genetic testing is usually performed only once in a lifetime, with additional impact for blood relatives, are determining the extraordinary importance of quality assurance in human genetic testing. Abidance of laws, directives, and guidelines plays a major role. This article aims to present the major laws, directives, and guidelines with respect to quality assurance of human genetic testing, paying careful attention to internal and external quality assurance. The information on quality assurance of human genetic testing was obtained through a web-based search of the web pages that are referred to in this article. Further information was retrieved from publications in the German Society of Human Genetics and through a PubMed-search using term quality + assurance + genetic + diagnostics. The most important laws, directives, and guidelines for quality assurance of human genetic testing are the gene diagnostics law (GenDG), the directive of the Federal Medical Council for quality control of clinical laboratory analysis (RiliBÄK), and the S2K guideline for human genetic diagnostics and counselling. In addition, voluntary accreditation under DIN EN ISO 15189:2013 offers a most recommended contribution towards quality assurance of human genetic testing. Legal restraints on quality assurance of human genetic testing as mentioned in § 5 GenDG are fulfilled once RiliBÄK requirements are followed.

  13. Topic modelling in the information warfare domain

    CSIR Research Space (South Africa)

    De Waal, A

    2013-11-01

    Full Text Available In this paper the authors provide context to Topic Modelling as an Information Warfare technique. Topic modelling is a technique that discovers latent topics in unstructured and unlabelled collection of documents. The topic structure can be searched...

  14. Information Retrieval Interaction: an Analysis of Models

    Directory of Open Access Journals (Sweden)

    Farahnaz Sadoughi

    2012-03-01

    Full Text Available Information searching process is an interactive process; thus users has control on searching process, and they can manage the results of the search process. In this process, user's question became more mature, according to retrieved results. In addition, on the side of the information retrieval system, there are some processes that could not be realized, unless by user. Practically, this issue, is egregious in “Interaction” -i.e. process of user connection to other system elements- and in “Relevance judgment”. This paper had a glance to existence of “Interaction” in information retrieval, in first. Then the tradition model of information retrieval and its strenght and weak points were reviewed. Finally, the current models of interactive information retrieval includes: Belkin episodic model, Ingwersen cognitive model, Sarasevic stratified model, and Spinks interactive feedback model were elucidated.

  15. A Policy Model for Secure Information Flow

    Science.gov (United States)

    Adetoye, Adedayo O.; Badii, Atta

    When a computer program requires legitimate access to confidential data, the question arises whether such a program may illegally reveal sensitive information. This paper proposes a policy model to specify what information flow is permitted in a computational system. The security definition, which is based on a general notion of information lattices, allows various representations of information to be used in the enforcement of secure information flow in deterministic or nondeterministic systems. A flexible semantics-based analysis technique is presented, which uses the input-output relational model induced by an attacker’s observational power, to compute the information released by the computational system. An illustrative attacker model demonstrates the use of the technique to develop a termination-sensitive analysis. The technique allows the development of various information flow analyses, parametrised by the attacker’s observational power, which can be used to enforce what declassification policies.

  16. Performance of Information Criteria for Spatial Models.

    Science.gov (United States)

    Lee, Hyeyoung; Ghosh, Sujit K

    2009-01-01

    Model choice is one of the most crucial aspect in any statistical data analysis. It is well known that most models are just an approximation to the true data generating process but among such model approximations it is our goal to select the "best" one. Researchers typically consider a finite number of plausible models in statistical applications and the related statistical inference depends on the chosen model. Hence model comparison is required to identify the "best" model among several such candidate models. This article considers the problem of model selection for spatial data. The issue of model selection for spatial models has been addressed in the literature by the use of traditional information criteria based methods, even though such criteria have been developed based on the assumption of independent observations. We evaluate the performance of some of the popular model selection critera via Monte Carlo simulation experiments using small to moderate samples. In particular, we compare the performance of some of the most popular information criteria such as Akaike Information Criterion (AIC), Bayesian Information Criterion (BIC), and Corrected AIC (AICc) in selecting the true model. The ability of these criteria to select the correct model is evaluated under several scenarios. This comparison is made using various spatial covariance models ranging from stationary isotropic to nonstationary models.

  17. Research of Home Information Technology Adoption Model

    Institute of Scientific and Technical Information of China (English)

    Ao Shan; Ren Weiyin; Lin Peishan; Tang Shoulian

    2008-01-01

    The Information Technology at Home has caught the attention of various industries such as IT, Home Appliances, Communication, and Real Estate. Based on the information technology acceptance theories and family consumption behaviors theories, this study summarized and analyzed four key belief variables i.e. Perceived Value, Perceived Risk, Perceived Cost and Perceived Ease of Use, which influence the acceptance of home information technology. The study also summaries three groups of external variables. They axe social, industrial, and family influence factors. The social influence factors include Subjective Norm; the industry factors include the Unification of Home Information Technological Standards, the Perfection of Home Information Industry Value Chain, and the Competitiveness of Home Information Industry; and the family factors include Family Income, Family Life Cycle and Family Educational Level. The study discusses the relationship among these external variables and cognitive variables. The study provides Home Information Technology Acceptance Model based on the Technology Acceptance Model and the characteristics of home information technology consumption.

  18. Probabilistic Modeling in Dynamic Information Retrieval

    OpenAIRE

    Sloan, M. C.

    2016-01-01

    Dynamic modeling is used to design systems that are adaptive to their changing environment and is currently poorly understood in information retrieval systems. Common elements in the information retrieval methodology, such as documents, relevance, users and tasks, are dynamic entities that may evolve over the course of several interactions, which is increasingly captured in search log datasets. Conventional frameworks and models in information retrieval treat these elements as static, or only...

  19. FINANCIAL MARKET MODEL WITH INFLUENTIAL INFORMED INVESTORS

    OpenAIRE

    AXEL GRORUD; MONIQUE PONTIER

    2005-01-01

    We develop a financial model with an "influential informed" investor who has an additional information and influences asset prices by means of his strategy. The prices dynamics are supposed to be driven by a Brownian motion, the informed investor's strategies affect the risky asset trends and the interest rate. Our paper could be seen as an extension of Cuoco and Cvitanic's work [4] since, as these authors, we solve the informed influential investor's optimization problem. But our main result...

  20. Directory of Energy Information Administration models 1996

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-07-01

    This directory revises and updates the Directory of Energy Information Administration Models 1995, DOE/EIA-0293(95), Energy Information Administration (EIA), U.S. Department of Energy, July 1995. Four models have been deleted in this directory as they are no longer being used: (1) Market Penetration Model for Ground-Water Heat Pump Systems (MPGWHP); (2) Market Penetration Model for Residential Rooftop PV Systems (MPRESPV-PC); (3) Market Penetration Model for Active and Passive Solar Technologies (MPSOLARPC); and (4) Revenue Requirements Modeling System (RRMS).

  1. World Wide Telescope(WWT) teaching design base on ASSURE model%基于ASSURE模式的WWT天文教学设计

    Institute of Scientific and Technical Information of China (English)

    王宗月; 李璐

    2012-01-01

    The ASSURE model is instructional design model based on media technology.This paper takes the ASSURE model as a guide,the World Wide Telescope(WWT) as the teaching media,the student of participating in the compulsory course of fundamental astronomy as implementation of object.The whole instructional design process are carried out according to the ASSURE model of several major steps,it integrate teaching media with astronomy course effectively and play a full role of media and learning resources of advantages,improve the teaching effect,achieve the expected teaching goal,and improve the student of academic record and interest,cultivate the students' innovation ability and the spirit of cooperation and exploration.%ASSURE模式是基于媒体技术的教学设计模式,本文以ASSURE模式为指导,以万维望远镜(WWT)为教学媒体,以参加基础天文学公选课的学生为实施对象。整个教学设计过程按照ASSURE模式主要的几个步骤来进行,使教学媒体与天文课程进行有效的整合,并充分发挥媒体和学习资源的作用和优势,改善课程的教学效果,达到预期的教学目标,并且提高了学生成绩及学习兴趣,培养了学生的创新能力、协作与探索精神。

  2. Information criteria for astrophysical model selection

    CERN Document Server

    Liddle, A R

    2007-01-01

    Model selection is the problem of distinguishing competing models, perhaps featuring different numbers of parameters. The statistics literature contains two distinct sets of tools, those based on information theory such as the Akaike Information Criterion (AIC), and those on Bayesian inference such as the Bayesian evidence and Bayesian Information Criterion (BIC). The Deviance Information Criterion combines ideas from both heritages; it is readily computed from Monte Carlo posterior samples and, unlike the AIC and BIC, allows for parameter degeneracy. I describe the properties of the information criteria, and as an example compute them from WMAP3 data for several cosmological models. I find that at present the information theory and Bayesian approaches give significantly different conclusions from that data.

  3. Managing Event Information Modeling, Retrieval, and Applications

    CERN Document Server

    Gupta, Amarnath

    2011-01-01

    With the proliferation of citizen reporting, smart mobile devices, and social media, an increasing number of people are beginning to generate information about events they observe and participate in. A significant fraction of this information contains multimedia data to share the experience with their audience. A systematic information modeling and management framework is necessary to capture this widely heterogeneous, schemaless, potentially humongous information produced by many different people. This book is an attempt to examine the modeling, storage, querying, and applications of such an

  4. High assurance services computing

    CERN Document Server

    2009-01-01

    Covers service-oriented technologies in different domains including high assurance systemsAssists software engineers from industry and government laboratories who develop mission-critical software, and simultaneously provides academia with a practitioner's outlook on the problems of high-assurance software development

  5. Modeling the Dynamics of an Information System

    Directory of Open Access Journals (Sweden)

    Jacek Unold

    2003-11-01

    Full Text Available The article concentrates on the nature of a social subsystem of an information system. It analyzes the nature of information processes of collectivity within an IS and introduces a model of IS dynamics. The model is based on the assumption that a social subsystem of an information system works as a nonlinear dynamic system. The model of IS dynamics is verified on the indexes of the stock market. It arises from the basic assumption of the technical analysis of the markets, that is, the index chart reflects the play of demand and supply, which in turn represents the crowd sentiment on the market.

  6. Information-Theoretic Perspectives on Geophysical Models

    Science.gov (United States)

    Nearing, Grey

    2016-04-01

    To test any hypothesis about any dynamic system, it is necessary to build a model that places that hypothesis into the context of everything else that we know about the system: initial and boundary conditions and interactions between various governing processes (Hempel and Oppenheim, 1948, Cartwright, 1983). No hypothesis can be tested in isolation, and no hypothesis can be tested without a model (for a geoscience-related discussion see Clark et al., 2011). Science is (currently) fundamentally reductionist in the sense that we seek some small set of governing principles that can explain all phenomena in the universe, and such laws are ontological in the sense that they describe the object under investigation (Davies, 1990 gives several competing perspectives on this claim). However, since we cannot build perfect models of complex systems, any model that does not also contain an epistemological component (i.e., a statement, like a probability distribution, that refers directly to the quality of of the information from the model) is falsified immediately (in the sense of Popper, 2002) given only a small number of observations. Models necessarily contain both ontological and epistemological components, and what this means is that the purpose of any robust scientific method is to measure the amount and quality of information provided by models. I believe that any viable philosophy of science must be reducible to this statement. The first step toward a unified theory of scientific models (and therefore a complete philosophy of science) is a quantitative language that applies to both ontological and epistemological questions. Information theory is one such language: Cox' (1946) theorem (see Van Horn, 2003) tells us that probability theory is the (only) calculus that is consistent with Classical Logic (Jaynes, 2003; chapter 1), and information theory is simply the integration of convex transforms of probability ratios (integration reduces density functions to scalar

  7. Directory of Energy Information Administration Models 1993

    Energy Technology Data Exchange (ETDEWEB)

    1993-07-06

    This directory contains descriptions about each model, including the title, acronym, purpose, followed by more detailed information on characteristics, uses, and requirements. Sources for additional information are identified. Included in this directory are 35 EIA models active as of May 1, 1993. Models that run on personal computers are identified by ``PC`` as part of the acronym. EIA is developing new models, a National Energy Modeling System (NEMS), and is making changes to existing models to include new technologies, environmental issues, conservation, and renewables, as well as extend forecast horizon. Other parts of the Department are involved in this modeling effort. A fully operational model is planned which will integrate completed segments of NEMS for its first official application--preparation of EIA`s Annual Energy Outlook 1994. Abstracts for the new models will be included in next year`s version of this directory.

  8. Directory of energy information administration models 1995

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-07-13

    This updated directory has been published annually; after this issue, it will be published only biennially. The Disruption Impact Simulator Model in use by EIA is included. Model descriptions have been updated according to revised documentation approved during the past year. This directory contains descriptions about each model, including title, acronym, purpose, followed by more detailed information on characteristics, uses, and requirements. Sources for additional information are identified. Included are 37 EIA models active as of February 1, 1995. The first group is the National Energy Modeling System (NEMS) models. The second group is all other EIA models that are not part of NEMS. Appendix A identifies major EIA modeling systems and the models within these systems. Appendix B is a summary of the `Annual Energy Outlook` Forecasting System.

  9. Thermodynamic Model of Noise Information Transfer

    Science.gov (United States)

    Hejna, Bohdan

    2008-10-01

    In this paper we apply a certain unifying physical description of the results of Information Theory. Assuming that heat entropy is a thermodynamic realization of information entropy [2], we construct a cyclical, thermodynamic, average-value model of an information transfer chain [3] as a general heat engine, in particular a Carnot engine, reversible or irreversible. A working medium of the cycle (a thermodynamic system transforming input heat energy) can be considered as a thermodynamic, average-value model or, as such, as a realization of an information transfer channel. We show that in a model realized in this way the extended II. Principle of Thermodynamics is valid [2] and we formulate its information form.

  10. A Model for an Electronic Information Marketplace

    Directory of Open Access Journals (Sweden)

    Wei Ge

    2005-11-01

    Full Text Available As the information content on the Internet increases, the task of locating desired information and assessing its quality becomes increasingly difficult. This development causes users to be more willing to pay for information that is focused on specific issues, verifiable, and available upon request. Thus, the nature of the Internet opens up the opportunity for information trading. In this context, the Internet cannot only be used to close the transaction, but also to deliver the product - desired information - to the user. Early attempts to implement such business models have fallen short of expectations. In this paper, we discuss the limitations of such practices and present a modified business model for information trading, which uses a reverse auction approach together with a multiple-buyer price discovery process

  11. Optimization of assured result in dynamical model of management of innovation process in the enterprise of agricultural production complex

    Directory of Open Access Journals (Sweden)

    Andrey Fedorovich Shorikov

    2014-03-01

    Full Text Available Research and the problem solution of management of innovative process at the enterprise (UIPP demands the development of the dynamic economic-mathematical model considering the control action, uncontrolled parameters (risks, modeling errors, etc. and deficit of information. At the same time, the existing approaches to the solution of similar problems are generally based on the static models and use the device of stochastic modeling, which requires knowledge of probabilistic characteristics of key parameters of the model and special conditions on realization of considered process. It is significant that the strict conditions are necessary for the use of the stochastic modeling, but in practice it is not possible. In the article, it is offered to use the determined approach for the modeling and solution of an initial problems in the form of dynamic problem of program minimax control (optimization of the guaranteed result IPP on the set timepoint taking into account risks. At the same time, the risks in the system of UIPP are understood as the factors, which influence negatively or catastrophically on the results of the processes considered in the system. To solve the problem of minimax program control of IPP at risks, the method to implement the solutions of the final number of the problem of linear and convex mathematical programming and a problem of discrete optimization is offered. The offered method gives the chance to develop the effective numerical procedures allowing to realize computer modeling of dynamics considered problem, to create program minimax control of IPP, and to receive the optimum guaranteed result. The results presented in the article are based on the research [2, 3, 7-10] and can be used for economic-mathematical modeling and the solution of other problems of data forecasting process optimization and management at the deficit of information and at risks, and also for development of the corresponding software and

  12. A Model for Teaching Information Design

    Science.gov (United States)

    Pettersson, Rune

    2011-01-01

    The author presents his views on the teaching of information design. The starting point includes some general aspects of teaching and learning. The multidisciplinary structure and content of information design as well as the combined practical and theoretical components influence studies of the discipline. Experiences from working with a model for…

  13. Measurements and Information in Spin Foam Models

    CERN Document Server

    Garcia-Islas, J Manuel

    2012-01-01

    We present a problem relating measurements and information theory in spin foam models. In the three dimensional case of quantum gravity we can compute probabilities of spin network graphs and study the behaviour of the Shannon entropy associated to the corresponding information. We present a general definition, compute the Shannon entropy of some examples, and find some interesting inequalities.

  14. A Model for Teaching Information Design

    Science.gov (United States)

    Pettersson, Rune

    2011-01-01

    The author presents his views on the teaching of information design. The starting point includes some general aspects of teaching and learning. The multidisciplinary structure and content of information design as well as the combined practical and theoretical components influence studies of the discipline. Experiences from working with a model for…

  15. The Information Service Evaluation (ISE Model

    Directory of Open Access Journals (Sweden)

    Laura Schumann

    2014-06-01

    Full Text Available Information services are an inherent part of our everyday life. Especially since ubiquitous cities are being developed all over the world their number is increasing even faster. They aim at facilitating the production of information and the access to the needed information and are supposed to make life easier. Until today many different evaluation models (among others, TAM, TAM 2, TAM 3, UTAUT and MATH have been developed to measure the quality and acceptance of these services. Still, they only consider subareas of the whole concept that represents an information service. As a holistic and comprehensive approach, the ISE Model studies five dimensions that influence adoption, use, impact and diffusion of the information service: information service quality, information user, information acceptance, information environment and time. All these aspects have a great impact on the final grading and of the success (or failure of the service. Our model combines approaches, which study subjective impressions of users (e.g., the perceived service quality, and user-independent, more objective approaches (e.g., the degree of gamification of a system. Furthermore, we adopt results of network economics, especially the "Success breeds success"-principle.

  16. SU-E-T-532: Validation and Implementation of Model-Based Patient Specific Quality Assurance Using Mobius3D and MobiusFX in a Clinical Setting

    Energy Technology Data Exchange (ETDEWEB)

    Galavis, P; Osterman, K; Jozsef, G; Becker, S; Dewyngaert, K [NYU Medical Center, NY, NY (United States)

    2014-06-01

    Purpose: This work carries out the commissioning and validation of the Mobius3D and MobiusFX software tools, which can replace the time-consuming measurement-based patient specific quality assurance (PSQA). Methods: The beam model supplied by Mobius3D was validated against a 21EX linac's beam measured data. Complex patient (VMAT) plans using Eclipse treatment planning system (TPS) was used to test the consistency between Mobius3D (calculates dose using patient image and field data) and MobiusFx (calculates dose using treatment dynalog files). Dose difference and gamma analysis (3%/3mm) between Mobius3D and MobiusFx were used to assess treatment plan and treatment delivery consistency. An end-to-end test was performed to validate Mobius3D and MobiusFx against ion chamber measurements. Effect of the dosimetric leaf gap (DLG) on Mobius3D dose calculation was additionally investigated. Results: Mobius3D beam model parameters matched within 1%-3% with our beam measured data. A comparison of Mobius3D and MobiusFx dose matrices for VMAT planned prostate cases showed (0.33±0.07)% mean dose difference with gamma values above 95%. The end-to-end test showed dose differences of 1% between Mobius3D and MobiusFx. Dependence of Mobius3D dose calculation upon DLG was explored by introducing a ±0.5 mm change in the default value for DLG. This change resulted in agreement differences above 2% Conclusion: Use of reference beam data would appear to speed up commissioning process for the clinical implementation of Mobius3D. However, careful consideration is required when comparing the information provided by the software, since large dose variations can be seen when the proper parameters are not optimized. The plan and delivered dose were in good agreement; hence MobiusFx has the potential to significantly speed up the PSQA process and at the same time helps to verify treatment parameters that are not possible with measurement-based PSQA.

  17. Information Literacy for Health Professionals: Teaching Essential Information Skills with the Big6 Information Literacy Model

    Science.gov (United States)

    Santana Arroyo, Sonia

    2013-01-01

    Health professionals frequently do not possess the necessary information-seeking abilities to conduct an effective search in databases and Internet sources. Reference librarians may teach health professionals these information and technology skills through the Big6 information literacy model (Big6). This article aims to address this issue. It also…

  18. BIM. Building Information Model. Special issue; BIM. Building Information Model. Themanummer

    Energy Technology Data Exchange (ETDEWEB)

    Van Gelder, A.L.A. [Arta and Consultancy, Lage Zwaluwe (Netherlands); Van den Eijnden, P.A.A. [Stichting Marktwerking Installatietechniek, Zoetermeer (Netherlands); Veerman, J.; Mackaij, J.; Borst, E. [Royal Haskoning DHV, Nijmegen (Netherlands); Kruijsse, P.M.D. [Wolter en Dros, Amersfoort (Netherlands); Buma, W. [Merlijn Media, Waddinxveen (Netherlands); Bomhof, F.; Willems, P.H.; Boehms, M. [TNO, Delft (Netherlands); Hofman, M.; Verkerk, M. [ISSO, Rotterdam (Netherlands); Bodeving, M. [VIAC Installatie Adviseurs, Houten (Netherlands); Van Ravenswaaij, J.; Van Hoven, H. [BAM Techniek, Bunnik (Netherlands); Boeije, I.; Schalk, E. [Stabiplan, Bodegraven (Netherlands)

    2012-11-15

    A series of 14 articles illustrates the various aspects of the Building Information Model (BIM). The essence of BIM is to capture information about the building process and the building product. [Dutch] In 14 artikelen worden diverse aspecten m.b.t. het Building Information Model (BIM) belicht. De essentie van BIM is het vastleggen van informatie over het bouwproces en het bouwproduct.

  19. Development of a flattening filter free multiple source model for use as an independent, Monte Carlo, dose calculation, quality assurance tool for clinical trials.

    Science.gov (United States)

    Faught, Austin M; Davidson, Scott E; Popple, Richard; Kry, Stephen F; Etzel, Carol; Ibbott, Geoffrey S; Followill, David S

    2017-09-01

    The Imaging and Radiation Oncology Core-Houston (IROC-H) Quality Assurance Center (formerly the Radiological Physics Center) has reported varying levels of compliance from their anthropomorphic phantom auditing program. IROC-H studies have suggested that one source of disagreement between institution submitted calculated doses and measurement is the accuracy of the institution's treatment planning system dose calculations and heterogeneity corrections used. In order to audit this step of the radiation therapy treatment process, an independent dose calculation tool is needed. Monte Carlo multiple source models for Varian flattening filter free (FFF) 6 MV and FFF 10 MV therapeutic x-ray beams were commissioned based on central axis depth dose data from a 10 × 10 cm(2) field size and dose profiles for a 40 × 40 cm(2) field size. The models were validated against open-field measurements in a water tank for field sizes ranging from 3 × 3 cm(2) to 40 × 40 cm(2) . The models were then benchmarked against IROC-H's anthropomorphic head and neck phantom and lung phantom measurements. Validation results, assessed with a ±2%/2 mm gamma criterion, showed average agreement of 99.9% and 99.0% for central axis depth dose data for FFF 6 MV and FFF 10 MV models, respectively. Dose profile agreement using the same evaluation technique averaged 97.8% and 97.9% for the respective models. Phantom benchmarking comparisons were evaluated with a ±3%/2 mm gamma criterion, and agreement averaged 90.1% and 90.8% for the respective models. Multiple source models for Varian FFF 6 MV and FFF 10 MV beams have been developed, validated, and benchmarked for inclusion in an independent dose calculation quality assurance tool for use in clinical trial audits. © 2017 American Association of Physicists in Medicine.

  20. Models of Financial Market Information Ecology

    Science.gov (United States)

    Challet, Damien

    I discuss a new simple framework that allows a more realistic modelling of speculation. The resulting model features expliciting position holding, contagion between predictability patterns, allows for an explicit measure of market inefficiency and substantiates the use of the minority game to study information ecology in financial markets.

  1. Multi-dimensional indoor location information model

    NARCIS (Netherlands)

    Xiong, Q.; Zhu, Q.; Zlatanova, S.; Huang, L.; Zhou, Y.; Du, Z.

    2013-01-01

    Aiming at the increasing requirements of seamless indoor and outdoor navigation and location service, a Chinese standard of Multidimensional Indoor Location Information Model is being developed, which defines ontology of indoor location. The model is complementary to 3D concepts like CityGML and

  2. Millennial Students' Mental Models of Information Retrieval

    Science.gov (United States)

    Holman, Lucy

    2009-01-01

    This qualitative study examines first-year college students' online search habits in order to identify patterns in millennials' mental models of information retrieval. The study employed a combination of modified contextual inquiry and concept mapping methodologies to elicit students' mental models. The researcher confirmed previously observed…

  3. Adaptable Information Models in the Global Change Information System

    Science.gov (United States)

    Duggan, B.; Buddenberg, A.; Aulenbach, S.; Wolfe, R.; Goldstein, J.

    2014-12-01

    The US Global Change Research Program has sponsored the creation of the Global Change Information System () to provide a web based source of accessible, usable, and timely information about climate and global change for use by scientists, decision makers, and the public. The GCIS played multiple roles during the assembly and release of the Third National Climate Assessment. It provided human and programmable interfaces, relational and semantic representations of information, and discrete identifiers for various types of resources, which could then be manipulated by a distributed team with a wide range of specialties. The GCIS also served as a scalable backend for the web based version of the report. In this talk, we discuss the infrastructure decisions made during the design and deployment of the GCIS, as well as ongoing work to adapt to new types of information. Both a constrained relational database and an open ended triple store are used to ensure data integrity while maintaining fluidity. Using natural primary keys allows identifiers to propagate through both models. Changing identifiers are accomodated through fine grained auditing and explicit mappings to external lexicons. A practical RESTful API is used whose endpoints are also URIs in an ontology. Both the relational schema and the ontology are maleable, and stability is ensured through test driven development and continuous integration testing using modern open source techniques. Content is also validated through continuous testing techniques. A high degres of scalability is achieved through caching.

  4. A Study on Constructing National Policy System of Security Assurance of Digital Academic Information Resources%数字学术信息资源安全保障的国家政策体系构建研究

    Institute of Scientific and Technical Information of China (English)

    刘万国; 周秀霞; 杨雨师

    2016-01-01

    〔Abstract〕 As digital academic information resources has become the main form of knowledge culture communication, utilization and preservation worldwide, national policy of security assurance of digital academic information resources has become an important research topic. This article discusses the facing security risk of digital academic information resources, the domestic and foreign policy of security assurance of digital academic information resources by method of comparative analysis, and constructs its national policy system frame of China .%随着数字学术信息资源成为全世界知识、文化传播、利用、保存的主要形式,数字学术信息资源安全保障的国家政策成为重要的研究课题。文章从数字学术信息资源面临的安全风险入手,采用对比分析法研究了国内外数字学术信息资源安全保存的政策,并以此为鉴,提出了构建我国数字学术信息资源安全保障国家政策体系的框架。

  5. An information criterion for marginal structural models.

    Science.gov (United States)

    Platt, Robert W; Brookhart, M Alan; Cole, Stephen R; Westreich, Daniel; Schisterman, Enrique F

    2013-04-15

    Marginal structural models were developed as a semiparametric alternative to the G-computation formula to estimate causal effects of exposures. In practice, these models are often specified using parametric regression models. As such, the usual conventions regarding regression model specification apply. This paper outlines strategies for marginal structural model specification and considerations for the functional form of the exposure metric in the final structural model. We propose a quasi-likelihood information criterion adapted from use in generalized estimating equations. We evaluate the properties of our proposed information criterion using a limited simulation study. We illustrate our approach using two empirical examples. In the first example, we use data from a randomized breastfeeding promotion trial to estimate the effect of breastfeeding duration on infant weight at 1 year. In the second example, we use data from two prospective cohorts studies to estimate the effect of highly active antiretroviral therapy on CD4 count in an observational cohort of HIV-infected men and women. The marginal structural model specified should reflect the scientific question being addressed but can also assist in exploration of other plausible and closely related questions. In marginal structural models, as in any regression setting, correct inference depends on correct model specification. Our proposed information criterion provides a formal method for comparing model fit for different specifications.

  6. Optimal Control Design with Limited Model Information

    CERN Document Server

    Farokhi, F; Johansson, K H

    2011-01-01

    We introduce the family of limited model information control design methods, which construct controllers by accessing the plant's model in a constrained way, according to a given design graph. We investigate the achievable closed-loop performance of discrete-time linear time-invariant plants under a separable quadratic cost performance measure with structured static state-feedback controllers. We find the optimal control design strategy (in terms of the competitive ratio and domination metrics) when the control designer has access to the local model information and the global interconnection structure of the plant-to-be-controlled. At last, we study the trade-off between the amount of model information exploited by a control design method and the best closed-loop performance (in terms of the competitive ratio) of controllers it can produce.

  7. Advancing an Information Model for Environmental Observations

    Science.gov (United States)

    Horsburgh, J. S.; Aufdenkampe, A. K.; Hooper, R. P.; Lehnert, K. A.; Schreuders, K.; Tarboton, D. G.; Valentine, D. W.; Zaslavsky, I.

    2011-12-01

    Observational data are fundamental to hydrology and water resources, and the way they are organized, described, and shared either enables or inhibits the analyses that can be performed using the data. The CUAHSI Hydrologic Information System (HIS) project is developing cyberinfrastructure to support hydrologic science by enabling better access to hydrologic data. HIS is composed of three major components. HydroServer is a software stack for publishing time series of hydrologic observations on the Internet as well as geospatial data using standards-based web feature, map, and coverage services. HydroCatalog is a centralized facility that catalogs the data contents of individual HydroServers and enables search across them. HydroDesktop is a client application that interacts with both HydroServer and HydroCatalog to discover, download, visualize, and analyze hydrologic observations published on one or more HydroServers. All three components of HIS are founded upon an information model for hydrologic observations at stationary points that specifies the entities, relationships, constraints, rules, and semantics of the observational data and that supports its data services. Within this information model, observations are described with ancillary information (metadata) about the observations to allow them to be unambiguously interpreted and used, and to provide traceable heritage from raw measurements to useable information. Physical implementations of this information model include the Observations Data Model (ODM) for storing hydrologic observations, Water Markup Language (WaterML) for encoding observations for transmittal over the Internet, the HydroCatalog metadata catalog database, and the HydroDesktop data cache database. The CUAHSI HIS and this information model have now been in use for several years, and have been deployed across many different academic institutions as well as across several national agency data repositories. Additionally, components of the HIS

  8. Information retrieval models foundations and relationships

    CERN Document Server

    Roelleke, Thomas

    2013-01-01

    Information Retrieval (IR) models are a core component of IR research and IR systems. The past decade brought a consolidation of the family of IR models, which by 2000 consisted of relatively isolated views on TF-IDF (Term-Frequency times Inverse-Document-Frequency) as the weighting scheme in the vector-space model (VSM), the probabilistic relevance framework (PRF), the binary independence retrieval (BIR) model, BM25 (Best-Match Version 25, the main instantiation of the PRF/BIR), and language modelling (LM). Also, the early 2000s saw the arrival of divergence from randomness (DFR).Regarding in

  9. TUNS/TCIS information model/process model

    Science.gov (United States)

    Wilson, James

    1992-01-01

    An Information Model is comprised of graphical and textual notation suitable for describing and defining the problem domain - in our case, TUNS or TCIS. The model focuses on the real world under study. It identifies what is in the problem and organizes the data into a formal structure for documentation and communication purposes. The Information Model is composed of an Entity Relationship Diagram (ERD) and a Data Dictionary component. The combination of these components provide an easy to understand methodology for expressing the entities in the problem space, the relationships between entities and the characteristics (attributes) of the entities. This approach is the first step in information system development. The Information Model identifies the complete set of data elements processed by TUNS. This representation provides a conceptual view of TUNS from the perspective of entities, data, and relationships. The Information Model reflects the business practices and real-world entities that users must deal with.

  10. RAVEN Quality Assurance Activities

    Energy Technology Data Exchange (ETDEWEB)

    Cogliati, Joshua Joseph [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    This report discusses the quality assurance activities needed to raise the Quality Level of Risk Analysis in a Virtual Environment (RAVEN) from Quality Level 3 to Quality Level 2. This report also describes the general RAVEN quality assurance activities. For improving the quality, reviews of code changes have been instituted, more parts of testing have been automated, and improved packaging has been created. For upgrading the quality level, requirements have been created and the workflow has been improved.

  11. Performance assurance program plan

    Energy Technology Data Exchange (ETDEWEB)

    Rogers, B.H.

    1997-11-06

    B and W Protec, Inc. (BWP) is responsible for implementing the Performance Assurance Program for the Project Hanford Management Contract (PHMC) in accordance with DOE Order 470.1, Safeguards and Security Program (DOE 1995a). The Performance Assurance Program applies to safeguards and security (SAS) systems and their essential components (equipment, hardware, administrative procedures, Protective Force personnel, and other personnel) in direct support of Category I and H special nuclear material (SNM) protection. Performance assurance includes several Hanford Site activities that conduct performance, acceptance, operability, effectiveness, and validation tests. These activities encompass areas of training, exercises, quality assurance, conduct of operations, total quality management, self assessment, classified matter protection and control, emergency preparedness, and corrective actions tracking and trending. The objective of the Performance Assurance Program is to capture the critical data of the tests, training, etc., in a cost-effective, manageable program that reflects the overall effectiveness of the program while minimizing operational impacts. To aid in achieving this objective, BWP will coordinate the Performance Assurance Program for Fluor Daniel Hanford, Inc. (FDH) and serve as the central point for data collection.

  12. A linguistic model of informed consent.

    Science.gov (United States)

    Marta, J

    1996-02-01

    The current disclosure model of informed consent ignores the linguistic complexity of any act of communication, and the increased risk of difficulties in the special circumstances of informed consent. This article explores, through linguistic analysis, the specificity of informed consent as a speech act, a communication act, and a form of dialogue, following on the theories of J.L. Austin, Roman Jakobson, and Mikhail Bakhtin, respectively. In the proposed model, informed consent is a performative speech act resulting from a series of communication acts which together constitute a dialogic, polyphonic, heteroglossial discourse. It is an act of speech that results in action being taken after a conversation has happened where distinct individuals, multiple voices, and multiple perspectives have been respected, and convention observed and recognized. It is more meaningful and more ethical for both patient and physician, in all their human facets including their interconnectedness.

  13. Optimal information diffusion in stochastic block models

    CERN Document Server

    Curato, Gianbiagio

    2016-01-01

    We use the linear threshold model to study the diffusion of information on a network generated by the stochastic block model. We focus our analysis on a two community structure where the initial set of informed nodes lies only in one of the two communities and we look for optimal network structures, i.e. those maximizing the asymptotic extent of the diffusion. We find that, constraining the mean degree and the fraction of initially informed nodes, the optimal structure can be assortative (modular), core-periphery, or even disassortative. We then look for minimal cost structures, i.e. those such that a minimal fraction of initially informed nodes is needed to trigger a global cascade. We find that the optimal networks are assortative but with a structure very close to a core-periphery graph, i.e. a very dense community linked to a much more sparsely connected periphery.

  14. Information modeling system for blast furnace control

    Science.gov (United States)

    Spirin, N. A.; Gileva, L. Y.; Lavrov, V. V.

    2016-09-01

    Modern Iron & Steel Works as a rule are equipped with powerful distributed control systems (DCS) and databases. Implementation of DSC system solves the problem of storage, control, protection, entry, editing and retrieving of information as well as generation of required reporting data. The most advanced and promising approach is to use decision support information technologies based on a complex of mathematical models. The model decision support system for control of blast furnace smelting is designed and operated. The basis of the model system is a complex of mathematical models created using the principle of natural mathematical modeling. This principle provides for construction of mathematical models of two levels. The first level model is a basic state model which makes it possible to assess the vector of system parameters using field data and blast furnace operation results. It is also used to calculate the adjustment (adaptation) coefficients of the predictive block of the system. The second-level model is a predictive model designed to assess the design parameters of the blast furnace process when there are changes in melting conditions relative to its current state. Tasks for which software is developed are described. Characteristics of the main subsystems of the blast furnace process as an object of modeling and control - thermal state of the furnace, blast, gas dynamic and slag conditions of blast furnace smelting - are presented.

  15. Automatic Building Information Model Query Generation

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Yufei; Yu, Nan; Ming, Jiang; Lee, Sanghoon; DeGraw, Jason; Yen, John; Messner, John I.; Wu, Dinghao

    2015-12-01

    Energy efficient building design and construction calls for extensive collaboration between different subfields of the Architecture, Engineering and Construction (AEC) community. Performing building design and construction engineering raises challenges on data integration and software interoperability. Using Building Information Modeling (BIM) data hub to host and integrate building models is a promising solution to address those challenges, which can ease building design information management. However, the partial model query mechanism of current BIM data hub collaboration model has several limitations, which prevents designers and engineers to take advantage of BIM. To address this problem, we propose a general and effective approach to generate query code based on a Model View Definition (MVD). This approach is demonstrated through a software prototype called QueryGenerator. By demonstrating a case study using multi-zone air flow analysis, we show how our approach and tool can help domain experts to use BIM to drive building design with less labour and lower overhead cost.

  16. 云计算环境下信息安全保障体系研究%Research on Information Security Assurance System under the Environment of Cloud Computing

    Institute of Scientific and Technical Information of China (English)

    刘晔

    2015-01-01

    在云计算技术已广泛应用于各行各业的时代背景下,如何构建云计算环境下的信息安全保障体系,确保数据的可靠性、准确性、安全性、用户信息的私密性等,成为亟待解决的问题。文章在分析云计算特点的基础上,认为云计算环境下的信息安全保障体系应包含三大块:云端服务商层面、用户层面和国家法律法规层面。%Under the background that Cloud Computing technology has been widely used in all walks of life,Building an information security assurance system under the cloud computing environment to ensure the reliability,accuracy, security of data,user information confidentiality,has become a problem urgently to be solved.Based on the analysis of the cloud computing characteristics,this article believes information security assurance system under the cloud com⁃puting environment includes three blocks:the cloud service provider level,user level and the sate laws and regula⁃tions level.

  17. Informing mechanistic toxicology with computational molecular models.

    Science.gov (United States)

    Goldsmith, Michael R; Peterson, Shane D; Chang, Daniel T; Transue, Thomas R; Tornero-Velez, Rogelio; Tan, Yu-Mei; Dary, Curtis C

    2012-01-01

    Computational molecular models of chemicals interacting with biomolecular targets provides toxicologists a valuable, affordable, and sustainable source of in silico molecular level information that augments, enriches, and complements in vitro and in vivo efforts. From a molecular biophysical ansatz, we describe how 3D molecular modeling methods used to numerically evaluate the classical pair-wise potential at the chemical/biological interface can inform mechanism of action and the dose-response paradigm of modern toxicology. With an emphasis on molecular docking, 3D-QSAR and pharmacophore/toxicophore approaches, we demonstrate how these methods can be integrated with chemoinformatic and toxicogenomic efforts into a tiered computational toxicology workflow. We describe generalized protocols in which 3D computational molecular modeling is used to enhance our ability to predict and model the most relevant toxicokinetic, metabolic, and molecular toxicological endpoints, thereby accelerating the computational toxicology-driven basis of modern risk assessment while providing a starting point for rational sustainable molecular design.

  18. Geomagnetic Information Model for the Year 2013

    Directory of Open Access Journals (Sweden)

    Mario Brkić

    2012-12-01

    Full Text Available The finalization of the survey of the Basic Geomagnetic Network of the Republic of Croatia (BGNRC and completion of geomagnetic information models for the Institute for Research and Development of Defence Systems of the Ministry of Defence and the State Geodetic Administration (e.g. Brkić M., E. Jungwirth, D. Matika and Ž. Bačić, 2012, Geomagnetic Information and Safety, 3rd Conference of Croatian National Platform for Disaster Risk Reduction, National Protection and Rescue Directorate, Zagreb was followed in 2012 with validity confirmation of the GI2012 predictive model by geomagnetic observations in quiet conditions. The differences between the measured and modelled declination were found to be within the expected errors of the model. It needs to be pointed out that this was the first successful implementation of night surveying (especially suitable for geomagnetic surveys of airports in the Republic of Croatia.

  19. Fisher Information Framework for Time Series Modeling

    CERN Document Server

    Venkatesan, R C

    2016-01-01

    A robust prediction model invoking the Takens embedding theorem, whose \\textit{working hypothesis} is obtained via an inference procedure based on the minimum Fisher information principle, is presented. The coefficients of the ansatz, central to the \\textit{working hypothesis} satisfy a time independent Schr\\"{o}dinger-like equation in a vector setting. The inference of i) the probability density function of the coefficients of the \\textit{working hypothesis} and ii) the establishing of constraint driven pseudo-inverse condition for the modeling phase of the prediction scheme, is made, for the case of normal distributions, with the aid of the quantum mechanical virial theorem. The well-known reciprocity relations and the associated Legendre transform structure for the Fisher information measure (FIM, hereafter)-based model in a vector setting (with least square constraints) are self-consistently derived. These relations are demonstrated to yield an intriguing form of the FIM for the modeling phase, which defi...

  20. Quality Assurance in Higher Education in Zimbabwe

    Science.gov (United States)

    Garwe, Evelyn Chiyevo

    2014-01-01

    The purpose of this paper is to furnish local and global stakeholders with detailed information regarding the development and current status of quality assurance in the Zimbabwean higher education sector. The study used document analysis, observation and interviews with key informants as sources of data. This paper addresses the dearth of…

  1. Information filtering via collaborative user clustering modeling

    Science.gov (United States)

    Zhang, Chu-Xu; Zhang, Zi-Ke; Yu, Lu; Liu, Chuang; Liu, Hao; Yan, Xiao-Yong

    2014-02-01

    The past few years have witnessed the great success of recommender systems, which can significantly help users to find out personalized items for them from the information era. One of the widest applied recommendation methods is the Matrix Factorization (MF). However, most of the researches on this topic have focused on mining the direct relationships between users and items. In this paper, we optimize the standard MF by integrating the user clustering regularization term. Our model considers not only the user-item rating information but also the user information. In addition, we compared the proposed model with three typical other methods: User-Mean (UM), Item-Mean (IM) and standard MF. Experimental results on two real-world datasets, MovieLens 1M and MovieLens 100k, show that our method performs better than other three methods in the accuracy of recommendation.

  2. The Consumer Health Information System Adoption Model.

    Science.gov (United States)

    Monkman, Helen; Kushniruk, Andre W

    2015-01-01

    Derived from overlapping concepts in consumer health, a consumer health information system refers to any of the broad range of applications, tools, and educational resources developed to empower consumers with knowledge, techniques, and strategies, to manage their own health. As consumer health information systems become increasingly popular, it is important to explore the factors that impact their adoption and success. Accumulating evidence indicates a relationship between usability and consumers' eHealth Literacy skills and the demands consumer HISs place on their skills. Here, we present a new model called the Consumer Health Information System Adoption Model, which depicts both consumer eHealth literacy skills and system demands on eHealth literacy as moderators with the potential to affect the strength of relationship between usefulness and usability (predictors of usage) and adoption, value, and successful use (actual usage outcomes). Strategies for aligning these two moderating factors are described.

  3. Asset Condition, Information Systems and Decision Models

    CERN Document Server

    Willett, Roger; Brown, Kerry; Mathew, Joseph

    2012-01-01

    Asset Condition, Information Systems and Decision Models, is the second volume of the Engineering Asset Management Review Series. The manuscripts provide examples of implementations of asset information systems as well as some practical applications of condition data for diagnostics and prognostics. The increasing trend is towards prognostics rather than diagnostics, hence the need for assessment and decision models that promote the conversion of condition data into prognostic information to improve life-cycle planning for engineered assets. The research papers included here serve to support the on-going development of Condition Monitoring standards. This volume comprises selected papers from the 1st, 2nd, and 3rd World Congresses on Engineering Asset Management, which were convened under the auspices of ISEAM in collaboration with a number of organisations, including CIEAM Australia, Asset Management Council Australia, BINDT UK, and Chinese Academy of Sciences, Beijing University of Chemical Technology, Chin...

  4. Engaging Theories and Models to Inform Practice

    Science.gov (United States)

    Kraus, Amanda

    2012-01-01

    Helping students prepare for the complex transition to life after graduation is an important responsibility shared by those in student affairs and others in higher education. This chapter explores theories and models that can inform student affairs practitioners and faculty in preparing students for life after college. The focus is on roles,…

  5. Higher-dimensional modelling of geographic information

    NARCIS (Netherlands)

    Arroyo Ohori, G.A.K.

    2016-01-01

    Our world is three-dimensional and complex, continuously changing over time and appearing different at different scales. Yet, when we model it in a computer using Geographic Information Systems (GIS), we mostly use 2D representations, which essentially consist of linked points, lines and polygons. T

  6. Using Interaction Scenarios to Model Information Systems

    DEFF Research Database (Denmark)

    Bækgaard, Lars; Bøgh Andersen, Peter

    The purpose of this paper is to define and discuss a set of interaction primitives that can be used to model the dynamics of socio-technical activity systems, including information systems, in a way that emphasizes structural aspects of the interaction that occurs in such systems. The primitives...

  7. A focused information criterion for graphical models

    NARCIS (Netherlands)

    Pircalabelu, E.; Claeskens, G.; Waldorp, L.

    2015-01-01

    A new method for model selection for Gaussian Bayesian networks and Markov networks, with extensions towards ancestral graphs, is constructed to have good mean squared error properties. The method is based on the focused information criterion, and offers the possibility of fitting individual-tailore

  8. Data Model Management for Space Information Systems

    Science.gov (United States)

    Hughes, J. Steven; Crichton, Daniel J.; Ramirez, Paul; Mattmann, chris

    2006-01-01

    The Reference Architecture for Space Information Management (RASIM) suggests the separation of the data model from software components to promote the development of flexible information management systems. RASIM allows the data model to evolve independently from the software components and results in a robust implementation that remains viable as the domain changes. However, the development and management of data models within RASIM are difficult and time consuming tasks involving the choice of a notation, the capture of the model, its validation for consistency, and the export of the model for implementation. Current limitations to this approach include the lack of ability to capture comprehensive domain knowledge, the loss of significant modeling information during implementation, the lack of model visualization and documentation capabilities, and exports being limited to one or two schema types. The advent of the Semantic Web and its demand for sophisticated data models has addressed this situation by providing a new level of data model management in the form of ontology tools. In this paper we describe the use of a representative ontology tool to capture and manage a data model for a space information system. The resulting ontology is implementation independent. Novel on-line visualization and documentation capabilities are available automatically, and the ability to export to various schemas can be added through tool plug-ins. In addition, the ingestion of data instances into the ontology allows validation of the ontology and results in a domain knowledge base. Semantic browsers are easily configured for the knowledge base. For example the export of the knowledge base to RDF/XML and RDFS/XML and the use of open source metadata browsers provide ready-made user interfaces that support both text- and facet-based search. This paper will present the Planetary Data System (PDS) data model as a use case and describe the import of the data model into an ontology tool

  9. Data Model Management for Space Information Systems

    Science.gov (United States)

    Hughes, J. Steven; Crichton, Daniel J.; Ramirez, Paul; Mattmann, chris

    2006-01-01

    The Reference Architecture for Space Information Management (RASIM) suggests the separation of the data model from software components to promote the development of flexible information management systems. RASIM allows the data model to evolve independently from the software components and results in a robust implementation that remains viable as the domain changes. However, the development and management of data models within RASIM are difficult and time consuming tasks involving the choice of a notation, the capture of the model, its validation for consistency, and the export of the model for implementation. Current limitations to this approach include the lack of ability to capture comprehensive domain knowledge, the loss of significant modeling information during implementation, the lack of model visualization and documentation capabilities, and exports being limited to one or two schema types. The advent of the Semantic Web and its demand for sophisticated data models has addressed this situation by providing a new level of data model management in the form of ontology tools. In this paper we describe the use of a representative ontology tool to capture and manage a data model for a space information system. The resulting ontology is implementation independent. Novel on-line visualization and documentation capabilities are available automatically, and the ability to export to various schemas can be added through tool plug-ins. In addition, the ingestion of data instances into the ontology allows validation of the ontology and results in a domain knowledge base. Semantic browsers are easily configured for the knowledge base. For example the export of the knowledge base to RDF/XML and RDFS/XML and the use of open source metadata browsers provide ready-made user interfaces that support both text- and facet-based search. This paper will present the Planetary Data System (PDS) data model as a use case and describe the import of the data model into an ontology tool

  10. Modeling decisions information fusion and aggregation operators

    CERN Document Server

    Torra, Vicenc

    2007-01-01

    Information fusion techniques and aggregation operators produce the most comprehensive, specific datum about an entity using data supplied from different sources, thus enabling us to reduce noise, increase accuracy, summarize and extract information, and make decisions. These techniques are applied in fields such as economics, biology and education, while in computer science they are particularly used in fields such as knowledge-based systems, robotics, and data mining. This book covers the underlying science and application issues related to aggregation operators, focusing on tools used in practical applications that involve numerical information. Starting with detailed introductions to information fusion and integration, measurement and probability theory, fuzzy sets, and functional equations, the authors then cover the following topics in detail: synthesis of judgements, fuzzy measures, weighted means and fuzzy integrals, indices and evaluation methods, model selection, and parameter extraction. The method...

  11. Pakistan Economy DSGE Model with Informality

    OpenAIRE

    2012-01-01

    In this paper we develop a closed economy DSGE model of Pakistan with informality both in the labor and product markets. We try to remain consistent with the micro-foundations of Pakistan’s economy for the purpose of estimation of the model parameters. However a couple of them have been calibrated to match the long-run features of the Pakistan economy. We introduce exogenous shocks of technology, fiscal spending and nominal interest rate in our model. Despite having to rely on annual data our...

  12. Information Service Model with Mobile Agent Supported

    Institute of Scientific and Technical Information of China (English)

    邹涛; 王继成; 张福炎

    2000-01-01

    Mobile Agent is a kind of novel agent technology characterized by mobile, intelligent, parallel and asynchronous computing. In this paper, a new information service model that adopts mobile agent technology is introduced first,and then an experimental system DOLTRIA system that is implemented based on the model is described. The DOLTRIA system implemented by WWW framework and Java can search for relevant HTML documents on a set of Web servers. The result of experiments shows that performance improvement can be achieved by this model, and both the elapsed time and network traffic are reduced significantly.

  13. Requirements for clinical information modelling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Jódar-Sánchez, Francisco; Kalra, Dipak

    2015-07-01

    This study proposes consensus requirements for clinical information modelling tools that can support modelling tasks in medium/large scale institutions. Rather than identify which functionalities are currently available in existing tools, the study has focused on functionalities that should be covered in order to provide guidance about how to evolve the existing tools. After identifying a set of 56 requirements for clinical information modelling tools based on a literature review and interviews with experts, a classical Delphi study methodology was applied to conduct a two round survey in order to classify them as essential or recommended. Essential requirements are those that must be met by any tool that claims to be suitable for clinical information modelling, and if we one day have a certified tools list, any tool that does not meet essential criteria would be excluded. Recommended requirements are those more advanced requirements that may be met by tools offering a superior product or only needed in certain modelling situations. According to the answers provided by 57 experts from 14 different countries, we found a high level of agreement to enable the study to identify 20 essential and 21 recommended requirements for these tools. It is expected that this list of identified requirements will guide developers on the inclusion of new basic and advanced functionalities that have strong support by end users. This list could also guide regulators in order to identify requirements that could be demanded of tools adopted within their institutions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  14. Modeling uncertainty in geographic information and analysis

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Uncertainty modeling and data quality for spatial data and spatial analyses are im-portant topics in geographic information science together with space and time in geography,as well as spatial analysis. In the past two decades,a lot of efforts have been made to research the uncertainty modeling for spatial data and analyses. This paper presents our work in the research. In particular,four progresses in the re-search are given out: (a) from determinedness-to uncertainty-based representation of geographic objects in GIS; (b) from uncertainty modeling for static data to dy-namic spatial analyses; (c) from modeling uncertainty for spatial data to models; and (d) from error descriptions to quality control for spatial data.

  15. Just in Time Assurance

    Science.gov (United States)

    2010-04-01

    Just in Time Assurance Ji Al F PhD U i it f Id hm ves- oss, , n vers y o a o Director Center for Secure and Dependable Computing W. Mark Vanfleet...COVERED 00-00-2010 to 00-00-2010 4. TITLE AND SUBTITLE Just in Time Assurance 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...discusses how practical and affordable recertification can become the norm instead of the rare exception 2 What Does Just in Time Mean? Manufacturing

  16. Information as a Measure of Model Skill

    Science.gov (United States)

    Roulston, M. S.; Smith, L. A.

    2002-12-01

    Physicist Paul Davies has suggested that rather than the quest for laws that approximate ever more closely to "truth", science should be regarded as the quest for compressibility. The goodness of a model can be judged by the degree to which it allows us to compress data describing the real world. The "logarithmic scoring rule" is a method for evaluating probabilistic predictions of reality that turns this philosophical position into a practical means of model evaluation. This scoring rule measures the information deficit or "ignorance" of someone in possession of the prediction. A more applied viewpoint is that the goodness of a model is determined by its value to a user who must make decisions based upon its predictions. Any form of decision making under uncertainty can be reduced to a gambling scenario. Kelly showed that the value of a probabilistic prediction to a gambler pursuing the maximum return on their bets depends on their "ignorance", as determined from the logarithmic scoring rule, thus demonstrating a one-to-one correspondence between data compression and gambling returns. Thus information theory provides a way to think about model evaluation, that is both philosophically satisfying and practically oriented. P.C.W. Davies, in "Complexity, Entropy and the Physics of Information", Proceedings of the Santa Fe Institute, Addison-Wesley 1990 J. Kelly, Bell Sys. Tech. Journal, 35, 916-926, 1956.

  17. Software Assurance Curriculum Project Volume 1: Master of Software Assurance Reference Curriculum

    Science.gov (United States)

    2010-08-01

    credits per course has often aligned with the number of hours of lecture per week, but with online and other non-traditional formats becoming increasingly...development hires people to be developers and they are then selected to be SPOCs . Ergo, assurance is a collateral duty not their main job and we...Engineering Institute SIA survivability and information assurance SOA service-oriented architecture SPOC single point of contact SQL Structured Query

  18. FROM PHYSICAL BENCHMARKS TO MENTAL BENCHMARKS: A Four Dimensions Dynamic Model to Assure the Quality of Instructional Activities in Electronic and Virtual Learning Environments

    Directory of Open Access Journals (Sweden)

    Hamdy AHMED ABDELAZIZ

    2013-04-01

    Full Text Available The objective of this paper was to develop a four dimensions dynamic model for designing instructional activities appropriate to electronic and virtual learning environments. The suggested model is guided by learning principles of cognitivism, constructivism, and connectivism learning theories in order to help online learners to build and acquire meaningful knowledge and experiences. The proposed model consists of four dynamic dimensions: Ø Cognitive presence activities; Ø Psychological presence activities; Ø Social presence activities; and Ø Mental presence activities. Cognitive presence activities refer to learner’s ability to emerge a cognitive vision regarding the content of learning. The cognitive vision will be the starting point to construct meaningful understanding. Psychological presence activities refer to the learner’s ability to construct self awareness and trustworthiness. It will work as psychological schema to decrease the load of learning at distance. Social presence activities refer to the learner’s ability to share knowledge with others in a way to construct a community of practice and assure global understanding of learning. Finally, mental presence activities refer to learner’s ability to construct mental models that represent knowledge creation. It will help learners to make learning outcomes and experiences transferable. Applying the proposed model will improve the process of developing e-based activities throughout a set of adaptive and dynamic frameworks and guidelines to meet online learner’s cognitive, psychological, social and mental presence.

  19. Building Information Modelling in Denmark and Iceland

    DEFF Research Database (Denmark)

    Jensen, Per Anker; Jóhannesson, Elvar Ingi

    2013-01-01

    Purpose – The purpose of this paper is to explore the implementation of building information modelling (BIM) in the Nordic countries of Europe with particular focus on the Danish building industry with the aim of making use of its experience for the Icelandic building industry. Design....../methodology/aptroach – The research is based on two separate analyses. In the first part, the deployment of information and communication technology (ICT) in the Icelandic building industry is investigated and compared with the other Nordic countries. In the second part the experience in Denmark from implementing and working...... for making standards and guidelines related to BIM. Public building clients are also encouraged to consider initiating projects based on making simple building models of existing buildings in order to introduce the BIM technology to the industry. Icelandic companies are recommended to start implementing BIM...

  20. Information Filtering via Collaborative User Clustering Modeling

    CERN Document Server

    Zhang, Chu-Xu; Yu, Lu; Liu, Chuang; Liu, Hao; Yan, Xiao-Yong

    2013-01-01

    The past few years have witnessed the great success of recommender systems, which can significantly help users find out personalized items for them from the information era. One of the most widely applied recommendation methods is the Matrix Factorization (MF). However, most of researches on this topic have focused on mining the direct relationships between users and items. In this paper, we optimize the standard MF by integrating the user clustering regularization term. Our model considers not only the user-item rating information, but also takes into account the user interest. We compared the proposed model with three typical other methods: User-Mean (UM), Item-Mean (IM) and standard MF. Experimental results on a real-world dataset, MovieLens, show that our method performs much better than other three methods in the accuracy of recommendation.

  1. Understanding requirements via natural language information modeling

    Energy Technology Data Exchange (ETDEWEB)

    Sharp, J.K.; Becker, S.D.

    1993-07-01

    Information system requirements that are expressed as simple English sentences provide a clear understanding of what is needed between system specifiers, administrators, users, and developers of information systems. The approach used to develop the requirements is the Natural-language Information Analysis Methodology (NIAM). NIAM allows the processes, events, and business rules to be modeled using natural language. The natural language presentation enables the people who deal with the business issues that are to be supported by the information system to describe exactly the system requirements that designers and developers will implement. Computer prattle is completely eliminated from the requirements discussion. An example is presented that is based upon a section of a DOE Order involving nuclear materials management. Where possible, the section is analyzed to specify the process(es) to be done, the event(s) that start the process, and the business rules that are to be followed during the process. Examples, including constraints, are developed. The presentation steps through the modeling process and shows where the section of the DOE Order needs clarification, extensions or interpretations that could provide a more complete and accurate specification.

  2. Mutual information in the Tangled Nature Model

    DEFF Research Database (Denmark)

    Jones, Dominic; Jeldtoft Jensen, Henrik; Sibani, Paolo

    2009-01-01

    We consider the concept of mutual information in ecological networks, and use this idea to analyse the Tangled Nature model of co-evolution. We show that this measure of correlation has two distinct behaviours depending on how we define the network in question: if we consider only the network...... of viable species this measure increases, whereas for the whole system it decreases. It is suggested that these are complimentary behaviours that show how ecosystems can become both more stable and better adapted....

  3. Bayesian information criterion for censored survival models.

    Science.gov (United States)

    Volinsky, C T; Raftery, A E

    2000-03-01

    We investigate the Bayesian Information Criterion (BIC) for variable selection in models for censored survival data. Kass and Wasserman (1995, Journal of the American Statistical Association 90, 928-934) showed that BIC provides a close approximation to the Bayes factor when a unit-information prior on the parameter space is used. We propose a revision of the penalty term in BIC so that it is defined in terms of the number of uncensored events instead of the number of observations. For a simple censored data model, this revision results in a better approximation to the exact Bayes factor based on a conjugate unit-information prior. In the Cox proportional hazards regression model, we propose defining BIC in terms of the maximized partial likelihood. Using the number of deaths rather than the number of individuals in the BIC penalty term corresponds to a more realistic prior on the parameter space and is shown to improve predictive performance for assessing stroke risk in the Cardiovascular Health Study.

  4. Mission Operations Assurance

    Science.gov (United States)

    Faris, Grant

    2012-01-01

    Integrate the mission operations assurance function into the flight team providing: (1) value added support in identifying, mitigating, and communicating the project's risks and, (2) being an essential member of the team during the test activities, training exercises and critical flight operations.

  5. Quality Assurance Program Description

    Energy Technology Data Exchange (ETDEWEB)

    Halford, Vaughn Edward [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ryder, Ann Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-07-01

    Effective May 1, 2017, led by a new executive leadership team, Sandia began operating within a new organizational structure. National Technology and Engineering Solutions of Sandia (Sandia’s) Quality Assurance Program (QAP) was established to assign responsibilities and authorities, define workflow policies and requirements, and provide for the performance and assessment of work.

  6. The Effects of a Computer-Assisted Teaching Material, Designed According to the ASSURE Instructional Design and the ARCS Model of Motivation, on Students' Achievement Levels in a Mathematics Lesson and Their Resulting Attitudes

    Science.gov (United States)

    Karakis, Hilal; Karamete, Aysen; Okçu, Aydin

    2016-01-01

    This study examined the effects that computer-assisted instruction had on students' attitudes toward a mathematics lesson and toward learning mathematics with computer-assisted instruction. The computer software we used was based on the ASSURE Instructional Systems Design and the ARCS Model of Motivation, and the software was designed to teach…

  7. INFORMATIONAL MODEL OF MENTAL ROTATION OF FIGURES

    Directory of Open Access Journals (Sweden)

    V. A. Lyakhovetskiy

    2016-01-01

    Full Text Available Subject of Study.The subject of research is the information structure of objects internal representations and operations over them, used by man to solve the problem of mental rotation of figures. To analyze this informational structure we considered not only classical dependencies of the correct answers on the angle of rotation, but also the other dependencies obtained recently in cognitive psychology. Method.The language of technical computing Matlab R2010b was used for developing information model of the mental rotation of figures. Such model parameters as the number of bits in the internal representation, an error probability in a single bit, discrete rotation angle, comparison threshold, and the degree of difference during rotation can be changed. Main Results.The model reproduces qualitatively such psychological dependencies as the linear increase of time of correct answers and the number of errors on the angle of rotation for identical figures, "flat" dependence of the time of correct answers and the number of errors on the angle of rotation for mirror-like figures. The simulation results suggest that mental rotation is an iterative process of finding a match between the two figures, each step of which can lead to a significant distortion of the internal representation of the stored objects. Matching is carried out within the internal representations that have no high invariance to rotation angle. Practical Significance.The results may be useful for understanding the role of learning (including the learning with a teacher in the development of effective information representation and operations on them in artificial intelligence systems.

  8. Blackhole evaporation model without information loss

    CERN Document Server

    Villegas, Kristian Hauser A

    2016-01-01

    A simple model of a blackhole evaporation without information loss is given. In this model, the blackhole is \\textit{not} in a specific mass eigenstate as it evaporates but rather, is in a superposition of various mass eigenstates and is entangled with the radiation. For astrophysical blackhole, the mass distribution is sharply peak about its average value with a vanishingly small standard deviation, which is consistent with our intuition of a classical object. It is then shown that as the blackhole evaporates, the evolution of the closed blackhole-radiation system is unitary. This is done by showing that the full density matrix satisfies Tr$\\rho^2=1$ at all times. Finally, it is shown that the entanglement entropy, after an initial increase, decreases and approaches zero. These show that this model of blackhole evaporation has no infromation loss.

  9. BYU Food Quality Assurance Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The Quality Assurance Lab is located in the Eyring Science Center in the department of Nutrition, Dietetics, and Food Science. The Quality Assurance Lab has about 10...

  10. Hybrid Information Retrieval Model For Web Images

    CERN Document Server

    Bassil, Youssef

    2012-01-01

    The Bing Bang of the Internet in the early 90's increased dramatically the number of images being distributed and shared over the web. As a result, image information retrieval systems were developed to index and retrieve image files spread over the Internet. Most of these systems are keyword-based which search for images based on their textual metadata; and thus, they are imprecise as it is vague to describe an image with a human language. Besides, there exist the content-based image retrieval systems which search for images based on their visual information. However, content-based type systems are still immature and not that effective as they suffer from low retrieval recall/precision rate. This paper proposes a new hybrid image information retrieval model for indexing and retrieving web images published in HTML documents. The distinguishing mark of the proposed model is that it is based on both graphical content and textual metadata. The graphical content is denoted by color features and color histogram of ...

  11. An XML-based information model for archaeological pottery

    Institute of Scientific and Technical Information of China (English)

    LIU De-zhi; RAZDAN Anshuman; SIMON Arleyn; BAE Myungsoo

    2005-01-01

    An information model is defined to support sharing scientific information on Web for archaeological pottery. Apart from non-shape information, such as age, material, etc., the model also consists of shape information and shape feature information. Shape information is collected by Lasers Scanner and geometric modelling techniques. Feature information is generated from shape information via feature extracting techniques. The model is used in an integrated storage, archival, and sketch-based query and retrieval system for 3D objects, native American ceramic vessels. A novel aspect of the information model is that it is totally implemented with XML, and is designed for Web-based visual query and storage application.

  12. Building Information Modelling for Smart Built Environments

    Directory of Open Access Journals (Sweden)

    Jianchao Zhang

    2015-01-01

    Full Text Available Building information modelling (BIM provides architectural 3D visualization and a standardized way to share and exchange building information. Recently, there has been an increasing interest in using BIM, not only for design and construction, but also the post-construction management of the built facility. With the emergence of smart built environment (SBE technology, which embeds most spaces with smart objects to enhance the building’s efficiency, security and comfort of its occupants, there is a need to understand and address the challenges BIM faces in the design, construction and management of future smart buildings. In this paper, we investigate how BIM can contribute to the development of SBE. Since BIM is designed to host information of the building throughout its life cycle, our investigation has covered phases from architecture design to facility management. Firstly, we extend BIM for the design phase to provide material/device profiling and the information exchange interface for various smart objects. Next, we propose a three-layer verification framework to assist BIM users in identifying possible defects in their SBE design. For the post-construction phase, we have designed a facility management tool to provide advanced energy management of smart grid-connected SBEs, where smart objects, as well as distributed energy resources (DERs are deployed.

  13. Building Information Modeling in engineering teaching

    DEFF Research Database (Denmark)

    Andersson, Niclas; Andersson, Pernille Hammar

    2010-01-01

    The application of Information and Communication Technology (ICT) in construction supports business as well as project processes by providing integrated systems for communication, administration, quantity takeoff, time scheduling, cost estimating, progress control among other things. The rapid...... technological development of ICT systems and the increased application of ICT in industry significantly influence the management and organisation of construction projects, and consequently, ICT has implications for the education of engineers and the preparation of students for their future professional careers...... in this case is represented by adopting Building Information Modelling, BIM, for construction management purposes. Course evaluations, a questionnaire and discussions with students confirm a genuinely positive attitude towards the role-play simulation and interaction with industry professionals. The students...

  14. An information theory-based approach to modeling the information processing of NPP operators

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Hyun; Seong, Poong Hyun [Korea Advanced Institute, Taejon (Korea, Republic of)

    2002-08-01

    This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task. The focus will be on i) developing a model for information processing of NPP operators and ii) quantifying the model. To resolve the problems of the previous approaches based on the information theory, i.e. the problems of single channel approaches, we primarily develop the information processing model having multiple stages, which contains information flows. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory.

  15. Study on Integrated Quality Assurance System in CIMS

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Integrated Quality Assurance System (IQAS) is an important part of CIMS.This paper introduces the architecture of IQAS,elaborates the philosophy of quality assurance and quality control in CIMS.A type of function model is proposed.Meanwhile, details of the model are described.

  16. Quality assurance and evidence in career guidance in Europe

    DEFF Research Database (Denmark)

    Plant, Peter

    2011-01-01

    Quality assurance and evidence in career guidance in Europe is based on a particular, positivtic model. Other approaches are largely neglected.......Quality assurance and evidence in career guidance in Europe is based on a particular, positivtic model. Other approaches are largely neglected....

  17. Creating Quality Assurance and International Transparency for Quality Assurance Agencies

    DEFF Research Database (Denmark)

    Kristoffersen, Dorte; Lindeberg, Tobias

    2004-01-01

    The paper presents the experiences gained in the pilot project on mutual recognition conducted by the quality assurance agencies in the Nordic countries and the future perspective for international quality assurance of national quality assurance agencies. The background of the project was the need......, on the one hand, to advance internationalisation of quality assurance of higher education, and on the other hand, allow for the differences in the national approaches to quality assurance. The paper will focus on two issues: first, the strength and weaknesses of the method employed and of the use of the ENQA...

  18. 38 CFR 18.405 - Assurances required.

    Science.gov (United States)

    2010-07-01

    ...) Extent of application to institution or facility. An assurance shall apply to the entire institution or... real property as security for financing construction of new, or improvement of existing, facilities on... information required to ascertain whether the recipient has complied or is complying with the law....

  19. A focus on building information modelling.

    Science.gov (United States)

    Ryan, Alison

    2014-03-01

    With the Government Construction Strategy requiring a strengthening of the public sector's capability to implement Building Information Modelling (BIM) protocols, the goal being that all central government departments will be adopting, as a minimum, collaborative Level 2 BIM by 2016, Alison Ryan, of consulting engineers, DSSR, explains the principles behind BIM, its history and evolution, and some of the considerable benefits it can offer. These include lowering capital project costs through enhanced co-ordination, cutting carbon emissions, and the ability to manage facilities more efficiently.

  20. Le soluzioni Building Information Modeling di Bentley

    Directory of Open Access Journals (Sweden)

    Fulvio Bernardini

    2007-04-01

    Full Text Available La questione dell’interoperabilità dei dati negli ultimi anni è stata continuamente dibattuta dai professionisti dei vari settori. L’edilizia col suo ciclo di vita non hanno fatto eccezione e da quando il concetto di Building Information Modeling (BIM ha fatto il suo ingresso nel mondo dell’architettura, dell’ingegneria e delle costruzioni (AEC, le fasi inerenti il processo del buildingnon sono più state considerate separatamente. Bentley Systems, da sempre attiva nel settore delle infrastrutture, propone un’ampia gamma di soluzioni studiate proprio per coprire questo bisogno.

  1. Live sequence charts to model medical information

    Directory of Open Access Journals (Sweden)

    Aslakson Eric

    2012-06-01

    Full Text Available Abstract Background Medical records accumulate data concerning patient health and the natural history of disease progression. However, methods to mine information systematically in a form other than an electronic health record are not yet available. The purpose of this study was to develop an object modeling technique as a first step towards a formal database of medical records. Method Live Sequence Charts (LSC were used to formalize the narrative text obtained during a patient interview. LSCs utilize a visual scenario-based programming language to build object models. LSC extends the classical language of UML message sequence charts (MSC, predominantly through addition of modalities and providing executable semantics. Inter-object scenarios were defined to specify natural history event interactions and different scenarios in the narrative text. Result A simulated medical record was specified into LSC formalism by translating the text into an object model that comprised a set of entities and events. The entities described the participating components (i.e., doctor, patient and record and the events described the interactions between elements. A conceptual model is presented to illustrate the approach. An object model was generated from data extracted from an actual new patient interview, where the individual was eventually diagnosed as suffering from Chronic Fatigue Syndrome (CFS. This yielded a preliminary formal designated vocabulary for CFS development that provided a basis for future formalism of these records. Conclusions Translation of medical records into object models created the basis for a formal database of the patient narrative that temporally depicts the events preceding disease, the diagnosis and treatment approach. The LSCs object model of the medical narrative provided an intuitive, visual representation of the natural history of the patient’s disease.

  2. Building information modelling (BIM: now and beyond

    Directory of Open Access Journals (Sweden)

    Salman Azhar

    2012-12-01

    Full Text Available Building Information Modeling (BIM, also called n-D Modeling or Virtual Prototyping Technology, is a revolutionary development that is quickly reshaping the Architecture-Engineering-Construction (AEC industry. BIM is both a technology and a process. The technology component of BIM helps project stakeholders to visualize what is to be built in a simulated environment to identify any potential design, construction or operational issues. The process component enables close collaboration and encourages integration of the roles of all stakeholders on a project. The paper presents an overview of BIM with focus on its core concepts, applications in the project life cycle and benefits for project stakeholders with the help of case studies. The paper also elaborates risks and barriers to BIM implementation and future trends.

  3. Building information modelling (BIM: now and beyond

    Directory of Open Access Journals (Sweden)

    Salman Azhar

    2015-10-01

    Full Text Available Building Information Modeling (BIM, also called n-D Modeling or Virtual Prototyping Technology, is a revolutionary development that is quickly reshaping the Architecture-Engineering-Construction (AEC industry. BIM is both a technology and a process. The technology component of BIM helps project stakeholders to visualize what is to be built in a simulated environment to identify any potential design, construction or operational issues. The process component enables close collaboration and encourages integration of the roles of all stakeholders on a project. The paper presents an overview of BIM with focus on its core concepts, applications in the project life cycle and benefits for project stakeholders with the help of case studies. The paper also elaborates risks and barriers to BIM implementation and future trends.

  4. [Medical quality assurance today].

    Science.gov (United States)

    Schäfer, Robert D

    2008-01-01

    Both the quality and performance of health systems are strongly influenced by the number and the qualification of the professional staff. Quality assurance programs help to analyse causalities which are responsible for medical malpractice. On the basis of the experiences gained by the performance of established Quality Assurance Programs (QAP) in the North Rhine area since 1982 various aspects of the efficiency of these programs will be discussed. The implementation of legal regulations making these programs mandatory is criticised not only for its bureaucratic effect but also for the attempt to exclude professional experts from the interpretation of results. It is recommended to liberalize these regulations in order to facilitate improvement of methods and participation of the medical profession.

  5. Power transformers quality assurance

    CERN Document Server

    Dasgupta, Indrajit

    2009-01-01

    About the Book: With the view to attain higher reliability in power system operation, the quality assurance in the field of distribution and power transformers has claimed growing attention. Besides new developments in the material technology and manufacturing processes of transformers, regular diagnostic testing and maintenance of any engineering product may be ascertained by ensuring: right selection of materials and components and their quality checks. application of correct manufacturing processes any systems engineering. the user`s awareness towards preventive maintenance. The

  6. The PARTI Architecture Assurance

    Science.gov (United States)

    2008-10-01

    example safety critical system, 2, Issues Guidance Papers ( IGPs ) that further explain key concepts or requirements of the STAN- DARD, The guidance...Organisation (2009) IGP -OOl: Guidance Notes for Project Offices. Published as part of [20]. 4. Defence Material Organisation (2009) IGP -002: Methodsfor Safety...Architecture Analysis. Published as part of [20]. 5. Defence Material Organisation (2009) IGP -003: Methods for Design Assurance. Published as part of

  7. Multiscale information modelling for heart morphogenesis

    Science.gov (United States)

    Abdulla, T.; Imms, R.; Schleich, J. M.; Summers, R.

    2010-07-01

    Science is made feasible by the adoption of common systems of units. As research has become more data intensive, especially in the biomedical domain, it requires the adoption of a common system of information models, to make explicit the relationship between one set of data and another, regardless of format. This is being realised through the OBO Foundry to develop a suite of reference ontologies, and NCBO Bioportal to provide services to integrate biomedical resources and functionality to visualise and create mappings between ontology terms. Biomedical experts tend to be focused at one level of spatial scale, be it biochemistry, cell biology, or anatomy. Likewise, the ontologies they use tend to be focused at a particular level of scale. There is increasing interest in a multiscale systems approach, which attempts to integrate between different levels of scale to gain understanding of emergent effects. This is a return to physiological medicine with a computational emphasis, exemplified by the worldwide Physiome initiative, and the European Union funded Network of Excellence in the Virtual Physiological Human. However, little work has been done on how information modelling itself may be tailored to a multiscale systems approach. We demonstrate how this can be done for the complex process of heart morphogenesis, which requires multiscale understanding in both time and spatial domains. Such an effort enables the integration of multiscale metrology.

  8. Parsimonious modeling with information filtering networks

    Science.gov (United States)

    Barfuss, Wolfram; Massara, Guido Previde; Di Matteo, T.; Aste, Tomaso

    2016-12-01

    We introduce a methodology to construct parsimonious probabilistic models. This method makes use of information filtering networks to produce a robust estimate of the global sparse inverse covariance from a simple sum of local inverse covariances computed on small subparts of the network. Being based on local and low-dimensional inversions, this method is computationally very efficient and statistically robust, even for the estimation of inverse covariance of high-dimensional, noisy, and short time series. Applied to financial data our method results are computationally more efficient than state-of-the-art methodologies such as Glasso producing, in a fraction of the computation time, models that can have equivalent or better performances but with a sparser inference structure. We also discuss performances with sparse factor models where we notice that relative performances decrease with the number of factors. The local nature of this approach allows us to perform computations in parallel and provides a tool for dynamical adaptation by partial updating when the properties of some variables change without the need of recomputing the whole model. This makes this approach particularly suitable to handle big data sets with large numbers of variables. Examples of practical application for forecasting, stress testing, and risk allocation in financial systems are also provided.

  9. Concept Tree Based Information Retrieval Model

    Directory of Open Access Journals (Sweden)

    Chunyan Yuan

    2014-05-01

    Full Text Available This paper proposes a novel concept-based query expansion technique named Markov concept tree model (MCTM, discovering term relationship through the concept tree deduced by term markov network. We address two important issues for query expansion: the selection and the weighting of expansion search terms. In contrast to earlier methods, queries are expanded by adding those terms that are most similar to the concept of the query, rather than selecting terms that are similar to a signal query terms. Utilizing Markov network which is constructed according to the co-occurrence information of the terms in collection, it generate concept tree for each original query term, remove the redundant and irrelevant nodes in concept tree, then adjust the weight of original query and the weight of expansion term based on a pruning algorithm. We use this model for query expansion and evaluate the effectiveness of the model by examining the accuracy and robustness of the expansion methods, Compared with the baseline model, the experiments on standard dataset reveal that this method can achieve a better query quality

  10. Utilization of building information modeling in infrastructure’s design and construction

    Science.gov (United States)

    Zak, Josef; Macadam, Helen

    2017-09-01

    Building Information Modeling (BIM) is a concept that has gained its place in the design, construction and maintenance of buildings in Czech Republic during recent years. This paper deals with description of usage, applications and potential benefits and disadvantages connected with implementation of BIM principles in the preparation and construction of infrastructure projects. Part of the paper describes the status of BIM implementation in Czech Republic, and there is a review of several virtual design and construction practices in Czech Republic. Examples of best practice are presented from current infrastructure projects. The paper further summarizes experiences with new technologies gained from the application of BIM related workflows. The focus is on the BIM model utilization for the machine control systems on site, quality assurance, quality management and construction management.

  11. Validity of information security policy models

    Directory of Open Access Journals (Sweden)

    Joshua Onome Imoniana

    Full Text Available Validity is concerned with establishing evidence for the use of a method to be used with a particular set of population. Thus, when we address the issue of application of security policy models, we are concerned with the implementation of a certain policy, taking into consideration the standards required, through attribution of scores to every item in the research instrument. En today's globalized economic scenarios, the implementation of information security policy, in an information technology environment, is a condition sine qua non for the strategic management process of any organization. Regarding this topic, various studies present evidences that, the responsibility for maintaining a policy rests primarily with the Chief Security Officer. The Chief Security Officer, in doing so, strives to enhance the updating of technologies, in order to meet all-inclusive business continuity planning policies. Therefore, for such policy to be effective, it has to be entirely embraced by the Chief Executive Officer. This study was developed with the purpose of validating specific theoretical models, whose designs were based on literature review, by sampling 10 of the Automobile Industries located in the ABC region of Metropolitan São Paulo City. This sampling was based on the representativeness of such industries, particularly with regards to each one's implementation of information technology in the region. The current study concludes, presenting evidence of the discriminating validity of four key dimensions of the security policy, being such: the Physical Security, the Logical Access Security, the Administrative Security, and the Legal & Environmental Security. On analyzing the Alpha of Crombach structure of these security items, results not only attest that the capacity of those industries to implement security policies is indisputable, but also, the items involved, homogeneously correlate to each other.

  12. Towards Run-time Assurance of Advanced Propulsion Algorithms

    Science.gov (United States)

    Wong, Edmond; Schierman, John D.; Schlapkohl, Thomas; Chicatelli, Amy

    2014-01-01

    This paper covers the motivation and rationale for investigating the application of run-time assurance methods as a potential means of providing safety assurance for advanced propulsion control systems. Certification is becoming increasingly infeasible for such systems using current verification practices. Run-time assurance systems hold the promise of certifying these advanced systems by continuously monitoring the state of the feedback system during operation and reverting to a simpler, certified system if anomalous behavior is detected. The discussion will also cover initial efforts underway to apply a run-time assurance framework to NASA's model-based engine control approach. Preliminary experimental results are presented and discussed.

  13. Building information models for astronomy projects

    Science.gov (United States)

    Ariño, Javier; Murga, Gaizka; Campo, Ramón; Eletxigerra, Iñigo; Ampuero, Pedro

    2012-09-01

    A Building Information Model is a digital representation of physical and functional characteristics of a building. BIMs represent the geometrical characteristics of the Building, but also properties like bills of quantities, definition of COTS components, status of material in the different stages of the project, project economic data, etc. The BIM methodology, which is well established in the Architecture Engineering and Construction (AEC) domain for conventional buildings, has been brought one step forward in its application for Astronomical/Scientific facilities. In these facilities steel/concrete structures have high dynamic and seismic requirements, M&E installations are complex and there is a large amount of special equipment and mechanisms involved as a fundamental part of the facility. The detail design definition is typically implemented by different design teams in specialized design software packages. In order to allow the coordinated work of different engineering teams, the overall model, and its associated engineering database, is progressively integrated using a coordination and roaming software which can be used before starting construction phase for checking interferences, planning the construction sequence, studying maintenance operation, reporting to the project office, etc. This integrated design & construction approach will allow to efficiently plan construction sequence (4D). This is a powerful tool to study and analyze in detail alternative construction sequences and ideally coordinate the work of different construction teams. In addition engineering, construction and operational database can be linked to the virtual model (6D), what gives to the end users a invaluable tool for the lifecycle management, as all the facility information can be easily accessed, added or replaced. This paper presents the BIM methodology as implemented by IDOM with the E-ELT and ATST Enclosures as application examples.

  14. Assimilating host model information into a limited area model

    Directory of Open Access Journals (Sweden)

    Nils Gustafsson

    2012-01-01

    Full Text Available We propose to add an extra source of information to the data-assimilation of the regional HIgh Resolution Limited Area Model (HIRLAM model, constraining larger scales to the host model providing the lateral boundary conditions. An extra term, Jk, measuring the distance to the large-scale vorticity of the host model, is added to the cost-function of the variational data-assimilation. Vorticity is chosen because it is a good representative of the large-scale flow and because vorticity is a basic control variable of the HIRLAM variational data-assimilation. Furthermore, by choosing only vorticity, the remaining model variables, divergence, temperature, surface pressure and specific humidity will be allowed to adapt to the modified vorticity field in accordance with the internal balance constraints of the regional model. The error characteristics of the Jk term are described by the horizontal spectral densities and the vertical eigenmodes (eigenvectors and eigenvalues of the host model vorticity forecast error fields, expressed in the regional model geometry. The vorticity field, provided by the European Centre for Medium-range Weather Forecasts (ECMWF operational model, was assimilated into the HIRLAM model during an experiment period of 33 d in winter with positive impact on forecast verification statistics for upper air variables and mean sea level pressure.The review process was handled by Editor-in-Cheif Harald Lejenäs

  15. Information analysis for modeling and representation of meaning

    OpenAIRE

    Uda, Norihiko

    1994-01-01

    In this dissertation, information analysis and an information model called the Semantic Structure Model based on information analysis are explained for semantic processing. Methods for self organization of information are also described. In addition, Information-Base Systems for thinking support of research and development in non linear optical materials are explained. As a result of information analysis, general properties of information and structural properties of concepts become clear. Ge...

  16. Semantic World Modelling and Data Management in a 4d Forest Simulation and Information System

    Science.gov (United States)

    Roßmann, J.; Hoppen, M.; Bücken, A.

    2013-08-01

    Various types of 3D simulation applications benefit from realistic forest models. They range from flight simulators for entertainment to harvester simulators for training and tree growth simulations for research and planning. Our 4D forest simulation and information system integrates the necessary methods for data extraction, modelling and management. Using modern methods of semantic world modelling, tree data can efficiently be extracted from remote sensing data. The derived forest models contain position, height, crown volume, type and diameter of each tree. This data is modelled using GML-based data models to assure compatibility and exchangeability. A flexible approach for database synchronization is used to manage the data and provide caching, persistence, a central communication hub for change distribution, and a versioning mechanism. Combining various simulation techniques and data versioning, the 4D forest simulation and information system can provide applications with "both directions" of the fourth dimension. Our paper outlines the current state, new developments, and integration of tree extraction, data modelling, and data management. It also shows several applications realized with the system.

  17. SECURITY ASSURANCE FRAMEWORK OF INFORMATION SYSTEM DEVELOPMENT LIFECYCLE FOR SMALL AND MEDIUM-SIZED BANKS%中小银行信息系统开发生命周期安全保障框架

    Institute of Scientific and Technical Information of China (English)

    陆向阳; 蒋树立; 孙亮; 熊延忠

    2013-01-01

    In this paper we propose a full-lifecycle security assurance framework of information system development suitable for local small and medium-sized banks through studying the SDL of Microsoft and GB 20274 and combining the information system construction practices at Jiangnan Rural Commercial Bank .The framework integrates the software security protection into five phases of information system lifecycle with detail descriptions of the security control measures for each phase , which ensures the safety and reliability of the system development .%通过研究微软安全开发生命周期SDL( Security Development Lifecycle )和《GB 20274信息系统安全保障评估框架》,结合江南农村商业银行信息系统建设的实践,提出适合国内中小银行信息系统开发的全生命周期安全保障框架。该框架将软件安全保障集成到信息系统开发生命周期的五个阶段中,详细阐述每个阶段要进行的安全控制措施,确保系统开发的安全性和可靠性。

  18. Quality assurance in diagnostic ultrasound

    Energy Technology Data Exchange (ETDEWEB)

    Sipilae, Outi, E-mail: outi.sipila@hus.fi [HUS Helsinki Medical Imaging Center, Helsinki University Central Hospital, P.O. Box 340, 00029 HUS (Finland); Mannila, Vilma, E-mail: vilma.mannila@hus.fi [HUS Helsinki Medical Imaging Center, Helsinki University Central Hospital, P.O. Box 340, 00029 HUS (Finland); Department of Physics, University of Helsinki, P.O. Box 64, 00014 Helsinki University (Finland); Vartiainen, Eija, E-mail: eija.vartiainen@hus.fi [HUS Helsinki Medical Imaging Center, Helsinki University Central Hospital, P.O. Box 750, 00029 HUS (Finland)

    2011-11-15

    Objective: To setup a practical ultrasound quality assurance protocol in a large radiological center, results from transducer tests, phantom measurements and visual checks for physical faults were compared. Materials and methods: Altogether 151 transducers from 54 ultrasound scanners, from seven different manufacturers, were tested with a Sonora FirstCall aPerio{sup TM} system (Sonora Medical Systems, Inc., Longmont, CO, USA) to detect non-functional elements. Phantom measurements using a CIRS General Purpose Phantom Model 040 (CIRS Tissue Simulation and Phantom Technology, VA, USA) were available for 135 transducers. The transducers and scanners were also checked visually for physical faults. The percentages of defective findings in these tests were computed. Results: Defective results in the FirstCall tests were found in 17% of the 151 transducers, and in 16% of the 135 transducers. Defective image quality resulted with 15% of the transducers, and 25% of the transducers had a physical flaw. In 16% of the scanners, a physical fault elsewhere than in the transducer was found. Seven percent of the transducers had a concurrent defective result both in the FirstCall test and in the phantom measurements, 8% in the FirstCall test and in the visual check, 4% in the phantom measurements and in the visual check, and 2% in all three tests. Conclusion: The tested methods produced partly complementary results and seemed all to be necessary. Thus a quality assurance protocol is forced to be rather labored, and therefore the benefits and costs must be closely followed.

  19. Information sharing systems and teamwork between sub-teams: a mathematical modeling perspective

    Science.gov (United States)

    Tohidi, Hamid; Namdari, Alireza; Keyser, Thomas K.; Drzymalski, Julie

    2017-04-01

    Teamwork contributes to a considerable improvement in quality and quantity of the ultimate outcome. Collaboration and alliance between team members bring a substantial progress for any business. However, it is imperative to acquire an appropriate team since many factors must be considered in this regard. Team size may represent the effectiveness of a team and it is of paramount importance to determine what the ideal team size exactly should be. In addition, information technology increasingly plays a differentiating role in productivity and adopting appropriate information sharing systems may contribute to improvement in efficiency especially in competitive markets when there are numerous producers that compete with each other. The significance of transmitting information to individuals is inevitable to assure an improvement in team performance. In this paper, a model of teamwork and its organizational structure are presented. Furthermore, a mathematical model is proposed in order to characterize a group of sub-teams according to two criteria: team size and information technology. The effect of information technology on performance of team and sub-teams as well as optimum size of those team and sub-teams from a productivity perspective are studied. Moreover, a quantitative sensitivity analysis is presented in order to analyze the interaction between these two factors through a sharing system.

  20. Assuring quality in high-consequence engineering

    Energy Technology Data Exchange (ETDEWEB)

    Hoover, Marcey L.; Kolb, Rachel R.

    2014-03-01

    In high-consequence engineering organizations, such as Sandia, quality assurance may be heavily dependent on staff competency. Competency-dependent quality assurance models are at risk when the environment changes, as it has with increasing attrition rates, budget and schedule cuts, and competing program priorities. Risks in Sandia's competency-dependent culture can be mitigated through changes to hiring, training, and customer engagement approaches to manage people, partners, and products. Sandia's technical quality engineering organization has been able to mitigate corporate-level risks by driving changes that benefit all departments, and in doing so has assured Sandia's commitment to excellence in high-consequence engineering and national service.

  1. Quality assurance and organizational effectiveness in hospitals.

    Science.gov (United States)

    Hetherington, R W

    1982-01-01

    The purpose of this paper is to explore some aspects of a general theoretical model within which research on the organizational impacts of quality assurance programs in hospitals may be examined. Quality assurance is conceptualized as an organizational control mechanism, operating primarily through increased formalization of structures and specification of procedures. Organizational effectiveness is discussed from the perspective of the problem-solving theory of organizations, wherein effective organizations are those which maintain at least average performance in all four system problem areas simultaneously (goal-attainment, integration, adaptation and pattern-maintenance). It is proposed that through the realization of mutual benefits for both professionals and the bureaucracy, quality assurance programs can maximize such effective performance in hospitals.

  2. Enhancements to commissioning techniques and quality assurance of brachytherapy treatment planning systems that use model-based dose calculation algorithms.

    Science.gov (United States)

    Rivard, Mark J; Beaulieu, Luc; Mourtada, Firas

    2010-06-01

    The current standard for brachytherapy dose calculations is based on the AAPM TG-43 formalism. Simplifications used in the TG-43 formalism have been challenged by many publications over the past decade. With the continuous increase in computing power, approaches based on fundamental physics processes or physics models such as the linear-Boltzmann transport equation are now applicable in a clinical setting. Thus, model-based dose calculation algorithms (MBDCAs) have been introduced to address TG-43 limitations for brachytherapy. The MBDCA approach results in a paradigm shift, which will require a concerted effort to integrate them properly into the radiation therapy community. MBDCA will improve treatment planning relative to the implementation of the traditional TG-43 formalism by accounting for individualized, patient-specific radiation scatter conditions, and the radiological effect of material heterogeneities differing from water. A snapshot of the current status of MBDCA and AAPM Task Group reports related to the subject of QA recommendations for brachytherapy treatment planning is presented. Some simplified Monte Carlo simulation results are also presented to delineate the effects MBDCA are called to account for and facilitate the discussion on suggestions for (i) new QA standards to augment current societal recommendations, (ii) consideration of dose specification such as dose to medium in medium, collisional kerma to medium in medium, or collisional kerma to water in medium, and (iii) infrastructure needed to uniformly introduce these new algorithms. Suggestions in this Vision 20/20 article may serve as a basis for developing future standards to be recommended by professional societies such as the AAPM, ESTRO, and ABS toward providing consistent clinical implementation throughout the brachytherapy community and rigorous quality management of MBDCA-based treatment planning systems.

  3. 78 FR 25445 - Federal Acquisition Regulation; Submission for OMB Review; Quality Assurance Requirements

    Science.gov (United States)

    2013-05-01

    ... Regulation; Submission for OMB Review; Quality Assurance Requirements AGENCY: Department of Defense (DOD... information collection requirement concerning quality assurance requirements. A notice was published in the..., Quality Assurance Requirements, by any of the following methods: Regulations.gov : http://www.regulations...

  4. Image quality assurance in X-ray diagnosis - information on DIN 6868, part 2, film processing: Constancy control of visual optical density

    Energy Technology Data Exchange (ETDEWEB)

    Becker-Gaab, C.; Borcke, E.; Bunde, E.; Hagemann, G.; Kuetterer, G.; Lang, G.R.; Schoefer, H.; Stender, H.S.; Stieve, F.E.; Volkmann, T. v.

    1987-02-01

    Good image quality in X-ray diagnosis is the prerequisite for providing the best information possible for making a diagnose quality control of X-ray films is indispensable in order to achieve the objective mentioned above and to reduce both radiation dose and cost.

  5. Instructional Quality Assurance at Lansing Community College.

    Science.gov (United States)

    Herder, Dale M.; And Others

    Drawing from the experiences of Lansing Community College (LCC), this paper offers a rationale and model for measuring instructional quality. Section I offers background on LCC's efforts to assess the quality of its courses and curricula, and to introduce such quality assurance procedures as computer-based course syllabi, department and program…

  6. Quality Assurance in University Guidance Services

    Science.gov (United States)

    Simon, Alexandra

    2014-01-01

    In Europe there is no common quality assurance framework for the delivery of guidance in higher education. Using a case study approach in four university career guidance services in England, France and Spain, this article aims to study how quality is implemented in university career guidance services in terms of strategy, standards and models,…

  7. Quality Assurance in Distance and Open Learning

    Science.gov (United States)

    Mahafzah, Mohammed Hasan

    2012-01-01

    E-learning has become an increasingly important teaching and learning mode in educational institutions and corporate training. The evaluation of E-learning, however, is essential for the quality assurance of E-learning courses. This paper constructs a three-phase evaluation model for E-learning courses, which includes development, process, and…

  8. MCNP{trademark} Software Quality Assurance plan

    Energy Technology Data Exchange (ETDEWEB)

    Abhold, H.M.; Hendricks, J.S.

    1996-04-01

    MCNP is a computer code that models the interaction of radiation with matter. MCNP is developed and maintained by the Transport Methods Group (XTM) of the Los Alamos National Laboratory (LANL). This plan describes the Software Quality Assurance (SQA) program applied to the code. The SQA program is consistent with the requirements of IEEE-730.1 and the guiding principles of ISO 900.

  9. Quality Assurance in University Guidance Services

    Science.gov (United States)

    Simon, Alexandra

    2014-01-01

    In Europe there is no common quality assurance framework for the delivery of guidance in higher education. Using a case study approach in four university career guidance services in England, France and Spain, this article aims to study how quality is implemented in university career guidance services in terms of strategy, standards and models,…

  10. CRISP. Information Security Models and Their Economics

    Energy Technology Data Exchange (ETDEWEB)

    Gustavsson, R.; Mellstrand, P.; Tornqvist, B. [Blekinge Institute of Technology BTH, Karlskrona (Sweden)

    2005-03-15

    The deliverable D1.6 includes background material and specifications of a CRISP Framework on protection of information assets related to power net management and management of business operations related to energy services. During the project it was discovered by the CRISP consortium that the original description of WP 1.6 was not adequate for the project as such. The main insight was that the original emphasis on cost-benefit analysis of security protection measures was to early to address in the project. This issue is of course crucial in itself but requires new models of consequence analysis that still remains to be developed, especially for the new business models we are investigated in the CRISP project. The updated and approved version of the WP1.6 description, together with the also updated WP2.4 focus on Dependable ICT support of Power Grid Operations constitutes an integrated approach towards dependable and secure future utilities and their business processes. This document (D1.6) is a background to deliverable D2.4. Together they provide a dependability and security framework to the three CRISP experiments in WP3.

  11. Molecular model with quantum mechanical bonding information.

    Science.gov (United States)

    Bohórquez, Hugo J; Boyd, Russell J; Matta, Chérif F

    2011-11-17

    The molecular structure can be defined quantum mechanically thanks to the theory of atoms in molecules. Here, we report a new molecular model that reflects quantum mechanical properties of the chemical bonds. This graphical representation of molecules is based on the topology of the electron density at the critical points. The eigenvalues of the Hessian are used for depicting the critical points three-dimensionally. The bond path linking two atoms has a thickness that is proportional to the electron density at the bond critical point. The nuclei are represented according to the experimentally determined atomic radii. The resulting molecular structures are similar to the traditional ball and stick ones, with the difference that in this model each object included in the plot provides topological information about the atoms and bonding interactions. As a result, the character and intensity of any given interatomic interaction can be identified by visual inspection, including the noncovalent ones. Because similar bonding interactions have similar plots, this tool permits the visualization of chemical bond transferability, revealing the presence of functional groups in large molecules.

  12. Concept of information models in GGOS

    Science.gov (United States)

    Pachelski, Wojciech

    2010-05-01

    GGOS divides geodesy as a discipline into three parts, so called "pillars". First pillar consists of methods, techniques and theories that are used to determine Earth's shape (its surface: solid Earth, ice and oceans) as a global function of space and time (kinematics). Second pillar regards to Earth's gravitational field determination and monitoring, it also describes mass distributions and the shape of the geoid. Third pillar concerns planet rotation and forces related to interactions between Earth and other celestial bodies, especially the Moon and the Sun. These three pillars constitute province of modern geodesy. Different parts of the overall system are cross-linked through observations and inter-dependent. All these techniques are affected by and measure the "output" of the same unique Earth system, that is, the various geodetic fingerprints induced by mass redistribution and changes in the system's dynamics. Consistency of data processing, modeling, and conventions across the techniques and across the "three pillars" is mandatory for maximum exploitation of the full potential of the system. The main purpose of this paper is to make an introduction to full description of connections between all GGOS components and describe GGOS information structure - a great number of mutually related objects, phenomena, theories. Understanding of relations and dependences within GGOS is necessary to conscious usage of it products. The authors' intention is to show and explain examples of such relations related to the part of GGOS, which is described as "Geokinematics", "Gravity field", "Earth rotation", "Reference systems". The next step is to present those dependences using Unified Modeling Language (UML) - formal language, which is used to model and describe reality in object-oriented analysis and programming.There are packages "Geokinematics", "Gravity field", "Earth rotation", "Reference systems" and classes for each package defined. To show connections between some

  13. Meta-Model of Resilient information System

    OpenAIRE

    Ahmed, Adnan; Hussain, Syed Shahram

    2007-01-01

    The role of information systems has become very important in today’s world. It is not only the business organizations who use information systems but the governments also posses’ very critical information systems. The need is to make information systems available at all times under any situation. Information systems must have the capabilities to resist against the dangers to its services,performance & existence, and recover to its normal working state with the available resources in catas...

  14. DEFICIENT INFORMATION MODELING OF MECHANICAL PRODUCTS FOR CONCEPTUAL SHAPE DESIGN

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    In allusion to the deficient feature of product information in conceptual design, a framework of deficient information modeling for conceptual shape design is put forward, which includes qualitative shape modeling (a qualitative solid model), uncertain shape modeling (an uncertain relation model) and imprecise shape modeling (an imprecise region model). In the framework, the qualitative solid model is the core, which represents qualitatively (using symbols) the conceptual shapes of mechanical products. The uncertain relation model regarding domain relations as objects and the imprecise region model regarding domains as objects are used to deal with the uncertain and imprecise issues respectively, which arise from qualitative shape modeling or exist in product information itself.

  15. Evaluation of clinical information modeling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Austin, Tony; Moreno-Conde, Jesús; Parra-Calderón, Carlos L; Kalra, Dipak

    2016-11-01

    Clinical information models are formal specifications for representing the structure and semantics of the clinical content within electronic health record systems. This research aims to define, test, and validate evaluation metrics for software tools designed to support the processes associated with the definition, management, and implementation of these models. The proposed framework builds on previous research that focused on obtaining agreement on the essential requirements in this area. A set of 50 conformance criteria were defined based on the 20 functional requirements agreed by that consensus and applied to evaluate the currently available tools. Of the 11 initiative developing tools for clinical information modeling identified, 9 were evaluated according to their performance on the evaluation metrics. Results show that functionalities related to management of data types, specifications, metadata, and terminology or ontology bindings have a good level of adoption. Improvements can be made in other areas focused on information modeling and associated processes. Other criteria related to displaying semantic relationships between concepts and communication with terminology servers had low levels of adoption. The proposed evaluation metrics were successfully tested and validated against a representative sample of existing tools. The results identify the need to improve tool support for information modeling and software development processes, especially in those areas related to governance, clinician involvement, and optimizing the technical validation of testing processes. This research confirmed the potential of these evaluation metrics to support decision makers in identifying the most appropriate tool for their organization. Los Modelos de Información Clínica son especificaciones para representar la estructura y características semánticas del contenido clínico en los sistemas de Historia Clínica Electrónica. Esta investigación define, prueba y valida

  16. The Significance of Quality Assurance within Model Intercomparison Projects at the World Data Centre for Climate (WDCC)

    Science.gov (United States)

    Toussaint, F.; Hoeck, H.; Stockhause, M.; Lautenschlager, M.

    2014-12-01

    The classical goals of a quality assessment system in the data life cycle are (1) to encourage data creators to improve their quality assessment procedures to reach the next quality level and (2) enable data consumers to decide, whether a dataset has a quality that is sufficient for usage in the target application, i.e. to appraise the data usability for their own purpose.As the data volumes of projects and the interdisciplinarity of data usage grow, the need for homogeneous structure and standardised notation of data and metadata increases. This third aspect is especially valid for the data repositories, as they manage data through machine agents. So checks for homogeneity and consistency in early parts of the workflow become essential to cope with today's data volumes.Selected parts of the workflow in the model intercomparison project CMIP5 and the archival of the data for the interdiscipliary user community of the IPCC-DDC AR5 and the associated quality checks are reviewed. We compare data and metadata checks and relate different types of checks to their positions in the data life cycle.The project's data citation approach is included in the discussion, with focus on temporal aspects of the time necessary to comply with the project's requirements for formal data citations and the demand for the availability of such data citations.In order to make different quality assessments of projects comparable, WDCC developed a generic Quality Assessment System. Based on the self-assessment approach of a maturity matrix, an objective and uniform quality level system for all data at WDCC is derived which consists of five maturity quality levels.

  17. A Critical Information Literacy Model: Library Leadership within the Curriculum

    Science.gov (United States)

    Swanson, Troy

    2011-01-01

    It is a time for a new model for teaching students to find, evaluate, and use information by drawing on critical pedagogy theory in the education literature. This critical information literacy model views the information world as a dynamic place where authors create knowledge for many reasons; it seeks to understand students as information users,…

  18. Brief Introducation of the USA Network Security and Information Assurance Research Plan%美国网络安全与信息保障研发计划简介

    Institute of Scientific and Technical Information of China (English)

    王宇

    2015-01-01

    The USA networking and information technology research program is briefly introduced, and three themes inducing change of research fields and corresponding research projects are overviewed in this paper. It is concluded that realizing information assurance architecture of active and cooperative defense according to basic theories and system frameworks to stengthen trusted network, high as ̄surance platform and moving target defense technologies and so on, is an effective method to counter new network threat.%简要介绍美国联网与信息技术研究开发计划,重点对诱导改变研发领域涉及的三个主题及其相关研究项目进行综述,指出从基础理论和系统架构层面加强可信网络、高保障平台、移动目标防御等技术的研究,建立主动防御、合作防御的信息安全保障体系,是应对新型网络攻击威胁的有效途径。

  19. Conceptual Model of Multidimensional Marketing Information System

    Science.gov (United States)

    Kriksciuniene, Dalia; Urbanskiene, Ruta

    This article is aimed to analyse, why the information systems at the enterprise not always satisfy the expectations of marketing management specialists. The computerized systems more and more successfully serve information needs of those areas of enterprise management, where they can create the information equivalent of real management processes. Yet their inability to effectively fulfill marketing needs indicate the gaps not only in ability to structure marketing processes, but in the conceptual development of marketing information systems (MkIS) as well.

  20. A descriptive model of information problem solving while using internet

    NARCIS (Netherlands)

    Brand-Gruwel, Saskia; Wopereis, Iwan; Walraven, Amber

    2009-01-01

    This paper presents the IPS-I-model: a model that describes the process of information problem solving (IPS) in which the Internet (I) is used to search information. The IPS-I-model is based on three studies, in which students in secondary and (post) higher education were asked to solve information

  1. The Nature of Information Science: Changing Models

    Science.gov (United States)

    Robinson, Lyn; Karamuftuoglu, Murat

    2010-01-01

    Introduction: This paper considers the nature of information science as a discipline and profession. Method: It is based on conceptual analysis of the information science literature, and consideration of philosophical perspectives, particularly those of Kuhn and Peirce. Results: It is argued that information science may be understood as a field of…

  2. Application and Discussion for ASSURE Model on the Course of Computer Human Animation%A SSURE 模式在《计算机人体动画》课程中的应用

    Institute of Scientific and Technical Information of China (English)

    刘华俊

    2014-01-01

    The study is guided by the instructional design theory ASSURE model ,and using the students of two under-graduate classes whose major are computer science in Wuhan University as experimental objects ,to analyze the teaching effect .In our experiment ,the ASSURE model is applied for one class ,according to the six steps of ASSURE model , which are analyze learners ,state objectives ,select methods ,media and materials ,utilize media and materials ,require learner participation ,and evaluate and revise .In addition ,the traditional multimedia teaching is applied for the other class .Comparing the teaching effect ,we demonstrate the effectiveness of ASSURE model for the course of computer hu-man animation .%以经典的教学设计理论ASSURE模式为指导,以武汉大学计算机专业本科二年级两个班的学生作为实验对象,进行了教学对比研究。其中,实验组采用ASSURE模式进行多媒体教学,按照ASSURE模式的框架,分别从分析学习者、陈述学习目标、选择教学方法、媒体和材料、使用媒体和资源、要求学习者参与以及评价修正6个步骤进行课程设计,控制组采用传统的多媒体教学。根据试验结论探讨了ASSURE模式在《计算机人体动画》课程中的有效性。

  3. A MATHEMATICAL MODELING OF CAMPUS INFORMATION SYSTEM

    Directory of Open Access Journals (Sweden)

    S. STALIN KUMAR

    2016-07-01

    Full Text Available An H-magic labeling in a H-decomposable graph G is a bijection f : V (G ∪ E(G → {1, 2, ..., p + q} such that for every copy H in the decomposition, \\sum\\limits_{v∈V (H}{f(v}+\\sum\\limits_{e∈E(H}{ f(e} is constant. f is said to be H-V -super magic if f(V (G = {1, 2, ..., p}. Suppose that V (G = U(G ∪ W(G with |U(G| = m and |W(G| = n. Then f is said to be H-V -super-strong magic labeling if f(U(G = {1, 2, ..., m} and f(W(G = {m + 1, m + 2, ...,(m + n = p}. A graph that admits a H-V -super-strong magic labeling is called a H-V -super-strong magic decomposable graph. In this paper, we pay our attention to provide a mathematical modeling of campus information system.

  4. The case of sustainability assurance: constructing a new assurance service

    NARCIS (Netherlands)

    O'Dwyer, B.

    2011-01-01

    This paper presents an in-depth longitudinal case study examining the processes through which practitioners in two Big 4 professional services firms have attempted to construct sustainability assurance (independent assurance on sustainability reports). Power’s (1996, 1997, 1999, 2003) theorization o

  5. Study on Product Lifecycle Dynamic Information Modeling and Its Application

    Institute of Scientific and Technical Information of China (English)

    ZHAO Liang-cai; WANG Li-hui; ZHANG Yong

    2003-01-01

    The PLDIM (Product Lifecycle Dynamic Information Model) is the most important part of the PLDM (Product Lifecycle Dynamic Model ) and it is the basis to create the information system and to implement PLM (Product Lifecycle Management). The information classification, the relationships among all information items, PLDIM mathematic expression, information coding and the 3D synthetic description of the PLDIM are presented. The information flow and information system structure based on the two information centers and Internet/Intranet are proposed, and how to implement this system for ship diesel engines are also introduced according to the PLDIM and PLM solutions.

  6. Construction quality assurance report

    Energy Technology Data Exchange (ETDEWEB)

    Roscha, V.

    1994-09-08

    This report provides a summary of the construction quality assurance (CQA) observation and test results, including: The results of the geosynthetic and soil materials conformance testing. The observation and testing results associates with the installation of the soil liners. The observation and testing results associated with the installation of the HDPE geomembrane liner systems. The observation and testing results associated with the installation of the leachate collection and removal systems. The observation and testing results associated with the installation of the working surfaces. The observation and testing results associated with in-plant manufacturing process. Summary of submittal reviews by Golder Construction Services, Inc. The submittal and certification of the piping material specifications. The observation and verification associated of the Acceptance Test Procedure results of the operational equipment functions. Summary of the ECNs which are incorporated into the project.

  7. FESA Quality Assurance

    CERN Document Server

    CERN. Geneva

    2015-01-01

    FESA is a framework used by 100+ developers at CERN to design and implement the real-time software used to control the accelerators. Each new version must be tested and qualified to ensure that no backward compatibility issues have been introduced and that there is no major bug which might prevent accelerator operations. Our quality assurance approach is based on code review and a two-level testing process. The first level is made of unit-test (Python unittest & Google tests for C++). The second level consists of integration tests running on an isolated test environment. We also use a continuous integration service (Bamboo) to ensure the tests are executed periodically and the bugs caught early. In the presentation, we will explain the reasons why we took this approach, the results and some thoughts on the pros and cons.

  8. Decision Making Models Using Weather Forecast Information

    OpenAIRE

    Hiramatsu, Akio; Huynh, Van-Nam; Nakamori, Yoshiteru

    2007-01-01

    The quality of weather forecast has gradually improved, but weather information such as precipitation forecast is still uncertainty. Meteorologists have studied the use and economic value of weather information, and users have to translate weather information into their most desirable action. To maximize the economic value of users, the decision maker should select the optimum course of action for his company or project, based on an appropriate decision strategy under uncertain situations. In...

  9. Recent Trends in Quality Assurance

    Science.gov (United States)

    Amaral, Alberto; Rosa, Maria Joao

    2010-01-01

    In this paper we present a brief description of the evolution of quality assurance in Europe, paying particular attention to its relationship to the rising loss of trust in higher education institutions. We finalise by analysing the role of the European Commission in the setting up of new quality assurance mechanisms that tend to promote…

  10. Scanning Health Information Sources: Applying and Extending the Comprehensive Model of Information Seeking.

    Science.gov (United States)

    Ruppel, Erin K

    2016-01-01

    Information scanning, or attention to information via incidental or routine exposure or browsing, is relatively less understood than information seeking. To (a) provide a more theoretical understanding of information scanning and (b) extend existing information seeking theory to information scanning, the current study used data from the National Cancer Institute's Health Information National Trends Survey to examine cancer information scanning using the comprehensive model of information seeking (CMIS). Consistent with the CMIS, health-related factors were associated with the information-carrier factor of trust, and health-related factors and trust were associated with attention to information sources. Some of these associations differed between entertainment-oriented sources, information-oriented sources, and the Internet. The current findings provide a clearer picture of information scanning and suggest future avenues of research and practice using the CMIS.

  11. Modeling and Analysis of Information Product Maps

    Science.gov (United States)

    Heien, Christopher Harris

    2012-01-01

    Information Product Maps are visual diagrams used to represent the inputs, processing, and outputs of data within an Information Manufacturing System. A data unit, drawn as an edge, symbolizes a grouping of raw data as it travels through this system. Processes, drawn as vertices, transform each data unit input into various forms prior to delivery…

  12. Solutions to Integration Model of Rural Information Resources

    Institute of Scientific and Technical Information of China (English)

    Xirong; GAO; Bo; TAO

    2014-01-01

    The integration of rural information resources is a key factor restricting rural informationization and effective operation of rural information services. To solve problems of separate rural information resources and departments acting willfully regardless of overall interest,this paper analyzed characteristics and distribution of rural information resources,built a basic framework for integration of rural information resources and a mathematic model of integration,and finally came up with specific solutions to integration of rural information resources.

  13. Metrics for building performance assurance

    Energy Technology Data Exchange (ETDEWEB)

    Koles, G.; Hitchcock, R.; Sherman, M.

    1996-07-01

    This report documents part of the work performed in phase I of a Laboratory Directors Research and Development (LDRD) funded project entitled Building Performance Assurances (BPA). The focus of the BPA effort is to transform the way buildings are built and operated in order to improve building performance by facilitating or providing tools, infrastructure, and information. The efforts described herein focus on the development of metrics with which to evaluate building performance and for which information and optimization tools need to be developed. The classes of building performance metrics reviewed are (1) Building Services (2) First Costs, (3) Operating Costs, (4) Maintenance Costs, and (5) Energy and Environmental Factors. The first category defines the direct benefits associated with buildings; the next three are different kinds of costs associated with providing those benefits; the last category includes concerns that are broader than direct costs and benefits to the building owner and building occupants. The level of detail of the various issues reflect the current state of knowledge in those scientific areas and the ability of the to determine that state of knowledge, rather than directly reflecting the importance of these issues; it intentionally does not specifically focus on energy issues. The report describes work in progress and is intended as a resource and can be used to indicate the areas needing more investigation. Other reports on BPA activities are also available.

  14. Technical Reference Suite Addressing Challenges of Providing Assurance for Fault Management Architectural Design

    Science.gov (United States)

    Fitz, Rhonda; Whitman, Gerek

    2016-01-01

    Research into complexities of software systems Fault Management (FM) and how architectural design decisions affect safety, preservation of assets, and maintenance of desired system functionality has coalesced into a technical reference (TR) suite that advances the provision of safety and mission assurance. The NASA Independent Verification and Validation (IV&V) Program, with Software Assurance Research Program support, extracted FM architectures across the IV&V portfolio to evaluate robustness, assess visibility for validation and test, and define software assurance methods applied to the architectures and designs. This investigation spanned IV&V projects with seven different primary developers, a wide range of sizes and complexities, and encompassed Deep Space Robotic, Human Spaceflight, and Earth Orbiter mission FM architectures. The initiative continues with an expansion of the TR suite to include Launch Vehicles, adding the benefit of investigating differences intrinsic to model-based FM architectures and insight into complexities of FM within an Agile software development environment, in order to improve awareness of how nontraditional processes affect FM architectural design and system health management. The identification of particular FM architectures, visibility, and associated IV&V techniques provides a TR suite that enables greater assurance that critical software systems will adequately protect against faults and respond to adverse conditions. Additionally, the role FM has with regard to strengthened security requirements, with potential to advance overall asset protection of flight software systems, is being addressed with the development of an adverse conditions database encompassing flight software vulnerabilities. Capitalizing on the established framework, this TR suite provides assurance capability for a variety of FM architectures and varied development approaches. Research results are being disseminated across NASA, other agencies, and the

  15. AP233: An Information Model for Systems Engineering

    Science.gov (United States)

    Siebes, Georg

    2009-01-01

    In today's world, information is abundant. We have no problems generating it. But we are challenged to find, organize, and exchange information. center dot A standardized model of information can help. Such a model nearly completed its development for Systems Engineering. It is referred to as AP233 (AP = Application Protocol).

  16. Assurance Technology Challenges of Advanced Space Systems

    Science.gov (United States)

    Chern, E. James

    2004-01-01

    The initiative to explore space and extend a human presence across our solar system to revisit the moon and Mars post enormous technological challenges to the nation's space agency and aerospace industry. Key areas of technology development needs to enable the endeavor include advanced materials, structures and mechanisms; micro/nano sensors and detectors; power generation, storage and management; advanced thermal and cryogenic control; guidance, navigation and control; command and data handling; advanced propulsion; advanced communication; on-board processing; advanced information technology systems; modular and reconfigurable systems; precision formation flying; solar sails; distributed observing systems; space robotics; and etc. Quality assurance concerns such as functional performance, structural integrity, radiation tolerance, health monitoring, diagnosis, maintenance, calibration, and initialization can affect the performance of systems and subsystems. It is thus imperative to employ innovative nondestructive evaluation methodologies to ensure quality and integrity of advanced space systems. Advancements in integrated multi-functional sensor systems, autonomous inspection approaches, distributed embedded sensors, roaming inspectors, and shape adaptive sensors are sought. Concepts in computational models for signal processing and data interpretation to establish quantitative characterization and event determination are also of interest. Prospective evaluation technologies include ultrasonics, laser ultrasonics, optics and fiber optics, shearography, video optics and metrology, thermography, electromagnetics, acoustic emission, x-ray, data management, biomimetics, and nano-scale sensing approaches for structural health monitoring.

  17. Contexts for concepts: Information modeling for semantic interoperability

    NARCIS (Netherlands)

    Oude Luttighuis, P.H.W.M.; Stap, R.E.; Quartel, D.

    2011-01-01

    Conceptual information modeling is a well-established practice, aimed at preparing the implementation of information systems, the specification of electronic message formats, and the design of information processes. Today's ever more connected world however poses new challenges for conceptual inform

  18. Information cascade, Kirman's ant colony model, and kinetic Ising model

    CERN Document Server

    Hisakado, Masato

    2014-01-01

    In this paper, we discuss a voting model in which voters can obtain information from a finite number of previous voters. There exist three groups of voters: (i) digital herders and independent voters, (ii) analog herders and independent voters, and (iii) tanh-type herders. In our previous paper, we used the mean field approximation for case (i). In that study, if the reference number r is above three, phase transition occurs and the solution converges to one of the equilibria. In contrast, in the current study, the solution oscillates between the two equilibria, that is, good and bad equilibria. In this paper, we show that there is no phase transition when r is finite. If the annealing schedule is adequately slow from finite r to infinite r, the voting rate converges only to the good equilibrium. In case (ii), the state of reference votes is equivalent to that of Kirman's ant colony model, and it follows beta binomial distribution. In case (iii), we show that the model is equivalent to the finite-size kinetic...

  19. A Dynamic Model of Information and Entropy

    Directory of Open Access Journals (Sweden)

    Stuart D. Walker

    2010-01-01

    Full Text Available We discuss the possibility of a relativistic relationship between information and entropy, closely analogous to the classical Maxwell electro-magnetic wave equations. Inherent to the analysis is the description of information as residing in points of non-analyticity; yet ultimately also exhibiting a distributed characteristic: additionally analogous, therefore, to the wave-particle duality of light. At cosmological scales our vector differential equations predict conservation of information in black holes, whereas regular- and Z-DNA molecules correspond to helical solutions at microscopic levels. We further propose that regular- and Z-DNA are equivalent to the alternative words chosen from an alphabet to maintain the equilibrium of an information transmission system.

  20. A new model of information behaviour based on the Search Situation Transition schema Information searching, Information behaviour, Behavior, Information retrieval, Information seeking

    Directory of Open Access Journals (Sweden)

    Nils Pharo

    2004-01-01

    Full Text Available This paper presents a conceptual model of information behaviour. The model is part of the Search Situation Transition method schema. The method schema is developed to discover and analyse interplay between phenomena traditionally analysed as factors influencing either information retrieval or information seeking. In this paper the focus is on the model's five main categories: the work task, the searcher, the social/organisational environment, the search task, and the search process. In particular, the search process and its sub-categories search situation and transition and the relationship between these are discussed. To justify the method schema an empirical study was designed according to the schema's specifications. In the paper a subset of the study is presented analysing the effects of work tasks on Web information searching. Findings from this small-scale study indicate a strong relationship between the work task goal and the level of relevance used for judging resources during search processes.

  1. Language Models With Meta-information

    NARCIS (Netherlands)

    Shi, Y.

    2014-01-01

    Language modeling plays a critical role in natural language processing and understanding. Starting from a general structure, language models are able to learn natural language patterns from rich input data. However, the state-of-the-art language models only take advantage of words themselves, which

  2. Strategies for Achieving Quality Assurance in Science Education in ...

    African Journals Online (AJOL)

    Nekky Umera

    all the staff in the functions of planning, execution, monitoring and evaluation using set .... designed questionnaire tagged Strategies for Achieving Quality. Assurance in .... Governmental Workshop on Regional Accreditation Modelling and. Accrediting the ... Enugu: Fourth Dimension Publishing Company Ltd. Walkin, L.

  3. An Integrated Model of Information Literacy, Based upon Domain Learning

    Science.gov (United States)

    Thompson, Gary B.; Lathey, Johnathan W.

    2013-01-01

    Introduction. Grounded in Alexander's model of domain learning, this study presents an integrated micro-model of information literacy. It is predicated upon the central importance of domain learning for the development of the requisite research skills by students. Method. The authors reviewed previous models of information literacy and…

  4. NASA's Approach to Software Assurance

    Science.gov (United States)

    Wetherholt, Martha

    2015-01-01

    NASA defines software assurance as: the planned and systematic set of activities that ensure conformance of software life cycle processes and products to requirements, standards, and procedures via quality, safety, reliability, and independent verification and validation. NASA's implementation of this approach to the quality, safety, reliability, security and verification and validation of software is brought together in one discipline, software assurance. Organizationally, NASA has software assurance at each NASA center, a Software Assurance Manager at NASA Headquarters, a Software Assurance Technical Fellow (currently the same person as the SA Manager), and an Independent Verification and Validation Organization with its own facility. An umbrella risk mitigation strategy for safety and mission success assurance of NASA's software, software assurance covers a wide area and is better structured to address the dynamic changes in how software is developed, used, and managed, as well as it's increasingly complex functionality. Being flexible, risk based, and prepared for challenges in software at NASA is essential, especially as much of our software is unique for each mission.

  5. A process Approach to Information Services: Information Search Process (ISP Model

    Directory of Open Access Journals (Sweden)

    Hamid Keshavarz

    2010-12-01

    Full Text Available Information seeking is a behavior emerging out of the interaction between information seeker and information system and should be regarded as an episodic process so as to meet information needs of users and to take different roles in different stages of it. The present article introduces a process approach to information services in libraries using Carol Collier Kuhlthau Model. In this model, information seeking is regarded as a process consisting of six stages in each of which users have different thoughts, feelings and actions and librarians also take different roles at any stage correspondingly. These six stages are derived from instructive learning theory based on uncertainty principle. Regardless of some acceptable shortcomings, this model may be regarded as a new solution for rendering modern information services in libraries especially in relation to new information environments and media.

  6. Information, complexity and efficiency: The automobile model

    Energy Technology Data Exchange (ETDEWEB)

    Allenby, B. [Lucent Technologies (United States)]|[Lawrence Livermore National Lab., CA (United States)

    1996-08-08

    The new, rapidly evolving field of industrial ecology - the objective, multidisciplinary study of industrial and economic systems and their linkages with fundamental natural systems - provides strong ground for believing that a more environmentally and economically efficient economy will be more information intensive and complex. Information and intellectual capital will be substituted for the more traditional inputs of materials and energy in producing a desirable, yet sustainable, quality of life. While at this point this remains a strong hypothesis, the evolution of the automobile industry can be used to illustrate how such substitution may, in fact, already be occurring in an environmentally and economically critical sector.

  7. Information systems validation using formal models

    Directory of Open Access Journals (Sweden)

    Azadeh Sarram

    2014-03-01

    Full Text Available During the past few years, there has been growing interest to use unified modeling language (UML to consider the functional requirements. However, lacking a tool to detect the accuracy and the logic of diagrams in this language makes a formal model indispensable. In this study, conversion of primary UML model of a system to a colored Petri net has been accomplished in order to examine the precision of the model. For this purpose, first the definition of priority and implementation tags for UML activity diagram are provided; then it is turned into colored Petri net. Second, the proposed model provides translated tags in terms of net transitions and some monitoring are used to control the system characteristics. Finally, an executable model of UML activity diagram is provided so that the designer could simulate the model by using the simulation results to detect and to refine the problems of the model. In addition, by checking the results, we find out the proposed method enhances authenticity and accuracy of early models and the ratio of system validation increases compared with previous methods.

  8. Information Modeling for Direct Control of Distributed Energy Resources

    DEFF Research Database (Denmark)

    Biegel, Benjamin; Andersen, Palle; Stoustrup, Jakob

    2013-01-01

    a desired accumulated response. In this paper, we design such an information model based on the markets that the aggregator participates in and based on the flexibility characteristics of the remote controlled DERs. The information model is constructed in a modular manner making the interface suitable...... for a whole range of different DERs. The devised information model can serve as input to the international standardization efforts on distributed energy resources....

  9. Academic Support through Information System : Srinivas Integrated Model

    OpenAIRE

    Aithal, Sreeramana; Kumar, Suresh

    2016-01-01

    As part of imparting quality higher education for undergraduate and post graduate students, Srinivas Institute of Management Studies (SIMS) developed an education service model for integrated academic support known as Srinivas Integrated Model. Backed by the presumption that knowledge is power and information is fundamental to knowledge building and knowledge sharing, this model is aimed to provide information support to students for improved academic performance. Information on the college a...

  10. Model driven geo-information systems development

    NARCIS (Netherlands)

    Morales Guarin, J.M.; Ferreira Pires, Luis; van Sinderen, Marten J.; Williams, A.D.

    Continuous change of user requirements has become a constant for geo-information systems. Designing systems that can adapt to such changes requires an appropriate design methodology that supports abstraction, modularity and other mechanisms to capture the essence of the system and help controlling

  11. Geographical information modelling for land resource survey

    NARCIS (Netherlands)

    Bruin, de S.

    2000-01-01

    The increasing popularity of geographical information systems (GIS) has at least three major implications for land resources survey. Firstly, GIS allows alternative and richer representation of spatial phenomena than is possible with the traditional paper map. Secondly, digital technology has improv

  12. Empirical modeling of information communication technology usage ...

    African Journals Online (AJOL)

    Hennie

    2015-11-01

    Nov 1, 2015 ... Information Communication Technology (ICT) usage behavior), based on ... highly integrated ICT schemes with competent personnel, using ICTs in .... skilled business teachers, office administrators and ... management of various aspects of the learning ..... Library, 31(6):792-807. doi: 10.1108/EL-04-2012-.

  13. Challenges in Information Retrieval and Language Modeling

    NARCIS (Netherlands)

    Allen, J.; Aslam, J.; Belkin, N.; Buckley, C.; Callan, J.; Croft, W.B.; Dumais, S.; Fuhr, N.; Harman, D.; Harper, D.J.; Hiemstra, D.; Hofmann, T.; Hovey, E.; Kraaij, W.; Lafferty, J.; Lavrenko, V.; Lewis, D.; Liddy, L.; Manmatha, R.; McCallum, A.; Ponte, J.; Prager, J.; Radev, D.; Resnik, P.; Robertson, S.E.; Rosenfeld, R.; Roukos, S.; Sanderson, M.; Schwartz, R.; Singhal, A.; Smeaton, A.; Turtle, H.; Voorhees, E.M.; Weischedel, R.; Xu, J.; Zhai, B.C.

    2003-01-01

    Information retrieval (IR) research has reached a point where it is appropriate to assess progress and to define a research agenda for the next five to ten years. This report summarizes a discussion of IR research challenges that took place at a recent workshop. The attendees of the workshop conside

  14. Quality assurance, information tracking, and consumer labeling.

    Science.gov (United States)

    Caswell, Julie A

    2006-01-01

    Reducing marine-based public health risk requires strict control of several attributes of seafood products, often including location and conditions of catch or aquaculture, processing, and handling throughout the supply chain. Buyers likely will also be interested in other attributes of these products such as eco-friendliness or taste. Development of markets for improved safety, as well as for other quality attributes, requires an effective certification and tracking of these attributes as well as their communication to buyers. Several challenges must be met if labeling, particularly consumer labeling, is to support the development of markets for improved seafood safety.

  15. AFRL/Cornell Information Assurance Institute

    Science.gov (United States)

    2007-11-02

    Election Protocol for Large Groups. DISC 2000 ( Toledo , Spain, October 2000). (61) Z. Haas, J. Y. Halpern and L. Li. Gossip-based ad hoc routing. Pro...Joseph Y. Halpern, Robert Harper, Neil Immerman, Phokion G. Ko- laitis, Moshe Y. Vardi, and Victor Vianu. The Unusual Effectiveness of Logic in Computer

  16. AFRL/Cornell Information Assurance Institute

    Science.gov (United States)

    2007-03-01

    placement problems, using a complete system imple- 13 mentation on 46 Tmote Sky motes, demonstrating significant advantages over existing methods...Program Construction (MPC 2004) ( Stirling , Scotland, July 2004), vol. 3125 of Lecture Notes in Computer Science, Springer- Verlag, 400 pages. (202

  17. Modeling behavioral considerations related to information security.

    Energy Technology Data Exchange (ETDEWEB)

    Martinez-Moyano, I. J.; Conrad, S. H.; Andersen, D. F. (Decision and Information Sciences); (SNL); (Univ. at Albany)

    2011-01-01

    The authors present experimental and simulation results of an outcome-based learning model for the identification of threats to security systems. This model integrates judgment, decision-making, and learning theories to provide a unified framework for the behavioral study of upcoming threats.

  18. Teacher Modeling Using Complex Informational Texts

    Science.gov (United States)

    Fisher, Douglas; Frey, Nancy

    2015-01-01

    Modeling in complex texts requires that teachers analyze the text for factors of qualitative complexity and then design lessons that introduce students to that complexity. In addition, teachers can model the disciplinary nature of content area texts as well as word solving and comprehension strategies. Included is a planning guide for think aloud.

  19. A model-driven approach to information security compliance

    Science.gov (United States)

    Correia, Anacleto; Gonçalves, António; Teodoro, M. Filomena

    2017-06-01

    The availability, integrity and confidentiality of information are fundamental to the long-term survival of any organization. Information security is a complex issue that must be holistically approached, combining assets that support corporate systems, in an extended network of business partners, vendors, customers and other stakeholders. This paper addresses the conception and implementation of information security systems, conform the ISO/IEC 27000 set of standards, using the model-driven approach. The process begins with the conception of a domain level model (computation independent model) based on information security vocabulary present in the ISO/IEC 27001 standard. Based on this model, after embedding in the model mandatory rules for attaining ISO/IEC 27001 conformance, a platform independent model is derived. Finally, a platform specific model serves the base for testing the compliance of information security systems with the ISO/IEC 27000 set of standards.

  20. Construction Cost Prediction by Using Building Information Modeling

    National Research Council Canada - National Science Library

    Remon F. Aziz

    2015-01-01

    The increased interest in using Building Information Modeling (BIM) in detailed construction cost estimates calls for methodologies to evaluate the effectiveness of BIM-Assisted Detailed Estimating (BADE...

  1. Quality Assurance Decisions with Air Models: A Case Study of ImputatIon of Missing Input Data Using EPA's Multi-Layer Model

    Science.gov (United States)

    Abstract Environmental models are frequently used within regulatory and policy frameworks to estimate environmental metrics that are difficult or impossible to physically measure. As important decision tools, the uncertainty associated with the model outputs should impact their ...

  2. [Quality assurance in acupuncture therapy].

    Science.gov (United States)

    Kubiena, G

    1996-04-01

    Quality assurance for acupuncture therapy requires a good basic and on-going training in both conventional western medicine as well as in the theory and practice of acupuncture, the ability to synthesize the patient's objective findings and subjective feelings, and honesty with the patient and towards oneself. Thus, based on the continuous critical evaluation of the objective and subjective parameters, the question of acupunture as the optimal form of therapy for this specific case is honestly answered and one has the courage to admit failures. With regard to the theory, surveys of the acupuncture literature show that a considerable improvement in quality and honesty is necessary. There is a lack of standardised experimental methods (e.g. 28 different placebos in 28 different studies!). Especially German acupuncture journals have a disturbed relation to failures. To hide or deny failures is of no benefit neither to acupuncture, science to the relationship between the physician and the patient since the practitioner must be able to rely on the information in the literature. Furthermore, one should be open minded to alternative methods even if this means to refer a patient to a colleague.

  3. Spinal Cord Injury Model System Information Network

    Science.gov (United States)

    ... Go New to Website Managing Bowel Function After Spinal Cord Injury Resilience, Depression and Bouncing Back after SCI Getting ... the UAB-SCIMS Contact the UAB-SCIMS UAB Spinal Cord Injury Model System Newly Injured Health Daily Living Consumer ...

  4. Building Information Modeling for Managing Design and Construction

    DEFF Research Database (Denmark)

    Berard, Ole Bengt

    outcome of construction work. Even though contractors regularly encounter design information problems, these issues are accepted as a condition of doing business and better design information has yet to be defined. Building information modeling has the inherent promise of improving the quality of design...... information by suggesting technologies and methods that are supposed to improve design information. However, building information modeling provides no means to assess these improvements of design information. This research introduces design information quality as an equivalent to information quality...... of five points, ranging from traditional to most innovative practice. However, since technology and practice changes rapidly, the definition of each score has to be adjusted regularly. Finally, the framework is applied to a construction project in order to evaluate its practical application. The framework...

  5. [Use of clinical databanks with special reference to gerontologic-geriatric quality assurance].

    Science.gov (United States)

    Thiesemann, R; Kruse, W H; Meier-Baumgartner, H P

    1997-01-01

    Geriatric institutions enforce clinical documentation in order to assure quality of care. Different means include basic information of the clinical course, data gathering by administration and extractions from research projects. The use of electronic data bases and common data processing shall provide a base for the development of a cooperation network, academic progress, quality assurance programs and models of utilization review. In this article, clinical data bases are defined and described with reference to their organization. Data elements collected depend on the focus and function of a data base which must be considered in developing a quality assurance program. Usually there is a focus on 1) therapy, device or procedures or 2) diseases or populations. However, the measurement of variables concerning health aspects of older patients crosses more than one dimension. Geriatric teams may have an advantage in developing a successful data base because of the fact that this requires a multidisciplinary team. It is necessary to follow principles of quality assurance and well-defined data base design in order to succeed in enforcing the objectives mentioned above.

  6. Vector space model for document representation in information retrieval

    Directory of Open Access Journals (Sweden)

    Dan MUNTEANU

    2007-12-01

    Full Text Available This paper presents the basics of information retrieval: the vector space model for document representation with Boolean and term weighted models, ranking methods based on the cosine factor and evaluation measures: recall, precision and combined measure.

  7. Quality Assurance Training Tracking (QATTS)

    Data.gov (United States)

    U.S. Environmental Protection Agency — This is metadata documentation for the Quality Assurance Training Tracking System (QATTS) which tracks Quality Assurace training given by R7 QA staff to in-house...

  8. BUILDING "BRIDGES" WITH QUALITY ASSURANCE

    Science.gov (United States)

    The papr describes how, rather than building "bridges" across centuries, quality assurance (QA) personnel have the opportunity to build bridges across technical disciplines, between public and private organizations, and between different QA groups. As reviewers and auditors of a...

  9. Building information modeling based on intelligent parametric technology

    Institute of Scientific and Technical Information of China (English)

    ZENG Xudong; TAN Jie

    2007-01-01

    In order to push the information organization process of the building industry,promote sustainable architectural design and enhance the competitiveness of China's building industry,the author studies building information modeling (BIM) based on intelligent parametric modeling technology.Building information modeling is a new technology in the field of computer aided architectural design,which contains not only geometric data,but also the great amount of engineering data throughout the lifecycle of a building.The author also compares BIM technology with two-dimensional CAD technology,and demonstrates the advantages and characteristics of intelligent parametric modeling technology.Building information modeling,which is based on intelligent parametric modeling technology,will certainly replace traditional computer aided architectural design and become the new driving force to push forward China's building industry in this information age.

  10. How Qualitative Methods Can be Used to Inform Model Development.

    Science.gov (United States)

    Husbands, Samantha; Jowett, Susan; Barton, Pelham; Coast, Joanna

    2017-06-01

    Decision-analytic models play a key role in informing healthcare resource allocation decisions. However, there are ongoing concerns with the credibility of models. Modelling methods guidance can encourage good practice within model development, but its value is dependent on its ability to address the areas that modellers find most challenging. Further, it is important that modelling methods and related guidance are continually updated in light of any new approaches that could potentially enhance model credibility. The objective of this article was to highlight the ways in which qualitative methods have been used and recommended to inform decision-analytic model development and enhance modelling practices. With reference to the literature, the article discusses two key ways in which qualitative methods can be, and have been, applied. The first approach involves using qualitative methods to understand and inform general and future processes of model development, and the second, using qualitative techniques to directly inform the development of individual models. The literature suggests that qualitative methods can improve the validity and credibility of modelling processes by providing a means to understand existing modelling approaches that identifies where problems are occurring and further guidance is needed. It can also be applied within model development to facilitate the input of experts to structural development. We recommend that current and future model development would benefit from the greater integration of qualitative methods, specifically by studying 'real' modelling processes, and by developing recommendations around how qualitative methods can be adopted within everyday modelling practice.

  11. Model-driven design of geo-information services

    NARCIS (Netherlands)

    Morales Guarin, J.M.; Morales Guarin, Javier Marcelino

    2004-01-01

    This thesis presents a method for the development of distributed geo-information systems. The method is organised around the design principles of modularity, reuse and replaceability. The method enables the modelling of both behavioural and informational aspects of geo-information systems in an inte

  12. A Model for Information Integration Using Service Oriented Architectur

    Directory of Open Access Journals (Sweden)

    C. Punitha Devi

    2014-06-01

    Full Text Available Business agility remains to be the keyword that drives the business into different directions and enabling a 360 degree shift in the business process. To achieve agility the organization should work on real time information and data. The need to have instant access to information appears to be ever shine requirement of all organizations or enterprise. Access to information does not come directly with a single query but a complex process termed Information integration. Information integration has been in existence for the past two decades and has been progressive up to now. The challenges and issues keep on persisting as information integration problem evolves by itself. This paper addresses the issues in the approaches, techniques and models pertaining to information integration and identifies the problem for a need for a complete model. As SOA is the architectural style that is changing the business patterns today, this paper proposes a service oriented model for information integration. The model mainly focuses on giving a complete structure for information integration that is adaptable to any environment and open in nature. Here information is converted into service and then the information services are integrated through service oriented integration to provide the integrated information also as service.

  13. The Implementing Model of Empowering Eight for Information Literacy

    Science.gov (United States)

    Boeriswati, Endry

    2012-01-01

    Information literacy is the awareness and skills to identify, locate, evaluate, organize, create, use and communicate information to solve or resolve problems. This article is the result of the research on the efforts to improve students' problem-solving skills in the "Research Methods" course through "Empowering Eight: Information Literacy Model"…

  14. Model-driven design of geo-information services

    NARCIS (Netherlands)

    Morales Guarin, Javier Marcelino

    2004-01-01

    This thesis presents a method for the development of distributed geo-information systems. The method is organised around the design principles of modularity, reuse and replaceability. The method enables the modelling of both behavioural and informational aspects of geo-information systems in an inte

  15. Revisiting the JDL Model for Information Exploitation

    Science.gov (United States)

    2013-07-01

    L. Bowman, and F. E. White, “Revisions to the JDL model,” Joint NATO/IRIS Conf., 1998. [3] E. Blasch and S. Plano , “JDL Level 5 Fusion model ‘user...33] E. Blasch and S. Plano , “Cognitive Fusion Analysis Based on Context,” Proc. of SPIE, Vol. 5434, 2004. [34] E. Blasch, “Level 5 (User...Aerospace Elect. Conf., 2011. [46] E. Blasch and S. Plano , “Proactive Decision Fusion for Site Security,” Int. Conf. on Info Fusion, 2005. [47] A. N

  16. Cyber-assurance for the Internet of Things

    CERN Document Server

    2017-01-01

    This book discusses the cyber-assurance needs of the IoT environment, highlighting key information assurance (IA) IoT issues and identifying the associated security implications. Through contributions from cyber-assurance, IA, information security and IoT industry practitioners and experts, the text covers fundamental and advanced concepts necessary to grasp current IA issues, challenges, and solutions for the IoT. The future trends in IoT infrastructures, architectures and applications are also examined. Other topics discussed include the IA protection of IoT systems and information being stored, processed or transmitted from unauthorized access or modification of machine-2-machine (M2M) devices, radio-frequency identification (RFID) networks, wireless sensor networks, smart grids, and supervisory control and data acquisition (SCADA) systems. The book also discusses IA measures necessary to detect, rotect, and defend IoT information and networks/systems to ensure their availability, integrity, authentication...

  17. Information model construction of MES oriented to mechanical blanking workshop

    Science.gov (United States)

    Wang, Jin-bo; Wang, Jin-ye; Yue, Yan-fang; Yao, Xue-min

    2016-11-01

    Manufacturing Execution System (MES) is one of the crucial technologies to implement informatization management in manufacturing enterprises, and the construction of its information model is the base of MES database development. Basis on the analysis of the manufacturing process information in mechanical blanking workshop and the information requirement of MES every function module, the IDEF1X method was adopted to construct the information model of MES oriented to mechanical blanking workshop, and a detailed description of the data structure feature included in MES every function module and their logical relationship was given from the point of view of information relationship, which laid the foundation for the design of MES database.

  18. Norms, standards, models and recommendations for information security management

    Directory of Open Access Journals (Sweden)

    Karol Kreft

    2010-12-01

    Full Text Available Information is the factor which can decide about the potential and market value of a company. An increase in the value of intellectual capital of an information-driven company requires development of an effective security management system. More and more often companies develop information security management systems (ISMS based on already verified models. In the article, the main problems with management of information security were discussed. Security models were described, as well as the risk analysis in information security management.

  19. Sustainability Product Properties in Building Information Models

    Science.gov (United States)

    2012-09-01

    washers, dryers , etc. are indispensable in a passive house. Certification is through a third-party building certifier that has been ac- credited by the...Anchor Trenwyth Model Old World Tumbled - 4X8x16 Standard CMU - 8X8X16 Verastone Plus recycled filled and polished ground face masonry units

  20. Model selection and inference a practical information-theoretic approach

    CERN Document Server

    Burnham, Kenneth P

    1998-01-01

    This book is unique in that it covers the philosophy of model-based data analysis and an omnibus strategy for the analysis of empirical data The book introduces information theoretic approaches and focuses critical attention on a priori modeling and the selection of a good approximating model that best represents the inference supported by the data Kullback-Leibler information represents a fundamental quantity in science and is Hirotugu Akaike's basis for model selection The maximized log-likelihood function can be bias-corrected to provide an estimate of expected, relative Kullback-Leibler information This leads to Akaike's Information Criterion (AIC) and various extensions and these are relatively simple and easy to use in practice, but little taught in statistics classes and far less understood in the applied sciences than should be the case The information theoretic approaches provide a unified and rigorous theory, an extension of likelihood theory, an important application of information theory, and are ...

  1. Modelling and Retrieving Audiovisual Information - A Soccer Video Retrieval System

    NARCIS (Netherlands)

    Woudstra, A.; Velthausz, D.D.; Poot, de H.J.G.; Moelaart El-Hadidy, F.; Jonker, W.; Houtsma, M.A.W.; Heller, R.G.; Heemskerk, J.N.H.

    1998-01-01

    This paper describes the results of an ongoing collaborative project between KPN Research and the Telematics Institute on multimedia information handling. The focus of the paper is the modelling and retrieval of audiovisual information. The paper presents a general framework for modeling multimedia

  2. Conceptual Modeling of Events as Information Objects and Change Agents

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    as a totality of an information object and a change agent. When an event is modeled as an information object it is comparable to an entity that exists only at a specific point in time. It has attributes and can be used for querying and specification of constraints. When an event is modeled as a change agent...

  3. An equilibrium search model of the informal sector

    OpenAIRE

    2006-01-01

    We use an equilibrium search framework to model a formal- informal sector labour market where the informal sector arises endogenously. In our model large firms will be in the formal sector and pay a wage premium, while small firms are characterised by low wages and tend to be in the informal sector. Using data from the South African labour force survey we illustrate that the data is consistent with these predictions.

  4. Building Information Modeling in engineering teaching

    DEFF Research Database (Denmark)

    Andersson, Niclas; Andersson, Pernille Hammar

    2010-01-01

    . In engineering education there is an obvious aim to provide students with sufficient disciplinary knowledge in science and engineering principles. The implementation of ICT in engineering education requires, however, that valuable time and teaching efforts are spent on adequate software training needed......The application of Information and Communication Technology (ICT) in construction supports business as well as project processes by providing integrated systems for communication, administration, quantity takeoff, time scheduling, cost estimating, progress control among other things. The rapid...... technological development of ICT systems and the increased application of ICT in industry significantly influence the management and organisation of construction projects, and consequently, ICT has implications for the education of engineers and the preparation of students for their future professional careers...

  5. Display of the information model accounting system

    Directory of Open Access Journals (Sweden)

    Matija Varga

    2011-12-01

    Full Text Available This paper presents the accounting information system in public companies, business technology matrix and data flow diagram. The paper describes the purpose and goals of the accounting process, matrix sub-process and data class. Data flow in the accounting process and the so-called general ledger module are described in detail. Activities of the financial statements and determining the financial statements of the companies are mentioned as well. It is stated how the general ledger module should function and what characteristics it must have. Line graphs will depict indicators of the company’s business success, indebtedness and company’s efficiency coefficients based on financial balance reports, and profit and loss report.

  6. A generalized model via random walks for information filtering

    Energy Technology Data Exchange (ETDEWEB)

    Ren, Zhuo-Ming, E-mail: zhuomingren@gmail.com [Department of Physics, University of Fribourg, Chemin du Musée 3, CH-1700, Fribourg (Switzerland); Chongqing Institute of Green and Intelligent Technology, Chinese Academy of Sciences, ChongQing, 400714 (China); Kong, Yixiu [Department of Physics, University of Fribourg, Chemin du Musée 3, CH-1700, Fribourg (Switzerland); Shang, Ming-Sheng, E-mail: msshang@cigit.ac.cn [Chongqing Institute of Green and Intelligent Technology, Chinese Academy of Sciences, ChongQing, 400714 (China); Zhang, Yi-Cheng [Department of Physics, University of Fribourg, Chemin du Musée 3, CH-1700, Fribourg (Switzerland)

    2016-08-06

    There could exist a simple general mechanism lurking beneath collaborative filtering and interdisciplinary physics approaches which have been successfully applied to online E-commerce platforms. Motivated by this idea, we propose a generalized model employing the dynamics of the random walk in the bipartite networks. Taking into account the degree information, the proposed generalized model could deduce the collaborative filtering, interdisciplinary physics approaches and even the enormous expansion of them. Furthermore, we analyze the generalized model with single and hybrid of degree information on the process of random walk in bipartite networks, and propose a possible strategy by using the hybrid degree information for different popular objects to toward promising precision of the recommendation. - Highlights: • We propose a generalized recommendation model employing the random walk dynamics. • The proposed model with single and hybrid of degree information is analyzed. • A strategy with the hybrid degree information improves precision of recommendation.

  7. Decentralized Disturbance Accommodation with Limited Plant Model Information

    CERN Document Server

    Farokhi, F; Johansson, K H

    2011-01-01

    The design of optimal disturbance accommodation and servomechanism controllers with limited plant model information is considered in this paper. Their closed-loop performance are compared using a performance metric called competitive ratio which is the worst-case ratio of the cost of a given control design strategy to the cost of the optimal control design with full model information. It was recently shown that when it comes to designing optimal centralized or partially structured decentralized state-feedback controllers with limited model information, the best control design strategy in terms of competitive ratio is a static one. This is true even though the optimal structured decentralized state-feedback controller with full model information is dynamic. In this paper, we show that, in contrast, the best limited model information control design strategy for the disturbance accommodation problem gives a dynamic controller. We find an explicit minimizer of the competitive ratio and we show that it is undomina...

  8. Information technology - Telecommunications and information exchange between systems - Private integrated services network - Specification, functional model and information flows - Call interception additional network feature

    CERN Document Server

    International Organization for Standardization. Geneva

    2003-01-01

    Information technology - Telecommunications and information exchange between systems - Private integrated services network - Specification, functional model and information flows - Call interception additional network feature

  9. Information technology - Telecommunications and information exchange between systems - Private integrated services network - Specification, functional model and information flows - Call priority interruption and call priority interruption protection supplementary services

    CERN Document Server

    International Organization for Standardization. Geneva

    2003-01-01

    Information technology - Telecommunications and information exchange between systems - Private integrated services network - Specification, functional model and information flows - Call priority interruption and call priority interruption protection supplementary services

  10. Information technology - Telecommunications and information exchange between systems - Private Integrated Services Network - Specification, functional model and information flows - Call transfer supplementary service

    CERN Document Server

    International Organization for Standardization. Geneva

    2003-01-01

    Information technology - Telecommunications and information exchange between systems - Private Integrated Services Network - Specification, functional model and information flows - Call transfer supplementary service

  11. Information technology - Telecommunications and information exchange between systems - Private integrated services network - Specification, functional model and information flows - Recall supplementary service

    CERN Document Server

    International Organization for Standardization. Geneva

    2003-01-01

    Information technology - Telecommunications and information exchange between systems - Private integrated services network - Specification, functional model and information flows - Recall supplementary service

  12. Thoughts on Internal and External Quality Assurance

    Science.gov (United States)

    Zhang, Jianxin

    2012-01-01

    Quality assurance of higher education is made up of two parts: internal quality assurance (IQA) and external quality assurance (EQA). Both belong to a union of the coexistence and balance of yin and yang. But in reality there exists a paradox of "confusion of quality assurance (QA) subject consciousness, singularity of social QA and lack of QA…

  13. 10 CFR 71.37 - Quality assurance.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Quality assurance. 71.37 Section 71.37 Energy NUCLEAR... Package Approval § 71.37 Quality assurance. (a) The applicant shall describe the quality assurance program... quality assurance program that are applicable to the particular package design under...

  14. 40 CFR 51.363 - Quality assurance.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 2 2010-07-01 2010-07-01 false Quality assurance. 51.363 Section 51... Requirements § 51.363 Quality assurance. An ongoing quality assurance program shall be implemented to discover... impede program performance. The quality assurance and quality control procedures shall be...

  15. Thoughts on Internal and External Quality Assurance

    Science.gov (United States)

    Zhang, Jianxin

    2012-01-01

    Quality assurance of higher education is made up of two parts: internal quality assurance (IQA) and external quality assurance (EQA). Both belong to a union of the coexistence and balance of yin and yang. But in reality there exists a paradox of "confusion of quality assurance (QA) subject consciousness, singularity of social QA and lack of QA…

  16. End-to-end Information Flow Security Model for Software-Defined Networks

    Directory of Open Access Journals (Sweden)

    D. Ju. Chaly

    2015-01-01

    Full Text Available Software-defined networks (SDN are a novel paradigm of networking which became an enabler technology for many modern applications such as network virtualization, policy-based access control and many others. Software can provide flexibility and fast-paced innovations in the networking; however, it has a complex nature. In this connection there is an increasing necessity of means for assuring its correctness and security. Abstract models for SDN can tackle these challenges. This paper addresses to confidentiality and some integrity properties of SDNs. These are critical properties for multi-tenant SDN environments, since the network management software must ensure that no confidential data of one tenant are leaked to other tenants in spite of using the same physical infrastructure. We define a notion of end-to-end security in context of software-defined networks and propose a semantic model where the reasoning is possible about confidentiality, and we can check that confidential information flows do not interfere with non-confidential ones. We show that the model can be extended in order to reason about networks with secure and insecure links which can arise, for example, in wireless environments.The article is published in the authors’ wording.

  17. Agricultural information dissemination using ICTs: A review and analysis of information dissemination models in China

    Directory of Open Access Journals (Sweden)

    Yun Zhang

    2016-03-01

    Full Text Available Over the last three decades, China’s agriculture sector has been transformed from the traditional to modern practice through the effective deployment of Information and Communication Technologies (ICTs. Information processing and dissemination have played a critical role in this transformation process. Many studies in relation to agriculture information services have been conducted in China, but few of them have attempted to provide a comprehensive review and analysis of different information dissemination models and their applications. This paper aims to review and identify the ICT based information dissemination models in China and to share the knowledge and experience in applying emerging ICTs in disseminating agriculture information to farmers and farm communities to improve productivity and economic, social and environmental sustainability. The paper reviews and analyzes the development stages of China’s agricultural information dissemination systems and different mechanisms for agricultural information service development and operations. Seven ICT-based information dissemination models are identified and discussed. Success cases are presented. The findings provide a useful direction for researchers and practitioners in developing future ICT based information dissemination systems. It is hoped that this paper will also help other developing countries to learn from China’s experience and best practice in their endeavor of applying emerging ICTs in agriculture information dissemination and knowledge transfer.

  18. The Columbia River Protection Supplemental Technologies Quality Assurance Project Plan

    Energy Technology Data Exchange (ETDEWEB)

    Fix, N. J.

    2008-03-12

    Pacific Northwest National Laboratory researchers are working on the Columbia River Protection Supplemental Technologies Project. This project is a U. S. Department of Energy, Office of Environmental Management-funded initiative designed to develop new methods, strategies, and technologies for characterizing, modeling, remediating, and monitoring soils and groundwater contaminated with metals, radionuclides, and chlorinated organics. This Quality Assurance Project Plan provides the quality assurance requirements and processes that will be followed by the Technologies Project staff.

  19. Translating Building Information Modeling to Building Energy Modeling Using Model View Definition

    Directory of Open Access Journals (Sweden)

    WoonSeong Jeong

    2014-01-01

    Full Text Available This paper presents a new approach to translate between Building Information Modeling (BIM and Building Energy Modeling (BEM that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1 the BIM-based Modelica models generated from Revit2Modelica and (2 BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1 enables BIM models to be translated into ModelicaBEM models, (2 enables system interface development based on the MVD for thermal simulation, and (3 facilitates the reuse of original BIM data into building energy simulation without an import/export process.

  20. Translating building information modeling to building energy modeling using model view definition.

    Science.gov (United States)

    Jeong, WoonSeong; Kim, Jong Bum; Clayton, Mark J; Haberl, Jeff S; Yan, Wei

    2014-01-01

    This paper presents a new approach to translate between Building Information Modeling (BIM) and Building Energy Modeling (BEM) that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM) has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD) consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM) and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica) development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1) the BIM-based Modelica models generated from Revit2Modelica and (2) BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1) enables BIM models to be translated into ModelicaBEM models, (2) enables system interface development based on the MVD for thermal simulation, and (3) facilitates the reuse of original BIM data into building energy simulation without an import/export process.

  1. Complexity vs. simplicity: groundwater model ranking using information criteria.

    Science.gov (United States)

    Engelhardt, I; De Aguinaga, J G; Mikat, H; Schüth, C; Liedl, R

    2014-01-01

    A groundwater model characterized by a lack of field data about hydraulic model parameters and boundary conditions combined with many observation data sets for calibration purpose was investigated concerning model uncertainty. Seven different conceptual models with a stepwise increase from 0 to 30 adjustable parameters were calibrated using PEST. Residuals, sensitivities, the Akaike information criterion (AIC and AICc), Bayesian information criterion (BIC), and Kashyap's information criterion (KIC) were calculated for a set of seven inverse calibrated models with increasing complexity. Finally, the likelihood of each model was computed. Comparing only residuals of the different conceptual models leads to an overparameterization and certainty loss in the conceptual model approach. The model employing only uncalibrated hydraulic parameters, estimated from sedimentological information, obtained the worst AIC, BIC, and KIC values. Using only sedimentological data to derive hydraulic parameters introduces a systematic error into the simulation results and cannot be recommended for generating a valuable model. For numerical investigations with high numbers of calibration data the BIC and KIC select as optimal a simpler model than the AIC. The model with 15 adjusted parameters was evaluated by AIC as the best option and obtained a likelihood of 98%. The AIC disregards the potential model structure error and the selection of the KIC is, therefore, more appropriate. Sensitivities to piezometric heads were highest for the model with only five adjustable parameters and sensitivity coefficients were directly influenced by the changes in extracted groundwater volumes.

  2. Research on Modeling of Genetic Networks Based on Information Measurement

    Institute of Scientific and Technical Information of China (English)

    ZHANG Guo-wei; SHAO Shi-huang; ZHANG Ying; LI Hai-ying

    2006-01-01

    As the basis of network of biology organism, the genetic network is concerned by many researchers.Current modeling methods to genetic network, especially the Boolean networks modeling method are analyzed. For modeling the genetic network, the information theory is proposed to mining the relations between elements in network. Through calculating the values of information entropy and mutual entropy in a case, the effectiveness of the method is verified.

  3. Information for seasonal models of carbon fluxes in agroecosystems

    Energy Technology Data Exchange (ETDEWEB)

    King, A.W.; DeAngelis, D.L.

    1987-04-01

    This report is a compilation of information useful for constructing regionally differentiated models of seasonal carbon fluxes in the terrestrial biosphere. Two classes of information are presented. First, extant agroecosystem models that simulate the flux of carbon in a stand or whole field are reviewed. Second, empirical data on seasonal carbon fluxes are compiled. These reviews and compilations are extensive, but not exhaustive. No attempt is made to evaluate the usefulness of seasonal models and data.

  4. MAGIC: Model and Graphic Information Converter

    Science.gov (United States)

    Herbert, W. C.

    2009-01-01

    MAGIC is a software tool capable of converting highly detailed 3D models from an open, standard format, VRML 2.0/97, into the proprietary DTS file format used by the Torque Game Engine from GarageGames. MAGIC is used to convert 3D simulations from authoritative sources into the data needed to run the simulations in NASA's Distributed Observer Network. The Distributed Observer Network (DON) is a simulation presentation tool built by NASA to facilitate the simulation sharing requirements of the Data Presentation and Visualization effort within the Constellation Program. DON is built on top of the Torque Game Engine (TGE) and has chosen TGE's Dynamix Three Space (DTS) file format to represent 3D objects within simulations.

  5. Information security evaluation a holistic approach

    CERN Document Server

    Tashi, Igli

    2011-01-01

    What is Information Security?Risk Management versus Security ManagementInformation Security Assurance: an Assessment ModelEvaluating the Organizational DimensionEvaluating the Functional DimensionEvaluating the Human DimensionEvaluating the Compliance DimensionConcluding RemarksBibliography Index of Keywords and Concepts

  6. A multi information dissemination model considering the interference of derivative information

    Science.gov (United States)

    Sun, Ling; Liu, Yun; Bartolacci, Michael R.; Ting, I.-Hsien

    2016-06-01

    With the tremendous growth of social network research, many information diffusion models have been proposed from multiple perspectives with the intent of finding out key factors. However, most models only focus on the individual behavior patterns or the usage habits of social applications; the potential interrelationships between information items have not been explored. From this point of view, we propose an information interference model that takes into account the interrelationships between information items in social network. The effect of interference and anti-interference abilities of information in diffusion are analyzed in highly clustered regular networks and also the random networks. We find that information diffusion in regular networks is more easily affected by interference information; but the corresponding reduction of the information diffusion range is the negative consequence in random networks. We also find that the individuals who know about information are the main spreaders of interference. From the aspect of the interference, random network shows a higher timeliness requirement to interference. Furthermore, simulation results indicate that increasing initial forwarding probability of information is much better than increasing the influence of it in reducing interference.

  7. A simplified computational memory model from information processing

    Science.gov (United States)

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-11-01

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view.

  8. Developing a model of forecasting information systems performance

    Directory of Open Access Journals (Sweden)

    G. N. Isaev

    2017-01-01

    Full Text Available Research aim: to develop a model to forecast the performance ofinformation systems as a mechanism for preliminary assessment of the information system effectiveness before the beginning of financing the information system project.Materials and methods: the starting material used the results of studying the parameters of the statistical structure of information system data processing defects. Methods of cluster analysis and regression analysis were applied.Results: in order to reduce financial risks, information systems customers try to make decisions on the basis of preliminary calculations on the effectiveness of future information systems. However, the assumptions on techno-economic justification of the project can only be obtained when the funding for design work is already open. Its evaluation can be done before starting the project development using a model of forecasting information system performance. The model is developed using regression analysis in the form of a multiple linear regression. The value of information system performance is the predicted variable in the regression equation. The values of data processing defects in the classes of accuracy, completeness and timeliness are the forecast variables. Measurement and evaluation of parameters of the statistical structure of defects were done through programmes of cluster analysis and regression analysis. The calculations for determining the actual and forecast values of the information system performance were conducted.Conclusion: in terms of implementing the model, a research of information systems was carried out, as well as the development of forecasting model of information system performance. The conducted experimental work showed the adequacy of the model. The model is implemented in the complex task of designing information systems in education and industry.

  9. Tracers and traceability: implementing the cirrus parameterisation from LACM in the TOMCAT/SLIMCAT chemistry transport model as an example of the application of quality assurance to legacy models

    Directory of Open Access Journals (Sweden)

    A. M. Horseman

    2010-03-01

    Full Text Available A new modelling tool for the investigation of large-scale behaviour of cirrus clouds has been developed. This combines two existing models, the TOMCAT/SLIMCAT chemistry transport model (nupdate library version 0.80, script mpc346_l and cirrus parameterisation of Ren and MacKenzie (LACM implementation not versioned. The development process employed a subset of best-practice software engineering and quality assurance processes, selected to be viable for small-scale projects whilst maintaining the same traceability objectives. The application of the software engineering and quality control processes during the development has been shown to be not a great overhead, and their use has been of benefit to the developers as well as the end users of the results. We provide a step-by-step guide to the implementation of traceability tailored to the production of geo-scientific research software, as distinct from commercial and operational software. Our recommendations include: maintaining a living "requirements list"; explicit consideration of unit, integration and acceptance testing; and automated revision/configuration control, including control of analysis tool scripts and programs.

    Initial testing of the resulting model against satellite and in-situ measurements has been promising. The model produces representative results for both spatial distribution of the frequency of occurrence of cirrus ice, and the drying of air as it moves across the tropical tropopause. The model is now ready for more rigorous quantitative testing, but will require the addition of a vertical wind velocity downscaling scheme to better represent extra-tropical continental cirrus.

  10. Tracers and traceability: implementing the cirrus parameterisation from LACM in the TOMCAT/SLIMCAT chemistry transport model as an example of the application of quality assurance to legacy models

    Directory of Open Access Journals (Sweden)

    A. M. Horseman

    2009-11-01

    Full Text Available A new modelling tool for the investigation of large-scale behaviour of cirrus clouds has been developed. This combines two existing models, the TOMCAT/SLIMCAT chemistry transport model (nupdate library version 0.80, script mpc346_l and cirrus parameterisation of Ren and MacKenzie (LACM implementation not versioned. The development process employed a subset of best-practice software engineering and quality assurance processes, selected to be viable for small-scale projects whilst maintaining the same traceability objectives. The application of the software engineering and quality control processes during the development has been shown to be not a great overhead, and their use has been of benefit to the developers as well as the end users of the results. We provide a step-by-step guide to the implementation of traceability tailored to the production of geo-scientific research software, as distinct from commercial and operational software. Our recommendations include: maintaining a living "requirements list"; explicit consideration of unit, integration and acceptance testing; and automated revision/configuration control, including control of analysis tool scripts and programs.

    Initial testing of the resulting model against satellite and in-situ measurements has been promising. The model produces representative results for both spatial distribution of the frequency of occurrence of cirrus ice, and the drying of air as it moves across the tropical tropopause. The model is now ready for more rigorous quantitative testing, but will require the addition of a vertical wind velocity downscaling scheme to better represent extra-tropical continental cirrus.

  11. Probing models of information spreading in social networks

    CERN Document Server

    Zoller, J

    2014-01-01

    We apply signal processing analysis to the information spreading in scale-free network. To reproduce typical behaviors obtained from the analysis of information spreading in the world wide web we use a modified SIS model where synergy effects and influential nodes are taken into account. This model depends on a single free parameter that characterize the memory-time of the spreading process. We show that by means of fractal analysis it is possible -from aggregated easily accessible data- to gain information on the memory time of the underlying mechanism driving the information spreading process.

  12. Information Technology Model for Product Lifecycle Engineering

    Directory of Open Access Journals (Sweden)

    Bhanumathi KS

    2013-02-01

    Full Text Available An aircraft is a complex, multi-disciplinary, system-engineered product that requires real-time global technical collaboration through its life-cycle. Engineering data and processes which form the backbone of the aircraft should be under strict Configuration Control (CC. It should be model-based and allow for 3D visualization and manipulation. This requires accurate, realtime collaboration and concurrent engineering-based business processes operating in an Integrated Digital Environment (IDE. The IDE uses lightweight, neutral Computer Aided Design (CAD Digital Mock-Up (DMU. The DMU deals with complex structural assemblies and systems of more than a hundred thousand parts created by engineers across the globe, each using diverse CAD, Computer Aided Engineering (CAE, Computer Aided Manufacturing (CAM, Computer Integrated Manufacturing (CIM, Enterprise Resource Planning (ERP, Supply Chain Management(SCM,Customer Relationship Management(CRM and Computer Aided Maintenance Management System (CAMMS systems. In this paper, a comprehensive approach to making such an environment a reality is presented.

  13. Bayesian Case-deletion Model Complexity and Information Criterion.

    Science.gov (United States)

    Zhu, Hongtu; Ibrahim, Joseph G; Chen, Qingxia

    2014-10-01

    We establish a connection between Bayesian case influence measures for assessing the influence of individual observations and Bayesian predictive methods for evaluating the predictive performance of a model and comparing different models fitted to the same dataset. Based on such a connection, we formally propose a new set of Bayesian case-deletion model complexity (BCMC) measures for quantifying the effective number of parameters in a given statistical model. Its properties in linear models are explored. Adding some functions of BCMC to a conditional deviance function leads to a Bayesian case-deletion information criterion (BCIC) for comparing models. We systematically investigate some properties of BCIC and its connection with other information criteria, such as the Deviance Information Criterion (DIC). We illustrate the proposed methodology on linear mixed models with simulations and a real data example.

  14. A Conditioned Model for Choice of Mode Under Information

    Directory of Open Access Journals (Sweden)

    Agha Faisal Habib Pathan

    2013-07-01

    Full Text Available This paper examines the influence of time and cost information obtained from different sources on choice of mode of Leeds' long distance travellers. The choice of mode was investigated through modal attributes provided by at least two different information sources which might provide contrary or corroborating information rather than on actual attributes. The experiment included telephone administered questionnaire including RP (Revealed Preference questions and an SP (Stated Preference exercise dealing with the choice of modes conditioned by the information received from various sources. Information on travel time and cost was provided from two different information sources for each mode to facilitate the conditioning of mode choice on corroborating/contradictory information. The research employs a wide range of modelling methodologies and examines a range of traditional and newly developed calibration and estimation procedures including Mixed Logit models with individual specific parameters and the newly developed RRM (Random Regret Minimisation framework. The study confirms that the market share of the modes increases when information sources show decreased travel time and cost values and shows that the maximum shares are achieved when different information sources give the same information to the travellers. The study found that pre-trip time information has more influence on mode choice when derived from websites than when derived from other sources. Pre-trip information on costs was, however, less influential when derived from websites than when derived from other sources.

  15. Quality assurance: Importance of systems and standard operating procedures.

    Science.gov (United States)

    Manghani, Kishu

    2011-01-01

    It is mandatory for sponsors of clinical trials and contract research organizations alike to establish, manage and monitor their quality control and quality assurance systems and their integral standard operating procedures and other quality documents to provide high-quality products and services to fully satisfy customer needs and expectations. Quality control and quality assurance systems together constitute the key quality systems. Quality control and quality assurance are parts of quality management. Quality control is focused on fulfilling quality requirements, whereas quality assurance is focused on providing confidence that quality requirements are fulfilled. The quality systems must be commensurate with the Company business objectives and business model. Top management commitment and its active involvement are critical in order to ensure at all times the adequacy, suitability, effectiveness and efficiency of the quality systems. Effective and efficient quality systems can promote timely registration of drugs by eliminating waste and the need for rework with overall financial and social benefits to the Company.

  16. Acquisition of certification on quality assurance system ISO9002 in the Tokai Reprocessing Center

    Energy Technology Data Exchange (ETDEWEB)

    Masui, Jinichi; Kobayashi, Kentaro; Iwasaki, Shogo; Fukanoki, Shinji [Japan Nuclear Cycle Development Inst., Tokai Works, Tokai Reprocessing Center, Tokai, Ibaraki (Japan)

    2002-03-01

    On September 6th 2001, Tokai Reprocessing Center obtained Certification under Quality Assurance System ISO9002: 2nd edition 1994 (JIS Z9902: 1998)-Model for quality assurance in production, installation and servicing. In Tokai Reprocessing Plant, quality assurance activities have been undertaken to contribute to the safety and stable operation of the plant based on the JEAG4101 since 1983. Since 1995, the establishment of a quality assurance system based on the ISO9000 series has been underway, and with the fire and explosion incident at the Bituminization Demonstration Facility as a turning point, this activity has been accelerated and certification obtained under ISO9002. These procedures have strengthened quality assurance activities in the plant operation and transparency of the business has been improved for society through an objective evaluation conducted by the International Organization for Standardization. This report describes the details of quality assurance activities until the acquisition of certification and the outline of the established quality assurance system. (author)

  17. Risk-Significant Adverse Condition Awareness Strengthens Assurance of Fault Management Systems

    Science.gov (United States)

    Fitz, Rhonda

    2017-01-01

    As spaceflight systems increase in complexity, Fault Management (FM) systems are ranked high in risk-based assessment of software criticality, emphasizing the importance of establishing highly competent domain expertise to provide assurance. Adverse conditions (ACs) and specific vulnerabilities encountered by safety- and mission-critical software systems have been identified through efforts to reduce the risk posture of software-intensive NASA missions. Acknowledgement of potential off-nominal conditions and analysis to determine software system resiliency are important aspects of hazard analysis and FM. A key component of assuring FM is an assessment of how well software addresses susceptibility to failure through consideration of ACs. Focus on significant risk predicted through experienced analysis conducted at the NASA Independent Verification & Validation (IV&V) Program enables the scoping of effective assurance strategies with regard to overall asset protection of complex spaceflight as well as ground systems. Research efforts sponsored by NASAs Office of Safety and Mission Assurance (OSMA) defined terminology, categorized data fields, and designed a baseline repository that centralizes and compiles a comprehensive listing of ACs and correlated data relevant across many NASA missions. This prototype tool helps projects improve analysis by tracking ACs and allowing queries based on project, mission type, domain/component, causal fault, and other key characteristics. Vulnerability in off-nominal situations, architectural design weaknesses, and unexpected or undesirable system behaviors in reaction to faults are curtailed with the awareness of ACs and risk-significant scenarios modeled for analysts through this database. Integration within the Enterprise Architecture at NASA IV&V enables interfacing with other tools and datasets, technical support, and accessibility across the Agency. This paper discusses the development of an improved workflow process utilizing

  18. Risk-Significant Adverse Condition Awareness Strengthens Assurance of Fault Management Systems

    Science.gov (United States)

    Fitz, Rhonda

    2017-01-01

    As spaceflight systems increase in complexity, Fault Management (FM) systems are ranked high in risk-based assessment of software criticality, emphasizing the importance of establishing highly competent domain expertise to provide assurance. Adverse conditions (ACs) and specific vulnerabilities encountered by safety- and mission-critical software systems have been identified through efforts to reduce the risk posture of software-intensive NASA missions. Acknowledgement of potential off-nominal conditions and analysis to determine software system resiliency are important aspects of hazard analysis and FM. A key component of assuring FM is an assessment of how well software addresses susceptibility to failure through consideration of ACs. Focus on significant risk predicted through experienced analysis conducted at the NASA Independent Verification Validation (IVV) Program enables the scoping of effective assurance strategies with regard to overall asset protection of complex spaceflight as well as ground systems. Research efforts sponsored by NASA's Office of Safety and Mission Assurance defined terminology, categorized data fields, and designed a baseline repository that centralizes and compiles a comprehensive listing of ACs and correlated data relevant across many NASA missions. This prototype tool helps projects improve analysis by tracking ACs and allowing queries based on project, mission type, domaincomponent, causal fault, and other key characteristics. Vulnerability in off-nominal situations, architectural design weaknesses, and unexpected or undesirable system behaviors in reaction to faults are curtailed with the awareness of ACs and risk-significant scenarios modeled for analysts through this database. Integration within the Enterprise Architecture at NASA IVV enables interfacing with other tools and datasets, technical support, and accessibility across the Agency. This paper discusses the development of an improved workflow process utilizing this

  19. Majoring in Information Systems: An Examination of Role Model Influence

    Science.gov (United States)

    Akbulut, Asli Y.

    2016-01-01

    The importance of role models on individuals' academic and career development and success has been widely acknowledged in the literature. The purpose of this study was to understand the influence of role models on students' decisions to major in information systems (IS). Utilizing a model derived from the social cognitive career theory, we…

  20. A qualitative model for temporal reasoning with incomplete information

    Energy Technology Data Exchange (ETDEWEB)

    Geffner, H. [Universidad Simon Bolivar, Caracas (Venezuela)

    1996-12-31

    We develop a qualitative framework for temporal reasoning with incomplete information that features a modeling language based on rules and a semantics based on infinitesimal probabilities. The framework relates logical and probabilistical models, and accommodates in a natural way features that have been found problematic in other models like non-determinism, action qualifications, parallel actions, and abduction to actions and fluents.

  1. The Pitts/Stripling Model of Information Literacy.

    Science.gov (United States)

    Veltze, Linda

    2003-01-01

    Examines the Pitts/Stripling model of information literacy, describing its key aspects and showing the relationship between the model and Stripling's vision of the library media center for the twenty-first century. The model promotes caring, student-centered, holistic, humanistic, and realistic library practices that are respectful of the…

  2. Digital Avionics Information System (DAIS): Training Requirements Analysis Model (TRAMOD).

    Science.gov (United States)

    Czuchry, Andrew J.; And Others

    The training requirements analysis model (TRAMOD) described in this report represents an important portion of the larger effort called the Digital Avionics Information System (DAIS) Life Cycle Cost (LCC) Study. TRAMOD is the second of three models that comprise an LCC impact modeling system for use in the early stages of system development. As…

  3. Food Security Information Platform Model Based on Internet of Things

    Directory of Open Access Journals (Sweden)

    Lei Zhang

    2015-06-01

    Full Text Available According to the tracking and tracing requirements of food supply chain management and quality and safety, this study built food security information platform using the Internet of things technology, with reference to the EPC standard, the use of RFID technology, adopting the model of SOA, based on SCOR core processes, researches the food security information platform which can set up the whole process from the source to the consumption of the traceability information, provides food information, strengthens the food identity verification, prevents food identification and information of error identification to the consumer and government food safety regulators, provides good practices for food safety traceability.

  4. One decade of the Data Fusion Information Group (DFIG) model

    Science.gov (United States)

    Blasch, Erik

    2015-05-01

    The revision of the Joint Directors of the Laboratories (JDL) Information Fusion model in 2004 discussed information processing, incorporated the analyst, and was coined the Data Fusion Information Group (DFIG) model. Since that time, developments in information technology (e.g., cloud computing, applications, and multimedia) have altered the role of the analyst. Data production has outpaced the analyst; however the analyst still has the role of data refinement and information reporting. In this paper, we highlight three examples being addressed by the DFIG model. One example is the role of the analyst to provide semantic queries (through an ontology) so that vast amount of data available can be indexed, accessed, retrieved, and processed. The second idea is reporting which requires the analyst to collect the data into a condensed and meaningful form through information management. The last example is the interpretation of the resolved information from data that must include contextual information not inherent in the data itself. Through a literature review, the DFIG developments in the last decade demonstrate the usability of the DFIG model to bring together the user (analyst or operator) and the machine (information fusion or manager) in a systems design.

  5. Analyzing Traditional Medical Practitioners' Information-Seeking Behaviour Using Taylor's Information-Use Environment Model

    Science.gov (United States)

    Olatokun, Wole Michael; Ajagbe, Enitan

    2010-01-01

    This survey-based study examined the information-seeking behaviour of traditional medical practitioners using Taylor's information use model. Respondents comprised all 160 traditional medical practitioners that treat sickle cell anaemia. Data were collected using an interviewer-administered, structured questionnaire. Frequency and percentage…

  6. Approaching the Affective Factors of Information Seeking: The Viewpoint of the Information Search Process Model

    Science.gov (United States)

    Savolainen, Reijo

    2015-01-01

    Introduction: The article contributes to the conceptual studies of affective factors in information seeking by examining Kuhlthau's information search process model. Method: This random-digit dial telephone survey of 253 people (75% female) living in a rural, medically under-serviced area of Ontario, Canada, follows-up a previous interview study…

  7. Analyzing Traditional Medical Practitioners' Information-Seeking Behaviour Using Taylor's Information-Use Environment Model

    Science.gov (United States)

    Olatokun, Wole Michael; Ajagbe, Enitan

    2010-01-01

    This survey-based study examined the information-seeking behaviour of traditional medical practitioners using Taylor's information use model. Respondents comprised all 160 traditional medical practitioners that treat sickle cell anaemia. Data were collected using an interviewer-administered, structured questionnaire. Frequency and percentage…

  8. Information model for on-site inspection system

    Energy Technology Data Exchange (ETDEWEB)

    Bray, O.H.; Deland, S.

    1997-01-01

    This report describes the information model that was jointly developed as part of two FY93 LDRDs: (1) Information Integration for Data Fusion, and (2) Interactive On-Site Inspection System: An Information System to Support Arms Control Inspections. This report describes the purpose and scope of the two LDRD projects and reviews the prototype development approach, including the use of a GIS. Section 2 describes the information modeling methodology. Section 3 provides a conceptual data dictionary for the OSIS (On-Site Information System) model, which can be used in conjunction with the detailed information model provided in the Appendix. Section 4 discussions the lessons learned from the modeling and the prototype. Section 5 identifies the next steps--two alternate paths for future development. The long-term purpose of the On-Site Inspection LDRD was to show the benefits of an information system to support a wide range of on-site inspection activities for both offensive and defensive inspections. The database structure and the information system would support inspection activities under nuclear, chemical, biological, and conventional arms control treaties. This would allow a common database to be shared for all types of inspections, providing much greater cross-treaty synergy.

  9. Full feature data model for spatial information network integration

    Institute of Scientific and Technical Information of China (English)

    DENG Ji-qiu; BAO Guang-shu

    2006-01-01

    In allusion to the difficulty of integrating data with different models in integrating spatial information,the characteristics of raster structure, vector structure and mixed model were analyzed, and a hierarchical vectorraster integrative full feature model was put forward by integrating the advantage of vector and raster model and using the object-oriented method. The data structures of the four basic features, i.e. point, line, surface and solid,were described. An application was analyzed and described, and the characteristics of this model were described. In this model, all objects in the real world are divided into and described as features with hierarchy, and all the data are organized in vector. This model can describe data based on feature, field, network and other models, and avoid the disadvantage of inability to integrate data based on different models and perform spatial analysis on them in spatial information integration.

  10. High resolution reservoir geological modelling using outcrop information

    Energy Technology Data Exchange (ETDEWEB)

    Zhang Changmin; Lin Kexiang; Liu Huaibo [Jianghan Petroleum Institute, Hubei (China)] [and others

    1997-08-01

    This is China`s first case study of high resolution reservoir geological modelling using outcrop information. The key of the modelling process is to build a prototype model and using the model as a geological knowledge bank. Outcrop information used in geological modelling including seven aspects: (1) Determining the reservoir framework pattern by sedimentary depositional system and facies analysis; (2) Horizontal correlation based on the lower and higher stand duration of the paleo-lake level; (3) Determining the model`s direction based on the paleocurrent statistics; (4) Estimating the sandbody communication by photomosaic and profiles; (6) Estimating reservoir properties distribution within sandbody by lithofacies analysis; and (7) Building the reservoir model in sandbody scale by architectural element analysis and 3-D sampling. A high resolution reservoir geological model of Youshashan oil field has been built by using this method.

  11. Modernization of software quality assurance

    Science.gov (United States)

    Bhaumik, Gokul

    1988-01-01

    The customers satisfaction depends not only on functional performance, it also depends on the quality characteristics of the software products. An examination of this quality aspect of software products will provide a clear, well defined framework for quality assurance functions, which improve the life-cycle activities of software development. Software developers must be aware of the following aspects which have been expressed by many quality experts: quality cannot be added on; the level of quality built into a program is a function of the quality attributes employed during the development process; and finally, quality must be managed. These concepts have guided our development of the following definition for a Software Quality Assurance function: Software Quality Assurance is a formal, planned approach of actions designed to evaluate the degree of an identifiable set of quality attributes present in all software systems and their products. This paper is an explanation of how this definition was developed and how it is used.

  12. Quality assurance in production and use of special form radioactive material - focal points in BAM approvals

    Energy Technology Data Exchange (ETDEWEB)

    Rolle, A.; Buhlemann, L. [Bundesanstalt fuer Materialforschung und -pruefung (BAM), Berlin (Germany)

    2004-07-01

    BAM as the competent authority for approval of special form radioactive material attaches great importance to a detailed audit of the required quality assurance programs for design, manufacture, testing, documentation, use, maintenance and inspection. Applicants have to submit, together with application documentation information on general arrangements for quality assurance, as well as on quality assurance in production and in operation. Fields where BAM has often found deficiencies are leak test methods, weld seam quality and the safety level after use.

  13. Quality assurance of qualitative analysis

    DEFF Research Database (Denmark)

    Ríos, Ángel; Barceló, Damiá; Buydens, Lutgarde

    2003-01-01

    The European Commission has supported the G6MA-CT-2000-01012 project on "Metrology of Qualitative Chemical Analysis" (MEQUALAN), which was developed during 2000-2002. The final result is a document produced by a group of scientists with expertise in different areas of chemical analysis, metrology...... and quality assurance. One important part of this document deals, therefore, with aspects involved in analytical quality assurance of qualitative analysis. This article shows the main conclusions reported in the document referring to the implementation of quality principles in qualitative analysis...

  14. Generalisation of geographic information cartographic modelling and applications

    CERN Document Server

    Mackaness, William A; Sarjakoski, L Tiina

    2011-01-01

    Theoretical and Applied Solutions in Multi Scale MappingUsers have come to expect instant access to up-to-date geographical information, with global coverage--presented at widely varying levels of detail, as digital and paper products; customisable data that can readily combined with other geographic information. These requirements present an immense challenge to those supporting the delivery of such services (National Mapping Agencies (NMA), Government Departments, and private business. Generalisation of Geographic Information: Cartographic Modelling and Applications provides detailed review

  15. Communicate and collaborate by using building information modeling

    DEFF Research Database (Denmark)

    Mondrup, Thomas Fænø; Karlshøj, Jan; Vestergaard, Flemming

    Building Information Modeling (BIM) represents a new approach within the Architecture, Engineering, and Construction (AEC) industry, one that encourages collaboration and engagement of all stakeholders on a project. This study discusses the potential of adopting BIM as a communication...

  16. Semantic-Sensitive Web Information Retrieval Model for HTML Documents

    CERN Document Server

    Bassil, Youssef

    2012-01-01

    With the advent of the Internet, a new era of digital information exchange has begun. Currently, the Internet encompasses more than five billion online sites and this number is exponentially increasing every day. Fundamentally, Information Retrieval (IR) is the science and practice of storing documents and retrieving information from within these documents. Mathematically, IR systems are at the core based on a feature vector model coupled with a term weighting scheme that weights terms in a document according to their significance with respect to the context in which they appear. Practically, Vector Space Model (VSM), Term Frequency (TF), and Inverse Term Frequency (IDF) are among other long-established techniques employed in mainstream IR systems. However, present IR models only target generic-type text documents, in that, they do not consider specific formats of files such as HTML web documents. This paper proposes a new semantic-sensitive web information retrieval model for HTML documents. It consists of a...

  17. A Participatory Model for Multi-Document Health Information Summarisation

    Directory of Open Access Journals (Sweden)

    Dinithi Nallaperuma

    2017-03-01

    Full Text Available Increasing availability and access to health information has been a paradigm shift in healthcare provision as it empowers both patients and practitioners alike. Besides awareness, significant time savings and process efficiencies can be achieved through effective summarisation of healthcare information. Relevance and accuracy are key concerns when generating summaries for such documents. Despite advances in automated summarisation approaches, the role of participation has not been explored. In this paper, we propose a new model for multi-document health information summarisation that takes into account the role of participation. The updated IS user participation theory was extended to explicate these roles. The proposed model integrates both extractive and abstractive summarisation processes with continuous participatory inputs to each phase. The model was implemented as a client-server application and evaluated by both domain experts and health information consumers. Results from the evaluation phase indicates the model is successful in generating relevant and accurate summaries for diverse audiences.

  18. Recommendations concerning energy information model documentation, public access, and evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Wood, D.O.; Mason, M.J.

    1979-10-01

    A review is presented of the Energy Information Administration (EIA) response to Congressional and management concerns, relating specifically to energy information system documentation, public access to EIA systems, and scientific/peer evaluation. The relevant organizational and policy responses of EIA are discussed. An analysis of the model development process and approaches to, and organization of, model evaluation is presented. Included is a survey of model evaluation studies. A more detailed analysis of the origins of the legislated documentation and public access requirements is presented in Appendix A, and the results of an informal survey of other agency approaches to public access and evaluation is presented in Appendix B. Appendix C provides a survey of non-EIA activities relating to model documentation and evaluation. Twelve recommendations to improve EIA's procedures for energy information system documentation, evaluation activities, and public access are determined. These are discussed in detail. (MCW)

  19. Research and Development of Information Retrieval Models and Their Applications.

    Science.gov (United States)

    Fox, Edward A.

    1989-01-01

    This introduction to a special issue devoted to modeling data, information, and knowledge briefly describes the origins of the papers presented and the topics covered, which include: Boolean logic; probability theory; artificial intelligence; organizing and encoding information and data; and characteristics of users of retrieval systems. (12…

  20. Proposing a Metaliteracy Model to Redefine Information Literacy

    Science.gov (United States)

    Jacobson, Trudi E.; Mackey, Thomas P.

    2013-01-01

    Metaliteracy is envisioned as a comprehensive model for information literacy to advance critical thinking and reflection in social media, open learning settings, and online communities. At this critical time in higher education, an expansion of the original definition of information literacy is required to include the interactive production and…

  1. A MODEL INFORMATION SYSTEM FOR THE ADULT EDUCATION PROFESSION.

    Science.gov (United States)

    DECROW, ROGER

    A MODEL OF INFORMATION SERVICES FOR THE ADULT EDUCATION PROFESSION PROVIDES FOR--(1) ACCESS TO THE LITERATURE THROUGH BIBLIOGRAPHIES, REVIEWS, AND MECHANIZED RETRIEVAL, (2) PHYSICAL ACCESS (MAINLY IN MICROFORM), (3) SPECIALIZED INFORMATION SERVICES LINKED WITH ONE ANOTHER AND THE ERIC CLEARINGHOUSE ON ADULT EDUCATION, (4) COORDINATION, RESEARCH,…

  2. SHIR competitive information diffusion model for online social media

    Science.gov (United States)

    Liu, Yun; Diao, Su-Meng; Zhu, Yi-Xiang; Liu, Qing

    2016-11-01

    In online social media, opinion divergences and differentiations generally exist as a result of individuals' extensive participation and personalization. In this paper, a Susceptible-Hesitated-Infected-Removed (SHIR) model is proposed to study the dynamics of competitive dual information diffusion. The proposed model extends the classical SIR model by adding hesitators as a neutralized state of dual information competition. It is both hesitators and stable spreaders that facilitate information dissemination. Researching on the impacts of diffusion parameters, it is found that the final density of stiflers increases monotonically as infection rate increases and removal rate decreases. And the advantage information with larger stable transition rate takes control of whole influence of dual information. The density of disadvantage information spreaders slightly grows with the increase of its stable transition rate, while whole spreaders of dual information and the relaxation time remain almost unchanged. Moreover, simulations imply that the final result of competition is closely related to the ratio of stable transition rates of dual information. If the stable transition rates of dual information are nearly the same, a slightly reduction of the smaller one brings out a significant disadvantage in its propagation coverage. Additionally, the relationship of the ratio of final stiflers versus the ratio of stable transition rates presents power characteristic.

  3. Marketing information systems in units of business information: a proposed model

    Directory of Open Access Journals (Sweden)

    Ana Maria Pereira

    2016-04-01

    Full Text Available Introduction: It proposes a theoretical model of marketing information system, which provides qualitiy attributes informations, such as: accuracy, economy, flexibility, reliability, relevance, simplicity and verifiability to the decision-makers of business organizations, based on the systemic vision and marketing theories. Objective: Present a model of marketing information system for business units, identifying the requirements, skills and abilities that the market demands of the librarian and his or hers integration. Methodology: Literature review that enabled the theoretic knowledge to propose the model. Results: The proposed model consists of five stages and constituent of subsystems that were not identified in existing marketing information systems, where it is confirmed that the organization of information is necessary for the development of the organization. Conclusions: It was identified that the librarian is an active agent, a mediator of information in marketing information systems in business units, must be present at all levels of the process and provide the administrators a greater credibility in the decisions taken.

  4. Fisher information and quantum potential well model for finance

    Energy Technology Data Exchange (ETDEWEB)

    Nastasiuk, V.A., E-mail: nasa@i.ua

    2015-09-25

    The probability distribution function (PDF) for prices on financial markets is derived by extremization of Fisher information. It is shown how on that basis the quantum-like description for financial markets arises and different financial market models are mapped by quantum mechanical ones. - Highlights: • The financial Schrödinger equation is derived using the principle of minimum Fisher information. • Statistical models for price variation are mapped by the quantum models of coupled particle. • The model of quantum particle in parabolic potential well corresponds to Efficient market.

  5. Mixing Formal and Informal Model Elements for Tracing Requirements

    DEFF Research Database (Denmark)

    Jastram, Michael; Hallerstede, Stefan; Ladenberger, Lukas

    2011-01-01

    a system for traceability with a state-based formal method that supports refinement. We do not require all specification elements to be modelled formally and support incremental incorporation of new specification elements into the formal model. Refinement is used to deal with larger amounts of requirements......Tracing between informal requirements and formal models is challenging. A method for such tracing should permit to deal efficiently with changes to both the requirements and the model. A particular challenge is posed by the persisting interplay of formal and informal elements. In this paper, we...

  6. Model choice considerations and information integration using analytical hierarchy process

    Energy Technology Data Exchange (ETDEWEB)

    Langenbrunner, James R [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Booker, Jane M [BOOKER SCIENTIFIC; Ross, Timothy J. [UNM

    2010-10-15

    Using the theory of information-gap for decision-making under severe uncertainty, it has been shown that model output compared to experimental data contains irrevocable trade-offs between fidelity-to-data, robustness-to-uncertainty and confidence-in-prediction. We illustrate a strategy for information integration by gathering and aggregating all available data, knowledge, theory, experience, similar applications. Such integration of information becomes important when the physics is difficult to model, when observational data are sparse or difficult to measure, or both. To aggregate the available information, we take an inference perspective. Models are not rejected, nor wasted, but can be integrated into a final result. We show an example of information integration using Saaty's Analytic Hierarchy Process (AHP), integrating theory, simulation output and experimental data. We used expert elicitation to determine weights for two models and two experimental data sets, by forming pair-wise comparisons between model output and experimental data. In this way we transform epistemic and/or statistical strength from one field of study into another branch of physical application. The price to pay for utilizing all available knowledge is that inferences drawn for the integrated information must be accounted for and the costs can be considerable. Focusing on inferences and inference uncertainty (IU) is one way to understand complex information.

  7. Food Information System Construction Based on DEA Model

    Directory of Open Access Journals (Sweden)

    AoTian Peng

    2015-03-01

    Full Text Available The study improves the traditional DEA model making it to reflect the subjective preference sequence DEA model, proposes a method to solve the dilemma with the average rate of crosscutting comparison with effective unit and cites a case for demonstration. Both at home and abroad, the food information system construction of the evaluation system implementation is at a lower level, one reason is the lag of food information system evaluation system and the imperfect.

  8. Formal Modeling for Information Appliance Using Abstract MVC Architecture

    OpenAIRE

    Arichika, Yuji; Araki, Keijiro

    2004-01-01

    In information appliance development, it is important to divide core functions and display functions because information appliance have various user interface and display functions changed frequently. Using MVC architecture is one way to divide display functions and core functions. But MVC architecture is implementation architecture and there are some gaps to get abstract model. On the other hand it is known that formal methods are useful for constructing abstract model. Therefore we intend t...

  9. Quarterly Bayesian DSGE Model of Pakistan Economy with Informality

    OpenAIRE

    2013-01-01

    In this paper we use the Bayesian methodology to estimate the structural and shocks‟ parameters of the DSGE model in Ahmad et al. (2012). This model includes formal and informal firms both at intermediate and final goods production levels. Households derive utility from leisure, real money balances and consumption. Each household is treated as a unit of labor which is a composite of formal (skilled) and informal (unskilled) labor. The formal (skilled) labor is further divided into types “r” a...

  10. Quality-assurance plan for groundwater activities, U.S. Geological Survey, Washington Water Science Center

    Science.gov (United States)

    Kozar, Mark D.; Kahle, Sue C.

    2013-01-01

    This report documents the standard procedures, policies, and field methods used by the U.S. Geological Survey’s (USGS) Washington Water Science Center staff for activities related to the collection, processing, analysis, storage, and publication of groundwater data. This groundwater quality-assurance plan changes through time to accommodate new methods and requirements developed by the Washington Water Science Center and the USGS Office of Groundwater. The plan is based largely on requirements and guidelines provided by the USGS Office of Groundwater, or the USGS Water Mission Area. Regular updates to this plan represent an integral part of the quality-assurance process. Because numerous policy memoranda have been issued by the Office of Groundwater since the previous groundwater quality assurance plan was written, this report is a substantial revision of the previous report, supplants it, and contains significant additional policies not covered in the previous report. This updated plan includes information related to the organization and responsibilities of USGS Washington Water Science Center staff, training, safety, project proposal development, project review procedures, data collection activities, data processing activities, report review procedures, and archiving of field data and interpretative information pertaining to groundwater flow models, borehole aquifer tests, and aquifer tests. Important updates from the previous groundwater quality assurance plan include: (1) procedures for documenting and archiving of groundwater flow models; (2) revisions to procedures and policies for the creation of sites in the Groundwater Site Inventory database; (3) adoption of new water-level forms to be used within the USGS Washington Water Science Center; (4) procedures for future creation of borehole geophysics, surface geophysics, and aquifer-test archives; and (5) use of the USGS Multi Optional Network Key Entry System software for entry of routine water-level data

  11. National Cancer Information Service in Italy: an information points network as a new model for providing information for cancer patients.

    Science.gov (United States)

    Truccolo, Ivana; Bufalino, Rosaria; Annunziata, Maria Antonietta; Caruso, Anita; Costantini, Anna; Cognetti, Gaetana; Florita, Antonio; Pero, Dina; Pugliese, Patrizia; Tancredi, Roberta; De Lorenzo, Francesco

    2011-01-01

    The international literature data report that good information and communication are fundamental components of a therapeutic process. They contribute to improve the patient-health care professional relationship, to facilitate doctor-patient relationships, therapeutic compliance and adherence, and to the informed consent in innovative clinical trials. We report the results of a multicentric national initiative that developed a 17-information-structure network: 16 Information Points located in the major state-funded certified cancer centers and general hospitals across Italy and a national Help-line at the nonprofit organization AIMaC (the Italian oncologic patients, families and friends association), and updated the already existing services with the aim to create the National Cancer Information Service (SION). The project is the result of a series of pilot and research projects funded by the Italian Ministry of Health. The Information Service model proposed is based on some fundamental elements: 1) human interaction with experienced operators, adequately trained in communication and information, complemented with 2) virtual interaction (Help line, Internet, blog, forum and social network); 3) informative material adequate for both scientific accuracy and communicative style; 4) adequate locations for appropriate positioning and privacy (adequate visibility); 5) appropriate advertising. First results coming from these initiatives contributed to introduce issues related to "Communication and Information to patients" as a "Public Health Instrument" to the National Cancer Plan approved by the Ministry of Health for the years 2010-2012.

  12. Modeling Interoperable Information Systems with 3LGM² and IHE.

    Science.gov (United States)

    Stäubert, S; Schaaf, M; Jahn, F; Brandner, R; Winter, A

    2015-01-01

    Strategic planning of information systems (IS) in healthcare requires descriptions of the current and the future IS state. Enterprise architecture planning (EAP) tools like the 3LGM² tool help to build up and to analyze IS models. A model of the planned architecture can be derived from an analysis of current state IS models. Building an interoperable IS, i. e. an IS consisting of interoperable components, can be considered a relevant strategic information management goal for many IS in healthcare. Integrating the healthcare enterprise (IHE) is an initiative which targets interoperability by using established standards. To link IHE concepts to 3LGM² concepts within the 3LGM² tool. To describe how an information manager can be supported in handling the complex IHE world and planning interoperable IS using 3LGM² models. To describe how developers or maintainers of IHE profiles can be supported by the representation of IHE concepts in 3LGM². Conceptualization and concept mapping methods are used to assign IHE concepts such as domains, integration profiles actors and transactions to the concepts of the three-layer graph-based meta-model (3LGM²). IHE concepts were successfully linked to 3LGM² concepts. An IHE-master-model, i. e. an abstract model for IHE concepts, was modeled with the help of 3LGM² tool. Two IHE domains were modeled in detail (ITI, QRPH). We describe two use cases for the representation of IHE concepts and IHE domains as 3LGM² models. Information managers can use the IHE-master-model as reference model for modeling interoperable IS based on IHE profiles during EAP activities. IHE developers are supported in analyzing consistency of IHE concepts with the help of the IHE-master-model and functions of the 3LGM² tool The complex relations between IHE concepts can be modeled by using the EAP method 3LGM². 3LGM² tool offers visualization and analysis features which are now available for the IHE-master-model. Thus information managers and IHE

  13. The Spectral Mixture Models: A Minimum Information Divergence Approach

    Science.gov (United States)

    2010-04-01

    Bayesian   Information   Criterion .   Developing a metric that measures the fitness of different models is beyond the scope of our discussion.    2.1...data,  then  the  results  are  questionable  or  perhaps  wrong.    Various  information   criteria  have  been  proposed  such  as  the  Akaike   and...LABORATORY INFORMATION DIRECTORATE THE SPECTRAL MIXTURE MODELS

  14. Change of Geographic Information Service Model in Mobile Context

    Institute of Scientific and Technical Information of China (English)

    REN Fu; DU Qingyun

    2005-01-01

    A research on that how the topic of mobility, which is completely different but tightly relevant to space, provides new approaches and methods so as to promote the further development of geographic information services, will accumulate basic experience for different types of relative information systems in the wide fields of location based services. This paper analyzes the meaning of mobility and the change for geographic information service model, it describes the differences and correlation between M-GIS and traditional GIS. It sets a technical framework of geographic information services according to mobile context and provides a case study.

  15. Information behavior versus communication: application models in multidisciplinary settings

    Directory of Open Access Journals (Sweden)

    Cecília Morena Maria da Silva

    2015-05-01

    Full Text Available This paper deals with the information behavior as support for models of communication design in the areas of Information Science, Library and Music. The communication models proposition is based on models of Tubbs and Moss (2003, Garvey and Griffith (1972, adapted by Hurd (1996 and Wilson (1999. Therefore, the questions arose: (i what are the informational skills required of librarians who act as mediators in scholarly communication process and informational user behavior in the educational environment?; (ii what are the needs of music related researchers and as produce, seek, use and access the scientific knowledge of your area?; and (iii as the contexts involved in scientific collaboration processes influence in the scientific production of information science field in Brazil? The article includes a literature review on the information behavior and its insertion in scientific communication considering the influence of context and/or situation of the objects involved in motivating issues. The hypothesis is that the user information behavior in different contexts and situations influence the definition of a scientific communication model. Finally, it is concluded that the same concept or a set of concepts can be used in different perspectives, reaching up, thus, different results.

  16. Organizational information assets classification model and security architecture methodology

    Directory of Open Access Journals (Sweden)

    Mostafa Tamtaji

    2015-12-01

    Full Text Available Today's, Organizations are exposed with huge and diversity of information and information assets that are produced in different systems shuch as KMS, financial and accounting systems, official and industrial automation sysytems and so on and protection of these information is necessary. Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released.several benefits of this model cuses that organization has a great trend to implementing Cloud computing. Maintaining and management of information security is the main challenges in developing and accepting of this model. In this paper, at first, according to "design science research methodology" and compatible with "design process at information systems research", a complete categorization of organizational assets, including 355 different types of information assets in 7 groups and 3 level, is presented to managers be able to plan corresponding security controls according to importance of each groups. Then, for directing of organization to architect it’s information security in cloud computing environment, appropriate methodology is presented. Presented cloud computing security architecture , resulted proposed methodology, and presented classification model according to Delphi method and expers comments discussed and verified.

  17. Mission Assurance: Issues and Challenges

    Science.gov (United States)

    2010-07-15

    JFQ), Summer 1995. [9] Alberts , C.J. & Dorofee, A.J., “Mission Assurance Analysis Protocol (MAAP): Assessing Risk in Complex Environments... CAMUS : Automatically Mapping Cyber Assets to Missions and Users,” Proc. of the 2010 Military Communications Conference (MILCOM 2009), 2009. [23

  18. Employer-Led Quality Assurance

    Science.gov (United States)

    Tyszko, Jason A.

    2017-01-01

    Recent criticism of higher education accreditation has prompted calls for reform and sparked interest in piloting alternative quality assurance methods that better address student learning and employment outcomes. Although this debate has brought much needed attention to improving the outcomes of graduates and safeguarding federal investment in…

  19. Quality Assurance Program. QAP Workbook.

    Science.gov (United States)

    Pelavin Research Inst., Washington, DC.

    The Quality Assurance Program (QAP) workbook is intended to assist institutions of higher education conduct qualitative and quantitative evaluations of their financial aid operations in relation to requirements of Title IV of the Higher Education Act. The workbook provides a structured approach for incorporating a cyclical Title IV QA system into…

  20. Redefining and expanding quality assurance.

    Science.gov (United States)

    Robins, J L

    1992-12-01

    To meet the current standards of excellence necessary for blood establishments, we have learned from industry that a movement toward organization-wide quality assurance/total quality management must be made. Everyone in the organization must accept responsibility for participating in providing the highest quality products and services. Quality must be built into processes and design systems to support these quality processes. Quality assurance has been redefined to include a quality planning function described as the most effective way of designing quality into processes. A formalized quality planning process must be part of quality assurance. Continuous quality improvement has been identified as the strategy every blood establishment must support while striving for error-free processing as the long-term objective. The auditing process has been realigned to support and facilitate this same objective. Implementing organization-wide quality assurance/total quality management is one proven plan for guaranteeing the quality of the 20 million products that are transfused into 4 million patients each year and for moving toward the new order.

  1. Quality Assurance: One School's Response.

    Science.gov (United States)

    Wittemann, K. Joseph

    1990-01-01

    Since 1987, the Virginia Commonwealth University School of Dentistry has established a system of committee responsibilities for quality assurance, involving the committees for clinical affairs, academic performance, safety and therapeutics, and a council composed largely of department chairs. Additional review of procedures and records management…

  2. [Quality assurance in interventional cardiology].

    Science.gov (United States)

    Gülker, H

    2009-10-01

    Quality assurance in clinical studies aiming at approval of pharmaceutical products is submitted to strict rules, controls and auditing regulations. Comparative instruments to ensure quality in diagnostic and therapeutic procedures are not available in interventional cardiology, likewise in other fields of cardiovascular medicine. Quality assurance simply consists of "quality registers" with basic data not externally controlled. Based on the experiences of clinical studies and their long history of standardization it is assumed that these data may be severely flawed thus being inappropriate to set standards for diagnostic and therapeutic strategies. The precondition for quality assurance are quality data. In invasive coronary angiography and intervention medical indications, the decision making process interventional versus surgical revascularization, technical performance and after - care are essential aspects affecting quality of diagnostics and therapy. Quality data are externally controlled data. To collect quality data an appropriate infrastructure is a necessary precondition which is not existent. For an appropriate infrastructure investments have to be done both to build up as well as to sustain the necessary preconditions. As long as there are no infrastructure and no investments there will be no "quality data". There exist simply registers of data which are not proved to be a basis for significant assurance and enhancement in quality in interventional coronary cardiology. Georg Thieme Verlag KG Stuttgart, New York.

  3. Proposing a Metaliteracy Model to Redefine Information Literacy

    Directory of Open Access Journals (Sweden)

    Trudi E. Jacobson

    2013-12-01

    Full Text Available Metaliteracy is envisioned as a comprehensive model for information literacy to advance critical thinking and reflection in social media, open learning settings, and online communities. At this critical time in higher education, an expansion of the original definition of information literacy is required to include the interactive production and sharing of original and repurposed digital materials. Metaliteracy provides an overarching and unifying framework that builds on the core information literacy competencies while addressing the revolutionary changes in how learners communicate, create, and distribute information in participatory environments. Central to the metaliteracy model is a metacognitive component that encourages learners to continuously reflect on their own thinking and literacy development in these fluid and networked spaces. This approach leads to expanded competencies for adapting to the ongoing changes in emerging technologies and for advancing critical thinking and empowerment for producing, connecting, and distributing information as independent and collaborative learners.

  4. An information search model for online social Networks - MOBIRSE

    Directory of Open Access Journals (Sweden)

    J. A. Astaiza

    2015-12-01

    Full Text Available Online Social Networks (OSNs have been gaining great importance among Internet users in recent years. These are sites where it is possible to meet people, publish, and share content in a way that is both easy and free of charge. As a result, the volume of information contained in these websites has grown exponentially, and web search has consequently become an important tool for users to easily find information relevant to their social networking objectives. Making use of ontologies and user profiles can make these searches more effective. This article presents a model for Information Retrieval in OSNs (MOBIRSE based on user profile and ontologies which aims to improve the relevance of retrieved information on these websites. The social network Facebook was chosen for a case study and as the instance for the proposed model. The model was validated using measures such as At-k Precision and Kappa statistics, to assess its efficiency.

  5. Scope of Building Information Modeling (BIM in India

    Directory of Open Access Journals (Sweden)

    Mahua Mukherjee

    2009-01-01

    Full Text Available The design communication is gradually being changed from 2D based to integrated 3D digital interface. Building InformationModeling (BIM is a model-based design concept, in which buildings will be built virtually before they get built outin the field, where data models organized for complete integration of all relevant factors in the building lifecycle whichalso manages the information exchange between the AEC (Architects, Engineers, Contractors professionals, to strengthenthe interaction between the design team. BIM is a shared knowledge about the information for decisions making during itslifecycle. There’s still much to be learned about the opportunities and implications of this tool.This paper deals with the status check of BIM application in India, to do that a survey has been designed to check the acceptanceof BIM till date, while this application is widely accepted throughout the industry in many countries for managingproject information with capabilities for cost control and facilities management.

  6. Software Assurance Curriculum Project Volume 3: Master of Software Assurance Course Syllabi

    Science.gov (United States)

    2011-07-01

    Nicola. “Computer-Aided Support for Se- cure Tropos .” Automated Software Engineering 14, 3 (September 2007): 341–364. • Zannone, Nicola. “The Si...that are specific to software assurance, such as CLASP and Secure Tropos . Discuss the pros and cons of standard development process models...CLASP or Secure Tropos could be applied to the project. 4 Teach BSIMM, SAFECode and OWASP best practices. Discuss the pros and cons of security

  7. Assurance in Agent-Based Systems

    Energy Technology Data Exchange (ETDEWEB)

    Gilliom, Laura R.; Goldsmith, Steven Y.

    1999-05-10

    Our vision of the future of information systems is one that includes engineered collectives of software agents which are situated in an environment over years and which increasingly improve the performance of the overall system of which they are a part. At a minimum, the movement of agent and multi-agent technology into National Security applications, including their use in information assurance, is apparent today. The use of deliberative, autonomous agents in high-consequence/high-security applications will require a commensurate level of protection and confidence in the predictability of system-level behavior. At Sandia National Laboratories, we have defined and are addressing a research agenda that integrates the surety (safety, security, and reliability) into agent-based systems at a deep level. Surety is addressed at multiple levels: The integrity of individual agents must be protected by addressing potential failure modes and vulnerabilities to malevolent threats. Providing for the surety of the collective requires attention to communications surety issues and mechanisms for identifying and working with trusted collaborators. At the highest level, using agent-based collectives within a large-scale distributed system requires the development of principled design methods to deliver the desired emergent performance or surety characteristics. This position paper will outline the research directions underway at Sandia, will discuss relevant work being performed elsewhere, and will report progress to date toward assurance in agent-based systems.

  8. Challenges of an Information Model for Federating Virtualized Infrastructures

    NARCIS (Netherlands)

    van der Ham, J.; Papagianni, C.; Stéger, J.; Mátray, P.; Kryftis, Y.; Grosso, P.; Lymberopoulos, L.

    2011-01-01

    Users of the Future Internet will expect seamless and secure access to virtual resources distributed across multiple domains. These federated platforms are the core of the Future Internet. It is clear that information models, and concrete implementation in data models, are necessary prerequisites fo

  9. Using Diagnostic Text Information to Constrain Situation Models

    NARCIS (Netherlands)

    Dutke, S.; Baadte, C.; Hähnel, A.; Hecker, U. von; Rinck, M.

    2010-01-01

    During reading, the model of the situation described by the text is continuously accommodated to new text input. The hypothesis was tested that readers are particularly sensitive to diagnostic text information that can be used to constrain their existing situation model. In 3 experiments, adult part

  10. Agent-based model of information spread in social networks

    CERN Document Server

    Lande, D V; Berezin, B O

    2016-01-01

    We propose evolution rules of the multiagent network and determine statistical patterns in life cycle of agents - information messages. The main discussed statistical pattern is connected with the number of likes and reposts for a message. This distribution corresponds to Weibull distribution according to modeling results. We examine proposed model using the data from Twitter, an online social networking service.

  11. Information Search Process Model: How Freshmen Begin Research.

    Science.gov (United States)

    Swain, Deborah E.

    1996-01-01

    Investigates Kuhlthau's Search Process Model for information seeking using two Freshmen English classes. Data showed that students followed the six stages Kuhlthau proposed and suggest extensions to the model, including changing the order of the tasks, iterating and combining steps, and revising search goals based on social and interpersonal…

  12. Changing Models for Researching Pedagogy with Information and Communications Technologies

    Science.gov (United States)

    Webb, M.

    2013-01-01

    This paper examines changing models of pedagogy by drawing on recent research with teachers and their students as well as theoretical developments. In relation to a participatory view of learning, the paper reviews existing pedagogical models that take little account of the use of information and communications technologies as well as those that…

  13. Quiz Games as a model for Information Hiding

    OpenAIRE

    Bank, Bernd; Heintz, Joos; Matera, Guillermo; Montana, Jose L.; Pardo, Luis M.; Paredes, Andres Rojas

    2015-01-01

    We present a general computation model inspired in the notion of information hiding in software engineering. This model has the form of a game which we call quiz game. It allows in a uniform way to prove exponential lower bounds for several complexity problems of elimination theory.

  14. The Sanctuary Model of Trauma-Informed Organizational Change

    Science.gov (United States)

    Bloom, Sandra L.; Sreedhar, Sarah Yanosy

    2008-01-01

    This article features the Sanctuary Model[R], a trauma-informed method for creating or changing an organizational culture. Although the model is based on trauma theory, its tenets have application in working with children and adults across a wide diagnostic spectrum. Originally developed in a short-term, acute inpatient psychiatric setting for…

  15. The value of structural information in the VAR model

    NARCIS (Netherlands)

    R.W. Strachan (Rodney); H.K. van Dijk (Herman)

    2003-01-01

    textabstractEconomic policy decisions are often informed by empirical economic analysis. While the decision-maker is usually only interested in good estimates of outcomes, the analyst is interested in estimating the model. Accurate inference on the structural features of a model, such as cointegrati

  16. A Product Development Decision Model for Cockpit Weather Information System

    Science.gov (United States)

    Sireli, Yesim; Kauffmann, Paul; Gupta, Surabhi; Kachroo, Pushkin; Johnson, Edward J., Jr. (Technical Monitor)

    2003-01-01

    There is a significant market demand for advanced cockpit weather information products. However, it is unclear how to identify the most promising technological options that provide the desired mix of consumer requirements by employing feasible technical systems at a price that achieves market success. This study develops a unique product development decision model that employs Quality Function Deployment (QFD) and Kano's model of consumer choice. This model is specifically designed for exploration and resolution of this and similar information technology related product development problems.

  17. A Product Development Decision Model for Cockpit Weather Information Systems

    Science.gov (United States)

    Sireli, Yesim; Kauffmann, Paul; Gupta, Surabhi; Kachroo, Pushkin

    2003-01-01

    There is a significant market demand for advanced cockpit weather information products. However, it is unclear how to identify the most promising technological options that provide the desired mix of consumer requirements by employing feasible technical systems at a price that achieves market success. This study develops a unique product development decision model that employs Quality Function Deployment (QFD) and Kano's model of consumer choice. This model is specifically designed for exploration and resolution of this and similar information technology related product development problems.

  18. Ranking streamflow model performance based on Information theory metrics

    Science.gov (United States)

    Martinez, Gonzalo; Pachepsky, Yakov; Pan, Feng; Wagener, Thorsten; Nicholson, Thomas

    2016-04-01

    The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic model evaluation and selection. We simulated 10-year streamflow time series in five watersheds located in Texas, North Carolina, Mississippi, and West Virginia. Eight model of different complexity were applied. The information-theory based metrics were obtained after representing the time series as strings of symbols where different symbols corresponded to different quantiles of the probability distribution of streamflow. The symbol alphabet was used. Three metrics were computed for those strings - mean information gain that measures the randomness of the signal, effective measure complexity that characterizes predictability and fluctuation complexity that characterizes the presence of a pattern in the signal. The observed streamflow time series has smaller information content and larger complexity metrics than the precipitation time series. Watersheds served as information filters and and streamflow time series were less random and more complex than the ones of precipitation. This is reflected the fact that the watershed acts as the information filter in the hydrologic conversion process from precipitation to streamflow. The Nash Sutcliffe efficiency metric increased as the complexity of models increased, but in many cases several model had this efficiency values not statistically significant from each other. In such cases, ranking models by the closeness of the information-theory based parameters in simulated and measured streamflow time series can provide an additional criterion for the evaluation of hydrologic model performance.

  19. Final Report of the NASA Office of Safety and Mission Assurance Agile Benchmarking Team

    Science.gov (United States)

    Wetherholt, Martha

    2016-01-01

    To ensure that the NASA Safety and Mission Assurance (SMA) community remains in a position to perform reliable Software Assurance (SA) on NASAs critical software (SW) systems with the software industry rapidly transitioning from waterfall to Agile processes, Terry Wilcutt, Chief, Safety and Mission Assurance, Office of Safety and Mission Assurance (OSMA) established the Agile Benchmarking Team (ABT). The Team's tasks were: 1. Research background literature on current Agile processes, 2. Perform benchmark activities with other organizations that are involved in software Agile processes to determine best practices, 3. Collect information on Agile-developed systems to enable improvements to the current NASA standards and processes to enhance their ability to perform reliable software assurance on NASA Agile-developed systems, 4. Suggest additional guidance and recommendations for updates to those standards and processes, as needed. The ABT's findings and recommendations for software management, engineering and software assurance are addressed herein.

  20. Reflection on Quality Assurance System of Higher Vocational Education under Big Data Era

    Directory of Open Access Journals (Sweden)

    Jiang Xinlan

    2015-01-01

    Full Text Available Big data has the features like Volume, Variety, Value and Velocity. Here come the new opportunities and challenges for construction of Chinese quality assurance system of higher vocational education under big data era. There are problems in current quality assurance system of higher vocational education, such as imperfect main body, non-formation of internally and externally incorporated quality assurance system, non-scientific security standard and insufficiency in security investment. The construction of higher vocational education under big data era requires a change in the idea of quality assurance system construction to realize the multiple main bodies and multiple layers development trend for educational quality assurance system, and strengthen the construction of information platform for quality assurance system.

  1. Requirements engineering for cross-sectional information chain models.

    Science.gov (United States)

    Hübner, U; Cruel, E; Gök, M; Garthaus, M; Zimansky, M; Remmers, H; Rienhoff, O

    2012-01-01

    Despite the wealth of literature on requirements engineering, little is known about engineering very generic, innovative and emerging requirements, such as those for cross-sectional information chains. The IKM health project aims at building information chain reference models for the care of patients with chronic wounds, cancer-related pain and back pain. Our question therefore was how to appropriately capture information and process requirements that are both generally applicable and practically useful. To this end, we started with recommendations from clinical guidelines and put them up for discussion in Delphi surveys and expert interviews. Despite the heterogeneity we encountered in all three methods, it was possible to obtain requirements suitable for building reference models. We evaluated three modelling languages and then chose to write the models in UML (class and activity diagrams). On the basis of the current project results, the pros and cons of our approach are discussed.

  2. Towards GLUE2 evolution of the computing element information model

    CERN Document Server

    Andreozzi, S; Field, L; Kónya, B

    2008-01-01

    A key advantage of Grid systems is the ability to share heterogeneous resources and services between traditional administrative and organizational domains. This ability enables virtual pools of resources to be created and assigned to groups of users. Resource awareness, the capability of users or user agents to have knowledge about the existence and state of resources, is required in order utilize the resource. This awareness requires a description of the services and resources typically defined via a community-agreed information model. One of the most popular information models, used by a number of Grid infrastructures, is the GLUE Schema, which provides a common language for describing Grid resources. Other approaches exist, however they follow different modeling strategies. The presence of different flavors of information models for Grid resources is a barrier for enabling inter-Grid interoperability. In order to solve this problem, the GLUE Working Group in the context of the Open Grid Forum was started. ...

  3. Technology of Developing of the Information Models of Nonstructurized Processes

    CERN Document Server

    Samojlov, V N

    2000-01-01

    In the paper, a multi-level algorithm of forming the information models is proposed for nonstructurized processes. Basic components of the information model are classified in the form of structure-functional constituents and structure-functional types of a developing composite system. Systematic requirements and criteria of construction of the basis of knowledge data and the bank of data of information model are formulated with the use of the proposed algorithm. As examples, the results are shown for application of the systematic analysis of forming the correspondence of computational technique and mathematical simulation in research studies on the structure-functional type "input-process-output" and constructing of the structure-functional model of the knowledge data basis starting from one of the inculcated technological processes.

  4. Quality assurance for environmental analytical chemistry: 1980

    Energy Technology Data Exchange (ETDEWEB)

    Gladney, E.S.; Goode, W.E.; Perrin, D.R.; Burns, C.E.

    1981-09-01

    The continuing quality assurance effort by the Environmental Surveillance Group is presented. Included are all standard materials now in use, their consensus or certified concentrations, quality control charts, and all quality assurance measurements made by H-8 during 1980.

  5. Quality assurance - how to involve the employees

    DEFF Research Database (Denmark)

    Jørgensen, Michael Søgaard

    1996-01-01

    An overview of strategies for involvement of employees in quality assurance developement and implementation.......An overview of strategies for involvement of employees in quality assurance developement and implementation....

  6. Information-theoretic model selection applied to supernovae data

    CERN Document Server

    Biesiada, M

    2007-01-01

    There are several different theoretical ideas invoked to explain the dark energy with relatively little guidance of which one of them might be right. Therefore the emphasis of ongoing and forthcoming research in this field shifts from estimating specific parameters of cosmological model to the model selection. In this paper we apply information-theoretic model selection approach based on Akaike criterion as an estimator of Kullback-Leibler entropy. In particular, we present the proper way of ranking the competing models based on Akaike weights (in Bayesian language - posterior probabilities of the models). Out of many particular models of dark energy we focus on four: quintessence, quintessence with time varying equation of state, brane-world and generalized Chaplygin gas model and test them on Riess' Gold sample. As a result we obtain that the best model - in terms of Akaike Criterion - is the quintessence model. The odds suggest that although there exist differences in the support given to specific scenario...

  7. The Institutionalization of Scientific Information: A Scientometric Model (ISI-S Model).

    Science.gov (United States)

    Vinkler, Peter

    2002-01-01

    Introduces a scientometric model (ISI-S model) for describing the institutionalization process of scientific information. ISI-S describes the information and knowledge systems of scientific publications as a global network of interdependent information and knowledge clusters that are dynamically changing by their content and size. (Author/LRW)

  8. Quality assurance in performance assessments

    Energy Technology Data Exchange (ETDEWEB)

    Maul, P.R.; Watkins, B.M.; Salter, P.; Mcleod, R [QuantiSci Ltd, Henley-on-Thames (United Kingdom)

    1999-01-01

    Following publication of the Site-94 report, SKI wishes to review how Quality Assurance (QA) issues could be treated in future work both in undertaking their own Performance Assessment (PA) calculations and in scrutinising documents supplied by SKB (on planning a repository for spent fuels in Sweden). The aim of this report is to identify the key QA issues and to outline the nature and content of a QA plan which would be suitable for SKI, bearing in mind the requirements and recommendations of relevant standards. Emphasis is on issues which are specific to Performance Assessments for deep repositories for radioactive wastes, but consideration is also given to issues which need to be addressed in all large projects. Given the long time over which the performance of a deep repository system must be evaluated, the demonstration that a repository is likely to perform satisfactorily relies on the use of computer-generated model predictions of system performance. This raises particular QA issues which are generally not encountered in other technical areas (for instance, power station operations). The traceability of the arguments used is a key QA issue, as are conceptual model uncertainty, and code verification and validation; these were all included in the consideration of overall uncertainties in the Site-94 project. Additionally, issues which are particularly relevant to SKI include: How QA in a PA fits in with the general QA procedures of the organisation undertaking the work. The relationship between QA as applied by the regulator and the implementor of a repository development programme. Section 2 introduces the discussion of these issues by reviewing the standards and guidance which are available from national and international organisations. This is followed in Section 3 by a review of specific issues which arise from the Site-94 exercise. An outline procedure for managing QA issues in SKI is put forward as a basis for discussion in Section 4. It is hoped that

  9. Bayesian generalized linear mixed modeling of Tuberculosis using informative priors.

    Science.gov (United States)

    Ojo, Oluwatobi Blessing; Lougue, Siaka; Woldegerima, Woldegebriel Assefa

    2017-01-01

    TB is rated as one of the world's deadliest diseases and South Africa ranks 9th out of the 22 countries with hardest hit of TB. Although many pieces of research have been carried out on this subject, this paper steps further by inculcating past knowledge into the model, using Bayesian approach with informative prior. Bayesian statistics approach is getting popular in data analyses. But, most applications of Bayesian inference technique are limited to situations of non-informative prior, where there is no solid external information about the distribution of the parameter of interest. The main aim of this study is to profile people living with TB in South Africa. In this paper, identical regression models are fitted for classical and Bayesian approach both with non-informative and informative prior, using South Africa General Household Survey (GHS) data for the year 2014. For the Bayesian model with informative prior, South Africa General Household Survey dataset for the year 2011 to 2013 are used to set up priors for the model 2014.

  10. A Novel Fuzzy Document Based Information Retrieval Model for Forecasting

    Directory of Open Access Journals (Sweden)

    Partha Roy

    2017-06-01

    Full Text Available Information retrieval systems are generally used to find documents that are most appropriate according to some query that comes dynamically from users. In this paper a novel Fuzzy Document based Information Retrieval Model (FDIRM is proposed for the purpose of Stock Market Index forecasting. The novelty of proposed approach is a modified tf-idf scoring scheme to predict the future trend of the stock market index. The contribution of this paper has two dimensions, 1 In the proposed system the simple time series is converted to an enriched fuzzy linguistic time series with a unique approach of incorporating market sentiment related information along with the price and 2 A unique approach is followed while modeling the information retrieval (IR system which converts a simple IR system into a forecasting system. From the performance comparison of FDIRM with standard benchmark models it can be affirmed that the proposed model has a potential of becoming a good forecasting model. The stock market data provided by Standard & Poor’s CRISIL NSE Index 50 (CNX NIFTY-50 index of National Stock Exchange of India (NSE is used to experiment and validate the proposed model. The authentic data for validation and experimentation is obtained from http://www.nseindia.com which is the official website of NSE. A java program is under construction to implement the model in real-time with graphical users’ interface.

  11. Applying XML for designing and interchanging information for multidimensional model

    Institute of Scientific and Technical Information of China (English)

    Lu Changhui; Deng Su; Zhang Weiming

    2005-01-01

    In order to exchange and share information among the conceptual models of data warehouse, and to build a solid base for the integration and share of metadata, a new multidimensional concept model is presented based on XML and its DTD is defined, which can perfectly describe various semantic characteristics of multidimensional conceptual model. According to the multidimensional conceptual modeling technique which is based on UML, the mapping algorithm between the multidimensional conceptual model is described based on XML and UML class diagram, and an application base for the wide use of this technique is given.

  12. Informed Principal Model and Contract in Supply Chain with Demand Disruption Asymmetric Information

    Directory of Open Access Journals (Sweden)

    Huan Zhang

    2016-01-01

    Full Text Available Because of the frequency and disastrous influence, the supply chain disruption has caused extensive concern both in the industry and in the academia. In a supply chain with one manufacturer and one retailer, the demand of the retailer is uncertain and meanwhile may suffer disruption with a probability. Taking the demand disruption probability as the retailer’s asymmetric information, an informed principal model with the retailer as the principal is explored to make the contract. The retailer can show its information to the manufacturer through the contract. It is found out that the high-risk retailer intends to pretend to be the low-risk one. So the separating contract is given through the low-information-intensity allocation, in which the order quantity and the transferring payment for the low-risk retailer distort upwards, but those of high-risk retailer do not distort. In order to reduce the signaling cost which the low-risk retailer pays, the interim efficient model is introduced, which ends up with the order quantity and transferring payment distorting upwards again but less than before. In the numerical examples, with two different mutation probabilities, the informed principal contracts show the application of the informed principal model in the supply chain with demand disruption.

  13. Process and building information modelling in the construction industry by using information delivery manuals and model view definitions

    DEFF Research Database (Denmark)

    Karlshøj, Jan

    2012-01-01

    The construction industry is gradually increasing its use of structured information and building information modelling.To date, the industry has suffered from the disadvantages of a project-based organizational structure and ad hoc solutions. Furthermore, it is not used to formalizing the flow...... of information and specifying exactly which objects and properties are needed for each process and which information is produced by the processes. The present study is based on reviewing the existing methodology of Information Delivery Manuals (IDM) from Buildingsmart, which also is also an ISO standard 29481...... Part 1; and the Model View Definition (MVD) methodology developed by Buildingsmart and BLIS. The research also includes a review of concrete IDM development projects that have been developed over the last five years. Although the study has identified interest in the IDM methodology in a number...

  14. Semantic Information Modeling for Emerging Applications in Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Qunzhi; Natarajan, Sreedhar; Simmhan, Yogesh; Prasanna, Viktor

    2012-04-16

    Smart Grid modernizes power grid by integrating digital and information technologies. Millions of smart meters, intelligent appliances and communication infrastructures are under deployment allowing advanced IT applications to be developed to secure and manage power grid operations. Demand response (DR) is one such emerging application to optimize electricity demand by curtailing/shifting power load when peak load occurs. Existing DR approaches are mostly based on static plans such as pricing policies and load shedding schedules. However, improvements to power management applications rely on data emanating from existing and new information sources with the growth of Smart Grid information space. In particular, dynamic DR algorithms depend on information from smart meters that report interval-based power consumption measurement, HVAC systems that monitor buildings heat and humidity, and even weather forecast services. In order for emerging Smart Grid applications to take advantage of the diverse data influx, extensible information integration is required. In this paper, we develop an integrated Smart Grid information model using Semantic Web techniques and present case studies of using semantic information for dynamic DR. We show the semantic model facilitates information integration and knowledge representation for developing the next generation Smart Grid applications.

  15. A Compositional Relevance Model for Adaptive Information Retrieval

    Science.gov (United States)

    Mathe, Nathalie; Chen, James; Lu, Henry, Jr. (Technical Monitor)

    1994-01-01

    There is a growing need for rapid and effective access to information in large electronic documentation systems. Access can be facilitated if information relevant in the current problem solving context can be automatically supplied to the user. This includes information relevant to particular user profiles, tasks being performed, and problems being solved. However most of this knowledge on contextual relevance is not found within the contents of documents, and current hypermedia tools do not provide any easy mechanism to let users add this knowledge to their documents. We propose a compositional relevance network to automatically acquire the context in which previous information was found relevant. The model records information on the relevance of references based on user feedback for specific queries and contexts. It also generalizes such information to derive relevant references for similar queries and contexts. This model lets users filter information by context of relevance, build personalized views of documents over time, and share their views with other users. It also applies to any type of multimedia information. Compared to other approaches, it is less costly and doesn't require any a priori statistical computation, nor an extended training period. It is currently being implemented into the Computer Integrated Documentation system which enables integration of various technical documents in a hypertext framework.

  16. MATHEMATICAL MODEL FOR CALCULATION OF INFORMATION RISKS FOR INFORMATION AND LOGISTICS SYSTEM

    Directory of Open Access Journals (Sweden)

    A. G. Korobeynikov

    2015-05-01

    Full Text Available Subject of research. The paper deals with mathematical model for assessment calculation of information risks arising during transporting and distribution of material resources in the conditions of uncertainty. Meanwhile information risks imply the danger of origin of losses or damage as a result of application of information technologies by the company. Method. The solution is based on ideology of the transport task solution in stochastic statement with mobilization of mathematical modeling theory methods, the theory of graphs, probability theory, Markov chains. Creation of mathematical model is performed through the several stages. At the initial stage, capacity on different sites depending on time is calculated, on the basis of information received from information and logistic system, the weight matrix is formed and the digraph is under construction. Then there is a search of the minimum route which covers all specified vertexes by means of Dejkstra algorithm. At the second stage, systems of differential Kolmogorov equations are formed using information about the calculated route. The received decisions show probabilities of resources location in concrete vertex depending on time. At the third stage, general probability of the whole route passing depending on time is calculated on the basis of multiplication theorem of probabilities. Information risk, as time function, is defined by multiplication of the greatest possible damage by the general probability of the whole route passing. In this case information risk is measured in units of damage which corresponds to that monetary unit which the information and logistic system operates with. Main results. Operability of the presented mathematical model is shown on a concrete example of transportation of material resources where places of shipment and delivery, routes and their capacity, the greatest possible damage and admissible risk are specified. The calculations presented on a diagram showed

  17. Modeling of information diffusion in Twitter-like social networks under information overload.

    Science.gov (United States)

    Li, Pei; Li, Wei; Wang, Hui; Zhang, Xin

    2014-01-01

    Due to the existence of information overload in social networks, it becomes increasingly difficult for users to find useful information according to their interests. This paper takes Twitter-like social networks into account and proposes models to characterize the process of information diffusion under information overload. Users are classified into different types according to their in-degrees and out-degrees, and user behaviors are generalized into two categories: generating and forwarding. View scope is introduced to model the user information-processing capability under information overload, and the average number of times a message appears in view scopes after it is generated by a given type user is adopted to characterize the information diffusion efficiency, which is calculated theoretically. To verify the accuracy of theoretical analysis results, we conduct simulations and provide the simulation results, which are consistent with the theoretical analysis results perfectly. These results are of importance to understand the diffusion dynamics in social networks, and this analysis framework can be extended to consider more realistic situations.

  18. Quality Assurance in Open and Distance Education: a Case Study of Kota Open University

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    After a brief analysis of the concept of quality in open and distance education, information technology for quality assurance and Indian initiative for quality improvement, the paper examines the quality assurance measures at Kota Open University under the following areas: planning academic programmes ;developing academic programmes; producing learning materials; implementing programmes; reviewing courses/programmes ; and developing human resources.

  19. A non-linear model of information seeking behaviour

    Directory of Open Access Journals (Sweden)

    Allen E. Foster

    2005-01-01

    Full Text Available The results of a qualitative, naturalistic, study of information seeking behaviour are reported in this paper. The study applied the methods recommended by Lincoln and Guba for maximising credibility, transferability, dependability, and confirmability in data collection and analysis. Sampling combined purposive and snowball methods, and led to a final sample of 45 inter-disciplinary researchers from the University of Sheffield. In-depth semi-structured interviews were used to elicit detailed examples of information seeking. Coding of interview transcripts took place in multiple iterations over time and used Atlas-ti software to support the process. The results of the study are represented in a non-linear Model of Information Seeking Behaviour. The model describes three core processes (Opening, Orientation, and Consolidation and three levels of contextual interaction (Internal Context, External Context, and Cognitive Approach, each composed of several individual activities and attributes. The interactivity and shifts described by the model show information seeking to be non-linear, dynamic, holistic, and flowing. The paper concludes by describing the whole model of behaviours as analogous to an artist's palette, in which activities remain available throughout information seeking. A summary of key implications of the model and directions for further research are included.

  20. 10 CFR 76.93 - Quality assurance.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Quality assurance. 76.93 Section 76.93 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS Safety § 76.93 Quality assurance. The Corporation shall establish, maintain, and execute a quality assurance program satisfying each...