WorldWideScience

Sample records for modeling information assurance

  1. Modeling Information Assurance

    National Research Council Canada - National Science Library

    Beauregard, Joseph

    2001-01-01

    .... S. military controls much of the world's most sensitive information, and since it cannot sacrifice losing the speed at which this information is currently processed and disseminated, it must find a way...

  2. A Computational Model and Multi-Agent Simulation for Information Assurance

    National Research Council Canada - National Science Library

    VanPutte, Michael

    2002-01-01

    The field of information assurance (IA) is too complex for current modeling tools, While security analysts may understand individual mechanisms at a particular moment, the interactions among the mechanisms, combined with evolving nature...

  3. Models for Information Assurance Education and Outreach: A Report on Year 1 Implementation

    Science.gov (United States)

    Wang, Jianjun

    2013-01-01

    On September 22, 2012, NSF announced its decision to fund a three-year project, "Models for Information Assurance Education and Outreach" (MIAEO). In the first year of grant operation, MIAEO has invited 18 high school students, two K-12 teachers, and two CSUB student assistants to conduct research explorations in the fields of…

  4. Models for Information Assurance Education and Outreach: A Report on Year 2 Implementation

    Science.gov (United States)

    Wang, Jianjun

    2014-01-01

    "Models for Information Assurance Education and Outreach" (MIAEO) is an NSF-funded, three-year project to support hands-on explorations in "network security" and "cryptography" through Research Experience Vitalizing Science-University Program (REVS-UP) at California State University, Bakersfield. In addition, the…

  5. Model-Based Assurance Case+ (MBAC+): Tutorial on Modeling Radiation Hardness Assurance Activities

    Science.gov (United States)

    Austin, Rebekah; Label, Ken A.; Sampson, Mike J.; Evans, John; Witulski, Art; Sierawski, Brian; Karsai, Gabor; Mahadevan, Nag; Schrimpf, Ron; Reed, Robert A.

    2017-01-01

    This presentation will cover why modeling is useful for radiation hardness assurance cases, and also provide information on Model-Based Assurance Case+ (MBAC+), NASAs Reliability Maintainability Template, and Fault Propagation Modeling.

  6. Information Assurance Under Fire

    NARCIS (Netherlands)

    Luiijf, H.A.M.

    2000-01-01

    Information and Communication Technology (ICT) has an immense impact on the Military Mode of Operation. Modern Armed Forces are increasingly using commercial-off-the-shelf (COTS) hardware, software and ICT-services. Defence and government decision-making units and its supporting critical industries

  7. Risk Information Management Resource (RIMR): modeling an approach to defending against military medical information assurance brain drain

    Science.gov (United States)

    Wright, Willie E.

    2003-05-01

    As Military Medical Information Assurance organizations face off with modern pressures to downsize and outsource, they battle with losing knowledgeable people who leave and take with them what they know. This knowledge is increasingly being recognized as an important resource and organizations are now taking steps to manage it. In addition, as the pressures for globalization (Castells, 1998) increase, collaboration and cooperation are becoming more distributed and international. Knowledge sharing in a distributed international environment is becoming an essential part of Knowledge Management. This is a major shortfall in the current approach to capturing and sharing knowledge in Military Medical Information Assurance. This paper addresses this challenge by exploring Risk Information Management Resource (RIMR) as a tool for sharing knowledge using the concept of Communities of Practice. RIMR is based no the framework of sharing and using knowledge. This concept is done through three major components - people, process and technology. The people aspect enables remote collaboration, support communities of practice, reward and recognize knowledge sharing while encouraging storytelling. The process aspect enhances knowledge capture and manages information. While the technology aspect enhance system integration and data mining, it also utilizes intelligent agents and exploits expert systems. These coupled with supporting activities of education and training, technology infrastructure and information security enables effective information assurance collaboration.

  8. Waste Management facilities cost information: System Cost Model Software Quality Assurance Plan. Revision 2

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, B.L.; Lundeen, A.S.

    1996-02-01

    In May of 1994, Lockheed Idaho Technologies Company (LITCO) in Idaho Falls, Idaho and subcontractors developed the System Cost Model (SCM) application. The SCM estimates life-cycle costs of the entire US Department of Energy (DOE) complex for designing; constructing; operating; and decommissioning treatment, storage, and disposal (TSD) facilities for mixed low-level, low-level, transuranic, and mixed transuranic waste. The SCM uses parametric cost functions to estimate life-cycle costs for various treatment, storage, and disposal modules which reflect planned and existing facilities at DOE installations. In addition, SCM can model new facilities based on capacity needs over the program life cycle. The SCM also provides transportation costs for truck and rail, which include transport of contact-handled, remote-handled, and alpha (transuranic) wastes. The user can provide input data (default data is included in the SCM) including the volume and nature of waste to be managed, the time period over which the waste is to be managed, and the configuration of the waste management complex (i.e., where each installation`s generated waste will be treated, stored, and disposed). Then the SCM uses parametric cost equations to estimate the costs of pre-operations (designing), construction costs, operation management, and decommissioning these waste management facilities. For the product to be effective and useful the SCM users must have a high level of confidence in the data generated by the software model. The SCM Software Quality Assurance Plan is part of the overall SCM project management effort to ensure that the SCM is maintained as a quality product and can be relied on to produce viable planning data. This document defines tasks and deliverables to ensure continued product integrity, provide increased confidence in the accuracy of the data generated, and meet the LITCO`s quality standards during the software maintenance phase. 8 refs., 1 tab.

  9. Waste Management facilities cost information: System Cost Model Software Quality Assurance Plan. Revision 2

    International Nuclear Information System (INIS)

    Peterson, B.L.; Lundeen, A.S.

    1996-02-01

    In May of 1994, Lockheed Idaho Technologies Company (LITCO) in Idaho Falls, Idaho and subcontractors developed the System Cost Model (SCM) application. The SCM estimates life-cycle costs of the entire US Department of Energy (DOE) complex for designing; constructing; operating; and decommissioning treatment, storage, and disposal (TSD) facilities for mixed low-level, low-level, transuranic, and mixed transuranic waste. The SCM uses parametric cost functions to estimate life-cycle costs for various treatment, storage, and disposal modules which reflect planned and existing facilities at DOE installations. In addition, SCM can model new facilities based on capacity needs over the program life cycle. The SCM also provides transportation costs for truck and rail, which include transport of contact-handled, remote-handled, and alpha (transuranic) wastes. The user can provide input data (default data is included in the SCM) including the volume and nature of waste to be managed, the time period over which the waste is to be managed, and the configuration of the waste management complex (i.e., where each installation's generated waste will be treated, stored, and disposed). Then the SCM uses parametric cost equations to estimate the costs of pre-operations (designing), construction costs, operation management, and decommissioning these waste management facilities. For the product to be effective and useful the SCM users must have a high level of confidence in the data generated by the software model. The SCM Software Quality Assurance Plan is part of the overall SCM project management effort to ensure that the SCM is maintained as a quality product and can be relied on to produce viable planning data. This document defines tasks and deliverables to ensure continued product integrity, provide increased confidence in the accuracy of the data generated, and meet the LITCO's quality standards during the software maintenance phase. 8 refs., 1 tab

  10. SECURE MATHEMATICALLY- ASSURED COMPOSITION OF CONTROL MODELS

    Science.gov (United States)

    2017-09-27

    SECURE MATHEMATICALLY-ASSURED COMPOSITION OF CONTROL MODELS ROCKWELL COLLINS SEPTEMBER 2017 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE...collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE...MATHEMATICALLY-ASSURED COMPOSITION OF CONTROL MODELS 5a. CONTRACT NUMBER FA8750-12-9-0179 5b. GRANT NUMBER N/A 5c. PROGRAM ELEMENT NUMBER 62303E

  11. Development of an Instructional Quality Assurance Model in Nursing Science

    Science.gov (United States)

    Ajpru, Haruthai; Pasiphol, Shotiga; Wongwanich, Suwimon

    2011-01-01

    The purpose of this study was to develop an instructional quality assurance model in nursing science. The study was divided into 3 phases; (1) to study the information for instructional quality assurance model development (2) to develop an instructional quality assurance model in nursing science and (3) to audit and the assessment of the developed…

  12. CyberCIEGE: Gaming for Information Assurance

    OpenAIRE

    Irvine, Cynthia E.; Thompson, Michael F.; Allen, Ken

    2004-01-01

    Cyber security students need to understand both the impact that poor security choices can have on an organization's health and the connect steps that can improve security within it. In short, they must understand information assurance (IA) principles and how to apply them.

  13. Information Assurance and Forensic Readiness

    Science.gov (United States)

    Pangalos, Georgios; Katos, Vasilios

    Egalitarianism and justice are amongst the core attributes of a democratic regime and should be also secured in an e-democratic setting. As such, the rise of computer related offenses pose a threat to the fundamental aspects of e-democracy and e-governance. Digital forensics are a key component for protecting and enabling the underlying (e-)democratic values and therefore forensic readiness should be considered in an e-democratic setting. This position paper commences from the observation that the density of compliance and potential litigation activities is monotonically increasing in modern organizations, as rules, legislative regulations and policies are being constantly added to the corporate environment. Forensic practices seem to be departing from the niche of law enforcement and are becoming a business function and infrastructural component, posing new challenges to the security professionals. Having no a priori knowledge on whether a security related event or corporate policy violation will lead to litigation, we advocate that computer forensics need to be applied to all investigatory, monitoring and auditing activities. This would result into an inflation of the responsibilities of the Information Security Officer. After exploring some commonalities and differences between IS audit and computer forensics, we present a list of strategic challenges the organization and, in effect, the IS security and audit practitioner will face.

  14. Information Assurance Security in the Information Environment

    CERN Document Server

    Blyth, Andrew

    2006-01-01

    Intended for IT managers and assets protection professionals, this work aims to bridge the gap between information security, information systems security and information warfare. It covers topics such as the role of the corporate security officer; Corporate cybercrime; Electronic commerce and the global marketplace; Cryptography; and, more.

  15. INFORMATION ASSURANCE - INTELLIGENCE - INFORMATION SUPERIORITY RELATIONSHIP WITHIN NATO OPERATIONS

    Directory of Open Access Journals (Sweden)

    Gheorghe BOARU, Ioan-Mihai ILIEŞ

    2011-01-01

    Full Text Available There is a tight relationship between information assurance, the intelligence cycle and information superiority within NATO operations. The intelligence cycle has a discrete architecture and provides on-time and relevant intelligence products to the joint force commanders and to other authorized users in a specifi c joint area of operations. The intelligence cycle must follow the evolution of the operation. A permanent intelligence estimate will be performed during the military decision making process and operations execution. Information superiority is one of the most powerful intelligence cycle achievements. and decisively infuences the success of NATO joint operations. Information superiority must be preserved and enhanced through information assurance. Information assurance is an information operation that must be planned by the military in charge of operation security or by non-military experts, executed by all personnel during the entire intelligence cycle life time and employed during the planning and execution of NATO joint operations.

  16. Information Assurance and the Information Society

    NARCIS (Netherlands)

    Luiijf, H.A.M.

    1999-01-01

    Society is on the verge of a new era: the information age. Economical changes, a new way of looking at services and new types of conflict are forecasted. Some glimpses of these changes were noticed during the Persian Gulf War. Government decision units, organisations, society and critical industries

  17. Information Assurance and the Information Society

    NARCIS (Netherlands)

    Luiijf, H.A.M.

    1998-01-01

    Society is on the verge of a new era: the information age. Economical changes, a new way of looking at services and new types of conflict are forecasted. Some glimpses of these changes were noticed during the Persian Gulf War. Government decision units, organisations, society and critical industries

  18. NIF Projects Controls and Information Systems Software Quality Assurance Plan

    Energy Technology Data Exchange (ETDEWEB)

    Fishler, B

    2011-03-18

    Quality achievement for the National Ignition Facility (NIF) and the National Ignition Campaign (NIC) is the responsibility of the NIF Projects line organization as described in the NIF and Photon Science Directorate Quality Assurance Plan (NIF QA Plan). This Software Quality Assurance Plan (SQAP) is subordinate to the NIF QA Plan and establishes quality assurance (QA) activities for the software subsystems within Controls and Information Systems (CIS). This SQAP implements an activity level software quality assurance plan for NIF Projects as required by the LLNL Institutional Software Quality Assurance Program (ISQAP). Planned QA activities help achieve, assess, and maintain appropriate quality of software developed and/or acquired for control systems, shot data systems, laser performance modeling systems, business applications, industrial control and safety systems, and information technology systems. The objective of this SQAP is to ensure that appropriate controls are developed and implemented for management planning, work execution, and quality assessment of the CIS organization's software activities. The CIS line organization places special QA emphasis on rigorous configuration control, change management, testing, and issue tracking to help achieve its quality goals.

  19. Information Assurance in Saudi Organizations - An Empirical Study

    Science.gov (United States)

    Nabi, Syed Irfan; Mirza, Abdulrahman A.; Alghathbar, Khaled

    This paper presents selective results of a survey conducted to find out the much needed insight into the status of information security in Saudi Arabian organizations. The purpose of this research is to give the state of information assurance in the Kingdom and to better understand the prevalent ground realities. The survey covered technical aspects of information security, risk management and information assurance management. The results provide deep insights in to the existing level of information assurance in various sectors that can be helpful in better understanding the intricate details of the prevalent information security in the Kingdom. Also, the results can be very useful for information assurance policy makers in the government as well as private sector organizations. There are few empirical studies on information assurance governance available in literature, especially about the Middle East and Saudi Arabia, therefore, the results are invaluable for information security researchers in improving the understanding of information assurance in this region and the Kingdom.

  20. Review of the National Information Assurance Partnership (NIAP)

    National Research Council Canada - National Science Library

    Larsen, Gregory N; Burton, J. K; Cohen, Patricia A; Harvey, Rick A; Meeson, Reginald N; Nash, Michael S; Nash, Sarah H; Schneider, Edward A; Simpson, William R; Stytz, Martin R; Wheeler, David A

    2006-01-01

    This study was mandated by the National Strategy to Secure Cyberspace which requires the federal government to conduct a comprehensive review of the National Information Assurance Partnership (NIAP...

  1. Causal Models for Safety Assurance Technologies Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Fulfillment of NASA's System-Wide Safety and Assurance Technology (SSAT) project at NASA requires leveraging vast amounts of data into actionable knowledge. Models...

  2. Can the Analytical Hierarchy Process Model Be Effectively Applied in the Prioritization of Information Assurance Defense In-Depth Measures? --A Quantitative Study

    Science.gov (United States)

    Alexander, Rodney T.

    2017-01-01

    Organizational computing devices are increasingly becoming targets of cyber-attacks, and organizations have become dependent on the safety and security of their computer networks and their organizational computing devices. Business and government often use defense in-depth information assurance measures such as firewalls, intrusion detection…

  3. Integrating Information Assurance and Security into IT Education: A Look at the Model Curriculum and Emerging Practice

    Science.gov (United States)

    Dark, Melissa Jane; Ekstrom, Joseph J.; Lunt, Barry M.

    2006-01-01

    In December 2001 a meeting of interested parties from fifteen four-year IT programs from the US along with representatives from IEEE, ACM, and ABET (CITC-1) began work on the formalization of Information Technology as an accredited academic discipline. The effort has evolved into SIGITE, the ACM SIG for Information Technology Education. During…

  4. Mission Assurance Modeling and Simulation: A Cyber Security Roadmap

    Science.gov (United States)

    Gendron, Gerald; Roberts, David; Poole, Donold; Aquino, Anna

    2012-01-01

    This paper proposes a cyber security modeling and simulation roadmap to enhance mission assurance governance and establish risk reduction processes within constrained budgets. The term mission assurance stems from risk management work by Carnegie Mellon's Software Engineering Institute in the late 19905. By 2010, the Defense Information Systems Agency revised its cyber strategy and established the Program Executive Officer-Mission Assurance. This highlights a shift from simply protecting data to balancing risk and begins a necessary dialogue to establish a cyber security roadmap. The Military Operations Research Society has recommended a cyber community of practice, recognizing there are too few professionals having both cyber and analytic experience. The authors characterize the limited body of knowledge in this symbiotic relationship. This paper identifies operational and research requirements for mission assurance M&S supporting defense and homeland security. M&S techniques are needed for enterprise oversight of cyber investments, test and evaluation, policy, training, and analysis.

  5. Development of a model for assessing the impact of information assurance functionality on secure messaging system performance

    Science.gov (United States)

    Belur, Sheela V.; Gloster, Jonathan

    2007-04-01

    An analytical performance model for a generic secure messaging system is formulated as a multi-class queuing network. The model includes assessment of the impact of security features such as secret key encryption/ decryption, signature generation/verification, and certificate validation, on overall performance. Findings of sensitivity analysis with respect to message rate, WAN transmission link, SSL encryption, message size, and distance between servers is also presented. Finally, the description of how the model can be adopted for making performance based architectural design options is outlined.

  6. A Socio-technical Analysis of Information Systems Security Assurance : A Case Study for Effective Assurance

    OpenAIRE

    Chaula, Job Asheri

    2006-01-01

    This thesis examines the concepts of Information System (IS) security assurance using a socio-technical framework. IS security assurance deals with the problem of estimating how well a particular security system will function efficiently and effectively in a specific operational environment. In such environments, the IS interact with other systems such as ethical, legal, operational and administrative. Security failure in any of these systems may result in security failure of the whole system...

  7. Information Assurance Intrusion Detection Sensor Database Design: Lessons Learned

    National Research Council Canada - National Science Library

    Spink, Brian

    2001-01-01

    Current architectural trends in information assurance for the DOD focuses on the fusion and correlation of large volumes of data collected across several intrusion detection systems and boundary devices...

  8. Critical Infrastructure Protection and Information Assurance (CIPIA) Fellow Program

    National Research Council Canada - National Science Library

    Chen, Peter

    2003-01-01

    LSU was one of the universities chosen to participate in the project of training new researchers to work on the Critical Infrastructure Protection and Information Assurance (CIPIA) areas. Three Ph.D...

  9. DoD Global Information Grid Mission Assurance

    National Research Council Canada - National Science Library

    Bargar, Anthony

    2008-01-01

    ... for espionage and the criminal theft of data. GIG mission assurance works to ensure the DoD is able to accomplish its critical missions when networks, services, or information are unavailable, degraded, or distrusted...

  10. Voice Biometrics for Information Assurance Applications

    National Research Council Canada - National Science Library

    Kang, George

    2002-01-01

    In 2002, the President of the United States established an organization within the DOD to develop and promulgate biometrics technologies to achieve security in information, information systems, weapons, and facilities...

  11. Information Assurance within the United States Air Force

    Science.gov (United States)

    Cherry, John D.

    2010-01-01

    According to the Department of Defense (DoD), a review of information assurance (IA) in the United States Air Force (USAF) in 2009, cyber security is jeopardized because of information loss. This situation has occurred in large part because of less than optimal training practices or adherence to training protocols. The purpose of this study was…

  12. INFORMATION FLOW ASSURED BY ITC CONTINUITY PLANNING

    Directory of Open Access Journals (Sweden)

    Gabriel Cozgarea

    2009-05-01

    Full Text Available Forwarding the frequent usage of complex processes and the big volume of information, it is imperative to manage the automatic circuit of the document flow in a company activity. The main advantage of such a system consist in document waiting to be proces

  13. Integrated Reporting and Assurance of Sustainability Information: An Experimental Study on Professional Investors’ Information Processing

    NARCIS (Netherlands)

    Reimsbach, D.; Hahn, R.; Gürtürk, A.

    2018-01-01

    Sustainability-related non-financial information is increasingly deemed value relevant. Against this background, two recent trends in non-financial reporting are frequently discussed: integrated reporting and assurance of sustainability information. Using an established framework of information

  14. Evaluating Outsourcing Information Technology and Assurance Expertise by Small Non-Profit Organizations

    Science.gov (United States)

    Guinn, Fillmore

    2013-01-01

    Small non-profit organizations outsource at least one information technology or information assurance process. Outsourcing information technology and information assurance processes has increased every year. The study was to determine the key reasons behind the choice to outsource information technology and information assurance processes. Using…

  15. Model Based Mission Assurance: Emerging Opportunities for Robotic Systems

    Science.gov (United States)

    Evans, John W.; DiVenti, Tony

    2016-01-01

    The emergence of Model Based Systems Engineering (MBSE) in a Model Based Engineering framework has created new opportunities to improve effectiveness and efficiencies across the assurance functions. The MBSE environment supports not only system architecture development, but provides for support of Systems Safety, Reliability and Risk Analysis concurrently in the same framework. Linking to detailed design will further improve assurance capabilities to support failures avoidance and mitigation in flight systems. This also is leading new assurance functions including model assurance and management of uncertainty in the modeling environment. Further, the assurance cases, a structured hierarchal argument or model, are emerging as a basis for supporting a comprehensive viewpoint in which to support Model Based Mission Assurance (MBMA).

  16. Strategic approach to information security and assurance in health research.

    Science.gov (United States)

    Akazawa, Shunichi; Igarashi, Manabu; Sawa, Hirofumi; Tamashiro, Hiko

    2005-09-01

    Information security and assurance are an increasingly critical issue in health research. Whether health research be in genetics, new drugs, disease outbreaks, biochemistry, or effects of radiation, it deals with information that is highly sensitive and which could be targeted by rogue individuals or groups, corporations, national intelligence agencies, or terrorists, looking for financial, social, or political gains. The advents of the Internet and advances in recent information technologies have also dramatically increased opportunities for attackers to exploit sensitive and valuable information.Government agencies have deployed legislative measures to protect the privacy of health information and developed information security guidelines for epidemiological studies. However, risks are grossly underestimated and little effort has been made to strategically and comprehensively protect health research information by institutions, governments and international communities.There is a need to enforce a set of proactive measures to protect health research information locally and globally. Such measures should be deployed at all levels but will be successful only if research communities collaborate actively, governments enforce appropriate legislative measures at national level, and the international community develops quality standards, concluding treaties if necessary, at the global level.Proactive measures for the best information security and assurance would be achieved through rigorous management process with a cycle of "plan, do, check, and act". Each health research entity, such as hospitals, universities, institutions, or laboratories, should implement this cycle and establish an authoritative security and assurance organization, program and plan coordinated by a designatedChief Security Officer who will ensure implementation of the above process, putting appropriate security controls in place, with key focus areas such aspolicies and best practices, enforcement

  17. Planning Considerations for Defensive Information Warfare. Information Assurance

    National Research Council Canada - National Science Library

    1993-01-01

    If the Department of Defense is to maintain operational readiness and fulfill its national security responsibilities, the information infrastructure upon which it depends for information services must...

  18. A Rotational Blended Learning Model: Enhancement and Quality Assurance

    Science.gov (United States)

    Ghoul, Said

    2013-01-01

    Research on blended learning theory and practice is growing nowadays with a focus on the development, evaluation, and quality assurance of case studies. However, the enhancement of blended learning existing models, the specification of their online parts, and the quality assurance related specifically to them have not received enough attention.…

  19. [Role of medical information processing for quality assurance in obstetrics].

    Science.gov (United States)

    Selbmann, H K

    1983-06-01

    The paradigma of problem-orientated assuring of the professional quality of medical case is a kind of "control loop system" consisting of the following 5 steps: routine observation, identification of the problem, analysis of the problem, translation of problem solutions into daily practice and control as to whether the problem has been solved or eliminated. Medical data processing, which involves documentation, electronic data processing and statistics, can make substantial contributions especially to the steps of observation, identification of the problem, and follow-up control. Perinatal data collection, which has already been introduced in 6 Länder of the Federal Republic of Germany, has supplied ample proof of this. These operations were conducted under the heading "internal clinical assuring of quality with external aid". Those clinics who participated in this programme, were given the necessary aid in self-observation (questionnaires, clinical statistics), and they were also given comparative informative data to help them in identifying the problems (clinical profiles, etc.). It is entirely left to the responsibility of the clinics themselves--voluntary cooperation and guarantee of remaining anonymous being a matter of course -- to draw their own consequences from the collected data and to translate these into clinical everyday practice.

  20. Development, implementation and quality assurance of biokinetic models within CONRAD

    International Nuclear Information System (INIS)

    Nosske, D.; Birchall, A.; Blanchardon, E.; Breustedt, B.; Giussani, A.; Luciani, A.; Oeh, U.; Lopez, M. A.

    2008-01-01

    The work of the Task Group 5.2 'Research Studies on Biokinetic Models' of the CONRAD project is presented. New biokinetic models have been implemented by several European institutions. Quality assurance procedures included intercomparison of the results as well as quality assurance of model formulation. Additionally, the use of the models was examined leading to proposals of tuning parameters. Stable isotope studies were evaluated with respect to their implications to the new models, and new biokinetic models were proposed on the basis of their results. Furthermore, the development of a biokinetic model describing the effects of decorporation of actinides by diethylenetriaminepentaacetic acid treatment was initiated. (authors)

  1. QAM: PROPOSED MODEL FOR QUALITY ASSURANCE IN CBSS

    Directory of Open Access Journals (Sweden)

    Latika Kharb

    2015-08-01

    Full Text Available Component-based software engineering (CBSE / Component-Based Development (CBD lays emphasis on decomposition of the engineered systems into functional or logical components with well-defined interfaces used for communication across the components. Component-based software development approach is based on the idea to develop software systems by selecting appropriate off-the-shelf components and then to assemble them with a well-defined software architecture. Because the new software development paradigm is much different from the traditional approach, quality assurance for component-based software development is a new topic in the software engineering research community. Because component-based software systems are developed on an underlying process different from that of the traditional software, their quality assurance model should address both the process of components and the process of the overall system. Quality assurance for component-based software systems during the life cycle is used to analyze the components for achievement of high quality component-based software systems. Although some Quality assurance techniques and component based approach to software engineering have been studied, there is still no clear and well-defined standard or guidelines for component-based software systems. Therefore, identification of the quality assurance characteristics, quality assurance models, quality assurance tools and quality assurance metrics, are under urgent need. As a major contribution in this paper, I have proposed QAM: Quality Assurance Model for component-based software development, which covers component requirement analysis, component development, component certification, component architecture design, integration, testing, and maintenance.

  2. Quality Assurance Model for Digital Adult Education Materials

    Science.gov (United States)

    Dimou, Helen; Kameas, Achilles

    2016-01-01

    Purpose: This paper aims to present a model for the quality assurance of digital educational material that is appropriate for adult education. The proposed model adopts the software quality standard ISO/IEC 9126 and takes into account adult learning theories, Bloom's taxonomy of learning objectives and two instructional design models: Kolb's model…

  3. A Functional Model of Quality Assurance for Psychiatric Hospitals and Corresponding Staffing Requirements.

    Science.gov (United States)

    Kamis-Gould, Edna; And Others

    1991-01-01

    A model for quality assurance (QA) in psychiatric hospitals is described. Its functions (general QA, utilization review, clinical records, evaluation, management information systems, risk management, and infection control), subfunctions, and corresponding staffing requirements are reviewed. This model was designed to foster standardization in QA…

  4. 21 CFR 20.114 - Data and information submitted pursuant to cooperative quality assurance agreements.

    Science.gov (United States)

    2010-04-01

    ... cooperative quality assurance agreements. 20.114 Section 20.114 Food and Drugs FOOD AND DRUG ADMINISTRATION... Records § 20.114 Data and information submitted pursuant to cooperative quality assurance agreements. Data and information submitted to the Food and Drug Administration pursuant to a cooperative quality...

  5. DoD Global Information Grid Mission Assurance

    National Research Council Canada - National Science Library

    Bargar, Anthony

    2008-01-01

    ...). However, the GIG was built for business efficiency instead of mission assurance against sophisticated adversaries who have demonstrated intent and proven their ability to use cyberspace as a tool...

  6. Quality assurance of weather data for agricultural system model input

    Science.gov (United States)

    It is well known that crop production and hydrologic variation on watersheds is weather related. Rarely, however, is meteorological data quality checks reported for agricultural systems model research. We present quality assurance procedures for agricultural system model weather data input. Problems...

  7. Statistical Model Selection for TID Hardness Assurance

    Science.gov (United States)

    Ladbury, R.; Gorelick, J. L.; McClure, S.

    2010-01-01

    Radiation Hardness Assurance (RHA) methodologies against Total Ionizing Dose (TID) degradation impose rigorous statistical treatments for data from a part's Radiation Lot Acceptance Test (RLAT) and/or its historical performance. However, no similar methods exist for using "similarity" data - that is, data for similar parts fabricated in the same process as the part under qualification. This is despite the greater difficulty and potential risk in interpreting of similarity data. In this work, we develop methods to disentangle part-to-part, lot-to-lot and part-type-to-part-type variation. The methods we develop apply not just for qualification decisions, but also for quality control and detection of process changes and other "out-of-family" behavior. We begin by discussing the data used in ·the study and the challenges of developing a statistic providing a meaningful measure of degradation across multiple part types, each with its own performance specifications. We then develop analysis techniques and apply them to the different data sets.

  8. Risk-Informed Safety Assurance and Probabilistic Assessment of Mission-Critical Software-Intensive Systems

    Science.gov (United States)

    Guarro, Sergio B.

    2010-01-01

    This report validates and documents the detailed features and practical application of the framework for software intensive digital systems risk assessment and risk-informed safety assurance presented in the NASA PRA Procedures Guide for Managers and Practitioner. This framework, called herein the "Context-based Software Risk Model" (CSRM), enables the assessment of the contribution of software and software-intensive digital systems to overall system risk, in a manner which is entirely compatible and integrated with the format of a "standard" Probabilistic Risk Assessment (PRA), as currently documented and applied for NASA missions and applications. The CSRM also provides a risk-informed path and criteria for conducting organized and systematic digital system and software testing so that, within this risk-informed paradigm, the achievement of a quantitatively defined level of safety and mission success assurance may be targeted and demonstrated. The framework is based on the concept of context-dependent software risk scenarios and on the modeling of such scenarios via the use of traditional PRA techniques - i.e., event trees and fault trees - in combination with more advanced modeling devices such as the Dynamic Flowgraph Methodology (DFM) or other dynamic logic-modeling representations. The scenarios can be synthesized and quantified in a conditional logic and probabilistic formulation. The application of the CSRM method documented in this report refers to the MiniAERCam system designed and developed by the NASA Johnson Space Center.

  9. Statistical Modeling for Radiation Hardness Assurance: Toward Bigger Data

    Science.gov (United States)

    Ladbury, R.; Campola, M. J.

    2015-01-01

    New approaches to statistical modeling in radiation hardness assurance are discussed. These approaches yield quantitative bounds on flight-part radiation performance even in the absence of conventional data sources. This allows the analyst to bound radiation risk at all stages and for all decisions in the RHA process. It also allows optimization of RHA procedures for the project's risk tolerance.

  10. Statistical Modeling for Radiation Hardness Assurance

    Science.gov (United States)

    Ladbury, Raymond L.

    2014-01-01

    We cover the models and statistics associated with single event effects (and total ionizing dose), why we need them, and how to use them: What models are used, what errors exist in real test data, and what the model allows us to say about the DUT will be discussed. In addition, how to use other sources of data such as historical, heritage, and similar part and how to apply experience, physics, and expert opinion to the analysis will be covered. Also included will be concepts of Bayesian statistics, data fitting, and bounding rates.

  11. Information Assurance in Networked Enterprises: MICSS Class Experiments and Industry Survey Analysis

    National Research Council Canada - National Science Library

    Ray, Parbati

    2001-01-01

    .... The surveys give an insight into how inter-networked companies use their ERP systems, whet their current policies maybe with respect to information management, and what their security and assurance problems maybe...

  12. Virginia Tech named national Center of Academic Excellence in Information Assurance Education

    OpenAIRE

    Micale, Barbara L.

    2005-01-01

    Virginia Tech has been designated as a national Center of Academic Excellence in Information Assurance Education (CAEIAE) for academic years 2005-2008 by the National Security Agency (NSA) and Department of Homeland Security (DHS).

  13. A DOCTORAL PROGRAM WITH SPECIALIZATION IN INFORMATION SECURITY A High Assurance Constructive Security Approach

    OpenAIRE

    Irvine, Cynthia E.; Levin, Timothy E.

    2003-01-01

    A doctoral program in computer science with a specialization in information security is described. The focus of the program is constructive security. Key elements of the program are the strong computer science core upon which it builds, coursework on the theory and principles of information assurance, and a unifying research project. The doctoral candidate is a member of the project team, whose research contributes to the goals of the project and to fundamental advancements in high assurance ...

  14. 48 CFR 239.7102-3 - Information assurance contractor training and certification.

    Science.gov (United States)

    2010-10-01

    ... ACQUISITION OF INFORMATION TECHNOLOGY Security and Privacy for Computer Systems 239.7102-3 Information... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Information assurance... functional services for DoD information systems, or that require any appropriately cleared contractor...

  15. Quality Assurance Based on Descriptive and Parsimonious Appearance Models

    DEFF Research Database (Denmark)

    Nielsen, Jannik Boll; Eiríksson, Eyþór Rúnar; Kristensen, Rasmus Lyngby

    2015-01-01

    In this positional paper, we discuss the potential benefits of using appearance models in additive manufacturing, metal casting, wind turbine blade production, and 3D content acquisition. Current state of the art in acquisition and rendering of appearance cannot easily be used for quality assurance...... in these areas. The common denominator is the need for descriptive and parsimonious appearance models. By ‘parsimonious’ we mean with few parameters so that a model is useful both for fast acquisition, robust fitting, and fast rendering of appearance. The word ‘descriptive’ refers to the fact that a model should...

  16. Incorporating Global Information Security and Assurance in I.S. Education

    Science.gov (United States)

    White, Garry L.; Hewitt, Barbara; Kruck, S. E.

    2013-01-01

    Over the years, the news media has reported numerous information security incidents. Because of identity theft, terrorism, and other criminal activities, President Obama has made information security a national priority. Not only is information security and assurance an American priority, it is also a global issue. This paper discusses the…

  17. Report on probabilistic safety assessment (PSA) quality assurance in utilization of risk information

    International Nuclear Information System (INIS)

    2006-12-01

    Recently in Japan, introduction of nuclear safety regulations using risk information such as probabilistic safety assessment (PSA) has been considered and utilization of risk information in the rational and practical measures on safety assurance has made a progress to start with the operation or inspection area. The report compiled results of investigation and studies of PSA quality assurance in risk-informed activities in the USA. Relevant regulatory guide and standard review plan as well as issues and recommendations were reviewed for technical adequacy and advancement of probabilistic risk assessment technology in risk-informed decision making. Useful and important information to be referred as issues in PSA quality assurance was identified. (T. Tanaka)

  18. MATHEMATICAL MODEL FOR SOFTWARE USABILITY AUTOMATED EVALUATION AND ASSURANCE

    Directory of Open Access Journals (Sweden)

    І. Гученко

    2011-04-01

    Full Text Available The subject of the research is software usability and the aim is construction of mathematicalmodel of estimation and providing of the set level of usability. Methodology of structural analysis,methods of multicriterion optimization and theory of making decision, method of convolution,scientific methods of analysis and analogies is used in the research. The result of executed work isthe model for software usability automated evaluation and assurance that allows not only toestimate the current level of usability during every iteration of agile development but also tomanage the usability of created software products. Results can be used for the construction ofautomated support systems of management the software usability.

  19. How to prepare for the next waves of Information Assurance issues?

    NARCIS (Netherlands)

    Luiijf, H.A.M.

    2006-01-01

    L'histore se répète. In general, each development wave of new technology shows a lack of security. The same lack of security can be found in the area of information and communications technology resulting in a lack of Critical Information Infrastructure (CII) Assurance. By looking back, we can

  20. Assuring Integrity of Information Utility in Cyber-Learning Formats.

    Science.gov (United States)

    Morrison, James L.; Stein, Linda L.

    1999-01-01

    Describes a cyber-learning project for the World Wide Web developed by faculty and librarians at the University of Delaware that combined discovery learning with problem-based learning to develop critical thinking and quality management for information. Undergraduates were to find, evaluate, and use information to generate an Internet marketing…

  1. Coalition Information Assurance - Common Operating Picture (CIA-COP)

    National Research Council Canada - National Science Library

    Scheiderich, Louis

    2005-01-01

    .... Cyber Panel sought to provide high-level capabilities to help defend mission-critical information systems by monitoring them for signs of cyber attack and allowing operators to manage the operation...

  2. Engaging the Board: Corporate Governance and Information Assurance

    National Research Council Canada - National Science Library

    Anhal, Aarti

    2003-01-01

    .... Information and Communication Technologies (ICT) hold the potential to revitalise UK business, to spur economic growth and competitiveness, to revolutionise working practices and living environments as well as to transform government services...

  3. Assessing and Managing Risks to Information Assurance: A Methodological Approach

    National Research Council Canada - National Science Library

    Lamm, George

    2001-01-01

    .... Despite spending millions of dollars on firewalls, encryption technologies, and intrusion detection software, information infrastructure vulnerabilities and incidents continue to happen. These trends have a significant impact on military operations in the next decades.

  4. Information Assurance: Trends in Vulnerabilities, Threats, and Technologies

    National Research Council Canada - National Science Library

    Gansler, Jacques S; Binnendijk, Hans

    2004-01-01

    ... together leaders in the fields of military and commercial technology. The purpose of the meeting was to gain insight into the risks and vulnerabilities inherent in the use of information technology on the battlefield and in military...

  5. Assessing and Managing Risks to Information Assurance: A Methodological Approach

    Science.gov (United States)

    2001-05-01

    Figure 1). Computer system vulnerabilities have also increased according to CERT with JAVA and Windows operating systems at times reporting one... windstorms and water from floods and rain cause service outages to information systems and networks but are limited to geography and time. Manmade...Step B.2.4 by putting a magnifying glass on the probabilities of the risk scenarios. 110 Availability: Indicates a scenario for which the system provides

  6. SYN-OP-SYS™: A Computerized Management Information System for Quality Assurance and Risk Management

    OpenAIRE

    Thomas, David J.; Weiner, Jayne; Lippincott, Ronald C.

    1985-01-01

    SYN·OP·SYS™ is a computerized management information system for quality assurance and risk management. Computer software for the efficient collection and analysis of “occurrences” and the clinical data associated with these kinds of patient events is described. The system is evaluated according to certain computer design criteria, and the system's implementation is assessed.

  7. 77 FR 14955 - DoD Information Assurance Scholarship Program (IASP)

    Science.gov (United States)

    2012-03-14

    ...'' includes computer security, network security, cybersecurity, cyber operations, and other relevant IT...: The National Security Agency (NSA) is the Executive Administrator of the DoD Information Assurance... criteria for IA education and has been jointly designated by the Department of Homeland Security and the...

  8. Software quality assurance and information management, October 1986 to October 1992

    International Nuclear Information System (INIS)

    Hill, I.E.

    1993-01-01

    This report describes the work carried out by Cedar Design Systems Limited under contract PECD 7/9/384. The brief for the contract was initially to provide advice on Software Quality Assurance (SQA) as part of the CEC PACOMA project. This was later extended to include further SQA and information management tasks specific to the HMIP Radioactive Waste Disposal Assessments Research Programme. (Author)

  9. A Framework for Managing the Assured Information Sharing Lifecycle

    Science.gov (United States)

    2013-11-06

    Adam, A Poli- cy-based Approach to Smart Cloud Services, Service Research and Innovation Institute Global Conf., July 2012. • Sumit More, Mary...Research, June 2012. • Sumit More, Mary Mathews, Anupam Joshi and Tim Finin, A Knowledge-Based Approach To Intrusion Detection Modeling, Proc IEEE

  10. Analysis of data as information: quality assurance approach.

    Science.gov (United States)

    Ivankovic, D; Kern, J; Bartolic, A; Vuletic, S

    1993-01-01

    Describes a prototype module for data analysis of the healthcare delivery system. It consists of three main parts: data/variable selection; algorithms for the analysis of quantitative and qualitative changes in the system; and interpretation and explanation of the results. Such a module designed for primary health care has been installed on a PC in the health manager's office. Data enter the information system through the standard DBMS procedures, followed by calculating a number of different indicators and the time series, as the ordered sequences of indicators, according to demands of the manager. The last procedure is "the change analysis" with estimation of unexpected differences between and within some units, e.g. health-care teams, as well as some unexpected variabilities and trends. As an example, presents and discusses the diagnostic pattern of neurotic cases, referral patterns and preventive behaviour of GP's teams as well.

  11. Developing a Framework for Evaluating Organizational Information Assurance Metrics Programs

    Science.gov (United States)

    2007-03-01

    security ((ISC)2 , 2006)      As another conceptual taxonomy, the DOD’s defense‐in‐depth concept  compares information security systems to a  medieval ...functions and technology with  medieval   tools, weapons, and attacks of the dark ages (Jones, 2005).    20    The International Standards Organization (ISO...California  Institute of Technology and three “Deep Space Network complexes around the    145  world,” as well as an “ astronomical  observatory” in California

  12. Applying Business Process Reengineering to the Marine Corps Information Assurance Certification and Accreditation Process

    Science.gov (United States)

    2009-09-01

    Database Management System DATO : Denial of Authority To Operate DIACAP: DoD Information Assurance Certification and Accreditation Program DII...level of risk, based on the implementation of an approved set of technical, managerial, and procedural safeguards. (CNSSI, 2006, p. 2) IA...concerned with risk elimination but rather risk minimization. The need for IA C&A in USMC Information Technology (IT) systems is based on the need to

  13. Model Based Mission Assurance in a Model Based Systems Engineering (MBSE) Framework: State-of-the-Art Assessment

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.

    2016-01-01

    This report explores the current state of the art of Safety and Mission Assurance (S&MA) in projects that have shifted towards Model Based Systems Engineering (MBSE). Its goal is to provide insight into how NASA's Office of Safety and Mission Assurance (OSMA) should respond to this shift. In MBSE, systems engineering information is organized and represented in models: rigorous computer-based representations, which collectively make many activities easier to perform, less error prone, and scalable. S&MA practices must shift accordingly. The "Objective Structure Hierarchies" recently developed by OSMA provide the framework for understanding this shift. Although the objectives themselves will remain constant, S&MA practices (activities, processes, tools) to achieve them are subject to change. This report presents insights derived from literature studies and interviews. The literature studies gleaned assurance implications from reports of space-related applications of MBSE. The interviews with knowledgeable S&MA and MBSE personnel discovered concerns and ideas for how assurance may adapt. Preliminary findings and observations are presented on the state of practice of S&MA with respect to MBSE, how it is already changing, and how it is likely to change further. Finally, recommendations are provided on how to foster the evolution of S&MA to best fit with MBSE.

  14. Guidance for implementing an environmental, safety, and health-assurance program. Volume 15. A model plan for line organization environmental, safety, and health-assurance programs

    Energy Technology Data Exchange (ETDEWEB)

    Ellingson, A.C.; Trauth, C.A. Jr.

    1982-01-01

    This is 1 of 15 documents designed to illustrate how an Environmental, Safety and Health (ES and H) Assurance Program may be implemented. The generic definition of ES and H Assurance Programs is given in a companion document entitled An Environmental, Safety and Health Assurance Program Standard. This particular document presents a model operational-level ES and H Assurance Program that may be used as a guide by an operational-level organization in developing its own plan. The model presented here reflects the guidance given in the total series of 15 documents.

  15. Audit of the informed consent process as a part of a clinical research quality assurance program.

    Science.gov (United States)

    Lad, Pramod M; Dahl, Rebecca

    2014-06-01

    Audits of the informed consent process are a key element of a clinical research quality assurance program. A systematic approach to such audits has not been described in the literature. In this paper we describe two components of the audit. The first is the audit of the informed consent document to verify adherence with federal regulations. The second component is comprised of the audit of the informed consent conference, with emphasis on a real time review of the appropriate communication of the key elements of the informed consent. Quality measures may include preparation of an informed consent history log, notes to accompany the informed consent, the use of an informed consent feedback tool, and the use of institutional surveys to assess comprehension of the informed consent process.

  16. Probability-informed testing for reliability assurance through Bayesian hypothesis methods

    International Nuclear Information System (INIS)

    Smith, Curtis; Kelly, Dana; Dezfuli, Homayoon

    2010-01-01

    Bayesian inference techniques play a central role in modern risk and reliability evaluations of complex engineering systems. These techniques allow the system performance data and any relevant associated information to be used collectively to calculate the probabilities of various types of hypotheses that are formulated as part of reliability assurance activities. This paper proposes a methodology based on Bayesian hypothesis testing to determine the number of tests that would be required to demonstrate that a system-level reliability target is met with a specified probability level. Recognizing that full-scale testing of a complex system is often not practical, testing schemes are developed at the subsystem level to achieve the overall system reliability target. The approach uses network modeling techniques to transform the topology of the system into logic structures consisting of series and parallel subsystems. The paper addresses the consideration of cost in devising subsystem level test schemes. The developed techniques are demonstrated using several examples. All analyses are carried out using the Bayesian analysis tool WinBUGS, which uses Markov chain Monte Carlo simulation methods to carry out inference over the network.

  17. A Model for Information

    Directory of Open Access Journals (Sweden)

    Paul Walton

    2014-09-01

    Full Text Available This paper uses an approach drawn from the ideas of computer systems modelling to produce a model for information itself. The model integrates evolutionary, static and dynamic views of information and highlights the relationship between symbolic content and the physical world. The model includes what information technology practitioners call “non-functional” attributes, which, for information, include information quality and information friction. The concepts developed in the model enable a richer understanding of Floridi’s questions “what is information?” and “the informational circle: how can information be assessed?” (which he numbers P1 and P12.

  18. Prospects for Evidence -Based Software Assurance: Models and Analysis

    Science.gov (United States)

    2015-09-01

    HTML5 . • Assurance for self-adaptive systems. Self-adaptiveness is becoming an essential feature of modern systems on the basis of requirements for...are interleaved at run-time. In related work, safe HTML5 subsets were considered as a way to provide a means to avoid certain exploits [1...web languages are often vague and they interact in ways that can be subtle and dangerous. We methodically surveyed the HTML5 specification and

  19. Information Assurance for Enterprise Resource Planning Systems: Risk Considerations in Public Sector Organizations

    Directory of Open Access Journals (Sweden)

    SHAHZAD NAEEM

    2016-10-01

    Full Text Available ERP (Enterprise Resource Planning systems reveal and pose non-typical risks due to its dependencies of interlinked business operations and process reengineering. Understanding of such type of risks is significant conducting and planning assurance involvement of the reliability of these complicated computer systems. Specially, in case of distributed environment where data reside at multiple sites and risks are of unique nature. Until now, there are brief pragmatic grounds on this public sector ERP issue. To analyze this subject, a partially organized consultation study was carried out with 15 skilled information systems auditors who are specialists in evaluating ERP systems risks. This methodology permitted to get more elaborated information about stakeholder?s opinions and customer experiences. In addition, interviewees mentioned a numerous basic execution troubles (e.g. inadequately skilled human resource and insufficient process reengineering attempts that lead into enhanced hazards. It was also reported by the interviewees that currently risks vary across vendors and across applications. Eventually, in offering assurance with ERP systems participants irresistibly stresses examining the process instead of system end product.

  20. The evolving story of information assurance at the DoD.

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, Philip LaRoche

    2007-01-01

    This document is a review of five documents on information assurance from the Department of Defense (DoD), namely 5200.40, 8510.1-M, 8500.1, 8500.2, and an ''interim'' document on DIACAP [9]. The five documents divide into three sets: (1) 5200.40 & 8510.1-M, (2) 8500.1 & 8500.2, and (3) the interim DIACAP document. The first two sets describe the certification and accreditation process known as ''DITSCAP''; the last two sets describe the certification and accreditation process known as ''DIACAP'' (the second set applies to both processes). Each set of documents describes (1) a process, (2) a systems classification, and (3) a measurement standard. Appendices in this report (a) list the Phases, Activities, and Tasks of DITSCAP, (b) note the discrepancies between 5200.40 and 8510.1-M concerning DITSCAP Tasks and the System Security Authorization Agreement (SSAA), (c) analyze the DIACAP constraints on role fusion and on reporting, (d) map terms shared across the documents, and (e) review three additional documents on information assurance, namely DCID 6/3, NIST 800-37, and COBIT{reg_sign}.

  1. Information Assurance for Enterprise Resource Planning Systems: Risk Considerations in Public Sector Organizations

    International Nuclear Information System (INIS)

    Naeem, S.; Islam, M.H.

    2016-01-01

    ERP (Enterprise Resource Planning) systems reveal and pose non-typical risks due to its dependencies of interlinked business operations and process reengineering. Understanding of such type of risks is significant conducting and planning assurance involvement of the reliability of these complicated computer systems. Specially, in case of distributed environment where data reside at multiple sites and risks are of unique nature. Until now, there are brief pragmatic grounds on this public sector ERP issue. To analyze this subject, a partially organized consultation study was carried out with 15 skilled information systems auditors who are specialists in evaluating ERP systems risks. This methodology permitted to get more elaborated information about stakeholder's opinions and customer experiences. In addition, interviewees mentioned a numerous basic execution troubles (e.g. inadequately skilled human resource and insufficient process reengineering attempts) that lead into enhanced hazards. It was also reported by the interviewees that currently risks vary across vendors and across applications. Eventually, in offering assurance with ERP systems participants irresistibly stresses examining the process instead of system end product. (author)

  2. Governance and Public Sector Transformation in South Africa: Reporting and Providing Assurance on Service Delivery Information

    Directory of Open Access Journals (Sweden)

    Mariaan Roos

    2012-12-01

    Full Text Available Reporting on performance was legislatively established in South Africa in terms of the Public Finance Management Act, Act 1 of 1999, section 40 (3(a. The auditing of the reported information was legislated in the Public Audit Act, Act 25 of 2004, section 20(2 (c. The objectives of the article are firstly to provide an overview of the development and application of the reporting and secondly providing assurance on service delivery information and thirdly to reflect on challenges to the implementation thereof in South Africa. The aim through deploying these set objectives is to formulate possible future considerations for improved governance. As central part of the methodology, review of literature on reporting and audit of non-financialwas conducted. The research included scrutiny of the different philosophies and approaches adopted by different countries to the reporting and providing assurance on service delivery information. In this respect, the research reflects a comparative element. In South Africa the Auditor-General adopted a phasing-in approach. The development of the audit approach and audit procedures has reached a stable stage, nine years after the initial process started. The audit of performance information now forms an integral part of the regularity audit process. The analysis of audit findings of the period under study indicates a considerable improvement once initiated, but stagnation persists in subsequent years. Numerous challenges remain around the application of performance reporting in South Africa including non-compliance, the lack of sufficient and appropriate audit evidence, inconsistencies between the various strategic documents and the need to improve the usefulness of performance information. In conclusion the article proposes some steps to address the challenges.

  3. Comparing Information Assurance Awareness Training for End-Users: A Content Analysis Examination of Air Force and Defense Information Systems Agency User Training Modules

    National Research Council Canada - National Science Library

    Fruge, John W

    2008-01-01

    Today, the threats to information security and assurance are great. While there are many avenues for IT professionals to safeguard against these threats, many times these defenses prove useless against typical system users...

  4. Quality assurance of metabolomics.

    Science.gov (United States)

    Bouhifd, Mounir; Beger, Richard; Flynn, Thomas; Guo, Lining; Harris, Georgina; Hogberg, Helena; Kaddurah-Daouk, Rima; Kamp, Hennicke; Kleensang, Andre; Maertens, Alexandra; Odwin-DaCosta, Shelly; Pamies, David; Robertson, Donald; Smirnova, Lena; Sun, Jinchun; Zhao, Liang; Hartung, Thomas

    2015-01-01

    Metabolomics promises a holistic phenotypic characterization of biological responses to toxicants. This technology is based on advanced chemical analytical tools with reasonable throughput, including mass-spectroscopy and NMR. Quality assurance, however - from experimental design, sample preparation, metabolite identification, to bioinformatics data-mining - is urgently needed to assure both quality of metabolomics data and reproducibility of biological models. In contrast to microarray-based transcriptomics, where consensus on quality assurance and reporting standards has been fostered over the last two decades, quality assurance of metabolomics is only now emerging. Regulatory use in safety sciences, and even proper scientific use of these technologies, demand quality assurance. In an effort to promote this discussion, an expert workshop discussed the quality assurance needs of metabolomics. The goals for this workshop were 1) to consider the challenges associated with metabolomics as an emerging science, with an emphasis on its application in toxicology and 2) to identify the key issues to be addressed in order to establish and implement quality assurance procedures in metabolomics-based toxicology. Consensus has still to be achieved regarding best practices to make sure sound, useful, and relevant information is derived from these new tools.

  5. Evaluation of a mandatory quality assurance data capture in anesthesia: a secure electronic system to capture quality assurance information linked to an automated anesthesia record.

    Science.gov (United States)

    Peterfreund, Robert A; Driscoll, William D; Walsh, John L; Subramanian, Aparna; Anupama, Shaji; Weaver, Melissa; Morris, Theresa; Arnholz, Sarah; Zheng, Hui; Pierce, Eric T; Spring, Stephen F

    2011-05-01

    Efforts to assure high-quality, safe, clinical care depend upon capturing information about near-miss and adverse outcome events. Inconsistent or unreliable information capture, especially for infrequent events, compromises attempts to analyze events in quantitative terms, understand their implications, and assess corrective efforts. To enhance reporting, we developed a secure, electronic, mandatory system for reporting quality assurance data linked to our electronic anesthesia record. We used the capabilities of our anesthesia information management system (AIMS) in conjunction with internally developed, secure, intranet-based, Web application software. The application is implemented with a backend allowing robust data storage, retrieval, data analysis, and reporting capabilities. We customized a feature within the AIMS software to create a hard stop in the documentation workflow before the end of anesthesia care time stamp for every case. The software forces the anesthesia provider to access the separate quality assurance data collection program, which provides a checklist for targeted clinical events and a free text option. After completing the event collection program, the software automatically returns the clinician to the AIMS to finalize the anesthesia record. The number of events captured by the departmental quality assurance office increased by 92% (95% confidence interval [CI] 60.4%-130%) after system implementation. The major contributor to this increase was the new electronic system. This increase has been sustained over the initial 12 full months after implementation. Under our reporting criteria, the overall rate of clinical events reported by any method was 471 events out of 55,382 cases or 0.85% (95% CI 0.78% to 0.93%). The new system collected 67% of these events (95% confidence interval 63%-71%). We demonstrate the implementation in an academic anesthesia department of a secure clinical event reporting system linked to an AIMS. The system enforces

  6. A Belief-Based Model of Air Traffic Controllers Performing Separation Assurance

    Science.gov (United States)

    Landry, S.J.

    2009-01-01

    A model of an air traffic controller performing a separation assurance task was produced. The model was designed to be simple to use and deploy in a simulator, but still provide realistic behavior. The model is based upon an evaluation of the safety function of the controller for separation assurance, and utilizes fast and frugal heuristics and belief networks to establish a knowledge set for the controller model. Based on this knowledge set, the controller acts to keep aircraft separated. Validation results are provided to demonstrate the model s performance.

  7. Engineering Information Security The Application of Systems Engineering Concepts to Achieve Information Assurance

    CERN Document Server

    Jacobs, Stuart

    2011-01-01

    Information security is the act of protecting information from unauthorized access, use, disclosure, disruption, modification, or destruction. This book discusses why information security is needed and how security problems can have widespread impacts. It covers the complete security lifecycle of products and services, starting with requirements and policy development and progressing through development, deployment, and operations, and concluding with decommissioning. Professionals in the sciences, engineering, and communications fields will turn to this resource to understand the many legal,

  8. Information Assurance and Information Technology: Training, Certification, and Personnel Management in the Department of Defense

    National Research Council Canada - National Science Library

    1999-01-01

    The DoD's warfighting Capability and the security of its information infrastructure are at great risk from attacks by foreign intelligence organizations, cyber-terrorists, and the incompetencies of some of its own users...

  9. Quality Assurance for Postgraduate Programs: Design of a Model Applied on a University in Chile

    Science.gov (United States)

    Careaga Butter, Marcelo; Meyer Aguilera, Eduardo; Badilla Quintana, María Graciela; Jiménez Pérez, Laura; Sepúlveda Valenzuela, Eileen

    2017-01-01

    The quality of Education in Chile is a controversial topic that has been in the public debate in the last several years. To ensure quality in graduate programs, accreditation is compulsory. The current article presents a model to improve the process of self-regulation. The main objective was to design a Model of Quality Assurance for Postgraduate…

  10. Development of Knowledge Management Model for Developing the Internal Quality Assurance in Educational Opportunity Expansion Schools

    Science.gov (United States)

    Pradabpech, Pipat; Chantarasombat, Chalard; Sriampai, Anan

    2015-01-01

    This research for: 1) to study the current situation and problem in KM, 2) to develop the KM Model, and 3) to evaluate the finding usage of the KM Model for developing the Internal Quality Assurance of Educational Opportunity Expansion Schools. There were 3 Phases of research implementation. Phase 1: the current situation and problem in KM, was…

  11. Applying the Concept of Minimal Essential to Maintain Operational Continuity and Attain Mission Assurance During Internal and External Attacks on the Information Environment

    National Research Council Canada - National Science Library

    McCallam, Dennis H; Luzwick, Perry

    2002-01-01

    .... Because network centric warfare and information superiority are important in achieving military successes, data resiliency for operational continuity is essential for achieving mission assurance...

  12. Critical Thinking Skills of Students through Mathematics Learning with ASSURE Model Assisted by Software Autograph

    Science.gov (United States)

    Kristianti, Y.; Prabawanto, S.; Suhendra, S.

    2017-09-01

    This study aims to examine the ability of critical thinking and students who attain learning mathematics with learning model ASSURE assisted Autograph software. The design of this study was experimental group with pre-test and post-test control group. The experimental group obtained a mathematics learning with ASSURE-assisted model Autograph software and the control group acquired the mathematics learning with the conventional model. The data are obtained from the research results through critical thinking skills tests. This research was conducted at junior high school level with research population in one of junior high school student in Subang Regency of Lesson Year 2016/2017 and research sample of class VIII student in one of junior high school in Subang Regency for 2 classes. Analysis of research data is administered quantitatively. Quantitative data analysis was performed on the normalized gain level between the two sample groups using a one-way anova test. The results show that mathematics learning with ASSURE assisted model Autograph software can improve the critical thinking ability of junior high school students. Mathematical learning using ASSURE-assisted model Autograph software is significantly better in improving the critical thinking skills of junior high school students compared with conventional models.

  13. An Exploration of Professional Culture Differentials and Their Potential Impact on the Information Assurance Component of Optical Transmission Networks Design

    Science.gov (United States)

    Cuthrell, Michael Gerard

    2011-01-01

    Optical transmission networks are an integral component of the critical infrastructures for many nations. Many people believe that optical transmission networks are impenetrable. In actuality, these networks possess weaknesses that can be exploited to bring about harm. An emerging Information Assurance (IA) industry has as its goals: to…

  14. Building information modelling (BIM)

    CSIR Research Space (South Africa)

    Conradie, Dirk CU

    2009-02-01

    Full Text Available The concept of a Building Information Model (BIM) also known as a Building Product Model (BPM) is nothing new. A short article on BIM will never cover the entire filed, because it is a particularly complex filed that is recently beginning to receive...

  15. Development Design Model of Academic Quality Assurance at Private Islamic University Jakarta Indonesia

    Science.gov (United States)

    Suprihatin, Krebet; Bin Mohamad Yusof, Hj. Abdul Raheem

    2015-01-01

    This study aims to evaluate the practice of academic quality assurance in design model based on seven aspects of quality are: curriculum design, teaching and learning, student assessment, student selection, support services, learning resources, and continuous improvement. The design study was conducted in two stages. The first stage is to obtain…

  16. Quality Assurance in E-Learning: PDPP Evaluation Model and Its Application

    Science.gov (United States)

    Zhang, Weiyuan; Cheng, Y. L.

    2012-01-01

    E-learning has become an increasingly important teaching and learning mode in educational institutions and corporate training. The evaluation of e-learning, however, is essential for the quality assurance of e-learning courses. This paper constructs a four-phase evaluation model for e-learning courses, which includes planning, development,…

  17. A Methodology, a Language, and a Tool to Provide Information Security Assurance Arguments

    National Research Council Canada - National Science Library

    Park, Joon

    2002-01-01

    .... To design a system that can be trusted or assess security properties in a system, the related assurance arguments need to be developed and described effectively in a well-organized format by means of a sound language...

  18. Quality assurance in Library and Information Schools in Europe: major trends and issues

    OpenAIRE

    Tammaro, Anna Maria

    2006-01-01

    . In Europe, the internationalisation process of higher education – driven by Bologna Process - have identified the objectives of improving quality assurance, transparency and recognition of qualifications. LIS guidelines for quality assurance and the recognition of professionals have been analised to discover a common definition of quality, of same purposes and of similar process. Could European LIS Schools collaborate toward a single accreditation system in Europe? The paper reports on the ...

  19. Book Review: Cyber Security and Global Information Assurance: Threat Analysis and Response Solutions

    Directory of Open Access Journals (Sweden)

    Gary Kessler

    2009-09-01

    Full Text Available Knapp, K.J. (Ed. (2009. Cyber Security and Global Information Assurance: Threat Analysis and Response Solutions. Hershey, NY: Information Science Reference. 434 + xxii pages, ISBN: 978-1-60566-326-5, US$195.Reviewed by Gary C. Kessler (gck@garykessler.netI freely admit that this book was sent to me by the publisher for the expressed purpose of my writing a review and that I know several of the chapter authors. With that disclosure out of the way, let me say that the book is well worth the review (and I get to keep my review copy.The preface to the book cites the 2003 publication of The National Strategy to Secure Cyberspace by the White House, and the acknowledgement by the U.S. government that our economy and national security were fully dependent upon computers, networks, and the telecommunications infrastructure. This mayhave come as news to the general population but it was a long overdue public statement to those of us in the industry. The FBI's InfraGard program and the formation of the National Infrastructure Protection Center (NIPC pre-dated this report by at least a half-dozen years, so the report was hardly earthshattering. And the fact that the bulk of the telecom infrastructure is owned by the private sector is a less advertized fact. Nonetheless, reminding the community of these facts is always a Good Thing and provides the raison d’être of this book.(see PDF for full review

  20. Modeling E-learning quality assurance benchmarking in higher education

    NARCIS (Netherlands)

    Alsaif, Fatimah; Clementking, Arockisamy

    2014-01-01

    Online education programs have been growing rapidly. While it is somehow difficult to specifically quantify quality, many recommendations have been suggested to specify and demonstrate quality of online education touching on common areas of program enhancement and administration. To design a model

  1. Inexperienced clinicians can extract pathoanatomic information from MRI narrative reports with high reproducability for use in research/quality assurance

    DEFF Research Database (Denmark)

    Kent, Peter; Briggs, Andrew M; Albert, Hanne Birgit

    2011-01-01

    and transforming that information into quantitative data. However, this process is frequently required in research and quality assurance contexts. The purpose of this study was to examine inter-rater reproducibility (agreement and reliability) among an inexperienced group of clinicians in extracting spinal...... of radiological training is not required in order to transform MRI-derived pathoanatomic information from a narrative format to a quantitative format with high reproducibility for research or quality assurance purposes....... a categorical electronic coding matrix. Decision rules were developed after initial coding in an effort to resolve ambiguities in narrative reports. This process was repeated a further three times using separate samples of 20 MRI reports until no further ambiguities were identified (total n=80). Reproducibility...

  2. Information Systems Efficiency Model

    Directory of Open Access Journals (Sweden)

    Milos Koch

    2017-07-01

    Full Text Available This contribution discusses the basic concept of creating a new model for the efficiency and effectiveness assessment of company information systems. The present trends in this field are taken into account, and the attributes are retained of measuring the optimal solutions for a company’s ICT (the implementation, functionality, service, innovations, safety, relationships, costs, etc.. The proposal of a new model of assessment comes from our experience with formerly implemented and employed methods, methods which we have modified in time and adapted to companies’ needs but also to the necessaries of our research that has been done through the ZEFIS portal. The most noteworthy of them is the HOS method that we have discussed in a number of forums. Its main feature is the fact that it respects the complexity of an information system in correlation with the balanced state of its individual parts.

  3. Implementing two nurse practitioner models of service at an Australian male prison: A quality assurance study.

    Science.gov (United States)

    Wong, Ides; Wright, Eryn; Santomauro, Damian; How, Raquel; Leary, Christopher; Harris, Meredith

    2018-01-01

    To examine the quality and safety of nurse practitioner services of two newly implemented nurse practitioner models of care at a correctional facility. Nurse practitioners could help to meet the physical and mental health needs of Australia's growing prison population; however, the nurse practitioner role has not previously been evaluated in this context. A quality assurance study conducted in an Australian prison where a primary health nurse practitioner and a mental health nurse practitioner were incorporated into an existing primary healthcare service. The study was guided by Donabedian's structure, processes and outcomes framework. Routinely collected information included surveys of staff attitudes to the implementation of the nurse practitioner models (n = 21 staff), consultation records describing clinical processes and time use (n = 289 consultations), and a patient satisfaction survey (n = 29 patients). Data were analysed descriptively and compared to external benchmarks where available. Over the two-month period, the nurse practitioners provided 289 consultations to 208 prisoners. The presenting problems treated indicated that most referrals were appropriate. A significant proportion of consultations involved medication review and management. Both nurse practitioners spent more than half of their time on individual patient-related care. Overall, multidisciplinary team staff agreed that the nurse practitioner services were necessary, safe, met patient need and reduced treatment delays. Findings suggest that the implementation of nurse practitioners into Australian correctional facilities is acceptable and feasible and has the potential to improve prisoners' access to health services. Structural factors (e.g., room availability and limited access to prisoners) may have reduced the efficiency of the nurse practitioners' clinical processes and service implementation. Results suggest that nurse practitioner models can be successfully integrated into a

  4. Modeling information technology effectiveness

    OpenAIRE

    Aleksander Lotko

    2005-01-01

    Numerous cases of systems not bringing expected results cause that investments in information technology are treated more and more carefully and are not privileged amongst others. This gives rise to the need for applying costs–effect calculations. Modeling IT effectiveness is a procedure which helps to bring system complexity under control. By using proper measures it is possible to perform an objective investment appraisal for projects under consideration. In the paper, a framework of method...

  5. ASPECTS REGARDING THE ROLE OF INFORMATION TECHNOLOGIES IN THE ASSURANCE OF SUPPLY CHAIN MANAGEMENT PERFORMANCE

    Directory of Open Access Journals (Sweden)

    Ilies Radu Ovidiu

    2013-07-01

    information technology such as Internet and ERP systems. Internet offers important opportunities to all partners from the supply chain to get information about consumption tendencies and changes in consumption request, virtual information about a product and the clients’ requests regarding the logistic services. As for ERP systems, it can be said that they mostly influence the designing of business processes, in order to assure coherence between them and the effective integration of different firm components. Even though the internal integration is an important aspect, an approach to management at the supply chain level, in an efficient and effective way, cannot be done without external integration with suppliers and clients. That is why we consider that companies belonging to the business field must focus on structuring key processes, to collaborate with their clients and suppliers and to integrate their internal systems, with the aim to support business operations.

  6. TU-G-BRD-02: Automated Systematic Quality Assurance Program for Radiation Oncology Information System Upgrades

    International Nuclear Information System (INIS)

    Zhang, B; Yi, B; Eley, J; Mutaf, Y; Rahman, S; D’Souza, W

    2015-01-01

    Purpose: To: (1) describe an independent, automated, systematic software-based protocol for verifying clinical data accuracy/integrity for mitigation of data corruption/loss risks following radiation oncology information system (ROIS) upgrades; and (2) report on application of this approach in an academic/community practice environment. Methods: We propose a robust approach to perform quality assurance on the ROIS after an upgrade, targeting four data sources: (1) ROIS relational database; (2) ROIS DICOM interface; (3) ROIS treatment machine data configuration; and (4) ROIS-generated clinical reports. We investigated the database schema for differences between pre-/post-upgrade states. Paired DICOM data streams for the same object (such as RT-Plan/Treatment Record) were compared between pre-/post-upgrade states for data corruption. We examined machine configuration and related commissioning data files for changes and corruption. ROIS-generated treatment appointment and treatment parameter reports were compared to ensure patient encounter and treatment plan accuracy. This protocol was supplemented by an end-to-end clinical workflow test to verify essential ROI functionality and integrity of components interfaced during patient care chain of activities. We describe the implementation of this protocol during a Varian ARIA system upgrade at our clinic. Results: We verified 1,638 data tables with 2.4 billion data records. For 222 under-treatment patients, 605 DICOM RT plans and 13,480 DICOM treatment records retrieved from the ROIS DICOM interface were compared, with no differences in fractions, doses delivered, or treatment parameters. We identified 82 new data tables and 78 amended/deleted tables consistent with the upgrade. Reports for 5,073 patient encounters over a 2-week horizon were compared and were identical to those before the upgrade. Content in 12,237 xml machine files was compared, with no differences identified. Conclusion: An independent QA

  7. Multinational Quality Assurance

    Science.gov (United States)

    Kinser, Kevin

    2011-01-01

    Multinational colleges and universities pose numerous challenges to the traditional models of quality assurance that are designed to validate domestic higher education. When institutions cross international borders, at least two quality assurance protocols are involved. To guard against fraud and abuse, quality assurance in the host country is…

  8. Quality assurance

    International Nuclear Information System (INIS)

    Hiller, G.H.

    1979-01-01

    This compendium intends to give fast bibliographic information and to fill the visible gap between documentation and general bibliographic information. The reader is given an outline of quality assurance and some examples of techniques from the relevant literature. The practical engineer, who is always short of time, is thus offered a quick survey and a fast deepening of his understanding by means of literature dealing specifically with his unresolved problems. The mansucript has been kept in tis original form in order to speed up tis publication. The RKW technical department limited itself to checking its contents and the adherence to the established information goals. (orig.) 891 RW/orig. 892 MB [de

  9. MOLES Information Model

    Science.gov (United States)

    Ventouras, Spiros; Lawrence, Bryan; Woolf, Andrew; Cox, Simon

    2010-05-01

    The Metadata Objects for Linking Environmental Sciences (MOLES) model has been developed within the Natural Environment Research Council (NERC) DataGrid project [NERC DataGrid] to fill a missing part of the ‘metadata spectrum'. It is a framework within which to encode the relationships between the tools used to obtain data, the activities which organised their use, and the datasets produced. MOLES is primarily of use to consumers of data, especially in an interdisciplinary context, to allow them to establish details of provenance, and to compare and contrast such information without recourse to discipline-specific metadata or private communications with the original investigators [Lawrence et al 2009]. MOLES is also of use to the custodians of data, providing an organising paradigm for the data and metadata. The work described in this paper is a high-level view of the structure and content of a recent major revision of MOLES (v3.3) carried out as part of a NERC DataGrid extension project. The concepts of MOLES v3.3 are rooted in the harmonised ISO model [Harmonised ISO model] - particularly in metadata standards (ISO 19115, ISO 19115-2) and the ‘Observations and Measurements' conceptual model (ISO 19156). MOLES exploits existing concepts and relationships, and specialises information in these standards. A typical sequence of data capturing involves one or more projects under which a number of activities are undertaken, using appropriate tools and methods to produce the datasets. Following this typical sequence, the relevant metadata can be partitioned into the following main sections - helpful in mapping onto the most suitable standards from the ISO 19100 series. • Project section • Activity section (including both observation acquisition and numerical computation) • Observation section (metadata regarding the methods used to obtained the data, the spatial and temporal sampling regime, quality etc.) • Observation collection section The key concepts in

  10. Force Displacement Model of Compliant Mechanisms Using Assur Sub-Chains

    DEFF Research Database (Denmark)

    Durango, Sebastian; Correa, Jorge; Aristizabal, Mauricio

    2011-01-01

    This article develops a modular procedure to perform force-displacement modeling of planar flexurebased Compliant Mechanisms (CMs). The procedure is mostly suitable for planar lumped CMs. To achieve the position analysis of CMs requires: (i) to implement the kinematic analysis as in the case...... of ordinary mechanisms, (ii) to solve the equilibrium problem by means of a static analysis and (iii) to model the flexures behavior through a deflection analysis. The novel contribution of this article relies on the fact that a division strategy of the CM into Assur sub-chains is implemented, so that any CM...

  11. Japanese Quality Assurance System Regarding the Provision of Material Accounting Reports and the Safeguards Relevant Information to the IAEA

    International Nuclear Information System (INIS)

    Goto, Y.; Namekawa, M.; Kumekawa, H.; Usui, A.; Sano, K.

    2015-01-01

    The provision of the safeguards relevant reports and information in accordance with the comprehensive safeguards agreement (CSA) and the additional protocol (AP) is the basis for the IAEA safeguards. The government of Japan (Japan Safeguards Office, JSGO) has believed that the correct reports contribute to effective and efficient safeguards therefore the domestic quality assurance system for the reporting to the IAEA was already established at the time of the accession of the CSA in 1977. It consists of Code 10 interpretation (including the seminars for operators in Japan), SSAC's checks for syntax error, code and internal consistency (computer based consistency check between facilities) and the discussion with the IAEA on the facilities' measurement system for bulk-handling facilities, which contributes to the more accurate reports from operators. This spirit has been maintained for the entry into force of the AP. For example, questions and amplification from the IAEA will be taken into account the review of the AP declaration before sending to the IAEA and the open source information such as news article and scientific literature in Japanese is collected and translated into English, and the translated information is provided to the IAEA as the supplementary information, which may contribute to broadening the IAEA information source and to their comprehensive evaluation. The other safeguards relevant information, such as the mail-box information for SNRI at LEU fuel fabrication plants, is also checked by the JSGO's QC software before posting. The software was developed by JSGO and it checks data format, batch IDs, birth/death date, shipper/receiver information and material description code. This paper explains the history of the development of the Japanese quality assurance system regarding the reports and the safeguards relevant information to the IAEA. (author)

  12. Quality Assurance in E-Learning: PDPP Evaluation Model and its Application

    Directory of Open Access Journals (Sweden)

    Weiyuan Zhang

    2012-06-01

    Full Text Available E-learning has become an increasingly important teaching and learning mode in educational institutions and corporate training. The evaluation of e-learning, however, is essential for the quality assurance of e-learning courses. This paper constructs a four-phase evaluation model for e-learning courses, which includes planning, development, process, and product evaluation, called the PDPP evaluation model. Planning evaluation includes market demand, feasibility, target student group, course objectives, and finance. Development evaluation includes instructional design, course material design, course Web site design, flexibility, student-student interaction, teacher/tutor support, technical support, and assessment. Process evaluation includes technical support, Web site utilization, learning interaction, learning evaluation, learning support, and flexibility. Product evaluation includes student satisfaction, teaching effectiveness, learning effectiveness, and sustainability. Using the PDPP model as a research framework, a purely e-learning course on Research Methods in Distance Education, developed by the School of Professional and Continuing Education at the University of Hong Kong (HKU SPACE and jointly offered with the School of Distance Learning for Medical Education of Peking University (SDLME, PKU, was used as a case study. Sixty students from mainland China, Hong Kong, Macau, and Malaysia were recruited for this course. According to summative evaluation through a student e-learning experience survey, the majority of students were very satisfied/satisfied on all e-learning dimensions of this course. The majority of students thought that the learning effectiveness of this course was equivalent, even better, than face-to-face learning because of cross-border collaborative learning, student-centred learning, sufficient learning support, and learning flexibility. This study shows that a high quality of teaching and learning might be assured by

  13. Model for Electromagnetic Information Leakage

    OpenAIRE

    Mao Jian; Li Yongmei; Zhang Jiemin; Liu Jinming

    2013-01-01

    Electromagnetic leakage will happen in working information equipments; it could lead to information leakage. In order to discover the nature of information in electromagnetic leakage, this paper combined electromagnetic theory with information theory as an innovative research method. It outlines a systematic model of electromagnetic information leakage, which theoretically describes the process of information leakage, intercept and reproduction based on electromagnetic radiation, and ana...

  14. Information Retrieval Models

    NARCIS (Netherlands)

    Hiemstra, Djoerd; Göker, Ayse; Davies, John

    2009-01-01

    Many applications that handle information on the internet would be completely inadequate without the support of information retrieval technology. How would we find information on the world wide web if there were no web search engines? How would we manage our email without spam filtering? Much of the

  15. Object Modeling and Building Information Modeling

    OpenAIRE

    Auråen, Hege; Gjemdal, Hanne

    2016-01-01

    The main part of this thesis is an online course (Small Private Online Course) entitled "Introduction to Object Modeling and Building Information Modeling". This supplementary report clarifies the choices made in the process of developing the course. The course examines the basic concepts of object modeling, modeling techniques and a modeling language ​​(UML). Further, building information modeling (BIM) is presented as a modeling process, and the object modeling concepts in the BIM softw...

  16. Authentication Assurance Levels

    International Nuclear Information System (INIS)

    Kouzes, Richard T.; Cash, James R.; Devaney, David M.; Geelhood, Bruce D.; Hansen, Randy R.; Melton, Ronald B.; Pitts, W. Karl

    2002-01-01

    This Common Criteria approach has been applied to create a definition of Authentication Assurance Levels that can quantify the level of assurance reached for a system subject to a set of authentication procedures. The arms-control authentication application of the Common Criteria expands on more typical information security evaluations in that it must contend with information barriers and preclude sophisticated intentional subversion attempts.

  17. A causal model for the effectiveness of internal quality assurance for the health science area.

    Science.gov (United States)

    Seeorn, Kittiya

    2005-10-01

    The purposes of this research were 1) to study the effectiveness of Internal Quality Assurance (IQA) of the Health science area, and 2) to study the factors affecting the effectiveness of the IQA of the Health science area. A causal model has been developed by the researcher comprised of the 6 exogenous latent variables: Attitude towards quality assurance, Teamwork, Staff training, Resource sufficiency, Organizational culture, and Leadership, and the 4 endogenous latent variables, which are the effectiveness of the IQA, Student-centered approach, Decentralized administration, PDCA cycle of work (Plan-Do-Check-Act), and Staff job satisfaction. The research sample consisted of 108 health science faculties derived by stratified random sampling technique. Data were collected by 10 questionnaires having reliability ranging from 0.79 to 0.96. Data analyses were descriptive statistics, and Linear Structure Relationship (LISREL) analysis. The major findings were as follows: 1. The 4 dimensions of effectiveness for the IQA of the Health science areas were significantly higher at the .05 level, after the Health science faculty applied the IQA programme according to the National Education Act of 1999. 2. The causal model of the effectiveness of the IQA was valid and fitted the empirical data. The 6 predictors accounted for 83% of the variance in the effectiveness of IQA. Culture and Leadership were the predictors that significantly accounted for the effectiveness of the IQA.

  18. Data quality assessment in the routine health information system: an application of the Lot Quality Assurance Sampling in Benin.

    Science.gov (United States)

    Glèlè Ahanhanzo, Yolaine; Ouendo, Edgard-Marius; Kpozèhouen, Alphonse; Levêque, Alain; Makoutodé, Michel; Dramaix-Wilmet, Michèle

    2015-09-01

    Health information systems in developing countries are often faulted for the poor quality of the data generated and for the insufficient means implemented to improve system performance. This study examined data quality in the Routine Health Information System in Benin in 2012 and carried out a cross-sectional evaluation of the quality of the data using the Lot Quality Assurance Sampling method. The results confirm the insufficient quality of the data based on three criteria: completeness, reliability and accuracy. However, differences can be seen as the shortcomings are less significant for financial data and for immunization data. The method is simple, fast and can be proposed for current use at operational level as a data quality control tool during the production stage. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine © The Author 2014; all rights reserved.

  19. Quality assurance of in-situ measurements of land surface albedo: A model-based approach

    Science.gov (United States)

    Adams, Jennifer; Gobron, Nadine; Widlowski, Jean-Luc; Mio, Corrado

    2016-04-01

    This paper presents the development of a model-based framework for assessing the quality of in-situ measurements of albedo used to validate land surface albedo products. Using a 3D Monte Carlo Ray Tracing (MCRT) radiative transfer model, a quality assurance framework is built based on simulated field measurements of albedo within complex 3D canopies and under various illumination scenarios. This method provides an unbiased approach in assessing the quality of field measurements, and is also able to trace the contributions of two main sources of uncertainty in field-measurements of albedo; those resulting from 1) the field measurement protocol, such as height or placement of field measurement within the canopy, and 2) intrinsic factors of the 3D canopy under specific illumination characteristics considered, such as the canopy structure and landscape heterogeneity, tree heights, ecosystem type and season.

  20. Assuring the privacy and security of transmitting sensitive electronic health information.

    Science.gov (United States)

    Peng, Charlie; Kesarinath, Gautam; Brinks, Tom; Young, James; Groves, David

    2009-11-14

    The interchange of electronic health records between healthcare providers and public health organizations has become an increasingly desirable tool in reducing healthcare costs, improving healthcare quality, and protecting population health. Assuring privacy and security in nationwide sharing of Electronic Health Records (EHR) in an environment such as GRID has become a top challenge and concern. The Centers for Disease Control and Prevention's (CDC) and The Science Application International Corporation (SAIC) have jointly conducted a proof of concept study to find and build a common secure and reliable messaging platform (the SRM Platform) to handle this challenge. The SRM Platform is built on the open standards of OASIS, World Wide Web Consortium (W3C) web-services standards, and Web Services Interoperability (WS-I) specifications to provide the secure transport of sensitive EHR or electronic medical records (EMR). Transmitted data may be in any digital form including text, data, and binary files, such as images. This paper identifies the business use cases, architecture, test results, and new connectivity options for disparate health networks among PHIN, NHIN, Grid, and others.

  1. Modeling spatiotemporal information generation

    NARCIS (Netherlands)

    Scheider, Simon; Gräler, Benedikt; Pebesma, Edzer; Stasch, Christoph

    2016-01-01

    Maintaining knowledge about the provenance of datasets, that is, about how they were obtained, is crucial for their further use. Contrary to what the overused metaphors of ‘data mining’ and ‘big data’ are implying, it is hardly possible to use data in a meaningful way if information about sources

  2. An Introduction to the Deputy Assistant Secretary of Defense for Information and Identity Assurance

    National Research Council Canada - National Science Library

    Lentz, Robert

    2008-01-01

    ...) is information dependent and relies on trusted information to function effectively. The DoD faces daily attacks on its networks and systems, ranging from curious kids to much more advanced, organized campaigns. The DASD(IIA...

  3. Quality assurance and marketing.

    Science.gov (United States)

    Demby, N A

    1985-07-01

    Although considerable efforts have been directed toward the development and utilization of marketing strategies for dental practices, little if any information exists in the specific area of the role quality assurance may play in marketing dental services. This article describes and analyzes the current relationship between quality assurance and marketing, given the complex array of factors on the horizon that may affect how dentistry is organized and delivered. It must become the role of the profession to see that the alliance between marketing and quality assurance continues and is utilized to assure the quality of care provided and accountability to the public.

  4. The Development of Evaluation Model for Internal Quality Assurance System of Dramatic Arts College of Bunditpattanasilpa Institute

    Science.gov (United States)

    Sinthukhot, Kittisak; Srihamongkol, Yannapat; Luanganggoon, Nuchwana; Suwannoi, Paisan

    2013-01-01

    The research purpose was to develop an evaluation model for the internal quality assurance system of the dramatic arts College of Bunditpattanasilpa Institute. The Research and Development method was used as research methodology which was divided into three phases; "developing the model and its guideline", "trying out the actual…

  5. A simple parametric model observer for quality assurance in computer tomography

    Science.gov (United States)

    Anton, M.; Khanin, A.; Kretz, T.; Reginatto, M.; Elster, C.

    2018-04-01

    Model observers are mathematical classifiers that are used for the quality assessment of imaging systems such as computer tomography. The quality of the imaging system is quantified by means of the performance of a selected model observer. For binary classification tasks, the performance of the model observer is defined by the area under its ROC curve (AUC). Typically, the AUC is estimated by applying the model observer to a large set of training and test data. However, the recording of these large data sets is not always practical for routine quality assurance. In this paper we propose as an alternative a parametric model observer that is based on a simple phantom, and we provide a Bayesian estimation of its AUC. It is shown that a limited number of repeatedly recorded images (10–15) is already sufficient to obtain results suitable for the quality assessment of an imaging system. A MATLAB® function is provided for the calculation of the results. The performance of the proposed model observer is compared to that of the established channelized Hotelling observer and the nonprewhitening matched filter for simulated images as well as for images obtained from a low-contrast phantom on an x-ray tomography scanner. The results suggest that the proposed parametric model observer, along with its Bayesian treatment, can provide an efficient, practical alternative for the quality assessment of CT imaging systems.

  6. A simple parametric model observer for quality assurance in computer tomography.

    Science.gov (United States)

    Anton, Mathias; Khanin, Alexander; Kretz, Tobias; Reginatto, Marcel; Elster, Clemens

    2018-02-26

    Model observers are mathematical classifiers that are used for the quality assessment of imaging systems such as computer tomography. The quality of the imaging system is quantified by means of the performance of a selected model observer. For binary classification tasks, the performance of the model observer is defined by the area under its ROC curve (AUC). Typically, the AUC is estimated by applying the model observer to a large set of training and test data. However, the recording of these large data sets is not always practical for routine quality assurance. In this paper we propose as an alternative a parametric model observer that is based on a simple phantom, and we provide a Bayesian estimation of its AUC. It is shown that a limited number of repeatedly recorded images (10-15) is already sufficient to obtain results suitable for the quality assessment of an imaging system. A MATLAB® function is provided for the calculation of the results. The performance of the proposed model observer is compared to that of the established channelized Hotelling observer (CHO) and the nonprewhitening matched filter (NPW) for simulated images as well as for images obtained from a low-contrast phantom on an x-ray tomography scanner. The results suggest that the proposed parametric model observer, along with its Bayesian treatment, can provide an efficient, practical alternative for the quality assessment of CT imaging systems. © 2018 Institute of Physics and Engineering in Medicine.

  7. Quality assurance

    International Nuclear Information System (INIS)

    Kunich, M.P.; Vieth, D.L.

    1989-01-01

    This paper provides a point/counterpoint view of a quality assurance director and a project manager. It presents numerous aspects of quality assurance requirements along with analyses as to the value of each

  8. Clinical pharmacology quality assurance program: models for longitudinal analysis of antiretroviral proficiency testing for international laboratories.

    Science.gov (United States)

    DiFrancesco, Robin; Rosenkranz, Susan L; Taylor, Charlene R; Pande, Poonam G; Siminski, Suzanne M; Jenny, Richard W; Morse, Gene D

    2013-10-01

    Among National Institutes of Health HIV Research Networks conducting multicenter trials, samples from protocols that span several years are analyzed at multiple clinical pharmacology laboratories (CPLs) for multiple antiretrovirals. Drug assay data are, in turn, entered into study-specific data sets that are used for pharmacokinetic analyses, merged to conduct cross-protocol pharmacokinetic analysis, and integrated with pharmacogenomics research to investigate pharmacokinetic-pharmacogenetic associations. The CPLs participate in a semiannual proficiency testing (PT) program implemented by the Clinical Pharmacology Quality Assurance program. Using results from multiple PT rounds, longitudinal analyses of recovery are reflective of accuracy and precision within/across laboratories. The objectives of this longitudinal analysis of PT across multiple CPLs were to develop and test statistical models that longitudinally: (1) assess the precision and accuracy of concentrations reported by individual CPLs and (2) determine factors associated with round-specific and long-term assay accuracy, precision, and bias using a new regression model. A measure of absolute recovery is explored as a simultaneous measure of accuracy and precision. Overall, the analysis outcomes assured 97% accuracy (±20% of the final target concentration of all (21) drug concentration results reported for clinical trial samples by multiple CPLs). Using the Clinical Laboratory Improvement Act acceptance of meeting criteria for ≥2/3 consecutive rounds, all 10 laboratories that participated in 3 or more rounds per analyte maintained Clinical Laboratory Improvement Act proficiency. Significant associations were present between magnitude of error and CPL (Kruskal-Wallis P < 0.001) and antiretroviral (Kruskal-Wallis P < 0.001).

  9. An anaesthesia information management system as a tool for a quality assurance program: 10years of experience.

    Science.gov (United States)

    Motamed, Cyrus; Bourgain, Jean Louis

    2016-06-01

    Anaesthesia Information Management Systems (AIMS) generate large amounts of data, which might be useful for quality assurance programs. This study was designed to highlight the multiple contributions of our AIMS system in extracting quality indicators over a period of 10years. The study was conducted from 2002 to 2011. Two methods were used to extract anaesthesia indicators: the manual extraction of individual files for monitoring neuromuscular relaxation and structured query language (SQL) extraction for other indicators which were postoperative nausea and vomiting (PONV), pain, sedation scores, pain-related medications, scores and postoperative hypothermia. For each indicator, a program of information/meetings and adaptation/suggestions for operating room and PACU personnel was initiated to improve quality assurance, while data were extracted each year. The study included 77,573 patients. The mean overall completeness of data for the initial years ranged from 55 to 85% and was indicator-dependent, which then improved to 95% completeness for the last 5years. The incidence of neuromuscular monitoring was initially 67% and then increased to 95% (P<0.05). The rate of pharmacological reversal remained around 53% throughout the study. Regarding SQL data, an improvement of severe postoperative pain and PONV scores was observed throughout the study, while mild postoperative hypothermia remained a challenge, despite efforts for improvement. The AIMS system permitted the follow-up of certain indicators through manual sampling and many more via SQL extraction in a sustained and non-time-consuming way across years. However, it requires competent and especially dedicated resources to handle the database. Copyright © 2016 Société française d'anesthésie et de réanimation (Sfar). Published by Elsevier Masson SAS. All rights reserved.

  10. 78 FR 54643 - Proposed Information Collection Request; Comment Request; Laboratory Quality Assurance Evaluation...

    Science.gov (United States)

    2013-09-05

    ... respond to a collection of information unless it displays a currently valid OMB control number. DATES... Paperwork Reduction Act, EPA is soliciting comments and information to enable it to: (i) Evaluate whether...) responsible for auditing Cryptosporidium laboratories; (2) provide written guidance to State/Regional COs; (3...

  11. Small satellite product assurance

    Science.gov (United States)

    Demontlivault, J.; Cadelec, Jacques

    1993-01-01

    In order to increase the interest in small satellites, their cost must be reduced; reducing product assurance costs induced by quality requirements is a major objective. For a logical approach, small satellites are classified in three main categories: satellites for experimental operations with a short lifetime, operational satellites manufactured in small mass with long lifetime requirements, operational satellites (long lifetime required), of which only a few models are produced. The various requirements as regards the product assurance are examined for each satellite category: general requirements for space approach, reliability, electronic components, materials and processes, quality assurance, documentation, tests, and management. Ideal product assurance system integrates quality teams and engineering teams.

  12. Textual information access statistical models

    CERN Document Server

    Gaussier, Eric

    2013-01-01

    This book presents statistical models that have recently been developed within several research communities to access information contained in text collections. The problems considered are linked to applications aiming at facilitating information access:- information extraction and retrieval;- text classification and clustering;- opinion mining;- comprehension aids (automatic summarization, machine translation, visualization).In order to give the reader as complete a description as possible, the focus is placed on the probability models used in the applications

  13. Information Assurance Technologies for the Global Command and Control System (GCCS) Leading Edge Services (LES)

    National Research Council Canada - National Science Library

    O'Brien, Richard

    2001-01-01

    ... (LES) program was sponsored by DARPA's Information Systems Office. This report describes the different technology areas the program encompassed, summarized the major achievements of the program, and documents lessons learned and open issues...

  14. Federal Plan for Cyber Security and Information Assurance Research and Development

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — Powerful personal computers, high-bandwidth and wireless networking technologies, and the widespread use of the Internet have transformed stand-alone computing...

  15. DOD FINANCIAL MANAGEMENT: More Reliable Information Key to Assuring Accountability and Managing Defense Operations More Efficiently

    National Research Council Canada - National Science Library

    1999-01-01

    .... These problems not only hamper the department's ability to produce timely and accurate financial management information, but also significantly impair efforts to improve the economy and efficiency of its operations...

  16. DoD Information Assurance Certification and Accreditation Process (DIACAP) Survey and Decision Tree

    Science.gov (United States)

    2011-07-01

    CVC Compliance and Validation Certification DAA designated accrediting authority DATO denial of authorization to operate DIACAP DoD Information...standard based on implementation of the best practices listed in paragraph 2.3. c. Direct the DSG to rename the Data Protection Committee to the...Information Grid (GIG)- based environment. Figure A-1. DoD IA program management. 1.1.1 DIACAP Background. a. Interim DIACAP signed 6 July 2006

  17. Information Assurance as a System of Systems in the Submarine Force

    Science.gov (United States)

    2013-09-01

    Information Technology ambitions was called “Information Technology for the 21st Century” ( Vena 1998). Vena focused on competencies of Navy Enlisted...embarking . The report well identified training requirements and core competencies for enlisted IT specialists. Vena stated, “Will IT training and...their problems by overemphasizing technology?” ( Vena 1998, 2). In other words, the Navy needed to change how it trained those personnel who took care

  18. Statistical Modeling for Quality Assurance of Human Papillomavirus DNA Batch Testing.

    Science.gov (United States)

    Beylerian, Emily N; Slavkovsky, Rose C; Holme, Francesca M; Jeronimo, Jose A

    2018-03-22

    Our objective was to simulate the distribution of human papillomavirus (HPV) DNA test results from a 96-well microplate assay to identify results that may be consistent with well-to-well contamination, enabling programs to apply specific quality assurance parameters. For this modeling study, we designed an algorithm that generated the analysis population of 900,000 to simulate the results of 10,000 microplate assays, assuming discrete HPV prevalences of 12%, 13%, 14%, 15%, and 16%. Using binomial draws, the algorithm created a vector of results for each prevalence and reassembled them into 96-well matrices for results distribution analysis of the number of positive cells and number and size of cell clusters (≥2 positive cells horizontally or vertically adjacent) per matrix. For simulation conditions of 12% and 16% HPV prevalence, 95% of the matrices displayed the following characteristics: 5 to 17 and 8 to 22 total positive cells, 0 to 4 and 0 to 5 positive cell clusters, and largest cluster sizes of up to 5 and up to 6 positive cells, respectively. Our results suggest that screening programs in regions with an oncogenic HPV prevalence of 12% to 16% can expect 5 to 22 positive results per microplate in approximately 95% of assays and 0 to 5 positive results clusters with no cluster larger than 6 positive results. Results consistently outside of these ranges deviate from what is statistically expected and could be the result of well-to-well contamination. Our results provide guidance that laboratories can use to identify microplates suspicious for well-to-well contamination, enabling improved quality assurance.This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially without permission from the journal.

  19. STEPP: A Grounded Model to Assure the Quality of Instructional Activities in e-Learning Environments

    Directory of Open Access Journals (Sweden)

    Hamdy AHMED ABDELAZIZ

    2013-07-01

    Full Text Available The present theoretical paper aims to develop a grounded model for designing instructional activities appropriate to e-learning and online learning environments. The suggested model is guided by learning principles of cognitivism, constructivism, and connectivism learning principles to help online learners constructing meaningful experiences and moving from knowledge acquisition to knowledge creation process. The proposed model consists of five dynamic and grounded domains that assure the quality of designing and using e-learning activities: Ø Social Domain; Ø Technological Domain; Ø Epistemological Domain; Ø Psychological domain; and Ø Pedagogical Domain. Each of these domains needs four types of presences to reflect the design and the application process of e-learning activities. These four presences are: Ø cognitive presence, Ø human presence, Ø psychological presence and Ø mental presence. Applying the proposed model (STEPP throughout all online and adaptive e-learning environments may improve the process of designing and developing e-learning activities to be used as mindtools for current and future learners.

  20. Quality assurance of high education

    Directory of Open Access Journals (Sweden)

    A. M. Aleksankov

    2016-01-01

    European and Russian approaches in Quality assurance will not appear. It means that the ways of harmonization of European and Russian requirements to Study Programmes’ Quality assurance could be found. And the logical part of implementation of international Quality management schemes will be the accreditation of Russian Study programmes in international organizations and networks. In order to ensure the effectiveness of such tasks, it is necessary to develop an appropriate tools, which could help to formalize and systematize procedures of Study Programmes’ Quality assurance with a glance of requirements of European standards. The experience of St. Petersburg Polytechnic University (SPbPU on Quality assurance of Study Programmes is discussed, in particular: development and appraisal of Technique for monitoring of Study Programmes and of the Model for on-line Quality Assurance of Study Programmes with a glance of requirements of European standards, which have been created in frames of the project TEMPUS EQUASP («On-line (Electronic Quality Assurance of Study Programmes» with participation of SPbPU. Implementation of proposed tools ensures the integrity and authenticity of information on all aspects of the realization of educational process, fulfi llment of all-European requirements on Study Programmes’ accreditation, harmonization of Russian and European Higher education systems, and, thus, forms the basis for Study Programmes’ accreditation in international organizations and networks. The Model for on-line Quality Assurance of Study Programmes is a powerful tool, which allows to bring the process of Quality Assurance of Study Programmes into accord with European standards and guidelines, to improve quality of Programmes, to increase their transparency and comparability.

  1. Modelling Choice of Information Sources

    Directory of Open Access Journals (Sweden)

    Agha Faisal Habib Pathan

    2013-04-01

    Full Text Available This paper addresses the significance of traveller information sources including mono-modal and multimodal websites for travel decisions. The research follows a decision paradigm developed earlier, involving an information acquisition process for travel choices, and identifies the abstract characteristics of new information sources that deserve further investigation (e.g. by incorporating these in models and studying their significance in model estimation. A Stated Preference experiment is developed and the utility functions are formulated by expanding the travellers' choice set to include different combinations of sources of information. In order to study the underlying choice mechanisms, the resulting variables are examined in models based on different behavioural strategies, including utility maximisation and minimising the regret associated with the foregone alternatives. This research confirmed that RRM (Random Regret Minimisation Theory can fruitfully be used and can provide important insights for behavioural studies. The study also analyses the properties of travel planning websites and establishes a link between travel choices and the content, provenance, design, presence of advertisements, and presentation of information. The results indicate that travellers give particular credence to governmentowned sources and put more importance on their own previous experiences than on any other single source of information. Information from multimodal websites is more influential than that on train-only websites. This in turn is more influential than information from friends, while information from coachonly websites is the least influential. A website with less search time, specific information on users' own criteria, and real time information is regarded as most attractive

  2. Information systems for administration, clinical documentation and quality assurance in an Austrian disease management programme.

    Science.gov (United States)

    Beck, Peter; Truskaller, Thomas; Rakovac, Ivo; Bruner, Fritz; Zanettin, Dominik; Pieber, Thomas R

    2009-01-01

    5.9% of the Austrian population is affected by diabetes mellitus. Disease Management is a structured treatment approach that is suitable for application to the diabetes mellitus area and often is supported by information technology. This article describes the information systems developed and implemented in the Austrian disease management programme for type 2 diabetes. Several workflows for administration as well as for clinical documentation have been implemented utilizing the Austrian e-Health infrastructure. De-identified clinical data is available for creating feedback reports for providers and programme evaluation.

  3. El Aseguramiento de los Informes de Sostenibilidad: Diferencias Sustanciales con la Auditoría de Cuentas (Sustainability Reports Assurance: Substantial Differences with Financial Auditing

    Directory of Open Access Journals (Sweden)

    Amaia Zubiaurre

    2015-12-01

    Full Text Available Besides the growing interest of companies to communicate their commitment to sustainability, assurance of the information disclosed has increased, due to the interest of the stakeholders to know their reliability. Initially, we will explain the concept and benefits of sustainability reporting assurance. Subsequently, we will focus on the differences between the financial audit and sustainability reports assurance and the description of the main international report assurance statements. Finally, we will explain the main criticisms of assurance and some proposals for improvement. Junto al creciente interés de las empresas por comunicar su compromiso con la sostenibilidad, ha aumentado el aseguramiento de la información revelada, debido al interés de los grupos de interés por conocer su fiabilidad. Inicialmente, explicaremos el concepto y las ventajas del aseguramiento de los informes de sostenibilidad. Posteriormente, nos centraremos en las diferencias existentes entre la auditoría de cuentas y el aseguramiento de memorias de sostenibilidad y en la descripción de los principales estándares internacionales en materia de aseguramiento. Finalmente, expondremos las principales críticas al aseguramiento y algunas propuestas de mejora. DOWNLOAD THIS PAPER FROM SSRN: http://ssrn.com/abstract=2690158

  4. Information Warfare: Legal, Regulatory, Policy and Organizational Considerations for Assurance. Second Edition.

    Science.gov (United States)

    1996-07-04

    2-19 2-2-1 State Com puter Crim e Statutes ....................................................................... 2-44...2-2-2 Com puter Crim e Jurisdiction .......................................................................... 2-45 2-4-1 Information Warfare Policy...infrastructures from physical and cyber threats. * Propose statutory and regulatory changes. The Infrastructure Protection Task Force (IPTF): * Increase

  5. An evaluation model of educational quality assurance at junior high schools

    Directory of Open Access Journals (Sweden)

    Sugiyanta Sugiyanta

    2016-12-01

    Full Text Available The study was to develop an appropriate evaluation model of quality assurance (QA for evaluating the programs of the educational QA (EQA at junior high schools. The study was a research and development study that referred to the steps developed by Borg and Gall. The results of the study show that the evaluation model of EQA in junior high schools consist of the implementation of QA system and the performance of QA. The constructs for the instrument of QA system implementation consisted of planning, implementation, monitoring and evaluation, and the act of revision is based on the exploratory factor analysis at the significance level of 0.000. The constructs for the instrument of EQA performance consisted of: resource development; program and activity development; participation, satisfaction, knowledge change, attitude change, and behavior change of school community; social, economic, and school environmental development based on the exploratory factor analysis at the significance level of 0.000. The feasibility of the evaluation model is in a good category based on  experts’, users’, and practitioners’ judgment and the evidence found in the field testing.

  6. Assessment of different models to describe wax precipitation in flow assurance problems

    Energy Technology Data Exchange (ETDEWEB)

    Martos, C.; Coto, B.; Espada, J.J.; Robustillo, M.D. [Rey Juan Carlos Univ., Madrid (Spain). Dept. of Chemical and Environmental Technology; Pena, J.L. [Repsol-YPF, Madrid (Spain). Alfonso Cortina Technology Centre

    2008-07-01

    Paraffinic waxes found in crude oils cause flow assurance problems because these compounds can precipitate when temperature decreases during oil production, transport through pipelines or storage. The key variables involved in the wax precipitation process are the wax appearance temperature (WAT) and the wax precipitation curve (WPC). A good understanding of the liquid-solid equilibrium is required in order to model the precipitation process. However, new experimental data is needed to address this issue, particularly the composition of the raw crude oil, the amount of precipitated waxes against temperature and the nature of such waxes. Most models available in the literature require the knowledge of the n-paraffin distribution of crude oil. This type of determination can be carried out using different chromatographic techniques. In this study, experimental WAT and WPC were determined by means of a recently developed multistage fractional precipitation procedure. The trapped crude oil of the precipitated mixtures at each temperature was determined by the 1H NMR technique to determine the true amount of wax precipitated at each temperature. The n-paraffin distribution for the chosen crude oils was determined by chromatographic techniques. The predictive capabilities of the available models was verified by comparing experimental and predicted results. 3 refs.

  7. Information Theory: a Multifaceted Model of Information

    Directory of Open Access Journals (Sweden)

    Mark Burgin

    2003-06-01

    Full Text Available A contradictory and paradoxical situation that currently exists in information studies can be improved by the introduction of a new information approach, which is called the general theory of information. The main achievement of the general theory of information is explication of a relevant and adequate definition of information. This theory is built as a system of two classes of principles (ontological and sociological and their consequences. Axiological principles, which explain how to measure and evaluate information and information processes, are presented in the second section of this paper. These principles systematize and unify different approaches, existing as well as possible, to construction and utilization of information measures. Examples of such measures are given by Shannon’s quantity of information, algorithmic quantity of information or volume of information. It is demonstrated that all other known directions of information theory may be treated inside general theory of information as its particular cases.

  8. INFORMATION MODEL OF SOCIAL TRANSFORMATIONS

    Directory of Open Access Journals (Sweden)

    Мария Васильевна Комова

    2013-09-01

    Full Text Available The social transformation is considered as a process of qualitative changes of the society, creating a new level of organization in all areas of life, in different social formations, societies of different types of development. The purpose of the study is to create a universal model for studying social transformations based on their understanding as the consequence of the information exchange processes in the society. After defining the conceptual model of the study, the author uses the following methods: the descriptive method, analysis, synthesis, comparison.Information, objectively existing in all elements and systems of the material world, is an integral attribute of the society transformation as well. The information model of social transformations is based on the definition of the society transformation as the change in the information that functions in the society’s information space. The study of social transformations is the study of information flows circulating in the society and being characterized by different spatial, temporal, and structural states. Social transformations are a highly integrated system of social processes and phenomena, the nature, course and consequences of which are affected by the factors representing the whole complex of material objects. The integrated information model of social transformations foresees the interaction of the following components: social memory, information space, and the social ideal. To determine the dynamics and intensity of social transformations the author uses the notions of "information threshold of social transformations" and "information pressure".Thus, the universal nature of information leads to considering social transformations as a system of information exchange processes. Social transformations can be extended to any episteme actualized by social needs. The establishment of an information threshold allows to simulate the course of social development, to predict the

  9. Spatio-Temporal Nonlinear Filtering With Applications to Information Assurance and Counter Terrorism

    Science.gov (United States)

    2011-11-14

    ourselves with LAPD and Long Beach PD in order to develop direct comparisons between our models and real field data from spatially extended urban...otherwise, P∞(TQA > ν) = λ ν A, where ν > 0; in general, limA →∞ λA = 1. Also, Pollak and Tartakovsky [127] provide sufficient conditions for λA to be an...is to detect the changepoint as soon as possible. However, in many scenarios such as detecting pollutants and biological warfare agents, the change

  10. Executive Information Systems' Multidimensional Models

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available Executive Information Systems are design to improve the quality of strategic level of management in organization through a new type of technology and several techniques for extracting, transforming, processing, integrating and presenting data in such a way that the organizational knowledge filters can easily associate with this data and turn it into information for the organization. These technologies are known as Business Intelligence Tools. But in order to build analytic reports for Executive Information Systems (EIS in an organization we need to design a multidimensional model based on the business model from the organization. This paper presents some multidimensional models that can be used in EIS development and propose a new model that is suitable for strategic business requests.

  11. Regional technical cooperation model project, IAEA - RER/2/2004 ''quality control and quality assurance for nuclear analytical techniques'

    International Nuclear Information System (INIS)

    Arikan, P.

    2002-01-01

    An analytical laboratory should produce high quality analytical data through the use of analytical measurements that is accurate, reliable and adequate for the intended purpose. This objective can be accomplished in a cost-effective manner under a planned and documented quality system of activities. It is well-known that serious deficiencies can occur in laboratory operations when insufficient attention is given to the quality of the work. It requires not only a thorough knowledge of the laboratory's purpose and operation, but also the dedication of the management and operating staff to standards of excellence. Laboratories employing nuclear and nuclear-related analytical techniques are sometimes confronted with performance problems which prevent them from becoming accepted and respected by clients, such as industry, government and regulatory bodies, and from being eligible for contracts. The International Standard ISO 17025 has been produced as the result of extensive experience in the implementation of ISO/IEC Guide 25:1990 and EN 45001:1989, which replaces both of them now. It contains all of the requirements that testing and calibration laboratories must meet if they wish to demonstrate that they operate a quality system that is technically competent, and are able to generate technically valid results. The use of ISO 17025 should facilitate cooperation between laboratories and other bodies to assist in the exchange of information and experience, and in the harmonization of standards and procedures. IAEA model project RER/2/004 entitled 'Quality Assurance/Quality Control in Nuclear Analytical Techniques' was initiated in 1999 as a Regional TC project in East European countries to assist Member State laboratories in the region to install a complete quality system according to the ISO/IEC 17025 standard. 12 laboratories from 11 countries plus the Agency's Laboratories in Seibersdorf have been selected as participants to undergo exercises and training with the

  12. An Analysis of Department of Defense Instruction 8500.2 'Information Assurance (IA) Implementation.'

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, Philip LaRoche

    2012-01-01

    The Department of Defense (DoD) provides its standard for information assurance in its Instruction 8500.2, dated February 6, 2003. This Instruction lists 157 'IA Controls' for nine 'baseline IA levels.' Aside from distinguishing IA Controls that call for elevated levels of 'robustness' and grouping the IA Controls into eight 'subject areas' 8500.2 does not examine the nature of this set of controls, determining, for example, which controls do not vary in robustness, how this set of controls compares with other such sets, or even which controls are required for all nine baseline IA levels. This report analyzes (1) the IA Controls, (2) the subject areas, and (3) the Baseline IA levels. For example, this report notes that there are only 109 core IA Controls (which this report refers to as 'ICGs'), that 43 of these core IA Controls apply without variation to all nine baseline IA levels and that an additional 31 apply with variations. This report maps the IA Controls of 8500.2 to the controls in NIST 800-53 and ITGI's CoBIT. The result of this analysis and mapping, as shown in this report, serves as a companion to 8500.2. (An electronic spreadsheet accompanies this report.)

  13. Software quality assurance handbook

    Energy Technology Data Exchange (ETDEWEB)

    1990-09-01

    There are two important reasons for Software Quality Assurance (SQA) at Allied-Signal Inc., Kansas City Division (KCD): First, the benefits from SQA make good business sense. Second, the Department of Energy has requested SQA. This handbook is one of the first steps in a plant-wide implementation of Software Quality Assurance at KCD. The handbook has two main purposes. The first is to provide information that you will need to perform software quality assurance activities. The second is to provide a common thread to unify the approach to SQA at KCD. 2 figs.

  14. Compass model-based quality assurance for stereotactic VMAT treatment plans.

    Science.gov (United States)

    Valve, Assi; Keyriläinen, Jani; Kulmala, Jarmo

    2017-12-01

    To use Compass as a model-based quality assurance (QA) tool for stereotactic body radiation therapy (SBRT) and stereotactic radiation therapy (SRT) volumetric modulated arc therapy (VMAT) treatment plans calculated with Eclipse treatment planning system (TPS). Twenty clinical stereotactic VMAT SBRT and SRT treatment plans were blindly selected for evaluation. Those plans included four different treatment sites: prostate, brain, lung and body. The plans were evaluated against dose-volume histogram (DVH) parameters and 2D and 3D gamma analysis. The dose calculated with Eclipse treatment planning system (TPS) was compared to Compass calculated dose (CCD) and Compass reconstructed dose (CRD). The maximum differences in mean dose of planning target volume (PTV) were 2.7 ± 1.0% between AAA and Acuros XB calculation algorithm TPS dose, -7.6 ± 3.5% between Eclipse TPS dose and CCD dose and -5.9 ± 3.7% between Eclipse TPS dose and CRD dose for both Eclipse calculation algorithms, respectively. 2D gamma analysis was not able to identify all the cases that 3D gamma analysis specified for further verification. Compass is suitable for QA of SBRT and SRT treatment plans. However, the QA process should include wide set of DVH-based dose parameters and 3D gamma analysis should be the preferred method when performing clinical patient QA. The results suggest that the Compass should not be used for smaller field sizes than 3 × 3 cm 2 or the beam model should be adjusted separately for both small (FS ≤ 3 cm) and large (FS > 3 cm) field sizes. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  15. UST Financial Assurance Information

    Data.gov (United States)

    U.S. Environmental Protection Agency — Subtitle I of the Resource Conservation and Recovery Act, as amended by the Hazardous Waste Disposal Act of 1984, brought underground storage tanks (USTs) under...

  16. Quality assurance

    International Nuclear Information System (INIS)

    1996-01-01

    The main efforts of Nuclear Regulatory Authority of the Slovak Republic (NRA SR) was focused on support of quality assurance programmes development at responsible organizations Bohunice V-1 and V-v and Mochovce NPPs and their inspection. Development of the level two documentation of a partial quality assurance programme for NPP operation continued at Mochovce NPP. Most of documentation has been submitted to NRA SR for comments and approval. NRA SR invited a mission of French experts to Mochovce NPP to review preparation and performance of internal audits that would be beneficial for improvement in this kind activities at the NPP. Bohunice NPP continued in development of a partial quality assurance programme for operation. The Quality Assurance Programme submitted to NRA SR for approval. Based on a request of Bohunice NPPs, NRA SR consulted the draft quality assurance programme developed by Siemens for stage of the 'Basic Design' of V-1 NPP upgrading. The programme had not been submitted for approval to NRA SR prior to completion of works by Siemens. Based on an internal audit that had been performed, corrective measures were proposed to meet requirements on review and approval of suppliers quality assurance programmes. Requirements related to the quality assurance at nuclear installations were prepared to be incorporated into principles of a act on peaceful use of nuclear power in Slovak Republic

  17. DOE financial assurance presentation

    International Nuclear Information System (INIS)

    Huck, R.

    1990-01-01

    The presentation topic is California's approach to license application review in meeting financial assurances for the proposed Ward Valley site. The purpose of the presentation is to provide information on specific financial assurance provisions contained in 10 CFR Part 61 and how California intends to satisfy those requirements. Also, as rate setter, California intends to demonstrate how it will assure allowable costs to the rate base though a financial prudency review. The key provisions of financial assurance are: 10 CFR Section 61.61 - This provision requires an applicant to demonstrate its ability to finance licensed activities; 10 CFR Section 61.62 - This provision requires an applicant to provide assurance that sufficient funds will be available for site closure and stabilization; and 10 CFR Section 61.63 - This provision requires an applicant to provide 'a copy of a binding arrangement, such as a lease, between the applicant and the disposal site owner, so that sufficient funds will be available to cover the costs of the institutional control period.' To assist California in its determination of financial assurance compliance to be demonstrated by the applicant for Part 61 requirements, is NUREG guidance document 1199 'Standard Format and Content of a License Application for a Low-Level Radioactive Waste (LLRW) Disposal Facility.' The detailed financial assurance provisions of NUREG 1199 are then embodied in NUREG 1200, 'Standard Review Plant for the Review of a License Application for a LLRW Disposal Facility.'

  18. The Information Technology Model Curriculum

    Science.gov (United States)

    Ekstrom, Joseph J.; Gorka, Sandra; Kamali, Reza; Lawson, Eydie; Lunt, Barry; Miller, Jacob; Reichgelt, Han

    2006-01-01

    The last twenty years has seen the development of demand for a new type of computing professional, which has resulted in the emergence of the academic discipline of Information Technology (IT). Numerous colleges and universities across the country and abroad have responded by developing programs without the advantage of an existing model for…

  19. Data Quality Objectives and Criteria for Basic Information, Acceptable Uncertainty, and Quality-Assurance and Quality-Control Documentation

    Science.gov (United States)

    Granato, Gregory E.; Bank, Fred G.; Cazenas, Patricia A.

    1998-01-01

    The Federal Highway Administration and State transportation agencies have the responsibility of determining and minimizing the effects of highway runoff on water quality; therefore, they have been conducting an extensive program of water-quality monitoring and research during the last 25 years. The objectives and monitoring goals of highway runoff studies have been diverse, because the highway community must address many different questions about the characteristics and impacts of highway runoff. The Federal Highway Administration must establish that available data and procedures that are used to assess and predict pollutant loadings and impacts from highway stormwater runoff are valid, current, and technically supportable. This report examines criteria for evaluating water-quality data and resultant interpretations. The criteria used to determine if data are valid (useful for intended purposes), current, and technically supportable are derived from published materials from the Federal Highway Administration, the U.S. Environmental Protection Agency, the Intergovernmental Task Force on Monitoring Water Quality, the U.S. Geological Survey and from technical experts throughout the U.S. Geological Survey. Water-quality data that are documented to be meaningful, representative, complete, precise, accurate, comparable, and admissible as legal evidence will meet the scientific, engineering, and regulatory needs of highway agencies. Documentation of basic information, such as compatible monitoring objectives and program design features; metadata (when, where, and how data were collected as well as who collected and analyzed the data); ancillary information (explanatory variables and study-site characteristics); and legal requirements are needed to evaluate data. Documentation of sufficient quality-assurance and quality-control information to establish the quality and uncertainty in the data and interpretations also are needed to determine the comparability and utility of

  20. Solving multi-customer FPR model with quality assurance and discontinuous deliveries using a two-phase algebraic approach.

    Science.gov (United States)

    Chiu, Yuan-Shyi Peter; Chou, Chung-Li; Chang, Huei-Hsin; Chiu, Singa Wang

    2016-01-01

    A multi-customer finite production rate (FPR) model with quality assurance and discontinuous delivery policy was investigated in a recent paper (Chiu et al. in J Appl Res Technol 12(1):5-13, 2014) using differential calculus approach. This study employs mathematical modeling along with a two-phase algebraic method to resolve such a specific multi-customer FPR model. As a result, the optimal replenishment lot size and number of shipments can be derived without using the differential calculus. Such a straightforward method may assist practitioners who with insufficient knowledge of calculus in learning and managing the real multi-customer FPR systems more effectively.

  1. [Model of a uniform system for surgical statistics and quality assurance in surgery].

    Science.gov (United States)

    Herwig, H; Wolff, H; Gastinger, I; Lippert, H

    1987-01-01

    This is a progress report on work carried out to draft a coherent system for documentation and quality assurance in surgery. Methods and value of the system are described, with particular reference being made to results obtained from pilot studies in the Region of Suhl.

  2. Quality assurance

    OpenAIRE

    Cauchi, Maurice A.M.

    1993-01-01

    The concept of quality assurance refers more specifically to the process of objectifying and clearly enunciating goals, and providing means of assessing the outcomes. In this article the author mentions four fundamental elements of quality assurance which should be applied in the medical profession in Malta. These elements should relate to professional performance, resource utilisation, risk management and patient satisfaction. The aim of the medical professionals in Malta is to provide the b...

  3. Model for deployment of a Quality Assurance System in the nuclear fuel cycle facilities using Project Management techniques

    International Nuclear Information System (INIS)

    Lage, Ricardo F.; Ribeiro, Saulo F.Q.

    2015-01-01

    The Nuclear Safety is the main goal in any nuclear facility. In this sense the Norm CNEN-NN-1.16 classifies the quality assurance issue as a management system to be deployed and implemented by the organization to achieving security goals. Quality Assurance is a set of systematic and planned actions necessary to provide adequate confidence ensuring that a structure, system, component or installation will work satisfactorily in s. Hence, the Quality Assurance System (QAS) is a complete and comprehensive methodology, going far beyond a management plan quality from the perspective of project management. The fundamental of QAS requirements is all activities that influence the quality, involving organizational, human resources, procurement, nuclear safety, projects, procedures and communication. Coordination of all these elements requires a great effort by the team responsible because it usually involves different areas and different levels of hierarchy within the organization. The objectives and desired benefits should be well set for everyone to understand what it means to be achieved and how to achieve. The support of senior management is critical at this stage, providing guidelines and resources necessary to get the job elapse clearly and efficiently, on time, cost and certain scope. The methodology of project management processes can be applied to facilitate and expedite the implementation of this system. Many of the principles of the QAS are correlated with knowledge areas of project management. The proposed model for implementation of a QAS in the nuclear fuel cycle facilities considered the best project management practices according to the Project Management Book of Knowledge (PMBOK - 5th edition) of the Project Management Institute (PMI). This knowledge is considered very good practices around the world. Since the model was defined, the deployment process becomes more practical and efficient, providing reduction in deployment time, better management of human

  4. Model for deployment of a Quality Assurance System in the nuclear fuel cycle facilities using Project Management techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lage, Ricardo F.; Ribeiro, Saulo F.Q., E-mail: rflage@gmail.com, E-mail: quintao.saulo@gmail.com [Industrias Nucleares do Brasil (INB), Rio de Janeiro, RJ (Brazil)

    2015-07-01

    The Nuclear Safety is the main goal in any nuclear facility. In this sense the Norm CNEN-NN-1.16 classifies the quality assurance issue as a management system to be deployed and implemented by the organization to achieving security goals. Quality Assurance is a set of systematic and planned actions necessary to provide adequate confidence ensuring that a structure, system, component or installation will work satisfactorily in s. Hence, the Quality Assurance System (QAS) is a complete and comprehensive methodology, going far beyond a management plan quality from the perspective of project management. The fundamental of QAS requirements is all activities that influence the quality, involving organizational, human resources, procurement, nuclear safety, projects, procedures and communication. Coordination of all these elements requires a great effort by the team responsible because it usually involves different areas and different levels of hierarchy within the organization. The objectives and desired benefits should be well set for everyone to understand what it means to be achieved and how to achieve. The support of senior management is critical at this stage, providing guidelines and resources necessary to get the job elapse clearly and efficiently, on time, cost and certain scope. The methodology of project management processes can be applied to facilitate and expedite the implementation of this system. Many of the principles of the QAS are correlated with knowledge areas of project management. The proposed model for implementation of a QAS in the nuclear fuel cycle facilities considered the best project management practices according to the Project Management Book of Knowledge (PMBOK - 5th edition) of the Project Management Institute (PMI). This knowledge is considered very good practices around the world. Since the model was defined, the deployment process becomes more practical and efficient, providing reduction in deployment time, better management of human

  5. Building Information Modeling Comprehensive Overview

    Directory of Open Access Journals (Sweden)

    Sergey Kalinichuk

    2015-07-01

    Full Text Available The article is addressed to provide a comprehensive review on recently accelerated development of the Information Technology within project market such as industrial, engineering, procurement and construction. Author’s aim is to cover the last decades of the growth of the Information and Communication Technology in construction industry in particular Building Information Modeling and testifies that the problem of a choice of the effective project realization method not only has not lost its urgency, but has also transformed into one of the major condition of the intensive technology development. All of it has created a great impulse on shortening the project duration and has led to the development of various schedule compression techniques what becomes a focus of modern construction.

  6. Flow assurance

    Energy Technology Data Exchange (ETDEWEB)

    Mullins, O.C.; Dong, C. [Schlumberger-Doll Research Center, Cambridge, MA (United States); Elshahawi, H. [Shell Exploration and Production Company, The Hague (Netherlands)

    2008-07-01

    This study emphasized the need for considering flow assurance for producing oil and gas, particularly in high cost areas such as deepwater. Phase behaviour studies, sticking propensities, and interfacial interactions have been investigated in many laboratory studies using asphaltenes, wax, hydrates, organic and inorganic scale, and even diamondoids. However, the spatial variation of reservoir fluids has received little attention, despite the fact that it is one of the most important factors affecting flow assurance. This issue was difficult to address in a systematic way in the past because of cost constraints. Today, reservoir fluid variation and flow assurance can be considered at the outset of a project given the technological advances in downhole fluid analysis. This study described the origins of reservoir fluid compositional variations and the controversies surrounding them. It also described the indispensable chemical analytical technology. The impact of these reservoir fluid compositional variations on flow assurance considerations was also discussed. A methodology that accounts for these variations at the outset in flow assurance evaluation was also presented.

  7. Financial assurances

    International Nuclear Information System (INIS)

    Paton, R.F.

    1990-01-01

    US Ecology is a full service waste management company. The company operates two of the nation's three existing low-level radioactive waste (LLRW) disposal facilities and has prepared and submitted license applications for two new LLRW disposal facilities in California and Nebraska. The issue of financial assurances is an important aspect of site development and operation. Proper financial assurances help to insure that uninterrupted operation, closure and monitoring of a facility will be maintained throughout the project's life. Unfortunately, this aspect of licensing is not like others where you can gauge acceptance by examining approved computer codes, site performance standards or applying specific technical formulas. There is not a standard financial assurance plan. Each site should develop its requirements based upon the conditions of the site, type of design, existing state or federal controls, and realistic assessments of future financial needs. Financial assurances at U.S. Ecology's existing sites in Richland, Washington, and Beatty, Nevada, have been in place for several years and are accomplished in a variety of ways by the use of corporate guarantees, corporate capital funds, third party liability insurance, and post closure/long-term care funds. In addressing financial assurances, one can divide the issue into three areas: Site development/operations, third party damages, and long-term care/cleanup

  8. Information risk and security modeling

    Science.gov (United States)

    Zivic, Predrag

    2005-03-01

    This research paper presentation will feature current frameworks to addressing risk and security modeling and metrics. The paper will analyze technical level risk and security metrics of Common Criteria/ISO15408, Centre for Internet Security guidelines, NSA configuration guidelines and metrics used at this level. Information IT operational standards view on security metrics such as GMITS/ISO13335, ITIL/ITMS and architectural guidelines such as ISO7498-2 will be explained. Business process level standards such as ISO17799, COSO and CobiT will be presented with their control approach to security metrics. Top level, the maturity standards such as SSE-CMM/ISO21827, NSA Infosec Assessment and CobiT will be explored and reviewed. For each defined level of security metrics the research presentation will explore the appropriate usage of these standards. The paper will discuss standards approaches to conducting the risk and security metrics. The research findings will demonstrate the need for common baseline for both risk and security metrics. This paper will show the relation between the attribute based common baseline and corporate assets and controls for risk and security metrics. IT will be shown that such approach spans over all mentioned standards. The proposed approach 3D visual presentation and development of the Information Security Model will be analyzed and postulated. Presentation will clearly demonstrate the benefits of proposed attributes based approach and defined risk and security space for modeling and measuring.

  9. Study on quality assurance for high-level radioactive waste disposal project (2). Quality assurance system for the site characterization phase in the Yucca Mountain Project

    International Nuclear Information System (INIS)

    Takada, Susumu

    2006-01-01

    The objective of this report is to assist related organizations in the development of quality assurance systems for a high-level radioactive waste disposal system. This report presents detail information with which related organizations can begin the development of quality assurance systems at an initial phase of repository development for a high-level radioactive waste disposal program, including data qualification, model validation, systems and facilities for quality assurance (e.g., technical data management system, sample management facility, etc.), and QA program applicability (items and activities). These descriptions are based on information in QA program for the Yucca Mountain Project (YMP), such as the U.S. Department of Energy (DOE) Quality Assurance Requirements and Description (QARD), DOE/RW-0333P, quality implementing procedures, and reports implemented by the procedures. Additionally, this report includes some brief recommendations for developing of quality assurance systems, such as establishment of quality assurance requirements, measures for establishment of QA system. (author)

  10. Migration modelling as a tool for quality assurance of food packaging.

    Science.gov (United States)

    Brandsch, J; Mercea, P; Rüter, M; Tosa, V; Piringer, O

    2002-01-01

    The current potential for the use of migration modelling for studying polyolefin packaging materials (low- and high-density polyethylene and polypropylene) is summarized and demonstrated with practical examples. For these polymers, an upper limit of migration into foodstuffs can be predicted with a high degree of statistical confidence. The only analytical information needed for modelling in such cases is the initial concentration of the migrant in the polymer matrix. For polyolefins of unknown origin or newly developed materials with new properties, a quick experimental method is described for obtaining the characteristic matrix parameter needed for migration modelling. For easy handling of both the experimental results and the diffusion model, user-friendly software has been developed. An additional aim of the described method is the determination of the migrant partition between polymer and food or food simulant and the specific contribution of the migrant molecular structure on the diffusion coefficient. For migration modelling of packaging materials with multilayer structures, a numerical solution of the diffusion equation is described. This procedure has been also applied for modelling the migration into solid or high viscous foodstuffs.

  11. Quality assurance manual: Volume 2, Appendices

    International Nuclear Information System (INIS)

    Oijala, J.E.

    1988-06-01

    This paper contains quality assurance information on departments of the Stanford Linear Accelerator Center. Particular quality assurance policies and standards discussed are on: Mechanical Systems; Klystron and Microwave Department; Electronics Department; Plant Engineering; Accelerator Department; Purchasing; and Experimental Facilities Department

  12. Spiral model pilot project information model

    Science.gov (United States)

    1991-01-01

    The objective was an evaluation of the Spiral Model (SM) development approach to allow NASA Marshall to develop an experience base of that software management methodology. A discussion is presented of the Information Model (IM) that was used as part of the SM methodology. A key concept of the SM is the establishment of an IM to be used by management to track the progress of a project. The IM is the set of metrics that is to be measured and reported throughout the life of the project. These metrics measure both the product and the process to ensure the quality of the final delivery item and to ensure the project met programmatic guidelines. The beauty of the SM, along with the IM, is the ability to measure not only the correctness of the specification and implementation of the requirements but to also obtain a measure of customer satisfaction.

  13. Possibilities for Using TAM and Technology Frames Models to Assess the Acceptance of New Technologies in the Chilean Higher Education Quality Assurance

    Directory of Open Access Journals (Sweden)

    Luis González-Bravo

    2015-05-01

    Full Text Available This essay reviews the importance of assessing the degree of acceptance of new technologies in the Chilean higher education institutions, as an input for managing quality assurance. Technology Acceptance and Technology Frames models are described, emphasizing their benefits in this field. Understanding and facilitating the process of new technologies acceptance in the organizations, by identifying those elements which hinder it, allows improving the implementation of quality assurance mechanisms in order to make the educational process more efficient and effective.

  14. Importance of information about distribution for assuring secured and safe food! : What is food traceability system? Considering from an approach of Mr. Kazuo Sawauchi, Manager, Advanced Automation Company, Yamatake Corporation

    Science.gov (United States)

    Morita, Utako

    Importance of information about distribution for assuring secured and safe food! : What is food traceability system? Considering from an approach of Mr. Kazuo Sawauchi, Manager, Advanced Automation Company, Yamatake Corporation

  15. Quality assurance

    International Nuclear Information System (INIS)

    Gillespie, B.M.; Gleckler, B.P.

    1995-01-01

    This section of the 1994 Hanford Site Environmental Report summarizes the quality assurance and quality control practices of Hanford Site environmental monitoring and surveillance programs. Samples are analyzed according to documented standard analytical procedures. This section discusses specific measures taken to ensure quality in project management, sample collection, and analytical results

  16. Software Quality Assurance Audits Guidebooks

    Science.gov (United States)

    1990-01-01

    The growth in cost and importance of software to NASA has caused NASA to address the improvement of software development across the agency. One of the products of this program is a series of guidebooks that define a NASA concept of the assurance processes that are used in software development. The Software Assurance Guidebook, NASA-GB-A201, issued in September, 1989, provides an overall picture of the NASA concepts and practices in software assurance. Second level guidebooks focus on specific activities that fall within the software assurance discipline, and provide more detailed information for the manager and/or practitioner. This is the second level Software Quality Assurance Audits Guidebook that describes software quality assurance audits in a way that is compatible with practices at NASA Centers.

  17. Printed Circuit Board Quality Assurance

    Science.gov (United States)

    Sood, Bhanu

    2016-01-01

    PCB Assurance Summary: PCB assurance actives are informed by risk in context of the Project. Lessons are being applied across Projects for continuous improvements. Newer component technologies, smaller/high pitch devices: tighter and more demanding PCB designs: Identifying new research areas. New materials, designs, structures and test methods.

  18. Nuclear fuel quality assurance

    International Nuclear Information System (INIS)

    1976-01-01

    Full text: Quality assurance is used extensively in the design, construction and operation of nuclear power plants. This methodology is applied to all activities affecting the quality of a nuclear power plant in order to obtain confidence that an item or a facility will perform satisfactorily in service. Although the achievement of quality is the responsibility of all parties participating in a nuclear power project, establishment and implementation of the quality assurance programme for the whole plant is a main responsibility of the plant owner. For the plant owner, the main concern is to achieve control over the quality of purchased products or services through contractual arrangements with the vendors. In the case of purchase of nuclear fuel, the application of quality assurance might be faced with several difficulties because of the lack of standardization in nuclear fuel and the proprietary information of the fuel manufacturers on fuel design specifications and fuel manufacturing procedures. The problems of quality assurance for purchase of nuclear fuel were discussed in detail during the seminar. Due to the lack of generally acceptable standards, the successful application of the quality assurance concept to the procurement of fuel depends on how much information can be provided by the fuel manufacturer to the utility which is purchasing fuel, and in what form and how early this information can be provided. The extent of information transfer is basically set out in the individual vendor-utility contracts, with some indirect influence from the requirements of regulatory bodies. Any conflict that exists appears to come from utilities which desire more extensive control over the product they are buying. There is a reluctance on the part of vendors to permit close insight of the purchasers into their design and manufacturing procedures, but there nevertheless seems to be an increasing trend towards release of more information to the purchasers. It appears that

  19. Models of memory: information processing.

    Science.gov (United States)

    Eysenck, M W

    1988-01-01

    A complete understanding of human memory will necessarily involve consideration of the active processes involved at the time of learning and of the organization and nature of representation of information in long-term memory. In addition to process and structure, it is important for theory to indicate the ways in which stimulus-driven and conceptually driven processes interact with each other in the learning situation. Not surprisingly, no existent theory provides a detailed specification of all of these factors. However, there are a number of more specific theories which are successful in illuminating some of the component structures and processes. The working memory model proposed by Baddeley and Hitch (1974) and modified subsequently has shown how the earlier theoretical construct of the short-term store should be replaced with the notion of working memory. In essence, working memory is a system which is used both to process information and to permit the transient storage of information. It comprises a number of conceptually distinct, but functionally interdependent components. So far as long-term memory is concerned, there is evidence of a number of different kinds of representation. Of particular importance is the distinction between declarative knowledge and procedural knowledge, a distinction which has received support from the study of amnesic patients. Kosslyn has argued for a distinction between literal representation and propositional representation, whereas Tulving has distinguished between episodic and semantic memories. While Tulving's distinction is perhaps the best known, there is increasing evidence that episodic and semantic memory differ primarily in content rather than in process, and so the distinction may be of less theoretical value than was originally believed.(ABSTRACT TRUNCATED AT 250 WORDS)

  20. Quality assurance handbook for measurement laboratories

    International Nuclear Information System (INIS)

    Delvin, W.L.

    1984-10-01

    This handbook provides guidance in the application of quality assurance to measurement activities. It is intended to help those persons making measurements in applying quality assurance to their work activities by showing how laboratory practices and quality assurance requirements are integrated to provide control within those activities. The use of the guidance found in this handbook should help provide consistency in the interpretation of quality assurance requirements across all types of measurement laboratories. This handbook also can assist quality assurance personnel in understanding the relationships between laboratory practices and quality assurance requirements. The handbook is composed of three chapters and several appendices. Basic guidance is provided by the three chapters. In Chapter 1, the role of quality assurance in obtaining quality data and the importance of such data are discussed. Chapter 2 presents the elements of laboratory quality assurance in terms of practices that can be used in controlling work activities to assure the acquisition of quality data. Chapter 3 discusses the implementation of laboratory quality assurance. The appendices provide supplemental information to give the users a better understanding of the following: what is quality assurance; why quality assurance is required; where quality assurance requirements come from; how those requirements are interpreted for application to laboratory operations; how the elements of laboratory quality assurance relate to various laboratory activities; and how a quality assurance program can be developed

  1. Reservoir Model Information System: REMIS

    Science.gov (United States)

    Lee, Sang Yun; Lee, Kwang-Wu; Rhee, Taehyun; Neumann, Ulrich

    2009-01-01

    We describe a novel data visualization framework named Reservoir Model Information System (REMIS) for the display of complex and multi-dimensional data sets in oil reservoirs. It is aimed at facilitating visual exploration and analysis of data sets as well as user collaboration in an easier way. Our framework consists of two main modules: the data access point module and the data visualization module. For the data access point module, the Phrase-Driven Grammar System (PDGS) is adopted for helping users facilitate the visualization of data. It integrates data source applications and external visualization tools and allows users to formulate data query and visualization descriptions by selecting graphical icons in a menu or on a map with step-by-step visual guidance. For the data visualization module, we implemented our first prototype of an interactive volume viewer named REMVR to classify and to visualize geo-spatial specific data sets. By combining PDGS and REMVR, REMIS assists users better in describing visualizations and exploring data so that they can easily find desired data and explore interesting or meaningful relationships including trends and exceptions in oil reservoir model data.

  2. Parsimonious Language Models for Information Retrieval

    NARCIS (Netherlands)

    Hiemstra, Djoerd; Robertson, Stephen; Zaragoza, Hugo

    We systematically investigate a new approach to estimating the parameters of language models for information retrieval, called parsimonious language models. Parsimonious language models explicitly address the relation between levels of language models that are typically used for smoothing. As such,

  3. [New avenues to quality assurance--a model project for recording bedsore incidence].

    Science.gov (United States)

    Steingass, S; Klein, B; Hube, G; Pavel, K; Walter, K; Weiss, V

    2002-11-01

    Bedsores can usually be avoided by adequate care and preventive measures. In the context of the local agenda process a local district office (Landratsamt) and inspection units initiated a variety of activities to promote sensitisation in health institutions and contribute to an increase in life quality of the persons concerned. Nearly all nursing care homes, domiciliary services and hospitals participated in a pilot study which was accompanied by the Fraunhofer IAO in Stuttgart. Objectives of the pilot study were to implement internal quality assurance to sensitise for persons the topic and to collect comparable data to achieve benchmarking. Software-tool institutions recorded data on care days, days spent with bedsores according to care level from July till September 01. Major result was that - although institutions had already a decreasing incidence in bedsores since the start of discussing the project - bedsore quotas could be further decreased from 2.15 to 1.84 %.

  4. Refreshing Information Literacy: Learning from Recent British Information Literacy Models

    Science.gov (United States)

    Martin, Justine

    2013-01-01

    Models play an important role in helping practitioners implement and promote information literacy. Over time models can lose relevance with the advances in technology, society, and learning theory. Practitioners and scholars often call for adaptations or transformations of these frameworks to articulate the learning needs in information literacy…

  5. Item Information in the Rasch Model

    NARCIS (Netherlands)

    Engelen, Ron J.H.; van der Linden, Willem J.; Oosterloo, Sebe J.

    1988-01-01

    Fisher's information measure for the item difficulty parameter in the Rasch model and its marginal and conditional formulations are investigated. It is shown that expected item information in the unconditional model equals information in the marginal model, provided the assumption of sampling

  6. Software quality assurance

    CERN Document Server

    Laporte, Claude Y

    2018-01-01

    This book introduces Software Quality Assurance (SQA) and provides an overview of standards used to implement SQA. It defines ways to assess the effectiveness of how one approaches software quality across key industry sectors such as telecommunications, transport, defense, and aerospace. * Includes supplementary website with an instructor's guide and solutions * Applies IEEE software standards as well as the Capability Maturity Model Integration for Development (CMMI) * Illustrates the application of software quality assurance practices through the use of practical examples, quotes from experts, and tips from the authors

  7. Vega flow assurance system

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Marit; Munaweera, Sampath

    2010-07-01

    Vega is a gas condensate field located at the west coast of Norway and developed as a tie-in to the Gjoea platform. Operator is Statoil, production startup is estimated to the end of 2010. Flow assurance challenges are high reservoir pressure and temperature, hydrate and wax control, liquid accumulation and monitoring the well/template production rates. The Vega Flow Assurance System (FAS) is a software that supports monitoring and operation of the field. The FAS is based FlowManagerTM designed for real time systems. This is a flexible tool with its own steady state multiphase- and flow assurance models. Due to the long flowlines lines and the dynamic behavior, the multiphase flow simulator OLGA is also integrated in the system. Vega FAS will be used as: - An online monitoring tool - An offline what-if simulation and validation tool - An advisory control system for well production allocation. (Author)

  8. 7 CFR 652.7 - Quality assurance.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 6 2010-01-01 2010-01-01 false Quality assurance. 652.7 Section 652.7 Agriculture... assurance. (a) NRCS will review, in consultation with the Farm Service Agency, as appropriate, the quality... information obtained through its quality assurance process, documentation submitted by the technical service...

  9. Model Information Exchange System (MIXS).

    Science.gov (United States)

    2013-08-01

    Many travel demand forecast models operate at state, regional, and local levels. While they share the same physical network in overlapping geographic areas, they use different and uncoordinated modeling networks. This creates difficulties for models ...

  10. Directory of Energy Information Administration Models 1994

    Energy Technology Data Exchange (ETDEWEB)

    1994-07-01

    This directory revises and updates the 1993 directory and includes 15 models of the National Energy Modeling System (NEMS). Three other new models in use by the Energy Information Administration (EIA) have also been included: the Motor Gasoline Market Model (MGMM), Distillate Market Model (DMM), and the Propane Market Model (PPMM). This directory contains descriptions about each model, including title, acronym, purpose, followed by more detailed information on characteristics, uses and requirements. Sources for additional information are identified. Included in this directory are 37 EIA models active as of February 1, 1994.

  11. Directory of Energy Information Administration Models 1994

    International Nuclear Information System (INIS)

    1994-07-01

    This directory revises and updates the 1993 directory and includes 15 models of the National Energy Modeling System (NEMS). Three other new models in use by the Energy Information Administration (EIA) have also been included: the Motor Gasoline Market Model (MGMM), Distillate Market Model (DMM), and the Propane Market Model (PPMM). This directory contains descriptions about each model, including title, acronym, purpose, followed by more detailed information on characteristics, uses and requirements. Sources for additional information are identified. Included in this directory are 37 EIA models active as of February 1, 1994

  12. Information technology model for evaluating emergency medicine teaching

    Science.gov (United States)

    Vorbach, James; Ryan, James

    1996-02-01

    This paper describes work in progress to develop an Information Technology (IT) model and supporting information system for the evaluation of clinical teaching in the Emergency Medicine (EM) Department of North Shore University Hospital. In the academic hospital setting student physicians, i.e. residents, and faculty function daily in their dual roles as teachers and students respectively, and as health care providers. Databases exist that are used to evaluate both groups in either academic or clinical performance, but rarely has this information been integrated to analyze the relationship between academic performance and the ability to care for patients. The goal of the IT model is to improve the quality of teaching of EM physicians by enabling the development of integrable metrics for faculty and resident evaluation. The IT model will include (1) methods for tracking residents in order to develop experimental databases; (2) methods to integrate lecture evaluation, clinical performance, resident evaluation, and quality assurance databases; and (3) a patient flow system to monitor patient rooms and the waiting area in the Emergency Medicine Department, to record and display status of medical orders, and to collect data for analyses.

  13. Information modelling and knowledge bases XXV

    CERN Document Server

    Tokuda, T; Jaakkola, H; Yoshida, N

    2014-01-01

    Because of our ever increasing use of and reliance on technology and information systems, information modelling and knowledge bases continue to be important topics in those academic communities concerned with data handling and computer science. As the information itself becomes more complex, so do the levels of abstraction and the databases themselves. This book is part of the series Information Modelling and Knowledge Bases, which concentrates on a variety of themes in the important domains of conceptual modeling, design and specification of information systems, multimedia information modelin

  14. Improving of Quality Control and Quality Assurance in 14C and 3H Laboratory; Participation in the IAEA Model Project

    International Nuclear Information System (INIS)

    Obelic, B.

    2001-01-01

    Full text: Users of laboratory's analytical results are increasingly requiring demonstrable proofs of the reliability and credibility of the results using internationally accepted standards, because the economic, ecological, medical and legal decisions based on laboratory results need to be accepted nationally and internationally. Credibility, respect and opportunities of the laboratories are improved when objective evidence on the reliability and quality of the results can be given. This is achieved through inculcation of a quality culture through definition of well-defined procedures and controls and operational checks characteristic of quality assurance and quality control (Q A/QC). IAEA launched in 1999 a two-and-a-half year model project entitled Quality Control and Quality Assurance of Nuclear Analytical Techniques with participation of laboratories using alpha, beta and/or gamma spectrometry from CEE and NIS countries. The project started to introduce and implement QA principles in accordance with the ISO-17025 guide, leading eventually to a level at which the QA system is self-sustainable and might be appropriate for formal accreditation or certification by respective national authorities. Activities within the project consist of semi-annual reports, two training workshops, two inspection visits of the laboratories by IAEA experts and proficiency tests. The following topics were considered: organisation requirements, acceptance criteria and non-conformance management of QC, internal and external method validation, statistical analyses and uncertainty evaluation, standard operation procedures and quality manual documentation. 14 C and 3 H Laboratory of the Rudjer Boskovic Institute has been one of ten laboratories participating in the Project. In the Laboratory all the procedures required in the quality control were included implicitly, while during the Model Project much effort has been devoted to elaboration of explicit documentation. Since the beginning

  15. Optimal Quality Assurance Systems for Agricultural Outputs

    OpenAIRE

    Miguel Carriquiry; Bruce A. Babcock; Roxana Carbone

    2003-01-01

    New quality assurance systems (QASs) are being put in place to facilitate the flow of information about agricultural and food products. But what constitutes a proper mix of public and private efforts in setting up QASs is an unsettled question. A better understanding of private sector incentives for setting up such systems will help clarify what role the public sector might have in establishing standards. We contribute to this understanding by modeling the optimal degree of "stringency" or as...

  16. Complementarity of Historic Building Information Modelling and Geographic Information Systems

    Science.gov (United States)

    Yang, X.; Koehl, M.; Grussenmeyer, P.; Macher, H.

    2016-06-01

    In this paper, we discuss the potential of integrating both semantically rich models from Building Information Modelling (BIM) and Geographical Information Systems (GIS) to build the detailed 3D historic model. BIM contributes to the creation of a digital representation having all physical and functional building characteristics in several dimensions, as e.g. XYZ (3D), time and non-architectural information that are necessary for construction and management of buildings. GIS has potential in handling and managing spatial data especially exploring spatial relationships and is widely used in urban modelling. However, when considering heritage modelling, the specificity of irregular historical components makes it problematic to create the enriched model according to its complex architectural elements obtained from point clouds. Therefore, some open issues limiting the historic building 3D modelling will be discussed in this paper: how to deal with the complex elements composing historic buildings in BIM and GIS environment, how to build the enriched historic model, and why to construct different levels of details? By solving these problems, conceptualization, documentation and analysis of enriched Historic Building Information Modelling are developed and compared to traditional 3D models aimed primarily for visualization.

  17. [Integrated quality assurance].

    Science.gov (United States)

    Bögel, K; Stöhr, K

    1994-07-01

    The definition of terms and connotation of "Quality", "Quality Assurance" and "Integration" lead to an analysis and understanding of inhibiting and fostering factors of the "Health Triad" of people, animals and environment. Although "Quality" is largely or ultimately determined by the consumer, there are considerable differences as this term is applied by (a) the individual consumer, (b) the dynamic producer defending or gaining markets, (c) those engaged in traditional product manufacturing, or (d) governments setting (minimum) requirements for the sake of free trade. "Quality Assurance" offers cooperation of partners all along the food chain from "pasture to table". The managerial process turned into a continuum of responsibility and agreement on processes and product characteristics. This overcomes the disadvantages of strategies stressing distinct defense barriers. In practice this philosophy of a predominant role of defence barriers proved largely partnership destructive, in that it permitted to shift responsibilities for failures and to claim administrative competence according to momentary situations and interests. "Integrated Quality Assurance" means mutual agreement of two or more partners along the food chain (e. g. feed producers, farmers, animal health industry, veterinarians and food processors) on product characteristics and production methods. It involves essential system elements including facilities, materials, manpower, information, transport, management etc. Different principles and procedures of quality assurance have been introduced in practice, including agriculture and food processing. These different approaches are not mutually exclusive but largely of complementary nature.(ABSTRACT TRUNCATED AT 250 WORDS)

  18. The Information Warfare Life Cycle Model

    Directory of Open Access Journals (Sweden)

    Brett van Niekerk

    2011-03-01

    Full Text Available Information warfare (IW is a dynamic and developing concept, which constitutes a number of disciplines. This paper aims to develop a life cycle model for information warfare that is applicable to all of the constituent disciplines. The model aims to be scalable and applicable to civilian and military incidents where information warfare tactics are employed. Existing information warfare models are discussed, and a new model is developed from the common aspects of these existing models. The proposed model is then applied to a variety of incidents to test its applicability and scalability. The proposed model is shown to be applicable to multiple disciplines of information warfare and is scalable, thus meeting the objectives of the model.

  19. Assuring quality.

    Science.gov (United States)

    Eaton, K A; Reynolds, P A; Mason, R; Cardell, R

    2008-08-09

    All those involved in education have a strong motivation to ensure that all its aspects, including content and teaching practice, are of the highest standard. This paper describes how agencies such as the Quality Assurance Agency for Higher Education (QAA) and the General Dental Council (GDC) have established frameworks and specifications to monitor the quality of education provided in dental schools and other institutes that provide education and training for dentists and dental care professionals (DCPs). It then considers quality issues in programme and course development, techniques for assessing the quality of education, including content and presentation, and the role of students. It goes on to review the work that has been done in developing quality assessment for distance learning in dentistry. It concludes that, to date, much of the work on quality applies to education as a whole and that the assessment of the quality of e-learning in dentistry is in its infancy.

  20. The Safety Journey: Using a Safety Maturity Model for Safety Planning and Assurance in the UK Coal Mining Industry

    Directory of Open Access Journals (Sweden)

    Patrick Foster

    2013-02-01

    Full Text Available A Safety Maturity Model was developed for use in UK coal mining operations in order to assess the level of compliance and effectiveness with a recently introduced standards based safety management system. The developed model allowed for a “self-assessment” of the maturity to be undertaken by teams from the individual sites. Assessments were undertaken at all sites (surface and underground and in some cases within each site (e.g., underground operations, surface coal preparation plant. Once the level of maturity was established, improvement plans were developed to improve the maturity of individual standards that were weaker than the average and/or improve the maturity as a whole. The model was likened to a journey as there was a strong focus on continual improvement and effectiveness of the standards, rather than pure compliance. The model has been found to be a practical and useful tool by sites as a means of identifying strengths and weaknesses within their systems, and as a means of assurance with the safety management system standards.

  1. Building Program Models Incrementally from Informal Descriptions.

    Science.gov (United States)

    1979-10-01

    AD-AOB6 50 STANFORD UNIV CA DEPT OF COMPUTER SCIENCE F/G 9/2 BUILDING PROGRAM MODELS INCREMENTALLY FROM INFORMAL DESCRIPTION--ETC(U) OCT 79 B P...port SCI.ICS.U.79.2 t Building Program Models Incrementally from Informal Descriptions by Brian P. McCune Research sponsored by Defense Advanced...TYPE OF REPORT & PERIOD COVERED Building Program Models Incrementally from Informal Descriptions. , technical, October 1979 6. PERFORMING ORG

  2. How ISO/IEC 17799 can be used for base lining information assurance among entities using data mining for defense, homeland security, commercial, and other civilian/commercial domains

    Science.gov (United States)

    Perry, William G.

    2006-04-01

    One goal of database mining is to draw unique and valid perspectives from multiple data sources. Insights that are fashioned from closely-held data stores are likely to possess a high degree of reliability. The degree of information assurance comes into question, however, when external databases are accessed, combined and analyzed to form new perspectives. ISO/IEC 17799, Information technology-Security techniques-Code of practice for information security management, can be used to establish a higher level of information assurance among disparate entities using data mining in the defense, homeland security, commercial and other civilian/commercial domains. Organizations that meet ISO/IEC information security standards have identified and assessed risks, threats and vulnerabilities and have taken significant proactive steps to meet their unique security requirements. The ISO standards address twelve domains: risk assessment and treatment, security policy, organization of information security, asset management, human resources security, physical and environmental security, communications and operations management, access control, information systems acquisition, development and maintenance, information security incident management and business continuity management and compliance. Analysts can be relatively confident that if organizations are ISO 17799 compliant, a high degree of information assurance is likely to be a characteristic of the data sets being used. The reverse may be true. Extracting, fusing and drawing conclusions based upon databases with a low degree of information assurance may be wrought with all of the hazards that come from knowingly using bad data to make decisions. Using ISO/IEC 17799 as a baseline for information assurance can help mitigate these risks.

  3. Topic modelling in the information warfare domain

    CSIR Research Space (South Africa)

    De Waal, A

    2013-11-01

    Full Text Available In this paper the authors provide context to Topic Modelling as an Information Warfare technique. Topic modelling is a technique that discovers latent topics in unstructured and unlabelled collection of documents. The topic structure can be searched...

  4. Melvin Defleur's Information Communication Model: Its Application ...

    African Journals Online (AJOL)

    The paper discusses Melvin Defleur's information communication model and its application to archives administration. It provides relevant examples in which archives administration functions involve the communication process. Specific model elements and their application in archives administration are highlighted.

  5. Representing energy technologies in top-down economic models using bottom-up information

    Energy Technology Data Exchange (ETDEWEB)

    McFarland, J.R. [M.I.T., Cambridge, MA (United States). Technology and Policy Program; Reilly, J.M. [M.I.T., Cambridge, MA (United States). Joint Program on the Science and Policy of Global Change; Herzog, H.J. [M.I.T., Cambridge, MA (United States). Laboratory for Energy and the Environment

    2004-07-01

    The rate and magnitude of technological change is a critical component in estimating future anthropogenic carbon emissions. We present a methodology for modeling low-carbon emitting technologies within the MIT Emissions Prediction and Policy Analysis (EPPA) model, a computable general equilibrium (CGE) model of the world economy. The methodology translates bottom-up engineering information for two carbon capture and sequestration (CCS) technologies in the electric power sector into the EPPA model and discusses issues that arise in assuring an accurate representation and realistic market penetration. We find that coal-based technologies with sequestration penetrate, despite their higher cost today, because of projected rising natural gas prices. (author)

  6. Compilation of information on melter modeling

    International Nuclear Information System (INIS)

    Eyler, L.L.

    1996-03-01

    The objective of the task described in this report is to compile information on modeling capabilities for the High-Temperature Melter and the Cold Crucible Melter and issue a modeling capabilities letter report summarizing existing modeling capabilities. The report is to include strategy recommendations for future modeling efforts to support the High Level Waste (BLW) melter development

  7. Spinal Cord Injury Model System Information Network

    Science.gov (United States)

    ... the UAB-SCIMS Contact the UAB-SCIMS UAB Spinal Cord Injury Model System Newly Injured Health Daily Living Consumer ... Information Network The University of Alabama at Birmingham Spinal Cord Injury Model System (UAB-SCIMS) maintains this Information Network ...

  8. Quality Assurance in the Presence of Variability

    Science.gov (United States)

    Lauenroth, Kim; Metzger, Andreas; Pohl, Klaus

    Software Product Line Engineering (SPLE) is a reuse-driven development paradigm that has been applied successfully in information system engineering and other domains. Quality assurance of the reusable artifacts of the product line (e.g. requirements, design, and code artifacts) is essential for successful product line engineering. As those artifacts are reused in several products, a defect in a reusable artifact can affect several products of the product line. A central challenge for quality assurance in product line engineering is how to consider product line variability. Since the reusable artifacts contain variability, quality assurance techniques from single-system engineering cannot directly be applied to those artifacts. Therefore, different strategies and techniques have been developed for quality assurance in the presence of variability. In this chapter, we describe those strategies and discuss in more detail one of those strategies, the so called comprehensive strategy. The comprehensive strategy aims at checking the quality of all possible products of the product line and thus offers the highest benefits, since it is able to uncover defects in all possible products of the product line. However, the central challenge for applying the comprehensive strategy is the complexity that results from the product line variability and the large number of potential products of a product line. In this chapter, we present one concrete technique that we have developed to implement the comprehensive strategy that addresses this challenge. The technique is based on model checking technology and allows for a comprehensive verification of domain artifacts against temporal logic properties.

  9. A Software Hub for High Assurance Model-Driven Development and Analysis

    National Research Council Canada - National Science Library

    Cleaveland, Rance; Sims, Steve; Hansel, David; DuVarney, Dan

    2007-01-01

    .... As part of this six-month effort a translator was implemented from the commercially popular modeling notations Simulink/Stateflow into the SAL input notation for the SALSA analysis tool, and several...

  10. HASP - The High Assurance Security Program

    OpenAIRE

    Naval Postgraduate School (U.S.); Center for Information Systems Studies Security and Research (CISR)

    2011-01-01

    This program provides a unifying conceptual framework and management structure for long range planning and coordination of focused Information Assurance research projects. The primary program goal is to support the strengthening of assurance provided by the National Information Infrastructure. Our approach includes the research and development of high assurance networks, systems, components and tools, and the open dissemination of outputs from those efforts, such as code and documentation.

  11. Quality assurance criteria for Waste Isolation Pilot Plant performance assessment modeling

    International Nuclear Information System (INIS)

    1995-07-01

    The US Department of Energy (DOE) is developing the Waste Isolation Pilot Plant (WIPP) as a deep geologic repository for transuranic (TRU) and TRU-mixed wastes generated by DOE Defense Program activities. Regulatory agencies, including the Environmental Protection Agency (EPA) and New Mexico Environment Department, will be forced to rely upon system modeling to determine the potential compliance of the WIPP facility with federal regulations. Specifically, long-term modeling efforts are focused on compliance with 40 CFR Part 268, ''Land Disposal Restrictions,'' and 40 CFR Part 191, ''Environmental Radiation Protection Standards for Management and Disposal of Spent Nuclear Fuel, High-Level, and Transuranic Radioactive Wastes.'' DOE plans to use the similar conceptual models and numerical codes to demonstrate compliance under both of these regulations. Sandia National Laboratories (SNL) has been developing a system model that will be used to demonstrate potential waste migration from the WIPP facility. Because the geologic system underlying the WIPP site is not completely understood, the software code to model the system must be developed to exacting standards for its predictions to be reliable and defensible. This is a complex model that consists of many submodules used to describe various migration pathways and processes that affect potential waste migration

  12. Directory of Energy Information Administration models 1996

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-07-01

    This directory revises and updates the Directory of Energy Information Administration Models 1995, DOE/EIA-0293(95), Energy Information Administration (EIA), U.S. Department of Energy, July 1995. Four models have been deleted in this directory as they are no longer being used: (1) Market Penetration Model for Ground-Water Heat Pump Systems (MPGWHP); (2) Market Penetration Model for Residential Rooftop PV Systems (MPRESPV-PC); (3) Market Penetration Model for Active and Passive Solar Technologies (MPSOLARPC); and (4) Revenue Requirements Modeling System (RRMS).

  13. Conceptual Modeling of Time-Varying Information

    DEFF Research Database (Denmark)

    Gregersen, Heidi; Jensen, Christian Søndergaard

    2004-01-01

    A wide range of database applications manage information that varies over time. Many of the underlying database schemas of these were designed using the Entity-Relationship (ER) model. In the research community as well as in industry, it is common knowledge that the temporal aspects of the mini-world...... are important, but difficult to capture using the ER model. Several enhancements to the ER model have been proposed in an attempt to support the modeling of temporal aspects of information. Common to the existing temporally extended ER models, few or no specific requirements to the models were given...

  14. Organization-based Model-driven Development of High-assurance Multiagent Systems

    Science.gov (United States)

    2009-02-27

    abstract qualities. And, fourth, we analyze the generated policies for conflicts. 2.6.2 Quality Metrics ISO 9126 ( ISO , 1991) defines a set of...OPF). Annals of Software Engineering. 14, 1-4 (2002) 341-362. ISO . (1991). Iso /iec: 9126 information technology-software product evaluation-quality...c2,...) Pf{cy...) 2.6.2.2 Quality of Product Quality of achieved goals falls under the ISO software quality of Functionality. Here we define

  15. ONLINE MODEL OF EDUCATION QUALITY ASSURANCE EQUASP IMPLEMENTATION: EXPERIENCE OF VYATKA STATE UNIVERSITY

    Directory of Open Access Journals (Sweden)

    Valentin Pugach

    2015-10-01

    Full Text Available The article is devoted to the problem of assessing the quality of higher education. In the Russian Federation recently quality assessment of educational services provided by state-accredited universities is carried out by the state represented by the Ministry of education and science. State universities have simulated internal systemseducation quality assessment in accordance with the methodology proposed by the Ministry of education and science. Currently more attention is paid to the independent assessment of education quality which is the basis of professional public accreditation. The project "EQUASP" financed within the framework of the TEMPUS programme is directed to the problem of implementing the methodology of the online model of independent higher education quality assessment in the practice of Russian universities. The proposed model for assessing the quality of education is based on usage of 5 standards. The authors have done a comparative analysis of the model of higher education quality assessment existing in Vyatka State University and the model of education quality assessing offered by European universities-participants of the project EQUASP. The authors have presented the main results of investigation of this problem and some suggestions for improving the model of education quality assessment used by Vyatka State University.

  16. An information maximization model of eye movements

    Science.gov (United States)

    Renninger, Laura Walker; Coughlan, James; Verghese, Preeti; Malik, Jitendra

    2005-01-01

    We propose a sequential information maximization model as a general strategy for programming eye movements. The model reconstructs high-resolution visual information from a sequence of fixations, taking into account the fall-off in resolution from the fovea to the periphery. From this framework we get a simple rule for predicting fixation sequences: after each fixation, fixate next at the location that minimizes uncertainty (maximizes information) about the stimulus. By comparing our model performance to human eye movement data and to predictions from a saliency and random model, we demonstrate that our model is best at predicting fixation locations. Modeling additional biological constraints will improve the prediction of fixation sequences. Our results suggest that information maximization is a useful principle for programming eye movements.

  17. Construction of a business model to assure financial sustainability of biobanks.

    Science.gov (United States)

    Warth, Rainer; Perren, Aurel

    2014-12-01

    Biobank-suisse (BBS) is a collaborative network of biobanks in Switzerland. Since 2005, the network has worked with biobank managers towards a Swiss biobanking platform that harmonizes structures and procedures. The work with biobank managers has shown that long-term, sustainable financing is difficult to obtain. In this report, three typical biobank business models are identified and their characteristics analyzed. Five forces analysis was used to understand the competitive environment of biobanks. Data provided by OECD was used for financial estimations. The model was constructed using the business model canvas tool. The business models identified feature financing influenced by the economic situation and the research budgets in a given country. Overall, the competitive environment for biobanks is positive. The bargaining power with the buyer is negative since price setting and demand prediction is difficult. In Switzerland, the healthcare industry collects approximately 5600 U.S. dollars per person and year. If each Swiss citizen paid 0.1% (or 5 U.S. dollars) of this amount to Swiss biobanks, 45 million U.S. dollars could be collected. This compares to the approximately 10 million U.S. dollars made available for cohort studies, longitudinal studies, and pathology biobanks through science funding. With the same approach, Germany, the United States, Canada, France, and the United Kingdom could collect 361, 2634, 154, 264, and 221 million U.S. dollars, respectively. In Switzerland and in other countries, an annual fee less than 5 U.S. dollars per person is sufficient to provide biobanks with sustainable financing. This inspired us to construct a business model that not only includes the academic and industrial research sectors as customer segment, but also includes the population. The revenues would be collected as fees by the healthcare system. In Italy and Germany, a small share of healthcare spending is already used to finance selected clinical trials. The legal

  18. Quality Assurance Project Plan - Modeling the Impact of Hydraulic Fracturing on Water Resources Based on Water Acquisition Scenarios

    Science.gov (United States)

    This planning document describes the quality assurance/quality control activities and technical requirements that will be used during the research study. The goal of this project is to evaluate the potential impacts of large volume water withdrawals.

  19. The Feasibility of Quality Function Deployment (QFD) as an Assessment and Quality Assurance Model

    Science.gov (United States)

    Matorera, D.; Fraser, W. J.

    2016-01-01

    Business schools are globally often seen as structured, purpose-driven, multi-sector and multi-perspective organisations. This article is based on the response of a graduate school to an innovative industrial Quality Function Deployment-based model (QFD), which was to be adopted initially in a Master's degree programme for quality assurance…

  20. Codifying Information Assurance Controls for Department of Defense (DoD) Supervisory Control and Data Acquisition (SCADA) Systems

    Science.gov (United States)

    2010-03-01

    the listed certifications are vendor ( CCNA ) and Server Plus (Server neutral such as the Certified Information Systems Security Professional (CISSP...Certifications 49 . Sample Cyber Security Certifications (Figure 11) Figure 12 provides the DoD Directive CCNA ITIL NET+GSEC SERVER+ CISSP . 5

  1. The Science of Mission Assurance

    Directory of Open Access Journals (Sweden)

    Kamal Jabbour

    2011-01-01

    Full Text Available The intent of this article is to describe—and prescribe—a scientific framework for assuring mission essential functions in a contested cyber environment. Such a framework has profound national security implications as the American military increasingly depends on cyberspace to execute critical mission sets. In setting forth this prescribed course of action, the article will first decompose information systems into atomic processes that manipulate information at all six phases of the information lifecycle, then systematically define the mathematical rules that govern mission assurance.

  2. Auditors’ Perceptions of Reasonable Assurance the Effectiveness of the Audit Risk Model. Case from Iran

    OpenAIRE

    Hashem Valipour; Javad Moradi; Hajar Moazaminezhad

    2012-01-01

    Despite the definition of the concept of logical confidence in auditing standards, the results from some studies conducted indicate a meaningful difference between perceptions this basic concept, by different auditors (Law, 2008, 180). The results from some researches also indicate that auditors’ perceptions about the effectiveness of the audit risk model vary (which is based on auditing general principles on the basis of risk) (Arense, 2006, 148). In so doing, aiming at studying the proof fo...

  3. Model transformation based information system modernization

    Directory of Open Access Journals (Sweden)

    Olegas Vasilecas

    2013-03-01

    Full Text Available Information systems begin to date increasingly faster because of rapidly changing business environment. Usually, small changes are not sufficient to adapt complex legacy information systems to changing business needs. A new functionality should be installed with the requirement of putting business data in the smallest possible risk. Information systems modernization problems are beeing analyzed in this paper and a method for information system modernization is proposed. It involves programming code transformation into abstract syntax tree metamodel (ASTM and model based transformation from ASTM into knowledge discovery model (KDM. The method is validated on example for SQL language.

  4. Directory of energy information administration models 1995

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-07-13

    This updated directory has been published annually; after this issue, it will be published only biennially. The Disruption Impact Simulator Model in use by EIA is included. Model descriptions have been updated according to revised documentation approved during the past year. This directory contains descriptions about each model, including title, acronym, purpose, followed by more detailed information on characteristics, uses, and requirements. Sources for additional information are identified. Included are 37 EIA models active as of February 1, 1995. The first group is the National Energy Modeling System (NEMS) models. The second group is all other EIA models that are not part of NEMS. Appendix A identifies major EIA modeling systems and the models within these systems. Appendix B is a summary of the `Annual Energy Outlook` Forecasting System.

  5. Information-Theoretic Perspectives on Geophysical Models

    Science.gov (United States)

    Nearing, Grey

    2016-04-01

    To test any hypothesis about any dynamic system, it is necessary to build a model that places that hypothesis into the context of everything else that we know about the system: initial and boundary conditions and interactions between various governing processes (Hempel and Oppenheim, 1948, Cartwright, 1983). No hypothesis can be tested in isolation, and no hypothesis can be tested without a model (for a geoscience-related discussion see Clark et al., 2011). Science is (currently) fundamentally reductionist in the sense that we seek some small set of governing principles that can explain all phenomena in the universe, and such laws are ontological in the sense that they describe the object under investigation (Davies, 1990 gives several competing perspectives on this claim). However, since we cannot build perfect models of complex systems, any model that does not also contain an epistemological component (i.e., a statement, like a probability distribution, that refers directly to the quality of of the information from the model) is falsified immediately (in the sense of Popper, 2002) given only a small number of observations. Models necessarily contain both ontological and epistemological components, and what this means is that the purpose of any robust scientific method is to measure the amount and quality of information provided by models. I believe that any viable philosophy of science must be reducible to this statement. The first step toward a unified theory of scientific models (and therefore a complete philosophy of science) is a quantitative language that applies to both ontological and epistemological questions. Information theory is one such language: Cox' (1946) theorem (see Van Horn, 2003) tells us that probability theory is the (only) calculus that is consistent with Classical Logic (Jaynes, 2003; chapter 1), and information theory is simply the integration of convex transforms of probability ratios (integration reduces density functions to scalar

  6. Quality Assurance - Construction

    DEFF Research Database (Denmark)

    Gaarslev, Axel

    1996-01-01

    Gives contains three main chapters:1. Quality Assurance initiated by external demands2. Quality Assurance initiated by internal company goals3. Innovation strategies......Gives contains three main chapters:1. Quality Assurance initiated by external demands2. Quality Assurance initiated by internal company goals3. Innovation strategies...

  7. Tools for evaluating Veterinary Services: an external auditing model for the quality assurance process.

    Science.gov (United States)

    Melo, E Correa

    2003-08-01

    The author describes the reasons why evaluation processes should be applied to the Veterinary Services of Member Countries, either for trade in animals and animal products and by-products between two countries, or for establishing essential measures to improve the Veterinary Service concerned. The author also describes the basic elements involved in conducting an evaluation process, including the instruments for doing so. These basic elements centre on the following:--designing a model, or desirable image, against which a comparison can be made--establishing a list of processes to be analysed and defining the qualitative and quantitative mechanisms for this analysis--establishing a multidisciplinary evaluation team and developing a process for standardising the evaluation criteria.

  8. Quality assurance target for community-based breast cancer screening in China: a model simulation.

    Science.gov (United States)

    Yang, Lan; Wang, Jing; Cheng, Juan; Wang, Yuan; Lu, Wenli

    2018-03-07

    We aimed to clarify the feasibility of a community-based screening strategy for breast cancer in Tianjin, China; to identify the factors that most significantly influenced its feasibility; and to identify the reference range for quality control. A state-transition Markov model simulated a hypothetical cohort of 100,000 healthy women, the start aged was set at 35 years and the time horizon was set to 50 years. The primary outcome for the model was the incremental cost-utility ratio (ICUR), defined as the program's cost per quality-adjusted life year (QALY) gained. Three screening strategies providing by community health service for women aged 35 to 69 years was compared regarding to different intervals. The probability of the ICUR being below 20 272USD (i.e., triple the annual gross domestic product [3 GDPs]) per QALY saved was 100% for annual screening strategy and screening every three years. Only when the attendance rate was > 50%, the probability for annual screening would be cost effective > 95%. The probability for the annual screening strategy being cost effective could reach to 95% for a willingness-to-pay (WTP) of 2 GDPs when the compliance rate for transfer was > 80%. When 10% stage I tumors were detected by screening, the probability of the annual screening strategy being cost effective would be up to 95% for a WTP > 3 GDPs. Annual community-based breast cancer screening was cost effective for a WTP of 3 GDP based on the incidence of breast cancer in Tianjin, China. Measures are needed to ensure performance indicators to a desirable level for the cost-effectiveness of breast cancer screening.

  9. Evaluation of plan quality assurance models for prostate cancer patients based on fully automatically generated Pareto-optimal treatment plans.

    Science.gov (United States)

    Wang, Yibing; Breedveld, Sebastiaan; Heijmen, Ben; Petit, Steven F

    2016-06-07

    IMRT planning with commercial Treatment Planning Systems (TPSs) is a trial-and-error process. Consequently, the quality of treatment plans may not be consistent among patients, planners and institutions. Recently, different plan quality assurance (QA) models have been proposed, that could flag and guide improvement of suboptimal treatment plans. However, the performance of these models was validated using plans that were created using the conventional trail-and-error treatment planning process. Consequently, it is challenging to assess and compare quantitatively the accuracy of different treatment planning QA models. Therefore, we created a golden standard dataset of consistently planned Pareto-optimal IMRT plans for 115 prostate patients. Next, the dataset was used to assess the performance of a treatment planning QA model that uses the overlap volume histogram (OVH). 115 prostate IMRT plans were fully automatically planned using our in-house developed TPS Erasmus-iCycle. An existing OVH model was trained on the plans of 58 of the patients. Next it was applied to predict DVHs of the rectum, bladder and anus of the remaining 57 patients. The predictions were compared with the achieved values of the golden standard plans for the rectum D mean, V 65, and V 75, and D mean of the anus and the bladder. For the rectum, the prediction errors (predicted-achieved) were only  -0.2  ±  0.9 Gy (mean  ±  1 SD) for D mean,-1.0  ±  1.6% for V 65, and  -0.4  ±  1.1% for V 75. For D mean of the anus and the bladder, the prediction error was 0.1  ±  1.6 Gy and 4.8  ±  4.1 Gy, respectively. Increasing the training cohort to 114 patients only led to minor improvements. A dataset of consistently planned Pareto-optimal prostate IMRT plans was generated. This dataset can be used to train new, and validate and compare existing treatment planning QA models, and has been made publicly available. The OVH model was highly accurate

  10. A Model for an Electronic Information Marketplace

    Directory of Open Access Journals (Sweden)

    Wei Ge

    2005-11-01

    Full Text Available As the information content on the Internet increases, the task of locating desired information and assessing its quality becomes increasingly difficult. This development causes users to be more willing to pay for information that is focused on specific issues, verifiable, and available upon request. Thus, the nature of the Internet opens up the opportunity for information trading. In this context, the Internet cannot only be used to close the transaction, but also to deliver the product - desired information - to the user. Early attempts to implement such business models have fallen short of expectations. In this paper, we discuss the limitations of such practices and present a modified business model for information trading, which uses a reverse auction approach together with a multiple-buyer price discovery process

  11. Information modeling for interoperable dimensional metrology

    CERN Document Server

    Zhao, Y; Brown, Robert; Xu, Xun

    2014-01-01

    This book analyzes interoperability issues in dimensional metrology systems and describes information modeling techniques. Coverage includes theory, techniques and key technologies, and explores new approaches for solving real-world interoperability problems.

  12. Geospatial Information System Capability Maturity Models

    Science.gov (United States)

    2017-06-01

    To explore how State departments of transportation (DOTs) evaluate geospatial tool applications and services within their own agencies, particularly their experiences using capability maturity models (CMMs) such as the Urban and Regional Information ...

  13. Information Dynamics in Networks: Models and Algorithms

    Science.gov (United States)

    2016-09-13

    Information Dynamics in Networks: Models and Algorithms In this project, we investigated how network structure interplays with higher level processes in...Models and Algorithms Report Title In this project, we investigated how network structure interplays with higher level processes in online social...Received Paper 1.00 2.00 3.00 . A Note on Modeling Retweet Cascades on Twitter, Workshop on Algorithms and Models for the Web Graph. 09-DEC-15

  14. Heat loss model for flow assurance in a deep water riser

    Science.gov (United States)

    Soetikno, Darmadi; Rodiah, Isti; Islahuddin, Muhammad; Kania, Riska A. P.; Gunawan, Agus Y.; Sukarno, Pudjo; Permadi, Asep K.; Soewono, Edy

    2014-03-01

    The study is intended to investigate the heat loss phenomenon of oil flow in a riser. This heat loss happens due to the difference between the oil temperature in a riser and the surrounding sea water temperature. It causes the formation of wax that may disturb the flow. Heat loss can be reduced by setting up an insulator in a riser or by selecting appropriate pipeline specifications. It is necessary to determine the possible locations and specifications of insulator and pipeline. A mathematical model is formulated by considering the oil temperature and its flow velocity. Assuming that the density variation is small, the fluid behaves as an incompressible fluid. Furthermore, numerical solutions with finite difference methods are presented with some hypothetical data to give an overview of how the system works. Two surrounding conditions are taken into account, i.e. with and without sea current. From the simulation, the location of wax formation can be predicted. At a certain depth region of sea, where the sea current is present, a greater heat loss take place in which wax may be formed immediately. To overcome the formation of wax, we can control the parameters such as conductivity and wall thickness of pipe.

  15. Introduction to quality assurance

    International Nuclear Information System (INIS)

    Kaden, W.

    1980-01-01

    In today's interpretation 'quality assurance' means 'good management'. Quality assurance has to cover all phases of a work, but all quality assurance measures must be adapted to the relevance and complexity of the actual task. Examples are given for the preparation of quality classes, the organization of quality assurance during design and manufacturing and for auditing. Finally, efficiency and limits of quality assurance systems are described. (orig.)

  16. THE INFORMATION MODEL «SOCIAL EXPLOSION»

    Directory of Open Access Journals (Sweden)

    Alexander Chernyavskiy

    2012-01-01

    Full Text Available Article is dedicated to examination and analysis of the construction of the information model «social explosion», which corresponds to the newest «colored» revolutions. The analysis of model makes it possible to see effective approaches to the initiation of this explosion and by the use of contemporary information communications as honeycomb connection and the mobile Internet

  17. [About the elaboration of the unique informational space of the medical service of the Armed Forces and improvement of informational assurance of system of it's control].

    Science.gov (United States)

    Shappo, V V; Stoliar, V P; Zubkov, A D

    2007-12-01

    The now-day period of development of the medical service of the Armed Forces of the RF and it's system of control is characterized by a growing up need by functionaries of control departments in an actual, authentic, well-timed and all-round information. This information is necessary for qualitative solving of concrete missions. Growing of volume and importance of information stipulates research of new methods and ways of rise of control effectiveness. One of the ways of resolving this problem is elaboration of the unique informational space of the medical service of the Armed Forces of RF. Realization of perspectives of improvement of information acquisition and processing for the control of the medical service permits create the unique informational space of the medical service of the Armed Forces of RF, realize a centralized control, solving missions of building and reforming of system of control and medical supply of Forces.

  18. THE MODEL OF INTERACTION BETWEEN INSURANCE INTERMEDIARIES AND INSURANCE COMPANIES IN THE ASSURANCE OF SUSTAINABLE DEVELOPMENT OF THE INSURANCE MARKET

    Directory of Open Access Journals (Sweden)

    Nataliia Kudriavska

    2017-11-01

    Full Text Available The purpose of this paper is the investigation of the model of interaction between insurance intermediaries and insurance companies in the assurance of sustainable development of the insurance market. The methodology is based on the new studies and books. It is underlined the importance of potency and effectiveness of this model, its influence on the insurance market stability. It is analysed the European experience and specific of Ukrainian insurance market. The main ways for improving its model and ways of its practical realization are characterized. Results. The problems that exist in the broker market in general are connected with an ineffective state policy. In particular, we can say about the absence of many laws, acts, resolutions, which explain what a broker have to do in case of different problems with insurance companies, another brokers and clients. At the same time, the problem of distrust to national brokers exists. It provokes a decline of the demand for their services and so on. However, it is possible to solve these problems. Practical implications. For this, it is necessary to do some acts. The first one is to implement resolutions that regulate relationships between insurance brokers and insurance companies, clearly regulate the model of its interaction. This model affects the stability of the insurance market in general. The second is to find methods of solving problems of the increase in insurance culture of the population (for example, by the way of advertisement. The third one is to solve problems connected with the appearance of foreign brokers in the insurance market of Ukraine. Actually, the Ukrainian market of insurance brokers is not developed enough. That is why it needs big changes and reforms. Value/originality. Among alternatives of the strategic development of insurance, the method of quick liberalization and gradual development is distinguished. According to the liberal way, it is possible to transfer to the

  19. Software Quality Assurance Plan for GoldSim Models Supporting the Area 3 and Area 5 Radioactive Waste Management Site Performance Assessment Program

    International Nuclear Information System (INIS)

    Gregory J. Shott, Vefa Yucel

    2007-01-01

    This Software Quality Assurance Plan (SQAP) applies to the development and maintenance of GoldSim models supporting the Area 3 and Area 5 Radioactive Waste Management Sites (RWMSs) performance assessments (PAs) and composite analyses (CAs). Two PA models have been approved by the U.S. Department of Energy (DOE), National Nuclear Security Administration Nevada Site Office (NNSA/NSO) as of November 2006 for the PA maintenance work undertaken by National Security Technologies, LLC (NSTec). NNSA/NSO asked NSTec to assume the custodianship of the models for future development and maintenance. The models were initially developed by Neptune and Company (N and C)

  20. Software Quality Assurance Plan for GoldSim Models Supporting the Area 3 and Area 5 Radioactive Waste Management Sites Performance Assessment Program

    Energy Technology Data Exchange (ETDEWEB)

    Gregory J. Shott, Vefa Yucel

    2007-01-03

    This Software Quality Assurance Plan (SQAP) applies to the development and maintenance of GoldSim models supporting the Area 3 and Area 5 Radioactive Waste Management Sites (RWMSs) performance assessments (PAs) and composite analyses (CAs). Two PA models have been approved by the U.S. Department of Energy (DOE), National Nuclear Security Administration Nevada Site Office (NNSA/NSO) as of November 2006 for the PA maintenance work undertaken by National Security Technologies, LLC (NSTec). NNSA/NSO asked NSTec to assume the custodianship of the models for future development and maintenance. The models were initially developed by Neptune and Company (N&C).

  1. Enterprise Modelling for an Educational Information Infrastructure

    NARCIS (Netherlands)

    Widya, I.A.; Michiels, E.F.; Volman, C.J.A.M.; Pokraev, S.; de Diana, I.P.F.; Filipe, J.; Sharp, B.; Miranda, P.

    2001-01-01

    This paper reports the modelling exercise of an educational information infrastructure that aims to support the organisation of teaching and learning activities suitable for a wide range of didactic policies. The modelling trajectory focuses on capturing invariant structures of relations between

  2. Millennial Students' Mental Models of Information Retrieval

    Science.gov (United States)

    Holman, Lucy

    2009-01-01

    This qualitative study examines first-year college students' online search habits in order to identify patterns in millennials' mental models of information retrieval. The study employed a combination of modified contextual inquiry and concept mapping methodologies to elicit students' mental models. The researcher confirmed previously observed…

  3. Click Model-Based Information Retrieval Metrics

    NARCIS (Netherlands)

    Chuklin, A.; Serdyukov, P.; de Rijke, M.

    2013-01-01

    In recent years many models have been proposed that are aimed at predicting clicks of web search users. In addition, some information retrieval evaluation metrics have been built on top of a user model. In this paper we bring these two directions together and propose a common approach to converting

  4. Bayesian Modeling of Cerebral Information Processing

    OpenAIRE

    Labatut, Vincent; Pastor, Josette

    2001-01-01

    International audience; Modeling explicitly the links between cognitive functions and networks of cerebral areas is necessitated both by the understanding of the clinical outcomes of brain lesions and by the interpretation of activation data provided by functional neuroimaging techniques. At this global level of representation, the human brain can be best modeled by a probabilistic functional causal network. Our modeling approach is based on the anatomical connection pattern, the information ...

  5. Information retrieval models foundations and relationships

    CERN Document Server

    Roelleke, Thomas

    2013-01-01

    Information Retrieval (IR) models are a core component of IR research and IR systems. The past decade brought a consolidation of the family of IR models, which by 2000 consisted of relatively isolated views on TF-IDF (Term-Frequency times Inverse-Document-Frequency) as the weighting scheme in the vector-space model (VSM), the probabilistic relevance framework (PRF), the binary independence retrieval (BIR) model, BM25 (Best-Match Version 25, the main instantiation of the PRF/BIR), and language modelling (LM). Also, the early 2000s saw the arrival of divergence from randomness (DFR).Regarding in

  6. Quality Assurance Tracking System - R7 (QATS-R7)

    Data.gov (United States)

    U.S. Environmental Protection Agency — This is metadata documentation for the Quality Assurance Tracking System - R7, an EPA Region 7 resource that tracks information on quality assurance reviews. Also...

  7. Optimal information diffusion in stochastic block models.

    Science.gov (United States)

    Curato, Gianbiagio; Lillo, Fabrizio

    2016-09-01

    We use the linear threshold model to study the diffusion of information on a network generated by the stochastic block model. We focus our analysis on a two-community structure where the initial set of informed nodes lies only in one of the two communities and we look for optimal network structures, i.e., those maximizing the asymptotic extent of the diffusion. We find that, constraining the mean degree and the fraction of initially informed nodes, the optimal structure can be assortative (modular), core-periphery, or even disassortative. We then look for minimal cost structures, i.e., those for which a minimal fraction of initially informed nodes is needed to trigger a global cascade. We find that the optimal networks are assortative but with a structure very close to a core-periphery graph, i.e., a very dense community linked to a much more sparsely connected periphery.

  8. A linguistic model of informed consent.

    Science.gov (United States)

    Marta, J

    1996-02-01

    The current disclosure model of informed consent ignores the linguistic complexity of any act of communication, and the increased risk of difficulties in the special circumstances of informed consent. This article explores, through linguistic analysis, the specificity of informed consent as a speech act, a communication act, and a form of dialogue, following on the theories of J.L. Austin, Roman Jakobson, and Mikhail Bakhtin, respectively. In the proposed model, informed consent is a performative speech act resulting from a series of communication acts which together constitute a dialogic, polyphonic, heteroglossial discourse. It is an act of speech that results in action being taken after a conversation has happened where distinct individuals, multiple voices, and multiple perspectives have been respected, and convention observed and recognized. It is more meaningful and more ethical for both patient and physician, in all their human facets including their interconnectedness.

  9. Information technology and innovative drainage management practices for selenium load reduction from irrigated agriculture to provide stakeholder assurances and meet contaminant mass loading policy objectives

    Energy Technology Data Exchange (ETDEWEB)

    Quinn, N.W.T.

    2009-10-15

    Many perceive the implementation of environmental regulatory policy, especially concerning non-point source pollution from irrigated agriculture, as being less efficient in the United States than in many other countries. This is partly a result of the stakeholder involvement process but is also a reflection of the inability to make effective use of Environmental Decision Support Systems (EDSS) to facilitate technical information exchange with stakeholders and to provide a forum for innovative ideas for controlling non-point source pollutant loading. This paper describes one of the success stories where a standardized Environmental Protection Agency (EPA) methodology was modified to better suit regulation of a trace element in agricultural subsurface drainage and information technology was developed to help guide stakeholders, provide assurances to the public and encourage innovation while improving compliance with State water quality objectives. The geographic focus of the paper is the western San Joaquin Valley where, in 1985, evapoconcentration of selenium in agricultural subsurface drainage water, diverted into large ponds within a federal wildlife refuge, caused teratogenecity in waterfowl embryos and in other sensitive wildlife species. The fallout from this environmental disaster was a concerted attempt by State and Federal water agencies to regulate non-point source loads of the trace element selenium. The complexity of selenium hydrogeochemistry, the difficulty and expense of selenium concentration monitoring and political discord between agricultural and environmental interests created challenges to the regulation process. Innovative policy and institutional constructs, supported by environmental monitoring and the web-based data management and dissemination systems, provided essential decision support, created opportunities for adaptive management and ultimately contributed to project success. The paper provides a retrospective on the contentious planning

  10. Predictive modelling of evidence informed teaching

    OpenAIRE

    Zhang, Dell; Brown, C.

    2017-01-01

    In this paper, we analyse the questionnaire survey data collected from 79 English primary schools about the situation of evidence informed teaching, where the evidences could come from research journals or conferences. Specifically, we build a predictive model to see what external factors could help to close the gap between teachers’ belief and behaviour in evidence informed teaching, which is the first of its kind to our knowledge. The major challenge, from the data mining perspective, is th...

  11. Quality assurance manual: Volume 2, Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Oijala, J.E.

    1988-06-01

    This paper contains quality assurance information on departments of the Stanford Linear Accelerator Center. Particular quality assurance policies and standards discussed are on: Mechanical Systems; Klystron and Microwave Department; Electronics Department; Plant Engineering; Accelerator Department; Purchasing; and Experimental Facilities Department. (LSP)

  12. Tool Use Within NASA Software Quality Assurance

    Science.gov (United States)

    Shigeta, Denise; Port, Dan; Nikora, Allen P.; Wilf, Joel

    2013-01-01

    As space mission software systems become larger and more complex, it is increasingly important for the software assurance effort to have the ability to effectively assess both the artifacts produced during software system development and the development process itself. Conceptually, assurance is a straightforward idea - it is the result of activities carried out by an organization independent of the software developers to better inform project management of potential technical and programmatic risks, and thus increase management's confidence in the decisions they ultimately make. In practice, effective assurance for large, complex systems often entails assessing large, complex software artifacts (e.g., requirements specifications, architectural descriptions) as well as substantial amounts of unstructured information (e.g., anomaly reports resulting from testing activities during development). In such an environment, assurance engineers can benefit greatly from appropriate tool support. In order to do so, an assurance organization will need accurate and timely information on the tool support available for various types of assurance activities. In this paper, we investigate the current use of tool support for assurance organizations within NASA, and describe on-going work at JPL for providing assurance organizations with the information about tools they need to use them effectively.

  13. Teacher Reaction to ICP Quality Assurance Procedures.

    Science.gov (United States)

    Leonard, Ann

    An integral part of the Quality Assurance Manual developed by Southwest Regional Laboratory (SWRL) to accompany the Kindergarten Program is the end-of-program assessment of the Instructional Concepts Program (ICP). Following completion of ICP Quality Assurance assessment, four teachers were interviewed in order to gather information pertinent to…

  14. Information Assurance in Sensor Networks

    Science.gov (United States)

    2009-09-15

    intrusion detection system can be classified as host-based IDS, such as Haystack [78] and MIDAS [79], multi host-based IDS, such as NIDES [80] and CSM ...mislabeled training data with tD , get back the sampling data set eS and slice it into the majority data set 1e and the minority data set 2e...2e with td , get back a sampling minority data set tg , where there are stm data inside. (5) For each example i tg∈x , find its 2k nearest

  15. International Planetary Data Alliance (IPDA) Information Model

    Science.gov (United States)

    Hughes, John Steven; Beebe, R.; Guinness, E.; Heather, D.; Huang, M.; Kasaba, Y.; Osuna, P.; Rye, E.; Savorskiy, V.

    2007-01-01

    This document is the third deliverable of the International Planetary Data Alliance (IPDA) Archive Data Standards Requirements Identification project. The goal of the project is to identify a subset of the standards currently in use by NASAs Planetary Data System (PDS) that are appropriate for internationalization. As shown in the highlighted sections of Figure 1, the focus of this project is the Information Model component of the Data Architecture Standards, namely the object models, a data dictionary, and a set of data formats.

  16. Computer software quality assurance

    International Nuclear Information System (INIS)

    Ives, K.A.

    1986-06-01

    The author defines some criteria for the evaluation of software quality assurance elements for applicability to the regulation of the nuclear industry. The author then analyses a number of software quality assurance (SQA) standards. The major extracted SQA elements are then discussed, and finally specific software quality assurance recommendations are made for the nuclear industry

  17. Data Quality Assurance Governance

    OpenAIRE

    Montserrat Gonzalez; Stephanie Suhr

    2016-01-01

    This deliverable describes the ELIXIR-EXCELERATE Quality Management Strategy, addressing EXCELERATE Ethics requirement no. 5 on Data Quality Assurance Governance. The strategy describes the essential procedures and practices within ELIXIR-EXCELERATE concerning planning of quality management, performing quality assurance and controlling quality. It also depicts the overall organisation of ELIXIR with emphasis on authority and specific responsibilities related to quality assurance.

  18. Data Model Management for Space Information Systems

    Science.gov (United States)

    Hughes, J. Steven; Crichton, Daniel J.; Ramirez, Paul; Mattmann, chris

    2006-01-01

    The Reference Architecture for Space Information Management (RASIM) suggests the separation of the data model from software components to promote the development of flexible information management systems. RASIM allows the data model to evolve independently from the software components and results in a robust implementation that remains viable as the domain changes. However, the development and management of data models within RASIM are difficult and time consuming tasks involving the choice of a notation, the capture of the model, its validation for consistency, and the export of the model for implementation. Current limitations to this approach include the lack of ability to capture comprehensive domain knowledge, the loss of significant modeling information during implementation, the lack of model visualization and documentation capabilities, and exports being limited to one or two schema types. The advent of the Semantic Web and its demand for sophisticated data models has addressed this situation by providing a new level of data model management in the form of ontology tools. In this paper we describe the use of a representative ontology tool to capture and manage a data model for a space information system. The resulting ontology is implementation independent. Novel on-line visualization and documentation capabilities are available automatically, and the ability to export to various schemas can be added through tool plug-ins. In addition, the ingestion of data instances into the ontology allows validation of the ontology and results in a domain knowledge base. Semantic browsers are easily configured for the knowledge base. For example the export of the knowledge base to RDF/XML and RDFS/XML and the use of open source metadata browsers provide ready-made user interfaces that support both text- and facet-based search. This paper will present the Planetary Data System (PDS) data model as a use case and describe the import of the data model into an ontology tool

  19. Study on geo-information modelling

    Czech Academy of Sciences Publication Activity Database

    Klimešová, Dana

    2006-01-01

    Roč. 5, č. 5 (2006), s. 1108-1113 ISSN 1109-2777 Institutional research plan: CEZ:AV0Z10750506 Keywords : control GIS * geo- information modelling * uncertainty * spatial temporal approach Web Services Subject RIV: BC - Control Systems Theory

  20. Using Interaction Scenarios to Model Information Systems

    DEFF Research Database (Denmark)

    Bækgaard, Lars; Bøgh Andersen, Peter

    The purpose of this paper is to define and discuss a set of interaction primitives that can be used to model the dynamics of socio-technical activity systems, including information systems, in a way that emphasizes structural aspects of the interaction that occurs in such systems. The primitives...

  1. Technical Note: Validation of halo modeling for proton pencil beam spot scanning using a quality assurance test pattern

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Liyong, E-mail: linl@uphs.upenn.edu; Huang, Sheng; Kang, Minglei; Solberg, Timothy D.; McDonough, James E.; Ainsley, Christopher G. [Department of Radiation Oncology, University of Pennsylvania, 3400 Civic Center Boulevard, Philadelphia, Pennsylvania 19104 (United States)

    2015-09-15

    Purpose: The purpose of this paper is to demonstrate the utility of a comprehensive test pattern in validating calculation models that include the halo component (low-dose tails) of proton pencil beam scanning (PBS) spots. Such a pattern has been used previously for quality assurance purposes to assess spot shape, position, and dose. Methods: In this study, a scintillation detector was used to measure the test pattern in air at isocenter for two proton beam energies (115 and 225 MeV) of two IBA universal nozzles (UN #1 and UN #2). Planar measurements were compared with calculated dose distributions based on the weighted superposition of location-independent (UN #1) or location-dependent (UN #2) spot profiles, previously measured using a pair-magnification method and between two nozzles. Results: Including the halo component below 1% of the central dose is shown to improve the gamma-map comparison between calculation and measurement from 94.9% to 98.4% using 2 mm/2% criteria for the 115 MeV proton beam of UN #1. In contrast, including the halo component below 1% of the central dose does not improve the gamma agreement for the 115 MeV proton beam of UN #2, due to the cutoff of the halo component at off-axis locations. When location-dependent spot profiles are used for calculation instead of spot profiles at central axis, the gamma agreement is improved from 98.0% to 99.5% using 2 mm/2% criteria. The two nozzles clearly have different characteristics, as a direct comparison of measured data shows a passing rate of 89.7% for the 115 MeV proton beam. At 225 MeV, the corresponding gamma comparisons agree better between measurement and calculation, and between measurements in the two nozzles. Conclusions: In addition to confirming the primary component of individual PBS spot profiles, a comprehensive test pattern is useful for the validation of the halo component at off-axis locations, especially for low energy protons.

  2. Asset Condition, Information Systems and Decision Models

    CERN Document Server

    Willett, Roger; Brown, Kerry; Mathew, Joseph

    2012-01-01

    Asset Condition, Information Systems and Decision Models, is the second volume of the Engineering Asset Management Review Series. The manuscripts provide examples of implementations of asset information systems as well as some practical applications of condition data for diagnostics and prognostics. The increasing trend is towards prognostics rather than diagnostics, hence the need for assessment and decision models that promote the conversion of condition data into prognostic information to improve life-cycle planning for engineered assets. The research papers included here serve to support the on-going development of Condition Monitoring standards. This volume comprises selected papers from the 1st, 2nd, and 3rd World Congresses on Engineering Asset Management, which were convened under the auspices of ISEAM in collaboration with a number of organisations, including CIEAM Australia, Asset Management Council Australia, BINDT UK, and Chinese Academy of Sciences, Beijing University of Chemical Technology, Chin...

  3. Concepts of nuclear quality assurance

    International Nuclear Information System (INIS)

    Randers, G.; Morris, P.A.; Pomeroy, D.

    1976-01-01

    While the safety record of the nuclear industry continues to be excellent, the forced outage rates for recent years continue to be 15% or more. Quality assurance, therefore, needs to be applied not only to nuclear safety matters, but to the goals of increased productivity and reduced construction and operating costs. Broadening the application of the general concept of quality assurance in this way leads to the introduction of reliability technology. The total activity might better be called reliability assurance. That effective quality assurance systems do pay off is described by examples from the utility industry, from a manufacturer of instruments and systems and from the experience of Westinghouse Electric Company's manufacturing divisions. The special situation of applying quality assurance to nuclear fuel is discussed. Problems include the lack of a fully developed regulatory policy in this area, incomplete understanding of the mechanism for pellet-clad interaction failures, incomplete access to manufacturers design and process information, inability to make desirable changes on a timely basis and inadequate feedback of irradiation experience. (author)

  4. Modeling decisions information fusion and aggregation operators

    CERN Document Server

    Torra, Vicenc

    2007-01-01

    Information fusion techniques and aggregation operators produce the most comprehensive, specific datum about an entity using data supplied from different sources, thus enabling us to reduce noise, increase accuracy, summarize and extract information, and make decisions. These techniques are applied in fields such as economics, biology and education, while in computer science they are particularly used in fields such as knowledge-based systems, robotics, and data mining. This book covers the underlying science and application issues related to aggregation operators, focusing on tools used in practical applications that involve numerical information. Starting with detailed introductions to information fusion and integration, measurement and probability theory, fuzzy sets, and functional equations, the authors then cover the following topics in detail: synthesis of judgements, fuzzy measures, weighted means and fuzzy integrals, indices and evaluation methods, model selection, and parameter extraction. The method...

  5. Requirements for clinical information modelling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Jódar-Sánchez, Francisco; Kalra, Dipak

    2015-07-01

    This study proposes consensus requirements for clinical information modelling tools that can support modelling tasks in medium/large scale institutions. Rather than identify which functionalities are currently available in existing tools, the study has focused on functionalities that should be covered in order to provide guidance about how to evolve the existing tools. After identifying a set of 56 requirements for clinical information modelling tools based on a literature review and interviews with experts, a classical Delphi study methodology was applied to conduct a two round survey in order to classify them as essential or recommended. Essential requirements are those that must be met by any tool that claims to be suitable for clinical information modelling, and if we one day have a certified tools list, any tool that does not meet essential criteria would be excluded. Recommended requirements are those more advanced requirements that may be met by tools offering a superior product or only needed in certain modelling situations. According to the answers provided by 57 experts from 14 different countries, we found a high level of agreement to enable the study to identify 20 essential and 21 recommended requirements for these tools. It is expected that this list of identified requirements will guide developers on the inclusion of new basic and advanced functionalities that have strong support by end users. This list could also guide regulators in order to identify requirements that could be demanded of tools adopted within their institutions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  6. Ontological modeling of electronic health information exchange.

    Science.gov (United States)

    McMurray, J; Zhu, L; McKillop, I; Chen, H

    2015-08-01

    Investments of resources to purposively improve the movement of information between health system providers are currently made with imperfect information. No inventories of system-level electronic health information flows currently exist, nor do measures of inter-organizational electronic information exchange. Using Protégé 4, an open-source OWL Web ontology language editor and knowledge-based framework, we formalized a model that decomposes inter-organizational electronic health information flow into derivative concepts such as diversity, breadth, volume, structure, standardization and connectivity. The ontology was populated with data from a regional health system and the flows were measured. Individual instance's properties were inferred from their class associations as determined by their data and object property rules. It was also possible to visualize interoperability activity for regional analysis and planning purposes. A property called Impact was created from the total number of patients or clients that a health entity in the region served in a year, and the total number of health service providers or organizations with whom it exchanged information in support of clinical decision-making, diagnosis or treatment. Identifying providers with a high Impact but low Interoperability score could assist planners and policy-makers to optimize technology investments intended to electronically share patient information across the continuum of care. Finally, we demonstrated how linked ontologies were used to identify logical inconsistencies in self-reported data for the study. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Regulatory inspection of the implementation of quality assurance programmes

    International Nuclear Information System (INIS)

    1989-01-01

    This Manual provides guidance to Member States in the organization and performance of their regulatory inspection functions regarding the implementation of nuclear power plant quality assurance programmes. It addresses the interface between, and is consistent with, the IAEA Nuclear Safety Standards (NUSS programme) documents on quality assurance and governmental organization. The Manual offers a practical model and examples for performing regulatory inspections to ensure that the quality assurance programme is operating satisfactorily in the siting, design, manufacturing, construction, commissioning, operation and decommissioning of nuclear power plants. The primary objective is to confirm that the licensee has the capability to manage and control the effective performance of all quality assurance responsibilities during all phases of a nuclear power project. The guidance provided through this Manual for proper establishment and execution of the regulatory inspections helps to enforce the effective implementation of the quality assurance programme as a management control system that the nuclear industry should establish and use in attaining the safety and reliability objectives for nuclear installations. This enforcement action by national regulatory bodies and the emphasis on the purposes and advantages of quality assurance as an important management tool integrated within the total project task have been recommended by the IAEA International Nuclear Safety Advisory Group (INSAG). The primary intended users of this Manual are the management personnel and high level staff from regulatory bodies but it will also be helpful to management personnel from nuclear utilities and vendors. They all are inevitable partners in a nuclear power project and this document offers all of them valuable information on the better accomplishment of quality assurance activities to ensure the common objective of safe and reliable nuclear power production

  8. Health equity monitoring for healthcare quality assurance.

    Science.gov (United States)

    Cookson, R; Asaria, M; Ali, S; Shaw, R; Doran, T; Goldblatt, P

    2018-02-01

    Population-wide health equity monitoring remains isolated from mainstream healthcare quality assurance. As a result, healthcare organizations remain ill-informed about the health equity impacts of their decisions - despite becoming increasingly well-informed about quality of care for the average patient. We present a new and improved analytical approach to integrating health equity into mainstream healthcare quality assurance, illustrate how this approach has been applied in the English National Health Service, and discuss how it could be applied in other countries. We illustrate the approach using a key quality indicator that is widely used to assess how well healthcare is co-ordinated between primary, community and acute settings: emergency inpatient hospital admissions for ambulatory care sensitive chronic conditions ("potentially avoidable emergency admissions", for short). Whole-population data for 2015 on potentially avoidable emergency admissions in England were linked with neighborhood deprivation indices. Inequality within the populations served by 209 clinical commissioning groups (CCGs: care purchasing organizations with mean population 272,000) was compared against two benchmarks - national inequality and inequality within ten similar populations - using neighborhood-level models to simulate the gap in indirectly standardized admissions between most and least deprived neighborhoods. The modelled inequality gap for England was 927 potentially avoidable emergency admissions per 100,000 people, implying 263,894 excess hospitalizations associated with inequality. Against this national benchmark, 17% of CCGs had significantly worse-than-benchmark equity, and 23% significantly better. The corresponding figures were 11% and 12% respectively against the similar populations benchmark. Deprivation-related inequality in potentially avoidable emergency admissions varies substantially between English CCGs serving similar populations, beyond expected statistical

  9. Modeling Information-Seeking Dialogues: The Conversational Roles (COR) Model.

    Science.gov (United States)

    Sitter, Stefan; Stein, Adelheit

    1996-01-01

    Introduces a generic, application-independent model of human-computer information-seeking dialog, the Conversational Roles (COR) Model, and reviews the theoretical background. COR is represented as a recursive state-transition-network that determines legitimate types and possible sequences of dialog acts, and categorizes dialog acts on the basis…

  10. Acceptance model of a Hospital Information System.

    Science.gov (United States)

    Handayani, P W; Hidayanto, A N; Pinem, A A; Hapsari, I C; Sandhyaduhita, P I; Budi, I

    2017-03-01

    The purpose of this study is to develop a model of Hospital Information System (HIS) user acceptance focusing on human, technological, and organizational characteristics for supporting government eHealth programs. This model was then tested to see which hospital type in Indonesia would benefit from the model to resolve problems related to HIS user acceptance. This study used qualitative and quantitative approaches with case studies at four privately owned hospitals and three government-owned hospitals, which are general hospitals in Indonesia. The respondents involved in this study are low-level and mid-level hospital management officers, doctors, nurses, and administrative staff who work at medical record, inpatient, outpatient, emergency, pharmacy, and information technology units. Data was processed using Structural Equation Modeling (SEM) and AMOS 21.0. The study concludes that non-technological factors, such as human characteristics (i.e. compatibility, information security expectancy, and self-efficacy), and organizational characteristics (i.e. management support, facilitating conditions, and user involvement) which have level of significance of ptechnological factors to better plan for HIS implementation. Support from management is critical to the sustainability of HIS implementation to ensure HIS is easy to use and provides benefits to the users as well as hospitals. Finally, this study could assist hospital management and IT developers, as well as researchers, to understand the obstacles faced by hospitals in implementing HIS. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. The application of quality assurance

    International Nuclear Information System (INIS)

    Lovatt, G.B.

    1988-01-01

    The paper concerns the application of quality assurance to structures, systems and components for the design, construction and operation of nuclear power plant and fuel reprocessing plant. A description is given of:- the requirements for quality assurance, the establishment of quality assurance arrangements, quality assurance documents structure, and quality assurance manuals and programmes. Quality assurance procedures and auditing are also discussed. (U.K.)

  12. Quality assurance of nuclear energy

    International Nuclear Information System (INIS)

    1994-12-01

    It consists of 14 chapters, which are outline of quality assurance of nuclear energy, standard of quality assurance, business quality assurance, design quality assurance, purchase quality assurance, production quality assurance, a test warranty operation warranty, maintenance warranty, manufacture of nuclear power fuel warranty, computer software warranty, research and development warranty and quality audit.

  13. [Quality assurance in human genetic testing].

    Science.gov (United States)

    Stuhrmann-Spangenberg, Manfred

    2015-02-01

    Advances in technical developments of genetic diagnostics for more than 50 years, as well as the fact that human genetic testing is usually performed only once in a lifetime, with additional impact for blood relatives, are determining the extraordinary importance of quality assurance in human genetic testing. Abidance of laws, directives, and guidelines plays a major role. This article aims to present the major laws, directives, and guidelines with respect to quality assurance of human genetic testing, paying careful attention to internal and external quality assurance. The information on quality assurance of human genetic testing was obtained through a web-based search of the web pages that are referred to in this article. Further information was retrieved from publications in the German Society of Human Genetics and through a PubMed-search using term quality + assurance + genetic + diagnostics. The most important laws, directives, and guidelines for quality assurance of human genetic testing are the gene diagnostics law (GenDG), the directive of the Federal Medical Council for quality control of clinical laboratory analysis (RiliBÄK), and the S2K guideline for human genetic diagnostics and counselling. In addition, voluntary accreditation under DIN EN ISO 15189:2013 offers a most recommended contribution towards quality assurance of human genetic testing. Legal restraints on quality assurance of human genetic testing as mentioned in § 5 GenDG are fulfilled once RiliBÄK requirements are followed.

  14. Information model for learning nursing terminology.

    Science.gov (United States)

    Nytun, Jan Pettersen; Fossum, Mariann

    2014-01-01

    Standardized terminologies are introduced in healthcare with the intention of improving information quality, which is important for enhancing the quality of healthcare itself. The International Classification for Nursing Practice (ICNP®) is a unified language system that presents an ontology for nursing terminology; it is meant for documentation of nursing diagnoses, nursing interventions and patient outcomes. This paper presents an information model and an application for teaching nursing students how to use ICNP to assist in the planning of nursing care. The model is an integration of ICNP and our catalog ontology, patient journal ontology, and ontology defining task sets. The application for learning nursing terminology offers descriptions of patient situations and then prompts the student to supply nursing statements for diagnoses, goals and interventions. The nursing statements may be selected from catalogues containing premade solutions based on ICNP, or they may be constructed directly by selecting terms from ICNP.

  15. Mutual information in the Tangled Nature Model

    DEFF Research Database (Denmark)

    Jones, Dominic; Sibani, Paolo

    2010-01-01

    We consider the concept of mutual information in ecological networks, and use this idea to analyse the Tangled Nature model of co-evolution. We show that this measure of correlation has two distinct behaviours depending on how we define the network in question: if we consider only the network of ...... of viable species this measure increases, whereas for the whole system it decreases. It is suggested that these are complimentary behaviours that show how ecosystems can become both more stable and better adapted.......We consider the concept of mutual information in ecological networks, and use this idea to analyse the Tangled Nature model of co-evolution. We show that this measure of correlation has two distinct behaviours depending on how we define the network in question: if we consider only the network...

  16. Building Information Modelling in Denmark and Iceland

    DEFF Research Database (Denmark)

    Jensen, Per Anker; Jóhannesson, Elvar Ingi

    2013-01-01

    Purpose – The purpose of this paper is to explore the implementation of building information modelling (BIM) in the Nordic countries of Europe with particular focus on the Danish building industry with the aim of making use of its experience for the Icelandic building industry. Design....../methodology/aptroach – The research is based on two separate analyses. In the first part, the deployment of information and communication technology (ICT) in the Icelandic building industry is investigated and compared with the other Nordic countries. In the second part the experience in Denmark from implementing and working...... for making standards and guidelines related to BIM. Public building clients are also encouraged to consider initiating projects based on making simple building models of existing buildings in order to introduce the BIM technology to the industry. Icelandic companies are recommended to start implementing BIM...

  17. Building an environment model using depth information

    Science.gov (United States)

    Roth-Tabak, Y.; Jain, Ramesh

    1989-01-01

    Modeling the environment is one of the most crucial issues for the development and research of autonomous robot and tele-perception. Though the physical robot operates (navigates and performs various tasks) in the real world, any type of reasoning, such as situation assessment, planning or reasoning about action, is performed based on information in its internal world. Hence, the robot's intentional actions are inherently constrained by the models it has. These models may serve as interfaces between sensing modules and reasoning modules, or in the case of telerobots serve as interface between the human operator and the distant robot. A robot operating in a known restricted environment may have a priori knowledge of its whole possible work domain, which will be assimilated in its World Model. As the information in the World Model is relatively fixed, an Environment Model must be introduced to cope with the changes in the environment and to allow exploring entirely new domains. Introduced here is an algorithm that uses dense range data collected at various positions in the environment to refine and update or generate a 3-D volumetric model of an environment. The model, which is intended for autonomous robot navigation and tele-perception, consists of cubic voxels with the possible attributes: Void, Full, and Unknown. Experimental results from simulations of range data in synthetic environments are given. The quality of the results show great promise for dealing with noisy input data. The performance measures for the algorithm are defined, and quantitative results for noisy data and positional uncertainty are presented.

  18. Fisher information framework for time series modeling

    Science.gov (United States)

    Venkatesan, R. C.; Plastino, A.

    2017-08-01

    A robust prediction model invoking the Takens embedding theorem, whose working hypothesis is obtained via an inference procedure based on the minimum Fisher information principle, is presented. The coefficients of the ansatz, central to the working hypothesis satisfy a time independent Schrödinger-like equation in a vector setting. The inference of (i) the probability density function of the coefficients of the working hypothesis and (ii) the establishing of constraint driven pseudo-inverse condition for the modeling phase of the prediction scheme, is made, for the case of normal distributions, with the aid of the quantum mechanical virial theorem. The well-known reciprocity relations and the associated Legendre transform structure for the Fisher information measure (FIM, hereafter)-based model in a vector setting (with least square constraints) are self-consistently derived. These relations are demonstrated to yield an intriguing form of the FIM for the modeling phase, which defines the working hypothesis, solely in terms of the observed data. Cases for prediction employing time series' obtained from the: (i) the Mackey-Glass delay-differential equation, (ii) one ECG signal from the MIT-Beth Israel Deaconess Hospital (MIT-BIH) cardiac arrhythmia database, and (iii) one ECG signal from the Creighton University ventricular tachyarrhythmia database. The ECG samples were obtained from the Physionet online repository. These examples demonstrate the efficiency of the prediction model. Numerical examples for exemplary cases are provided.

  19. Geographical information modelling for land resource survey

    OpenAIRE

    Bruin, de, S.

    2000-01-01

    The increasing popularity of geographical information systems (GIS) has at least three major implications for land resources survey. Firstly, GIS allows alternative and richer representation of spatial phenomena than is possible with the traditional paper map. Secondly, digital technology has improved the accessibility of ancillary data, such as digital elevation models and remotely sensed imagery, and the possibilities of incorporating these into target database production. Thirdly, owing to...

  20. Formal Information Model for Representing Production Resources

    OpenAIRE

    Siltala, Niko; Järvenpää, Eeva; Lanz, Minna

    2017-01-01

    Part 2: Intelligent Manufacturing Systems; International audience; This paper introduces a concept and associated descriptions to formally describe physical production resources for modular and reconfigurable production systems. These descriptions are source of formal information for (automatic) production system design and (re-)configuration. They can be further utilized during the system deployment and execution. The proposed concept and the underlying formal resource description model is c...

  1. [Quality assurance in colorectal cancer in Europe AD 2011].

    Science.gov (United States)

    Mroczkowski, P; Hac, S; Lippert, H; Kube, R

    2013-12-01

    Malignant tumours are the second largest cause of death in Europe. Colorectal cancer takes second place within this group and is responsible for every eighth tumour-related death. Surgical quality assurance requires a prospective observational study, any different type of study is not possible. A complete recording of all treated patients is a prerequisite for quality assurance. Currently, there are quality assurance programmes in Sweden, Norway, Denmark, Great Britain, Spain, Belgium, the Netherlands as well as the multinational study for patients from Germany, Poland and Italy. These projects deliver comprehensive information regarding the treatment of colorectal cancer. However, this information is deeply rooted in the organisation of the health-care system in the given country and is not easily transferable into international settings. Also, an interpretation of the collected data is often possible only within the given health-care system. First, unified initial diagnostics is a prerequisite for quality assurance -  for the local extent and exclusion / confirmation of distant metastases. Until these criteria are unified, any comparison is limited, including a comparison of survival. Second, quality-of-life is not recorded in any of the current projects. Third, the main focus of a quality assurance project must be on therapy-dependent factors. The most sensible method of quality control remains within the connection of preoperative diagnostics (estimate of a best-case scenario), the surgical technique (the actual result) and a standardised pathological examination (evaluation of the actual result). These parameters can be recorded and compared within a quality assurance project regardless of the limitations of the national health-care systems. There is no alternative to a unified diagnostics model and unified histopathological evaluation, a complete picture of treatment quality is also not possible without systematic analysis of the quality of life.

  2. Quality Assurance Project Plan Development Tool

    Science.gov (United States)

    This tool contains information designed to assist in developing a Quality Assurance (QA) Project Plan that meets EPA requirements for projects that involve surface or groundwater monitoring and/or the collection and analysis of water samples.

  3. INFORMATIONAL MODEL OF MENTAL ROTATION OF FIGURES

    Directory of Open Access Journals (Sweden)

    V. A. Lyakhovetskiy

    2016-01-01

    Full Text Available Subject of Study.The subject of research is the information structure of objects internal representations and operations over them, used by man to solve the problem of mental rotation of figures. To analyze this informational structure we considered not only classical dependencies of the correct answers on the angle of rotation, but also the other dependencies obtained recently in cognitive psychology. Method.The language of technical computing Matlab R2010b was used for developing information model of the mental rotation of figures. Such model parameters as the number of bits in the internal representation, an error probability in a single bit, discrete rotation angle, comparison threshold, and the degree of difference during rotation can be changed. Main Results.The model reproduces qualitatively such psychological dependencies as the linear increase of time of correct answers and the number of errors on the angle of rotation for identical figures, "flat" dependence of the time of correct answers and the number of errors on the angle of rotation for mirror-like figures. The simulation results suggest that mental rotation is an iterative process of finding a match between the two figures, each step of which can lead to a significant distortion of the internal representation of the stored objects. Matching is carried out within the internal representations that have no high invariance to rotation angle. Practical Significance.The results may be useful for understanding the role of learning (including the learning with a teacher in the development of effective information representation and operations on them in artificial intelligence systems.

  4. Summarization of clinical information: a conceptual model.

    Science.gov (United States)

    Feblowitz, Joshua C; Wright, Adam; Singh, Hardeep; Samal, Lipika; Sittig, Dean F

    2011-08-01

    To provide high-quality and safe care, clinicians must be able to optimally collect, distill, and interpret patient information. Despite advances in text summarization, only limited research exists on clinical summarization, the complex and heterogeneous process of gathering, organizing and presenting patient data in various forms. To develop a conceptual model for describing and understanding clinical summarization in both computer-independent and computer-supported clinical tasks. Based on extensive literature review and clinical input, we developed a conceptual model of clinical summarization to lay the foundation for future research on clinician workflow and automated summarization using electronic health records (EHRs). Our model identifies five distinct stages of clinical summarization: (1) Aggregation, (2) Organization, (3) Reduction and/or Transformation, (4) Interpretation and (5) Synthesis (AORTIS). The AORTIS model describes the creation of complex, task-specific clinical summaries and provides a framework for clinical workflow analysis and directed research on test results review, clinical documentation and medical decision-making. We describe a hypothetical case study to illustrate the application of this model in the primary care setting. Both practicing physicians and clinical informaticians need a structured method of developing, studying and evaluating clinical summaries in support of a wide range of clinical tasks. Our proposed model of clinical summarization provides a potential pathway to advance knowledge in this area and highlights directions for further research. Copyright © 2011 Elsevier Inc. All rights reserved.

  5. Revitalizing quality assurance

    International Nuclear Information System (INIS)

    Hawkins, F.C.

    1998-01-01

    The image of someone inspecting or auditing often comes to mind when people hear the term quality assurance. Although partially correct, this image is not the complete picture. The person doing the inspecting or auditing is probably part of a traditional quality assurance organization, but that organization is only one aspect of a properly conceived and effectively implemented quality assurance system whose goal is improved facility safety and reliability. This paper introduces the underlying philosophies and basic concepts of the International Atomic Energy Agency's new quality assurance initiative that began in 1991 as part of a broad Agency-wide program to enhance nuclear safety. The first product of that initiative was publication in 1996 of a new Quality Assurance Code 50-C/SG-Q and fourteen related Safety Guides. This new suite of documents provide the technical and philosophical foundation upon which Member States can base their quality assurance programs. (author)

  6. Investigating accident causation through information network modelling.

    Science.gov (United States)

    Griffin, T G C; Young, M S; Stanton, N A

    2010-02-01

    Management of risk in complex domains such as aviation relies heavily on post-event investigations, requiring complex approaches to fully understand the integration of multi-causal, multi-agent and multi-linear accident sequences. The Event Analysis of Systemic Teamwork methodology (EAST; Stanton et al. 2008) offers such an approach based on network models. In this paper, we apply EAST to a well-known aviation accident case study, highlighting communication between agents as a central theme and investigating the potential for finding agents who were key to the accident. Ultimately, this work aims to develop a new model based on distributed situation awareness (DSA) to demonstrate that the risk inherent in a complex system is dependent on the information flowing within it. By identifying key agents and information elements, we can propose proactive design strategies to optimize the flow of information and help work towards avoiding aviation accidents. Statement of Relevance: This paper introduces a novel application of an holistic methodology for understanding aviation accidents. Furthermore, it introduces an ongoing project developing a nonlinear and prospective method that centralises distributed situation awareness and communication as themes. The relevance of findings are discussed in the context of current ergonomic and aviation issues of design, training and human-system interaction.

  7. Building Information Modelling for Smart Built Environments

    Directory of Open Access Journals (Sweden)

    Jianchao Zhang

    2015-01-01

    Full Text Available Building information modelling (BIM provides architectural 3D visualization and a standardized way to share and exchange building information. Recently, there has been an increasing interest in using BIM, not only for design and construction, but also the post-construction management of the built facility. With the emergence of smart built environment (SBE technology, which embeds most spaces with smart objects to enhance the building’s efficiency, security and comfort of its occupants, there is a need to understand and address the challenges BIM faces in the design, construction and management of future smart buildings. In this paper, we investigate how BIM can contribute to the development of SBE. Since BIM is designed to host information of the building throughout its life cycle, our investigation has covered phases from architecture design to facility management. Firstly, we extend BIM for the design phase to provide material/device profiling and the information exchange interface for various smart objects. Next, we propose a three-layer verification framework to assist BIM users in identifying possible defects in their SBE design. For the post-construction phase, we have designed a facility management tool to provide advanced energy management of smart grid-connected SBEs, where smart objects, as well as distributed energy resources (DERs are deployed.

  8. Development of a flattening filter free multiple source model for use as an independent, Monte Carlo, dose calculation, quality assurance tool for clinical trials.

    Science.gov (United States)

    Faught, Austin M; Davidson, Scott E; Popple, Richard; Kry, Stephen F; Etzel, Carol; Ibbott, Geoffrey S; Followill, David S

    2017-09-01

    The Imaging and Radiation Oncology Core-Houston (IROC-H) Quality Assurance Center (formerly the Radiological Physics Center) has reported varying levels of compliance from their anthropomorphic phantom auditing program. IROC-H studies have suggested that one source of disagreement between institution submitted calculated doses and measurement is the accuracy of the institution's treatment planning system dose calculations and heterogeneity corrections used. In order to audit this step of the radiation therapy treatment process, an independent dose calculation tool is needed. Monte Carlo multiple source models for Varian flattening filter free (FFF) 6 MV and FFF 10 MV therapeutic x-ray beams were commissioned based on central axis depth dose data from a 10 × 10 cm 2 field size and dose profiles for a 40 × 40 cm 2 field size. The models were validated against open-field measurements in a water tank for field sizes ranging from 3 × 3 cm 2 to 40 × 40 cm 2 . The models were then benchmarked against IROC-H's anthropomorphic head and neck phantom and lung phantom measurements. Validation results, assessed with a ±2%/2 mm gamma criterion, showed average agreement of 99.9% and 99.0% for central axis depth dose data for FFF 6 MV and FFF 10 MV models, respectively. Dose profile agreement using the same evaluation technique averaged 97.8% and 97.9% for the respective models. Phantom benchmarking comparisons were evaluated with a ±3%/2 mm gamma criterion, and agreement averaged 90.1% and 90.8% for the respective models. Multiple source models for Varian FFF 6 MV and FFF 10 MV beams have been developed, validated, and benchmarked for inclusion in an independent dose calculation quality assurance tool for use in clinical trial audits. © 2017 American Association of Physicists in Medicine.

  9. Relevance of information warfare models to critical infrastructure ...

    African Journals Online (AJOL)

    This article illustrates the relevance of information warfare models to critical infrastructure protection. Analogies of information warfare models to those of information security and information systems were used to deconstruct the models into their fundamental components and this will be discussed. The models were applied ...

  10. Modelling the Replication Management in Information Systems

    Directory of Open Access Journals (Sweden)

    Cezar TOADER

    2017-01-01

    Full Text Available In the modern economy, the benefits of Web services are significant because they facilitates the activities automation in the framework of Internet distributed businesses as well as the cooperation between organizations through interconnection process running in the computer systems. This paper presents the development stages of a model for a reliable information system. This paper describes the communication between the processes within the distributed system, based on the message exchange, and also presents the problem of distributed agreement among processes. A list of objectives for the fault-tolerant systems is defined and a framework model for distributed systems is proposed. This framework makes distinction between management operations and execution operations. The proposed model promotes the use of a central process especially designed for the coordination and control of other application processes. The execution phases and the protocols for the management and the execution components are presented. This model of a reliable system could be a foundation for an entire class of distributed systems models based on the management of replication process.

  11. Sustainability Product Properties in Building Information Models

    Science.gov (United States)

    2012-09-01

    Pset_Material_Sustainability_US ThermalResistance Thermal resistance of the element, hr-CuFt-F/Btu (K-Cu m/W) 0 hr-CuFt-F/Btu Pset_Material_Sustainability_US Asphalt ...Model Checker, the sustainable information properties associated with the toilet fixture were visible by selecting the “Private 1.6 LPF” folder in...Performance - Required to be a minimum of 30% better than ASH RAE 90.1-2004 - The key strategies for conserving energy include energy efficiency in

  12. A focus on building information modelling.

    Science.gov (United States)

    Ryan, Alison

    2014-03-01

    With the Government Construction Strategy requiring a strengthening of the public sector's capability to implement Building Information Modelling (BIM) protocols, the goal being that all central government departments will be adopting, as a minimum, collaborative Level 2 BIM by 2016, Alison Ryan, of consulting engineers, DSSR, explains the principles behind BIM, its history and evolution, and some of the considerable benefits it can offer. These include lowering capital project costs through enhanced co-ordination, cutting carbon emissions, and the ability to manage facilities more efficiently.

  13. Microbial Performance of Food Safety Control and Assurance Activities in a Fresh Produce Processing Sector Measured Using a Microbial Assessment Scheme and Statistical Modeling

    DEFF Research Database (Denmark)

    Njage, Patrick Murigu Kamau; Sawe, Chemutai Tonui; Onyango, Cecilia Moraa

    2017-01-01

    Current approaches such as inspections, audits, and end product testing cannot detect the distribution and dynamics of microbial contamination. Despite the implementation of current food safety management systems, foodborne outbreaks linked to fresh produce continue to be reported. A microbial...... assessment scheme and statistical modeling were used to systematically assess the microbial performance of core control and assurance activities in five Kenyan fresh produce processing and export companies. Generalized linear mixed models and correlated random-effects joint models for multivariate clustered...... data followed by empirical Bayes estimates enabled the analysis of the probability of contamination across critical sampling locations (CSLs) and factories as a random effect. Salmonella spp. and Listeria monocytogenes were not detected in the final products. However, none of the processors attained...

  14. Norwegian Corporate Accounts : Documentation and quality assurance of SNF’s and NHH’s database of accounting and company information for Norwegian companies

    OpenAIRE

    Berner, Endre; Mjøs, Aksel; Olving, Marius

    2014-01-01

    This working paper describes the database used by the Institute for Research in Economics (SNF) and the Norwegian School of Economics (NHH) in research based on companies' accounts. The objective of this working paper and the pertaining data files with accounting and company data is to document and quality assure the database covering all Norwegian enterprises and groups for the years 1992 to 2011, with some exceptions in the first years. The working paper is a translation of Arbeidsnotat 18/...

  15. Building information modelling (BIM: now and beyond

    Directory of Open Access Journals (Sweden)

    Salman Azhar

    2012-12-01

    Full Text Available Building Information Modeling (BIM, also called n-D Modeling or Virtual Prototyping Technology, is a revolutionary development that is quickly reshaping the Architecture-Engineering-Construction (AEC industry. BIM is both a technology and a process. The technology component of BIM helps project stakeholders to visualize what is to be built in a simulated environment to identify any potential design, construction or operational issues. The process component enables close collaboration and encourages integration of the roles of all stakeholders on a project. The paper presents an overview of BIM with focus on its core concepts, applications in the project life cycle and benefits for project stakeholders with the help of case studies. The paper also elaborates risks and barriers to BIM implementation and future trends.

  16. Parsimonious modeling with information filtering networks

    Science.gov (United States)

    Barfuss, Wolfram; Massara, Guido Previde; Di Matteo, T.; Aste, Tomaso

    2016-12-01

    We introduce a methodology to construct parsimonious probabilistic models. This method makes use of information filtering networks to produce a robust estimate of the global sparse inverse covariance from a simple sum of local inverse covariances computed on small subparts of the network. Being based on local and low-dimensional inversions, this method is computationally very efficient and statistically robust, even for the estimation of inverse covariance of high-dimensional, noisy, and short time series. Applied to financial data our method results are computationally more efficient than state-of-the-art methodologies such as Glasso producing, in a fraction of the computation time, models that can have equivalent or better performances but with a sparser inference structure. We also discuss performances with sparse factor models where we notice that relative performances decrease with the number of factors. The local nature of this approach allows us to perform computations in parallel and provides a tool for dynamical adaptation by partial updating when the properties of some variables change without the need of recomputing the whole model. This makes this approach particularly suitable to handle big data sets with large numbers of variables. Examples of practical application for forecasting, stress testing, and risk allocation in financial systems are also provided.

  17. Multiscale information modelling for heart morphogenesis

    International Nuclear Information System (INIS)

    Abdulla, T; Imms, R; Summers, R; Schleich, J M

    2010-01-01

    Science is made feasible by the adoption of common systems of units. As research has become more data intensive, especially in the biomedical domain, it requires the adoption of a common system of information models, to make explicit the relationship between one set of data and another, regardless of format. This is being realised through the OBO Foundry to develop a suite of reference ontologies, and NCBO Bioportal to provide services to integrate biomedical resources and functionality to visualise and create mappings between ontology terms. Biomedical experts tend to be focused at one level of spatial scale, be it biochemistry, cell biology, or anatomy. Likewise, the ontologies they use tend to be focused at a particular level of scale. There is increasing interest in a multiscale systems approach, which attempts to integrate between different levels of scale to gain understanding of emergent effects. This is a return to physiological medicine with a computational emphasis, exemplified by the worldwide Physiome initiative, and the European Union funded Network of Excellence in the Virtual Physiological Human. However, little work has been done on how information modelling itself may be tailored to a multiscale systems approach. We demonstrate how this can be done for the complex process of heart morphogenesis, which requires multiscale understanding in both time and spatial domains. Such an effort enables the integration of multiscale metrology.

  18. Multiscale information modelling for heart morphogenesis

    Science.gov (United States)

    Abdulla, T.; Imms, R.; Schleich, J. M.; Summers, R.

    2010-07-01

    Science is made feasible by the adoption of common systems of units. As research has become more data intensive, especially in the biomedical domain, it requires the adoption of a common system of information models, to make explicit the relationship between one set of data and another, regardless of format. This is being realised through the OBO Foundry to develop a suite of reference ontologies, and NCBO Bioportal to provide services to integrate biomedical resources and functionality to visualise and create mappings between ontology terms. Biomedical experts tend to be focused at one level of spatial scale, be it biochemistry, cell biology, or anatomy. Likewise, the ontologies they use tend to be focused at a particular level of scale. There is increasing interest in a multiscale systems approach, which attempts to integrate between different levels of scale to gain understanding of emergent effects. This is a return to physiological medicine with a computational emphasis, exemplified by the worldwide Physiome initiative, and the European Union funded Network of Excellence in the Virtual Physiological Human. However, little work has been done on how information modelling itself may be tailored to a multiscale systems approach. We demonstrate how this can be done for the complex process of heart morphogenesis, which requires multiscale understanding in both time and spatial domains. Such an effort enables the integration of multiscale metrology.

  19. Quality assurance in radiodiagnosis

    International Nuclear Information System (INIS)

    Ghilardi Netto, T.; Sao Paulo Univ., Ribeirao Preto

    1983-01-01

    The following topics are dealt with: 1) the importance of the application of a quality assurance program in radiodiagnosis, with its main consequences : improvement of imaging quality, reduction of the patient expossure rate, cost reduction and 2) how to introduce the quality assurance control in the radiodiagnostic area. (M.A.) [pt

  20. Quality assurance program

    International Nuclear Information System (INIS)

    Brooks, G.L.

    The concept of levels of quality assurance as applied to CANDU-type nuclear power plant components, i.e. maintaining an appropriate cost/benefit ratio, is introduced. The design process itself has quality assurance features by virtue of multi-level review. (E.C.B.)

  1. High assurance services computing

    CERN Document Server

    2009-01-01

    Covers service-oriented technologies in different domains including high assurance systemsAssists software engineers from industry and government laboratories who develop mission-critical software, and simultaneously provides academia with a practitioner's outlook on the problems of high-assurance software development

  2. Laboratory quality assurance

    International Nuclear Information System (INIS)

    Delvin, W.L.

    1977-01-01

    The elements (principles) of quality assurance can be applied to the operation of the analytical chemistry laboratory to provide an effective tool for indicating the competence of the laboratory and for helping to upgrade competence if necessary. When used, those elements establish the planned and systematic actions necessary to provide adequate confidence in each analytical result reported by the laboratory (the definition of laboratory quality assurance). The elements, as used at the Hanford Engineering Development Laboratory (HEDL), are discussed and they are qualification of analysts, written methods, sample receiving and storage, quality control, audit, and documentation. To establish a laboratory quality assurance program, a laboratory QA program plan is prepared to specify how the elements are to be implemented into laboratory operation. Benefits that can be obtained from using laboratory quality assurance are given. Experience at HEDL has shown that laboratory quality assurance is not a burden, but it is a useful and valuable tool for the analytical chemistry laboratory

  3. SYNTHESIS OF INFORMATION MODEL FOR ALTERNATIVE FUNCTIONAL DIAGNOSTICS PROCEDURE

    OpenAIRE

    P. F. Shchapov; R. P. Miguschenko

    2014-01-01

    Probabilistic approaches in information theory and information theory of measurement, allowing to calculate and analyze the amount expected to models measuring conversions and encoding tasks random measurement signals were considered. A probabilistic model of diagnostic information model transformation and diagnostic procedures was developed. Conditions for obtaining the maximum amount of diagnostic information were found out.

  4. Quality assurance programme for isotope diagnostic laboratories

    International Nuclear Information System (INIS)

    Krasznai, Istvan

    1987-01-01

    Quality assurance systems are suggested to be introduced in laboratories, in accordance with the recommendations of IAEA and WHO, taking local circumstances into consideration. It is emphasized that a quantitative enhancement of work must not endanger its quality; diagnostic information must be undistorted, reproducible, and gathered with the minimum of radiation burden. National authorities are requested to strengthen their supervision. Recommendations for quality assurance methods are given for medical isotope diagnostic laboratories. (author)

  5. Technology development and quality assurance in industrial-scale solar heating plants. The GroSol information platform; Technikentwicklung und Qualitaetssicherung bei grossen Solarwaermeanlagen. Informationsplattform GroSol

    Energy Technology Data Exchange (ETDEWEB)

    Luchterhand, Jens [Solarpraxis AG, Berlin (Germany)

    2009-07-01

    The project ''Technology development and quality assurance'' is one of a number of BMU-funded measures to develop a market for industrial-scale solar thermal power systems. The project is to help remove technical obstacles in the projecting, installation and operation of industrial-scale solar thermal power systems. First results are presented here. The website describing the project is scheduled for autumn 2009. It will be updated regularly until the project ends in September 2011. (orig./AKB)

  6. Modeling the reemergence of information diffusion in social network

    Science.gov (United States)

    Yang, Dingda; Liao, Xiangwen; Shen, Huawei; Cheng, Xueqi; Chen, Guolong

    2018-01-01

    Information diffusion in networks is an important research topic in various fields. Existing studies either focus on modeling the process of information diffusion, e.g., independent cascade model and linear threshold model, or investigate information diffusion in networks with certain structural characteristics such as scale-free networks and small world networks. However, there are still several phenomena that have not been captured by existing information diffusion models. One of the prominent phenomena is the reemergence of information diffusion, i.e., a piece of information reemerges after the completion of its initial diffusion process. In this paper, we propose an optimized information diffusion model by introducing a new informed state into traditional susceptible-infected-removed model. We verify the proposed model via simulations in real-world social networks, and the results indicate that the model can reproduce the reemergence of information during the diffusion process.

  7. Design-reliability assurance program application to ACP600

    International Nuclear Information System (INIS)

    Zhichao, Huang; Bo, Zhao

    2012-01-01

    ACP600 is a newly nuclear power plant technology made by CNNC in China and it is based on the Generation III NPPs design experience and general safety goals. The ACP600 Design Reliability Assurance Program (D-RAP) is implemented as an integral part of the ACP600 design process. A RAP is a formal management system which assures the collection of important characteristic information about plant performance throughout each phase of its life and directs the use of this information in the implementation of analytical and management process which are specifically designed to meet two specific objects: confirm the plant goals and cost effective improvements. In general, typical reliability assurance program have 4 broad functional elements: 1) Goals and performance criteria; 2) Management system and implementing procedures; 3) Analytical tools and investigative methods; and 4) Information management. In this paper we will use the D-RAP technical and Risk-Informed requirements, and establish the RAM and PSA model to optimize the ACP600 design. Compared with previous design process, the D-RAP is more competent for the higher design targets and requirements, enjoying more creativity through an easier implementation of technical breakthroughs. By using D-RAP, the plants goals, system goals, performance criteria and safety criteria can be easier to realize, and the design can be optimized and more rational

  8. CRISP. Information Security Models and Their Economics

    International Nuclear Information System (INIS)

    Gustavsson, R.; Mellstrand, P.; Tornqvist, B.

    2005-03-01

    The deliverable D1.6 includes background material and specifications of a CRISP Framework on protection of information assets related to power net management and management of business operations related to energy services. During the project it was discovered by the CRISP consortium that the original description of WP 1.6 was not adequate for the project as such. The main insight was that the original emphasis on cost-benefit analysis of security protection measures was to early to address in the project. This issue is of course crucial in itself but requires new models of consequence analysis that still remains to be developed, especially for the new business models we are investigated in the CRISP project. The updated and approved version of the WP1.6 description, together with the also updated WP2.4 focus on Dependable ICT support of Power Grid Operations constitutes an integrated approach towards dependable and secure future utilities and their business processes. This document (D1.6) is a background to deliverable D2.4. Together they provide a dependability and security framework to the three CRISP experiments in WP3

  9. The study on the quality assurance of performance assessment for the disposal system

    International Nuclear Information System (INIS)

    Fusaeda, Shigeki; Yanagisawa, Ichiro; Katsurai, Kiyomichi; Ueda, Noriaki; Takeishi, Masayuki; Ida, Toshio; Imamura, Naoko

    1999-02-01

    The purpose of performance assessment of the geological disposal system in the second progress report is to quantitatively evaluate the performance in the near-field. For this purpose, validation of performance models and quality assurance of data used in the performance assessment are important technical subjects. To achieve the subjects, the quality of the procedure of analysis work and data acquisition work must be assured in addition to the quality assurance of data, models and analysis codes. In addition, to assure results of the performance assessment by integrating these qualities is an important matter. The following studies have been performed in order to improve the computer environment for controlling the quality information relating to the performance assessment, and to develop the integrated quality assurance system which can give reliability of the results of the performance assessment in the second progress report. (1) The study of quality assurance framework. In order to assure reliability of MESHNOTE3, we have carried out validation analysis based on experimental data and insite data. And we have revised the quality assurance manual in order to be applicable to preparing documents. We have carried out validation analysis/planning based on the experimental data which is acquired from 'Measurement of Apparent Diffusion Coefficient of 99 Tc in Compacted Bentonite with Fe powder', and confirmed validity of MESHNOTE3. We have added a postscript on the management of analysis documents to the quality assurance manual. (2) The development of the quality assurance computer system. In order to improve reliability of the analysis results and to efficiently use the quality assurance program, the quality assurance computer system on the basis of analysis management system CAPASA has been improved as follows. Database for radionuclide transport calculations that can control geometry of engineered barriers, data relating to glass dissolution and dose rate

  10. Quality management and quality assurance

    International Nuclear Information System (INIS)

    Pieroni, N.

    1991-01-01

    The main common difficulties are presented found in the implementation of effective Quality Management and Quality Assurance Programmes, based on the recommendations of the IAEA International Nuclear Safety Advisory Group, the information collected by the IAEA experts participating in its meetings, and the results of the IAEA Operational Safety Review Team missions. The difficulties were identified in several areas. The most relevant root causes can be characterized as lack of understanding of quality principles and difficulty in implementation by the responsible management. The IAEA programme is described attempting to provide advice and support in the implementation of an effective quality programme through a number of activities including: preparation of practical guidelines, training programmes for management personnel, assistance in building up qualified manpower, and promoting the quest for excellence through the exchange of experience in the implementation of effective Quality Management and Quality Assurance Programmes in nuclear power plants with good performance records. (Z.S.)

  11. Utilization of building information modeling in infrastructure’s design and construction

    Science.gov (United States)

    Zak, Josef; Macadam, Helen

    2017-09-01

    Building Information Modeling (BIM) is a concept that has gained its place in the design, construction and maintenance of buildings in Czech Republic during recent years. This paper deals with description of usage, applications and potential benefits and disadvantages connected with implementation of BIM principles in the preparation and construction of infrastructure projects. Part of the paper describes the status of BIM implementation in Czech Republic, and there is a review of several virtual design and construction practices in Czech Republic. Examples of best practice are presented from current infrastructure projects. The paper further summarizes experiences with new technologies gained from the application of BIM related workflows. The focus is on the BIM model utilization for the machine control systems on site, quality assurance, quality management and construction management.

  12. Quality assurance for online nursing courses.

    Science.gov (United States)

    Little, Barbara Battin

    2009-07-01

    Nurse educators and students have been expressing concern about the quality of online education for more than a decade. Models, standards, benchmarks, and peer review processes now offer tools for assuring the quality of online education and provide documentation for evaluation and accreditation processes. Standards provide the basis for initial course design, thus decreasing the need for revisions to correct weaknesses. This article reviews the literature on standards and quality assurance processes for online courses. Recommendations for the use of standards, peer review, and quality assurance of online courses are discussed.

  13. Quality Assurance in Radiotherapy

    Science.gov (United States)

    Mckenzie, Alan

    A common feature of the Radiotherapy Centres where there have been major accidents involving incorrect radiotherapy treatment is that they did not operate good Quality Assurance systems. A Quality Assurance system is sometimes called a Quality Management system, and it is designed to give assurance that quality standards are being met. One of the "spin offs" from operating a Quality Management system is that it reduces the likelihood of a radiotherapy accident. A detailed account of how to set up a quality system in radiotherapy has been given in an ESTRO booklet.2

  14. Evaluation of clinical information modeling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Austin, Tony; Moreno-Conde, Jesús; Parra-Calderón, Carlos L; Kalra, Dipak

    2016-11-01

    Clinical information models are formal specifications for representing the structure and semantics of the clinical content within electronic health record systems. This research aims to define, test, and validate evaluation metrics for software tools designed to support the processes associated with the definition, management, and implementation of these models. The proposed framework builds on previous research that focused on obtaining agreement on the essential requirements in this area. A set of 50 conformance criteria were defined based on the 20 functional requirements agreed by that consensus and applied to evaluate the currently available tools. Of the 11 initiative developing tools for clinical information modeling identified, 9 were evaluated according to their performance on the evaluation metrics. Results show that functionalities related to management of data types, specifications, metadata, and terminology or ontology bindings have a good level of adoption. Improvements can be made in other areas focused on information modeling and associated processes. Other criteria related to displaying semantic relationships between concepts and communication with terminology servers had low levels of adoption. The proposed evaluation metrics were successfully tested and validated against a representative sample of existing tools. The results identify the need to improve tool support for information modeling and software development processes, especially in those areas related to governance, clinician involvement, and optimizing the technical validation of testing processes. This research confirmed the potential of these evaluation metrics to support decision makers in identifying the most appropriate tool for their organization. Los Modelos de Información Clínica son especificaciones para representar la estructura y características semánticas del contenido clínico en los sistemas de Historia Clínica Electrónica. Esta investigación define, prueba y valida

  15. Information Model and Its Element for Displaying Information on Technical Condition of Objects of Integrated Information System

    OpenAIRE

    Kovalenko, Anna; Smirnov, Alexey; Kovalenko, Alexander; Dorensky, Alexander; Коваленко, А. С.; Смірнов, О. А.; Коваленко, О. В.; Доренський, О. П.

    2016-01-01

    The suggested information elements for the system of information display of the technical condition of the integrated information system meet the essential requirements of the information presentation. They correlate with the real object simply and very accurately. The suggested model of information display of the technical condition of the objects of integrated information system improves the efficiency of the operator of technical diagnostics in evaluating the information about the...

  16. Contexts for concepts: Information modeling for semantic interoperability

    NARCIS (Netherlands)

    Oude Luttighuis, P.H.W.M.; Stap, R.E.; Quartel, D.

    2011-01-01

    Conceptual information modeling is a well-established practice, aimed at preparing the implementation of information systems, the specification of electronic message formats, and the design of information processes. Today's ever more connected world however poses new challenges for conceptual

  17. Semantic World Modelling and Data Management in a 4d Forest Simulation and Information System

    Science.gov (United States)

    Roßmann, J.; Hoppen, M.; Bücken, A.

    2013-08-01

    Various types of 3D simulation applications benefit from realistic forest models. They range from flight simulators for entertainment to harvester simulators for training and tree growth simulations for research and planning. Our 4D forest simulation and information system integrates the necessary methods for data extraction, modelling and management. Using modern methods of semantic world modelling, tree data can efficiently be extracted from remote sensing data. The derived forest models contain position, height, crown volume, type and diameter of each tree. This data is modelled using GML-based data models to assure compatibility and exchangeability. A flexible approach for database synchronization is used to manage the data and provide caching, persistence, a central communication hub for change distribution, and a versioning mechanism. Combining various simulation techniques and data versioning, the 4D forest simulation and information system can provide applications with "both directions" of the fourth dimension. Our paper outlines the current state, new developments, and integration of tree extraction, data modelling, and data management. It also shows several applications realized with the system.

  18. Evaluation of financial assurance alternatives of licensees

    International Nuclear Information System (INIS)

    Douglas, J.N.

    1995-09-01

    The Uranium and Thorium Mining Regulations of the Atomic Energy Control Act require that applicants/licensees indicate to the AECB what financial assurance plans they have made to fund the decommissioning plan they propose to put in place. We have determined through our own business knowledge from other projects, as well as information provided by contacts in the banking, accounting, legal, investment and insurance communities, what financial assurance plans might be available. We have tabulated these alternatives, included explanations of how each might be implemented, and recorded advantages and disadvantages of each alternative to both the AECB and the applicant/licensee. In addition we have ranked the alternatives in order of most suitable to least suitable, from the AECB's perspective. Although these financial assurance mechanisms have been tabulated with a view to decommissioning of a uranium mine, they could be used in other licence or business arrangements that require financial assurance. (author). 3 tabs., 1 fig

  19. Quality assurance in radiotherapy

    International Nuclear Information System (INIS)

    Groth, S.; Meghzifene, A.; Tatsuzaki, H.; Levin, V.; Izewska, J.

    2001-01-01

    Quality assurance in the management of a patient receiving radiation therapy and the role of the radiation oncologist and medical physicist in this process is described. The constraints on available personnel are recognised and the need for further education resources and IAEA activities in education for both groups described. IAEA activities in the clinical and dosimetric aspects and the resultant publications and education have contributed to a culture of quality assurance. (author)

  20. RAVEN Quality Assurance Activities

    Energy Technology Data Exchange (ETDEWEB)

    Cogliati, Joshua Joseph [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    This report discusses the quality assurance activities needed to raise the Quality Level of Risk Analysis in a Virtual Environment (RAVEN) from Quality Level 3 to Quality Level 2. This report also describes the general RAVEN quality assurance activities. For improving the quality, reviews of code changes have been instituted, more parts of testing have been automated, and improved packaging has been created. For upgrading the quality level, requirements have been created and the workflow has been improved.

  1. The Nature of Information Science: Changing Models

    Science.gov (United States)

    Robinson, Lyn; Karamuftuoglu, Murat

    2010-01-01

    Introduction: This paper considers the nature of information science as a discipline and profession. Method: It is based on conceptual analysis of the information science literature, and consideration of philosophical perspectives, particularly those of Kuhn and Peirce. Results: It is argued that information science may be understood as a field of…

  2. National Space Science Data Center Information Model

    Science.gov (United States)

    Bell, E. V.; McCaslin, P.; Grayzeck, E.; McLaughlin, S. A.; Kodis, J. M.; Morgan, T. H.; Williams, D. R.; Russell, J. L.

    2013-12-01

    The National Space Science Data Center (NSSDC) was established by NASA in 1964 to provide for the preservation and dissemination of scientific data from NASA missions. It has evolved to support distributed, active archives that were established in the Planetary, Astrophysics, and Heliophysics disciplines through a series of Memoranda of Understanding. The disciplines took over responsibility for working with new projects to acquire and distribute data for community researchers while the NSSDC remained vital as a deep archive. Since 2000, NSSDC has been using the Archive Information Package to preserve data over the long term. As part of its effort to streamline the ingest of data into the deep archive, the NSSDC developed and implemented a data model of desired and required metadata in XML. This process, in use for roughly five years now, has been successfully used to support the identification and ingest of data into the NSSDC archive, most notably those data from the Planetary Data System (PDS) submitted under PDS3. A series of software packages (X-ware) were developed to handle the submission of data from the PDS nodes utilizing a volume structure. An XML submission manifest is generated at the PDS provider site prior to delivery to NSSDC. The manifest ensures the fidelity of PDS data delivered to NSSDC. Preservation metadata is captured in an XML object when NSSDC archives the data. With the recent adoption by the PDS of the XML-based PDS4 data model, there is an opportunity for the NSSDC to provide additional services to the PDS such as the preservation, tracking, and restoration of individual products (e.g., a specific data file or document), which was unfeasible in the previous PDS3 system. The NSSDC is modifying and further streamlining its data ingest process to take advantage of the PDS4 model, an important consideration given the ever-increasing amount of data being generated and archived by orbiting missions at the Moon and Mars, other active projects

  3. Information sharing systems and teamwork between sub-teams: a mathematical modeling perspective

    Science.gov (United States)

    Tohidi, Hamid; Namdari, Alireza; Keyser, Thomas K.; Drzymalski, Julie

    2017-12-01

    Teamwork contributes to a considerable improvement in quality and quantity of the ultimate outcome. Collaboration and alliance between team members bring a substantial progress for any business. However, it is imperative to acquire an appropriate team since many factors must be considered in this regard. Team size may represent the effectiveness of a team and it is of paramount importance to determine what the ideal team size exactly should be. In addition, information technology increasingly plays a differentiating role in productivity and adopting appropriate information sharing systems may contribute to improvement in efficiency especially in competitive markets when there are numerous producers that compete with each other. The significance of transmitting information to individuals is inevitable to assure an improvement in team performance. In this paper, a model of teamwork and its organizational structure are presented. Furthermore, a mathematical model is proposed in order to characterize a group of sub-teams according to two criteria: team size and information technology. The effect of information technology on performance of team and sub-teams as well as optimum size of those team and sub-teams from a productivity perspective are studied. Moreover, a quantitative sensitivity analysis is presented in order to analyze the interaction between these two factors through a sharing system.

  4. Quality assurance and organizational effectiveness in hospitals.

    OpenAIRE

    Hetherington, R W

    1982-01-01

    The purpose of this paper is to explore some aspects of a general theoretical model within which research on the organizational impacts of quality assurance programs in hospitals may be examined. Quality assurance is conceptualized as an organizational control mechanism, operating primarily through increased formalization of structures and specification of procedures. Organizational effectiveness is discussed from the perspective of the problem-solving theory of organizations, wherein effecti...

  5. Online Cancer Information Seeking: Applying and Extending the Comprehensive Model of Information Seeking.

    Science.gov (United States)

    Van Stee, Stephanie K; Yang, Qinghua

    2017-10-30

    This study applied the comprehensive model of information seeking (CMIS) to online cancer information and extended the model by incorporating an exogenous variable: interest in online health information exchange with health providers. A nationally representative sample from the Health Information National Trends Survey 4 Cycle 4 was analyzed to examine the extended CMIS in predicting online cancer information seeking. Findings from a structural equation model supported most of the hypotheses derived from the CMIS, as well as the extension of the model related to interest in online health information exchange. In particular, socioeconomic status, beliefs, and interest in online health information exchange predicted utility. Utility, in turn, predicted online cancer information seeking, as did information-carrier characteristics. An unexpected but important finding from the study was the significant, direct relationship between cancer worry and online cancer information seeking. Theoretical and practical implications are discussed.

  6. Quality Assurance in Higher Education in Zimbabwe

    Science.gov (United States)

    Garwe, Evelyn Chiyevo

    2014-01-01

    The purpose of this paper is to furnish local and global stakeholders with detailed information regarding the development and current status of quality assurance in the Zimbabwean higher education sector. The study used document analysis, observation and interviews with key informants as sources of data. This paper addresses the dearth of…

  7. The Effects of a Computer-Assisted Teaching Material, Designed According to the ASSURE Instructional Design and the ARCS Model of Motivation, on Students' Achievement Levels in a Mathematics Lesson and Their Resulting Attitudes

    Science.gov (United States)

    Karakis, Hilal; Karamete, Aysen; Okçu, Aydin

    2016-01-01

    This study examined the effects that computer-assisted instruction had on students' attitudes toward a mathematics lesson and toward learning mathematics with computer-assisted instruction. The computer software we used was based on the ASSURE Instructional Systems Design and the ARCS Model of Motivation, and the software was designed to teach…

  8. Organization and Implementation of Online Cytology Quality Assurance Program – Georgian Experience

    Directory of Open Access Journals (Sweden)

    Kldiashvili Ekaterina

    2017-07-01

    Full Text Available Medical information system (MIS is at the heart of information technology (IT implementation policies in healthcare systems around the world. Different architecture and application models of MIS are developed. Despite of obvious advantages and benefits, application of MIS in everyday practice is slow. On the background of analysis of the existing models of MIS in Georgia has been created a multi-user web-based approach. This article will present the architecture of the system and its application for cytology quality assurance programs. Five hundred Georgian language electronic medical records from the cervical screening activity illustrated by images were selected for quality assurance program. The primary goal of the MIS is patient management. However, the system can be used for quality assurance programs. The ideal of healthcare in the information age must be to create a situation where healthcare professionals spend more time creating knowledge from medical information and less time managing medical information. The application of easily available and adaptable technology and improvement of the infrastructure conditions is the basis for eHealth applications. The MIS is perspective and actual technology solution. It can be used for cytology quality assurance programs.

  9. Thermodynamics of information processing based on enzyme kinetics: An exactly solvable model of an information pump

    Science.gov (United States)

    Cao, Yuansheng; Gong, Zongping; Quan, H. T.

    2015-06-01

    Motivated by the recent proposed models of the information engine [Proc. Natl. Acad. Sci. USA 109, 11641 (2012), 10.1073/pnas.1204263109] and the information refrigerator [Phys. Rev. Lett. 111, 030602 (2013), 10.1103/PhysRevLett.111.030602], we propose a minimal model of the information pump and the information eraser based on enzyme kinetics. This device can either pump molecules against the chemical potential gradient by consuming the information to be encoded in the bit stream or (partially) erase the information initially encoded in the bit stream by consuming the Gibbs free energy. The dynamics of this model is solved exactly, and the "phase diagram" of the operation regimes is determined. The efficiency and the power of the information machine is analyzed. The validity of the second law of thermodynamics within our model is clarified. Our model offers a simple paradigm for the investigating of the thermodynamics of information processing involving the chemical potential in small systems.

  10. Creation and usage of component model in projecting information systems

    OpenAIRE

    Urbonas, Paulius

    2004-01-01

    The purpose of this project was to create the information system, using component model. Making new information systems, often the same models are building. Realizing system with component model in creating new system it‘s possible to use the old components. To describe advantages of component model information system was created for company “Vilseda”. If the created components used in future, they have been projected according to theirs types(grafical user interface, data and function reques...

  11. A Dynamic Model of Information and Entropy

    Directory of Open Access Journals (Sweden)

    Stuart D. Walker

    2010-01-01

    Full Text Available We discuss the possibility of a relativistic relationship between information and entropy, closely analogous to the classical Maxwell electro-magnetic wave equations. Inherent to the analysis is the description of information as residing in points of non-analyticity; yet ultimately also exhibiting a distributed characteristic: additionally analogous, therefore, to the wave-particle duality of light. At cosmological scales our vector differential equations predict conservation of information in black holes, whereas regular- and Z-DNA molecules correspond to helical solutions at microscopic levels. We further propose that regular- and Z-DNA are equivalent to the alternative words chosen from an alphabet to maintain the equilibrium of an information transmission system.

  12. Model of intelligent information searching system

    International Nuclear Information System (INIS)

    Yastrebkov, D.I.

    2004-01-01

    A brief description of the technique to search for electronic documents in large archives as well as drawbacks is presented. A solution close to intelligent information searching systems is proposed. (author)

  13. Empirical modeling of information communication technology usage ...

    African Journals Online (AJOL)

    Hennie

    2015-11-01

    behavioural intention and teachers'. Information ... Tackling ICT usage problems in emerging economies like Nigeria and South Africa requires in-depth research. ..... ment was also defined as the perception of inherent.

  14. A Linguistically Motivated Probabilistic Model of Information Retrieval

    NARCIS (Netherlands)

    Hiemstra, Djoerd

    1998-01-01

    This paper presents a new probabilistic model of information retrieval. The most important modeling assumption made is that documents and queries are defined by an ordered sequence of single terms. This assumption is not made in well known existing models of information retrieval, but is essential

  15. Information, complexity and efficiency: The automobile model

    Energy Technology Data Exchange (ETDEWEB)

    Allenby, B. [Lucent Technologies (United States)]|[Lawrence Livermore National Lab., CA (United States)

    1996-08-08

    The new, rapidly evolving field of industrial ecology - the objective, multidisciplinary study of industrial and economic systems and their linkages with fundamental natural systems - provides strong ground for believing that a more environmentally and economically efficient economy will be more information intensive and complex. Information and intellectual capital will be substituted for the more traditional inputs of materials and energy in producing a desirable, yet sustainable, quality of life. While at this point this remains a strong hypothesis, the evolution of the automobile industry can be used to illustrate how such substitution may, in fact, already be occurring in an environmentally and economically critical sector.

  16. Modeling behavioral considerations related to information security.

    Energy Technology Data Exchange (ETDEWEB)

    Martinez-Moyano, I. J.; Conrad, S. H.; Andersen, D. F. (Decision and Information Sciences); (SNL); (Univ. at Albany)

    2011-01-01

    The authors present experimental and simulation results of an outcome-based learning model for the identification of threats to security systems. This model integrates judgment, decision-making, and learning theories to provide a unified framework for the behavioral study of upcoming threats.

  17. Modeling Uncertain Context Information via Event Probabilities

    NARCIS (Netherlands)

    van Bunningen, A.H.; Feng, L.; Apers, Peter M.G.; de Keijzer, Ander; de Keijzer, A.; van Keulen, M.; van Keulen, Maurice

    2006-01-01

    To be able to support context-awareness in an Ambient Intelligent (AmI) environment, one needs a way to model context. Previous research shows that a good way to model context is using Description Logics (DL). Since context data is often coming from sensors and therefore exhibits uncertain

  18. Quality assurance of sterilized products: verification of a model relating frequency of contaminated items and increasing radiation dose

    International Nuclear Information System (INIS)

    Khan, A.A.; Tallentire, A.; Dwyer, J.

    1977-01-01

    Values of the γ-radiation resistance parameters (k and n of the 'multi-hit' expression) for Bacillus pumilus E601 spores and Serratia marcescens cells have been determined and the constancy of values for a given test condition demonstrated. These organisms, differing by a factor of about 50 in k value, have been included in a laboratory test system for use in verification of a model describing the dependence of the proportion of contaminated items in a population of items on radiation dose. The proportions of contaminated units of the test system at various γ-radiation doses have been measured for different initial numbers and types of organisms present in units either singly or together. Using the model, the probabilities of contaminated units for corresponding sets of conditions have been evaluated together with associated variances. Measured proportions and predicted probabilities agree well, showing that the model holds in a laboratory contrived situation. (author)

  19. Models of organisation and information system design | Mohamed ...

    African Journals Online (AJOL)

    We devote this paper to the models of organisation, and see which is best suited to provide a basis for information processing and transmission. In this respect we shall be dealing with four models of organisation, namely: the classical mode, the behavioural model, the systems model and the cybernetic model of ...

  20. Geographical information modelling for land resource survey

    NARCIS (Netherlands)

    Bruin, de S.

    2000-01-01

    The increasing popularity of geographical information systems (GIS) has at least three major implications for land resources survey. Firstly, GIS allows alternative and richer representation of spatial phenomena than is possible with the traditional paper map. Secondly, digital technology has

  1. Model driven geo-information systems development

    NARCIS (Netherlands)

    Morales Guarin, J.M.; Ferreira Pires, Luis; van Sinderen, Marten J.; Williams, A.D.

    Continuous change of user requirements has become a constant for geo-information systems. Designing systems that can adapt to such changes requires an appropriate design methodology that supports abstraction, modularity and other mechanisms to capture the essence of the system and help controlling

  2. A new model of information behaviour based on the Search Situation Transition schema Information searching, Information behaviour, Behavior, Information retrieval, Information seeking

    Directory of Open Access Journals (Sweden)

    Nils Pharo

    2004-01-01

    Full Text Available This paper presents a conceptual model of information behaviour. The model is part of the Search Situation Transition method schema. The method schema is developed to discover and analyse interplay between phenomena traditionally analysed as factors influencing either information retrieval or information seeking. In this paper the focus is on the model's five main categories: the work task, the searcher, the social/organisational environment, the search task, and the search process. In particular, the search process and its sub-categories search situation and transition and the relationship between these are discussed. To justify the method schema an empirical study was designed according to the schema's specifications. In the paper a subset of the study is presented analysing the effects of work tasks on Web information searching. Findings from this small-scale study indicate a strong relationship between the work task goal and the level of relevance used for judging resources during search processes.

  3. Research on network information security model and system construction

    OpenAIRE

    Wang Haijun

    2016-01-01

    It briefly describes the impact of large data era on China’s network policy, but also brings more opportunities and challenges to the network information security. This paper reviews for the internationally accepted basic model and characteristics of network information security, and analyses the characteristics of network information security and their relationship. On the basis of the NIST security model, this paper describes three security control schemes in safety management model and the...

  4. Heterogeneous information network model for equipment-standard system

    Science.gov (United States)

    Yin, Liang; Shi, Li-Chen; Zhao, Jun-Yan; Du, Song-Yang; Xie, Wen-Bo; Yuan, Fei; Chen, Duan-Bing

    2018-01-01

    Entity information network is used to describe structural relationships between entities. Taking advantage of its extension and heterogeneity, entity information network is more and more widely applied to relationship modeling. Recent years, lots of researches about entity information network modeling have been proposed, while seldom of them concentrate on equipment-standard system with properties of multi-layer, multi-dimension and multi-scale. In order to efficiently deal with some complex issues in equipment-standard system such as standard revising, standard controlling, and production designing, a heterogeneous information network model for equipment-standard system is proposed in this paper. Three types of entities and six types of relationships are considered in the proposed model. Correspondingly, several different similarity-measuring methods are used in the modeling process. The experiments show that the heterogeneous information network model established in this paper can reflect relationships between entities accurately. Meanwhile, the modeling process has a good performance on time consumption.

  5. A model-driven approach to information security compliance

    Science.gov (United States)

    Correia, Anacleto; Gonçalves, António; Teodoro, M. Filomena

    2017-06-01

    The availability, integrity and confidentiality of information are fundamental to the long-term survival of any organization. Information security is a complex issue that must be holistically approached, combining assets that support corporate systems, in an extended network of business partners, vendors, customers and other stakeholders. This paper addresses the conception and implementation of information security systems, conform the ISO/IEC 27000 set of standards, using the model-driven approach. The process begins with the conception of a domain level model (computation independent model) based on information security vocabulary present in the ISO/IEC 27001 standard. Based on this model, after embedding in the model mandatory rules for attaining ISO/IEC 27001 conformance, a platform independent model is derived. Finally, a platform specific model serves the base for testing the compliance of information security systems with the ISO/IEC 27000 set of standards.

  6. Global Information Enterprise (GIE) Modeling and Simulation (GIESIM)

    National Research Council Canada - National Science Library

    Bell, Paul

    2005-01-01

    ... AND S) toolkits into the Global Information Enterprise (GIE) Modeling and Simulation (GIESim) framework to create effective user analysis of candidate communications architectures and technologies...

  7. A process Approach to Information Services: Information Search Process (ISP Model

    Directory of Open Access Journals (Sweden)

    Hamid Keshavarz

    2010-12-01

    Full Text Available Information seeking is a behavior emerging out of the interaction between information seeker and information system and should be regarded as an episodic process so as to meet information needs of users and to take different roles in different stages of it. The present article introduces a process approach to information services in libraries using Carol Collier Kuhlthau Model. In this model, information seeking is regarded as a process consisting of six stages in each of which users have different thoughts, feelings and actions and librarians also take different roles at any stage correspondingly. These six stages are derived from instructive learning theory based on uncertainty principle. Regardless of some acceptable shortcomings, this model may be regarded as a new solution for rendering modern information services in libraries especially in relation to new information environments and media.

  8. Quality assurance for health and environmental chemistry: 1989

    International Nuclear Information System (INIS)

    Gautier, M.A.; Gladney, E.S.; Koski, N.L.; Jones, E.A.; Phillips, M.B.; O'Malley, B.T.

    1990-12-01

    This report documents the continuing quality assurance efforts of the Health and Environmental Chemistry Group (HSE-9) at the Los Alamos National Laboratory. The philosophy, methodology, computing resources, and laboratory information management system used by the quality assurance program to encompass the diversity of analytical chemistry practiced in the group are described. Included in the report are all quality assurance reference materials used, along with their certified or consensus concentrations, and all analytical chemistry quality assurance measurements made by HSE-9 during 1989. 38 refs., 8 figs., 3 tabs

  9. Quality assurance and evidence in career guidance in Europe

    DEFF Research Database (Denmark)

    Plant, Peter

    2011-01-01

    Quality assurance and evidence in career guidance in Europe is based on a particular, positivtic model. Other approaches are largely neglected.......Quality assurance and evidence in career guidance in Europe is based on a particular, positivtic model. Other approaches are largely neglected....

  10. Providing Continuous Assurance

    NARCIS (Netherlands)

    Kocken, Jonne; Hulstijn, Joris

    2017-01-01

    It has been claimed that continuous assurance can be attained by combining continuous monitoring by management, with continuous auditing of data streams and the effectiveness of internal controls by an external auditor. However, we find that in existing literature the final step to continuous

  11. Quality Assurance for All

    Science.gov (United States)

    Cheung, Peter P. T.; Tsui, Cecilia B. S.

    2010-01-01

    For higher education reform, most decision-makers aspire to achieving a higher participation rate and a respectable degree of excellence with diversity at the same time. But very few know exactly how. External quality assurance is a fair basis for differentiation but there can be doubt and resistance in some quarters. Stakeholder interests differ…

  12. Mission Operations Assurance

    Science.gov (United States)

    Faris, Grant

    2012-01-01

    Integrate the mission operations assurance function into the flight team providing: (1) value added support in identifying, mitigating, and communicating the project's risks and, (2) being an essential member of the team during the test activities, training exercises and critical flight operations.

  13. Quality assurance. 6. ed.

    International Nuclear Information System (INIS)

    Masing, W.

    1979-01-01

    Brief introduction to the quality sector. After some explanations of the terms of quality, feature, and defect, the article discusses the planning of quality and testing, industrial metrology, the test risk, quality assurance, quality enhancement, quality cost, and organisational problems. (RW) [de

  14. Medicine in Ancient Assur

    DEFF Research Database (Denmark)

    Arbøll, Troels Pank

    This dissertation is a microhistorical study of a single individual named Kiṣir-Aššur who practiced medicine in the ancient city of Assur (modern northern Iraq) in the 7th century BCE. The study provides the first detailed analysis of one healer’s education and practice in ancient Mesopotamia...

  15. Quality Assurance Program Description

    Energy Technology Data Exchange (ETDEWEB)

    Halford, Vaughn Edward [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ryder, Ann Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-07-01

    Effective May 1, 2017, led by a new executive leadership team, Sandia began operating within a new organizational structure. National Technology and Engineering Solutions of Sandia (Sandia’s) Quality Assurance Program (QAP) was established to assign responsibilities and authorities, define workflow policies and requirements, and provide for the performance and assessment of work.

  16. Building Information Modeling in engineering teaching

    DEFF Research Database (Denmark)

    Andersson, Niclas; Andersson, Pernille Hammar

    2010-01-01

    The application of Information and Communication Technology (ICT) in construction supports business as well as project processes by providing integrated systems for communication, administration, quantity takeoff, time scheduling, cost estimating, progress control among other things. The rapid....... In engineering education there is an obvious aim to provide students with sufficient disciplinary knowledge in science and engineering principles. The implementation of ICT in engineering education requires, however, that valuable time and teaching efforts are spent on adequate software training needed...

  17. Display of the information model accounting system

    OpenAIRE

    Matija Varga

    2011-01-01

    This paper presents the accounting information system in public companies, business technology matrix and data flow diagram. The paper describes the purpose and goals of the accounting process, matrix sub-process and data class. Data flow in the accounting process and the so-called general ledger module are described in detail. Activities of the financial statements and determining the financial statements of the companies are mentioned as well. It is stated how the general ledger...

  18. On Modelling Hybrid Uncertainty in Information

    Science.gov (United States)

    2007-02-01

    room increases the likelihood of death (vulnerability) while increasing the number of rooms with one window off the passage decreases the likelihood of... death . In a similar manner, the number of components in physical systems determines the likelihood of system failure regardless of the type of...information, in: Proc. 1996 Interdisciplinary Conference on Intelligent Systems: A Semiotic Perspective (NIST), pp.133-140. 48. Joslyn, C. (1993) Some

  19. Chapter 8: Quality assurance

    International Nuclear Information System (INIS)

    2001-01-01

    The main efforts of Nuclear Regulatory Authority of the Slovak Republic (UJD) have been focused on inspection of quality assurance programmes of Slovak Power Stations, plc. and its daughter companies at Bohunice and Mochovce. Two quality assurance inspections in the area of periodical in service inspections (V-2 units) and tests of selected equipment (NPP V-2 units) and operation control (V-1 units) has been performed at NPPs Bohunice. One violation of decree on quality assurance of selected equipment has been found in the area of documentation archiving. The inspection concerning the implementation of quality assurance programme for operation of NPP Mochovce in the area of operation control has been performed focused on safety aspects of operation, operational procedures, control of operational events and feedback from operational experience. The results of this inspection were positive. Inspection of implementation of quality assurance programme for operation of radioactive waste repository (RU RAW) at the Mochovce location has been performed focused on receiving of containers, with radioactive wastes, containers handling, radiation monitoring, activities of documentation control and radiation protection at the repository site. No serious deficiencies have been found out. Also one inspection of experimental nuclear installations of VUJE Trnava at Jaslovske Bohunice site has been performed focused on procurement control, quality audits, documentation and quality records control when performing activities at experimental nuclear installations. The activity on development of internal quality assurance system continued. The implementation of this system will assure quality and effective fulfilment enlarged tasks of UJD with limited resources for its activity. The analyses of possible use of existing internal administrative control documentation as a basis for future quality system procedures was performed in co-operation with an external specialised organisation. The

  20. Software quality assurance plan for PORFLOW-3D

    International Nuclear Information System (INIS)

    Maheras, S.J.

    1993-03-01

    This plan describes the steps taken by the Idaho National Engineering Laboratory Subsurface and Environmental Modeling Unit personnel to implement software quality assurance procedures for the PORFLOW-3D computer code. PORFLOW-3D was used to conduct radiological performance assessments at the Savannah River Site. software quality assurance procedures for PORFLOW-3D include software acquisition, installation, testing, operation, maintenance, and retirement. Configuration control and quality assurance procedures are also included or referenced in this plan

  1. Illinois' Forests, 2005: Statistics, Methods, and Quality Assurance

    Science.gov (United States)

    Susan J. Crocker; Charles J. Barnett; Mark A. Hatfield

    2013-01-01

    The first full annual inventory of Illinois' forests was completed in 2005. This report contains 1) descriptive information on methods, statistics, and quality assurance of data collection, 2) a glossary of terms, 3) tables that summarize quality assurance, and 4) a core set of tabular estimates for a variety of forest resources. A detailed analysis of inventory...

  2. Development and reproducibility evaluation of a Monte Carlo-based standard LINAC model for quality assurance of multi-institutional clinical trials.

    Science.gov (United States)

    Usmani, Muhammad Nauman; Takegawa, Hideki; Takashina, Masaaki; Numasaki, Hodaka; Suga, Masaki; Anetai, Yusuke; Kurosu, Keita; Koizumi, Masahiko; Teshima, Teruki

    2014-11-01

    Technical developments in radiotherapy (RT) have created a need for systematic quality assurance (QA) to ensure that clinical institutions deliver prescribed radiation doses consistent with the requirements of clinical protocols. For QA, an ideal dose verification system should be independent of the treatment-planning system (TPS). This paper describes the development and reproducibility evaluation of a Monte Carlo (MC)-based standard LINAC model as a preliminary requirement for independent verification of dose distributions. The BEAMnrc MC code is used for characterization of the 6-, 10- and 15-MV photon beams for a wide range of field sizes. The modeling of the LINAC head components is based on the specifications provided by the manufacturer. MC dose distributions are tuned to match Varian Golden Beam Data (GBD). For reproducibility evaluation, calculated beam data is compared with beam data measured at individual institutions. For all energies and field sizes, the MC and GBD agreed to within 1.0% for percentage depth doses (PDDs), 1.5% for beam profiles and 1.2% for total scatter factors (Scps.). Reproducibility evaluation showed that the maximum average local differences were 1.3% and 2.5% for PDDs and beam profiles, respectively. MC and institutions' mean Scps agreed to within 2.0%. An MC-based standard LINAC model developed to independently verify dose distributions for QA of multi-institutional clinical trials and routine clinical practice has proven to be highly accurate and reproducible and can thus help ensure that prescribed doses delivered are consistent with the requirements of clinical protocols. © The Author 2014. Published by Oxford University Press on behalf of The Japan Radiation Research Society and Japanese Society for Radiation Oncology.

  3. BYU Food Quality Assurance Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The Quality Assurance Lab is located in the Eyring Science Center in the department of Nutrition, Dietetics, and Food Science. The Quality Assurance Lab has about 10...

  4. Translating Building Information Modeling to Building Energy Modeling Using Model View Definition

    Science.gov (United States)

    Kim, Jong Bum; Clayton, Mark J.; Haberl, Jeff S.

    2014-01-01

    This paper presents a new approach to translate between Building Information Modeling (BIM) and Building Energy Modeling (BEM) that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM) has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD) consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM) and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica) development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1) the BIM-based Modelica models generated from Revit2Modelica and (2) BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1) enables BIM models to be translated into ModelicaBEM models, (2) enables system interface development based on the MVD for thermal simulation, and (3) facilitates the reuse of original BIM data into building energy simulation without an import/export process. PMID:25309954

  5. Translating building information modeling to building energy modeling using model view definition.

    Science.gov (United States)

    Jeong, WoonSeong; Kim, Jong Bum; Clayton, Mark J; Haberl, Jeff S; Yan, Wei

    2014-01-01

    This paper presents a new approach to translate between Building Information Modeling (BIM) and Building Energy Modeling (BEM) that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM) has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD) consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM) and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica) development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1) the BIM-based Modelica models generated from Revit2Modelica and (2) BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1) enables BIM models to be translated into ModelicaBEM models, (2) enables system interface development based on the MVD for thermal simulation, and (3) facilitates the reuse of original BIM data into building energy simulation without an import/export process.

  6. Translating Building Information Modeling to Building Energy Modeling Using Model View Definition

    Directory of Open Access Journals (Sweden)

    WoonSeong Jeong

    2014-01-01

    Full Text Available This paper presents a new approach to translate between Building Information Modeling (BIM and Building Energy Modeling (BEM that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1 the BIM-based Modelica models generated from Revit2Modelica and (2 BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1 enables BIM models to be translated into ModelicaBEM models, (2 enables system interface development based on the MVD for thermal simulation, and (3 facilitates the reuse of original BIM data into building energy simulation without an import/export process.

  7. Model-driven design of geo-information services

    NARCIS (Netherlands)

    Morales Guarin, J.M.; Morales Guarin, Javier Marcelino

    2004-01-01

    This thesis presents a method for the development of distributed geo-information systems. The method is organised around the design principles of modularity, reuse and replaceability. The method enables the modelling of both behavioural and informational aspects of geo-information systems in an

  8. An Information-Processing Model of Crisis Management.

    Science.gov (United States)

    Egelhoff, William G.; Sen, Falguni

    1992-01-01

    Develops a contingency model for managing a variety of corporate crises. Views crisis management as an information-processing situation and organizations that must cope with crisis as information-processing systems. Attempts to fit appropriate information-processing mechanisms to different categories of crises. (PRA)

  9. Creating Quality Assurance and International Transparency for Quality Assurance Agencies

    DEFF Research Database (Denmark)

    Kristoffersen, Dorte; Lindeberg, Tobias

    2004-01-01

    The paper presents the experiences gained in the pilot project on mutual recognition conducted by the quality assurance agencies in the Nordic countries and the future perspective for international quality assurance of national quality assurance agencies. The background of the project was the need......, on the one hand, to advance internationalisation of quality assurance of higher education, and on the other hand, allow for the differences in the national approaches to quality assurance. The paper will focus on two issues: first, the strength and weaknesses of the method employed and of the use of the ENQA...

  10. Marketing information systems in units of business information: a proposed model

    OpenAIRE

    Ana Maria Pereira; Carla Campos Pereira*

    2016-01-01

    Introduction: It proposes a theoretical model of marketing information system, which provides qualitiy attributes informations, such as: accuracy, economy, flexibility, reliability, relevance, simplicity and verifiability to the decision-makers of business organizations, based on the systemic vision and marketing theories. Objective: Present a model of marketing information system for business units, identifying the requirements, skills and abilities that the market demands of the libraria...

  11. Display of the information model accounting system

    Directory of Open Access Journals (Sweden)

    Matija Varga

    2011-12-01

    Full Text Available This paper presents the accounting information system in public companies, business technology matrix and data flow diagram. The paper describes the purpose and goals of the accounting process, matrix sub-process and data class. Data flow in the accounting process and the so-called general ledger module are described in detail. Activities of the financial statements and determining the financial statements of the companies are mentioned as well. It is stated how the general ledger module should function and what characteristics it must have. Line graphs will depict indicators of the company’s business success, indebtedness and company’s efficiency coefficients based on financial balance reports, and profit and loss report.

  12. Information metric on instanton moduli spaces in nonlinear σ models

    International Nuclear Information System (INIS)

    Yahikozawa, Shigeaki

    2004-01-01

    We study the information metric on instanton moduli spaces in two-dimensional nonlinear σ models. In the CP 1 model, the information metric on the moduli space of one instanton with the topological charge Q=k(k≥1) is a three-dimensional hyperbolic metric, which corresponds to Euclidean anti-de Sitter space-time metric in three dimensions, and the overall scale factor of the information metric is 4k 2 /3; this means that the sectional curvature is -3/4k 2 . We also calculate the information metric in the CP 2 model

  13. Towards dynamic reference information models: Readiness for ICT mass customisation

    NARCIS (Netherlands)

    Verdouw, C.N.; Beulens, A.J.M.; Trienekens, J.H.; Verwaart, D.

    2010-01-01

    Current dynamic demand-driven networks make great demands on, in particular, the interoperability and agility of information systems. This paper investigates how reference information models can be used to meet these demands by enhancing ICT mass customisation. It was found that reference models for

  14. A generalized model via random walks for information filtering

    Energy Technology Data Exchange (ETDEWEB)

    Ren, Zhuo-Ming, E-mail: zhuomingren@gmail.com [Department of Physics, University of Fribourg, Chemin du Musée 3, CH-1700, Fribourg (Switzerland); Chongqing Institute of Green and Intelligent Technology, Chinese Academy of Sciences, ChongQing, 400714 (China); Kong, Yixiu [Department of Physics, University of Fribourg, Chemin du Musée 3, CH-1700, Fribourg (Switzerland); Shang, Ming-Sheng, E-mail: msshang@cigit.ac.cn [Chongqing Institute of Green and Intelligent Technology, Chinese Academy of Sciences, ChongQing, 400714 (China); Zhang, Yi-Cheng [Department of Physics, University of Fribourg, Chemin du Musée 3, CH-1700, Fribourg (Switzerland)

    2016-08-06

    There could exist a simple general mechanism lurking beneath collaborative filtering and interdisciplinary physics approaches which have been successfully applied to online E-commerce platforms. Motivated by this idea, we propose a generalized model employing the dynamics of the random walk in the bipartite networks. Taking into account the degree information, the proposed generalized model could deduce the collaborative filtering, interdisciplinary physics approaches and even the enormous expansion of them. Furthermore, we analyze the generalized model with single and hybrid of degree information on the process of random walk in bipartite networks, and propose a possible strategy by using the hybrid degree information for different popular objects to toward promising precision of the recommendation. - Highlights: • We propose a generalized recommendation model employing the random walk dynamics. • The proposed model with single and hybrid of degree information is analyzed. • A strategy with the hybrid degree information improves precision of recommendation.

  15. A generalized model via random walks for information filtering

    International Nuclear Information System (INIS)

    Ren, Zhuo-Ming; Kong, Yixiu; Shang, Ming-Sheng; Zhang, Yi-Cheng

    2016-01-01

    There could exist a simple general mechanism lurking beneath collaborative filtering and interdisciplinary physics approaches which have been successfully applied to online E-commerce platforms. Motivated by this idea, we propose a generalized model employing the dynamics of the random walk in the bipartite networks. Taking into account the degree information, the proposed generalized model could deduce the collaborative filtering, interdisciplinary physics approaches and even the enormous expansion of them. Furthermore, we analyze the generalized model with single and hybrid of degree information on the process of random walk in bipartite networks, and propose a possible strategy by using the hybrid degree information for different popular objects to toward promising precision of the recommendation. - Highlights: • We propose a generalized recommendation model employing the random walk dynamics. • The proposed model with single and hybrid of degree information is analyzed. • A strategy with the hybrid degree information improves precision of recommendation.

  16. Evaluation Model of Tea Industry Information Service Quality

    OpenAIRE

    Shi , Xiaohui; Chen , Tian’en

    2015-01-01

    International audience; According to characteristics of tea industry information service, this paper have built service quality evaluation index system for tea industry information service quality, R-cluster analysis and multiple regression have been comprehensively used to contribute evaluation model with a high practice and credibility. Proved by the experiment, the evaluation model of information service quality has a good precision, which has guidance significance to a certain extent to e...

  17. The Significance of Quality Assurance within Model Intercomparison Projects at the World Data Centre for Climate (WDCC)

    Science.gov (United States)

    Toussaint, F.; Hoeck, H.; Stockhause, M.; Lautenschlager, M.

    2014-12-01

    The classical goals of a quality assessment system in the data life cycle are (1) to encourage data creators to improve their quality assessment procedures to reach the next quality level and (2) enable data consumers to decide, whether a dataset has a quality that is sufficient for usage in the target application, i.e. to appraise the data usability for their own purpose.As the data volumes of projects and the interdisciplinarity of data usage grow, the need for homogeneous structure and standardised notation of data and metadata increases. This third aspect is especially valid for the data repositories, as they manage data through machine agents. So checks for homogeneity and consistency in early parts of the workflow become essential to cope with today's data volumes.Selected parts of the workflow in the model intercomparison project CMIP5 and the archival of the data for the interdiscipliary user community of the IPCC-DDC AR5 and the associated quality checks are reviewed. We compare data and metadata checks and relate different types of checks to their positions in the data life cycle.The project's data citation approach is included in the discussion, with focus on temporal aspects of the time necessary to comply with the project's requirements for formal data citations and the demand for the availability of such data citations.In order to make different quality assessments of projects comparable, WDCC developed a generic Quality Assessment System. Based on the self-assessment approach of a maturity matrix, an objective and uniform quality level system for all data at WDCC is derived which consists of five maturity quality levels.

  18. Grid Technology and Quality Assurance

    International Nuclear Information System (INIS)

    Rippa, A.; Manieri, A.; Begin, M.E.; Di Meglio, A.

    2007-01-01

    Grid is one of the potential architectures of the coming years to support both the research and the commercial environment. Quality assurance techniques need both to adapt to these new architectures and exploit them to improve its effectiveness. Software quality is a key issue in the Digital Era: Industries as well as Public Administrations devote time to check and verify the quality of ICT products and services supplied. The definition of automatic measurement of quality metrics is a key point for implementing effective QA methods. In this paper we propose a quality certification model, named Grid-based Quality Certification Model (GQCM), that uses automatically calculable metrics to asses the quality of software applications; this model has been developed within the ETICS SSA4 activities and exploit grid technology for full automation of metrics calculation. It is however designed to be generic enough such that it can be implemented using any automatic build and test tool. (Author)

  19. Levy Random Bridges and the Modelling of Financial Information

    OpenAIRE

    Hoyle, Edward; Hughston, Lane P.; Macrina, Andrea

    2009-01-01

    The information-based asset-pricing framework of Brody, Hughston and Macrina (BHM) is extended to include a wider class of models for market information. In the BHM framework, each asset is associated with a collection of random cash flows. The price of the asset is the sum of the discounted conditional expectations of the cash flows. The conditional expectations are taken with respect to a filtration generated by a set of "information processes". The information processes carry imperfect inf...

  20. MAGIC: Model and Graphic Information Converter

    Science.gov (United States)

    Herbert, W. C.

    2009-01-01

    MAGIC is a software tool capable of converting highly detailed 3D models from an open, standard format, VRML 2.0/97, into the proprietary DTS file format used by the Torque Game Engine from GarageGames. MAGIC is used to convert 3D simulations from authoritative sources into the data needed to run the simulations in NASA's Distributed Observer Network. The Distributed Observer Network (DON) is a simulation presentation tool built by NASA to facilitate the simulation sharing requirements of the Data Presentation and Visualization effort within the Constellation Program. DON is built on top of the Torque Game Engine (TGE) and has chosen TGE's Dynamix Three Space (DTS) file format to represent 3D objects within simulations.

  1. Quality assurance in microbiology

    Directory of Open Access Journals (Sweden)

    Arora D

    2004-01-01

    Full Text Available Quality assurance (QA is the total process whereby the quality of laboratory reports can be guaranteed. The term quality control covers that part of QA, which primarily concerns the control of errors in the performance of tests and verification of test results. All materials, equipment and procedures must be adequately controlled. Culture media must be tested for sterility and performance. Each laboratory must have standard operating procedures (SOPs. QA of pre-analytical, analytical and post-analytical stages of microbiological procedures should be incorporated in SOPs. The laboratory must be well lit with dust-free air-conditioned environment. Environmental conditions should be monitored. Supervisory and technical personnel should be well qualified. The laboratory should participate in external and internal quality assurance schemes.

  2. Quality assurance in radiotherapy

    International Nuclear Information System (INIS)

    2003-03-01

    Good radiotherapy results and safety of treatment require the radiation to be optimally applied to a specified target area and the correct dose. According to international recommendations, the average uncertainty in therapeutic dose should not exceed 5%. The need for high precision in therapeutic dose requires quality assurance covering the entire radiotherapy process. Besides the physical and technical characteristics of the therapy equipment, quality assurance must include all radiotherapy equipment and procedures that are significant for the correct magnitude and precision of application of the therapeutic dose. The duties and responsibilities pertaining to various stages of treatment must also be precisely defined. These requirements may be best implemented through a quality system. The general requirements for supervision and quality assurance of medical radiation apparatus are prescribed in section 40 of the Radiation Act (592/1991, amendment 1142/1998) and in sections 18 and 32 of the Decree of the Ministry of Social Affairs and Health on the medical use of radiation (423/2000). Guide ST 2.2 imposes requirements on structural radiation shielding of radiotherapy equipment and the premises in which it is used, and on warning and safety arrangements. Guide ST 1.1 sets out the general safety principles for radiation practices and regulatory control procedure for the use of radiation. Guide ST 1.6 provides general requirements for operational measures in the use of radiation. This Guide sets out the duties of responsible parties (the party running a radiation practice) in respect of arranging and maintaining radiotherapy quality assurance. The principles set out in this Guide and Guide ST 6.3 may be applied to radionuclide therapy

  3. Power transformers quality assurance

    CERN Document Server

    Dasgupta, Indrajit

    2009-01-01

    About the Book: With the view to attain higher reliability in power system operation, the quality assurance in the field of distribution and power transformers has claimed growing attention. Besides new developments in the material technology and manufacturing processes of transformers, regular diagnostic testing and maintenance of any engineering product may be ascertained by ensuring: right selection of materials and components and their quality checks. application of correct manufacturing processes any systems engineering. the user`s awareness towards preventive maintenance. The

  4. Quality assurance in microbiology

    OpenAIRE

    Arora D

    2004-01-01

    Quality assurance (QA) is the total process whereby the quality of laboratory reports can be guaranteed. The term quality control covers that part of QA, which primarily concerns the control of errors in the performance of tests and verification of test results. All materials, equipment and procedures must be adequately controlled. Culture media must be tested for sterility and performance. Each laboratory must have standard operating procedures (SOPs). QA of pre-analytical, analytical and po...

  5. Introduction to quality assurance

    International Nuclear Information System (INIS)

    Raisic, N.

    1980-01-01

    Safety requirements set forth in the regulatory requirement, codes, standards as well as other requirements for various aspects of nuclear power plant design and operation are strictly implemented through QA activities. The overall QA aim is to assure that the plant is soundly and correctly designed and that it is built, tested and operated in accordance with stringent quality standards and conservative engineering practices. In this way a high degree of freedom from faults and errors can be achieved. (orig.)

  6. Software Quality Assurance Metrics

    Science.gov (United States)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  7. Specified assurance level sampling procedure

    International Nuclear Information System (INIS)

    Willner, O.

    1980-11-01

    In the nuclear industry design specifications for certain quality characteristics require that the final product be inspected by a sampling plan which can demonstrate product conformance to stated assurance levels. The Specified Assurance Level (SAL) Sampling Procedure has been developed to permit the direct selection of attribute sampling plans which can meet commonly used assurance levels. The SAL procedure contains sampling plans which yield the minimum sample size at stated assurance levels. The SAL procedure also provides sampling plans with acceptance numbers ranging from 0 to 10, thus, making available to the user a wide choice of plans all designed to comply with a stated assurance level

  8. THE MODEL FOR RISK ASSESSMENT ERP-SYSTEMS INFORMATION SECURITY

    Directory of Open Access Journals (Sweden)

    V. S. Oladko

    2016-12-01

    Full Text Available The article deals with the problem assessment of information security risks in the ERP-system. ERP-system functions and architecture are studied. The model malicious impacts on levels of ERP-system architecture are composed. Model-based risk assessment, which is the quantitative and qualitative approach to risk assessment, built on the partial unification 3 methods for studying the risks of information security - security models with full overlapping technique CRAMM and FRAP techniques developed.

  9. Measurement quality assurance

    International Nuclear Information System (INIS)

    Eisenhower, E.H.

    1988-01-01

    The quality of a radiation protection program can be no better than the quality of the measurements made to support it. In many cases, that quality is unknown and is merely implied on the basis of a calibration of a measuring instrument. If that calibration is inappropriate or is performed improperly, the measurement result will be inaccurate and misleading. Assurance of measurement quality can be achieved if appropriate procedures are followed, including periodic quality control actions that demonstrate adequate performance. Several national measurement quality assurance (MQA) programs are operational or under development in specific areas. They employ secondary standards laboratories that provide a high-quality link between the National Bureau of Standards and measurements made at the field use level. The procedures followed by these secondary laboratories to achieve MQA will be described, as well as plans for similar future programs. A growing general national interest in quality assurance, combined with strong specific motivations for MQA in the area of ionizing radiation, will provide continued demand for appropriate national programs. Such programs must, however, employ procedures that are cost effective and must be developed with participation by all affected parties

  10. 222-S Laboratory Quality Assurance Plan. Revision 1

    International Nuclear Information System (INIS)

    Meznarich, H.K.

    1995-01-01

    This Quality Assurance Plan provides,quality assurance (QA) guidance, regulatory QA requirements (e.g., 10 CFR 830.120), and quality control (QC) specifications for analytical service. This document follows the U.S Department of Energy (DOE) issued Hanford Analytical Services Quality Assurance Plan (HASQAP). In addition, this document meets the objectives of the Quality Assurance Program provided in the WHC-CM-4-2, Section 2.1. Quality assurance elements required in the Guidelines and Specifications for Preparing Quality Assurance Program Plans (QAMS-004) and Interim Guidelines and Specifications for Preparing Quality Assurance Project Plans (QAMS-005) from the US Environmental Protection Agency (EPA) are covered throughout this document. A quality assurance index is provided in the Appendix A. This document also provides and/or identifies the procedural information that governs laboratory operations. The personnel of the 222-S Laboratory and the Standards Laboratory including managers, analysts, QA/QC staff, auditors, and support staff shall use this document as guidance and instructions for their operational and quality assurance activities. Other organizations that conduct activities described in this document for the 222-S Laboratory shall follow this QA/QC document

  11. [Is there adequate care for patients with psychosomatic disorders in Austria? Analysis of the need and a proposal for a model of quality assurance in Austrian psychosomatic medicine].

    Science.gov (United States)

    Leitner, Anton; Pieh, Christoph; Matzer, Franziska; Fazekas, Christian

    2013-01-01

    Quality assurance in psychosomatic medicine in Austria is currently based on a voluntary continuing medical education programme in psychosocial, psychosomatic and psychotherapeutic medicine. It is questionable whether psychosomatic care can be sufficiently provided in this manner. In addition, a broadly based proposal to create a subspecialty in psychosomatic medicine in order to facilitate quality assurance, is investigated. The necessity to reorganize psychosomatic care was explored through semi-structured qualitative interviews with experts. Data-based analyses probed the labour market of the proposed subspecialty, and the literature was reviewed to look into the cost-benefit ratio of psychosomatic treatment. All experts expressed a need to restructure psychosomatic care in Austria. Examples exist for psychosomatic treatment with an efficient cost-benefit relation in diverse medical settings. Establishing a subspecialty in Psychosomatic Medicine seems feasible and could contribute to increased quality assurance and the nationwide provision of psychosomatic care.

  12. An information spreading model based on online social networks

    Science.gov (United States)

    Wang, Tao; He, Juanjuan; Wang, Xiaoxia

    2018-01-01

    Online social platforms are very popular in recent years. In addition to spreading information, users could review or collect information on online social platforms. According to the information spreading rules of online social network, a new information spreading model, namely IRCSS model, is proposed in this paper. It includes sharing mechanism, reviewing mechanism, collecting mechanism and stifling mechanism. Mean-field equations are derived to describe the dynamics of the IRCSS model. Moreover, the steady states of reviewers, collectors and stiflers and the effects of parameters on the peak values of reviewers, collectors and sharers are analyzed. Finally, numerical simulations are performed on different networks. Results show that collecting mechanism and reviewing mechanism, as well as the connectivity of the network, make information travel wider and faster, and compared to WS network and ER network, the speed of reviewing, sharing and collecting information is fastest on BA network.

  13. Agricultural information dissemination using ICTs: A review and analysis of information dissemination models in China

    Directory of Open Access Journals (Sweden)

    Yun Zhang

    2016-03-01

    Full Text Available Over the last three decades, China’s agriculture sector has been transformed from the traditional to modern practice through the effective deployment of Information and Communication Technologies (ICTs. Information processing and dissemination have played a critical role in this transformation process. Many studies in relation to agriculture information services have been conducted in China, but few of them have attempted to provide a comprehensive review and analysis of different information dissemination models and their applications. This paper aims to review and identify the ICT based information dissemination models in China and to share the knowledge and experience in applying emerging ICTs in disseminating agriculture information to farmers and farm communities to improve productivity and economic, social and environmental sustainability. The paper reviews and analyzes the development stages of China’s agricultural information dissemination systems and different mechanisms for agricultural information service development and operations. Seven ICT-based information dissemination models are identified and discussed. Success cases are presented. The findings provide a useful direction for researchers and practitioners in developing future ICT based information dissemination systems. It is hoped that this paper will also help other developing countries to learn from China’s experience and best practice in their endeavor of applying emerging ICTs in agriculture information dissemination and knowledge transfer.

  14. A simplified computational memory model from information processing.

    Science.gov (United States)

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-11-23

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view.

  15. A simplified computational memory model from information processing

    Science.gov (United States)

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-01-01

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view. PMID:27876847

  16. Quality assurance for geologic investigations

    International Nuclear Information System (INIS)

    Delvin, W.L.; Gustafson, L.D.

    1983-01-01

    A quality assurance handbook was written to provide guidance in the application of quality assurance to geologic work activities associated with the National Waste Terminal Storage (NWTS) Program. It is intended to help geoscientists and NWTS program managers in applying quality assurance to their work activitie and projects by showing how technical and quality assurance practices are integrated to provide control within those activities and projects. The use of the guidance found in this handbook should help provide consistency in the interpretation of quality assurance requirements across the various geologic activities within the NWTS Program. This handbook also can assist quality assurance personnel in understanding the relationships between technical and quality assurance practices. This paper describes the handbook

  17. Quality Assurance in Higher Education: A Review of Literature

    Science.gov (United States)

    Ryan, Tricia

    2015-01-01

    This paper examines the literature surrounding quality assurance in global higher education. It provides an overview of accreditation as a mechanism to ensure quality in higher education, examines models of QA, and explores the concept of quality (including definitions of quality and quality assurance). In addition, this paper provides a review of…

  18. High resolution reservoir geological modelling using outcrop information

    Energy Technology Data Exchange (ETDEWEB)

    Zhang Changmin; Lin Kexiang; Liu Huaibo [Jianghan Petroleum Institute, Hubei (China)] [and others

    1997-08-01

    This is China`s first case study of high resolution reservoir geological modelling using outcrop information. The key of the modelling process is to build a prototype model and using the model as a geological knowledge bank. Outcrop information used in geological modelling including seven aspects: (1) Determining the reservoir framework pattern by sedimentary depositional system and facies analysis; (2) Horizontal correlation based on the lower and higher stand duration of the paleo-lake level; (3) Determining the model`s direction based on the paleocurrent statistics; (4) Estimating the sandbody communication by photomosaic and profiles; (6) Estimating reservoir properties distribution within sandbody by lithofacies analysis; and (7) Building the reservoir model in sandbody scale by architectural element analysis and 3-D sampling. A high resolution reservoir geological model of Youshashan oil field has been built by using this method.

  19. Assuring quality in high-consequence engineering

    Energy Technology Data Exchange (ETDEWEB)

    Hoover, Marcey L.; Kolb, Rachel R.

    2014-03-01

    In high-consequence engineering organizations, such as Sandia, quality assurance may be heavily dependent on staff competency. Competency-dependent quality assurance models are at risk when the environment changes, as it has with increasing attrition rates, budget and schedule cuts, and competing program priorities. Risks in Sandia's competency-dependent culture can be mitigated through changes to hiring, training, and customer engagement approaches to manage people, partners, and products. Sandia's technical quality engineering organization has been able to mitigate corporate-level risks by driving changes that benefit all departments, and in doing so has assured Sandia's commitment to excellence in high-consequence engineering and national service.

  20. SWMM 5 REDEVELOPMENT QUALITY ASSURANCE PROGRAM

    Science.gov (United States)

    EPA recently released a new version of the Storm Water Management Model (SWMM) that combines a new interface with a completely re-written computational engine. The SWMM redevelopment project proceeded under a Quality Assurance Project Plan (QAPP) that describes methods and proced...

  1. Quality Assurance in Distance and Open Learning

    Science.gov (United States)

    Mahafzah, Mohammed Hasan

    2012-01-01

    E-learning has become an increasingly important teaching and learning mode in educational institutions and corporate training. The evaluation of E-learning, however, is essential for the quality assurance of E-learning courses. This paper constructs a three-phase evaluation model for E-learning courses, which includes development, process, and…

  2. MCNP trademark Software Quality Assurance plan

    International Nuclear Information System (INIS)

    Abhold, H.M.; Hendricks, J.S.

    1996-04-01

    MCNP is a computer code that models the interaction of radiation with matter. MCNP is developed and maintained by the Transport Methods Group (XTM) of the Los Alamos National Laboratory (LANL). This plan describes the Software Quality Assurance (SQA) program applied to the code. The SQA program is consistent with the requirements of IEEE-730.1 and the guiding principles of ISO 900

  3. One decade of the Data Fusion Information Group (DFIG) model

    Science.gov (United States)

    Blasch, Erik

    2015-05-01

    The revision of the Joint Directors of the Laboratories (JDL) Information Fusion model in 2004 discussed information processing, incorporated the analyst, and was coined the Data Fusion Information Group (DFIG) model. Since that time, developments in information technology (e.g., cloud computing, applications, and multimedia) have altered the role of the analyst. Data production has outpaced the analyst; however the analyst still has the role of data refinement and information reporting. In this paper, we highlight three examples being addressed by the DFIG model. One example is the role of the analyst to provide semantic queries (through an ontology) so that vast amount of data available can be indexed, accessed, retrieved, and processed. The second idea is reporting which requires the analyst to collect the data into a condensed and meaningful form through information management. The last example is the interpretation of the resolved information from data that must include contextual information not inherent in the data itself. Through a literature review, the DFIG developments in the last decade demonstrate the usability of the DFIG model to bring together the user (analyst or operator) and the machine (information fusion or manager) in a systems design.

  4. Conceptual Modeling of Events as Information Objects and Change Agents

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    Traditionally, semantic data models have not supported the modeling of behavior. We present an event modeling approach that can be used to extend semantic data models like the entity-relationship model and the functional data model. We model an event as a two-sided phenomenon that is seen as a to...... it is comparable to an executable transaction schema. Finally, we briefly compare our approach to object-oriented approaches based on encapsulated objects.......Traditionally, semantic data models have not supported the modeling of behavior. We present an event modeling approach that can be used to extend semantic data models like the entity-relationship model and the functional data model. We model an event as a two-sided phenomenon that is seen...... as a totality of an information object and a change agent. When an event is modeled as an information object it is comparable to an entity that exists only at a specific point in time. It has attributes and can be used for querying and specification of constraints. When an event is modeled as a change agent...

  5. Approaching the Affective Factors of Information Seeking: The Viewpoint of the Information Search Process Model

    Science.gov (United States)

    Savolainen, Reijo

    2015-01-01

    Introduction: The article contributes to the conceptual studies of affective factors in information seeking by examining Kuhlthau's information search process model. Method: This random-digit dial telephone survey of 253 people (75% female) living in a rural, medically under-serviced area of Ontario, Canada, follows-up a previous interview study…

  6. Research on new information service model of the contemporary library

    International Nuclear Information System (INIS)

    Xin Pingping; Lu Yan

    2010-01-01

    According to the development of the internet and multimedia technology, the information service models in the contemporary library become both of the traditional and digital information service. The libraries in each country do their best to make the voluminous information and the complex technology be High-integrated in the background management, and also make the front interface be more and more convenient to the users. The essential characteristics of the information service of the contemporary library are all-in-one and humanness. In this article, we will describe several new hot information service models of the contemporary library in detail, such as individualized service, reference service, reference service and strategic information service. (authors)

  7. A Participatory Model for Multi-Document Health Information Summarisation

    Directory of Open Access Journals (Sweden)

    Dinithi Nallaperuma

    2017-03-01

    Full Text Available Increasing availability and access to health information has been a paradigm shift in healthcare provision as it empowers both patients and practitioners alike. Besides awareness, significant time savings and process efficiencies can be achieved through effective summarisation of healthcare information. Relevance and accuracy are key concerns when generating summaries for such documents. Despite advances in automated summarisation approaches, the role of participation has not been explored. In this paper, we propose a new model for multi-document health information summarisation that takes into account the role of participation. The updated IS user participation theory was extended to explicate these roles. The proposed model integrates both extractive and abstractive summarisation processes with continuous participatory inputs to each phase. The model was implemented as a client-server application and evaluated by both domain experts and health information consumers. Results from the evaluation phase indicates the model is successful in generating relevant and accurate summaries for diverse audiences.

  8. A model for information retrieval driven by conceptual spaces

    OpenAIRE

    Tanase, D.

    2015-01-01

    A retrieval model describes the transformation of a query into a set of documents. The question is: what drives this transformation? For semantic information retrieval type of models this transformation is driven by the content and structure of the semantic models. In this case, Knowledge Organization Systems (KOSs) are the semantic models that encode the meaning employed for monolingual and cross-language retrieval. The focus of this research is the relationship between these meanings’ repre...

  9. Fisher information and quantum potential well model for finance

    International Nuclear Information System (INIS)

    Nastasiuk, V.A.

    2015-01-01

    The probability distribution function (PDF) for prices on financial markets is derived by extremization of Fisher information. It is shown how on that basis the quantum-like description for financial markets arises and different financial market models are mapped by quantum mechanical ones. - Highlights: • The financial Schrödinger equation is derived using the principle of minimum Fisher information. • Statistical models for price variation are mapped by the quantum models of coupled particle. • The model of quantum particle in parabolic potential well corresponds to Efficient market

  10. Generalisation of geographic information cartographic modelling and applications

    CERN Document Server

    Mackaness, William A; Sarjakoski, L Tiina

    2011-01-01

    Theoretical and Applied Solutions in Multi Scale MappingUsers have come to expect instant access to up-to-date geographical information, with global coverage--presented at widely varying levels of detail, as digital and paper products; customisable data that can readily combined with other geographic information. These requirements present an immense challenge to those supporting the delivery of such services (National Mapping Agencies (NMA), Government Departments, and private business. Generalisation of Geographic Information: Cartographic Modelling and Applications provides detailed review

  11. Proposing a Metaliteracy Model to Redefine Information Literacy

    Science.gov (United States)

    Jacobson, Trudi E.; Mackey, Thomas P.

    2013-01-01

    Metaliteracy is envisioned as a comprehensive model for information literacy to advance critical thinking and reflection in social media, open learning settings, and online communities. At this critical time in higher education, an expansion of the original definition of information literacy is required to include the interactive production and…

  12. A hierarchical modeling of information seeking behavior of school ...

    African Journals Online (AJOL)

    The aim of this study was to investigate the information seeking behavior of school teachers in the public primary schools of rural areas of Nigeria and to draw up a model of their information-seeking behavior. A Cross-sectional survey design research was employed to carry out the research. Findings showed that the ...

  13. Modeling of Information Security Strategic Planning Methods and Expert Assessments

    Directory of Open Access Journals (Sweden)

    Alexander Panteleevich Batsula

    2014-09-01

    Full Text Available The article, paper addresses problem of increasing the level of information security. As a result, a method of increasing the level of information security is developed through its modeling of strategic planning SWOT-analysis using expert assessments.

  14. Exploring a "Gap" Model of Information Services Quality.

    Science.gov (United States)

    Kettinger, William J.; Lee, Choong C.

    1995-01-01

    Outlines information systems (IS) service quality improvement to cope with a customer-driven IS environment due to the growth of end-user computing, information technology decentralization, and alternative sources of supply. It adapts a conceptual "gap" model from the marketing field as a framework for IS service quality management. (67…

  15. An information model of a centralized admission campaign in ...

    African Journals Online (AJOL)

    The aim of the work is to structure individual application environments of the information model of a centralized admission campaign in higher education institutions in Russia by modifying the corresponding structure of the Federal information system supporting state final examination and admission procedures. , The ...

  16. Dietary information improves cardiovascular disease risk prediction models.

    Science.gov (United States)

    Baik, I; Cho, N H; Kim, S H; Shin, C

    2013-01-01

    Data are limited on cardiovascular disease (CVD) risk prediction models that include dietary predictors. Using known risk factors and dietary information, we constructed and evaluated CVD risk prediction models. Data for modeling were from population-based prospective cohort studies comprised of 9026 men and women aged 40-69 years. At baseline, all were free of known CVD and cancer, and were followed up for CVD incidence during an 8-year period. We used Cox proportional hazard regression analysis to construct a traditional risk factor model, an office-based model, and two diet-containing models and evaluated these models by calculating Akaike information criterion (AIC), C-statistics, integrated discrimination improvement (IDI), net reclassification improvement (NRI) and calibration statistic. We constructed diet-containing models with significant dietary predictors such as poultry, legumes, carbonated soft drinks or green tea consumption. Adding dietary predictors to the traditional model yielded a decrease in AIC (delta AIC=15), a 53% increase in relative IDI (P-value for IDI NRI (category-free NRI=0.14, P NRI (category-free NRI=0.08, P<0.01) compared with the office-based model. The calibration plots for risk prediction demonstrated that the inclusion of dietary predictors contributes to better agreement in persons at high risk for CVD. C-statistics for the four models were acceptable and comparable. We suggest that dietary information may be useful in constructing CVD risk prediction models.

  17. Model choice considerations and information integration using analytical hierarchy process

    Energy Technology Data Exchange (ETDEWEB)

    Langenbrunner, James R [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Booker, Jane M [BOOKER SCIENTIFIC; Ross, Timothy J. [UNM

    2010-10-15

    Using the theory of information-gap for decision-making under severe uncertainty, it has been shown that model output compared to experimental data contains irrevocable trade-offs between fidelity-to-data, robustness-to-uncertainty and confidence-in-prediction. We illustrate a strategy for information integration by gathering and aggregating all available data, knowledge, theory, experience, similar applications. Such integration of information becomes important when the physics is difficult to model, when observational data are sparse or difficult to measure, or both. To aggregate the available information, we take an inference perspective. Models are not rejected, nor wasted, but can be integrated into a final result. We show an example of information integration using Saaty's Analytic Hierarchy Process (AHP), integrating theory, simulation output and experimental data. We used expert elicitation to determine weights for two models and two experimental data sets, by forming pair-wise comparisons between model output and experimental data. In this way we transform epistemic and/or statistical strength from one field of study into another branch of physical application. The price to pay for utilizing all available knowledge is that inferences drawn for the integrated information must be accounted for and the costs can be considerable. Focusing on inferences and inference uncertainty (IU) is one way to understand complex information.

  18. SU-F-T-276: Source Modeling and VMAT Quality Assurance Referring to the TrueBeam Representative Beam Data for Eclipse

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Q [Beijing Hospital, Beijing (China)

    2016-06-15

    Purpose: To study quality assurance (QA) of volumetric modulated arc therapy (VMAT) after the 6MV and 10MV photon beam source modeling, referring to the Varian TrueBeam representative beam data for Eclipse. Methods: The source model needs specific measured beam data, such as PDDs and profiles, diagonal profile, output factors (OFs), and MLC transmission factor (TF) and dosimetric leaf gap (DLG), et al. We downloaded the representative data from myVarian website, which includes TrueBeam 4MV-15MV photon beam data and 6MeV-22MeV electron beam data in w2CAD file format for use with Eclipse and in Excel spreadsheet format for use in data comparison. The beam data in W2CAD format can be imported into the Eclipse system and calibrated for use, as appropriate. We used PTW MP3 water tank to measure the beam data in some typical field sizes, and compared the measured data with the representative data. We found that the PDDs, profiles and OFs are similar. However according to some papers and our measurements, we decided that our MLC TF and DLG are 1.58 and 1.33 (6MV), 1.79 and 1.57 (10MV), respectively. After we had configured the anisotropic analytical algorithm (AAA) with the representative data in Eclipse, we also have done dosimetric verification for 88 VMAT plans. Results: The end-to-end test procedures of VMAT were performed for 6MV and 10MV energy modes. The NE Farmer ion chamber mean measurements showed 1.2% (6MV, 38 cases) and 1.2% (10MV, 50 cases) between measurement and calculation; the Sun Nuclear ArcCheck mean measurements demonstrated gamma pass rates are as followings: 98.9%, 93.2%, 61.0% for 6MV, and 98.9%, 91.9%, 59.5% for 10MV, using 3%/3mm, 2%/2mm, 1%/1mm, 10% threshold criteria, respectively. Conclusion: The representative data is applicable to our TrueBeam for the VMAT plan, though our MLC factors are a little different, and its patientspecific QA is good.

  19. Quality assurance services

    International Nuclear Information System (INIS)

    For over 20 years the quality assurance services at the Springfields Laboratories have been concerned with manufacturing both simple and complex engineering products to the highest standard. The scientists working there have considerable expertise in the practical application of quality control and the development and design of inspection and non-destructive testing equipment. The folder contains six sheets or leaflets illustrating the work and equipment. The subjects are the mechanical standards laboratory, non-destructive testing, the digitising table, the peripheral camera, automated measurement, data handling and presentation, and the computer controlled three axis co-ordinate measuring machine. (U.K.)

  20. SHIR competitive information diffusion model for online social media

    Science.gov (United States)

    Liu, Yun; Diao, Su-Meng; Zhu, Yi-Xiang; Liu, Qing

    2016-11-01

    In online social media, opinion divergences and differentiations generally exist as a result of individuals' extensive participation and personalization. In this paper, a Susceptible-Hesitated-Infected-Removed (SHIR) model is proposed to study the dynamics of competitive dual information diffusion. The proposed model extends the classical SIR model by adding hesitators as a neutralized state of dual information competition. It is both hesitators and stable spreaders that facilitate information dissemination. Researching on the impacts of diffusion parameters, it is found that the final density of stiflers increases monotonically as infection rate increases and removal rate decreases. And the advantage information with larger stable transition rate takes control of whole influence of dual information. The density of disadvantage information spreaders slightly grows with the increase of its stable transition rate, while whole spreaders of dual information and the relaxation time remain almost unchanged. Moreover, simulations imply that the final result of competition is closely related to the ratio of stable transition rates of dual information. If the stable transition rates of dual information are nearly the same, a slightly reduction of the smaller one brings out a significant disadvantage in its propagation coverage. Additionally, the relationship of the ratio of final stiflers versus the ratio of stable transition rates presents power characteristic.

  1. Modeling Interoperable Information Systems with 3LGM² and IHE.

    Science.gov (United States)

    Stäubert, S; Schaaf, M; Jahn, F; Brandner, R; Winter, A

    2015-01-01

    Strategic planning of information systems (IS) in healthcare requires descriptions of the current and the future IS state. Enterprise architecture planning (EAP) tools like the 3LGM² tool help to build up and to analyze IS models. A model of the planned architecture can be derived from an analysis of current state IS models. Building an interoperable IS, i. e. an IS consisting of interoperable components, can be considered a relevant strategic information management goal for many IS in healthcare. Integrating the healthcare enterprise (IHE) is an initiative which targets interoperability by using established standards. To link IHE concepts to 3LGM² concepts within the 3LGM² tool. To describe how an information manager can be supported in handling the complex IHE world and planning interoperable IS using 3LGM² models. To describe how developers or maintainers of IHE profiles can be supported by the representation of IHE concepts in 3LGM². Conceptualization and concept mapping methods are used to assign IHE concepts such as domains, integration profiles actors and transactions to the concepts of the three-layer graph-based meta-model (3LGM²). IHE concepts were successfully linked to 3LGM² concepts. An IHE-master-model, i. e. an abstract model for IHE concepts, was modeled with the help of 3LGM² tool. Two IHE domains were modeled in detail (ITI, QRPH). We describe two use cases for the representation of IHE concepts and IHE domains as 3LGM² models. Information managers can use the IHE-master-model as reference model for modeling interoperable IS based on IHE profiles during EAP activities. IHE developers are supported in analyzing consistency of IHE concepts with the help of the IHE-master-model and functions of the 3LGM² tool The complex relations between IHE concepts can be modeled by using the EAP method 3LGM². 3LGM² tool offers visualization and analysis features which are now available for the IHE-master-model. Thus information managers and IHE

  2. A Process Model for Goal-Based Information Retrieval

    Directory of Open Access Journals (Sweden)

    Harvey Hyman

    2014-12-01

    Full Text Available In this paper we examine the domain of information search and propose a "goal-based" approach to study search strategy. We describe "goal-based information search" using a framework of Knowledge Discovery. We identify two Information Retrieval (IR goals using the constructs of Knowledge Acquisition (KA and Knowledge Explanation (KE. We classify these constructs into two specific information problems: An exploration-exploitation problem and an implicit-explicit problem. Our proposed framework is an extension of prior work in this domain, applying an IR Process Model originally developed for Legal-IR and adapted to Medical-IR. The approach in this paper is guided by the recent ACM-SIG Medical Information Retrieval (MedIR Workshop definition: "methodologies and technologies that seek to improve access to medical information archives via a process of information retrieval."

  3. Mixing Formal and Informal Model Elements for Tracing Requirements

    DEFF Research Database (Denmark)

    Jastram, Michael; Hallerstede, Stefan; Ladenberger, Lukas

    2011-01-01

    Tracing between informal requirements and formal models is challenging. A method for such tracing should permit to deal efficiently with changes to both the requirements and the model. A particular challenge is posed by the persisting interplay of formal and informal elements. In this paper, we...... describe an incremental approach to requirements validation and systems modelling. Formal modelling facilitates a high degree of automation: it serves for validation and traceability. The foundation for our approach are requirements that are structured according to the WRSPM reference model. We provide...... a system for traceability with a state-based formal method that supports refinement. We do not require all specification elements to be modelled formally and support incremental incorporation of new specification elements into the formal model. Refinement is used to deal with larger amounts of requirements...

  4. How informative are slip models for aftershock forecasting?

    Science.gov (United States)

    Bach, Christoph; Hainzl, Sebastian

    2013-04-01

    Coulomb stress changes (ΔCFS) have been recognized as a major trigger mechanism for earthquakes, in particular aftershock distributions and the spatial patterns of ΔCFS are often found to be correlated. However, the Coulomb stress calculations are based on slip inversions and the receiver fault mechanisms which both contain large uncertainties. In particular, slip inversions are usually non-unique and often differ strongly for the same earthquakes. Here we want to address the information content of those inversions with respect to aftershock forecasting. Therefore we compare the slip models to randomized fractal slip models which are only constrained by fault information and moment magnitude. The uncertainty of the aftershock mechanisms is considered by using many receiver fault orientations, and by calculating ΔCFS at several depth layers. The stress change is then converted into an aftershock probability map utilizing a clock advance model. To estimate the information content of the slip models, we use an Epidemic Type Aftershock Sequence (ETAS) model approach introduced by Bach and Hainzl (2012), where the spatial probability density of direct aftershocks is related to the ΔCFS calculations. Besides the directly triggered aftershocks, this approach also takes secondary aftershock triggering into account. We quantify our results by calculating the information gain of the randomized slip models relative to the corresponding published slip model. As case studies, we investigate the aftershock sequences of several well-known main shocks such as 1992 Landers, 1999 Hector Mine, 2004 Parkfield, 2002 Denali. First results show a huge difference in the information content of slip models. For some of the cases up to 90% of the random slip models are found to perform better than the originally published model, for some other cases only few random models are found performing better than the published slip model.

  5. Data retrieval systems and models of information situations

    International Nuclear Information System (INIS)

    Jankowski, L.

    1984-01-01

    Demands placed on data retrieval systems and their basic parameters are given. According to the stage of development of data collection and processing, data retrieval systems may be divided into systems for the simple recording and provision of data, systems for recording and providing data with integrated statistical functions, and logical information systems. The structure is characterized of the said information systems as are methods of processing and representation of facts. The notion is defined of ''artificial intelligence'' in the development of logical information systems. The structure of representing knowledge in diverse forms of the model is decisive in logical information systems related to nuclear research. The main model elements are the characteristics of data, forms of representation and program. In dependence on the structure of data, the structure of the preparatory and transformation algorithms and on the aim of the system it is possible to classify data retrieval systems related to nuclear research and technology into five logical information models: linear, identification, advisory, theory-experiment models and problem solving models. The characteristics are given of the said models and examples of data retrieval systems for the individual models. (E.S.)

  6. Modeling the Informal Economy in Mexico. A Structural Equation Approach

    OpenAIRE

    Brambila Macias, Jose

    2008-01-01

    This paper uses annual data for the period 1970-2006 in order to estimate and investigate the evolution of the Mexican informal economy. In order to do so, we model the informal economy as a latent variable and try to explain it through relationships between possible cause and indicator variables using structural equation modeling (SEM). Our results indicate that the Mexican informal sector at the beginning of the 1970’s initially accounted for 40 percent of GDP while slightly decreasing to s...

  7. Quality-Assurance Program Plan

    International Nuclear Information System (INIS)

    Kettell, R.A.

    1981-05-01

    This Quality Assurance Program Plan (QAPP) is provided to describe the Quality Assurance Program which is applied to the waste management activities conducted by AESD-Nevada Operations at the E-MAD Facility located in Area 25 of the Nevada Test Site. The AESD-Nevada Operations QAPP provides the necessary systematic and administrative controls to assure activities that affect quality, safety, reliability, and maintainability during design, procurement, fabrication, inspection, shipments, tests, and storage are conducted in accordance with established requirements

  8. Quality assurance and reliability

    International Nuclear Information System (INIS)

    Normand, J.; Charon, M.

    1975-01-01

    Concern for obtaining high-quality products which will function properly when required to do so is nothing new - it is one manifestation of a conscientious attitude to work. However, the complexity and cost of equipment and the consequences of even temporary immobilization are such that it has become necessary to make special arrangements for obtaining high-quality products and examining what one has obtained. Each unit within an enterprise must examine its own work or arrange for it to be examined; a unit whose specific task is quality assurance is responsible for overall checking, but does not relieve other units of their responsibility. Quality assurance is a form of mutual assistance within an enterprise, designed to remove the causes of faults as far as possible. It begins very early in a project and continues through the ordering stage, construction, start-up trials and operation. Quality and hence reliability are the direct result of what is done at all stages of a project. They depend on constant attention to detail, for even a minor piece of poor workmanship can, in the case of an essential item of equipment, give rise to serious operational difficulties

  9. Quality assurance records system

    International Nuclear Information System (INIS)

    1979-01-01

    This Safety Guide was prepared as part of the Agency's programme, referred to as the NUSS programme, for establishing Codes of Practice and Safety Guides relating to nuclear power plants. It supplements the IAEA Code of Practice on Quality Assurance for Safety in Nuclear Power Plants (IAEA Safety Series No.50-C-QA), which requires that for each nuclear power plant a system for the generation, identification, collection, indexing, filing, storing, maintenance and disposition of quality assurance records shall be established and executed in accordance with written procedures and instructions. The purpose of this Safety Guide is to provide assistance in the establishment and operation of such a system. An orderly established and maintained records system is considered to be part of the means of providing a basis for an appropriate level of confidence that the activities which affect the quality of a nuclear power plant have been performed in accordance with the specific requirements and that the required quality has been achieved and is maintained

  10. Quality assurance in radiotherapy

    International Nuclear Information System (INIS)

    Tripathi, U.B.

    1998-01-01

    Quality assurance in radiotherapy embodies in itself all those procedures that ensure consistency of the clinical prescription and correct fulfillment of that prescription as regards to dose to the target volume, together with minimal dose to the normal tissue, minimal exposure to the occupational workers and adequate patient monitoring aimed at determining the end result of the treatment. This definition aptly describes the role of quality assurance (QA) in radiotherapy practice. QA needs for different systems and sub-systems of the equipment, dose measuring equipment and techniques, dose delivery methodologies, treatment planning system, plan evaluation, follow-up etc. It should clearly define the tolerance limits, action and intervention levels and test frequencies for different test parameters. This paper will dwell on some of these topics in some detail while only passing references will be made to others. Rationale for tolerance limits and test frequencies will be discussed. Attention will also be focussed on the definitions and implementations of the action and intervention levels

  11. Information behavior versus communication: application models in multidisciplinary settings

    Directory of Open Access Journals (Sweden)

    Cecília Morena Maria da Silva

    2015-05-01

    Full Text Available This paper deals with the information behavior as support for models of communication design in the areas of Information Science, Library and Music. The communication models proposition is based on models of Tubbs and Moss (2003, Garvey and Griffith (1972, adapted by Hurd (1996 and Wilson (1999. Therefore, the questions arose: (i what are the informational skills required of librarians who act as mediators in scholarly communication process and informational user behavior in the educational environment?; (ii what are the needs of music related researchers and as produce, seek, use and access the scientific knowledge of your area?; and (iii as the contexts involved in scientific collaboration processes influence in the scientific production of information science field in Brazil? The article includes a literature review on the information behavior and its insertion in scientific communication considering the influence of context and/or situation of the objects involved in motivating issues. The hypothesis is that the user information behavior in different contexts and situations influence the definition of a scientific communication model. Finally, it is concluded that the same concept or a set of concepts can be used in different perspectives, reaching up, thus, different results.

  12. Organizational information assets classification model and security architecture methodology

    Directory of Open Access Journals (Sweden)

    Mostafa Tamtaji

    2015-12-01

    Full Text Available Today's, Organizations are exposed with huge and diversity of information and information assets that are produced in different systems shuch as KMS, financial and accounting systems, official and industrial automation sysytems and so on and protection of these information is necessary. Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released.several benefits of this model cuses that organization has a great trend to implementing Cloud computing. Maintaining and management of information security is the main challenges in developing and accepting of this model. In this paper, at first, according to "design science research methodology" and compatible with "design process at information systems research", a complete categorization of organizational assets, including 355 different types of information assets in 7 groups and 3 level, is presented to managers be able to plan corresponding security controls according to importance of each groups. Then, for directing of organization to architect it’s information security in cloud computing environment, appropriate methodology is presented. Presented cloud computing security architecture , resulted proposed methodology, and presented classification model according to Delphi method and expers comments discussed and verified.

  13. Information matrix estimation procedures for cognitive diagnostic models.

    Science.gov (United States)

    Liu, Yanlou; Xin, Tao; Andersson, Björn; Tian, Wei

    2018-03-06

    Two new methods to estimate the asymptotic covariance matrix for marginal maximum likelihood estimation of cognitive diagnosis models (CDMs), the inverse of the observed information matrix and the sandwich-type estimator, are introduced. Unlike several previous covariance matrix estimators, the new methods take into account both the item and structural parameters. The relationships between the observed information matrix, the empirical cross-product information matrix, the sandwich-type covariance matrix and the two approaches proposed by de la Torre (2009, J. Educ. Behav. Stat., 34, 115) are discussed. Simulation results show that, for a correctly specified CDM and Q-matrix or with a slightly misspecified probability model, the observed information matrix and the sandwich-type covariance matrix exhibit good performance with respect to providing consistent standard errors of item parameter estimates. However, with substantial model misspecification only the sandwich-type covariance matrix exhibits robust performance. © 2018 The British Psychological Society.

  14. Scope of Building Information Modeling (BIM in India

    Directory of Open Access Journals (Sweden)

    Mahua Mukherjee

    2009-01-01

    Full Text Available The design communication is gradually being changed from 2D based to integrated 3D digital interface. Building InformationModeling (BIM is a model-based design concept, in which buildings will be built virtually before they get built outin the field, where data models organized for complete integration of all relevant factors in the building lifecycle whichalso manages the information exchange between the AEC (Architects, Engineers, Contractors professionals, to strengthenthe interaction between the design team. BIM is a shared knowledge about the information for decisions making during itslifecycle. There’s still much to be learned about the opportunities and implications of this tool.This paper deals with the status check of BIM application in India, to do that a survey has been designed to check the acceptanceof BIM till date, while this application is widely accepted throughout the industry in many countries for managingproject information with capabilities for cost control and facilities management.

  15. Microsoft Repository Version 2 and the Open Information Model.

    Science.gov (United States)

    Bernstein, Philip A.; Bergstraesser, Thomas; Carlson, Jason; Pal, Shankar; Sanders, Paul; Shutt, David

    1999-01-01

    Describes the programming interface and implementation of the repository engine and the Open Information Model for Microsoft Repository, an object-oriented meta-data management facility that ships in Microsoft Visual Studio and Microsoft SQL Server. Discusses Microsoft's component object model, object manipulation, queries, and information…

  16. On the Enterprise Modelling of an Educational Information Infrastructure

    NARCIS (Netherlands)

    Widya, I.A.; Volman, C.J.A.M.; Pokraev, S.; de Diana, I.P.F.; Michiels, E.F.; Miranda, P.; Sharp, B.; Pakstas, A.; Filipe, J.

    2002-01-01

    This paper reports the modelling exercise of an educational information infrastructure that aims to support the organisation of teaching and learning activities suitable for a wide range of didactic policies. The modelling trajectory focuses on capturing invariant structures of relations between

  17. User-Oriented and Cognitive Models of Information Retrieval

    DEFF Research Database (Denmark)

    Ingwersen, Peter; Järvelin, Kalervo; Skov, Mette

    2017-01-01

    The domain of user-oriented and cognitive information retrieval (IR) is first discussed, followed by a discussion on the dimensions and types of models one may build for the domain. The focus of the present entry is on the models of user-oriented and cognitive IR, not on their empirical...

  18. The value of structural information in the VAR model

    NARCIS (Netherlands)

    R.W. Strachan (Rodney); H.K. van Dijk (Herman)

    2003-01-01

    textabstractEconomic policy decisions are often informed by empirical economic analysis. While the decision-maker is usually only interested in good estimates of outcomes, the analyst is interested in estimating the model. Accurate inference on the structural features of a model, such as

  19. Museum Information System of Serbia recent approach to database modeling

    OpenAIRE

    Gavrilović, Goran

    2007-01-01

    The paper offers an illustration of the main parameters for museum database projection (case study of Integrated Museum Information System of Serbia). The simple case of museum data model development and implementation was described. The main aim is to present the advantages of ORM (Object Role Modeling) methodology by using Microsoft Visio as an eligible programmed support in formalization of museum business rules.

  20. Informatics in radiology: an information model of the DICOM standard.

    Science.gov (United States)

    Kahn, Charles E; Langlotz, Curtis P; Channin, David S; Rubin, Daniel L

    2011-01-01

    The Digital Imaging and Communications in Medicine (DICOM) Standard is a key foundational technology for radiology. However, its complexity creates challenges for information system developers because the current DICOM specification requires human interpretation and is subject to nonstandard implementation. To address this problem, a formally sound and computationally accessible information model of the DICOM Standard was created. The DICOM Standard was modeled as an ontology, a machine-accessible and human-interpretable representation that may be viewed and manipulated by information-modeling tools. The DICOM Ontology includes a real-world model and a DICOM entity model. The real-world model describes patients, studies, images, and other features of medical imaging. The DICOM entity model describes connections between real-world entities and the classes that model the corresponding DICOM information entities. The DICOM Ontology was created to support the Cancer Biomedical Informatics Grid (caBIG) initiative, and it may be extended to encompass the entire DICOM Standard and serve as a foundation of medical imaging systems for research and patient care. RSNA, 2010

  1. Changing Models for Researching Pedagogy with Information and Communications Technologies

    Science.gov (United States)

    Webb, M.

    2013-01-01

    This paper examines changing models of pedagogy by drawing on recent research with teachers and their students as well as theoretical developments. In relation to a participatory view of learning, the paper reviews existing pedagogical models that take little account of the use of information and communications technologies as well as those that…

  2. Quality assurance of fuel elements

    International Nuclear Information System (INIS)

    Hoerber, J.

    1980-01-01

    The quality assurance activities for reactor fuel elements are based on a quality assurance system which implies the requirements resulting from the specifications, regulations of the authorities, national standards and international rules and regulations. The quality assurance related to production of reactor fuel will be shown for PWR fuel elements in all typical fabrication steps as conversion into UO 2 -powder, pelletizing, rodmanufacture and assembling. A wide range of destructive and nondestructive techniques is applied. Quality assurance is not only verified by testing techniques but also by process monitoring by means of parameter control in production and testing procedures. (RW)

  3. Information-based models for finance and insurance

    Science.gov (United States)

    Hoyle, Edward

    2010-10-01

    In financial markets, the information that traders have about an asset is reflected in its price. The arrival of new information then leads to price changes. The `information-based framework' of Brody, Hughston and Macrina (BHM) isolates the emergence of information, and examines its role as a driver of price dynamics. This approach has led to the development of new models that capture a broad range of price behaviour. This thesis extends the work of BHM by introducing a wider class of processes for the generation of the market filtration. In the BHM framework, each asset is associated with a collection of random cash flows. The asset price is the sum of the discounted expectations of the cash flows. Expectations are taken with respect (i) an appropriate measure, and (ii) the filtration generated by a set of so-called information processes that carry noisy or imperfect market information about the cash flows. To model the flow of information, we introduce a class of processes termed Lévy random bridges (LRBs), generalising the Brownian and gamma information processes of BHM. Conditioned on its terminal value, an LRB is identical in law to a Lévy bridge. We consider in detail the case where the asset generates a single cash flow X_T at a fixed date T. The flow of information about X_T is modelled by an LRB with random terminal value X_T. An explicit expression for the price process is found by working out the discounted conditional expectation of X_T with respect to the natural filtration of the LRB. New models are constructed using information processes related to the Poisson process, the Cauchy process, the stable-1/2 subordinator, the variance-gamma process, and the normal inverse-Gaussian process. These are applied to the valuation of credit-risky bonds, vanilla and exotic options, and non-life insurance liabilities.

  4. Auditory information coding by modeled cochlear nucleus neurons.

    Science.gov (United States)

    Wang, Huan; Isik, Michael; Borst, Alexander; Hemmert, Werner

    2011-06-01

    In this paper we use information theory to quantify the information in the output spike trains of modeled cochlear nucleus globular bushy cells (GBCs). GBCs are part of the sound localization pathway. They are known for their precise temporal processing, and they code amplitude modulations with high fidelity. Here we investigated the information transmission for a natural sound, a recorded vowel. We conclude that the maximum information transmission rate for a single neuron was close to 1,050 bits/s, which corresponds to a value of approximately 5.8 bits per spike. For quasi-periodic signals like voiced speech, the transmitted information saturated as word duration increased. In general, approximately 80% of the available information from the spike trains was transmitted within about 20 ms. Transmitted information for speech signals concentrated around formant frequency regions. The efficiency of neural coding was above 60% up to the highest temporal resolution we investigated (20 μs). The increase in transmitted information to that precision indicates that these neurons are able to code information with extremely high fidelity, which is required for sound localization. On the other hand, only 20% of the information was captured when the temporal resolution was reduced to 4 ms. As the temporal resolution of most speech recognition systems is limited to less than 10 ms, this massive information loss might be one of the reasons which are responsible for the lack of noise robustness of these systems.

  5. A Product Development Decision Model for Cockpit Weather Information System

    Science.gov (United States)

    Sireli, Yesim; Kauffmann, Paul; Gupta, Surabhi; Kachroo, Pushkin; Johnson, Edward J., Jr. (Technical Monitor)

    2003-01-01

    There is a significant market demand for advanced cockpit weather information products. However, it is unclear how to identify the most promising technological options that provide the desired mix of consumer requirements by employing feasible technical systems at a price that achieves market success. This study develops a unique product development decision model that employs Quality Function Deployment (QFD) and Kano's model of consumer choice. This model is specifically designed for exploration and resolution of this and similar information technology related product development problems.

  6. A Product Development Decision Model for Cockpit Weather Information Systems

    Science.gov (United States)

    Sireli, Yesim; Kauffmann, Paul; Gupta, Surabhi; Kachroo, Pushkin

    2003-01-01

    There is a significant market demand for advanced cockpit weather information products. However, it is unclear how to identify the most promising technological options that provide the desired mix of consumer requirements by employing feasible technical systems at a price that achieves market success. This study develops a unique product development decision model that employs Quality Function Deployment (QFD) and Kano's model of consumer choice. This model is specifically designed for exploration and resolution of this and similar information technology related product development problems.

  7. Implications of Information Theory for Computational Modeling of Schizophrenia.

    Science.gov (United States)

    Silverstein, Steven M; Wibral, Michael; Phillips, William A

    2017-10-01

    Information theory provides a formal framework within which information processing and its disorders can be described. However, information theory has rarely been applied to modeling aspects of the cognitive neuroscience of schizophrenia. The goal of this article is to highlight the benefits of an approach based on information theory, including its recent extensions, for understanding several disrupted neural goal functions as well as related cognitive and symptomatic phenomena in schizophrenia. We begin by demonstrating that foundational concepts from information theory-such as Shannon information, entropy, data compression, block coding, and strategies to increase the signal-to-noise ratio-can be used to provide novel understandings of cognitive impairments in schizophrenia and metrics to evaluate their integrity. We then describe more recent developments in information theory, including the concepts of infomax, coherent infomax, and coding with synergy, to demonstrate how these can be used to develop computational models of schizophrenia-related failures in the tuning of sensory neurons, gain control, perceptual organization, thought organization, selective attention, context processing, predictive coding, and cognitive control. Throughout, we demonstrate how disordered mechanisms may explain both perceptual/cognitive changes and symptom emergence in schizophrenia. Finally, we demonstrate that there is consistency between some information-theoretic concepts and recent discoveries in neurobiology, especially involving the existence of distinct sites for the accumulation of driving input and contextual information prior to their interaction. This convergence can be used to guide future theory, experiment, and treatment development.

  8. Requirements engineering for cross-sectional information chain models.

    Science.gov (United States)

    Hübner, U; Cruel, E; Gök, M; Garthaus, M; Zimansky, M; Remmers, H; Rienhoff, O

    2012-01-01

    Despite the wealth of literature on requirements engineering, little is known about engineering very generic, innovative and emerging requirements, such as those for cross-sectional information chains. The IKM health project aims at building information chain reference models for the care of patients with chronic wounds, cancer-related pain and back pain. Our question therefore was how to appropriately capture information and process requirements that are both generally applicable and practically useful. To this end, we started with recommendations from clinical guidelines and put them up for discussion in Delphi surveys and expert interviews. Despite the heterogeneity we encountered in all three methods, it was possible to obtain requirements suitable for building reference models. We evaluated three modelling languages and then chose to write the models in UML (class and activity diagrams). On the basis of the current project results, the pros and cons of our approach are discussed.

  9. Towards GLUE2 evolution of the computing element information model

    CERN Document Server

    Andreozzi, S; Field, L; Kónya, B

    2008-01-01

    A key advantage of Grid systems is the ability to share heterogeneous resources and services between traditional administrative and organizational domains. This ability enables virtual pools of resources to be created and assigned to groups of users. Resource awareness, the capability of users or user agents to have knowledge about the existence and state of resources, is required in order utilize the resource. This awareness requires a description of the services and resources typically defined via a community-agreed information model. One of the most popular information models, used by a number of Grid infrastructures, is the GLUE Schema, which provides a common language for describing Grid resources. Other approaches exist, however they follow different modeling strategies. The presence of different flavors of information models for Grid resources is a barrier for enabling inter-Grid interoperability. In order to solve this problem, the GLUE Working Group in the context of the Open Grid Forum was started. ...

  10. Organization model and formalized description of nuclear enterprise information system

    International Nuclear Information System (INIS)

    Yuan Feng; Song Yafeng; Li Xudong

    2012-01-01

    Organization model is one of the most important models of Nuclear Enterprise Information System (NEIS). Scientific and reasonable organization model is the prerequisite that NEIS has robustness and extendibility, and is also the foundation of the integration of heterogeneous system. Firstly, the paper describes the conceptual model of the NEIS on ontology chart, which provides a consistent semantic framework of organization. Then it discusses the relations between the concepts in detail. Finally, it gives the formalized description of the organization model of NEIS based on six-tuple array. (authors)

  11. The case of sustainability assurance: constructing a new assurance service

    NARCIS (Netherlands)

    O'Dwyer, B.

    2011-01-01

    This paper presents an in-depth longitudinal case study examining the processes through which practitioners in two Big 4 professional services firms have attempted to construct sustainability assurance (independent assurance on sustainability reports). Power’s (1996, 1997, 1999, 2003) theorization

  12. A Novel Fuzzy Document Based Information Retrieval Model for Forecasting

    Directory of Open Access Journals (Sweden)

    Partha Roy

    2017-06-01

    Full Text Available Information retrieval systems are generally used to find documents that are most appropriate according to some query that comes dynamically from users. In this paper a novel Fuzzy Document based Information Retrieval Model (FDIRM is proposed for the purpose of Stock Market Index forecasting. The novelty of proposed approach is a modified tf-idf scoring scheme to predict the future trend of the stock market index. The contribution of this paper has two dimensions, 1 In the proposed system the simple time series is converted to an enriched fuzzy linguistic time series with a unique approach of incorporating market sentiment related information along with the price and 2 A unique approach is followed while modeling the information retrieval (IR system which converts a simple IR system into a forecasting system. From the performance comparison of FDIRM with standard benchmark models it can be affirmed that the proposed model has a potential of becoming a good forecasting model. The stock market data provided by Standard & Poor’s CRISIL NSE Index 50 (CNX NIFTY-50 index of National Stock Exchange of India (NSE is used to experiment and validate the proposed model. The authentic data for validation and experimentation is obtained from http://www.nseindia.com which is the official website of NSE. A java program is under construction to implement the model in real-time with graphical users’ interface.

  13. A Model-Driven Development Method for Management Information Systems

    Science.gov (United States)

    Mizuno, Tomoki; Matsumoto, Keinosuke; Mori, Naoki

    Traditionally, a Management Information System (MIS) has been developed without using formal methods. By the informal methods, the MIS is developed on its lifecycle without having any models. It causes many problems such as lack of the reliability of system design specifications. In order to overcome these problems, a model theory approach was proposed. The approach is based on an idea that a system can be modeled by automata and set theory. However, it is very difficult to generate automata of the system to be developed right from the start. On the other hand, there is a model-driven development method that can flexibly correspond to changes of business logics or implementing technologies. In the model-driven development, a system is modeled using a modeling language such as UML. This paper proposes a new development method for management information systems applying the model-driven development method to a component of the model theory approach. The experiment has shown that a reduced amount of efforts is more than 30% of all the efforts.

  14. Creating Quality Assurance and International Transparency for Quality Assurance Agencies

    DEFF Research Database (Denmark)

    Kristoffersen, Dorte; Lindeberg, Tobias

    2004-01-01

    The paper presents the experiences gained in the pilot project on mutual recognition conducted by the quality assurance agencies in the Nordic countries and the future perspective for international quality assurance of national quality assurance agencies. The background of the project was the need......, on the one hand, to advance internationalisation of quality assurance of higher education, and on the other hand, allow for the differences in the national approaches to quality assurance. The paper will focus on two issues: first, the strength and weaknesses of the method employed and of the use of the ENQA......‐membership provision as a basis for the evaluative procedure; and second, the pros and cons of using mutual recognition as international evaluative procedure compared with other approaches....

  15. A non-linear model of information seeking behaviour

    Directory of Open Access Journals (Sweden)

    Allen E. Foster

    2005-01-01

    Full Text Available The results of a qualitative, naturalistic, study of information seeking behaviour are reported in this paper. The study applied the methods recommended by Lincoln and Guba for maximising credibility, transferability, dependability, and confirmability in data collection and analysis. Sampling combined purposive and snowball methods, and led to a final sample of 45 inter-disciplinary researchers from the University of Sheffield. In-depth semi-structured interviews were used to elicit detailed examples of information seeking. Coding of interview transcripts took place in multiple iterations over time and used Atlas-ti software to support the process. The results of the study are represented in a non-linear Model of Information Seeking Behaviour. The model describes three core processes (Opening, Orientation, and Consolidation and three levels of contextual interaction (Internal Context, External Context, and Cognitive Approach, each composed of several individual activities and attributes. The interactivity and shifts described by the model show information seeking to be non-linear, dynamic, holistic, and flowing. The paper concludes by describing the whole model of behaviours as analogous to an artist's palette, in which activities remain available throughout information seeking. A summary of key implications of the model and directions for further research are included.

  16. Akaike information criterion to select well-fit resist models

    Science.gov (United States)

    Burbine, Andrew; Fryer, David; Sturtevant, John

    2015-03-01

    In the field of model design and selection, there is always a risk that a model is over-fit to the data used to train the model. A model is well suited when it describes the physical system and not the stochastic behavior of the particular data collected. K-fold cross validation is a method to check this potential over-fitting to the data by calibrating with k-number of folds in the data, typically between 4 and 10. Model training is a computationally expensive operation, however, and given a wide choice of candidate models, calibrating each one repeatedly becomes prohibitively time consuming. Akaike information criterion (AIC) is an information-theoretic approach to model selection based on the maximized log-likelihood for a given model that only needs a single calibration per model. It is used in this study to demonstrate model ranking and selection among compact resist modelforms that have various numbers and types of terms to describe photoresist behavior. It is shown that there is a good correspondence of AIC to K-fold cross validation in selecting the best modelform, and it is further shown that over-fitting is, in most cases, not indicated. In modelforms with more than 40 fitting parameters, the size of the calibration data set benefits from additional parameters, statistically validating the model complexity.

  17. A Compositional Relevance Model for Adaptive Information Retrieval

    Science.gov (United States)

    Mathe, Nathalie; Chen, James; Lu, Henry, Jr. (Technical Monitor)

    1994-01-01

    There is a growing need for rapid and effective access to information in large electronic documentation systems. Access can be facilitated if information relevant in the current problem solving context can be automatically supplied to the user. This includes information relevant to particular user profiles, tasks being performed, and problems being solved. However most of this knowledge on contextual relevance is not found within the contents of documents, and current hypermedia tools do not provide any easy mechanism to let users add this knowledge to their documents. We propose a compositional relevance network to automatically acquire the context in which previous information was found relevant. The model records information on the relevance of references based on user feedback for specific queries and contexts. It also generalizes such information to derive relevant references for similar queries and contexts. This model lets users filter information by context of relevance, build personalized views of documents over time, and share their views with other users. It also applies to any type of multimedia information. Compared to other approaches, it is less costly and doesn't require any a priori statistical computation, nor an extended training period. It is currently being implemented into the Computer Integrated Documentation system which enables integration of various technical documents in a hypertext framework.

  18. Semantic Information Modeling for Emerging Applications in Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Qunzhi; Natarajan, Sreedhar; Simmhan, Yogesh; Prasanna, Viktor

    2012-04-16

    Smart Grid modernizes power grid by integrating digital and information technologies. Millions of smart meters, intelligent appliances and communication infrastructures are under deployment allowing advanced IT applications to be developed to secure and manage power grid operations. Demand response (DR) is one such emerging application to optimize electricity demand by curtailing/shifting power load when peak load occurs. Existing DR approaches are mostly based on static plans such as pricing policies and load shedding schedules. However, improvements to power management applications rely on data emanating from existing and new information sources with the growth of Smart Grid information space. In particular, dynamic DR algorithms depend on information from smart meters that report interval-based power consumption measurement, HVAC systems that monitor buildings heat and humidity, and even weather forecast services. In order for emerging Smart Grid applications to take advantage of the diverse data influx, extensible information integration is required. In this paper, we develop an integrated Smart Grid information model using Semantic Web techniques and present case studies of using semantic information for dynamic DR. We show the semantic model facilitates information integration and knowledge representation for developing the next generation Smart Grid applications.

  19. Technical Reference Suite Addressing Challenges of Providing Assurance for Fault Management Architectural Design

    Science.gov (United States)

    Fitz, Rhonda; Whitman, Gerek

    2016-01-01

    Research into complexities of software systems Fault Management (FM) and how architectural design decisions affect safety, preservation of assets, and maintenance of desired system functionality has coalesced into a technical reference (TR) suite that advances the provision of safety and mission assurance. The NASA Independent Verification and Validation (IV&V) Program, with Software Assurance Research Program support, extracted FM architectures across the IV&V portfolio to evaluate robustness, assess visibility for validation and test, and define software assurance methods applied to the architectures and designs. This investigation spanned IV&V projects with seven different primary developers, a wide range of sizes and complexities, and encompassed Deep Space Robotic, Human Spaceflight, and Earth Orbiter mission FM architectures. The initiative continues with an expansion of the TR suite to include Launch Vehicles, adding the benefit of investigating differences intrinsic to model-based FM architectures and insight into complexities of FM within an Agile software development environment, in order to improve awareness of how nontraditional processes affect FM architectural design and system health management. The identification of particular FM architectures, visibility, and associated IV&V techniques provides a TR suite that enables greater assurance that critical software systems will adequately protect against faults and respond to adverse conditions. Additionally, the role FM has with regard to strengthened security requirements, with potential to advance overall asset protection of flight software systems, is being addressed with the development of an adverse conditions database encompassing flight software vulnerabilities. Capitalizing on the established framework, this TR suite provides assurance capability for a variety of FM architectures and varied development approaches. Research results are being disseminated across NASA, other agencies, and the

  20. Structural Simulations and Conservation Analysis -Historic Building Information Model (HBIM

    Directory of Open Access Journals (Sweden)

    C. Dore

    2015-02-01

    Full Text Available In this paper the current findings to date of the Historic Building Information Model (HBIM of the Four Courts in Dublin are presented. The Historic Building Information Model (HBIM forms the basis for both structural and conservation analysis to measure the impact of war damage which still impacts on the building. The laser scan survey was carried out in the summer of 2014 of the internal and external structure. After registration and processing of the laser scan survey, the HBIM was created of the damaged section of the building and is presented as two separate workflows in this paper. The first is the model created from historic data, the second a procedural and segmented model developed from laser scan survey of the war damaged drum and dome. From both models structural damage and decay simulations will be developed for documentation and conservation analysis.

  1. FESA Quality Assurance

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    FESA is a framework used by 100+ developers at CERN to design and implement the real-time software used to control the accelerators. Each new version must be tested and qualified to ensure that no backward compatibility issues have been introduced and that there is no major bug which might prevent accelerator operations. Our quality assurance approach is based on code review and a two-level testing process. The first level is made of unit-test (Python unittest & Google tests for C++). The second level consists of integration tests running on an isolated test environment. We also use a continuous integration service (Bamboo) to ensure the tests are executed periodically and the bugs caught early. In the presentation, we will explain the reasons why we took this approach, the results and some thoughts on the pros and cons.

  2. Construction quality assurance report

    International Nuclear Information System (INIS)

    Roscha, V.

    1994-01-01

    This report provides a summary of the construction quality assurance (CQA) observation and test results, including: The results of the geosynthetic and soil materials conformance testing. The observation and testing results associates with the installation of the soil liners. The observation and testing results associated with the installation of the HDPE geomembrane liner systems. The observation and testing results associated with the installation of the leachate collection and removal systems. The observation and testing results associated with the installation of the working surfaces. The observation and testing results associated with in-plant manufacturing process. Summary of submittal reviews by Golder Construction Services, Inc. The submittal and certification of the piping material specifications. The observation and verification associated of the Acceptance Test Procedure results of the operational equipment functions. Summary of the ECNs which are incorporated into the project

  3. Evaluating procedural modelling for 3D models of informal settlements in urban design activities

    Directory of Open Access Journals (Sweden)

    Victoria Rautenbach

    2015-11-01

    Full Text Available Three-dimensional (3D modelling and visualisation is one of the fastest growing application fields in geographic information science. 3D city models are being researched extensively for a variety of purposes and in various domains, including urban design, disaster management, education and computer gaming. These models typically depict urban business districts (downtown or suburban residential areas. Despite informal settlements being a prevailing feature of many cities in developing countries, 3D models of informal settlements are virtually non-existent. 3D models of informal settlements could be useful in various ways, e.g. to gather information about the current environment in the informal settlements, to design upgrades, to communicate these and to educate inhabitants about environmental challenges. In this article, we described the development of a 3D model of the Slovo Park informal settlement in the City of Johannesburg Metropolitan Municipality, South Africa. Instead of using time-consuming traditional manual methods, we followed the procedural modelling technique. Visualisation characteristics of 3D models of informal settlements were described and the importance of each characteristic in urban design activities for informal settlement upgrades was assessed. Next, the visualisation characteristics of the Slovo Park model were evaluated. The results of the evaluation showed that the 3D model produced by the procedural modelling technique is suitable for urban design activities in informal settlements. The visualisation characteristics and their assessment are also useful as guidelines for developing 3D models of informal settlements. In future, we plan to empirically test the use of such 3D models in urban design projects in informal settlements.

  4. 3D Modeling from Photos Given Topological Information.

    Science.gov (United States)

    Kim, Young Min; Cho, Junghyun; Ahn, Sang Chul

    2016-09-01

    Reconstructing 3D models given a single-view 2D information is inherently an ill-posed problem and requires additional information such as shape prior or user input.We introduce a method to generate multiple 3D models of a particular category given corresponding photographs when the topological information is known. While there is a wide range of shapes for an object of a particular category, the basic topology usually remains constant.In consequence, the topological prior needs to be provided only once for each category and can be easily acquired by consulting an existing database of 3D models or by user input. The input of topological description is only connectivity information between parts; this is in contrast to previous approaches that have required users to interactively mark individual parts. Given the silhouette of an object and the topology, our system automatically finds a skeleton and generates a textured 3D model by jointly fitting multiple parts. The proposed method, therefore, opens the possibility of generating a large number of 3D models by consulting a massive number of photographs. We demonstrate examples of the topological prior and reconstructed 3D models using photos.

  5. An information propagation model considering incomplete reading behavior in microblog

    Science.gov (United States)

    Su, Qiang; Huang, Jiajia; Zhao, Xiande

    2015-02-01

    Microblog is one of the most popular communication channels on the Internet, and has already become the third largest source of news and public opinions in China. Although researchers have studied the information propagation in microblog using the epidemic models, previous studies have not considered the incomplete reading behavior among microblog users. Therefore, the model cannot fit the real situations well. In this paper, we proposed an improved model entitled Microblog-Susceptible-Infected-Removed (Mb-SIR) for information propagation by explicitly considering the user's incomplete reading behavior. We also tested the effectiveness of the model using real data from Sina Microblog. We demonstrate that the new proposed model is more accurate in describing the information propagation in microblog. In addition, we also investigate the effects of the critical model parameters, e.g., reading rate, spreading rate, and removed rate through numerical simulations. The simulation results show that, compared with other parameters, reading rate plays the most influential role in the information propagation performance in microblog.

  6. Models, Metaphors and Symbols for Information and Knowledge Systems

    Directory of Open Access Journals (Sweden)

    David Williams

    2014-01-01

    Full Text Available A literature search indicates that Data, Information and Knowledge continue to be placed into a hierarchical construct where it is considered that information is more valuable than data and that information can be processed into becoming precious knowledge. Wisdom continues to be added to the model to further confuse the issue. This model constrains our ability to think more logically about how and why we develop knowledge management systems to support and enhance knowledge- intensive processes, tasks or projects. This paper seeks to summarise development of the Data-Information-Knowledge-Wisdom hierarchy, explore the extensive criticism of it and present a more logical (and accurate construct for the elements of intellectual capital when developing and managing Knowledge Management Systems.

  7. GRAMMAR RULE BASED INFORMATION RETRIEVAL MODEL FOR BIG DATA

    Directory of Open Access Journals (Sweden)

    T. Nadana Ravishankar

    2015-07-01

    Full Text Available Though Information Retrieval (IR in big data has been an active field of research for past few years; the popularity of the native languages presents a unique challenge in big data information retrieval systems. There is a need to retrieve information which is present in English and display it in the native language for users. This aim of cross language information retrieval is complicated by unique features of the native languages such as: morphology, compound word formations, word spelling variations, ambiguity, word synonym, other language influence and etc. To overcome some of these issues, the native language is modeled using a grammar rule based approach in this work. The advantage of this approach is that the native language is modeled and its unique features are encoded using a set of inference rules. This rule base coupled with the customized ontological system shows considerable potential and is found to show better precision and recall.

  8. Information Reference Models for European Pork Supply Networks - Identifying Gaps in Information Infrastructures

    DEFF Research Database (Denmark)

    Lehmann, Richard J.; Hermansen, John Erik; Fritz, Melanie

    2011-01-01

    models for European pork supply networks, which give an aggregated overview about information availability and exchange in the pork sector, identify additional information demands of decision makers at different stages of pork production, and identify gaps in the existing information infrastructure......Several global developments such as diminishing production resources, limits in the availability of water and the growing demand for bio-energy as well as sector-wide crises (e.g. BSE, swine fever, dioxin) have led to a changing attitude of society towards the conse-quences of the food system......‘s activities for social, economic and environmental issues, cap-tured in the term of sustainability. As a consequence, consumers show increasing interest in the characteristics of food, and in turn, on the availability of related information and guaran-tees. The paper introduces different information reference...

  9. Project Specific Quality Assurance Plan

    International Nuclear Information System (INIS)

    Pedersen, K.S.

    1995-01-01

    This Quality Assurance Project Plan (QAPP) identifies the Westinghouse Hanford Co. (WHC) Quality Assurance (QA) program requirements for all contractors involved in the planning and execution of the design, construction, testing and inspection of the 200 Area Effluent BAT/AKART Implementation, Project W-291

  10. Quality assurance of operating instructions

    International Nuclear Information System (INIS)

    Asmuss, G.

    1992-01-01

    It is pointed out that the quality assurance at nuclear power stations must be supported by national and international regulations. Quality assurance is explained using the example of the design of a pressurised water reactor. The operating and emergency manuals are discussed and examples for their structure put forward. The significance of updating is emphasised. 15 figs., 19 refs

  11. Recent Trends in Quality Assurance

    Science.gov (United States)

    Amaral, Alberto; Rosa, Maria Joao

    2010-01-01

    In this paper we present a brief description of the evolution of quality assurance in Europe, paying particular attention to its relationship to the rising loss of trust in higher education institutions. We finalise by analysing the role of the European Commission in the setting up of new quality assurance mechanisms that tend to promote…

  12. Quality assurance of qualitative analysis

    DEFF Research Database (Denmark)

    Ríos, Ángel; Barceló, Damiá; Buydens, Lutgarde

    2003-01-01

    and quality assurance. One important part of this document deals, therefore, with aspects involved in analytical quality assurance of qualitative analysis. This article shows the main conclusions reported in the document referring to the implementation of quality principles in qualitative analysis......: traceability, reliability (uncertainty), validation, and internal/external quality control for qualitative methods....

  13. QANU - Quality Assurance Netherlands Universities

    DEFF Research Database (Denmark)

    Jensen, Henrik Toft; Maria E., Weber; Vyt, André

    The Quality Assurance Netherlands Universities (QANU) underwent an ENQA-coordinated external review in 2016. The review was chaired by Henrik Toft Jensen, Research fellow at Roskilde University (RUC), Denmark.......The Quality Assurance Netherlands Universities (QANU) underwent an ENQA-coordinated external review in 2016. The review was chaired by Henrik Toft Jensen, Research fellow at Roskilde University (RUC), Denmark....

  14. R D software quality assurance

    Energy Technology Data Exchange (ETDEWEB)

    Hood, F.C.

    1991-10-01

    Research software quality assurance (QA) requirements must be adequate to strengthen development or modification objectives, but flexible enough not to restrict creativity. Application guidelines are needed for the different kinds of research and development (R D) software activities to assure project objectives are achieved.

  15. MATHEMATICAL MODEL FOR CALCULATION OF INFORMATION RISKS FOR INFORMATION AND LOGISTICS SYSTEM

    Directory of Open Access Journals (Sweden)

    A. G. Korobeynikov

    2015-05-01

    Full Text Available Subject of research. The paper deals with mathematical model for assessment calculation of information risks arising during transporting and distribution of material resources in the conditions of uncertainty. Meanwhile information risks imply the danger of origin of losses or damage as a result of application of information technologies by the company. Method. The solution is based on ideology of the transport task solution in stochastic statement with mobilization of mathematical modeling theory methods, the theory of graphs, probability theory, Markov chains. Creation of mathematical model is performed through the several stages. At the initial stage, capacity on different sites depending on time is calculated, on the basis of information received from information and logistic system, the weight matrix is formed and the digraph is under construction. Then there is a search of the minimum route which covers all specified vertexes by means of Dejkstra algorithm. At the second stage, systems of differential Kolmogorov equations are formed using information about the calculated route. The received decisions show probabilities of resources location in concrete vertex depending on time. At the third stage, general probability of the whole route passing depending on time is calculated on the basis of multiplication theorem of probabilities. Information risk, as time function, is defined by multiplication of the greatest possible damage by the general probability of the whole route passing. In this case information risk is measured in units of damage which corresponds to that monetary unit which the information and logistic system operates with. Main results. Operability of the presented mathematical model is shown on a concrete example of transportation of material resources where places of shipment and delivery, routes and their capacity, the greatest possible damage and admissible risk are specified. The calculations presented on a diagram showed

  16. Semantic reasoning with XML-based biomedical information models.

    Science.gov (United States)

    O'Connor, Martin J; Das, Amar

    2010-01-01

    The Extensible Markup Language (XML) is increasingly being used for biomedical data exchange. The parallel growth in the use of ontologies in biomedicine presents opportunities for combining the two technologies to leverage the semantic reasoning services provided by ontology-based tools. There are currently no standardized approaches for taking XML-encoded biomedical information models and representing and reasoning with them using ontologies. To address this shortcoming, we have developed a workflow and a suite of tools for transforming XML-based information models into domain ontologies encoded using OWL. In this study, we applied semantics reasoning methods to these ontologies to automatically generate domain-level inferences. We successfully used these methods to develop semantic reasoning methods for information models in the HIV and radiological image domains.

  17. A generalized model via random walks for information filtering

    Science.gov (United States)

    Ren, Zhuo-Ming; Kong, Yixiu; Shang, Ming-Sheng; Zhang, Yi-Cheng

    2016-08-01

    There could exist a simple general mechanism lurking beneath collaborative filtering and interdisciplinary physics approaches which have been successfully applied to online E-commerce platforms. Motivated by this idea, we propose a generalized model employing the dynamics of the random walk in the bipartite networks. Taking into account the degree information, the proposed generalized model could deduce the collaborative filtering, interdisciplinary physics approaches and even the enormous expansion of them. Furthermore, we analyze the generalized model with single and hybrid of degree information on the process of random walk in bipartite networks, and propose a possible strategy by using the hybrid degree information for different popular objects to toward promising precision of the recommendation.

  18. Quality assurance feedback as a nursing management strategy.

    Science.gov (United States)

    Brannon, D; Bucher, J A

    1989-01-01

    Quality assurance and effective nurse management can be viewed as intersecting goals. Objective feedback derived from quality assurance data is a potentially powerful means of enhancing nurses' performance and job satisfaction. The use of automated information systems to provide such direct feedback offers the additional advantage of recognizing nurses as self-monitoring, self-correcting professionals. The need, opportunity, and challenge involved in meshing quality assurance with human resource management through computer-generated feedback are discussed in the context of the home health care setting.

  19. Information Governance: A Model for Security in Medical Practice

    Directory of Open Access Journals (Sweden)

    Patricia A.H. Williams

    2007-03-01

    Full Text Available Information governance is becoming an important aspect of organisational accountability. In consideration that information is an integral asset of most organisations, the protection of this asset will increasingly rely on organisational capabilities in security.  In the medical arena this information is primarily sensitive patient-based information. Previous research has shown that application of security measures is a low priority for primary care medical practice and that awareness of the risks are seriously underestimated. Consequently, information security governance will be a key issue for medical practice in the future. Information security governance is a relatively new term and there is little existing research into how to meet governance requirements. The limited research that exists describes information security governance frameworks at a strategic level. However, since medical practice is already lagging in the implementation of appropriate security, such definition may not be practical although it is obviously desirable. This paper describes an on-going action research project undertaken in the area of medical information security, and presents a tactical approach model aimed at addressing information security governance and the protection of medical data. 

  20. Information Models of Acupuncture Analgesia and Meridian Channels

    Directory of Open Access Journals (Sweden)

    Chang Hua Zou

    2010-12-01

    Full Text Available Acupuncture and meridian channels have been major components of Chinese and Eastern Asian medicine—especially for analgesia—for over 2000 years. In recent decades, electroacupuncture (EA analgesia has been applied clinically and experimentally. However, there were controversial results between different treatment frequencies, or between the active and the placebo treatments; and the mechanisms of the treatments and the related meridian channels are still unknown. In this study, we propose a new term of infophysics therapy and develop information models of acupuncture (or EA analgesia and meridian channels, to understand the mechanisms and to explain the controversial results, based on Western theories of information, trigonometry and Fourier series, and physics, as well as published biomedical data. We are trying to build a bridge between Chinese medicine and Western medicine by investigating the Eastern acupuncture analgesia and meridian channels with Western sciences; we model the meridians as a physiological system that is mostly constructed with interstices in or between other physiological systems; we consider frequencies, amplitudes and wave numbers of electric field intensity (EFI as information data. Our modeling results demonstrate that information regulated with acupuncture (or EA is different from pain information, we provide answers to explain the controversial published results, and suggest that mechanisms of acupuncture (or EA analgesia could be mostly involved in information regulation of frequencies and amplitudes of EFI as well as neuronal transmitters such as endorphins.

  1. Quality assurance, information tracking, and consumer labeling

    International Nuclear Information System (INIS)

    Caswell, Julie A. . E-mail caswell@resecon.umass.edu

    2006-01-01

    Reducing marine-based public health risk requires strict control of several attributes of seafood products, often including location and conditions of catch or aquaculture, processing, and handling throughout the supply chain. Buyers likely will also be interested in other attributes of these products such as eco-friendliness or taste. Development of markets for improved safety, as well as for other quality attributes, requires an effective certification and tracking of these attributes as well as their communication to buyers. Several challenges must be met if labeling, particularly consumer labeling, is to support the development of markets for improved seafood safety

  2. Information Assurance Due to IFRS Adoption

    OpenAIRE

    Ungureanu Mihaela

    2012-01-01

    Currently, cross-border business operations are current currency and capital markets know no territorial limits. Standardization and harmonization have imposed an uniformity of accounting terms in national and international regulations. In this context, it makes direct reference to the accounting rules comply with the European Directives and International Financial Reporting Standards (IAS/IFRS), which are submitted to the first two fundamental concepts: performance and financial position, an...

  3. Metrics for building performance assurance

    Energy Technology Data Exchange (ETDEWEB)

    Koles, G.; Hitchcock, R.; Sherman, M.

    1996-07-01

    This report documents part of the work performed in phase I of a Laboratory Directors Research and Development (LDRD) funded project entitled Building Performance Assurances (BPA). The focus of the BPA effort is to transform the way buildings are built and operated in order to improve building performance by facilitating or providing tools, infrastructure, and information. The efforts described herein focus on the development of metrics with which to evaluate building performance and for which information and optimization tools need to be developed. The classes of building performance metrics reviewed are (1) Building Services (2) First Costs, (3) Operating Costs, (4) Maintenance Costs, and (5) Energy and Environmental Factors. The first category defines the direct benefits associated with buildings; the next three are different kinds of costs associated with providing those benefits; the last category includes concerns that are broader than direct costs and benefits to the building owner and building occupants. The level of detail of the various issues reflect the current state of knowledge in those scientific areas and the ability of the to determine that state of knowledge, rather than directly reflecting the importance of these issues; it intentionally does not specifically focus on energy issues. The report describes work in progress and is intended as a resource and can be used to indicate the areas needing more investigation. Other reports on BPA activities are also available.

  4. Automating linear accelerator quality assurance

    Energy Technology Data Exchange (ETDEWEB)

    Eckhause, Tobias; Thorwarth, Ryan; Moran, Jean M., E-mail: jmmoran@med.umich.edu [Department of Radiation Oncology, University of Michigan, Ann Arbor, Michigan 48109-5010 (United States); Al-Hallaq, Hania; Farrey, Karl [Department of Radiation Oncology and Cellular Oncology, The University of Chicago, Chicago, Illinois 60637 (United States); Ritter, Timothy [Ann Arbor VA Medical Center, Ann Arbor, Michigan 48109 (United States); DeMarco, John [Department of Radiation Oncology, Cedars-Sinai Medical Center, Los Angeles, California, 90048 (United States); Pawlicki, Todd; Kim, Gwe-Ya [UCSD Medical Center, La Jolla, California 92093 (United States); Popple, Richard [Department of Radiation Oncology, University of Alabama Birmingham, Birmingham, Alabama 35249 (United States); Sharma, Vijeshwar; Park, SungYong [Karmanos Cancer Institute, McLaren-Flint, Flint, Michigan 48532 (United States); Perez, Mario; Booth, Jeremy T. [Royal North Shore Hospital, Sydney, NSW 2065 (Australia)

    2015-10-15

    Purpose: The purpose of this study was 2-fold. One purpose was to develop an automated, streamlined quality assurance (QA) program for use by multiple centers. The second purpose was to evaluate machine performance over time for multiple centers using linear accelerator (Linac) log files and electronic portal images. The authors sought to evaluate variations in Linac performance to establish as a reference for other centers. Methods: The authors developed analytical software tools for a QA program using both log files and electronic portal imaging device (EPID) measurements. The first tool is a general analysis tool which can read and visually represent data in the log file. This tool, which can be used to automatically analyze patient treatment or QA log files, examines the files for Linac deviations which exceed thresholds. The second set of tools consists of a test suite of QA fields, a standard phantom, and software to collect information from the log files on deviations from the expected values. The test suite was designed to focus on the mechanical tests of the Linac to include jaw, MLC, and collimator positions during static, IMRT, and volumetric modulated arc therapy delivery. A consortium of eight institutions delivered the test suite at monthly or weekly intervals on each Linac using a standard phantom. The behavior of various components was analyzed for eight TrueBeam Linacs. Results: For the EPID and trajectory log file analysis, all observed deviations which exceeded established thresholds for Linac behavior resulted in a beam hold off. In the absence of an interlock-triggering event, the maximum observed log file deviations between the expected and actual component positions (such as MLC leaves) varied from less than 1% to 26% of published tolerance thresholds. The maximum and standard deviations of the variations due to gantry sag, collimator angle, jaw position, and MLC positions are presented. Gantry sag among Linacs was 0.336 ± 0.072 mm. The

  5. Towards GLUE 2: evolution of the computing element information model

    Science.gov (United States)

    Andreozzi, S.; Burke, S.; Field, L.; Kónya, B.

    2008-07-01

    A key advantage of Grid systems is the ability to share heterogeneous resources and services between traditional administrative and organizational domains. This ability enables virtual pools of resources to be created and assigned to groups of users. Resource awareness, the capability of users or user agents to have knowledge about the existence and state of resources, is required in order utilize the resource. This awareness requires a description of the services and resources typically defined via a community-agreed information model. One of the most popular information models, used by a number of Grid infrastructures, is the GLUE Schema, which provides a common language for describing Grid resources. Other approaches exist, however they follow different modeling strategies. The presence of different flavors of information models for Grid resources is a barrier for enabling inter-Grid interoperability. In order to solve this problem, the GLUE Working Group in the context of the Open Grid Forum was started. The purpose of the group is to oversee a major redesign of the GLUE Schema which should consider the successful modeling choices and flaws that have emerged from practical experience and modeling choices from other initiatives. In this paper, we present the status of the new model for describing computing resources as the first output from the working group with the aim of dissemination and soliciting feedback from the community.

  6. Petri net modeling of encrypted information flow in federated cloud

    Science.gov (United States)

    Khushk, Abdul Rauf; Li, Xiaozhong

    2017-08-01

    Solutions proposed and developed for the cost-effective cloud systems suffer from a combination of secure private clouds and less secure public clouds. Need to locate applications within different clouds poses a security risk to the information flow of the entire system. This study addresses this by assigning security levels of a given lattice to the entities of a federated cloud system. A dynamic flow sensitive security model featuring Bell-LaPadula procedures is explored that tracks and authenticates the secure information flow in federated clouds. Additionally, a Petri net model is considered as a case study to represent the proposed system and further validate the performance of the said system.

  7. Quality assurance when documenting chemical hazards to health and environment

    International Nuclear Information System (INIS)

    Guttormsen, R.; Modahl, S.I.; Tufto, P.A.; Buset, H.

    1991-01-01

    In a joint project between The Norwegian Petroleum Directorate (NPD), the State Pollution Control Agency (SFT) and Conoco Norway Inc. (CNI) we have evaluated the use of quality assurance principles in connection with development and distribution of information about chemicals. Assuring quality of the documentation is first of all depending on: the work in international organizations; the content of national and international guidelines and criteria documents; the use of product registers; activities in manufacturers' organizations; the role of importers and agents. These are aspects which have been evaluated. Recommendations are given in this paper concerning: definition of responsibilities in regulations, standards and guidelines; feedback of experience and coordination through international work; application of quality assurance principles in the use of information technology in international organizations and in manufacturers' organizations; use of quality assurance principles in validation of data

  8. Information density converges in dialogue: Towards an information-theoretic model.

    Science.gov (United States)

    Xu, Yang; Reitter, David

    2018-01-01

    The principle of entropy rate constancy (ERC) states that language users distribute information such that words tend to be equally predictable given previous contexts. We examine the applicability of this principle to spoken dialogue, as previous findings primarily rest on written text. The study takes into account the joint-activity nature of dialogue and the topic shift mechanisms that are different from monologue. It examines how the information contributions from the two dialogue partners interactively evolve as the discourse develops. The increase of local sentence-level information density (predicted by ERC) is shown to apply to dialogue overall. However, when the different roles of interlocutors in introducing new topics are identified, their contribution in information content displays a new converging pattern. We draw explanations to this pattern from multiple perspectives: Casting dialogue as an information exchange system would mean that the pattern is the result of two interlocutors maintaining their own context rather than sharing one. Second, we present some empirical evidence that a model of Interactive Alignment may include information density to explain the effect. Third, we argue that building common ground is a process analogous to information convergence. Thus, we put forward an information-theoretic view of dialogue, under which some existing theories of human dialogue may eventually be unified. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Methods of Software Quality Assurance under a Nuclear Quality Assurance Program

    International Nuclear Information System (INIS)

    Kim, Jang Yeol; Lee, Young Jun; Cha, Kyung Ho; Cheon, Se Woo; Lee, Jang Soo; Kwon, Kee Choon

    2005-01-01

    This paper addresses a substantial implementation of a software quality assurance under a nuclear quality assurance program. The relationship of the responsibility between a top-level nuclear quality assurance program such as ASME/NQA-1 and its lower level software quality assurance is described. Software quality assurance activities and software quality assurance procedures during the software development life cycle are also described

  10. Automated Physico-Chemical Cell Model Development through Information Theory

    Energy Technology Data Exchange (ETDEWEB)

    Peter J. Ortoleva

    2005-11-29

    The objective of this project was to develop predictive models of the chemical responses of microbial cells to variations in their surroundings. The application of these models is optimization of environmental remediation and energy-producing biotechnical processes.The principles on which our project is based are as follows: chemical thermodynamics and kinetics; automation of calibration through information theory; integration of multiplex data (e.g. cDNA microarrays, NMR, proteomics), cell modeling, and bifurcation theory to overcome cellular complexity; and the use of multiplex data and information theory to calibrate and run an incomplete model. In this report we review four papers summarizing key findings and a web-enabled, multiple module workflow we have implemented that consists of a set of interoperable systems biology computational modules.

  11. An Integrative Behavioral Model of Information Security Policy Compliance

    Directory of Open Access Journals (Sweden)

    Sang Hoon Kim

    2014-01-01

    Full Text Available The authors found the behavioral factors that influence the organization members’ compliance with the information security policy in organizations on the basis of neutralization theory, Theory of planned behavior, and protection motivation theory. Depending on the theory of planned behavior, members’ attitudes towards compliance, as well as normative belief and self-efficacy, were believed to determine the intention to comply with the information security policy. Neutralization theory, a prominent theory in criminology, could be expected to provide the explanation for information system security policy violations. Based on the protection motivation theory, it was inferred that the expected efficacy could have an impact on intentions of compliance. By the above logical reasoning, the integrative behavioral model and eight hypotheses could be derived. Data were collected by conducting a survey; 194 out of 207 questionnaires were available. The test of the causal model was conducted by PLS. The reliability, validity, and model fit were found to be statistically significant. The results of the hypotheses tests showed that seven of the eight hypotheses were acceptable. The theoretical implications of this study are as follows: (1 the study is expected to play a role of the baseline for future research about organization members’ compliance with the information security policy, (2 the study attempted an interdisciplinary approach by combining psychology and information system security research, and (3 the study suggested concrete operational definitions of influencing factors for information security policy compliance through a comprehensive theoretical review. Also, the study has some practical implications. First, it can provide the guideline to support the successful execution of the strategic establishment for the implement of information system security policies in organizations. Second, it proves that the need of education and training

  12. An integrative behavioral model of information security policy compliance.

    Science.gov (United States)

    Kim, Sang Hoon; Yang, Kyung Hoon; Park, Sunyoung

    2014-01-01

    The authors found the behavioral factors that influence the organization members' compliance with the information security policy in organizations on the basis of neutralization theory, Theory of planned behavior, and protection motivation theory. Depending on the theory of planned behavior, members' attitudes towards compliance, as well as normative belief and self-efficacy, were believed to determine the intention to comply with the information security policy. Neutralization theory, a prominent theory in criminology, could be expected to provide the explanation for information system security policy violations. Based on the protection motivation theory, it was inferred that the expected efficacy could have an impact on intentions of compliance. By the above logical reasoning, the integrative behavioral model and eight hypotheses could be derived. Data were collected by conducting a survey; 194 out of 207 questionnaires were available. The test of the causal model was conducted by PLS. The reliability, validity, and model fit were found to be statistically significant. The results of the hypotheses tests showed that seven of the eight hypotheses were acceptable. The theoretical implications of this study are as follows: (1) the study is expected to play a role of the baseline for future research about organization members' compliance with the information security policy, (2) the study attempted an interdisciplinary approach by combining psychology and information system security research, and (3) the study suggested concrete operational definitions of influencing factors for information security policy compliance through a comprehensive theoretical review. Also, the study has some practical implications. First, it can provide the guideline to support the successful execution of the strategic establishment for the implement of information system security policies in organizations. Second, it proves that the need of education and training programs suppressing

  13. Bayesian inference with information content model check for Langevin equations

    DEFF Research Database (Denmark)

    Krog, Jens F. C.; Lomholt, Michael Andersen

    2017-01-01

    The Bayesian data analysis framework has been proven to be a systematic and effective method of parameter inference and model selection for stochastic processes. In this work we introduce an information content model check which may serve as a goodness-of-fit, like the chi-square procedure......, to complement conventional Bayesian analysis. We demonstrate this extended Bayesian framework on a system of Langevin equations, where coordinate dependent mobilities and measurement noise hinder the normal mean squared displacement approach....

  14. ASSESSING INDIVIDUAL PERFORMANCE ON INFORMATION TECHNOLOGY ADOPTION: A NEW MODEL

    OpenAIRE

    Diah Hari Suryaningrum

    2012-01-01

    This paper aims to propose a new model in assessing individual performance on information technology adoption. The new model to assess individual performance was derived from two different theories: decomposed theory of planned behavior and task-technology fit theory. Although many researchers have tried to expand these theories, some of their efforts might lack of theoretical assumptions. To overcome this problem and enhance the coherence of the integration, I used a theory from social scien...

  15. Building Information Modeling (BIM) for Indoor Environmental Performance Analysis

    DEFF Research Database (Denmark)

    The report is a part of a research assignment carried out by students in the 5ETCS course “Project Byggeri – [entitled as: Building Information Modeling (BIM) – Modeling & Analysis]”, during the 3rd semester of master degree in Civil and Architectural Engineering, Department of Engineering, Aarhus...... University. This includes seven papers describing BIM for Sustainability, concentrating specifically on individual topics regarding to Indoor Environment Performance Analysis....

  16. Tracers and traceability: implementing the cirrus parameterisation from LACM in the TOMCAT/SLIMCAT chemistry transport model as an example of the application of quality assurance to legacy models

    Directory of Open Access Journals (Sweden)

    A. M. Horseman

    2010-03-01

    Full Text Available A new modelling tool for the investigation of large-scale behaviour of cirrus clouds has been developed. This combines two existing models, the TOMCAT/SLIMCAT chemistry transport model (nupdate library version 0.80, script mpc346_l and cirrus parameterisation of Ren and MacKenzie (LACM implementation not versioned. The development process employed a subset of best-practice software engineering and quality assurance processes, selected to be viable for small-scale projects whilst maintaining the same traceability objectives. The application of the software engineering and quality control processes during the development has been shown to be not a great overhead, and their use has been of benefit to the developers as well as the end users of the results. We provide a step-by-step guide to the implementation of traceability tailored to the production of geo-scientific research software, as distinct from commercial and operational software. Our recommendations include: maintaining a living "requirements list"; explicit consideration of unit, integration and acceptance testing; and automated revision/configuration control, including control of analysis tool scripts and programs.

    Initial testing of the resulting model against satellite and in-situ measurements has been promising. The model produces representative results for both spatial distribution of the frequency of occurrence of cirrus ice, and the drying of air as it moves across the tropical tropopause. The model is now ready for more rigorous quantitative testing, but will require the addition of a vertical wind velocity downscaling scheme to better represent extra-tropical continental cirrus.

  17. Extending 3D city models with legal information

    Science.gov (United States)

    Frank, A. U.; Fuhrmann, T.; Navratil, G.

    2012-10-01

    3D city models represent existing physical objects and their topological and functional relations. In everyday life the rights and responsibilities connected to these objects, primarily legally defined rights and obligations but also other socially and culturally established rights, are of importance. The rights and obligations are defined in various laws and it is often difficult to identify the rules applicable for a certain case. The existing 2D cadastres show civil law rights and obligations and plans to extend them to provide information about public law restrictions for land use are in several countries under way. It is tempting to design extensions to the 3D city models to provide information about legal rights in 3D. The paper analyses the different types of information that are needed to reduce conflicts and to facilitate decisions about land use. We identify the role 3D city models augmented with planning information in 3D can play, but do not advocate a general conversion from 2D to 3D for the legal cadastre. Space is not anisotropic and the up/down dimension is practically very different from the two dimensional plane - this difference must be respected when designing spatial information systems. The conclusions are: (1) continue the current regime for ownership of apartments, which is not ownership of a 3D volume, but co-ownership of a building with exclusive use of some rooms; such exclusive use rights could be shown in a 3D city model; (2) ownership of 3D volumes for complex and unusual building situations can be reported in a 3D city model, but are not required everywhere; (3) indicate restrictions for land use and building in 3D city models, with links to the legal sources.

  18. Computational Methods for Physical Model Information Management: Opening the Aperture

    International Nuclear Information System (INIS)

    Moser, F.; Kirgoeze, R.; Gagne, D.; Calle, D.; Murray, J.; Crowley, J.

    2015-01-01

    The volume, velocity and diversity of data available to analysts are growing exponentially, increasing the demands on analysts to stay abreast of developments in their areas of investigation. In parallel to the growth in data, technologies have been developed to efficiently process, store, and effectively extract information suitable for the development of a knowledge base capable of supporting inferential (decision logic) reasoning over semantic spaces. These technologies and methodologies, in effect, allow for automated discovery and mapping of information to specific steps in the Physical Model (Safeguard's standard reference of the Nuclear Fuel Cycle). This paper will describe and demonstrate an integrated service under development at the IAEA that utilizes machine learning techniques, computational natural language models, Bayesian methods and semantic/ontological reasoning capabilities to process large volumes of (streaming) information and associate relevant, discovered information to the appropriate process step in the Physical Model. The paper will detail how this capability will consume open source and controlled information sources and be integrated with other capabilities within the analysis environment, and provide the basis for a semantic knowledge base suitable for hosting future mission focused applications. (author)

  19. An architecture model for multiple disease management information systems.

    Science.gov (United States)

    Chen, Lichin; Yu, Hui-Chu; Li, Hao-Chun; Wang, Yi-Van; Chen, Huang-Jen; Wang, I-Ching; Wang, Chiou-Shiang; Peng, Hui-Yu; Hsu, Yu-Ling; Chen, Chi-Huang; Chuang, Lee-Ming; Lee, Hung-Chang; Chung, Yufang; Lai, Feipei

    2013-04-01

    Disease management is a program which attempts to overcome the fragmentation of healthcare system and improve the quality of care. Many studies have proven the effectiveness of disease management. However, the case managers were spending the majority of time in documentation, coordinating the members of the care team. They need a tool to support them with daily practice and optimizing the inefficient workflow. Several discussions have indicated that information technology plays an important role in the era of disease management. Whereas applications have been developed, it is inefficient to develop information system for each disease management program individually. The aim of this research is to support the work of disease management, reform the inefficient workflow, and propose an architecture model that enhance on the reusability and time saving of information system development. The proposed architecture model had been successfully implemented into two disease management information system, and the result was evaluated through reusability analysis, time consumed analysis, pre- and post-implement workflow analysis, and user questionnaire survey. The reusability of the proposed model was high, less than half of the time was consumed, and the workflow had been improved. The overall user aspect is positive. The supportiveness during daily workflow is high. The system empowers the case managers with better information and leads to better decision making.

  20. On the Enterprise Modelling of an Educational Information Infrastructure

    NARCIS (Netherlands)

    Michiels, E.F.; Widya, I.A.; Volman, C.J.A.M.; Pokraev, S.; de Diana, I.P.F.

    2000-01-01

    In this report, we present the outcomes of exercising a design trajectory in respect of the modelling of an educational information infrastructure. The infrastructure aims to support the organisation of teaching and learning activities, independently of any particular didactic policy. The design