WorldWideScience

Sample records for document model quality

  1. CNEA's quality system documentation

    International Nuclear Information System (INIS)

    Mazzini, M.M.; Garonis, O.H.

    1998-01-01

    Full text: To obtain an effective and coherent documentation system suitable for CNEA's Quality Management Program, we decided to organize the CNEA's quality documentation with : a- Level 1. Quality manual. b- Level 2. Procedures. c-Level 3. Qualities plans. d- Level 4: Instructions. e- Level 5. Records and other documents. The objective of this work is to present a standardization of the documentation of the CNEA's quality system of facilities, laboratories, services, and R and D activities. Considering the diversity of criteria and formats for elaboration the documentation by different departments, and since ultimately each of them generally includes the same quality management policy, we proposed the elaboration of a system in order to improve the documentation, avoiding unnecessary time wasting and costs. This will aloud each sector to focus on their specific documentation. The quality manuals of the atomic centers fulfill the rule 3.6.1 of the Nuclear Regulatory Authority, and the Safety Series 50-C/SG-Q of the International Atomic Energy Agency. They are designed by groups of competent and highly trained people of different departments. The normative procedures are elaborated with the same methodology as the quality manuals. The quality plans which describe the organizational structure of working group and the appropriate documentation, will asses the quality manuals of facilities, laboratories, services, and research and development activities of atomic centers. The responsibilities for approval of the normative documentation are assigned to the management in charge of the administration of economic and human resources in order to fulfill the institutional objectives. Another improvement aimed to eliminate unnecessary invaluable processes is the inclusion of all quality system's normative documentation in the CNEA intranet. (author) [es

  2. Section 3. The SPARROW Surface Water-Quality Model: Theory, Application and User Documentation

    Science.gov (United States)

    Schwarz, G.E.; Hoos, A.B.; Alexander, R.B.; Smith, R.A.

    2006-01-01

    SPARROW (SPAtially Referenced Regressions On Watershed attributes) is a watershed modeling technique for relating water-quality measurements made at a network of monitoring stations to attributes of the watersheds containing the stations. The core of the model consists of a nonlinear regression equation describing the non-conservative transport of contaminants from point and diffuse sources on land to rivers and through the stream and river network. The model predicts contaminant flux, concentration, and yield in streams and has been used to evaluate alternative hypotheses about the important contaminant sources and watershed properties that control transport over large spatial scales. This report provides documentation for the SPARROW modeling technique and computer software to guide users in constructing and applying basic SPARROW models. The documentation gives details of the SPARROW software, including the input data and installation requirements, and guidance in the specification, calibration, and application of basic SPARROW models, as well as descriptions of the model output and its interpretation. The documentation is intended for both researchers and water-resource managers with interest in using the results of existing models and developing and applying new SPARROW models. The documentation of the model is presented in two parts. Part 1 provides a theoretical and practical introduction to SPARROW modeling techniques, which includes a discussion of the objectives, conceptual attributes, and model infrastructure of SPARROW. Part 1 also includes background on the commonly used model specifications and the methods for estimating and evaluating parameters, evaluating model fit, and generating water-quality predictions and measures of uncertainty. Part 2 provides a user's guide to SPARROW, which includes a discussion of the software architecture and details of the model input requirements and output files, graphs, and maps. The text documentation and computer

  3. E-nursing documentation as a tool for quality assurance.

    Science.gov (United States)

    Rajkovic, Vladislav; Sustersic, Olga; Rajkovic, Uros

    2006-01-01

    The article presents the results of a project with which we describe the reengineering of nursing documentation. Documentation in nursing is an efficient tool for ensuring quality health care and consequently quality patient treatment along the whole clinical path. We have taken into account the nursing process and patient treatment based on Henderson theoretical model of nursing that consists of 14 basic living activities. The model of new documentation enables tracing, transparency, selectivity, monitoring and analyses. All these factors lead to improvements of a health system as well as to improved safety of patients and members of nursing teams. Thus the documentation was developed for three health care segments: secondary and tertiary level, dispensaries and community health care. The new quality introduced to the documentation process by information and communication technology is presented by a database model and a software prototype for managing documentation.

  4. Theoretical and Practical Aspects of Logistic Quality Management System Documentation Development Process

    Directory of Open Access Journals (Sweden)

    Linas Šaulinskas

    2013-12-01

    Full Text Available This paper addresses aspects of logistics quality management system documentation development and suggests models for quality management system documentation development, documentation hierarchical systems and authorization approval. It also identifies logistic processes and a responsibilities model and a detailed document development and approval process that can be practically applied. Our results are based upon an analysis of advanced Lithuanian and foreign corporate business practices, a review of current literature and recommendations for quality management system standards.

  5. Quality control of the documentation process in electronic economic activities

    Directory of Open Access Journals (Sweden)

    Krutova A.S.

    2017-06-01

    Full Text Available It is proved that the main tool that will provide adequate information resources e economic activities of social and economic relations are documenting quality control processes as the basis of global information space. Directions problems as formation evaluation information resources in the process of documentation, namely development tools assess the efficiency of the system components – qualitative assessment; development of mathematical modeling tools – quantitative evaluation. A qualitative assessment of electronic documentation of economic activity through exercise performance, efficiency of communication; document management efficiency; effectiveness of flow control operations; relationship management effectiveness. The concept of quality control process documents electronically economic activity to components which include: the level of workflow; forms adequacy of information; consumer quality documents; quality attributes; type of income data; condition monitoring systems; organizational level process documentation; attributes of quality, performance quality consumer; type of management system; type of income data; condition monitoring systems. Grounded components of the control system electronic document subjects of economic activity. Detected components IT-audit management system economic activity: compliance audit; audit of internal control; detailed multilevel analysis; corporate risk assessment methodology. The stages and methods of processing electronic transactions economic activity during condition monitoring of electronic economic activity.

  6. Quality improvement in clinical documentation: does clinical governance work?

    Directory of Open Access Journals (Sweden)

    Dehghan M

    2013-12-01

    Full Text Available Mahlegha Dehghan,1 Dorsa Dehghan,2 Akbar Sheikhrabori,3 Masoume Sadeghi,4 Mehrdad Jalalian5 1Department of Medical Surgical Nursing, School of Nursing and Midwifery, Kerman University of Medical Sciences, Kerman, 2Department of Pediatric Nursing, School of Nursing and Midwifery, Islamic Azad University Kerman Branch, Kerman, 3Department of Medical Surgical Nursing, School of Nursing and Midwifery, Kerman University of Medical Sciences, Kerman, 4Research Center for Modeling in Health, Institute of Futures Studies in Health, Kerman University of Medical Sciences, Kerman, 5Electronic Physician Journal, Mashhad, Iran Introduction: The quality of nursing documentation is still a challenge in the nursing profession and, thus, in the health care industry. One major quality improvement program is clinical governance, whose mission is to continuously improve the quality of patient care and overcome service quality problems. The aim of this study was to identify whether clinical governance improves the quality of nursing documentation. Methods: A quasi-experimental method was used to show nursing documentation quality improvement after a 2-year clinical governance implementation. Two hundred twenty random nursing documents were assessed structurally and by content using a valid and reliable researcher made checklist. Results: There were no differences between a nurse's demographic data before and after 2 years (P>0.05 and the nursing documentation score did not improve after a 2-year clinical governance program. Conclusion: Although some efforts were made to improve nursing documentation through clinical governance, these were not sufficient and more attempts are needed. Keywords: nursing documentation, clinical governance, quality improvement, nursing record

  7. SGHWR - quality assurance documentation

    International Nuclear Information System (INIS)

    Garrard, R.S.; Caulfield, J.

    1976-01-01

    The quality assurance program for a modern power station such as an SGHWR type reactor plant must include a record of quality achievement. The case history record which is evidence of the actual quality of the plant and is a data bank of design, manufacture, and results of inspections and tests, is described. Documentation distribution, which keeps all key areas informed of plant item quality status, and the retrieval and storage of information, are briefly discussed. (U.K.)

  8. Document and author promotion strategies in the secure wiki model

    DEFF Research Database (Denmark)

    Lindberg, Kasper; Jensen, Christian D.

    2012-01-01

    Wiki systems form a subclass of the more general Open Collaborative Authoring Systems, where content is created by a user community. The ability of anyone to edit the content is, at the same time, their strength and their weakness. Anyone can write documents that improve the value of the wiki-system......, but this also means that anyone can introduce errors into documents, either by accident or on purpose. A security model for wiki-style authoring systems, called the Secure Wiki Model, has previously been proposed to address this problem. This model is designed to prevent corruption of good quality documents......, by limiting updates, to such documents, to users who have demonstrated their ability to produce documents of similar or better quality. While this security model prevents all user from editing all documents, it does respect the wiki philosophy by allowing any author who has produced documents of a certain...

  9. Quality documentation challenges for veterinary clinical pathology laboratories.

    Science.gov (United States)

    Sacchini, Federico; Freeman, Kathleen P

    2008-05-01

    An increasing number of veterinary laboratories worldwide have obtained or are seeking certification based on international standards, such as the International Organization for Standardization/International Electrotechnical Commission 17025. Compliance with any certification standard or quality management system requires quality documentation, an activity that may present several unique challenges in the case of veterinary laboratories. Research specifically addressing quality documentation is conspicuously absent in the veterinary literature. This article provides an overview of the quality system documentation needed to comply with a quality management system with an emphasis on preparing written standard operating procedures specific for veterinary laboratories. In addition, the quality documentation challenges that are unique to veterinary clinical pathology laboratories are critically evaluated against the existing quality standards and discussed with respect to possible solutions and/or recommended courses of action. Documentation challenges include the establishment of quality requirements for veterinary tests, the use or modification of human analytic methods for animal samples, the limited availability of quality control materials satisfactory for veterinary clinical pathology laboratories, the limited availability of veterinary proficiency programs, and the complications in establishing species-specific reference intervals.

  10. Environmental Restoration Remedial Action quality assurance requirements document

    International Nuclear Information System (INIS)

    1991-01-01

    This document defines the quality assurance requirements for the US Department of Energy-Richland Operations Office Environmental Restoration Remedial Action program at the Hanford Site. The Environmental Restoration Remedial Action program implements significant commitments made by the US Department of Energy in the Hanford Federal Facility Agreement and Consent Order entered into with the Washington State Department of Ecology and the US Environmental Protection Agency. This document combines quality assurance requirements from various source documents into one set of requirements for use by the US Department of Energy-Richland Operations Office and other Environmental Restoration Remedial Action program participants. This document will serve as the basis for developing Quality Assurance Program Plans and implementing procedures by the participants. The requirements of this document will be applied to activities affecting quality, using a graded approach based on the importance of the item, service, or activity to the program objectives. The Quality Assurance Program that will be established using this document as the basis, together with other program and technical documents, form an integrated management control system for conducting the Environmental Restoration Remedial Action program activities in a manner that provides safety and protects the environment and public health

  11. Towards web documents quality assessment for digital humanities scholars

    NARCIS (Netherlands)

    Ceolin, D.; Noordegraaf, Julia; Aroyo, L.M.; van Son, C.M.; Nejdl, Wolfgang; Hall, Wendy; Parigi, Paolo; Staab, Steffen

    2016-01-01

    We present a framework for assessing the quality of Web documents, and a baseline of three quality dimensions: trustworthiness, objectivity and basic scholarly quality. Assessing Web document quality is a "deep data" problem necessitating approaches to handle both data size and complexity.

  12. Quality Management of Measurements incl. Documentation

    DEFF Research Database (Denmark)

    Tosello, Guido; De Chiffre, Leonardo

    2004-01-01

    -Nürnberg, Chair for Quality Management and Manufacturing-Oriented Metrology (Germany). 'Metro-E-Learn' project proposes to develop and implement a coherent learning and competence chain that leads from introductory and foundation e-courses in initial manufacturing engineering studies towards higher....... Machine tool testing 9. The role of manufacturing metrology for QM 10. Inspection planning 11. Quality management of measurements incl. Documentation 12. Advanced manufacturing measurement technology The present report (which represents the section 11 - Quality management of measurements incl....... Documentation – of the e-learning system) is part of the contribution of the Department for Manufacturing Engineering and Management (IPL) / Centre for Geometrical Metrology (CGM) at the Technical University of Denmark to the MINERVA EU project mentioned above....

  13. Modeling Documents with Event Model

    Directory of Open Access Journals (Sweden)

    Longhui Wang

    2015-08-01

    Full Text Available Currently deep learning has made great breakthroughs in visual and speech processing, mainly because it draws lessons from the hierarchical mode that brain deals with images and speech. In the field of NLP, a topic model is one of the important ways for modeling documents. Topic models are built on a generative model that clearly does not match the way humans write. In this paper, we propose Event Model, which is unsupervised and based on the language processing mechanism of neurolinguistics, to model documents. In Event Model, documents are descriptions of concrete or abstract events seen, heard, or sensed by people and words are objects in the events. Event Model has two stages: word learning and dimensionality reduction. Word learning is to learn semantics of words based on deep learning. Dimensionality reduction is the process that representing a document as a low dimensional vector by a linear mode that is completely different from topic models. Event Model achieves state-of-the-art results on document retrieval tasks.

  14. ARCHITECTURE SOFTWARE SOLUTION TO SUPPORT AND DOCUMENT MANAGEMENT QUALITY SYSTEM

    Directory of Open Access Journals (Sweden)

    Milan Eric

    2010-12-01

    Full Text Available One of the basis of a series of standards JUS ISO 9000 is quality system documentation. An architecture of the quality system documentation depends on the complexity of business system. An establishment of an efficient management documentation of system of quality is of a great importance for the business system, as well as in the phase of introducing the quality system and in further stages of its improvement. The study describes the architecture and capability of software solutions to support and manage the quality system documentation in accordance with the requirements of standards ISO 9001:2001, ISO 14001:2005 HACCP etc.

  15. Spectrum analysis on quality requirements consideration in software design documents.

    Science.gov (United States)

    Kaiya, Haruhiko; Umemura, Masahiro; Ogata, Shinpei; Kaijiri, Kenji

    2013-12-01

    Software quality requirements defined in the requirements analysis stage should be implemented in the final products, such as source codes and system deployment. To guarantee this meta-requirement, quality requirements should be considered in the intermediate stages, such as the design stage or the architectural definition stage. We propose a novel method for checking whether quality requirements are considered in the design stage. In this method, a technique called "spectrum analysis for quality requirements" is applied not only to requirements specifications but also to design documents. The technique enables us to derive the spectrum of a document, and quality requirements considerations in the document are numerically represented in the spectrum. We can thus objectively identify whether the considerations of quality requirements in a requirements document are adapted to its design document. To validate the method, we applied it to commercial software systems with the help of a supporting tool, and we confirmed that the method worked well.

  16. Builders Challenge Quality Criteria Support Document

    Energy Technology Data Exchange (ETDEWEB)

    None

    2009-06-01

    This document provides guidance to U.S. home builders participating in Builders Challenge. To qualify for the Builders Challenge, a home must score 70 or less on the EnergySmart Home Scale (E-Scale). Homes also must meet the Builders Challenge Quality Cri

  17. Document Models

    Directory of Open Access Journals (Sweden)

    A.A. Malykh

    2017-08-01

    Full Text Available In this paper, the concept of locally simple models is considered. Locally simple models are arbitrarily complex models built from relatively simple components. A lot of practically important domains of discourse can be described as locally simple models, for example, business models of enterprises and companies. Up to now, research in human reasoning automation has been mainly concentrated around the most intellectually intensive activities, such as automated theorem proving. On the other hand, the retailer business model is formed from ”jobs”, and each ”job” can be modelled and automated more or less easily. At the same time, the whole retailer model as an integrated system is extremely complex. In this paper, we offer a variant of the mathematical definition of a locally simple model. This definition is intended for modelling a wide range of domains. Therefore, we also must take into account the perceptual and psychological issues. Logic is elitist, and if we want to attract to our models as many people as possible, we need to hide this elitism behind some metaphor, to which ’ordinary’ people are accustomed. As such a metaphor, we use the concept of a document, so our locally simple models are called document models. Document models are built in the paradigm of semantic programming. This allows us to achieve another important goal - to make the documentary models executable. Executable models are models that can act as practical information systems in the described domain of discourse. Thus, if our model is executable, then programming becomes redundant. The direct use of a model, instead of its programming coding, brings important advantages, for example, a drastic cost reduction for development and maintenance. Moreover, since the model is well and sound, and not dissolved within programming modules, we can directly apply AI tools, in particular, machine learning. This significantly expands the possibilities for automation and

  18. Theoretical foundations: Formalized temporal models for hyperlinked multimedia documents

    NARCIS (Netherlands)

    B. Meixner (Britta)

    2018-01-01

    textabstractConsistent linking and accurate synchronization of multimedia elements in hypervideos or multimedia documents are essential to provide a good quality of experience to viewers. Temporal models are needed to define relationships and constraints between multimedia elements and create an

  19. The Janus Head Article - On Quality in the Documentation Process

    Directory of Open Access Journals (Sweden)

    Henrik Andersen

    2006-03-01

    Full Text Available The god Janus in Greek mythology was a two-faced god; each face had its own view of the world. Our idea behind the Janus Head article is to give you two different and maybe even contradicting views on a certain topic. In this issue the topic is quality in the documentation process. In the first half of this issue’s Janus Head Article translators from the international company Grundfos give us their view of quality and how quality is managed in the documentation process at Grundfos. In the second half of the Janus Head Article scholars from the University of Southern Denmark describe and discuss quality in the documentation process at Grundfos from a researcher’s point of view.

  20. The Janus Head Article - On Quality in the Documentation Process

    Directory of Open Access Journals (Sweden)

    Henrik Andersen

    2012-08-01

    Full Text Available The god Janus in Greek mythology was a two-faced god; each face had its own view of the world. Our idea behind the Janus Head article is to give you two different and maybe even contradicting views on a certain topic. In this issue the topic is quality in the documentation process. In the first half of this issue’s Janus Head Article translators from the international company Grundfos give us their view of quality and how quality is managed in the documentation process at Grundfos. In the second half of the Janus Head Article scholars from the University of Southern Denmark describe and discuss quality in the documentation process at Grundfos from a researcher’s point of view.

  1. Web-based X-ray quality control documentation.

    Science.gov (United States)

    David, George; Burnett, Lou Ann; Schenkel, Robert

    2003-01-01

    The department of radiology at the Medical College of Georgia Hospital and Clinics has developed an equipment quality control web site. Our goal is to provide immediate access to virtually all medical physics survey data. The web site is designed to assist equipment engineers, department management and technologists. By improving communications and access to equipment documentation, we believe productivity is enhanced. The creation of the quality control web site was accomplished in three distinct steps. First, survey data had to be placed in a computer format. The second step was to convert these various computer files to a format supported by commercial web browsers. Third, a comprehensive home page had to be designed to provide convenient access to the multitude of surveys done in the various x-ray rooms. Because we had spent years previously fine-tuning the computerization of the medical physics quality control program, most survey documentation was already in spreadsheet or database format. A major technical decision was the method of conversion of survey spreadsheet and database files into documentation appropriate for the web. After an unsatisfactory experience with a HyperText Markup Language (HTML) converter (packaged with spreadsheet and database software), we tried creating Portable Document Format (PDF) files using Adobe Acrobat software. This process preserves the original formatting of the document and takes no longer than conventional printing; therefore, it has been very successful. Although the PDF file generated by Adobe Acrobat is a proprietary format, it can be displayed through a conventional web browser using the freely distributed Adobe Acrobat Reader program that is available for virtually all platforms. Once a user installs the software, it is automatically invoked by the web browser whenever the user follows a link to a file with a PDF extension. Although no confidential patient information is available on the web site, our legal

  2. Impact of Quality Improvement Educational Interventions on Documented Adherence to Quality Measures for Adults with Crohn's Disease.

    Science.gov (United States)

    Greene, Laurence; Sapir, Tamar; Moreo, Kathleen; Carter, Jeffrey D; Patel, Barry; Higgins, Peter D R

    2015-09-01

    In recent years, leading organizations in inflammatory bowel disease (IBD) have developed quality measures for the care of adults with Crohn's disease or ulcerative colitis. We used chart audits to assess the impact of quality improvement educational activities on documented adherence to Physician Quality Reporting System measures for IBD. Twenty community-based gastroenterologists were recruited to participate in baseline chart audits (n = 200), a series of 4 accredited educational activities with feedback, and follow-up chart audits (n = 200). Trained abstractors reviewed randomly selected charts of adults with moderate or severe Crohn's disease. The charts were retrospectively abstracted for physicians' documented performance of the 2013 Physician Quality Reporting System IBD quality measures. We compared the physicians' baseline and posteducation rates of documented adherence with 10 of these measures. In a secondary analysis, we compared preeducation with posteducation difference scores of low-performing physicians, those whose baseline documentation rates were in the lowest quartile, and the rest of the cohort. At baseline, documentation of mean provider-level adherence to the 10 quality measures ranged from 3% to 98% (grand mean = 35.8%). In the overall analysis, baseline and posteducation rates of documented adherence did not differ significantly for any of the measures. However, for 4 measures, preeducation to posteducation difference scores were significantly greater among low performers than physicians in the highest 3 quartiles. The results of this preliminary pragmatic study indicate that quality improvement education affords the potential to improve adherence to Physician Quality Reporting System quality measures for IBD among low-performing gastroenterologists.

  3. Air quality model guideline

    International Nuclear Information System (INIS)

    Idriss, A.; Spurrell, F.

    2009-06-01

    Alberta Environment has developed a guidelines for operations and proposed operations that require approvals under the province's Environmental Protection and Enhancement Act or that operate under a code of practice for emissions to the atmosphere. In an effort to ensure consistency in the use of dispersion models for regulatory applications in Alberta, this document provided detailed guidance on suitable methods and approaches that should be employed to assess air quality from emission sources, specifically, information required to demonstrate that a source meets the Alberta ambient air quality objectives. The document outlined the statutory authority and provided an overview of the approach. It provided detailed advice on the types and uses of dispersion models with particular reference to the modelling protocol, input data, and output interpretation. Guidance on the application of regulatory models were also presented. Various models were described and their intended uses were explained. Internet addresses for different modelling resources were also offered. Last, some information about regional modelling in the province of Alberta was discussed. 40 refs., 4 tabs., 7 figs., 3 appendices.

  4. Development of the Quality of Australian Nursing Documentation in Aged Care (QANDAC) instrument to assess paper-based and electronic resident records.

    Science.gov (United States)

    Wang, Ning; Björvell, Catrin; Hailey, David; Yu, Ping

    2014-12-01

    To develop an Australian nursing documentation in aged care (Quality of Australian Nursing Documentation in Aged Care (QANDAC)) instrument to measure the quality of paper-based and electronic resident records. The instrument was based on the nursing process model and on three attributes of documentation quality identified in a systematic review. The development process involved five phases following approaches to designing criterion-referenced measures. The face and content validities and the inter-rater reliability of the instrument were estimated using a focus group approach and consensus model. The instrument contains 34 questions in three sections: completion of nursing history and assessment, description of care process and meeting the requirements of data entry. Estimates of the validity and inter-rater reliability of the instrument gave satisfactory results. The QANDAC instrument may be a useful audit tool for quality improvement and research in aged care documentation. © 2013 ACOTA.

  5. Daily Encounter Cards-Evaluating the Quality of Documented Assessments.

    Science.gov (United States)

    Cheung, Warren J; Dudek, Nancy; Wood, Timothy J; Frank, Jason R

    2016-10-01

    Concerns over the quality of work-based assessment (WBA) completion has resulted in faculty development and rater training initiatives. Daily encounter cards (DECs) are a common form of WBA used in ambulatory care and shift work settings. A tool is needed to evaluate initiatives aimed at improving the quality of completion of this widely used form of WBA. The completed clinical evaluation report rating (CCERR) was designed to provide a measure of the quality of documented assessments on in-training evaluation reports. The purpose of this study was to provide validity evidence to support using the CCERR to assess the quality of DEC completion. Six experts in resident assessment grouped 60 DECs into 3 quality categories (high, average, and poor) based on how informative each DEC was for reporting judgments of the resident's performance. Eight supervisors (blinded to the expert groupings) scored the 10 most representative DECs in each group using the CCERR. Mean scores were compared to determine if the CCERR could discriminate based on DEC quality. Statistically significant differences in CCERR scores were observed between all quality groups ( P  evaluate DEC quality. It can serve as an outcome measure for studying interventions targeted at improving the quality of assessments documented on DECs.

  6. Quality of nursing documentation: Paper-based health records versus electronic-based health records.

    Science.gov (United States)

    Akhu-Zaheya, Laila; Al-Maaitah, Rowaida; Bany Hani, Salam

    2018-02-01

    To assess and compare the quality of paper-based and electronic-based health records. The comparison examined three criteria: content, documentation process and structure. Nursing documentation is a significant indicator of the quality of patient care delivery. It can be either paper-based or organised within the system known as the electronic health records. Nursing documentation must be completed at the highest standards, to ensure the safety and quality of healthcare services. However, the evidence is not clear on which one of the two forms of documentation (paper-based versus electronic health records is more qualified. A retrospective, descriptive, comparative design was used to address the study's purposes. A convenient number of patients' records, from two public hospitals, were audited using the Cat-ch-Ing audit instrument. The sample size consisted of 434 records for both paper-based health records and electronic health records from medical and surgical wards. Electronic health records were better than paper-based health records in terms of process and structure. In terms of quantity and quality content, paper-based records were better than electronic health records. The study affirmed the poor quality of nursing documentation and lack of nurses' knowledge and skills in the nursing process and its application in both paper-based and electronic-based systems. Both forms of documentation revealed drawbacks in terms of content, process and structure. This study provided important information, which can guide policymakers and administrators in identifying effective strategies aimed at enhancing the quality of nursing documentation. Policies and actions to ensure quality nursing documentation at the national level should focus on improving nursing knowledge, competencies, practice in nursing process, enhancing the work environment and nursing workload, as well as strengthening the capacity building of nursing practice to improve the quality of nursing care and

  7. 34 CFR 200.89 - MEP allocations; Re-interviewing; Eligibility documentation; and Quality control.

    Science.gov (United States)

    2010-07-01

    ... of recruiters, size or growth in local migratory child population, effectiveness of local quality... documentation; and Quality control. 200.89 Section 200.89 Education Regulations of the Offices of the Department...-interviewing; Eligibility documentation; and Quality control. (a) Allocation of funds under the MEP for fiscal...

  8. Quality and correlates of medical record documentation in the ambulatory care setting

    Directory of Open Access Journals (Sweden)

    Simon Steven R

    2002-12-01

    Full Text Available Abstract Background Documentation in the medical record facilitates the diagnosis and treatment of patients. Few studies have assessed the quality of outpatient medical record documentation, and to the authors' knowledge, none has conclusively determined the correlates of chart documentation. We therefore undertook the present study to measure the rates of documentation of quality of care measures in an outpatient primary care practice setting that utilizes an electronic medical record. Methods We reviewed electronic medical records from 834 patients receiving care from 167 physicians (117 internists and 50 pediatricians at 14 sites of a multi-specialty medical group in Massachusetts. We abstracted information for five measures of medical record documentation quality: smoking history, medications, drug allergies, compliance with screening guidelines, and immunizations. From other sources we determined physicians' specialty, gender, year of medical school graduation, and self-reported time spent teaching and in patient care. Results Among internists, unadjusted rates of documentation were 96.2% for immunizations, 91.6% for medications, 88% for compliance with screening guidelines, 61.6% for drug allergies, 37.8% for smoking history. Among pediatricians, rates were 100% for immunizations, 84.8% for medications, 90.8% for compliance with screening guidelines, 50.4% for drug allergies, and 20.4% for smoking history. While certain physician and patient characteristics correlated with some measures of documentation quality, documentation varied depending on the measure. For example, female internists were more likely than male internists to document smoking history (odds ratio [OR], 1.90; 95% confidence interval [CI], 1.27 – 2.83 but were less likely to document drug allergies (OR, 0.51; 95% CI, 0.35 – 0.75. Conclusions Medical record documentation varied depending on the measure, with room for improvement in most domains. A variety of

  9. Improving documentation of a beta-blocker quality measure through an anesthesia information management system and real-time notification of documentation errors.

    Science.gov (United States)

    Nair, Bala G; Peterson, Gene N; Newman, Shu-Fang; Wu, Wei-Ying; Kolios-Morris, Vickie; Schwid, Howard A

    2012-06-01

    Continuation of perioperative beta-blockers for surgical patients who are receiving beta-blockers prior to arrival for surgery is an important quality measure (SCIP-Card-2). For this measure to be considered successful, name, date, and time of the perioperative beta-blocker must be documented. Alternately, if the beta-blocker is not given, the medical reason for not administering must be documented. Before the study was conducted, the institution lacked a highly reliable process to document the date and time of self-administration of beta-blockers prior to hospital admission. Because of this, compliance with the beta-blocker quality measure was poor (-65%). To improve this measure, the anesthesia care team was made responsible for documenting perioperative beta-blockade. Clear documentation guidelines were outlined, and an electronic Anesthesia Information Management System (AIMS) was configured to facilitate complete documentation of the beta-blocker quality measure. In addition, real-time electronic alerts were generated using Smart Anesthesia Messenger (SAM), an internally developed decision-support system, to notify users concerning incomplete beta-blocker documentation. Weekly compliance for perioperative beta-blocker documentation before the study was 65.8 +/- 16.6%, which served as the baseline value. When the anesthesia care team started documenting perioperative beta-blocker in AIMS, compliance was 60.5 +/- 8.6% (p = .677 as compared with baseline). Electronic alerts with SAM improved documentation compliance to 94.6 +/- 3.5% (p documentation and (2) enhance features in the electronic medical systems to alert the user concerning incomplete documentation.

  10. Daily Encounter Cards—Evaluating the Quality of Documented Assessments

    Science.gov (United States)

    Cheung, Warren J.; Dudek, Nancy; Wood, Timothy J.; Frank, Jason R.

    2016-01-01

    ABSTRACT Background  Concerns over the quality of work-based assessment (WBA) completion has resulted in faculty development and rater training initiatives. Daily encounter cards (DECs) are a common form of WBA used in ambulatory care and shift work settings. A tool is needed to evaluate initiatives aimed at improving the quality of completion of this widely used form of WBA. Objective  The completed clinical evaluation report rating (CCERR) was designed to provide a measure of the quality of documented assessments on in-training evaluation reports. The purpose of this study was to provide validity evidence to support using the CCERR to assess the quality of DEC completion. Methods  Six experts in resident assessment grouped 60 DECs into 3 quality categories (high, average, and poor) based on how informative each DEC was for reporting judgments of the resident's performance. Eight supervisors (blinded to the expert groupings) scored the 10 most representative DECs in each group using the CCERR. Mean scores were compared to determine if the CCERR could discriminate based on DEC quality. Results  Statistically significant differences in CCERR scores were observed between all quality groups (P < .001). A generalizability analysis demonstrated the majority of score variation was due to differences in DECs. The reliability with a single rater was 0.95. Conclusions  The CCERR is a reliable and valid tool to evaluate DEC quality. It can serve as an outcome measure for studying interventions targeted at improving the quality of assessments documented on DECs. PMID:27777675

  11. Quality assurance when documenting chemical hazards to health and environment

    International Nuclear Information System (INIS)

    Guttormsen, R.; Modahl, S.I.; Tufto, P.A.; Buset, H.

    1991-01-01

    In a joint project between The Norwegian Petroleum Directorate (NPD), the State Pollution Control Agency (SFT) and Conoco Norway Inc. (CNI) we have evaluated the use of quality assurance principles in connection with development and distribution of information about chemicals. Assuring quality of the documentation is first of all depending on: the work in international organizations; the content of national and international guidelines and criteria documents; the use of product registers; activities in manufacturers' organizations; the role of importers and agents. These are aspects which have been evaluated. Recommendations are given in this paper concerning: definition of responsibilities in regulations, standards and guidelines; feedback of experience and coordination through international work; application of quality assurance principles in the use of information technology in international organizations and in manufacturers' organizations; use of quality assurance principles in validation of data

  12. Model documentation report: Short-Term Hydroelectric Generation Model

    International Nuclear Information System (INIS)

    1993-08-01

    The purpose of this report is to define the objectives of the Short- Term Hydroelectric Generation Model (STHGM), describe its basic approach, and to provide details on the model structure. This report is intended as a reference document for model analysts, users, and the general public. Documentation of the model is in accordance with the Energy Information Administration's (AYE) legal obligation to provide adequate documentation in support of its models (Public Law 94-385, Section 57.b.2). The STHGM performs a short-term (18 to 27- month) forecast of hydroelectric generation in the United States using an autoregressive integrated moving average (UREMIA) time series model with precipitation as an explanatory variable. The model results are used as input for the short-term Energy Outlook

  13. Process of technical document quality assessment | Djebabra ...

    African Journals Online (AJOL)

    The most used instrument in training and scientific research is obviously the book which occupies since always a place of choice. Indeed, the book and more particularly the technical book are is used as a support, as well in the basic training as in the document is of good quality in order to have confidence in the services ...

  14. Quality improvement in documentation of postoperative care nursing using computer-based medical records

    DEFF Research Database (Denmark)

    Olsen, Susanne Winther

    2013-01-01

    on the template with quantitative data showed satisfactory documentation of postoperative care nursing in 67% (18% to 92%; mean [min-max]) of the scores. The template for documentation using qualitative descriptions was used by 63% of the nurses, but the keywords were used to a varying degree, that is, from 0......Postanesthesia nursing should be documented with high quality. The purpose of this retrospective case-based study on 49 patients was to analyze the quality of postoperative documentation in the two existing templates and, based on this audit, to suggest a new template for documentation. The audit...

  15. Model documentation report: Transportation sector model of the National Energy Modeling System

    Energy Technology Data Exchange (ETDEWEB)

    1994-03-01

    This report documents the objectives, analytical approach and development of the National Energy Modeling System (NEMS) Transportation Model (TRAN). The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated by the model. This document serves three purposes. First, it is a reference document providing a detailed description of TRAN for model analysts, users, and the public. Second, this report meets the legal requirements of the Energy Information Administration (EIA) to provide adequate documentation in support of its statistical and forecast reports (Public Law 93-275, 57(b)(1)). Third, it permits continuity in model development by providing documentation from which energy analysts can undertake model enhancements, data updates, and parameter refinements.

  16. World energy projection system: Model documentation

    Science.gov (United States)

    1992-06-01

    The World Energy Project System (WEPS) is an accounting framework that incorporates projects from independently documented models and assumptions about the future energy intensity of economic activity (ratios of total energy consumption divided by gross domestic product) and about the rate of incremental energy requirements met by hydropower, geothermal, coal, and natural gas to produce projections of world energy consumption published annually by the Energy Information Administration (EIA) in the International Energy Outlook (IEO). Two independently documented models presented in Figure 1, the Oil Market Simulation (OMS) model and the World Integrated Nuclear Evaluation System (WINES), provide projections of oil and nuclear power consumption published in the IEO. Output from a third independently documented model, and the International Coal Trade Model (ICTM), is not published in the IEO but is used in WEPS as a supply check on projections of world coal consumption produced by WEPS and published in the IEO. A WEPS model of natural gas production documented in this report provides the same type of implicit supply check on the WEPS projections of world natural gas consumption published in the IEO. Two additional models are included in Figure 1, the OPEC Capacity model and the Non-OPEC Oil Production model. These WEPS models provide inputs to the OMS model and are documented in this report.

  17. World energy projection system: Model documentation

    International Nuclear Information System (INIS)

    1992-06-01

    The World Energy Project System (WEPS) is an accounting framework that incorporates projects from independently documented models and assumptions about the future energy intensity of economic activity (ratios of total energy consumption divided by gross domestic product) and about the rate of incremental energy requirements met by hydropower, geothermal, coal, and natural gas to produce projections of world energy consumption published annually by the Energy Information Administration (EIA) in the International Energy Outlook (IEO) (Figure 1). Two independently documented models presented in Figure 1, the Oil Market Simulation (OMS) model and the World Integrated Nuclear Evaluation System (WINES) provide projections of oil and nuclear power consumption published in the IEO. Output from a third independently documented model, and the International Coal Trade Model (ICTM), is not published in the IEO but is used in WEPS as a supply check on projections of world coal consumption produced by WEPS and published in the IEO. A WEPS model of natural gas production documented in this report provides the same type of implicit supply check on the WEPS projections of world natural gas consumption published in the IEO. Two additional models are included in Figure 1, the OPEC Capacity model and the Non-OPEC Oil Production model. These WEPS models provide inputs to the OMS model and are documented in this report

  18. The Role of Documentation Quality in Anesthesia-Related Closed Claims: A Descriptive Qualitative Study.

    Science.gov (United States)

    Wilbanks, Bryan A; Geisz-Everson, Marjorie; Boust, Rebecca R

    2016-09-01

    Clinical documentation is a critical tool in supporting care provided to patients. Sound documentation provides a picture of clinical events that can be used to improve patient care. However, many other uses for clinical documentation are equally important. Such documentation informs clinical decision support tools, creates a legal record of patient care, assists in financial reimbursement of services, and serves as a repository for secondary data analysis. Conversely, poor documentation can impair patient safety and increase malpractice risk exposure by reflecting poor or inaccurate information that ultimately may guide patient care decisions.Through an examination of anesthesia-related closed claims, a descriptive qualitative study emerged, which explored the antecedents and consequences of documentation quality in the claims reviewed. A secondary data analysis utilized a database generated by the American Association of Nurse Anesthetists Foundation closed claim review team. Four major themes emerged from the analysis. Themes 1, 2, and 4 primarily describe how poor documentation quality can have negative consequences for clinicians. The third theme primarily describes how poor documentation quality that can negatively affect patient safety.

  19. QUALITY OF NURSING DOCUMENTATION AND NURSE’S OBJECTIVE WORKLOAD BASED ON TIME AND MOTION STUDY (TMS

    Directory of Open Access Journals (Sweden)

    Mira Amelynda Prakosa

    2017-02-01

    Full Text Available Introduction. The quality of documentation can decrease because of bad admission filling of documentation. Workload is one of the factor that can influence admission filling of documentation. This study was aimed to analyze the correlation between nurse’s objective workload and the quality of nursing documentation in RSU Haji. Method. The design of this study was descriptive correlation with cross-sectional approach. The population on this study was the nurse that works in Marwah 3 and 4 inpatient care in RSU Haji Surabaya. The number of the sample was 14 respondents were selected by simple random sampling. The independent variable was nurse’s objective workload and the dependent variable was quality of nursing documentation. The data were analyzed by using regression logistic. Result. Nurse’s objective workload in RSU Haji was 72%. There was no correlational between nurse’s objective workload with the completeness of nursing documentation (P= 0,999, also nurse’s objective workload with accurate of nursing documentation (P= 0,999. Discussion. This study concluded that nurse’s objective workload was low and quality of nursing documentation was accurate enough and complete enough. Next researcher should provide precise operational so the factors that affected the quality of documentation can be reached and the workload of the nurses in RSU Haji become ideal. Keyword:  nurses, quality of nursing documentation, objective workload

  20. Environmental restoration remedial action quality assurance requirements document

    International Nuclear Information System (INIS)

    Cote, R.F.

    1991-01-01

    The environmental Restoration Remedial Action Quality Assurance Requirements Document (DOE/RL 90-28) defines the quality assurance program requirements for the US Department of Energy-Richland Field Office Environmental Restoration Remedial Action Program at the Hanford Site, Richland, Washington. This paper describes the objectives outlined in DOE/RL 90-28. The Environmental Restoration Remedial Action Program implements significant commitments made by the US Department of Energy in the Hanford Federal Facility Agreement and Consent Order entered into with the Washington State Department of Ecology and the US Environmental Protection Agency

  1. Nonlinear filtering for character recognition in low quality document images

    Science.gov (United States)

    Diaz-Escobar, Julia; Kober, Vitaly

    2014-09-01

    Optical character recognition in scanned printed documents is a well-studied task, where the captured conditions like sheet position, illumination, contrast and resolution are controlled. Nowadays, it is more practical to use mobile devices for document capture than a scanner. So as a consequence, the quality of document images is often poor owing to presence of geometric distortions, nonhomogeneous illumination, low resolution, etc. In this work we propose to use multiple adaptive nonlinear composite filters for detection and classification of characters. Computer simulation results obtained with the proposed system are presented and discussed.

  2. Discussion on the compilation of document of iso quality management system for radiation sterilization enterprises

    International Nuclear Information System (INIS)

    Li Chunhong; Ha Yiming; Zhou Hongjie; Feng Zhiguo; Wang Feng

    2006-01-01

    According to the character of cooperation of radiation sterilization, and association with request of ISO9001, ISO13485 and ISO11137, compilation of document in quality manual, procedure document and technological document during certification of ISO quality management system of cooperation of radiation sterilization was discussed. (authors)

  3. Clinical decision support improves quality of telephone triage documentation--an analysis of triage documentation before and after computerized clinical decision support.

    Science.gov (United States)

    North, Frederick; Richards, Debra D; Bremseth, Kimberly A; Lee, Mary R; Cox, Debra L; Varkey, Prathibha; Stroebel, Robert J

    2014-03-20

    Clinical decision support (CDS) has been shown to be effective in improving medical safety and quality but there is little information on how telephone triage benefits from CDS. The aim of our study was to compare triage documentation quality associated with the use of a clinical decision support tool, ExpertRN©. We examined 50 triage documents before and after a CDS tool was used in nursing triage. To control for the effects of CDS training we had an additional control group of triage documents created by nurses who were trained in the CDS tool, but who did not use it in selected notes. The CDS intervention cohort of triage notes was compared to both the pre-CDS notes and the CDS trained (but not using CDS) cohort. Cohorts were compared using the documentation standards of the American Academy of Ambulatory Care Nursing (AAACN). We also compared triage note content (documentation of associated positive and negative features relating to the symptoms, self-care instructions, and warning signs to watch for), and documentation defects pertinent to triage safety. Three of five AAACN documentation standards were significantly improved with CDS. There was a mean of 36.7 symptom features documented in triage notes for the CDS group but only 10.7 symptom features in the pre-CDS cohort (p < 0.0001) and 10.2 for the cohort that was CDS-trained but not using CDS (p < 0.0001). The difference between the mean of 10.2 symptom features documented in the pre-CDS and the mean of 10.7 symptom features documented in the CDS-trained but not using was not statistically significant (p = 0.68). CDS significantly improves triage note documentation quality. CDS-aided triage notes had significantly more information about symptoms, warning signs and self-care. The changes in triage documentation appeared to be the result of the CDS alone and not due to any CDS training that came with the CDS intervention. Although this study shows that CDS can improve documentation, further study is needed

  4. International Nuclear Model personal computer (PCINM): Model documentation

    International Nuclear Information System (INIS)

    1992-08-01

    The International Nuclear Model (INM) was developed to assist the Energy Information Administration (EIA), U.S. Department of Energy (DOE) in producing worldwide projections of electricity generation, fuel cycle requirements, capacities, and spent fuel discharges from commercial nuclear reactors. The original INM was developed, maintained, and operated on a mainframe computer system. In spring 1992, a streamlined version of INM was created for use on a microcomputer utilizing CLIPPER and PCSAS software. This new version is known as PCINM. This documentation is based on the new PCINM version. This document is designed to satisfy the requirements of several categories of users of the PCINM system including technical analysts, theoretical modelers, and industry observers. This document assumes the reader is familiar with the nuclear fuel cycle and each of its components. This model documentation contains four chapters and seven appendices. Chapter Two presents the model overview containing the PCINM structure and process flow, the areas for which projections are made, and input data and output reports. Chapter Three presents the model technical specifications showing all model equations, algorithms, and units of measure. Chapter Four presents an overview of all parameters, variables, and assumptions used in PCINM. The appendices present the following detailed information: variable and parameter listings, variable and equation cross reference tables, source code listings, file layouts, sample report outputs, and model run procedures. 2 figs

  5. Entity models for trigger-reaction documents

    NARCIS (Netherlands)

    Khalid, M.A.; Marx, M.; Makkes, M.X.

    2008-01-01

    We define the notion of an entity model for a special kind of document popular on the web: an article followed by a list of reactions on that article, usually by many authors, usually inverse chronologically ordered. We call these documents trigger-reactions pairs. The entity model describes which

  6. Quality and Capacity Document 2011. Part 2; Kwaliteits- en Capaciteitsdocument 2011. Deel 2

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-01-15

    In accordance with Article 21 of the Dutch Electricity Law 1998 this quality and capacity document was drafted. The document describes (a) the level of quality which is aimed at and the scores over the last years; (b) the quality management system; (c) the estimated results of the total need for transport for the period 2012-2021; (d) bottlenecks and solutions required to meet the demand for future transport capacity; and (e) measures to be taken for maintenance and replacements. This document comprises chapters 6-8. In parts 1, 3 and 4 you can find the rest of the chapters [Dutch] Conform artikel 21 van de Elektriciteitswet 1998 is het titel document opgesteld. In dit document wordt en worden door TenneT (a) aangegeven welk kwaliteitsniveau wordt nagestreefd en wat de scores in de laatste drie jaar zijn geweest; (b) het kwaliteitbeheerssysteem beschreven; (c) de resultaten van de raming van de totale transportbehoefte beschreven voor de periode 2012-2021; (d) de knelpunten en oplossingen beschreven die nodig zijn om in de behoefte aan de toekomstige transportcapaciteit te kunnen voldoen; en (e) de maatregelen beschreven inzake vervanging en onderhoud. Dit document bevat de hoofdstukken 6-8. In de delen 1, 3 en 4 kunt u de overige hoofdstukken vinden.

  7. Quality and Capacity Document 2011. Part 1; Kwaliteits- en Capaciteitsdocument 2011. Deel 1

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-01-15

    In accordance with Article 21 of the Dutch Electricity Law 1998 this quality and capacity document was drafted. The document describes (a) the level of quality which is aimed at and the scores over the last years; (b) the quality management system; (c) the estimated results of the total need for transport for the period 2012-2021; (d) bottlenecks and solutions required to meet the demand for future transport capacity; and (e) measures to be taken for maintenance and replacements. This document comprises chapters 1-5. In parts 2, 3 and 4 you can find the rest of the chapters [Dutch] Conform artikel 21 van de Elektriciteitswet 1998 is het titel document opgesteld. In dit document wordt en worden door TenneT (a) aangegeven welk kwaliteitsniveau wordt nagestreefd en wat de scores in de laatste drie jaar zijn geweest; (b) het kwaliteitbeheerssysteem beschreven; (c) de resultaten van de raming van de totale transportbehoefte beschreven voor de periode 2012-2021; (d) de knelpunten en oplossingen beschreven die nodig zijn om in de behoefte aan de toekomstige transportcapaciteit te kunnen voldoen; en (e) de maatregelen beschreven inzake vervanging en onderhoud. Dit document bevat de hoofdstukken 1-5. In de delen 2, 3 en 4 kunt u de overige hoofdstukken vinden.

  8. Quality and Capacity Document 2011. Part 4; Kwaliteits- en Capaciteitsdocument 2011. Deel 4

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-01-15

    In accordance with Article 21 of the Dutch Electricity Law 1998 this quality and capacity document was drafted. The document describes (a) the level of quality which is aimed at and the scores over the last years; (b) the quality management system; (c) the estimated results of the total need for transport for the period 2012-2021; (d) bottlenecks and solutions required to meet the demand for future transport capacity; and (e) measures to be taken for maintenance and replacements. This document comprises chapters 10-12. In parts 1, 2 and 3 you can find the rest of the chapters [Dutch] Conform artikel 21 van de Elektriciteitswet 1998 is het titel document opgesteld. In dit document wordt en worden door TenneT (a) aangegeven welk kwaliteitsniveau wordt nagestreefd en wat de scores in de laatste drie jaar zijn geweest; (b) het kwaliteitbeheerssysteem beschreven; (c) de resultaten van de raming van de totale transportbehoefte beschreven voor de periode 2012-2021; (d) de knelpunten en oplossingen beschreven die nodig zijn om in de behoefte aan de toekomstige transportcapaciteit te kunnen voldoen; en (e) de maatregelen beschreven inzake vervanging en onderhoud. Dit document bevat de hoofdstukken 10-12. In de delen 1, 2 en 3 kunt u de overige hoofdstukken vinden.

  9. Poor Documentation of Inflammatory Bowel Disease Quality Measures in Academic, Community, and Private Practice.

    Science.gov (United States)

    Feuerstein, Joseph D; Castillo, Natalia E; Siddique, Sana S; Lewandowski, Jeffrey J; Geissler, Kathy; Martinez-Vazquez, Manuel; Thukral, Chandrashekhar; Leffler, Daniel A; Cheifetz, Adam S

    2016-03-01

    Quality measures are used to standardize health care and monitor quality of care. In 2011, the American Gastroenterological Association established quality measures for inflammatory bowel disease (IBD), but there has been limited documentation of compliance from different practice settings. We reviewed charts from 367 consecutive patients with IBD seen at academic practices, 217 patients seen at community practices, and 199 patients seen at private practices for compliance with 8 outpatient measures. Records were assessed for IBD history, medications, comorbidities, and hospitalizations. We also determined the number of patient visits to gastroenterologists in the past year, whether patients had a primary care physician at the same institution, and whether they were seen by a specialist in IBD or in conjunction with a trainee, and reviewed physician demographics. A univariate and multivariate statistical analysis was performed to determine which factors were associated with compliance of all core measures. Screening for tobacco abuse was the most frequently assessed core measure (89.6% of patients; n = 701 of 783), followed by location of IBD (80.3%; n = 629 of 783), and assessment for corticosteroid-sparing therapy (70.8%; n = 275 of 388). The least-frequently evaluated measures were pneumococcal immunization (16.7% of patients; n = 131 of 783), bone loss (25%; n = 126 of 505), and influenza immunization (28.7%; n = 225 of 783). Only 5.8% of patients (46 of 783) had all applicable core measures documented (24 in academic practice, none in clinical practice, and 22 in private practice). In the multivariate model, year of graduation from fellowship (odds ratio [OR], 2.184; 95% confidence interval [CI], 1.522-3.134; P measures. We found poor documentation of IBD quality measures in academic, clinical, and private gastroenterology practices. Interventions are necessary to improve reporting of quality measures. Copyright © 2016 AGA Institute. Published by Elsevier Inc

  10. Wilmar joint market model, Documentation

    International Nuclear Information System (INIS)

    Meibom, P.; Larsen, Helge V.; Barth, R.; Brand, H.; Weber, C.; Voll, O.

    2006-01-01

    The Wilmar Planning Tool is developed in the project Wind Power Integration in Liberalised Electricity Markets (WILMAR) supported by EU (Contract No. ENK5-CT-2002-00663). A User Shell implemented in an Excel workbook controls the Wilmar Planning Tool. All data are contained in Access databases that communicate with various sub-models through text files that are exported from or imported to the databases. The Joint Market Model (JMM) constitutes one of these sub-models. This report documents the Joint Market model (JMM). The documentation describes: 1. The file structure of the JMM. 2. The sets, parameters and variables in the JMM. 3. The equations in the JMM. 4. The looping structure in the JMM. (au)

  11. Wilmar joint market model, Documentation

    Energy Technology Data Exchange (ETDEWEB)

    Meibom, P.; Larsen, Helge V. [Risoe National Lab. (Denmark); Barth, R.; Brand, H. [IER, Univ. of Stuttgart (Germany); Weber, C.; Voll, O. [Univ. of Duisburg-Essen (Germany)

    2006-01-15

    The Wilmar Planning Tool is developed in the project Wind Power Integration in Liberalised Electricity Markets (WILMAR) supported by EU (Contract No. ENK5-CT-2002-00663). A User Shell implemented in an Excel workbook controls the Wilmar Planning Tool. All data are contained in Access databases that communicate with various sub-models through text files that are exported from or imported to the databases. The Joint Market Model (JMM) constitutes one of these sub-models. This report documents the Joint Market model (JMM). The documentation describes: 1. The file structure of the JMM. 2. The sets, parameters and variables in the JMM. 3. The equations in the JMM. 4. The looping structure in the JMM. (au)

  12. Model documentation renewable fuels module of the National Energy Modeling System

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-04-01

    This report documents the objectives, analytical approach, and design of the National Energy Modeling System (NEMS) Renewable Fuels Module (RFM) as it relates to the production of the 1997 Annual Energy Outlook forecasts. The report catalogues and describes modeling assumptions, computational methodologies, data inputs. and parameter estimation techniques. A number of offline analyses used in lieu of RFM modeling components are also described. This documentation report serves three purposes. First, it is a reference document for model analysts, model users, and the public interested in the construction and application of the RFM. Second, it meets the legal requirement of the Energy Information Administration (EIA) to provide adequate documentation in support of its models. Finally, such documentation facilitates continuity in EIA model development by providing information sufficient to perform model enhancements and data updates as part of EIA`s ongoing mission to provide analytical and forecasting information systems.

  13. Model documentation renewable fuels module of the National Energy Modeling System

    International Nuclear Information System (INIS)

    1997-04-01

    This report documents the objectives, analytical approach, and design of the National Energy Modeling System (NEMS) Renewable Fuels Module (RFM) as it relates to the production of the 1997 Annual Energy Outlook forecasts. The report catalogues and describes modeling assumptions, computational methodologies, data inputs. and parameter estimation techniques. A number of offline analyses used in lieu of RFM modeling components are also described. This documentation report serves three purposes. First, it is a reference document for model analysts, model users, and the public interested in the construction and application of the RFM. Second, it meets the legal requirement of the Energy Information Administration (EIA) to provide adequate documentation in support of its models. Finally, such documentation facilitates continuity in EIA model development by providing information sufficient to perform model enhancements and data updates as part of EIA's ongoing mission to provide analytical and forecasting information systems

  14. Let Documents Talk to Each Other: A Computer Model for Connection of Short Documents.

    Science.gov (United States)

    Chen, Z.

    1993-01-01

    Discusses the integration of scientific texts through the connection of documents and describes a computer model that can connect short documents. Information retrieval and artificial intelligence are discussed; a prototype system of the model is explained; and the model is compared to other computer models. (17 references) (LRW)

  15. TECHNICAL GUIDANCE DOCUMENT: CONSTRUCTION QUALITY MANAGEMENT FOR REMEDIAL ACTION AND REMEDIAL DESIGN WASTE CONTAINMENT SYSTEMS

    Science.gov (United States)

    This Technical Guidance Document is intended to augment the numerous construction quality control and construction quality assurance (CQC and CQA) documents that are available far materials associated with waste containment systems developed for Superfund site remediation. In ge...

  16. Documenting Quality Improvement and Patient Safety Efforts: The Quality Portfolio. A Statement from the Academic Hospitalist Taskforce

    OpenAIRE

    Taylor, Benjamin B.; Parekh, Vikas; Estrada, Carlos A.; Schleyer, Anneliese; Sharpe, Bradley

    2013-01-01

    Physicians increasingly investigate, work, and teach to improve the quality of care and safety of care delivery. The Society of General Internal Medicine Academic Hospitalist Task Force sought to develop a practical tool, the quality portfolio, to systematically document quality and safety achievements. The quality portfolio was vetted with internal and external stakeholders including national leaders in academic medicine. The portfolio was refined for implementation to include an outlined fr...

  17. Discharge documentation of patients discharged to subacute facilities: a three-year quality improvement process across an integrated health care system.

    Science.gov (United States)

    Gandara, Esteban; Ungar, Jonathan; Lee, Jason; Chan-Macrae, Myrna; O'Malley, Terrence; Schnipper, Jeffrey L

    2010-06-01

    Effective communication among physicians during hospital discharge is critical to patient care. Partners Healthcare (Boston) has been engaged in a multi-year process to measure and improve the quality of documentation of all patients discharged from its five acute care hospitals to subacute facilities. Partners first engaged stakeholders to develop a consensus set of 12 required data elements for all discharges to subacute facilities. A measurement process was established and later refined. Quality improvement interventions were then initiated to address measured deficiencies and included education of physicians and nurses, improvements in information technology, creation of or improvements in discharge documentation templates, training of hospitalists to serve as role models, feedback to physicians and their service chiefs regarding reviewed cases, and case manager review of documentation before discharge. To measure improvement in quality as a result of these efforts, rates of simultaneous inclusion of all 12 applicable data elements ("defect-free rate") were analyzed over time. Some 3,101 discharge documentation packets of patients discharged to subacute facilities from January 1, 2006, through September 2008 were retrospectively studied. During the 11 monitored quarters, the defect-free rate increased from 65% to 96% (p improvements were seen in documentation of preadmission medication lists, allergies, follow-up, and warfarin information. Institution of rigorous measurement, feedback, and multidisciplinary, multimodal quality improvement processes improved the inclusion of data elements in discharge documentation required for safe hospital discharge across a large integrated health care system.

  18. [Quality management and strategic consequences of assessing documentation and coding under the German Diagnostic Related Groups system].

    Science.gov (United States)

    Schnabel, M; Mann, D; Efe, T; Schrappe, M; V Garrel, T; Gotzen, L; Schaeg, M

    2004-10-01

    The introduction of the German Diagnostic Related Groups (D-DRG) system requires redesigning administrative patient management strategies. Wrong coding leads to inaccurate grouping and endangers the reimbursement of treatment costs. This situation emphasizes the roles of documentation and coding as factors of economical success. The aims of this study were to assess the quantity and quality of initial documentation and coding (ICD-10 and OPS-301) and find operative strategies to improve efficiency and strategic means to ensure optimal documentation and coding quality. In a prospective study, documentation and coding quality were evaluated in a standardized way by weekly assessment. Clinical data from 1385 inpatients were processed for initial correctness and quality of documentation and coding. Principal diagnoses were found to be accurate in 82.7% of cases, inexact in 7.1%, and wrong in 10.1%. Effects on financial returns occurred in 16%. Based on these findings, an optimized, interdisciplinary, and multiprofessional workflow on medical documentation, coding, and data control was developed. Workflow incorporating regular assessment of documentation and coding quality is required by the DRG system to ensure efficient accounting of hospital services. Interdisciplinary and multiprofessional cooperation is recognized to be an important factor in establishing an efficient workflow in medical documentation and coding.

  19. EIA model documentation: Petroleum Market Model of the National Energy Modeling System

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-12-30

    The purpose of this report is to define the objectives of the Petroleum Market Model (PMM), describe its basic approach, and provide detail on how it works. This report is intended as a reference document for model analysts, users, and the public. Documentation of the model is in accordance with EIA`s legal obligation to provide adequate documentation in support of its models (Public Law 94-385, section 57.b.2). The PMM models petroleum refining activities, the marketing of products, the production of natural gas liquids and domestic methanol, projects petroleum provides and sources of supplies for meeting demand. In addition, the PMM estimates domestic refinery capacity expansion and fuel consumption.

  20. Documentation for the Waste Reduction Model (WARM)

    Science.gov (United States)

    This page describes the WARM documentation files and provides links to all documentation files associated with EPA’s Waste Reduction Model (WARM). The page includes a brief summary of the chapters documenting the greenhouse gas emission and energy factors.

  1. Practice and documentation of palliative sedation: a quality improvement initiative

    Science.gov (United States)

    McKinnon, M.; Azevedo, C.; Bush, S.H.; Lawlor, P.; Pereira, J.

    2014-01-01

    Background Palliative sedation (ps), the continuous use of sedating doses of medication to intentionally reduce consciousness and relieve refractory symptoms at end of life, is ethically acceptable if administered according to standards of best practice. Procedural guidelines outlining the appropriate use of ps and the need for rigorous documentation have been developed. As a quality improvement strategy, we audited the practice and documentation of ps on our palliative care unit (pcu). Methods A pharmacy database search of admissions in 2008 identified, for a subsequent chart review, patients who had received either a continuous infusion of midazolam (≥10 mg/24 h), regular parenteral dosing of methotrimeprazine (≥75 mg daily), or regular phenobarbital. Documentation of the decision-making process, consent, and medication use was collected using a data extraction form based on current international ps standards. Results Interpretation and comparison of data were difficult because of an apparent lack of a consistent operational definition of ps. Patient records had no specific documentation in relation to ps initiation, to clearly identified refractory symptoms, and to informed consent in 60 (64.5%), 43 (46.2%), and 38 (40.9%) charts respectively. Variation in the medications used was marked: 54 patients (58%) were started on a single agent and 39 (42%), on multiple agents. The 40 patients (43%) started on midazolam alone received a mean daily dose of 21.4 mg (standard deviation: 24.6 mg). Conclusions The lack of documentation and standardized practice of ps on our pcu has resulted in a quality improvement program to address those gaps. They also highlight the importance of conducting research and developing clinical guidelines in this area. PMID:24764700

  2. EIA model documentation: Electricity market module - electricity fuel dispatch

    International Nuclear Information System (INIS)

    1997-01-01

    This report documents the National Energy Modeling System Electricity Fuel Dispatch Submodule (EFD), a submodule of the Electricity Market Module (EMM) as it was used for EIA's Annual Energy Outlook 1997. It replaces previous documentation dated March 1994 and subsequent yearly update revisions. The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated through the synthesis and scenario development based on these components. This document serves four purposes. First, it is a reference document providing a detailed description of the model for reviewers and potential users of the EFD including energy experts at the Energy Information Administration (EIA), other Federal agencies, state energy agencies, private firms such as utilities and consulting firms, and non-profit groups such as consumer and environmental groups. Second, this report meets the legal requirement of the Energy Information Administration (EIA) to provide adequate documentation in support of its statistical and forecast reports. Third, it facilitates continuity in model development by providing documentation which details model enhancements that were undertaken for AE097 and since the previous documentation. Last, because the major use of the EFD is to develop forecasts, this documentation explains the calculations, major inputs and assumptions which were used to generate the AE097

  3. EIA model documentation: Electricity market module - electricity fuel dispatch

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-01-01

    This report documents the National Energy Modeling System Electricity Fuel Dispatch Submodule (EFD), a submodule of the Electricity Market Module (EMM) as it was used for EIA`s Annual Energy Outlook 1997. It replaces previous documentation dated March 1994 and subsequent yearly update revisions. The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated through the synthesis and scenario development based on these components. This document serves four purposes. First, it is a reference document providing a detailed description of the model for reviewers and potential users of the EFD including energy experts at the Energy Information Administration (EIA), other Federal agencies, state energy agencies, private firms such as utilities and consulting firms, and non-profit groups such as consumer and environmental groups. Second, this report meets the legal requirement of the Energy Information Administration (EIA) to provide adequate documentation in support of its statistical and forecast reports. Third, it facilitates continuity in model development by providing documentation which details model enhancements that were undertaken for AE097 and since the previous documentation. Last, because the major use of the EFD is to develop forecasts, this documentation explains the calculations, major inputs and assumptions which were used to generate the AE097.

  4. Integrity Based Access Control Model for Multilevel XML Document

    Institute of Scientific and Technical Information of China (English)

    HONG Fan; FENG Xue-bin; HUANO Zhi; ZHENG Ming-hui

    2008-01-01

    XML's increasing popularity highlights the security demand for XML documents. A mandatory access control model for XML document is presented on the basis of investigation of the function dependency of XML documents and discussion of the integrity properties of multilevel XML document. Then, the algorithms for decomposition/recovery multilevel XML document into/from single level document are given, and the manipulation rules for typical operations of XQuery and XUpdate: QUERY, INSERT,UPDATE, and REMOVE, are elaborated. The multilevel XML document access model can meet the requirement of sensitive information processing application.

  5. Model documentation: Renewable Fuels Module of the National Energy Modeling System

    Energy Technology Data Exchange (ETDEWEB)

    1994-04-01

    This report documents the objectives, analytical approach, and design of the National Energy Modeling System (NEMS) Renewable Fuels Module (RFM) as it related to the production of the 1994 Annual Energy Outlook (AEO94) forecasts. The report catalogues and describes modeling assumptions, computational methodologies, data inputs, and parameter estimation techniques. A number of offline analyses used in lieu of RFM modeling components are also described. This documentation report serves two purposes. First, it is a reference document for model analysts, model users, and the public interested in the construction and application of the RFM. Second, it meets the legal requirement of the Energy Information Administration (EIA) to provide adequate documentation in support of its models. The RFM consists of six analytical submodules that represent each of the major renewable energy resources -- wood, municipal solid waste (MSW), solar energy, wind energy, geothermal energy, and alcohol fuels. Of these six, four are documented in the following chapters: municipal solid waste, wind, solar and biofuels. Geothermal and wood are not currently working components of NEMS. The purpose of the RFM is to define the technological and cost characteristics of renewable energy technologies, and to pass these characteristics to other NEMS modules for the determination of mid-term forecasted renewable energy demand.

  6. Document management in engineering construction

    International Nuclear Information System (INIS)

    Liao Bing

    2008-01-01

    Document management is one important part of systematic quality management, which is one of the key factors to ensure the construction quality. In the engineering construction, quality management and document management shall interwork all the time, to ensure the construction quality. Quality management ensures that the document is correctly generated and adopted, and thus the completeness, accuracy and systematicness of the document satisfy the filing requirements. Document management ensures that the document is correctly transferred during the construction, and various testimonies such as files and records are kept for the engineering construction and its quality management. This paper addresses the document management in the engineering construction based on the interwork of the quality management and document management. (author)

  7. Modeling documents with Generative Adversarial Networks

    OpenAIRE

    Glover, John

    2016-01-01

    This paper describes a method for using Generative Adversarial Networks to learn distributed representations of natural language documents. We propose a model that is based on the recently proposed Energy-Based GAN, but instead uses a Denoising Autoencoder as the discriminator network. Document representations are extracted from the hidden layer of the discriminator and evaluated both quantitatively and qualitatively.

  8. Semantic Document Model to Enhance Data and Knowledge Interoperability

    Science.gov (United States)

    Nešić, Saša

    To enable document data and knowledge to be efficiently shared and reused across application, enterprise, and community boundaries, desktop documents should be completely open and queryable resources, whose data and knowledge are represented in a form understandable to both humans and machines. At the same time, these are the requirements that desktop documents need to satisfy in order to contribute to the visions of the Semantic Web. With the aim of achieving this goal, we have developed the Semantic Document Model (SDM), which turns desktop documents into Semantic Documents as uniquely identified and semantically annotated composite resources, that can be instantiated into human-readable (HR) and machine-processable (MP) forms. In this paper, we present the SDM along with an RDF and ontology-based solution for the MP document instance. Moreover, on top of the proposed model, we have built the Semantic Document Management System (SDMS), which provides a set of services that exploit the model. As an application example that takes advantage of SDMS services, we have extended MS Office with a set of tools that enables users to transform MS Office documents (e.g., MS Word and MS PowerPoint) into Semantic Documents, and to search local and distant semantic document repositories for document content units (CUs) over Semantic Web protocols.

  9. Model documentation report: Macroeconomic Activity Module (MAM) of the National Energy Modeling System

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-02-01

    This report documents the objectives, analytical approach, and development of the National Energy Modeling System (NEMS) Macroeconomic Activity Module (MAM) used to develop the Annual Energy Outlook for 1997 (AEO 97). The report catalogues and describes the module assumptions, computations, methodology, parameter estimation techniques, and mainframe source code. This document serves three purposes. First it is a reference document providing a detailed description of the NEMS MAM used for the AEO 1997 production runs for model analysts, users, and the public. Second, this report meets the legal requirement of the Energy Information Administration (EIA) to provide adequate documentation in support of its models. Third, it facilitates continuity in model development by providing documentation from which energy analysts can undertake model enhancements, data updates, and parameter refinements as future projects.

  10. Model documentation report: Transportation sector model of the National Energy Modeling System

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-02-01

    Over the past year, several modifications have been made to the NEMS Transportation Model, incorporating greater levels of detail and analysis in modules previously represented in the aggregate or under a profusion of simplifying assumptions. This document is intended to amend those sections of the Model Documentation Report (MDR) which describe these superseded modules. Significant changes have been implemented in the LDV Fuel Economy Model, the Alternative Fuel Vehicle Model, the LDV Fleet Module, and the Highway Freight Model. The relevant sections of the MDR have been extracted from the original document, amended, and are presented in the following pages. A brief summary of the modifications follows: In the Fuel Economy Model, modifications have been made which permit the user to employ more optimistic assumptions about the commercial viability and impact of selected technological improvements. This model also explicitly calculates the fuel economy of an array of alternative fuel vehicles (AFV`s) which are subsequently used in the estimation of vehicle sales. In the Alternative Fuel Vehicle Model, the results of the Fuel Economy Model have been incorporated, and the program flows have been modified to reflect that fact. In the Light Duty Vehicle Fleet Module, the sales of vehicles to fleets of various size are endogenously calculated in order to provide a more detailed estimate of the impacts of EPACT legislation on the sales of AFV`s to fleets. In the Highway Freight Model, the previous aggregate estimation has been replaced by a detailed Freight Truck Stock Model, where travel patterns, efficiencies, and energy intensities are estimated by industrial grouping. Several appendices are provided at the end of this document, containing data tables and supplementary descriptions of the model development process which are not integral to an understanding of the overall model structure.

  11. Model documentation report: Industrial sector demand module of the National Energy Modeling System

    International Nuclear Information System (INIS)

    1997-01-01

    This report documents the objectives, analytical approach, and development of the National Energy Modeling System (NEMS) Industrial Demand Model. The report catalogues and describes model assumptions, computational methodology, parameter estimation techniques, and model source code. This document serves three purposes. First, it is a reference document providing a detailed description of the NEMS Industrial Model for model analysts, users, and the public. Second, this report meets the legal requirement of the Energy Information Administration (EIA) to provide adequate documentation in support of its models. Third, it facilitates continuity in model development by providing documentation from which energy analysts can undertake model enhancements, data updates, and parameter refinements as future projects. The NEMS Industrial Demand Model is a dynamic accounting model, bringing together the disparate industries and uses of energy in those industries, and putting them together in an understandable and cohesive framework. The Industrial Model generates mid-term (up to the year 2015) forecasts of industrial sector energy demand as a component of the NEMS integrated forecasting system. From the NEMS system, the Industrial Model receives fuel prices, employment data, and the value of industrial output. Based on the values of these variables, the Industrial Model passes back to the NEMS system estimates of consumption by fuel types

  12. Guidance and Control Software Project Data - Volume 4: Configuration Management and Quality Assurance Documents

    Science.gov (United States)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes configuration management and quality assurance documents from the GCS project. Volume 4 contains six appendices: A. Software Accomplishment Summary for the Guidance and Control Software Project; B. Software Configuration Index for the Guidance and Control Software Project; C. Configuration Management Records for the Guidance and Control Software Project; D. Software Quality Assurance Records for the Guidance and Control Software Project; E. Problem Report for the Pluto Implementation of the Guidance and Control Software Project; and F. Support Documentation Change Reports for the Guidance and Control Software Project.

  13. Scientific criteria document for the development of an interim provincial water quality objective for aniline

    Energy Technology Data Exchange (ETDEWEB)

    Angelow, R.V.; Bazinet, N.

    1996-11-01

    The purpose of this document is to develop an interim provincial water quality objective for aniline for the protection of aquatic life in Ontario. It reviews the sources of aniline in the environment, its environmental fate and properties, acute and chronic toxicity as determined from results reported in the literature on toxicity tests using vertebrates and invertebrates, the bioaccumulation of aniline in the environment, mutagenic effects, and threshold aniline concentrations affecting fish odour and taste. The document then explains the derivation of the interim water quality objective. Water quality criteria for aniline developed in other jurisdictions are noted.

  14. EIA model documentation: Petroleum market model of the national energy modeling system

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-12-28

    The purpose of this report is to define the objectives of the Petroleum Market Model (PMM), describe its basic approach, and provide detail on how it works. This report is intended as a reference document for model analysts, users, and the public. Documentation of the model is in accordance with EIA`s legal obligation to provide adequate documentation in support of its models. The PMM models petroleum refining activities, the marketing of petroleum products to consumption regions, the production of natural gas liquids in gas processing plants, and domestic methanol production. The PMM projects petroleum product prices and sources of supply for meeting petroleum product demand. The sources of supply include crude oil, both domestic and imported; other inputs including alcohols and ethers; natural gas plant liquids production; petroleum product imports; and refinery processing gain. In addition, the PMM estimates domestic refinery capacity expansion and fuel consumption. Product prices are estimated at the Census division level and much of the refining activity information is at the Petroleum Administration for Defense (PAD) District level.

  15. EIA model documentation: Petroleum market model of the national energy modeling system

    International Nuclear Information System (INIS)

    1995-01-01

    The purpose of this report is to define the objectives of the Petroleum Market Model (PMM), describe its basic approach, and provide detail on how it works. This report is intended as a reference document for model analysts, users, and the public. Documentation of the model is in accordance with EIA's legal obligation to provide adequate documentation in support of its models. The PMM models petroleum refining activities, the marketing of petroleum products to consumption regions, the production of natural gas liquids in gas processing plants, and domestic methanol production. The PMM projects petroleum product prices and sources of supply for meeting petroleum product demand. The sources of supply include crude oil, both domestic and imported; other inputs including alcohols and ethers; natural gas plant liquids production; petroleum product imports; and refinery processing gain. In addition, the PMM estimates domestic refinery capacity expansion and fuel consumption. Product prices are estimated at the Census division level and much of the refining activity information is at the Petroleum Administration for Defense (PAD) District level

  16. Fisher Sand & Gravel New Mexico, Inc. General Air Quality Permit: Related Documents

    Science.gov (United States)

    Documents related to the Fisher Sand & Gravel – New Mexico, Inc., Grey Mesa Gravel Pit General Air Quality Permit for New or Modified Minor Source Stone Quarrying, Crushing, and Screening Facilities in Indian Country.

  17. Testing the Q-DIO as an instrument to measure the documented quality of nursing diagnoses, interventions, and outcomes.

    NARCIS (Netherlands)

    Muller-Staub, M.; Lunney, M.; Lavin, M.A.; Needham, I.; Odenbreit, M.; Achterberg, T. van

    2008-01-01

    PURPOSE: To describe pilot testing of Quality of Diagnoses, Interventions and Outcomes (Q-DIO), an instrument to measure quality of nursing documentation. DESIGN: Instrument testing was performed using a random, stratified sample of 60 nursing documentations representing hospital nursing with and

  18. Software quality assurance documentation for the release of NUFT 2.0 for HP platforms

    International Nuclear Information System (INIS)

    Fernandez, M.W.; Preckshot, G.G.; Johnson, G.L.

    1998-01-01

    This document is the Individual Software Plan (ISP) for version 2.0 of the Non-isothermal Unsaturated-saturated Flow and Transport (NUFT.) analysis computer program. This document addresses the applicable requirements of LLNL YMP procedure 033-YMP-QP 3.2, Section 4.2.1.1. The purpose of this ISP is to plan and organize the activities required to certify the NUFT code for quality affecting work involving problems that include cross drift analysis of the Yucca Mountain Repository facility. NUFT is software for application to the solution of a class of coupled mass and heat transport problems in porous geologic media including Yucca Mountain Repository Cross Drift Problem (YMRCDP- also known as the Enhanced Characterization of the Repository Block (ECRB)). Solution of this class of problems requires a suite of multiphase, multi-component models for numerical solution of non- isothermal flow and transport in porous media with applications to subsurface contaminant transport problems. NUFT is a suite of multiphase, multi-component models for numerical solution of non- isothermal flow and transport in porous media, with application to subsurface contaminant transport problems, and in particular, to the hydrology in and about the Yucca Mountain Repository Site. NUFI is acquired software, as defined by 033-YMP-QP 3.2, and a preliminary baseline of source code, electronic documentation, and paper documentation has been established as required by 033-YMP-QP 3.2, Section 4.1. NUFT runs on Sun Unix platforms, Solaris operating system version 5.5 and HP-UX with operating system version 10.20. The product to be qualified under this ISP is the version running on HP- UX. The HP version will be labeled Version 2.0h. The h is included to distinguish the HP version from possible future versions qualified for Sun or other platforms. The scope of the plans and procedures outlined in this ISP is limited to the effort required to qualify NUFT for the class of problems identified in

  19. Take the Reins on Model Quality with ModelCHECK and Gatekeeper

    Science.gov (United States)

    Jones, Corey

    2012-01-01

    Model quality and consistency has been an issue for us due to the diverse experience level and imaginative modeling techniques of our users. Fortunately, setting up ModelCHECK and Gatekeeper to enforce our best practices has helped greatly, but it wasn't easy. There were many challenges associated with setting up ModelCHECK and Gatekeeper including: limited documentation, restrictions within ModelCHECK, and resistance from end users. However, we consider ours a success story. In this presentation we will describe how we overcame these obstacles and present some of the details of how we configured them to work for us.

  20. Enhancing Documentation of Pressure Ulcer Prevention Interventions: A Quality Improvement Strategy to Reduce Pressure Ulcers.

    Science.gov (United States)

    Jacobson, Therese M; Thompson, Susan L; Halvorson, Anna M; Zeitler, Kristine

    2016-01-01

    Prevention of hospital-acquired pressure ulcers requires the implementation of evidence-based interventions. A quality improvement project was conducted to provide nurses with data on the frequency with which pressure ulcer prevention interventions were performed as measured by documentation. Documentation reports provided feedback to stakeholders, triggering reminders and reeducation. Intervention reports and modifications to the documentation system were effective both in increasing the documentation of pressure ulcer prevention interventions and in decreasing the number of avoidable hospital-acquired pressure ulcers.

  1. Air pollution and public health: a guidance document for risk managers.

    Science.gov (United States)

    Craig, Lorraine; Brook, Jeffrey R; Chiotti, Quentin; Croes, Bart; Gower, Stephanie; Hedley, Anthony; Krewski, Daniel; Krupnick, Alan; Krzyzanowski, Michal; Moran, Michael D; Pennell, William; Samet, Jonathan M; Schneider, Jurgen; Shortreed, John; Williams, Martin

    2008-01-01

    This guidance document is a reference for air quality policymakers and managers providing state-of-the-art, evidence-based information on key determinants of air quality management decisions. The document reflects the findings of five annual meetings of the NERAM (Network for Environmental Risk Assessment and Management) International Colloquium Series on Air Quality Management (2001-2006), as well as the results of supporting international research. The topics covered in the guidance document reflect critical science and policy aspects of air quality risk management including i) health effects, ii) air quality emissions, measurement and modeling, iii) air quality management interventions, and iv) clean air policy challenges and opportunities.

  2. Methodology study for documentation and 3D modelling of blast induced fractures

    International Nuclear Information System (INIS)

    Olsson, Mats; Markstroem, Ingemar; Pettersson, Anders

    2008-05-01

    The purpose of this activity as part of the Zuse project was to test whether it is possible to produce a 3D model of blast induced fractures around a tunnel and also to find a methodology suitable for large scale studies. The purpose of the studies is to increase the understanding of the excavation damage zone (EDZ) and the possibility of an existing continuous EDZ along the tunnel. For the investigation, an old test area in the Q tunnel at the Aespoe Hard Rock Laboratory was selected, where slabs were excavated in 2003 to investigate the fracture pattern around the contour holes of a blasted tunnel. The rock walls of the excavated niche were studied and documented in the tunnel, while the excavated rock slabs were documented above ground. The work flow included photo documentation of both sides. The photos taken in the tunnel had to be rectified and then the fractures were vectorized automatically in a vectorization program, generating AutoCad DWG-files as output. The vectorized fractures were then moved to MicroStation/RVS where they were interpreted and connected into continuous line strings. The digitized slab and rock sides were then moved to the correct position in 3D space. Finally, a 3D model was made in RVS where the fracture traces were connected into undulating fracture planes in 3D. The conclusion is that it is possible to build a 3D model; the model is presented in Chapter 3.5. However, the age and condition of the slabs may have influenced the quality of the model in this study. The quality of a model that can be built in a future investigation, should be much better if the surveys are adapted to the investigation at hand and the slabs and rock sides are fresh and in better condition. The validity of a model depends on the density of the investigation data. There is also always a risk of over interpretation; the wish to identify a fracture from one section to the next can lead to an interpretation of the fractures as more persistent than they actually

  3. Methodology study for documentation and 3D modelling of blast induced fractures

    Energy Technology Data Exchange (ETDEWEB)

    Olsson, Mats (Swebrec - Swedish Blasting Research Centre, Luleaa (Sweden)); Markstroem, Ingemar; Pettersson, Anders (Golder Associates (Sweden))

    2008-05-15

    The purpose of this activity as part of the Zuse project was to test whether it is possible to produce a 3D model of blast induced fractures around a tunnel and also to find a methodology suitable for large scale studies. The purpose of the studies is to increase the understanding of the excavation damage zone (EDZ) and the possibility of an existing continuous EDZ along the tunnel. For the investigation, an old test area in the Q tunnel at the Aespoe Hard Rock Laboratory was selected, where slabs were excavated in 2003 to investigate the fracture pattern around the contour holes of a blasted tunnel. The rock walls of the excavated niche were studied and documented in the tunnel, while the excavated rock slabs were documented above ground. The work flow included photo documentation of both sides. The photos taken in the tunnel had to be rectified and then the fractures were vectorized automatically in a vectorization program, generating AutoCad DWG-files as output. The vectorized fractures were then moved to MicroStation/RVS where they were interpreted and connected into continuous line strings. The digitized slab and rock sides were then moved to the correct position in 3D space. Finally, a 3D model was made in RVS where the fracture traces were connected into undulating fracture planes in 3D. The conclusion is that it is possible to build a 3D model; the model is presented in Chapter 3.5. However, the age and condition of the slabs may have influenced the quality of the model in this study. The quality of a model that can be built in a future investigation, should be much better if the surveys are adapted to the investigation at hand and the slabs and rock sides are fresh and in better condition. The validity of a model depends on the density of the investigation data. There is also always a risk of over interpretation; the wish to identify a fracture from one section to the next can lead to an interpretation of the fractures as more persistent than they actually

  4. Aircraft model prototypes which have specified handling-quality time histories

    Science.gov (United States)

    Johnson, S. H.

    1978-01-01

    Several techniques for obtaining linear constant-coefficient airplane models from specified handling-quality time histories are discussed. The pseudodata method solves the basic problem, yields specified eigenvalues, and accommodates state-variable transfer-function zero suppression. The algebraic equations to be solved are bilinear, at worst. The disadvantages are reduced generality and no assurance that the resulting model will be airplane like in detail. The method is fully illustrated for a fourth-order stability-axis small motion model with three lateral handling quality time histories specified. The FORTRAN program which obtains and verifies the model is included and fully documented.

  5. Standards for Documenting Finite‐Fault Earthquake Rupture Models

    KAUST Repository

    Mai, Paul Martin

    2016-04-06

    In this article, we propose standards for documenting and disseminating finite‐fault earthquake rupture models, and related data and metadata. A comprehensive documentation of the rupture models, a detailed description of the data processing steps, and facilitating the access to the actual data that went into the earthquake source inversion are required to promote follow‐up research and to ensure interoperability, transparency, and reproducibility of the published slip‐inversion solutions. We suggest a formatting scheme that describes the kinematic rupture process in an unambiguous way to support subsequent research. We also provide guidelines on how to document the data, metadata, and data processing. The proposed standards and formats represent a first step to establishing best practices for comprehensively documenting input and output of finite‐fault earthquake source studies.

  6. Standards for Documenting Finite‐Fault Earthquake Rupture Models

    KAUST Repository

    Mai, Paul Martin; Shearer, Peter; Ampuero, Jean‐Paul; Lay, Thorne

    2016-01-01

    In this article, we propose standards for documenting and disseminating finite‐fault earthquake rupture models, and related data and metadata. A comprehensive documentation of the rupture models, a detailed description of the data processing steps, and facilitating the access to the actual data that went into the earthquake source inversion are required to promote follow‐up research and to ensure interoperability, transparency, and reproducibility of the published slip‐inversion solutions. We suggest a formatting scheme that describes the kinematic rupture process in an unambiguous way to support subsequent research. We also provide guidelines on how to document the data, metadata, and data processing. The proposed standards and formats represent a first step to establishing best practices for comprehensively documenting input and output of finite‐fault earthquake source studies.

  7. Process model for building quality software on internet time ...

    African Journals Online (AJOL)

    The competitive nature of the software construction market and the inherently exhilarating nature of software itself have hinged the success of any software development project on four major pillars: time to market, product quality, innovation and documentation. Unfortunately, however, existing software development models ...

  8. A Cognitive Model of Document Use during a Research Project. Study I. Document Selection.

    Science.gov (United States)

    Wang, Peiling; Soergel, Dagobert

    1998-01-01

    Proposes a model of document selection by real users of a bibliographic retrieval system. Reports on Part I of a longitudinal study of decision making on document use by academics (25 faculty and graduate students in Agricultural Economics). Examines what components are relevant to the users' decisions and what cognitive process may have occurred…

  9. [Document management systems to support quality management systems at university hospitals - an interview-based study].

    Science.gov (United States)

    Holderried, Martin; Bökel, Ann-Catrin; Ochsmann, Elke

    2018-05-01

    In order to save and control the processes and quality of medical services, a suitable steering system of all relevant documents is essential from the point of view of clinical quality management. Systems supporting an automated steering system of documents are called document management systems (DMS), and they also enter the healthcare sector. The use of DMS in the German healthcare sector has hardly been investigated so far. To close this knowledge gap, interviews were carried out with German university hospitals over a six-month period and subjected to a qualitative content analysis according to Mayring. In total, 25 university hospitals agreed to participate in this study, 19 of which have been working with a digital DMS for about six years on average. There was a great variety among the IT systems used. Document management and usability of the DMS as well as its integration into existing IT structures were key decision-making criteria for the selection of a digital DMS. In general, the long-term usability of the DMS is supported by regular evaluation of one's own requirements for the system, administration and training programs. In addition, DMS have a positive effect on patient safety and the quality of medical care. Copyright © 2018. Published by Elsevier GmbH.

  10. Generic safety documentation model

    International Nuclear Information System (INIS)

    Mahn, J.A.

    1994-04-01

    This document is intended to be a resource for preparers of safety documentation for Sandia National Laboratories, New Mexico facilities. It provides standardized discussions of some topics that are generic to most, if not all, Sandia/NM facilities safety documents. The material provides a ''core'' upon which to develop facility-specific safety documentation. The use of the information in this document will reduce the cost of safety document preparation and improve consistency of information

  11. World Energy Projection System model documentation

    International Nuclear Information System (INIS)

    Hutzler, M.J.; Anderson, A.T.

    1997-09-01

    The World Energy Projection System (WEPS) was developed by the Office of Integrated Analysis and Forecasting within the Energy Information Administration (EIA), the independent statistical and analytical agency of the US Department of Energy. WEPS is an integrated set of personal computer based spreadsheets containing data compilations, assumption specifications, descriptive analysis procedures, and projection models. The WEPS accounting framework incorporates projections from independently documented models and assumptions about the future energy intensity of economic activity (ratios of total energy consumption divided by gross domestic product GDP), and about the rate of incremental energy requirements met by natural gas, coal, and renewable energy sources (hydroelectricity, geothermal, solar, wind, biomass, and other renewable resources). Projections produced by WEPS are published in the annual report, International Energy Outlook. This report documents the structure and procedures incorporated in the 1998 version of the WEPS model. It has been written to provide an overview of the structure of the system and technical details about the operation of each component of the model for persons who wish to know how WEPS projections are produced by EIA

  12. World Energy Projection System model documentation

    Energy Technology Data Exchange (ETDEWEB)

    Hutzler, M.J.; Anderson, A.T.

    1997-09-01

    The World Energy Projection System (WEPS) was developed by the Office of Integrated Analysis and Forecasting within the Energy Information Administration (EIA), the independent statistical and analytical agency of the US Department of Energy. WEPS is an integrated set of personal computer based spreadsheets containing data compilations, assumption specifications, descriptive analysis procedures, and projection models. The WEPS accounting framework incorporates projections from independently documented models and assumptions about the future energy intensity of economic activity (ratios of total energy consumption divided by gross domestic product GDP), and about the rate of incremental energy requirements met by natural gas, coal, and renewable energy sources (hydroelectricity, geothermal, solar, wind, biomass, and other renewable resources). Projections produced by WEPS are published in the annual report, International Energy Outlook. This report documents the structure and procedures incorporated in the 1998 version of the WEPS model. It has been written to provide an overview of the structure of the system and technical details about the operation of each component of the model for persons who wish to know how WEPS projections are produced by EIA.

  13. Working document dispersion models

    International Nuclear Information System (INIS)

    Dop, H. van

    1988-01-01

    This report is a summary of the most important results from June 1985 of the collaboration of the RIVM (Dutch National Institute for Public Health and Environment Hygiene) and KNMI (Royal Dutch Meteorologic Institute) on the domain of dispersion models. It contains a short description of the actual SO x /NO x -model. Furthermore it contains recommendations for modifications of some numerical-mathematical aspects and an impulse to a more complete description of chemical processes in the atmosphere and the (wet) deposition process. A separate chapter is devoted to the preparation of meteorologic data which are relevant for dispersion as well as atmospheric chemistry and deposition. This report serves as working document for the final formulation of a acidifying- and oxidant-model. (H.W.). 69 refs.; 51 figs.; 13 tabs.; 3 schemes

  14. The changes in caregivers' perceptions about the quality of information and benefits of nursing documentation associated with the introduction of an electronic documentation system in a nursing home.

    Science.gov (United States)

    Munyisia, Esther N; Yu, Ping; Hailey, David

    2011-02-01

    To date few studies have compared nursing home caregivers' perceptions about the quality of information and benefits of nursing documentation in paper and electronic formats. With the increased interest in the use of information technology in nursing homes, it is important to obtain information on the benefits of newer approaches to nursing documentation so as to inform investment, organisational and care service decisions in the aged care sector. This study aims to investigate caregivers' perceptions about the quality of information and benefits of nursing documentation before and after the introduction of an electronic documentation system in a nursing home. A self-administered questionnaire survey was conducted three months before, and then six, 18 and 31 months after the introduction of an electronic documentation system. Further evidence was obtained through informal discussions with caregivers. Scores for questionnaire responses showed that the benefits of the electronic documentation system were perceived by the caregivers as provision of more accurate, legible and complete information, and reduction of repetition in data entry, with consequential managerial benefits. However, caregivers' perceptions of relevance and reliability of information, and of their communication and decision-making abilities were perceived to be similar either using an electronic or a paper-based documentation system. Improvement in some perceptions about the quality of information and benefits of nursing documentation was evident in the measurement conducted six months after the introduction of the electronic system, but were not maintained 18 or 31 months later. The electronic documentation system was perceived to perform better than the paper-based system in some aspects, with subsequent benefits to management of aged care services. In other areas, perceptions of additional benefits from the electronic documentation system were not maintained. In a number of attributes, there

  15. Logistic and linear regression model documentation for statistical relations between continuous real-time and discrete water-quality constituents in the Kansas River, Kansas, July 2012 through June 2015

    Science.gov (United States)

    Foster, Guy M.; Graham, Jennifer L.

    2016-04-06

    The Kansas River is a primary source of drinking water for about 800,000 people in northeastern Kansas. Source-water supplies are treated by a combination of chemical and physical processes to remove contaminants before distribution. Advanced notification of changing water-quality conditions and cyanobacteria and associated toxin and taste-and-odor compounds provides drinking-water treatment facilities time to develop and implement adequate treatment strategies. The U.S. Geological Survey (USGS), in cooperation with the Kansas Water Office (funded in part through the Kansas State Water Plan Fund), and the City of Lawrence, the City of Topeka, the City of Olathe, and Johnson County Water One, began a study in July 2012 to develop statistical models at two Kansas River sites located upstream from drinking-water intakes. Continuous water-quality monitors have been operated and discrete-water quality samples have been collected on the Kansas River at Wamego (USGS site number 06887500) and De Soto (USGS site number 06892350) since July 2012. Continuous and discrete water-quality data collected during July 2012 through June 2015 were used to develop statistical models for constituents of interest at the Wamego and De Soto sites. Logistic models to continuously estimate the probability of occurrence above selected thresholds were developed for cyanobacteria, microcystin, and geosmin. Linear regression models to continuously estimate constituent concentrations were developed for major ions, dissolved solids, alkalinity, nutrients (nitrogen and phosphorus species), suspended sediment, indicator bacteria (Escherichia coli, fecal coliform, and enterococci), and actinomycetes bacteria. These models will be used to provide real-time estimates of the probability that cyanobacteria and associated compounds exceed thresholds and of the concentrations of other water-quality constituents in the Kansas River. The models documented in this report are useful for characterizing changes

  16. Rancang Bangun Document Management System Untuk Mengelola Dokumen Standart Operational Procedure

    Directory of Open Access Journals (Sweden)

    I Putu Susila Handika

    2017-09-01

    Standard Operational Procedure (SOP is one important document in a company because it is useful to improve the quality of the company. PT. Global Retailindo Pratama is one of the companies engaged in retail that using standard of quality management ISO 9001: 2008. Currently the management of SOP documents at PT. Global Retailindo Pratama still use manual way. Manual way cause some problems such as the search process and document distribution process takes quite a long time. This research aims to design and build Document Management System to manage SOP documents. The system development model used in this research is prototyping model. This application is built in web-based with PHP as programming language. Testing the application using Blak Box Testing and Usability Testing shows that the Document Management System can run in accordance with the needs and can be used easily so that the process of document management SOP becomes faster. Keywords: Document Management System, Standart Operational Procedure, Information System, PHP.

  17. Documentation of quality improvement exposure by internal medicine residency applicants.

    Science.gov (United States)

    Kolade, Victor O; Sethi, Anuradha

    2016-01-01

    Quality improvement (QI) has become an essential component of medical care in the United States. In residency programs, QI is a focus area of the Clinical Learning Environment Review visits conducted by the Accreditation Council for Graduate Medical Education. The readiness of applicants to internal medicine residency to engage in QI on day one is unknown. To document the reporting of QI training or experience in residency applications. Electronic Residency Application Service applications to a single internal medicine program were reviewed individually looking for reported QI involvement or actual projects in the curriculum vitae (CVs), personal statements (PSs), and letters of recommendation (LORs). CVs were also reviewed for evidence of education in QI such as completion of Institute for Healthcare Improvement (IHI) modules. Of 204 candidates shortlisted for interview, seven had QI items on their CVs, including one basic IHI certificate. Three discussed their QI work in their PSs, and four had recommendation letters describing their involvement in QI. One applicant had both CV and LOR evidence, so that 13 (6%) documented QI engagement. Practice of or instruction in QI is rarely mentioned in application documents of prospective internal medicine interns.

  18. Documentation package for the RFID temperature monitoring system (Model 9977 packages at NTS)

    International Nuclear Information System (INIS)

    Chen, K.; Tsai, H.

    2009-01-01

    The technical basis for extending the Model 9977 shipping package periodic maintenance beyond the one-year interval to a maximum of five years is based on the performance of the O-ring seals and the environmental conditions. The DOE Packaging Certification Program (PCP) has tasked Argonne National Laboratory to develop a Radio-Frequency Identification (RFID) temperature monitoring system for use by the facility personnel at DAF/NTS. The RFID temperature monitoring system, depicted in the figure below, consists of the Mk-1 RFId tags, a reader, and a control computer mounted on a mobile platform that can operate as a stand-alone system, or it can be connected to the local IT network. As part of the Conditions of Approval of the CoC, the user must complete the prescribed training to become qualified and be certified for operation of the RFID temperature monitoring system. The training course will be administered by Argonne National Laboratory on behalf of the Headquarters Certifying Official. This is a complete documentation package for the RFID temperature monitoring system of the Model 9977 packagings at NTS. The documentation package will be used for training and certification. The table of contents are: Acceptance Testing Procedure of MK-1 RFID Tags for DOE/EM Nuclear Materials Management Applications; Acceptance Testing Result of MK-1 RFID Tags for DOE/EM Nuclear Materials Management Applications; Performance Test of the Single Bolt Seal Sensor for the Model 9977 Packaging; Calibration of Built-in Thermistors in RFID Tags for Nevada Test Site; Results of Calibration of Built-in Thermistors in RFID Tags; Results of Thermal Calibration of Second Batch of MK-I RFID Tags; Procedure for Installing and Removing MK-1 RFID Tag on Model 9977 Drum; User Guide for RFID Reader and Software for Temperature Monitoring of Model 9977 Drums at NTS; Software Quality Assurance Plan (SQAP) for the ARG-US System; Quality Category for the RFID Temperature Monitoring System; The

  19. Multimodal document management in radiotherapy

    International Nuclear Information System (INIS)

    Fahrner, H.; Kirrmann, S.; Roehner, F.; Schmucker, M.; Hall, M.; Heinemann, F.

    2013-01-01

    Background and purpose: After incorporating treatment planning and the organisational model of treatment planning in the operating schedule system (BAS, 'Betriebsablaufsystem'), complete document qualities were embedded in the digital environment. The aim of this project was to integrate all documents independent of their source (paper-bound or digital) and to make content from the BAS available in a structured manner. As many workflow steps as possible should be automated, e.g. assigning a document to a patient in the BAS. Additionally it must be guaranteed that at all times it could be traced who, when, how and from which source documents were imported into the departmental system. Furthermore work procedures should be changed that the documentation conducted either directly in the departmental system or from external systems can be incorporated digitally and paper document can be completely avoided (e.g. documents such as treatment certificate, treatment plans or documentation). It was a further aim, if possible, to automate the removal of paper documents from the departmental work flow, or even to make such paper documents superfluous. In this way patient letters for follow-up appointments should automatically generated from the BAS. Similarly patient record extracts in the form of PDF files should be enabled, e.g. for controlling purposes. Method: The available document qualities were analysed in detail by a multidisciplinary working group (BAS-AG) and after this examination and assessment of the possibility of modelling in our departmental workflow (BAS) they were transcribed into a flow diagram. The gathered specifications were implemented in a test environment by the clinical and administrative IT group of the department of radiation oncology and subsequent to a detailed analysis introduced into clinical routine. Results: The department has succeeded under the conditions of the aforementioned criteria to embed all relevant documents in the departmental

  20. [Efficient document management by introduction of a new, dynamic quality manual application according to DIN EN ISO 9001:2000].

    Science.gov (United States)

    Pache, G; Saueressig, U; Baumann, T; Dürselen, L; Langer, M; Kotter, E

    2008-06-01

    Evaluation of the impact of a new, dynamic computer-aided quality manual application (QMA) regarding the acceptance and efficiency of a quality management system (QMS) according to DIN EN ISO 9001:2000. The QMA combines static pages of HTML with active content generated from an underlying database. Through user access rights, a hierarchy is defined to create and administer quality documents. Document workflow, feedback management and employee survey were analyzed to compare the performance of the new QMH with the formerly used static version. Integration of a document editor and automated document re-approval accelerated the document process by an average of 10 min. In spite of an increase of the yearly document changes of 60%, the administration effort was reduced by approximately 160 h. Integration of the feedback management system into the QMA decreased handling time from an average of 16.5 to 3.4 days. Simultaneously the number of feedback messages increased from 160 in 2005 to 306 in 2006. Employee satisfaction was raised (old: 3.19+/-1.02, new: 1.91+/-0.8). The number of users who partook in the QMA more than once a week also increased from 29.5% to 60%. The computer-aided quality manual application constitutes the basis for the success of our QMS. The possibility to actively participate in the quality management process has led to broad acceptance and usage by the employees. The administration effort was able to be tremendously decreased as compared to conventional QMS.

  1. Measures of quality of process models created in BPMN

    Directory of Open Access Journals (Sweden)

    Radek Hronza

    2015-12-01

    Full Text Available Description, documentation, evaluation and redesign of key processes during their execution should be an essential part of the strategic management of any organization. All organizations live in a dynamically changing environment. Therefore they must adapt its internal processes to market changes. These processes must be described. Suitable way of description could be BPMN notation. Right after description of processes via BPMN, processes should be controlled to ensure expected of their quality. System (which could be automated based on mathematical expression of qualitative characteristics of process models (i.e. measures of quality of process models can support mentioned process controls. Research team trying to design and get into practical use such a tool. The aim of this publication is description of mentioned system – based on measures of the quality of process models - and answer associated scientific questions.

  2. Definition and documentation of engineering processes

    Energy Technology Data Exchange (ETDEWEB)

    McDonald, G.W. [Sandia National Labs., Albuquerque, NM (United States)

    1997-11-01

    This tutorial is an extract of a two-day workshop developed under the auspices of the Quality Engineering Department at Sandia National Laboratories. The presentation starts with basic definitions and addresses why processes should be defined and documented. It covers three primary topics: (1) process considerations and rationale, (2) approach to defining and documenting engineering processes, and (3) an IDEFO model of the process for defining engineering processes.

  3. Assessment model validity document FARF31

    International Nuclear Information System (INIS)

    Elert, Mark; Gylling Bjoern; Lindgren, Maria

    2004-08-01

    -fractures with flowing water and rock with porosity accessible only by diffusion. The approach furthermore assumes that the properties within the two porosity domains are averaged and also the transfer between the two domains is averaged.It is an important validation issue to verify that effective averaging of parameters can be performed and that suitable values can be derived. It can be shown that matrix interaction properties along a flow path can be integrated to an effective value and if the matrix depth can be considered as infinite, effective values may be derived also for the diffusion and sorption parameters. Thus, it is possible to derive effective parameters for sorbing radionuclides incorporating the total matrix effects along a flow path. This is strictly valid only for cases with no dispersion, but gives a good approximation as long as dispersion does not dominate the transport. FARF31 has been tested and compared with analytical solutions and other models and was found to correspond well within a wide range of input parameters. Support and documentation on how to use FARF31 are two important components to avoid calculation mistakes and obtain trustworthy results. The documentation describes handling and updates of the code. Test cases have been constructed which can be used to check updates and be used as templates. The development of the code is kept under source code control to fulfil quality assurance. The model is deemed to be well suited for performance assessments within the SKB framework

  4. The quality of paper-based versus electronic nursing care plan in Australian aged care homes: A documentation audit study.

    Science.gov (United States)

    Wang, Ning; Yu, Ping; Hailey, David

    2015-08-01

    The nursing care plan plays an essential role in supporting care provision in Australian aged care. The implementation of electronic systems in aged care homes was anticipated to improve documentation quality. Standardized nursing terminologies, developed to improve communication and advance the nursing profession, are not required in aged care practice. The language used by nurses in the nursing care plan and the effect of the electronic system on documentation quality in residential aged care need to be investigated. To describe documentation practice for the nursing care plan in Australian residential aged care homes and to compare the quantity and quality of documentation in paper-based and electronic nursing care plans. A nursing documentation audit was conducted in seven residential aged care homes in Australia. One hundred and eleven paper-based and 194 electronic nursing care plans, conveniently selected, were reviewed. The quantity of documentation in a care plan was determined by the number of phrases describing a resident problem and the number of goals and interventions. The quality of documentation was measured using 16 relevant questions in an instrument developed for the study. There was a tendency to omit 'nursing problem' or 'nursing diagnosis' in the nursing process by changing these terms (used in the paper-based care plan) to 'observation' in the electronic version. The electronic nursing care plan documented more signs and symptoms of resident problems and evaluation of care than the paper-based format (48.30 vs. 47.34 out of 60, Ppaper-based system (Ppaper-based system. Omission of the nursing problem or diagnosis from the nursing process may reflect a range of factors behind the practice that need to be understood. Further work is also needed on qualitative aspects of the nurse care plan, nurses' attitudes towards standardized terminologies and the effect of different documentation practice on care quality and resident outcomes. Copyright

  5. Modeling and Negotiating Service Quality

    Science.gov (United States)

    Benbernou, Salima; Brandic, Ivona; Cappiello, Cinzia; Carro, Manuel; Comuzzi, Marco; Kertész, Attila; Kritikos, Kyriakos; Parkin, Michael; Pernici, Barbara; Plebani, Pierluigi

    In this chapter the research problems of specifying and negotiating QoS and its corresponding quality documents are analyzed. For this reason, this chapter is separated into two main sections, Section 6.1 and 6.2, with each dedicated to one of the two problems, i.e., QoS specification and negotiation, respectively. Each section has a similar structure: they first introduce the problem and then, in the remaining subsections, review related work. Finally, the chapter ends with Section 6.3, which identifies research gaps and presents potential research challenges in QoS modelling, specification and negotiation.

  6. Delivery of completed irradiation vehicles and the quality assurance document to the High Flux Isotope Reactor for irradiation

    Energy Technology Data Exchange (ETDEWEB)

    Petrie, Christian M. [Oak Ridge National Laboratory (ORNL), Oak Ridge, TN (United States); McDuffee, Joel Lee [Oak Ridge National Laboratory (ORNL), Oak Ridge, TN (United States); Katoh, Yutai [Oak Ridge National Laboratory (ORNL), Oak Ridge, TN (United States); Terrani, Kurt A. [Oak Ridge National Laboratory (ORNL), Oak Ridge, TN (United States)

    2015-10-01

    This report details the initial fabrication and delivery of two Fuel Cycle Research and Development (FCRD) irradiation capsules (ATFSC01 and ATFSC02), with associated quality assurance documentation, to the High Flux Isotope Reactor (HFIR). The capsules and documentation were delivered by September 30, 2015, thus meeting the deadline for milestone M3FT-15OR0202268. These irradiation experiments are testing silicon carbide composite tubes in order to obtain experimental validation of thermo-mechanical models of stress states in SiC cladding irradiated under a prototypic high heat flux. This document contains a copy of the completed capsule fabrication request sheets, which detail all constituent components, pertinent drawings, etc., along with a detailed summary of the capsule assembly process performed by the Thermal Hydraulics and Irradiation Engineering Group (THIEG) in the Reactor and Nuclear Systems Division (RNSD). A complete fabrication package record is maintained by the THIEG and is available upon request.

  7. Efficient document management by introduction of a new, dynamic quality manual application according to DIN EN ISO 9001:2000

    International Nuclear Information System (INIS)

    Pache, G.; Saueressig, U.; Baumann, T.; Langer, M.; Kotter, E.

    2008-01-01

    Purpose: Evaluation of the impact of a new, dynamic computer-aided quality manual application (QMA) regarding the acceptance and efficiency of a quality management system (QMS) according to DIN EN ISO 9001:2000. Materials and Method: The QMA combines static pages of HTML with active content generated from an underlying database. Through user access rights, a hierarchy is defined to create and administer quality documents. Document workflow, feedback management and employee survey were analyzed to compare the performance of the new QMH with the formerly used static versions. Results: Integration of a document editor and automated document re-approval accelerated the document process by an average of 10 min. In spite of an increase of the yearly document changes of 60%, the administration effort was reduced by approximately 160 h. Integration of the feedback management system into the QMA decreased handling time from an average of 16.5 to 3.4 days. Simultaneously the number of feedback messages increased from 160 in 2005 to 306 in 2006. Employee satisfaction was raised (old: 3.19±1.02, new: 1.91±0.8). The number of users who partook in the QMA more than once a week also increased from 29.5% to 60%. Conclusion: The computer-aided quality manual application constitutes the basis for the success of our QMS. The possibility to actively participate in the quality management process has led to broad acceptance and usage by the employees. The administration effort was able to be tremendously decreased as compared to conventional QMS. (orig.)

  8. A Conceptual Model for Multidimensional Analysis of Documents

    Science.gov (United States)

    Ravat, Franck; Teste, Olivier; Tournier, Ronan; Zurlfluh, Gilles

    Data warehousing and OLAP are mainly used for the analysis of transactional data. Nowadays, with the evolution of Internet, and the development of semi-structured data exchange format (such as XML), it is possible to consider entire fragments of data such as documents as analysis sources. As a consequence, an adapted multidimensional analysis framework needs to be provided. In this paper, we introduce an OLAP multidimensional conceptual model without facts. This model is based on the unique concept of dimensions and is adapted for multidimensional document analysis. We also provide a set of manipulation operations.

  9. Shoulder dystocia documentation: an evaluation of a documentation training intervention.

    Science.gov (United States)

    LeRiche, Tammy; Oppenheimer, Lawrence; Caughey, Sharon; Fell, Deshayne; Walker, Mark

    2015-03-01

    To evaluate the quality and content of nurse and physician shoulder dystocia delivery documentation before and after MORE training in shoulder dystocia management skills and documentation. Approximately 384 charts at the Ottawa Hospital General Campus involving a diagnosis of shoulder dystocia between the years of 2000 and 2006 excluding the training year of 2003 were identified. The charts were evaluated for 14 key components derived from a validated instrument. The delivery notes were then scored based on these components by 2 separate investigators who were blinded to delivery note author, date, and patient identification to further quantify delivery record quality. Approximately 346 charts were reviewed for physician and nurse delivery documentation. The average score for physician notes was 6 (maximum possible score of 14) both before and after the training intervention. The nurses' average score was 5 before and after the training intervention. Negligible improvement was observed in the content and quality of shoulder dystocia documentation before and after nurse and physician training.

  10. Motor Gasoline Market Model documentation report

    International Nuclear Information System (INIS)

    1993-09-01

    The purpose of this report is to define the objectives of the Motor Gasoline Market Model (MGMM), describe its basic approach and to provide detail on model functions. This report is intended as a reference document for model analysts, users, and the general public. The MGMM performs a short-term (6- to 9-month) forecast of demand and price for motor gasoline in the US market; it also calculates end of month stock levels. The model is used to analyze certain market behavior assumptions or shocks and to determine the effect on market price, demand and stock level

  11. Document Categorization with Modified Statistical Language Models for Agglutinative Languages

    Directory of Open Access Journals (Sweden)

    Tantug

    2010-11-01

    Full Text Available In this paper, we investigate the document categorization task with statistical language models. Our study mainly focuses on categorization of documents in agglutinative languages. Due to the productive morphology of agglutinative languages, the number of word forms encountered in naturally occurring text is very large. From the language modeling perspective, a large vocabulary results in serious data sparseness problems. In order to cope with this drawback, previous studies in various application areas suggest modified language models based on different morphological units. It is reported that performance improvements can be achieved with these modified language models. In our document categorization experiments, we use standard word form based language models as well as other modified language models based on root words, root words and part-of-speech information, truncated word forms and character sequences. Additionally, to find an optimum parameter set, multiple tests are carried out with different language model orders and smoothing methods. Similar to previous studies on other tasks, our experimental results on categorization of Turkish documents reveal that applying linguistic preprocessing steps for language modeling provides improvements over standard language models to some extent. However, it is also observed that similar level of performance improvements can also be acquired by simpler character level or truncated word form models which are language independent.

  12. Fast words boundaries localization in text fields for low quality document images

    Science.gov (United States)

    Ilin, Dmitry; Novikov, Dmitriy; Polevoy, Dmitry; Nikolaev, Dmitry

    2018-04-01

    The paper examines the problem of word boundaries precise localization in document text zones. Document processing on a mobile device consists of document localization, perspective correction, localization of individual fields, finding words in separate zones, segmentation and recognition. While capturing an image with a mobile digital camera under uncontrolled capturing conditions, digital noise, perspective distortions or glares may occur. Further document processing gets complicated because of its specifics: layout elements, complex background, static text, document security elements, variety of text fonts. However, the problem of word boundaries localization has to be solved at runtime on mobile CPU with limited computing capabilities under specified restrictions. At the moment, there are several groups of methods optimized for different conditions. Methods for the scanned printed text are quick but limited only for images of high quality. Methods for text in the wild have an excessively high computational complexity, thus, are hardly suitable for running on mobile devices as part of the mobile document recognition system. The method presented in this paper solves a more specialized problem than the task of finding text on natural images. It uses local features, a sliding window and a lightweight neural network in order to achieve an optimal algorithm speed-precision ratio. The duration of the algorithm is 12 ms per field running on an ARM processor of a mobile device. The error rate for boundaries localization on a test sample of 8000 fields is 0.3

  13. [Recommendations for the control of documents and the establishment of a documentary system].

    Science.gov (United States)

    Vinner, E

    2013-06-01

    The quality management system that must be implemented in a MBL to meet the requirements of the standard NF EN ISO 15189 is based, among other things, on the creation and use by staff of a documentary system approved and updated. This documentary system is constituted by external documents (standards, suppliers' documents...) and internal documents (quality manual, procedures, instructions, technical and quality recordings...). A procedure of the documentary system control must be formalized. The documentary system should be modeled in order to identify the various procedures to be drafted and the incurred risks in the case a document would be missing in this system. Each document must be indexed in a unique way and document management must be carried out rigorously. The use of document management software is a great help to manage the life cycle of documents.

  14. Documentation pckage for the RFID temperature monitoring system (Of Model 9977 packages at NTS).

    Energy Technology Data Exchange (ETDEWEB)

    Chen, K.; Tsai, H.; Decision and Information Sciences

    2009-02-20

    The technical basis for extending the Model 9977 shipping package periodic maintenance beyond the one-year interval to a maximum of five years is based on the performance of the O-ring seals and the environmental conditions. The DOE Packaging Certification Program (PCP) has tasked Argonne National Laboratory to develop a Radio-Frequency Identification (RFID) temperature monitoring system for use by the facility personnel at DAF/NTS. The RFID temperature monitoring system, depicted in the figure below, consists of the Mk-1 RFId tags, a reader, and a control computer mounted on a mobile platform that can operate as a stand-alone system, or it can be connected to the local IT network. As part of the Conditions of Approval of the CoC, the user must complete the prescribed training to become qualified and be certified for operation of the RFID temperature monitoring system. The training course will be administered by Argonne National Laboratory on behalf of the Headquarters Certifying Official. This is a complete documentation package for the RFID temperature monitoring system of the Model 9977 packagings at NTS. The documentation package will be used for training and certification. The table of contents are: Acceptance Testing Procedure of MK-1 RFID Tags for DOE/EM Nuclear Materials Management Applications; Acceptance Testing Result of MK-1 RFID Tags for DOE/EM Nuclear Materials Management Applications; Performance Test of the Single Bolt Seal Sensor for the Model 9977 Packaging; Calibration of Built-in Thermistors in RFID Tags for Nevada Test Site; Results of Calibration of Built-in Thermistors in RFID Tags; Results of Thermal Calibration of Second Batch of MK-I RFID Tags; Procedure for Installing and Removing MK-1 RFID Tag on Model 9977 Drum; User Guide for RFID Reader and Software for Temperature Monitoring of Model 9977 Drums at NTS; Software Quality Assurance Plan (SQAP) for the ARG-US System; Quality Category for the RFID Temperature Monitoring System; The

  15. Documentation of the petroleum market model (PMM). Appendix: Model developer's report

    International Nuclear Information System (INIS)

    1994-01-01

    The Office of Integrated Analysis and Forecasting (OIAF) is required to provide complete model documentation to meet the EIA Model Acceptance Standards. The EIA Model Documentation: Petroleum Market Model of the National Energy Modeling System provides a complete description of the Petroleum Market Model's (PMM) methodology, and relation to other modules in the National Energy Modeling System (NEMS). This Model Developer's Report (MDR) serves as an appendix to the methodology documentation and provides an assessment of the sensitivity of PMM results to changes in input data. The MDR analysis for PMM is performed by varying several sets of input variables one-at-a-time and examining the effect on a set of selected output variables. The analysis is based on stand-alone, rather than integrated, National Energy Modeling System (NEMS) runs. This means that other NEMS modules are not responding to PMM outputs. The PMM models petroleum refining and marketing. The purpose of the PMM is to project petroleum product prices, refining activities, and movements of petroleum into the United States and among domestic regions. In addition, the PMM estimates capacity expansion and fuel consumption in, the refining industry. The PMM is also used to analyze a wide variety of petroleum-related issues and policies, in order to foster better understanding of the petroleum refining and marketing industry and the effects of certain policies and regulations. The PMM simulates the operation of petroleum refineries in the United States, including the supply and transportation of crude oil to refineries, the regional processing of these raw materials into petroleum products, and the distribution of petroleum products to meet regional demands. The essential outputs of this model are product prices, a petroleum supply/demand balance, demands for refinery fuel use, and capacity expansion

  16. SANSMIC design document.

    Energy Technology Data Exchange (ETDEWEB)

    Weber, Paula D. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Rudeen, David Keith [GRAM, Inc., Albuquerque, NM (United States)

    2015-07-01

    The United States Strategic Petroleum Reserve (SPR) maintains an underground storage system consisting of caverns that were leached or solution mined in four salt domes located near the Gulf of Mexico in Texas and Louisiana. The SPR comprises more than 60 active caverns containing approximately 700 million barrels of crude oil. Sandia National Labo- ratories (SNL) is the geotechnical advisor to the SPR. As the most pressing need at the inception of the SPR was to create and fill storage volume with oil, the decision was made to leach the caverns and fill them simultaneously (leach-fill). Therefore, A.J. Russo developed SANSMIC in the early 1980s which allows for a transient oil-brine interface (OBI) making it possible to model leach-fill and withdrawal operations. As the majority of caverns are currently filled to storage capacity, the primary uses of SANSMIC at this time are related to the effects of small and large withdrawals, expansion of existing caverns, and projecting future pillar to diameter ratios. SANSMIC was identified by SNL as a priority candidate for qualification. This report continues the quality assurance (QA) process by documenting the "as built" mathematical and numerical models that comprise this document. The pro- gram flow is outlined and the models are discussed in detail. Code features that were added later or were not documented previously have been expounded. No changes in the code's physics have occurred since the original documentation (Russo, 1981, 1983) although recent experiments may yield improvements to the temperature and plume methods in the future.

  17. Concordance Between Veterans' Self-Report and Documentation of Surrogate Decision Makers: Implications for Quality Measurement.

    Science.gov (United States)

    Garner, Kimberly K; Dubbert, Patricia; Lensing, Shelly; Sullivan, Dennis H

    2017-01-01

    The Measuring What Matters initiative of the American Academy of Hospice and Palliative Medicine and the Hospice and Palliative Nurses Association identified documentation of a surrogate decision maker as one of the top 10 quality indicators in the acute hospital and hospice settings. To better understand the potential implementation of this Measuring What Matters quality measure #8, Documentation of Surrogate in outpatient primary care settings by describing primary care patients' self-reported identification and documentation of a surrogate decision maker. Examination of patient responses to self-assessment questions from advance health care planning educational groups conducted in one medical center primary care clinic and seven community-based outpatient primary care clinics. We assessed the concordance between patient reports of identifying and naming a surrogate decision maker and having completed an advance directive (AD) with presence of an AD in the electronic medical record. Of veterans without a documented AD on file, more than half (66%) reported that they had talked with someone they trusted and nearly half (52%) reported that they had named someone to communicate their preferences. Our clinical project data suggest that many more veterans may have initiated communications with surrogate decision makers than is evident in the electronic medical record. System changes are needed to close the gap between veterans' plans for a surrogate decision maker and the documentation available to acute care health care providers. Published by Elsevier Inc.

  18. Experimental determination of chosen document elements parameters from raster graphics sources

    Directory of Open Access Journals (Sweden)

    Jiří Rybička

    2010-01-01

    Full Text Available Visual appearance of documents and their formal quality is considered to be as important as the content quality. Formal and typographical quality of documents can be evaluated by an automated system that processes raster images of documents. A document is described by a formal model that treats a page as an object and also as a set of elements, whereas page elements include text and graphic object. All elements are described by their parameters depending on elements’ type. For future evaluation, mainly text objects are important. This paper describes the experimental determination of chosen document elements parameters from raster images. Techniques for image processing are used, where an image is represented as a matrix of dots and parameter values are extracted. Algorithms for parameter extraction from raster images were designed and were aimed mainly at typographical parameters like indentation, alignment, font size or spacing. Algorithms were tested on a set of 100 images of paragraphs or pages and provide very good results. Extracted parameters can be directly used for typographical quality evaluation.

  19. Pocket change: a simple educational intervention increases hospitalist documentation of comorbidities and improves hospital quality performance measures.

    Science.gov (United States)

    Sparks, Rachel; Salskov, Alex H; Chang, Anita S; Wentworth, Kelly L; Gupta, Pritha P; Staiger, Thomas O; Anawalt, Bradley D

    2015-01-01

    Complete documentation of patient comorbidities in the medical record is important for clinical care, hospital reimbursement, and quality performance measures. We designed a pocket card reminder and brief educational intervention aimed at hospitalists with the goal of improving documentation of 6 common comorbidities present on admission: coagulation abnormalities, metastatic cancer, anemia, fluid and electrolyte abnormalities, malnutrition, and obesity. Two internal medicine inpatient teams led by 10 hospitalist physicians at an academic medical center received the educational intervention and pocket card reminder (n = 520 admissions). Two internal medicine teams led by nonhospitalist physicians served as a control group (n = 590 admissions). Levels of documentation of 6 common comorbidities, expected length of stay, and expected mortality were measured at baseline and during the 9-month study period. The intervention was associated with increased documentation of anemia, fluid and electrolyte abnormalities, malnutrition, and obesity in the intervention group, both compared to baseline and compared to the control group during the study period. The expected length of stay increased in the intervention group during the study period. A simple educational intervention and pocket card reminder were associated with improved documentation and hospital quality measures at an academic medical center.

  20. Research on Generating Method of Embedded Software Test Document Based on Dynamic Model

    Science.gov (United States)

    Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Liu, Ying

    2018-03-01

    This paper provides a dynamic model-based test document generation method for embedded software that provides automatic generation of two documents: test requirements specification documentation and configuration item test documentation. This method enables dynamic test requirements to be implemented in dynamic models, enabling dynamic test demand tracking to be easily generated; able to automatically generate standardized, standardized test requirements and test documentation, improved document-related content inconsistency and lack of integrity And other issues, improve the efficiency.

  1. Model documentation Renewable Fuels Module of the National Energy Modeling System

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-01-01

    This report documents the objectives, analaytical approach and design of the National Energy Modeling System (NEMS) Renewable Fuels Module (RFM) as it relates to the production of the 1996 Annual Energy Outlook forecasts. The report catalogues and describes modeling assumptions, computational methodologies, data inputs, and parameter estimation techniques. A number of offline analyses used in lieu of RFM modeling components are also described.

  2. Documentation of TRU biological transport model (BIOTRAN)

    Energy Technology Data Exchange (ETDEWEB)

    Gallegos, A.F.; Garcia, B.J.; Sutton, C.M.

    1980-01-01

    Inclusive of Appendices, this document describes the purpose, rationale, construction, and operation of a biological transport model (BIOTRAN). This model is used to predict the flow of transuranic elements (TRU) through specified plant and animal environments using biomass as a vector. The appendices are: (A) Flows of moisture, biomass, and TRU; (B) Intermediate variables affecting flows; (C) Mnemonic equivalents (code) for variables; (D) Variable library (code); (E) BIOTRAN code (Fortran); (F) Plants simulated; (G) BIOTRAN code documentation; (H) Operating instructions for BIOTRAN code. The main text is presented with a specific format which uses a minimum of space, yet is adequate for tracking most relationships from their first appearance to their formulation in the code. Because relationships are treated individually in this manner, and rely heavily on Appendix material for understanding, it is advised that the reader familiarize himself with these materials before proceeding with the main text.

  3. Documentation of TRU biological transport model (BIOTRAN)

    International Nuclear Information System (INIS)

    Gallegos, A.F.; Garcia, B.J.; Sutton, C.M.

    1980-01-01

    Inclusive of Appendices, this document describes the purpose, rationale, construction, and operation of a biological transport model (BIOTRAN). This model is used to predict the flow of transuranic elements (TRU) through specified plant and animal environments using biomass as a vector. The appendices are: (A) Flows of moisture, biomass, and TRU; (B) Intermediate variables affecting flows; (C) Mnemonic equivalents (code) for variables; (D) Variable library (code); (E) BIOTRAN code (Fortran); (F) Plants simulated; (G) BIOTRAN code documentation; (H) Operating instructions for BIOTRAN code. The main text is presented with a specific format which uses a minimum of space, yet is adequate for tracking most relationships from their first appearance to their formulation in the code. Because relationships are treated individually in this manner, and rely heavily on Appendix material for understanding, it is advised that the reader familiarize himself with these materials before proceeding with the main text

  4. Evaluation of the functional performance and technical quality of an Electronic Documentation System of the Nursing Process.

    Science.gov (United States)

    de Oliveira, Neurilene Batista; Peres, Heloisa Helena Ciqueto

    2015-01-01

    To evaluate the functional performance and the technical quality of the Electronic Documentation System of the Nursing Process of the Teaching Hospital of the University of São Paulo. exploratory-descriptive study. The Quality Model of regulatory standard 25010 and the Evaluation Process defined under regulatory standard 25040, both of the International Organization for Standardization/International Electrotechnical Commission. The quality characteristics evaluated were: functional suitability, reliability, usability, performance efficiency, compatibility, security, maintainability and portability. The sample was made up of 37 evaluators. in the evaluation of the specialists in information technology, only the characteristic of usability obtained a rate of positive responses of less than 70%. For the nurse lecturers, all the quality characteristics obtained a rate of positive responses of over 70%. The staff nurses of the medical and surgical clinics with experience in using the system) and staff nurses from other units of the hospital and from other health institutions (without experience in using the system) obtained rates of positive responses of more than 70% referent to the functional suitability, usability, and security. However, performance efficiency, reliability and compatibility all obtained rates below the parameter established. the software achieved rates of positive responses of over 70% for the majority of the quality characteristics evaluated.

  5. Quality Systems. A Thermodynamics-Related Interpretive Model

    Directory of Open Access Journals (Sweden)

    Stefano A. Lollai

    2017-08-01

    Full Text Available In the present paper, a Quality Systems Theory is presented. Certifiable Quality Systems are treated and interpreted in accordance with a Thermodynamics-based approach. Analysis is also conducted on the relationship between Quality Management Systems (QMSs and systems theories. A measure of entropy is proposed for QMSs, including a virtual document entropy and an entropy linked to processes and organisation. QMSs are also interpreted in light of Cybernetics, and interrelations between Information Theory and quality are also highlighted. A measure for the information content of quality documents is proposed. Such parameters can be used as adequacy indices for QMSs. From the discussed approach, suggestions for organising QMSs are also derived. Further interpretive thermodynamic-based criteria for QMSs are also proposed. The work represents the first attempt to treat quality organisational systems according to a thermodynamics-related approach. At this stage, no data are available to compare statements in the paper.

  6. A retrospective quality assessment of pre-hospital emergency medical documentation in motor vehicle accidents in south-eastern Norway

    Directory of Open Access Journals (Sweden)

    Staff Trine

    2011-03-01

    Full Text Available Abstract Background Few studies have evaluated pre-hospital documentation quality. We retrospectively assessed emergency medical service (EMS documentation of key logistic, physiologic, and mechanistic variables in motor vehicle accidents (MVAs. Methods Records from police, Emergency Medical Communication Centers (EMCC, ground and air ambulances were retrospectively collected for 189 MVAs involving 392 patients. Documentation of Glasgow Coma Scale (GCS, respiratory rate (RR, and systolic blood pressure (SBP was classified as exact values, RTS categories, clinical descriptions enabling post-hoc inference of RTS categories, or missing. The distribution of values of exact versus inferred RTS categories were compared (Chi-square test for trend. Results 25% of ground and 11% of air ambulance records were unretrieveable. Patient name, birth date, and transport destination was documented in >96% of ambulance records and 81% of EMCC reports. Only 54% of patient encounter times were transmitted to the EMCC, but 77% were documented in ground and 96% in air ambulance records. Ground ambulance records documented exact values of GCS in 48% and SBP in 53% of cases, exact RR in 10%, and RR RTS categories in 54%. Clinical descriptions made post-hoc inference of RTS categories possible in another 49% of cases for GCS, 26% for RR, and 20% for SBP. Air ambulance records documented exact values of GCS in 89% and SBP in 84% of cases, exact RR in 7% and RR RTS categories in 80%. Overall, for lower RTS categories of GCS, RR and SBP the proportion of actual documented values to inferred values increased (All: p Conclusion EMS documentation of logistic and mechanistic variables was adequate. Patient physiology was frequently documented only as descriptive text. Our finding indicates a need for improved procedures, training, and tools for EMS documentation. Documentation is in itself a quality criterion for appropriate care and is crucial to trauma research.

  7. 76 FR 22665 - Release of Final Document Related to the Review of the National Ambient Air Quality Standards for...

    Science.gov (United States)

    2011-04-22

    ... criteria. The revised air quality criteria reflect advances in scientific knowledge on the effects of the... National Ambient Air Quality Standards, contains staff analyses of the scientific bases for alternative... Document Related to the Review of the National Ambient Air Quality Standards for Particulate Matter AGENCY...

  8. Comparison of the guidance documents in support of EU risk assessments with those for the derivation of EU water quality standards

    NARCIS (Netherlands)

    Vos JH; Janssen MPM; SEC

    2005-01-01

    Risks of both new and existing substances and of biocides in Europe are being evaluated using the Technical Guidance Document (TGD). The European Water Framework Directive refers to this document for establishing Environmental Quality Standards (EQSs) for water. Another guidance document for the

  9. Model documentation Natural Gas Transmission and Distribution Model of the National Energy Modeling System. Volume 1

    International Nuclear Information System (INIS)

    1996-01-01

    The Natural Gas Transmission and Distribution Model (NGTDM) of the National Energy Modeling System is developed and maintained by the Energy Information Administration (EIA), Office of Integrated Analysis and Forecasting. This report documents the archived version of the NGTDM that was used to produce the natural gas forecasts presented in the Annual Energy Outlook 1996, (DOE/EIA-0383(96)). The purpose of this report is to provide a reference document for model analysts, users, and the public that defines the objectives of the model, describes its basic approach, and provides detail on the methodology employed. Previously this report represented Volume I of a two-volume set. Volume II reported on model performance, detailing convergence criteria and properties, results of sensitivity testing, comparison of model outputs with the literature and/or other model results, and major unresolved issues

  10. Model documentation Natural Gas Transmission and Distribution Model of the National Energy Modeling System. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-02-26

    The Natural Gas Transmission and Distribution Model (NGTDM) of the National Energy Modeling System is developed and maintained by the Energy Information Administration (EIA), Office of Integrated Analysis and Forecasting. This report documents the archived version of the NGTDM that was used to produce the natural gas forecasts presented in the Annual Energy Outlook 1996, (DOE/EIA-0383(96)). The purpose of this report is to provide a reference document for model analysts, users, and the public that defines the objectives of the model, describes its basic approach, and provides detail on the methodology employed. Previously this report represented Volume I of a two-volume set. Volume II reported on model performance, detailing convergence criteria and properties, results of sensitivity testing, comparison of model outputs with the literature and/or other model results, and major unresolved issues.

  11. Fiscal impacts model documentation. Version 1.0

    International Nuclear Information System (INIS)

    Beck, S.L.; Scott, M.J.

    1986-05-01

    The Fiscal Impacts (FI) Model, Version 1.0 was developed under Pacific Northwest Laboratory's Monitored Retrievable Storage (MRS) Program to aid in development of the MRS Reference Site Environmental Document (PNL 5476). It computes estimates of 182 fiscal items for state and local government jurisdictions, using input data from the US Census Bureau's 1981 Survey of Governments and local population forecasts. The model can be adapted for any county or group of counties in the United States

  12. Documenting a best practice model for successful female inmate ...

    African Journals Online (AJOL)

    Documenting a best practice model for successful female inmate and female ex ... of men and women within the prison and correctional services as well as inform and ... and beyond, with scope for transforming it into a robust business model.

  13. Documentation of the Oil and Gas Supply Module (OGSM). Appendix, Model developers report

    International Nuclear Information System (INIS)

    1995-01-01

    The Office of Integrated Analysis and Forecasting (OIAF) is required to provide complete model documentation to meet the EIA Model Acceptance Standards. The Documentation for the Oil and Gas Supply Module (OGSM) provides a complete description of the OGSM methodology, structure, and relation to other modules in the National Energy Modeling System (NEMS). This Model Developers Report (MDR) serves as an appendix to the methodology documentation. This report provides an overview of the model and an assessment of the sensitivity of OGSM results to changes in input data or parameters

  14. [Prospective DRG coding : Improvement in cost-effectiveness and documentation quality of in-patient hospital care].

    Science.gov (United States)

    Geuss, S; Jungmeister, A; Baumgart, A; Seelos, R; Ockert, S

    2018-02-01

    In prospective reimbursement schemes a diagnosis-related group (DRG) is assigned to each case according to all coded diagnoses and procedures. This process can be conducted retrospectively after (DC) or prospectively during the hospitalization (PC). The use of PC offers advantages in terms of cost-effectiveness and documentation quality without impairing patient safety. A retrospective analysis including all DRG records and billing data from 2012 to 2015 of a surgical department was carried out. The use of PC was introduced into the vascular surgery unit (VS) in September 2013, while the remaining surgical units (RS) stayed with DC. Analysis focused on differences between VS and RS before and after introduction of PC. Characteristics of cost-effectiveness were earnings (EBIT-DA), length of stay (LOS), the case mix index (CMI) and the productivity in relation to the DRG benchmark (productivity index, PI). The number of recorded diagnoses/procedures (ND/NP) was an indicator for documentation quality. A total of 1703 cases with VS and 27,679 cases with RS were analyzed. After introduction of PC the EBIT-DA per case increased in VS but not in RS (+3342 Swiss francs vs. +84, respectively, p  0.05) and the LOS was more reduced in VS than in RS (-0.36 days vs. -0.03 days, p > 0.005). The PI increased in VS but decreased in RS (+0.131 vs. -0.032, p DRG benchmark, i. e. increasing the PI. The increasing ND indicates an improvement in documentation quality.

  15. 75 FR 39252 - Release of Final Documents Related to the Review of the National Ambient Air Quality Standards...

    Science.gov (United States)

    2010-07-08

    ... Quality Standards: Scope and Methods Plan for Health Risk and Exposure Assessment and Particulate Matter... ENVIRONMENTAL PROTECTION AGENCY [EPA-HQ-OAR-2007-0492; FRL-9171-8] Release of Final Documents...: Environmental Protection Agency (EPA). ACTION: Notice of Availability. SUMMARY: The Office of Air Quality...

  16. [Psychometric properties of Q-DIO, an instrument to measure the quality of documented nursing diagnoses, interventions and outcomes].

    NARCIS (Netherlands)

    Muller-Staub, M.; Lunney, M.; Lavin, M.A.; Needham, I.; Odenbreit, M.; Achterberg, T. van

    2010-01-01

    The instrument Q-DIO was developed in the years 2005 till 2006 to measure the quality of documented nursing diagnoses, interventions, and nursing sensitive patient outcomes. Testing psychometric properties of the Q-DIO (Quality of nursing Diagnoses, Interventions and Outcomes.) was the study aim.

  17. Forms of contractual documents for public gas distribution; Modeles de documents contractuels pour la distribution publique de gaz

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-12-01

    This document is a compilation of standard forms of concession agreements and of specifications for public gas distribution (general dispositions, granted network and works, connection to the granted network, gas quality, contracts and conditions of supply, gas prices, concession end and control, various dispositions, agreement between the town and the grantee, calculation of profit rate, gas retail prices, general conditions of supply). (J.S.)

  18. Improving of Quality Control and Quality Assurance in 14C and 3H Laboratory; Participation in the IAEA Model Project

    International Nuclear Information System (INIS)

    Obelic, B.

    2001-01-01

    Full text: Users of laboratory's analytical results are increasingly requiring demonstrable proofs of the reliability and credibility of the results using internationally accepted standards, because the economic, ecological, medical and legal decisions based on laboratory results need to be accepted nationally and internationally. Credibility, respect and opportunities of the laboratories are improved when objective evidence on the reliability and quality of the results can be given. This is achieved through inculcation of a quality culture through definition of well-defined procedures and controls and operational checks characteristic of quality assurance and quality control (Q A/QC). IAEA launched in 1999 a two-and-a-half year model project entitled Quality Control and Quality Assurance of Nuclear Analytical Techniques with participation of laboratories using alpha, beta and/or gamma spectrometry from CEE and NIS countries. The project started to introduce and implement QA principles in accordance with the ISO-17025 guide, leading eventually to a level at which the QA system is self-sustainable and might be appropriate for formal accreditation or certification by respective national authorities. Activities within the project consist of semi-annual reports, two training workshops, two inspection visits of the laboratories by IAEA experts and proficiency tests. The following topics were considered: organisation requirements, acceptance criteria and non-conformance management of QC, internal and external method validation, statistical analyses and uncertainty evaluation, standard operation procedures and quality manual documentation. 14 C and 3 H Laboratory of the Rudjer Boskovic Institute has been one of ten laboratories participating in the Project. In the Laboratory all the procedures required in the quality control were included implicitly, while during the Model Project much effort has been devoted to elaboration of explicit documentation. Since the beginning

  19. THREE DIMENSIONAL MODELING VIA PHOTOGRAPHS FOR DOCUMENTATION OF A VILLAGE BATH

    Directory of Open Access Journals (Sweden)

    H. B. Balta

    2013-07-01

    Full Text Available The aim of this study is supporting the conceptual discussions of architectural restoration with three dimensional modeling of monuments based on photogrammetric survey. In this study, a 16th century village bath in Ulamış, Seferihisar, and Izmir is modeled for documentation. Ulamış is one of the historical villages within which Turkish population first settled in the region of Seferihisar – Urla. The methodology was tested on an antique monument; a bath with a cubical form. Within the limits of this study, only the exterior of the bath was modeled. The presentation scale for the bath was determined as 1 / 50, considering the necessities of designing structural interventions and architectural ones within the scope of a restoration project. The three dimensional model produced is a realistic document presenting the present situation of the ruin. Traditional plan, elevation and perspective drawings may be produced from the model, in addition to the realistic textured renderings and wireframe representations. The model developed in this study provides opportunity for presenting photorealistic details of historical morphologies in scale. Compared to conventional drawings, the renders based on the 3d models provide an opportunity for conceiving architectural details such as color, material and texture. From these documents, relatively more detailed restitution hypothesis can be developed and intervention decisions can be taken. Finally, the principles derived from the case study can be used for 3d documentation of historical structures with irregular surfaces.

  20. Consumer Vehicle Choice Model Documentation

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Changzheng [ORNL; Greene, David L [ORNL

    2012-08-01

    In response to the Fuel Economy and Greenhouse Gas (GHG) emissions standards, automobile manufacturers will need to adopt new technologies to improve the fuel economy of their vehicles and to reduce the overall GHG emissions of their fleets. The U.S. Environmental Protection Agency (EPA) has developed the Optimization Model for reducing GHGs from Automobiles (OMEGA) to estimate the costs and benefits of meeting GHG emission standards through different technology packages. However, the model does not simulate the impact that increased technology costs will have on vehicle sales or on consumer surplus. As the model documentation states, “While OMEGA incorporates functions which generally minimize the cost of meeting a specified carbon dioxide (CO2) target, it is not an economic simulation model which adjusts vehicle sales in response to the cost of the technology added to each vehicle.” Changes in the mix of vehicles sold, caused by the costs and benefits of added fuel economy technologies, could make it easier or more difficult for manufacturers to meet fuel economy and emissions standards, and impacts on consumer surplus could raise the costs or augment the benefits of the standards. Because the OMEGA model does not presently estimate such impacts, the EPA is investigating the feasibility of developing an adjunct to the OMEGA model to make such estimates. This project is an effort to develop and test a candidate model. The project statement of work spells out the key functional requirements for the new model.

  1. A Participatory Model for Multi-Document Health Information Summarisation

    Directory of Open Access Journals (Sweden)

    Dinithi Nallaperuma

    2017-03-01

    Full Text Available Increasing availability and access to health information has been a paradigm shift in healthcare provision as it empowers both patients and practitioners alike. Besides awareness, significant time savings and process efficiencies can be achieved through effective summarisation of healthcare information. Relevance and accuracy are key concerns when generating summaries for such documents. Despite advances in automated summarisation approaches, the role of participation has not been explored. In this paper, we propose a new model for multi-document health information summarisation that takes into account the role of participation. The updated IS user participation theory was extended to explicate these roles. The proposed model integrates both extractive and abstractive summarisation processes with continuous participatory inputs to each phase. The model was implemented as a client-server application and evaluated by both domain experts and health information consumers. Results from the evaluation phase indicates the model is successful in generating relevant and accurate summaries for diverse audiences.

  2. AVIS: analysis method for document coherence

    International Nuclear Information System (INIS)

    Henry, J.Y.; Elsensohn, O.

    1994-06-01

    The present document intends to give a short insight into AVIS, a method which permits to verify the quality of technical documents. The paper includes the presentation of the applied approach based on the K.O.D. method, the definition of quality criteria of a technical document, as well as a description of the means of valuating these criteria. (authors). 9 refs., 2 figs

  3. The documentation of product configuration systems: A framework and an IT solution

    DEFF Research Database (Denmark)

    Shafiee, Sara; Hvam, Lars; Haug, Anders

    2017-01-01

    for maintenance, further development, system quality and communication with domain experts. Product models are the main communication and documentation tools used in PCS projects. Recent studies have shown that up-to-date documentation for the PCS is often lacking due to the significant amount of work required...... system is proposed that is capable of retrieving knowledge from the PCS and thus generating the product model. Our framework and IT documentation system were developed and tested at a case company on five different projects. The results confirm that benefits can be achieved by using the proposed...

  4. [Psychometric properties of Q-DIO, an instrument to measure the quality of documented nursing diagnoses, interventions and outcomes].

    Science.gov (United States)

    Müller-Staub, Maria; Lunney, Margaret; Lavin, Mary Ann; Needham, Ian; Odenbreit, Matthias; van Achterberg, Theo

    2010-04-01

    The instrument Q-DIO was developed in the years 2005 till 2006 to measure the quality of documented nursing diagnoses, interventions, and nursing sensitive patient outcomes. Testing psychometric properties of the Q-DIO (Quality of nursing Diagnoses, Interventions and Outcomes.) was the study aim. Instrument testing included internal consistency, test-retest reliability, interrater reliability, item analyses, and an assessment of the objectivity. To render variation in scores, a random strata sample of 60 nursing documentations was drawn. The strata represented 30 nursing documentations with and 30 without application of theory based, standardised nursing language. Internal consistency of the subscale nursing diagnoses as process showed Cronbach's Alpha 0.83 [0.78, 0.88]; nursing diagnoses as product 0.98 [0.94, 0.99]; nursing interventions 0.90 [0.85, 0.94]; and nursing-sensitive patient outcomes 0.99 [0.95, 0.99]. With Cohen's Kappa of 0.95, the intrarater reliability was good. The interrater reliability showed a Kappa of 0.94 [0.90, 0.96]. Item analyses confirmed the fulfilment of criteria for degree of difficulty and discriminative validity of the items. In this study, Q-DIO has shown to be a reliable instrument. It allows measuring the documented quality of nursing diagnoses, interventions and outcomes with and without implementation of theory based, standardised nursing languages. Studies for further testing of Q-DIO in other settings are recommended. The results implicitly support the use of nursing classifications such as NANDA, NIC and NOC.

  5. Development of an instrument to measure the quality of documented nursing diagnoses, interventions and outcomes: the Q-DIO.

    Science.gov (United States)

    Müller-Staub, Maria; Lunney, Margaret; Odenbreit, Matthias; Needham, Ian; Lavin, Mary Ann; van Achterberg, Theo

    2009-04-01

    This paper aims to report the development stages of an audit instrument to assess standardised nursing language. Because research-based instruments were not available, the instrument Quality of documentation of nursing Diagnoses, Interventions and Outcomes (Q-DIO) was developed. Standardised nursing language such as nursing diagnoses, interventions and outcomes are being implemented worldwide and will be crucial for the electronic health record. The literature showed a lack of audit instruments to assess the quality of standardised nursing language in nursing documentation. A qualitative design was used for instrument development. Criteria were first derived from a theoretical framework and literature reviews. Second, the criteria were operationalized into items and eight experts assessed face and content validity of the Q-DIO. Criteria were developed and operationalized into 29 items. For each item, a three or five point scale was applied. The experts supported content validity and showed 88.25% agreement for the scores assigned to the 29 items of the Q-DIO. The Q-DIO provides a literature-based audit instrument for nursing documentation. The strength of Q-DIO is its ability to measure the quality of nursing diagnoses and related interventions and nursing-sensitive patient outcomes. Further testing of Q-DIO is recommended. Based on the results of this study, the Q-DIO provides an audit instrument to be used in clinical practice. Its criteria can set the stage for the electronic nursing documentation in electronic health records.

  6. Embedding Term Similarity and Inverse Document Frequency into a Logical Model of Information Retrieval.

    Science.gov (United States)

    Losada, David E.; Barreiro, Alvaro

    2003-01-01

    Proposes an approach to incorporate term similarity and inverse document frequency into a logical model of information retrieval. Highlights include document representation and matching; incorporating term similarity into the measure of distance; new algorithms for implementation; inverse document frequency; and logical versus classical models of…

  7. Fine‐Grained Mobile Application Clustering Model Using Retrofitted Document Embedding

    Directory of Open Access Journals (Sweden)

    Yeo‐Chan Yoon

    2017-08-01

    Full Text Available In this paper, we propose a fine‐grained mobile application clustering model using retrofitted document embedding. To automatically determine the clusters and their numbers with no predefined categories, the proposed model initializes the clusters based on title keywords and then merges similar clusters. For improved clustering performance, the proposed model distinguishes between an accurate clustering step with titles and an expansive clustering step with descriptions. During the accurate clustering step, an automatically tagged set is constructed as a result. This set is utilized to learn a high‐performance document vector. During the expansive clustering step, more applications are then classified using this document vector. Experimental results showed that the purity of the proposed model increased by 0.19, and the entropy decreased by 1.18, compared with the K‐means algorithm. In addition, the mean average precision improved by more than 0.09 in a comparison with a support vector machine classifier.

  8. Models and standards for production systems integration: Technological process and documents

    Directory of Open Access Journals (Sweden)

    Lečić Danica

    2005-01-01

    Full Text Available Electronic business demands from production companies to collaborate with customers, suppliers and end users and start electronic manufacturing. To achieve this goal companies have to integrate their subsystems (Application to Application-A2A and they have to collaborate with their business partners (Business to Business - B2B. For this purpose models and unique standards for integration are necessary. In this paper, ebXML and OAGI specifications have been used to present metamodel process by UML class diagram and standardized model of document Working Order for technological process in the form of OAGI BOD XML document. Based on it, from an example, model of technological process is presented by activity diagram (DA in XML form and an appearance of document Working Order. Just as well, rules of transformation DA to XML are presented.

  9. Documentation of 'Care-Packages' for Children in OECD's 2003 Tax/Ben Model, December 2006

    DEFF Research Database (Denmark)

    Hansen, Hans

    This working paper contains documentation for the modelling of schemes implemented in OECD's 2003 Tax/Ben model for use in the 'Carearchitecture' project. The documentation also includes schemes already in the model and used in the calculations for the project. The documented schemes include...... personal taxation, parental leave benefits, payment for childcare, child benefits and housing benefits in Denmark, Sweden, Norway, Finland, Great Britain and Germany....

  10. Hanford analytical services quality assurance requirements documents. Volume 1: Administrative Requirements

    International Nuclear Information System (INIS)

    Hyatt, J.E.

    1997-01-01

    Hanford Analytical Services Quality Assurance Requirements Document (HASQARD) is issued by the Analytical Services, Program of the Waste Management Division, US Department of Energy (US DOE), Richland Operations Office (DOE-RL). The HASQARD establishes quality requirements in response to DOE Order 5700.6C (DOE 1991b). The HASQARD is designed to meet the needs of DOE-RL for maintaining a consistent level of quality for sampling and field and laboratory analytical services provided by contractor and commercial field and laboratory analytical operations. The HASQARD serves as the quality basis for all sampling and field/laboratory analytical services provided to DOE-RL through the Analytical Services Program of the Waste Management Division in support of Hanford Site environmental cleanup efforts. This includes work performed by contractor and commercial laboratories and covers radiological and nonradiological analyses. The HASQARD applies to field sampling, field analysis, and research and development activities that support work conducted under the Hanford Federal Facility Agreement and Consent Order Tri-Party Agreement and regulatory permit applications and applicable permit requirements described in subsections of this volume. The HASQARD applies to work done to support process chemistry analysis (e.g., ongoing site waste treatment and characterization operations) and research and development projects related to Hanford Site environmental cleanup activities. This ensures a uniform quality umbrella to analytical site activities predicated on the concepts contained in the HASQARD. Using HASQARD will ensure data of known quality and technical defensibility of the methods used to obtain that data. The HASQARD is made up of four volumes: Volume 1, Administrative Requirements; Volume 2, Sampling Technical Requirements; Volume 3, Field Analytical Technical Requirements; and Volume 4, Laboratory Technical Requirements. Volume 1 describes the administrative requirements

  11. Model documentation coal market module of the National Energy Modeling System

    International Nuclear Information System (INIS)

    1997-02-01

    This report documents the objectives and the conceptual and methodological approach used in the development of the Coal Production Submodule (CPS). It provides a description of the CPS for model analysts and the public. The Coal Market Module provides annual forecasts of prices, production, and consumption of coal

  12. Model documentation coal market module of the National Energy Modeling System

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-02-01

    This report documents the objectives and the conceptual and methodological approach used in the development of the Coal Production Submodule (CPS). It provides a description of the CPS for model analysts and the public. The Coal Market Module provides annual forecasts of prices, production, and consumption of coal.

  13. DeepQA: improving the estimation of single protein model quality with deep belief networks.

    Science.gov (United States)

    Cao, Renzhi; Bhattacharya, Debswapna; Hou, Jie; Cheng, Jianlin

    2016-12-05

    Protein quality assessment (QA) useful for ranking and selecting protein models has long been viewed as one of the major challenges for protein tertiary structure prediction. Especially, estimating the quality of a single protein model, which is important for selecting a few good models out of a large model pool consisting of mostly low-quality models, is still a largely unsolved problem. We introduce a novel single-model quality assessment method DeepQA based on deep belief network that utilizes a number of selected features describing the quality of a model from different perspectives, such as energy, physio-chemical characteristics, and structural information. The deep belief network is trained on several large datasets consisting of models from the Critical Assessment of Protein Structure Prediction (CASP) experiments, several publicly available datasets, and models generated by our in-house ab initio method. Our experiments demonstrate that deep belief network has better performance compared to Support Vector Machines and Neural Networks on the protein model quality assessment problem, and our method DeepQA achieves the state-of-the-art performance on CASP11 dataset. It also outperformed two well-established methods in selecting good outlier models from a large set of models of mostly low quality generated by ab initio modeling methods. DeepQA is a useful deep learning tool for protein single model quality assessment and protein structure prediction. The source code, executable, document and training/test datasets of DeepQA for Linux is freely available to non-commercial users at http://cactus.rnet.missouri.edu/DeepQA/ .

  14. MODELING THE TRANSPORT AND CHEMICAL EVOLUTION OF ONSHORE AND OFFSHORE EMISSIONS AND THEIR IMPACT ON LOCAL AND REGIONAL AIR QUALITY USING A VARIABLE-GRID-RESOLUTION AIR QUALITY MODEL

    Energy Technology Data Exchange (ETDEWEB)

    Kiran Alapaty

    2003-12-01

    This document, the project's first semiannual report, summarizes the research performed from 04/17/2003 through 10/16/2003. Portions of the research in several of the project's eight tasks were completed, and results obtained are briefly presented. We have tested the applicability of two different atmospheric boundary layer schemes for use in air quality model simulations. Preliminary analysis indicates that a scheme that uses sophisticated atmospheric boundary physics resulted in better simulation of atmospheric circulations. We have further developed and tested a new surface data assimilation technique to improve meteorological simulations, which will also result in improved air quality model simulations. Preliminary analysis of results indicates that using the new data assimilation technique results in reduced modeling errors in temperature and moisture. Ingestion of satellite-derived sea surface temperatures into the mesoscale meteorological model led to significant improvements in simulated clouds and precipitation compared to that obtained using traditional analyzed sea surface temperatures. To enhance the capabilities of an emissions processing system so that it can be used with our variable-grid-resolution air quality model, we have identified potential areas for improvements. Also for use in the variable-grid-resolution air quality model, we have tested a cloud module offline for its functionality, and have implemented and tested an efficient horizontal diffusion algorithm within the model.

  15. Instructions for submittal and control of FFTF design documents and design related documentation

    International Nuclear Information System (INIS)

    Grush, R.E.

    1976-10-01

    This document provides the system and requirements for management of FFTF technical data prepared by Westinghouse Hanford (HEDL), and design contractors, the construction contractor and lower tier equipment suppliers. Included in this document are provisions for the review, approval, release, change control, and accounting of FFTF design disclosure and base documentation. Also included are provisions for submittal of other design related documents for review and approval consistent with applicable requirements of RDT-Standard F 2-2, ''Quality Assurance Program Requirements.''

  16. Model documentation: Natural Gas Transmission and Distribution Model of the National Energy Modeling System; Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-02-24

    The Natural Gas Transmission and Distribution Model (NGTDM) is a component of the National Energy Modeling System (NEMS) used to represent the domestic natural gas transmission and distribution system. NEMS is the third in a series of computer-based, midterm energy modeling systems used since 1974 by the Energy Information Administration (EIA) and its predecessor, the Federal Energy Administration, to analyze domestic energy-economy markets and develop projections. This report documents the archived version of NGTDM that was used to produce the natural gas forecasts used in support of the Annual Energy Outlook 1994, DOE/EIA-0383(94). The purpose of this report is to provide a reference document for model analysts, users, and the public that defines the objectives of the model, describes its basic design, provides detail on the methodology employed, and describes the model inputs, outputs, and key assumptions. It is intended to fulfill the legal obligation of the EIA to provide adequate documentation in support of its models (Public Law 94-385, Section 57.b.2). This report represents Volume 1 of a two-volume set. (Volume 2 will report on model performance, detailing convergence criteria and properties, results of sensitivity testing, comparison of model outputs with the literature and/or other model results, and major unresolved issues.) Subsequent chapters of this report provide: (1) an overview of the NGTDM (Chapter 2); (2) a description of the interface between the National Energy Modeling System (NEMS) and the NGTDM (Chapter 3); (3) an overview of the solution methodology of the NGTDM (Chapter 4); (4) the solution methodology for the Annual Flow Module (Chapter 5); (5) the solution methodology for the Distributor Tariff Module (Chapter 6); (6) the solution methodology for the Capacity Expansion Module (Chapter 7); (7) the solution methodology for the Pipeline Tariff Module (Chapter 8); and (8) a description of model assumptions, inputs, and outputs (Chapter 9).

  17. Quality assurance application in the documentation of nuclear research reactor

    International Nuclear Information System (INIS)

    Nababan, N.

    1999-01-01

    For each nuclear research reactor a document control system should be established and should be provide for preparation, review, approval, issuance, distribution, revision and validation (where appropriate) of documents essential to the management, performance and verification of work. In the document control system the responsibilities for each participating organization or individual should be defined in writing. The types of document include, but are not limited to document comprising the QA program, safety requirements, maintenance and operating procedures, inspection instructions, inspection and test reports, assessment reports, drawings, data files, calculations, specifications, computer codes, purchase orders and related documents, vendor supplied documents and work instruction. Management should identify the need for documents and should provide guidance to the organizations and people preparing them. The guidance should cover the status, scope and contents and the policies, standards and codes witch apply. It should also explain the need for feedback of experience. Plant modification or the results of assessments could also give rise to the need for a new document

  18. The Computable Catchment: An executable document for model-data software sharing, reproducibility and interactive visualization

    Science.gov (United States)

    Gil, Y.; Duffy, C.

    2015-12-01

    This paper proposes the concept of a "Computable Catchment" which is used to develop a collaborative platform for watershed modeling and data analysis. The object of the research is a sharable, executable document similar to a pdf, but one that includes documentation of the underlying theoretical concepts, interactive computational/numerical resources, linkage to essential data repositories and the ability for interactive model-data visualization and analysis. The executable document for each catchment is stored in the cloud with automatic provisioning and a unique identifier allowing collaborative model and data enhancements for historical hydroclimatic reconstruction and/or future landuse or climate change scenarios to be easily reconstructed or extended. The Computable Catchment adopts metadata standards for naming all variables in the model and the data. The a-priori or initial data is derived from national data sources for soils, hydrogeology, climate, and land cover available from the www.hydroterre.psu.edu data service (Leonard and Duffy, 2015). The executable document is based on Wolfram CDF or Computable Document Format with an interactive open-source reader accessible by any modern computing platform. The CDF file and contents can be uploaded to a website or simply shared as a normal document maintaining all interactive features of the model and data. The Computable Catchment concept represents one application for Geoscience Papers of the Future representing an extensible document that combines theory, models, data and analysis that are digitally shared, documented and reused among research collaborators, students, educators and decision makers.

  19. Rubber stamp templates for improving clinical documentation: A paper-based, m-Health approach for quality improvement in low-resource settings.

    Science.gov (United States)

    Kleczka, Bernadette; Musiega, Anita; Rabut, Grace; Wekesa, Phoebe; Mwaniki, Paul; Marx, Michael; Kumar, Pratap

    2018-06-01

    The United Nations' Sustainable Development Goal #3.8 targets 'access to quality essential healthcare services'. Clinical practice guidelines are an important tool for ensuring quality of clinical care, but many challenges prevent their use in low-resource settings. Monitoring the use of guidelines relies on cumbersome clinical audits of paper records, and electronic systems face financial and other limitations. Here we describe a unique approach to generating digital data from paper using guideline-based templates, rubber stamps and mobile phones. The Guidelines Adherence in Slums Project targeted ten private sector primary healthcare clinics serving informal settlements in Nairobi, Kenya. Each clinic was provided with rubber stamp templates to support documentation and management of commonly encountered outpatient conditions. Participatory design methods were used to customize templates to the workflows and infrastructure of each clinic. Rubber stamps were used to print templates into paper charts, providing clinicians with checklists for use during consultations. Templates used bubble format data entry, which could be digitized from images taken on mobile phones. Besides rubber stamp templates, the intervention included booklets of guideline compilations, one Android phone for digitizing images of templates, and one data feedback/continuing medical education session per clinic each month. In this paper we focus on the effect of the intervention on documentation of three non-communicable diseases in one clinic. Seventy charts of patients enrolled in the chronic disease program (hypertension/diabetes, n=867; chronic respiratory diseases, n=223) at one of the ten intervention clinics were sampled. Documentation of each individual patient encounter in the pre-intervention (January-March 2016) and post-intervention period (May-July) was scored for information in four dimensions - general data, patient assessment, testing, and management. Control criteria included

  20. Model documentation for relations between continuous real-time and discrete water-quality constituents in Cheney Reservoir near Cheney, Kansas, 2001--2009

    Science.gov (United States)

    Stone, Mandy L.; Graham, Jennifer L.; Gatotho, Jackline W.

    2013-01-01

    Cheney Reservoir, located in south-central Kansas, is one of the primary water supplies for the city of Wichita, Kansas. The U.S. Geological Survey has operated a continuous real-time water-quality monitoring station in Cheney Reservoir since 2001; continuously measured physicochemical properties include specific conductance, pH, water temperature, dissolved oxygen, turbidity, fluorescence (wavelength range 650 to 700 nanometers; estimate of total chlorophyll), and reservoir elevation. Discrete water-quality samples were collected during 2001 through 2009 and analyzed for sediment, nutrients, taste-and-odor compounds, cyanotoxins, phytoplankton community composition, actinomycetes bacteria, and other water-quality measures. Regression models were developed to establish relations between discretely sampled constituent concentrations and continuously measured physicochemical properties to compute concentrations of constituents that are not easily measured in real time. The water-quality information in this report is important to the city of Wichita because it allows quantification and characterization of potential constituents of concern in Cheney Reservoir. This report updates linear regression models published in 2006 that were based on data collected during 2001 through 2003. The update uses discrete and continuous data collected during May 2001 through December 2009. Updated models to compute dissolved solids, sodium, chloride, and suspended solids were similar to previously published models. However, several other updated models changed substantially from previously published models. In addition to updating relations that were previously developed, models also were developed for four new constituents, including magnesium, dissolved phosphorus, actinomycetes bacteria, and the cyanotoxin microcystin. In addition, a conversion factor of 0.74 was established to convert the Yellow Springs Instruments (YSI) model 6026 turbidity sensor measurements to the newer YSI

  1. GASCAP: Wellhead Gas Productive Capacity Model documentation, June 1993

    International Nuclear Information System (INIS)

    1993-01-01

    The Wellhead Gas Productive Capacity Model (GASCAP) has been developed by EIA to provide a historical analysis of the monthly productive capacity of natural gas at the wellhead and a projection of monthly capacity for 2 years into the future. The impact of drilling, oil and gas price assumptions, and demand on gas productive capacity are examined. Both gas-well gas and oil-well gas are included. Oil-well gas productive capacity is estimated separately and then combined with the gas-well gas productive capacity. This documentation report provides a general overview of the GASCAP Model, describes the underlying data base, provides technical descriptions of the component models, diagrams the system and subsystem flow, describes the equations, and provides definitions and sources of all variables used in the system. This documentation report is provided to enable users of EIA projections generated by GASCAP to understand the underlying procedures used and to replicate the models and solutions. This report should be of particular interest to those in the Congress, Federal and State agencies, industry, and the academic community, who are concerned with the future availability of natural gas

  2. Development of a mission-based funding model for undergraduate medical education: incorporation of quality.

    Science.gov (United States)

    Stagnaro-Green, Alex; Roe, David; Soto-Greene, Maria; Joffe, Russell

    2008-01-01

    Increasing financial pressures, along with a desire to realign resources with institutional priorities, has resulted in the adoption of mission-based funding (MBF) at many medical schools. The lack of inclusion of quality and the time and expense in developing and implementing mission based funding are major deficiencies in the models reported to date. In academic year 2002-2003 New Jersey Medical School developed a model that included both quantity and quality in the education metric and that was departmentally based. Eighty percent of the undergraduate medical education allocation was based on the quantity of undergraduate medical education taught by the department ($7.35 million), and 20% ($1.89 million) was allocated based on the quality of the education delivered. Quality determinations were made by the educational leadership based on student evaluations and departmental compliance with educational administrative requirements. Evolution of the model has included the development of a faculty oversight committee and the integration of peer evaluation in the determination of educational quality. Six departments had a documented increase in quality over time, and one department had a transient decrease in quality. The MBF model has been well accepted by chairs, educational leaders, and faculty and has been instrumental in enhancing the stature of education at our institution.

  3. The Effect of Structured Decision-Making Techniques and Gender on Student Reaction and Quality of Written Documents.

    Science.gov (United States)

    Neal, Joan; Echternacht, Lonnie

    1995-01-01

    Experimental groups used four decision-making techniques--reverse brainstorming (RS), dialectical inquiry (DI), devil's advocacy (DA), and consensus--in evaluating writing assignments. Control group produced a better quality document. Student reaction to negative features of RS, DI, and DA were not significant. (SK)

  4. Vision of the Arc for Quality Documentation and for Closed Loop Control of the Welding Process

    DEFF Research Database (Denmark)

    Kristiansen, Morten; Kristiansen, Ewa; Jensen, Casper Houmann

    2014-01-01

    For gas metal arc welding a vision system was developed, which was robust to monitor the position of the arc. The monitoring documents the welding quality indirectly and a closed loop fuzzy control was implemented to control an even excess penetration. For welding experiments on a butt......-joint with a V-groove with varying root gapthe system demonstrated increased welding quality compared to the system with no control. The system was implemented with a low cost vision system, which makes the system interesting to apply in industrial welding automation systems....

  5. Detailed Design Documentation, without the Pain

    Science.gov (United States)

    Ramsay, C. D.; Parkes, S.

    2004-06-01

    Producing detailed forms of design documentation, such as pseudocode and structured flowcharts, to describe the procedures of a software system:(1) allows software developers to model and discuss their understanding of a problem and the design of a solution free from the syntax of a programming language,(2) facilitates deeper involvement of non-technical stakeholders, such as the customer or project managers, whose influence ensures the quality, correctness and timeliness of the resulting system,(3) forms comprehensive documentation of the system for its future maintenance, reuse and/or redeployment.However, such forms of documentation require effort to create and maintain.This paper describes a software tool which is currently being developed within the Space Systems Research Group at the University of Dundee which aims to improve the utility of, and the incentive for, creating detailed design documentation for the procedures of a software system. The rationale for creating such a tool is briefly discussed, followed by a description of the tool itself, a summary of its perceived benefits, and plans for future work.

  6. Quality management system

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Mu Sung

    2009-08-15

    This book deals with ISO9001 quality management system which includes summary of this system such as classification of quality, principle of quality management, and definition, requirement and procedure of quality management system, introduction of ISO9001 system like model of ISO9001 quality management system, ISO certificate system, structure of ISO9001 standard, requirement of ISO9001 quality management system, process approach and documentation of system, propel cases of ISO9001 quality management system.

  7. Quality management system

    International Nuclear Information System (INIS)

    Lee, Mu Sung

    2009-08-01

    This book deals with ISO9001 quality management system which includes summary of this system such as classification of quality, principle of quality management, and definition, requirement and procedure of quality management system, introduction of ISO9001 system like model of ISO9001 quality management system, ISO certificate system, structure of ISO9001 standard, requirement of ISO9001 quality management system, process approach and documentation of system, propel cases of ISO9001 quality management system.

  8. Model documentation Coal Market Module of the National Energy Modeling System

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-04-30

    This report documents objectives and conceptual and methodological approach used in the development of the National Energy Modeling System (NEMS) Coal Market Module (CMM) used to develop the Annual Energy Outlook 1996 (AEO96). This report catalogues and describes the assumptions, methodology, estimation techniques, and source code of CMM`s three submodules: Coal Production Submodule, Coal Export Submodule, and Coal Distribution Submodule.

  9. Mined Geologic Disposal System Requirements Document

    International Nuclear Information System (INIS)

    1993-01-01

    This Mined Geologic Disposal System Requirements document (MGDS-RD) describes the functions to be performed by, and the requirements for, a Mined Geologic Disposal System (MGDS) for the permanent disposal of spent nuclear fuel (SNF) and commercial and defense high level radioactive waste (HLW) in support of the Civilian Radioactive Waste Management System (CRWMS). The development and control of the MGDS-RD is quality-affecting work and is subject to the Department of Energy (DOE) Office of Civilian Radioactive Waste Management (OCRWM) Quality Assurance Requirements Document (QARD). As part of the technical requirements baseline, it is also subject to Baseline Management Plan controls. The MGDS-RD and the other program-level requirements documents have been prepared and managed in accordance with the Technical Document Preparation Plan (TDPP) for the Preparation of System Requirements Documents

  10. Avaluació documental : estudi comparatiu dels models de Catalunya, Canadà i Mèxic

    OpenAIRE

    Colás Ayllón, Natalia

    2017-01-01

    El treball és un estudi comparatiu entre diferents models d'avaluació documental. Els models triats són el model de Catalunya, el de Canadà i el de Mèxic. L'objectiu d'aquest estudi és poder conèixer com es treballa en l'àmbit de l'avaluació documental en el nostre territori (Catalunya) i poder comparar-lo amb el d'altres indrets, entre els quals trobem Canadà, que va ser pioner en quant a l'avaluació documental es refereix; i Mèxic, que és un estat que ha creat el seu model més recentment. P...

  11. Documenting Climate Models and Simulations: the ES-DOC Ecosystem in Support of CMIP

    Science.gov (United States)

    Pascoe, C. L.; Guilyardi, E.

    2017-12-01

    The results of climate models are of increasing and widespread importance. No longer is climate model output of sole interest to climate scientists and researchers in the climate change impacts and adaptation fields. Now non-specialists such as government officials, policy-makers, and the general public, all have an increasing need to access climate model output and understand its implications. For this host of users, accurate and complete metadata (i.e., information about how and why the data were produced) is required to document the climate modeling results. Here we describe the ES-DOC community-govern project to collect and make available documentation of climate models and their simulations for the internationally coordinated modeling activity CMIP6 (Coupled Model Intercomparison Project, Phase 6). An overview of the underlying standards, key properties and features, the evolution from CMIP5, the underlying tools and workflows as well as what modelling groups should expect and how they should engage with the documentation of their contribution to CMIP6 is also presented.

  12. Reducing failures rate within the project documentation using Building Information Modelling, especially Level of Development

    Directory of Open Access Journals (Sweden)

    Prušková Kristýna

    2018-01-01

    Full Text Available Paper´s focus is on differences between traditional modelling in 2D software and modelling within the BIM technology. Research uncovers failures connected to the traditional way of designing and construction of project documentation. There are revealed and shown mismatches within the project documentation. Solution within the Building information modelling Technology is outlined. As a reference, there is used experience with design of specific building in both ways of construction of project documentation: in the way of traditional modelling and in the way when using BIM technology, especially using Level of Development. Output of this paper is pointing to benefits of using advanced technology in building design, thus Building Information Modelling, especially Level of Development, which leads to reducing failures rate within the project documentation.

  13. Model business letters, emails and other business documents

    CERN Document Server

    Taylor, Shirley

    2012-01-01

    For anyone who wants to communicate effectively in business, this is your complete reference guide for any form of written communication. Packed with over 500 sample documents, over 100 tips for better business writing and useful templates you can apply to your writing immediately, Model Business Letters will help you put the key rules of good business writing into action.

  14. 文件物件模型及其在XML文件處理之應用 Document Object Model and Its Application on XML Document Processing

    Directory of Open Access Journals (Sweden)

    Sinn-cheng Lin

    2001-06-01

    Full Text Available 無Document Object Model (DOM is an application-programming interface that can be applied to process XML documents. It defines the logical structure, the accessing interfaces and the operation methods for the document. In the DOM, an original document is mapped to a tree structure. Therefore ,the computer program can easily traverse the tree manipulate the nodes in the tree. In this paper, the fundamental models, definitions and specifications of DOM are surveyed. Then we create an experimenta1 system of DOM called XML On-Line Parser. The front-end of the system is built by the Web-based user interface for the XML document input and the parsed result output. On the other hand, the back-end of the system is built by an ASP program, which transforms the original document to DOM tree for document manipulation. This on-line system can be linked with a general-purpose web browser to check the well-formedness and the validity of the XML documents.

  15. Entropy and Graph Based Modelling of Document Coherence using Discourse Entities

    DEFF Research Database (Denmark)

    Petersen, Casper; Lioma, Christina; Simonsen, Jakob Grue

    2015-01-01

    We present two novel models of document coherence and their application to information retrieval (IR). Both models approximate document coherence using discourse entities, e.g. the subject or object of a sentence. Our first model views text as a Markov process generating sequences of discourse...... entities (entity n-grams); we use the entropy of these entity n-grams to approximate the rate at which new information appears in text, reasoning that as more new words appear, the topic increasingly drifts and text coherence decreases. Our second model extends the work of Guinaudeau & Strube [28......] that represents text as a graph of discourse entities, linked by different relations, such as their distance or adjacency in text. We use several graph topology metrics to approximate different aspects of the discourse flow that can indicate coherence, such as the average clustering or betweenness of discourse...

  16. Quality Model Based on Cots Quality Attributes

    OpenAIRE

    Jawad Alkhateeb; Khaled Musa

    2013-01-01

    The quality of software is essential to corporations in making their commercial software. Good or poorquality to software plays an important role to some systems such as embedded systems, real-time systems,and control systems that play an important aspect in human life. Software products or commercial off theshelf software are usually programmed based on a software quality model. In the software engineeringfield, each quality model contains a set of attributes or characteristics that drives i...

  17. USE OF IMAGE BASED MODELLING FOR DOCUMENTATION OF INTRICATELY SHAPED OBJECTS

    Directory of Open Access Journals (Sweden)

    M. Marčiš

    2016-06-01

    Full Text Available In the documentation of cultural heritage, we can encounter three dimensional shapes and structures which are complicated to measure. Such objects are for example spiral staircases, timber roof trusses, historical furniture or folk costume where it is nearly impossible to effectively use the traditional surveying or the terrestrial laser scanning due to the shape of the object, its dimensions and the crowded environment. The actual methods of digital photogrammetry can be very helpful in such cases with the emphasis on the automated processing of the extensive image data. The created high resolution 3D models and 2D orthophotos are very important for the documentation of architectural elements and they can serve as an ideal base for the vectorization and 2D drawing documentation. This contribution wants to describe the various usage of image based modelling in specific interior spaces and specific objects. The advantages and disadvantages of the photogrammetric measurement of such objects in comparison to other surveying methods are reviewed.

  18. Use of Image Based Modelling for Documentation of Intricately Shaped Objects

    Science.gov (United States)

    Marčiš, M.; Barták, P.; Valaška, D.; Fraštia, M.; Trhan, O.

    2016-06-01

    In the documentation of cultural heritage, we can encounter three dimensional shapes and structures which are complicated to measure. Such objects are for example spiral staircases, timber roof trusses, historical furniture or folk costume where it is nearly impossible to effectively use the traditional surveying or the terrestrial laser scanning due to the shape of the object, its dimensions and the crowded environment. The actual methods of digital photogrammetry can be very helpful in such cases with the emphasis on the automated processing of the extensive image data. The created high resolution 3D models and 2D orthophotos are very important for the documentation of architectural elements and they can serve as an ideal base for the vectorization and 2D drawing documentation. This contribution wants to describe the various usage of image based modelling in specific interior spaces and specific objects. The advantages and disadvantages of the photogrammetric measurement of such objects in comparison to other surveying methods are reviewed.

  19. Documentation of the DRI Model of the US economy, December 1993

    Energy Technology Data Exchange (ETDEWEB)

    1994-02-28

    The Energy Information Administration (EIA) uses models of the US economy developed by Data Resources, Inc. (DRI) for conducting policy analyses, preparing forecasts for the Annual Energy Outlook, the Short-Term Energy Outlook, and related analyses in conjunction with EIA`s National Energy Modeling System (NEMS) and its other energy market models. Both the DRI Model of the US Economy and the DRI Personal Computer Input-Output Model (PC-IO){sup 2} were developed and are maintained by DRI as proprietary models. This report provides documentation, as required by EIA standards for the use of proprietary models; describes the theoretical basis, structure and functions of both DRI models; and contains brief descriptions of the models and their equations. Appendix A describes how the two large-scale models documented here are used to support the macroeconomic and interindustry modeling associated with the National Energy Modeling System. Appendix B is an article by Stephen McNees of the Federal Reserve Bank of Boston on ``How Large are Economic Forecast Errors.`` This article assesses the forecast accuracy of a number of economic forecasting models (groups) and is attached as an independent assessment of the forecast accuracy of the DRI Model of the US Economy.

  20. An International Coordinated Effort to Further the Documentation & Development of Quality Assurance, Quality Control, and Best Practices for Oceanographic Observations

    Science.gov (United States)

    Bushnell, M.; Waldmann, C.; Hermes, J.; Tamburri, M.

    2017-12-01

    Many oceanographic observation groups create and maintain QA, QC, and best practices (BP) to ensure efficient and accurate data collection and quantify quality. Several entities - IOOS® QARTOD, AtlantOS, ACT, WMO/IOC JCOMM OCG - have joined forces to document existing practices, identify gaps, and support development of emerging techniques. While each group has a slightly different focus, many underlying QA/QC/BP needs can be quite common. QARTOD focuses upon real-time data QC, and has produced manuals that address QC tests for eleven ocean variables. AtlantOS is a research and innovation project working towards the integration of ocean-observing activities across all disciplines in the Atlantic Basin. ACT brings together research institutions, resource managers, and private companies to foster the development and adoption of effective and reliable sensors for coastal, freshwater, and ocean environments. JCOMM promotes broad international coordination of oceanographic and marine meteorological observations and data management and services. Leveraging existing efforts of these organizations is an efficient way to consolidate available information, develop new practices, and evaluate the use of ISO standards to judge the quality of measurements. ISO standards may offer accepted support for a framework for an ocean data quality management system, similar to the meteorological standards defined by WMO (https://www.wmo.int/pages/prog/arep/gaw/qassurance.html). We will first cooperatively develop a plan to create a QA/QC/BP manual. The resulting plan will describe the need for such a manual, the extent of the manual, the process used to engage the community in creating it, the maintenance of the resultant document, and how these things will be done. It will also investigate standards for metadata. The plan will subsequently be used to develop the QA/QC/BP manual, providing guidance which advances the standards adopted by IOOS, AtlantOS, JCOMM, and others.

  1. Attention modeling for video quality assessment

    DEFF Research Database (Denmark)

    You, Junyong; Korhonen, Jari; Perkis, Andrew

    2010-01-01

    averaged spatiotemporal pooling. The local quality is derived from visual attention modeling and quality variations over frames. Saliency, motion, and contrast information are taken into account in modeling visual attention, which is then integrated into IQMs to calculate the local quality of a video frame...... average between the global quality and the local quality. Experimental results demonstrate that the combination of the global quality and local quality outperforms both sole global quality and local quality, as well as other quality models, in video quality assessment. In addition, the proposed video...... quality modeling algorithm can improve the performance of image quality metrics on video quality assessment compared to the normal averaged spatiotemporal pooling scheme....

  2. Sandia Software Guidelines, Volume 2. Documentation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standards for software documentation, this volume provides guidance in the selection of an adequate document set for a software project and example formats for many types of software documentation. A tutorial on life cycle documentation is also provided. Extended document thematic outlines and working examples of software documents are available on electronic media as an extension of this volume.

  3. Application of models for exchange of electronic documents in complex administrative services

    International Nuclear Information System (INIS)

    Glavev, Victor

    2015-01-01

    The report presents application of models for exchange of electronic documents between different administrations in government and business sectors. It shows the benefits of implementing electronic exchange of documents between different local offices of one administration in government sector such as a municipality and the way it is useful for implementing complex administrative services

  4. Application of models for exchange of electronic documents in complex administrative services

    Energy Technology Data Exchange (ETDEWEB)

    Glavev, Victor

    2015-11-30

    The report presents application of models for exchange of electronic documents between different administrations in government and business sectors. It shows the benefits of implementing electronic exchange of documents between different local offices of one administration in government sector such as a municipality and the way it is useful for implementing complex administrative services.

  5. QA programme documentation

    International Nuclear Information System (INIS)

    Scheibelt, L.

    1980-01-01

    The present paper deals with the following topics: The need for a documented Q.A. program; Establishing a Q.A. program; Q.A. activities; Fundamental policies; Q.A. policies; Quality objectives Q.A. manual. (orig./RW)

  6. QUALITY IMPROVEMENT MODEL OF NURSING EDUCATION IN MUHAMMADIYAH UNIVERSITIES TOWARD COMPETITIVE ADVANTAGE

    Directory of Open Access Journals (Sweden)

    Abdul Aziz Alimul Hidayat

    2017-06-01

    Full Text Available Introduction: Most of (90,6% nursing education quality in East Java was still low (BAN-PT, 2012. It was because the quality improvement process in nursing education generally was conducted partially (random performance improvement. The solution which might be done was through identifying proper quality improvement model in Nursing Education toward competitive advantage. Method: This research used survey to gain the data. The research sample was 16 Muhammadiyah Universities chosen using simple random sampling. The data were collected with questionnaires of 174 questions and documentation study. Data analysis used was Partial Least Square (PLS analysis technique. Result: Nursing education department profile in Muhammadiyah Universities in Indonesia showed of 10 years establishment, accredited B and the competition level in one city/regency was averagely more than three Universities becoming the competitors. Based on the quality improvement model analysis of nursing education toward competitive advantage on Muhammadiyah Universities, it was directly affected by the focus of learning and operasional process through human resources management improvement, on the other hand information system also directly affected on quality improvement, also affected quality process components; leadership, human resources, focus of learning and operational process. In improving human resources would be directly influenced with proper strategic planning. Strategic planning was directly influenced with leadership. Thus, in improving quality of nursing education, the leadership role of department, proper information system, and thehuman resources management improvement must be implemented.  Conclusion: Quality improvement model in nursing education was directly determined with learning and operational process through human resources management along with information system, strategic planning factors, and leadership. The research finding could be developed in quality

  7. Importance of the documentation of the manual of quality and procedures handbook in the nuclear technology center

    International Nuclear Information System (INIS)

    Domech More, J.; Bolanos Hernandez, R.; Quitero Rosello, R.; Fernandez Rondon, M.; Milian Lorenzo, D.; Rodriguez Gual, M.

    1997-01-01

    In the present work is presented the methodology used for the elaboration of manual of quality of the Nuclear Technology Center and the technical Procedures Handbook for the execution of Preliminary Safety report of the Juragua Nuclear Power Plant, as well as the importance that has this documentation for the work of the center

  8. Nuclear power plants documentation system

    International Nuclear Information System (INIS)

    Schwartz, E.L.

    1991-01-01

    Since the amount of documents (type and quantity) necessary for the entire design of a NPP is very large, this implies that an overall and detailed identification, filling and retrieval system shall be implemented. This is even more applicable to the FINAL QUALITY DOCUMENTATION of the plant, as stipulated by IAEA Safety Codes and related guides. For such a purpose it was developed a DOCUMENTATION MANUAL, which describes in detail the before mentioned documentation system. Here we present the expected goals and results which we have to reach for Angra 2 and 3 Project. (author)

  9. Underground Test Area Subproject Phase I Data Analysis Task. Volume VII - Tritium Transport Model Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-12-01

    Volume VII of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the tritium transport model documentation. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  10. Quality models for audiovisual streaming

    Science.gov (United States)

    Thang, Truong Cong; Kim, Young Suk; Kim, Cheon Seog; Ro, Yong Man

    2006-01-01

    Quality is an essential factor in multimedia communication, especially in compression and adaptation. Quality metrics can be divided into three categories: within-modality quality, cross-modality quality, and multi-modality quality. Most research has so far focused on within-modality quality. Moreover, quality is normally just considered from the perceptual perspective. In practice, content may be drastically adapted, even converted to another modality. In this case, we should consider the quality from semantic perspective as well. In this work, we investigate the multi-modality quality from the semantic perspective. To model the semantic quality, we apply the concept of "conceptual graph", which consists of semantic nodes and relations between the nodes. As an typical of multi-modality example, we focus on audiovisual streaming service. Specifically, we evaluate the amount of information conveyed by a audiovisual content where both video and audio channels may be strongly degraded, even audio are converted to text. In the experiments, we also consider the perceptual quality model of audiovisual content, so as to see the difference with semantic quality model.

  11. Biomass Scenario Model Documentation: Data and References

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Y.; Newes, E.; Bush, B.; Peterson, S.; Stright, D.

    2013-05-01

    The Biomass Scenario Model (BSM) is a system dynamics model that represents the entire biomass-to-biofuels supply chain, from feedstock to fuel use. The BSM is a complex model that has been used for extensive analyses; the model and its results can be better understood if input data used for initialization and calibration are well-characterized. It has been carefully validated and calibrated against the available data, with data gaps filled in using expert opinion and internally consistent assumed values. Most of the main data sources that feed into the model are recognized as baseline values by the industry. This report documents data sources and references in Version 2 of the BSM (BSM2), which only contains the ethanol pathway, although subsequent versions of the BSM contain multiple conversion pathways. The BSM2 contains over 12,000 total input values, with 506 distinct variables. Many of the variables are opportunities for the user to define scenarios, while others are simply used to initialize a stock, such as the initial number of biorefineries. However, around 35% of the distinct variables are defined by external sources, such as models or reports. The focus of this report is to provide insight into which sources are most influential in each area of the supply chain.

  12. Regional technical cooperation model project, IAEA - RER/2/2004 ''quality control and quality assurance for nuclear analytical techniques'

    International Nuclear Information System (INIS)

    Arikan, P.

    2002-01-01

    An analytical laboratory should produce high quality analytical data through the use of analytical measurements that is accurate, reliable and adequate for the intended purpose. This objective can be accomplished in a cost-effective manner under a planned and documented quality system of activities. It is well-known that serious deficiencies can occur in laboratory operations when insufficient attention is given to the quality of the work. It requires not only a thorough knowledge of the laboratory's purpose and operation, but also the dedication of the management and operating staff to standards of excellence. Laboratories employing nuclear and nuclear-related analytical techniques are sometimes confronted with performance problems which prevent them from becoming accepted and respected by clients, such as industry, government and regulatory bodies, and from being eligible for contracts. The International Standard ISO 17025 has been produced as the result of extensive experience in the implementation of ISO/IEC Guide 25:1990 and EN 45001:1989, which replaces both of them now. It contains all of the requirements that testing and calibration laboratories must meet if they wish to demonstrate that they operate a quality system that is technically competent, and are able to generate technically valid results. The use of ISO 17025 should facilitate cooperation between laboratories and other bodies to assist in the exchange of information and experience, and in the harmonization of standards and procedures. IAEA model project RER/2/004 entitled 'Quality Assurance/Quality Control in Nuclear Analytical Techniques' was initiated in 1999 as a Regional TC project in East European countries to assist Member State laboratories in the region to install a complete quality system according to the ISO/IEC 17025 standard. 12 laboratories from 11 countries plus the Agency's Laboratories in Seibersdorf have been selected as participants to undergo exercises and training with the

  13. A sediment resuspension and water quality model of Lake Okeechobee

    Science.gov (United States)

    James, R.T.; Martin, J.; Wool, T.; Wang, P.-F.

    1997-01-01

    The influence of sediment resuspension on the water quality of shallow lakes is well documented. However, a search of the literature reveals no deterministic mass-balance eutrophication models that explicitly include resuspension. We modified the Lake Okeeehobee water quality model - which uses the Water Analysis Simulation Package (WASP) to simulate algal dynamics and phosphorus, nitrogen, and oxygen cycles - to include inorganic suspended solids and algorithms that: (1) define changes in depth with changes in volume; (2) compute sediment resuspension based on bottom shear stress; (3) compute partition coefficients for ammonia and ortho-phosphorus to solids; and (4) relate light attenuation to solids concentrations. The model calibration and validation were successful with the exception of dissolved inorganic nitrogen species which did not correspond well to observed data in the validation phase. This could be attributed to an inaccurate formulation of algal nitrogen preference and/or the absence of nitrogen fixation in the model. The model correctly predicted that the lake is lightlimited from resuspended solids, and algae are primarily nitrogen limited. The model simulation suggested that biological fluxes greatly exceed external loads of dissolved nutrients; and sedimentwater interactions of organic nitrogen and phosphorus far exceed external loads. A sensitivity analysis demonstrated that parameters affecting resuspension, settling, sediment nutrient and solids concentrations, mineralization, algal productivity, and algal stoichiometry are factors requiring further study to improve our understanding of the Lake Okeechobee ecosystem.

  14. Documentation on the development of the Swiss TIMES Electricity Model (STEM-E)

    International Nuclear Information System (INIS)

    Kannan, R.; Turton, H.

    2011-10-01

    This comprehensive report by the Paul Scherrer Institute PSI in Switzerland documents the development of the Swiss TIMES Electricity Model (STEM-E). This is a flexible model which explicitly depicts plausible pathways for the development of the Swiss electricity sector, while dealing with inter-temporal variations in demand and supply. TIMES is quoted as having the capability to portray the entire energy system from resource supply, through fuel processing, representation of infrastructures, conversion to secondary energy carriers, end-use technologies and energy service demands at end-use sectors. The background of the model's development and a reference energy system are described. Also, electricity end-use sectors and generating systems are examined, including hydropower, nuclear power, thermal generation and renewables. Environmental factors and the calibration of the model are discussed, as is the application of the model. The document is completed with an outlook, references and six appendices

  15. MELSAR: a mesoscale air quality model for complex terrain. Volume 2. Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Allwine, K.J.; Whiteman, C.D.

    1985-04-01

    This final report is submitted as part of the Green River Ambient Model Assessment (GRAMA) project conducted at the US Department of Energy's Pacific Northwest Laboratory for the US Environmental Protection Agency. The GRAMA Program has, as its ultimate goal, the development of validated air quality models that can be applied to the complex terrain of the Green River Formation of western Colorado, eastern Utah and southern Wyoming. The Green River Formation is a geologic formation containing large reserves of oil shale, coal, and other natural resources. Development of these resources may lead to a degradation of the air quality of the region. Air quality models are needed immediately for planning and regulatory purposes to assess the magnitude of these regional impacts. This report documents one of the models being developed for this purpose within GRAMA - specifically a model to predict short averaging time (less than or equal to 24 h) pollutant concentrations resulting from the mesoscale transport of pollutant releases from multiple sources. MELSAR has not undergone any rigorous operational testing, sensitivity analyses, or validation studies. Testing and evaluation of the model are needed to gain a measure of confidence in the model's performance. This report consists of two volumes. This volume contains the Appendices, which include listings of the FORTRAN code and Volume 1 contains the model overview, technical description, and user's guide. 13 figs., 10 tabs.

  16. 3D MODELING FOR UNDERWATER ARCHAEOLOGICAL DOCUMENTATION: METRIC VERIFICATIONS

    Directory of Open Access Journals (Sweden)

    S. D’Amelio

    2015-04-01

    Full Text Available The survey in underwater environment has always presented considerable difficulties both operative and technical and this has sometimes made it difficult to use the techniques of survey commonly used for the documentation of Cultural Heritage in dry environment. The work of study concerns the evaluation in terms of capability and accuracy of the Autodesk123DCatch software for the reconstruction of a three-dimensional model of an object in underwater context. The subjects of the study are models generated from sets of photographs and sets of frames extracted from video sequence. The study is based on comparative method, using a reference model, obtained with laser scanner technique.

  17. Improving collaborative documentation in CMS

    International Nuclear Information System (INIS)

    Lassila-Perini, Kati; Salmi, Leena

    2010-01-01

    Complete and up-to-date documentation is essential for efficient data analysis in a large and complex collaboration like CMS. Good documentation reduces the time spent in problem solving for users and software developers. The scientists in our research environment do not necessarily have the interests or skills of professional technical writers. This results in inconsistencies in the documentation. To improve the quality, we have started a multidisciplinary project involving CMS user support and expertise in technical communication from the University of Turku, Finland. In this paper, we present possible approaches to study the usability of the documentation, for instance, usability tests conducted recently for the CMS software and computing user documentation.

  18. Documentation of pain care processes does not accurately reflect pain management delivered in primary care.

    Science.gov (United States)

    Krebs, Erin E; Bair, Matthew J; Carey, Timothy S; Weinberger, Morris

    2010-03-01

    Researchers and quality improvement advocates sometimes use review of chart-documented pain care processes to assess the quality of pain management. Studies have found that primary care providers frequently fail to document pain assessment and management. To assess documentation of pain care processes in an academic primary care clinic and evaluate the validity of this documentation as a measure of pain care delivered. Prospective observational study. 237 adult patients at a university-affiliated internal medicine clinic who reported any pain in the last week. Immediately after a visit, we asked patients to report the pain treatment they received. Patients completed the Brief Pain Inventory (BPI) to assess pain severity at baseline and 1 month later. We extracted documentation of pain care processes from the medical record and used kappa statistics to assess agreement between documentation and patient report of pain treatment. Using multivariable linear regression, we modeled whether documented or patient-reported pain care predicted change in pain at 1 month. Participants' mean age was 53.7 years, 66% were female, and 74% had chronic pain. Physicians documented pain assessment for 83% of visits. Patients reported receiving pain treatment more often (67%) than was documented by physicians (54%). Agreement between documentation and patient report was moderate for receiving a new pain medication (k = 0.50) and slight for receiving pain management advice (k = 0.13). In multivariable models, documentation of new pain treatment was not associated with change in pain (p = 0.134). In contrast, patient-reported receipt of new pain treatment predicted pain improvement (p = 0.005). Chart documentation underestimated pain care delivered, compared with patient report. Documented pain care processes had no relationship with pain outcomes at 1 month, but patient report of receiving care predicted clinically significant improvement. Chart review measures may not accurately

  19. Lifecycle management for nuclear engineering project documents

    International Nuclear Information System (INIS)

    Zhang Li; Zhang Ming; Zhang Ling

    2010-01-01

    The nuclear engineering project documents with great quantity and various types of data, in which the relationships of each document are complex, the edition of document update frequently, are managed difficultly. While the safety of project even the nuclear safety is threatened seriously by the false documents and mistakes. In order to ensure the integrality, veracity and validity of project documents, the lifecycle theory of document is applied to build documents center, record center, structure and database of document lifecycle management system. And the lifecycle management is used to the documents of nuclear engineering projects from the production to pigeonhole, to satisfy the quality requirement of nuclear engineering projects. (authors)

  20. Effects of increased nurses’ workload on quality documentation of patient information at selected Primary Health Care facilities in Vhembe District, Limpopo Province

    Directory of Open Access Journals (Sweden)

    Rhulani C. Shihundla

    2016-05-01

    Full Text Available Background: Recording of information on multiple documents increases professional nurses’ responsibilities and workload during working hours. There are multiple registers and books at Primary Health Care (PHC facilities in which a patient’s information is to be recorded for different services during a visit to a health professional. Antenatal patients coming for the first visit must be recorded in the following documents: tick register; Prevention of Mother-ToChild Transmission (PMTCT register; consent form for HIV and AIDS testing; HIV Counselling and Testing (HCT register (if tested positive for HIV and AIDS then this must be recorded in the Antiretroviral Therapy (ART wellness register; ART file with an accompanying single file, completion of which is time-consuming; tuberculosis (TB suspects register; blood specimen register; maternity case record book and Basic Antenatal Care (BANC checklist. Nurses forget to record information in some documents which leads to the omission of important data. Omitting information might lead to mismanagement of patients. Some of the documents have incomplete and inaccurate information. As PHC facilities in Vhembe District render twenty four hour services through a call system, the same nurses are expected to resume duty at 07:00 the following morning. They are expected to work effectively and when tired a nurse may record illegible information which may cause problems when the document is retrieved by the next person for continuity of care. Objectives: The objective of this study was to investigate and describe the effects of increased nurses’ workload on quality documentation of patient information at PHC facilities in Vhembe District, Limpopo Province. Methods: The study was conducted in Vhembe District, Limpopo Province, where the effects of increased nurses’ workload on quality documentation of information is currently experienced. The research design was explorative, descriptive and contextual in

  1. Development of quality assurance programme for prescribed ionizing radiation source testing. Recommendations

    International Nuclear Information System (INIS)

    1999-01-01

    The document gives guidance to those applying for licence to perform ionizing radiation source acceptance tests and long-term stability tests and provides information which should be known when introducing quality assurance systems in compliance with legislative requirements. It is envisaged that this document ('Recommendations') will form a basis for final Safety Guides to be issued by the State Office for Nuclear Safety, the Czech nuclear regulatory authority. The setup of the publication is as follows. Part I gives a glossary of basic terms in quality systems. Part 2 explains quality system principles, paying special attention to radiation safety issues, and describes the structure and scope of quality system documentation. Part 3 explains the individual elements of the quality system and gives practical examples. Part 4 deals with the quality assurance programme; using instructions and practical examples, this part shows how the quality system elements should be applied to long-time stability testing and acceptance testing. A model structure of 2nd degree documentation (guidelines) and a model testing protocol are given in annexes. (P.A.)

  2. Documentation, User Support, and Verification of Wind Turbine and Plant Models

    Energy Technology Data Exchange (ETDEWEB)

    Robert Zavadil; Vadim Zheglov; Yuriy Kazachkov; Bo Gong; Juan Sanchez; Jun Li

    2012-09-18

    As part of the Utility Wind Energy Integration Group (UWIG) and EnerNex's Wind Turbine Modeling Project, EnerNex has received ARRA (federal stimulus) funding through the Department of Energy (DOE) to further the progress of wind turbine and wind plant models. Despite the large existing and planned wind generation deployment, industry-standard models for wind generation have not been formally adopted. Models commonly provided for interconnection studies are not adequate for use in general transmission planning studies, where public, non-proprietary, documented and validated models are needed. NERC MOD (North American Electric Reliability Corporation) reliability standards require that power flow and dynamics models be provided, in accordance with regional requirements and procedures. The goal of this project is to accelerate the appropriate use of generic wind turbine models for transmission network analysis by: (1) Defining proposed enhancements to the generic wind turbine model structures that would allow representation of more advanced; (2) Comparative testing of the generic models against more detailed (and sometimes proprietary) versions developed by turbine vendors; (3) Developing recommended parameters for the generic models to best mimic the performance of specific commercial wind turbines; (4) Documenting results of the comparative simulations in an application guide for users; (5) Conducting technology transfer activities in regional workshops for dissemination of knowledge and information gained, and to engage electric power and wind industry personnel in the project while underway; (6) Designing of a "living" homepage to establish an online resource for transmission planners.

  3. Learning High-Order Filters for Efficient Blind Deconvolution of Document Photographs

    KAUST Repository

    Xiao, Lei

    2016-09-16

    Photographs of text documents taken by hand-held cameras can be easily degraded by camera motion during exposure. In this paper, we propose a new method for blind deconvolution of document images. Observing that document images are usually dominated by small-scale high-order structures, we propose to learn a multi-scale, interleaved cascade of shrinkage fields model, which contains a series of high-order filters to facilitate joint recovery of blur kernel and latent image. With extensive experiments, we show that our method produces high quality results and is highly efficient at the same time, making it a practical choice for deblurring high resolution text images captured by modern mobile devices. © Springer International Publishing AG 2016.

  4. Review and comparison of quality standards, guidelines and regulations for laboratories

    Directory of Open Access Journals (Sweden)

    Tjeerd A.M. Datema

    2011-12-01

    Full Text Available Background: The variety and number of laboratory quality standards, guidelines and regulations (hereafter: quality documents makes it difficult to choose the most suitable one for establishing and maintaining a laboratory quality management system. Objectives: There is a need to compare the characteristics, suitability and applicability of quality documents in view of the increasing efforts to introduce quality management in laboratories, especially in clinical diagnostic laboratories in low income and middle income countries. This may provide valuable insights for policy makers developing national laboratory policies, and for laboratory managers and quality officers in choosing the most appropriate quality document for upgrading their laboratories. Method: We reviewed the history of quality document development and then selected a subset based on their current use. We analysed these documents following a framework for comparison of quality documents that was adapted from the Clinical Laboratory Standards Institute guideline GP26 Quality management system model for clinical laboratory services. Results: Differences were identified between national and international, and non-clinical and clinical quality documents. The most salient findings were the absence of provisions on occurrence management and customer service in almost all non-clinical quality documents, a low number of safety requirements aimed at protecting laboratory personnel in international quality documents and no requirements regarding ethical behaviour in almost all quality documents. Conclusion: Each laboratory needs to investigate whether national regulatory standards are present. These are preferred as they most closely suit the needs of laboratories in the country. A laboratory should always use both a standard and a guideline: a standard sums up the requirements to a quality management system, a guideline describes how quality management can be integrated in the laboratory

  5. Limited Documentation and Treatment Quality of Glycemic Inpatient Care in Relation to Structural Deficits of Heterogeneous Insulin Charts at a Large University Hospital.

    Science.gov (United States)

    Kopanz, Julia; Lichtenegger, Katharina M; Sendlhofer, Gerald; Semlitsch, Barbara; Cuder, Gerald; Pak, Andreas; Pieber, Thomas R; Tax, Christa; Brunner, Gernot; Plank, Johannes

    2018-02-09

    Insulin charts represent a key component in the inpatient glycemic management process. The aim was to evaluate the quality of structure, documentation, and treatment of diabetic inpatient care to design a new standardized insulin chart for a large university hospital setting. Historically grown blank insulin charts in use at 39 general wards were collected and evaluated for quality structure features. Documentation and treatment quality were evaluated in a consecutive snapshot audit of filled-in charts. The primary end point was the percentage of charts with any medication error. Overall, 20 different blank insulin charts with variable designs and significant structural deficits were identified. A medication error occurred in 55% of the 102 audited filled-in insulin charts, consisting of prescription and management errors in 48% and 16%, respectively. Charts of insulin-treated patients had more medication errors relative to patients treated with oral medication (P international standards, a new insulin chart was developed to overcome these quality hurdles.

  6. Person-centred care in nursing documentation.

    LENUS (Irish Health Repository)

    Broderick, Margaret C

    2012-12-07

    BACKGROUND: Documentation is an essential part of nursing. It provides evidence that care has been carried out and contains important information to enhance the quality and continuity of care. Person-centred care (PCC) is an approach to care that is underpinned by mutual respect and the development of a therapeutic relationship between the patient and nurse. It is a core principle in standards for residential care settings for older people and is beneficial for both patients and staff (International Practice Development in Nursing and Healthcare, Chichester, Blackwell, 2008 and The Implementation of a Model of Person-Centred Practice in Older Person Settings, Dublin, Health Service Executive, 2010a). However, the literature suggests a lack of person-centredness within nursing documentation (International Journal of Older People Nursing 2, 2007, 263 and The Implementation of a Model of Person-Centred Practice in Older Person Settings, Dublin, Health Service Executive, 2010a). AIMS AND OBJECTIVES: To explore nursing documentation in long-term care, to determine whether it reflected a person-centred approach to care and to describe aspects of PCC as they appeared in nursing records. METHOD: A qualitative descriptive study using the PCN framework (Person-centred Nursing; Theory and Practice, Oxford, Wiley-Blackwell, 2010) as the context through which nursing assessments and care plans were explored. RESULTS: Findings indicated that many nursing records were incomplete, and information regarding psychosocial aspects of care was infrequent. There was evidence that nurses engaged with residents and worked with their beliefs and values. However, nursing documentation was not completed in consultation with the patient, and there was little to suggest that patients were involved in decisions relating to their care. IMPLICATIONS FOR PRACTICE: The structure of nursing documentation can be a major obstacle to the recording of PCC and appropriate care planning. Documentation

  7. The perspective awareness model - Eliciting multiple perspectives to formulate high quality decisions

    International Nuclear Information System (INIS)

    Boucher, Laurel

    2013-01-01

    A great deal of attention is given to the importance of communication in environmental remediation and radioactive waste management. However, very little attention is given to eliciting multiple perspectives so as to formulate high quality decisions. Plans that are based on a limited number of perspectives tend to be narrowly focused whereas those that are based on a wide variety of perspectives tend to be comprehensive, higher quality, and more apt to be put into application. In addition, existing methods of dialogue have built-in limitations in that they typically draw from the predominant thinking patterns which focus in some areas but ignore others. This can result in clarity but a lack of comprehensiveness. This paper presents a Perspective Awareness Model which helps groups such as partnering teams, interagency teams, steering committees, and working groups elicit a wide net of perspectives and viewpoints. The paper begins by describing five factors that makes cooperation among such groups challenging. Next, a Perspective Awareness Model that makes it possible to manage these five factors is presented. The two primary components of this model --- the eight 'Thinking Directions' and the 'Shared Documentation' --- are described in detail. Several examples are given to illustrate how the Perspective Awareness Model can be used to elicit multiple perspectives to formulate high quality decisions in the area of environmental remediation and radioactive waste management. (authors)

  8. Eigenvector space model to capture features of documents

    Directory of Open Access Journals (Sweden)

    Choi DONGJIN

    2011-09-01

    Full Text Available Eigenvectors are a special set of vectors associated with a linear system of equations. Because of the special property of eigenvector, it has been used a lot for computer vision area. When the eigenvector is applied to information retrieval field, it is possible to obtain properties of documents data corpus. To capture properties of given documents, this paper conducted simple experiments to prove the eigenvector is also possible to use in document analysis. For the experiment, we use short abstract document of Wikipedia provided by DBpedia as a document corpus. To build an original square matrix, the most popular method named tf-idf measurement will be used. After calculating the eigenvectors of original matrix, each vector will be plotted into 3D graph to find what the eigenvector means in document processing.

  9. Higher Education Quality Assessment Model: Towards Achieving Educational Quality Standard

    Science.gov (United States)

    Noaman, Amin Y.; Ragab, Abdul Hamid M.; Madbouly, Ayman I.; Khedra, Ahmed M.; Fayoumi, Ayman G.

    2017-01-01

    This paper presents a developed higher education quality assessment model (HEQAM) that can be applied for enhancement of university services. This is because there is no universal unified quality standard model that can be used to assess the quality criteria of higher education institutes. The analytical hierarchy process is used to identify the…

  10. Contract models for public power distribution; Modeles de documents contractuels pour la distribution publique d'electricite

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    This document gathers several models of contractual documents relative to the public distribution of electricity: grant conventions (for towns syndicate or for a single town), grant technical specifications (general dispositions, works relative to the granted network, services to users, tariffing, grant completion, various dispositions, local modalities between the granting authority and the grantee, third party participation to connection costs, electricity purchase and sale prices, general delivery conditions for low-power deliveries. (J.S.)

  11. 78 FR 2394 - Release of Draft Document Related to the Review of the National Ambient Air Quality Standards for...

    Science.gov (United States)

    2013-01-11

    ... Quality Standards, External Review Draft (PA). This document was prepared as part of the current review of... ``anonymous access'' system, which means the EPA will not know your identity or contact information unless you... extent of all identifiable effects on public health or welfare which may be expected from the presence of...

  12. No Reference Video-Quality-Assessment Model for Monitoring Video Quality of IPTV Services

    Science.gov (United States)

    Yamagishi, Kazuhisa; Okamoto, Jun; Hayashi, Takanori; Takahashi, Akira

    Service providers should monitor the quality of experience of a communication service in real time to confirm its status. To do this, we previously proposed a packet-layer model that can be used for monitoring the average video quality of typical Internet protocol television content using parameters derived from transmitted packet headers. However, it is difficult to monitor the video quality per user using the average video quality because video quality depends on the video content. To accurately monitor the video quality per user, a model that can be used for estimating the video quality per video content rather than the average video quality should be developed. Therefore, to take into account the impact of video content on video quality, we propose a model that calculates the difference in video quality between the video quality of the estimation-target video and the average video quality estimated using a packet-layer model. We first conducted extensive subjective quality assessments for different codecs and video sequences. We then model their characteristics based on parameters related to compression and packet loss. Finally, we verify the performance of the proposed model by applying it to unknown data sets different from the training data sets used for developing the model.

  13. Building America Best Practices Series Volume 8: Builders Challenge Quality Criteria Support Document

    Energy Technology Data Exchange (ETDEWEB)

    Baechler, Michael C.; Bartlett, Rosemarie; Gilbride, Theresa L.

    2010-11-01

    The U.S. Department of Energy (DOE) has posed a challenge to the homebuilding industry—to build 220,000 high-performance homes by 2012. Through the Builders Challenge, participating homebuilders will have an easy way to differentiate their best energy-performing homes from other products in the marketplace, and to make the benefits clear to buyers. This document was prepared by Pacific Northwest National Laboratory for DOE to provide guidance to U.S. home builders who want to accept the challenge. To qualify for the Builders Challenge, a home must score 70 or less on the EnergySmart Home Scale (E-Scale). The E-scale is based on the well-established Home Energy Rating System (HERS) index, developed by the Residential Energy Services Network (RESNET). The E-scale allows homebuyers to understand – at a glance – how the energy performance of a particular home compares with the performance of others. To learn more about the index and HERS Raters, visit www.natresnet.org. Homes also must meet the Builders Challenge criteria described in this document. To help builders meet the Challenge, guidance is provided in this report for each of the 29 criteria. Included with guidance for each criteria are resources for more information and references for relevant codes and standards. The Builders Challenge Quality Criteria were originally published in Dec. 2008. They were revised and published as PNNL-18009 Rev 1.2 in Nov. 2009. This is version 1.3, published Nov 2010. Changes from the Nov 2009 version include adding a title page and updating the Energy Star windows critiera to the Version 5.0 criteria approved April 2009 and effective January 4, 2010. This document and other information about the Builders Challenge is available on line at www.buildingamerica.gov/challenge.

  14. Forensic document analysis using scanning microscopy

    Science.gov (United States)

    Shaffer, Douglas K.

    2009-05-01

    The authentication and identification of the source of a printed document(s) can be important in forensic investigations involving a wide range of fraudulent materials, including counterfeit currency, travel and identity documents, business and personal checks, money orders, prescription labels, travelers checks, medical records, financial documents and threatening correspondence. The physical and chemical characterization of document materials - including paper, writing inks and printed media - is becoming increasingly relevant for law enforcement agencies, with the availability of a wide variety of sophisticated commercial printers and copiers which are capable of producing fraudulent documents of extremely high print quality, rendering these difficult to distinguish from genuine documents. This paper describes various applications and analytical methodologies using scanning electron miscoscopy/energy dispersive (x-ray) spectroscopy (SEM/EDS) and related technologies for the characterization of fraudulent documents, and illustrates how their morphological and chemical profiles can be compared to (1) authenticate and (2) link forensic documents with a common source(s) in their production history.

  15. Emissions inventories for urban airshed model application in the Philadelphia Aqcr (Air Quality Control Region)

    Energy Technology Data Exchange (ETDEWEB)

    1982-04-01

    This report documents the procedures used to develop emissions input required by the Urban Airshed photochemical oxidant model. Ambient air quality data were gathered as part of another effort during the summer of 1979 in Philadelphia to be used in the model validation effort. For 1979 and the 1987 projection year, ES compiled hour by hour emissions data for a representative weekday in the oxidant season. The pollutants inventoried are five categories of VOC required by the Airshed model, four categories of VOC defined in RAPS, NO, NO2, CO, SO2, and TSP. Point and area sources were considered with the highway vehicle portion of the inventory being subcontracted to DVRPC. County level area source data were allocated to a 502-cell grid system. Projections were made so that ozone air quality in 1987 could be investigated. ES developed annualized EIS/PandR data and data files containing temporal and VOC/NOx profiles in order to generate the data packets required by the Airshed model.

  16. a Historical Timber Frame Model for Diagnosis and Documentation Before Building Restoration

    Science.gov (United States)

    Koehl, M.; Viale, A.; Reeb, S.

    2013-09-01

    The aim of the project that is described in this paper was to define a four-level timber frame survey mode of a historical building: the so-called "Andlau's Seigniory", Alsace, France. This historical building (domain) was built in the late XVIth century and is now in a stage of renovation in order to become a heritage interpretation centre. The used measurement methods combine Total Station measurements, Photogrammetry and 3D Terrestrial Laser scanner. Different modelling workflows were tested and compared according to the data acquisition method, but also according to the characteristics of the reconstructed model in terms of accuracy and level of detail. 3D geometric modelling of the entire structure was performed including modelling the degree of detail adapted to the needs. The described 3D timber framework exists now in different versions, from a theoretical and geometrical one up to a very detailed one, in which measurements and evaluation of deformation by time are potentially allowed. The virtually generated models involving archaeologists, architects, historians and specialists in historical crafts, are intended to be used during the four stages of the project: (i) knowledge of the current state of needs for diagnosis and understanding of former construction techniques; (ii) preparation and evaluation of restoration steps; (iii) knowledge and documentation concerning the archaeological object; (iv) transmission and dissemination of knowledge through the implementation of museum animations. Among the generated models we can also find a documentation of the site in the form of virtual tours created from panoramic photographs before and during the restoration works. Finally, the timber framework model was structured and integrated into a 3D GIS, where the association of descriptive and complementary digital documents was possible. Both offer tools leading to the diagnosis, the understanding of the structure, knowledge dissemination, documentation and the

  17. The Influence of Syntactic Quality on Pragmatic Quality of Enterprise Process Models

    Directory of Open Access Journals (Sweden)

    Merethe Heggset

    2015-12-01

    Full Text Available As approaches and tools for process and enterprise modelling are maturing, these techniques are being taken into use on a large scale in an increasing number of organizations. In this paper we report on the use of process modelling in connection to the quality system of Statoil, a large Norwegian oil company, in particular, on the aspects found necessary to be emphasized to achieve the appropriate quality of the models in this organization. Based on the investigation of usage statistics and user feedback on models, we have identified that there are problems in comprehending some of the models. Some of these models has poorer syntactic quality than the average syntactic quality of models of the same size. An experiment with improving syntactic quality on some of these models has given mixed results, and it appears that certain syntactic errors hinder comprehension more than others.

  18. U.S. Department of Energy, Carlsbad Area Office quality assurance program document. Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-04-01

    Mission of the Carlsbad Area Office (CAO) is to protect human health and the environment by opening and operating the Waste Isolation Pilot Plant (WIPP) for safe disposal of TRU waste, and establishing an effective system for management of TRU waste from generation to disposal. To help in fulfilling this mission and to ensure that risks and environmental impacts are identified and minimized, and that safety, reliability, and performance are optimized, CAO`s policy is to establish and maintain an effective quality assurance (QA) program that supports compliance with applicable Federal, State, and local regulations, and DOE orders and requirements. This document establishes QA program requirements for all programs, projects, and activities sponsored by CAO.

  19. Synthesis document on the long time behavior of packages: operational document ''bituminous'' 2204

    International Nuclear Information System (INIS)

    Tiffreau, C.

    2004-09-01

    This document is realized in the framework of the law of 1991 on the radioactive wastes management. The 2004 synthesis document on long time behavior of bituminous sludges packages is constituted by two documents, the reference document and the operational document. This paper presents the operational model describing the water alteration of the packages and the associated radioelements release, as the gas term source and the swelling associated to the self-irradiation and the bituminous radiolysis. (A.L.B.)

  20. Multiple sclerosis documentation system (MSDS): moving from documentation to management of MS patients.

    Science.gov (United States)

    Ziemssen, Tjalf; Kempcke, Raimar; Eulitz, Marco; Großmann, Lars; Suhrbier, Alexander; Thomas, Katja; Schultheiss, Thorsten

    2013-09-01

    The long disease duration of multiple sclerosis and the increasing therapeutic options require a individualized therapeutic approach which should be carefully documented over years of observation. To switch from MS documentation to an innovative MS management, new computer- and internet-based tools could be implemented as we could demonstrate with the novel computer-based patient management system "multiple sclerosis management system 3D" (MSDS 3D). MSDS 3D allows documentation and management of visit schedules and mandatory examinations via defined study modules by integration of data input from various sources (patients, attending physicians and MS nurses). It provides forms for the documentation of patient visits as well as clinical and diagnostic findings. Information can be collected via interactive touch screens. Specific modules allow the management of highly efficacious treatments as natalizumab or fingolimod. MSDS can be used to transfer the documented data to databases as, e.g. the registry of the German MS society or REGIMS. MSDS has already been implemented successfully in clinical practice and is currently being evaluated in a multicenter setting. High-quality management and documentation are crucial for improvements in clinical practice and research work.

  1. Air quality dispersion models from energy sources

    International Nuclear Information System (INIS)

    Lazarevska, Ana

    1996-01-01

    Along with the continuing development of new air quality models that cover more complex problems, in the Clean Air Act, legislated by the US Congress, a consistency and standardization of air quality model applications were encouraged. As a result, the Guidelines on Air Quality Models were published, which are regularly reviewed by the Office of Air Quality Planning and Standards, EPA. These guidelines provide a basis for estimating the air quality concentrations used in accessing control strategies as well as defining emission limits. This paper presents a review and analysis of the recent versions of the models: Simple Terrain Stationary Source Model; Complex Terrain Dispersion Model; Ozone,Carbon Monoxide and Nitrogen Dioxide Models; Long Range Transport Model; Other phenomenon Models:Fugitive Dust/Fugitive Emissions, Particulate Matter, Lead, Air Pathway Analyses - Air Toxic as well as Hazardous Waste. 8 refs., 4 tabs., 2 ills

  2. Multisource data fusion for documenting archaeological sites

    Science.gov (United States)

    Knyaz, Vladimir; Chibunichev, Alexander; Zhuravlev, Denis

    2017-10-01

    The quality of archaeological sites documenting is of great importance for cultural heritage preserving and investigating. The progress in developing new techniques and systems for data acquisition and processing creates an excellent basis for achieving a new quality of archaeological sites documenting and visualization. archaeological data has some specific features which have to be taken into account when acquiring, processing and managing. First of all, it is a needed to gather as full as possible information about findings providing no loss of information and no damage to artifacts. Remote sensing technologies are the most adequate and powerful means which satisfy this requirement. An approach to archaeological data acquiring and fusion based on remote sensing is proposed. It combines a set of photogrammetric techniques for obtaining geometrical and visual information at different scales and detailing and a pipeline for archaeological data documenting, structuring, fusion, and analysis. The proposed approach is applied for documenting of Bosporus archaeological expedition of Russian State Historical Museum.

  3. Mental Status Documentation: Information Quality and Data Processes.

    Science.gov (United States)

    Weir, Charlene; Gibson, Bryan; Taft, Teresa; Slager, Stacey; Lewis, Lacey; Staggers, Nancy

    2016-01-01

    Delirium is a fluctuating disturbance of cognition and/or consciousness associated with poor outcomes. Caring for patients with delirium requires integration of disparate information across clinicians, settings and time. The goal of this project was to characterize the information processes involved in nurses' assessment, documentation, decisionmaking and communication regarding patients' mental status in the inpatient setting. VA nurse managers of medical wards (n=18) were systematically selected across the US. A semi-structured telephone interview focused on current assessment, documentation, and communication processes, as well as clinical and administrative decision-making was conducted, audio-recorded and transcribed. A thematic analytic approach was used. Five themes emerged: 1) Fuzzy Concepts, 2) Grey Data, 3) Process Variability 4) Context is Critical and 5) Goal Conflict. This project describes the vague and variable information processes related to delirium and mental status that undermine effective risk, prevention, identification, communication and mitigation of harm.

  4. Documenting the 'terroir' aspects of award-winning Danish conserves: a model for the development of authentic food products

    DEFF Research Database (Denmark)

    Christensen, Laurids Siig; Hoorfar, Jeffrey; Sorensen, J.

    2012-01-01

    An example of the qualities embedded in the foods produced by small companies based on small Danish islands and reflections on the association between the qualities of the products and their geographical origin is presented. This case study discusses how it can be ensured that qualities in the pr...... in the products that can be tasted and documented truly reflect geographical origin, and ways in which authenticity can be guarded and exploited in marketing while at the same time increasing production volume....

  5. Does teaching of documentation of shoulder dystocia delivery through simulation result in improved documentation in real life?

    Science.gov (United States)

    Comeau, Robyn; Craig, Catherine

    2014-03-01

    Documentation of deliveries complicated by shoulder dystocia is a valuable communication skill necessary for residents to attain during residency training. Our objective was to determine whether the teaching of documentation of shoulder dystocia in a simulation environment would translate to improved documentation of the event in an actual clinical situation. We conducted a cohort study involving obstetrics and gynaecology residents in years 2 to 5 between November 2010 and December 2012. Each resident participated in a shoulder dystocia simulation teaching session and was asked to write a delivery note immediately afterwards. They were given feedback regarding their performance of the delivery and their documentation of the events. Following this, dictated records of shoulder dystocia deliveries immediately before and after the simulation session were identified through the Meditech system. An itemized checklist was used to assess the quality of residents' dictated documentation before and after the simulation session. All eligible residents (18) enrolled in the study, and 17 met the inclusion criteria. For 10 residents (59%) documentation of a delivery with shoulder dystocia was present before and after the simulation session, for five residents (29%) it was only present before the session, and for two residents (18%) it was only present after the session. When residents were assessed as a group, there were no differences in the proportion of residents recording items on the checklist before and after the simulation session (P > 0.05 for all). Similarly, analysis of the performance of the10 residents who had dictated documentation both before and after the session showed no differences in the number of elements recorded on dictations done before and after the simulation session (P > 0.05 for all). The teaching of shoulder dystocia documentation through simulation did not result in a measurable improvement in the quality of documentation of shoulder dystocia in

  6. Study on Software Quality Improvement based on Rayleigh Model and PDCA Model

    OpenAIRE

    Ning Jingfeng; Hu Ming

    2013-01-01

    As the software industry gradually becomes mature, software quality is regarded as the life of a software enterprise. This article discusses how to improve the quality of software, applies Rayleigh model and PDCA model to the software quality management, combines with the defect removal effectiveness index, exerts PDCA model to solve the problem of quality management objectives when using the Rayleigh model in bidirectional quality improvement strategies of software quality management, a...

  7. European Guidelines for Quality Assurance in Cervical Cancer Screening. Second edition--summary document.

    Science.gov (United States)

    Arbyn, M; Anttila, A; Jordan, J; Ronco, G; Schenck, U; Segnan, N; Wiener, H; Herbert, A; von Karsa, L

    2010-03-01

    European Guidelines for Quality Assurance in Cervical Cancer Screening have been initiated in the Europe Against Cancer Programme. The first edition established the principles of organised population-based screening and stimulated numerous pilot projects. The second multidisciplinary edition was published in 2008 and comprises approximately 250 pages divided into seven chapters prepared by 48 authors and contributors. Considerable attention has been devoted to organised, population-based programme policies which minimise adverse effects and maximise benefits of screening. It is hoped that this expanded guidelines edition will have a greater impact on countries in which screening programmes are still lacking and in which opportunistic screening has been preferred in the past. Other methodological aspects such as future prospects of human papillomavirus testing and vaccination in cervical cancer control have also been examined in the second edition; recommendations for integration of the latter technologies into European guidelines are currently under development in a related project supported by the European Union Health Programme. An overview of the fundamental points and principles that should support any quality-assured screening programme and key performance indicators are presented here in a summary document of the second guidelines edition in order to make these principles and standards known to a wider scientific community.

  8. Modelling air quality according to INSPIRE data specifications, ISO standards and national regulations

    Directory of Open Access Journals (Sweden)

    Pachelski Wojciech

    2017-12-01

    Full Text Available Protection of the environment is an activity of many institutions, organizations and communities from global to regional and local scales. Any activity in this area needs structured database records, using advanced methodology, given, among others, in INSPIRE documents, ISO standards of 19100 series, and national regulations. The goal of this paper is to analyse both the legal provisions related to the air quality and also data sources associated with the prevention of air pollution. Furthermore, the UML application schema of the spatial data related to the air protection is proposed, for the use by urban planners. Also, the overview of the methodology of geographic information is given, including the Unified Modelling Language (UML, as well as the basic concepts of conceptual models within the INSPIRE project. The study is based on the relevant literature and documents, as well as on the expert knowledge gained through urban planning practice, as well as on the analysis of the spatial planning regulations. The UML application schema for different aspects related to the air protection, as presented in this paper, is an example of how to use the methodology also in other fields of the environment protection. Spatial planners know how to improve the air quality, but in the present state of law they often suffer from the lack of planning tools for real actions. In the spatial planners work an important issue are data that allow a thorough analysis of the area.

  9. Spatial data quality and coastal spill modelling

    International Nuclear Information System (INIS)

    Li, Y.; Brimicombe, A.J.; Ralphs, M.P.

    1998-01-01

    Issues of spatial data quality are central to the whole oil spill modelling process. Both model and data quality performance issues should be considered as indispensable parts of a complete oil spill model specification and testing procedure. This paper presents initial results of research that will emphasise to modeler and manager alike the practical issues of spatial data quality for coastal oil spill modelling. It is centred around a case study of Jiao Zhou Bay in the People's Republic of China. The implications for coastal oil spill modelling are discussed and some strategies for managing the effects of spatial data quality in the outputs of oil spill modelling are explored. (author)

  10. Continuous Release-Emergency Response Notification System and Priority Assessment Model: Model documentation

    International Nuclear Information System (INIS)

    1991-01-01

    The purpose of the model documentation is to provide a detailed description of the modeling and risk analysis procedures used in CR-ERNS/PAM to assist OSCs and other Superfund decision-makers in interpreting the system results. PAM is a screening-level model; to properly interpret PAM's outputs, the user must understand the limitations and uncertainties in the equations and data used to generate these results. Chapter 2 presents the system's fate and transport models and describes the assumptions associated with these equations. Chapter 3 describes PAM's auxiliary data bases and provides the source(s) of each parameter and the methods by which values were selected. Chapter 4 explains the methods and exposure assumptions used to estimate exposures to hazardous substances and to evaluate the risks and hazards associated with these exposures. Chapter 5 presents examples of reports generated by PAM and explains the meaning of the 'flags' assigned to hazardous substances, media, and facilities. Appendix A contains versions of the fate and transport equations used for radionuclides. Appendix B contains copies of PAM's reports

  11. Model documentation, Coal Market Module of the National Energy Modeling System

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-01-01

    This report documents the objectives and the conceptual and methodological approach used in the development of the National Energy Modeling System`s (NEMS) Coal Market Module (CMM) used to develop the Annual Energy Outlook 1998 (AEO98). This report catalogues and describes the assumptions, methodology, estimation techniques, and source code of CMM`s two submodules. These are the Coal Production Submodule (CPS) and the Coal Distribution Submodule (CDS). CMM provides annual forecasts of prices, production, and consumption of coal for NEMS. In general, the CDS integrates the supply inputs from the CPS to satisfy demands for coal from exogenous demand models. The international area of the CDS forecasts annual world coal trade flows from major supply to major demand regions and provides annual forecasts of US coal exports for input to NEMS. Specifically, the CDS receives minemouth prices produced by the CPS, demand and other exogenous inputs from other NEMS components, and provides delivered coal prices and quantities to the NEMS economic sectors and regions.

  12. A review of hydrological/water-quality models

    Directory of Open Access Journals (Sweden)

    Liangliang GAO,Daoliang LI

    2014-12-01

    Full Text Available Water quality models are important in predicting the changes in surface water quality for environmental management. A range of water quality models are wildly used, but every model has its advantages and limitations for specific situations. The aim of this review is to provide a guide to researcher for selecting a suitable water quality model. Eight well known water quality models were selected for this review: SWAT, WASP, QUALs, MIKE 11, HSPF, CE-QUAL-W2, ELCOM-CAEDYM and EFDC. Each model is described according to its intended use, development, simulation elements, basic principles and applicability (e.g., for rivers, lakes, and reservoirs and estuaries. Currently, the most important trends for future model development are: (1 combination models─individual models cannot completely solve the complex situations so combined models are needed to obtain the most appropriate results, (2 application of artificial intelligence and mechanistic models combined with non-mechanistic models will provide more accurate results because of the realistic parameters derived from non-mechanistic models, and (3 integration with remote sensing, geographical information and global position systems (3S ─3S can solve problems requiring large amounts of data.

  13. Synthesis document on the long life behavior of packages: reference operational document ''CSD-C'' 2004

    International Nuclear Information System (INIS)

    Helie, M.

    2004-12-01

    This document is realized in the framework of the law of 1991 on the radioactive wastes management. The 2004 synthesis document on long time behavior of standard packages of compacted wastes is constituted by two documents, the reference document and the operational document. This paper presents the operational model describing the packages alteration by the water and the associated radionuclide release. (A.L.B.)

  14. Quality improvement in healthcare delivery utilizing the patient-centered medical home model.

    Science.gov (United States)

    Akinci, Fevzi; Patel, Poonam M

    2014-01-01

    Despite the fact that the United States dedicates so much of its resources to healthcare, the current healthcare delivery system still faces significant quality challenges. The lack of effective communication and coordination of care services across the continuum of care poses disadvantages for those requiring long-term management of their chronic conditions. This is why the new transformation in healthcare known as the patient-centered medical home (PCMH) can help restore confidence in our population that the healthcare services they receive is of the utmost quality and will effectively enhance their quality of life. Healthcare using the PCMH model is delivered with the patient at the center of the transformation and by reinvigorating primary care. The PCMH model strives to deliver effective quality care while attempting to reduce costs. In order to relieve some of our healthcare system distresses, organizations can modify their delivery of care to be patient centered. Enhanced coordination of services, better provider access, self-management, and a team-based approach to care represent some of the key principles of the PCMH model. Patients that can most benefit are those that require long-term management of their conditions such as chronic disease and behavioral health patient populations. The PCMH is a feasible option for delivery reform as pilot studies have documented successful outcomes. Controversy about the lack of a medical neighborhood has created concern about the overall sustainability of the medical home. The medical home can stand independently and continuously provide enhanced care services as a movement toward higher quality care while organizations and government policy assess what types of incentives to put into place for the full collaboration and coordination of care in the healthcare system.

  15. Effects of increased nurses' workload on quality documentation of patient information at selected Primary Health Care facilities in Vhembe District, Limpopo Province.

    Science.gov (United States)

    Shihundla, Rhulani C; Lebese, Rachel T; Maputle, Maria S

    2016-05-13

    Recording of information on multiple documents increases professional nurses' responsibilities and workload during working hours. There are multiple registers and books at Primary Health Care (PHC) facilities in which a patient's information is to be recorded for different services during a visit to a health professional. Antenatal patients coming for the first visit must be recorded in the following documents: tick register; Prevention of Mother-ToChild Transmission (PMTCT) register; consent form for HIV and AIDS testing; HIV Counselling and Testing (HCT) register (if tested positive for HIV and AIDS then this must be recorded in the Antiretroviral Therapy (ART) wellness register); ART file with an accompanying single file, completion of which is time-consuming; tuberculosis (TB) suspects register; blood specimen register; maternity case record book and Basic Antenatal Care (BANC) checklist. Nurses forget to record information in some documents which leads to the omission of important data. Omitting information might lead to mismanagement of patients. Some of the documents have incomplete and inaccurate information. As PHC facilities in Vhembe District render twenty four hour services through a call system, the same nurses are expected to resume duty at 07:00 the following morning. They are expected to work effectively and when tired a nurse may record illegible information which may cause problems when the document is retrieved by the next person for continuity of care. The objective of this study was to investigate and describe the effects of increased nurses' workload on quality documentation of patient information at PHC facilities in Vhembe District, Limpopo Province. The study was conducted in Vhembe District, Limpopo Province, where the effects of increased nurses' workload on quality documentation of information is currently experienced. The research design was explorative, descriptive and contextual in nature. The population consisted of all nurses who

  16. Model documentation of assessment and nursing diagnosis in the practice of nursing care management for nursing students

    OpenAIRE

    A. Aziz Alimul Hidayat; M. Kes

    2015-01-01

    Model documentation of assessment and nursing diagnosis in the practice of nursing care management is an integration model in nursing care records, especially records nursing assessment and diagnosis in one format. This model can reduce the duration of the recording in nursing care, and make it easier for students to understand the nursing diagnosis, so that nursing interventions more effective. The purpose of this paper was to describes the form integration documentation of nursing assessmen...

  17. Prevalence of accurate nursing documentation in patient records

    NARCIS (Netherlands)

    Paans, Wolter; Sermeus, Walter; Nieweg, Roos; van der Schans, Cees

    2010-01-01

    AIM: This paper is a report of a study conducted to describe the accuracy of nursing documentation in patient records in hospitals. Background.  Accurate nursing documentation enables nurses to systematically review the nursing process and to evaluate the quality of care. Assessing nurses' reports

  18. FACSIM/MRS-1: Cask receiving and consolidation model documentation and user's guide

    International Nuclear Information System (INIS)

    Lotz, T.L.; Shay, M.R.

    1987-06-01

    The Pacific Northwest Laboratory (PNL) has developed a stochastic computer model, FACSIM/MRS, to assist in assessing the operational performance of the Monitored Retrievable Storage (MRS) waste-handling facility. This report provides the documentation and user's guide for the component FACSIM/MRS-1, which is also referred to as the front-end model. The FACSIM/MRS-1 model simulates the MRS cask-receiving and spent-fuel consolidation activities. The results of the assessment of the operational performance of these activities are contained in a second report, FACSIM/MRS-1: Cask Receiving and Consolidation Performance Assessment (Lotz and Shay 1987). The model of MRS canister storage and shipping operations is presented in FACSIM/MRS-2: Storage and Shipping Model Documentation and User's Guide (Huber et al. 1987). The FACSIM/MRS model uses the commercially available FORTRAN-based SIMAN (SIMulation ANalysis language) simulation package (Pegden 1982). SIMAN provides a set of FORTRAN-coded commands, called block operations, which are used to build detailed models of continuous or discrete events that make up the operations of any process, such as the operation of an MRS facility. The FACSIM models were designed to run on either an IBM-PC or a VAX minicomputer. The FACSIM/MRS-1 model is flexible enough to collect statistics concerning almost any aspect of the cask receiving and consolidation operations of an MRS facility. The MRS model presently collects statistics on 51 quantities of interest during the simulation. SIMAN reports the statistics with two forms of output: a SIMAN simulation summary and an optional set of SIMAN output files containing data for use by more detailed post processors and report generators

  19. Functional requirements document for measuring emissions of airborne radioactive materials

    International Nuclear Information System (INIS)

    Glissmeyer, J.A.; Alvarez, J.L.; Hoover, M.D.; Newton, G.C.; McFarland, A.R.; Rodgers, J.C.

    1994-11-01

    This document states the general functional requirements for systems and procedures for measuring emissions of airborne radioactive materials from facilities administered by the Westinghouse Hanford Company (WHC). The following issues are addressed in this document: lg-bullet definition of the program objectives lg-bullet selection of the overall approach to collecting the samples lg-bullet sampling equipment design lg-bullet sampling equipment maintenance and quality assurance issues. The following issues are not addressed in this document: lg-bullet air sampling in work areas or containments lg-bullet selection of specific on-line sample monitoring instrumentation lg-bullet analyzing collected samples lg-bullet reporting and interpreting results. The document provides equipment design guidance that is performance based rather than prescriptive. Locations from which samples are obtained should exhibit mixing of the contaminants with the airstream and acceptable air flow characteristics. Sample collection equipment and effluent and sample flow elements should meet defined performance standards. Quality control and assurance requirements specific to sample collection, equipment inspection, and calibration are presented. Key sample collection performance requirements are summarized in Section 5.4. The intent of this document is to assist WHC in demonstrating a high quality of air emission measurements with verified system performance based on documented system design, testing, inspection, and maintenance

  20. Documentation of a Model Action Plan to Deter Illicit Nuclear Trafficking

    International Nuclear Information System (INIS)

    Smith, D; Kristo, M; Niemeyer, S; Dudder, G

    2006-01-01

    Theft, illegal possession, smuggling, or attempted unauthorized sale of nuclear and radiological materials remains a worldwide problem. The Nuclear Smuggling International Technical Working Group (ITWG) has adopted a model action plan to guide investigation of these cases through a systematic approach to nuclear forensics. The model action plan was recently documented and provides recommendations concerning incident response, collection of evidence in conformance with required legal standards, laboratory sampling and distribution of samples, radioactive materials analysis, including categorization and characterization of samples, forensics analysis of conventional evidence, and case development including interpretation of forensic signatures

  1. Documentation of a model action plan to deter illicit nuclear trafficking

    International Nuclear Information System (INIS)

    Smith, D.K.; Kristo, M.J.; Niemeyer, S.; Dudder, G.B.

    2008-01-01

    Theft, illegal possession, smuggling, or attempted unauthorized sale of nuclear and radiological materials remains a worldwide problem. The Nuclear Smuggling International Technical Working Group (ITWG) has adopted a model action plan to guide investigation of these cases through a systematic approach to nuclear forensics. The model action plan was recently documented and provides recommendations concerning incident response, collection of evidence in conformance with required legal standards, laboratory sampling and distribution of samples, radioactive materials analysis, including categorization and characterization of samples, forensics analysis of conventional evidence, and case development including interpretation of forensic signatures. (author)

  2. ETM documentation update – including modelling conventions and manual for software tools

    DEFF Research Database (Denmark)

    Grohnheit, Poul Erik

    This is the final report for the DTU contribution to Socio Economic Research on Fusion (SERF), EFDA Technology Work programme 2013. The structure and contents of this report was presented at the EFDA-TIMES workshop in Garching 12-13 December 2013. This report gives further background and references......, it summarises the work done during 2013, and it also contains presentations for promotion of fusion as a future element in the electricity generation mix and presentations for the modelling community concerning model development and model documentation – in particular for TIAM collaboration workshops....

  3. Quality Improvement Initiative on Pain Knowledge, Assessment, and Documentation Skills of Pediatric Nurses.

    Science.gov (United States)

    Margonary, Heather; Hannan, Margaret S; Schlenk, Elizabeth A

    2017-01-01

    Pain treatment begins with a nurse’s assessment, which relies on effective assessment skills. Hospital settings have implemented pain assessment education, but there is limited evidence in pediatric transitional care settings. The purpose of this quality improvement (QI) initiative was to develop, implement, and evaluate an evidence-based pain education session with 20 nurses in a pediatric specialty hospital that provides transitional care. Specific aims were to assess nurses’ knowledge and attitudes of pain, and evaluate assessment skills based on nurses’ documentation. A prospective pre-post design with three assessments (baseline, post-intervention, and one-month follow-up) was used. The Shriner’s Pediatric Nurses’ Knowledge and Attitudes Regarding Pain questionnaire and an electronic health record review were completed at each assessment. There was significant improvement in nurses’ knowledge and attitudes of pain after the education session (F[2,6] = 50.281, p nurses significantly increased from 43.1% at baseline to 64.8% at post-intervention, and 67.7% at follow-up (χ²[2] = 20.55, p Nursing interventions for pain increased significantly, from 33.3% at baseline to 84.0% at post-intervention, and stabilized at 80.0% at follow-up (χ²[2] = 8.91, p = 0.012). Frequency of pain reassessments did not show a statistically significant change, decreasing from 77.8% at baseline to 44.0% at post-intervention and 40.0% at follow-up (χ²[2]= 3.538, p = 0.171). Nurses’ pain knowledge and documentation of assessment skills were improved in this QI initiative.

  4. STREAMFLOW AND WATER QUALITY REGRESSION MODELING ...

    African Journals Online (AJOL)

    ... downstream Obigbo station show: consistent time-trends in degree of contamination; linear and non-linear relationships for water quality models against total dissolved solids (TDS), total suspended sediment (TSS), chloride, pH and sulphate; and non-linear relationship for streamflow and water quality transport models.

  5. Model documentation renewable fuels module of the National Energy Modeling System

    Science.gov (United States)

    1995-06-01

    This report documents the objectives, analytical approach, and design of the National Energy Modeling System (NEMS) Renewable Fuels Module (RFM) as it relates to the production of the 1995 Annual Energy Outlook (AEO95) forecasts. The report catalogs and describes modeling assumptions, computational methodologies, data inputs, and parameter estimation techniques. A number of offline analyses used in lieu of RFM modeling components are also described. The RFM consists of six analytical submodules that represent each of the major renewable energy resources -- wood, municipal solid waste (MSW), solar energy, wind energy, geothermal energy, and alcohol fuels. The RFM also reads in hydroelectric facility capacities and capacity factors from a data file for use by the NEMS Electricity Market Module (EMM). The purpose of the RFM is to define the technological, cost, and resource size characteristics of renewable energy technologies. These characteristics are used to compute a levelized cost to be competed against other similarly derived costs from other energy sources and technologies. The competition of these energy sources over the NEMS time horizon determines the market penetration of these renewable energy technologies. The characteristics include available energy capacity, capital costs, fixed operating costs, variable operating costs, capacity factor, heat rate, construction lead time, and fuel product price.

  6. Model documentation renewable fuels module of the National Energy Modeling System

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-06-01

    This report documents the objectives, analytical approach, and design of the National Energy Modeling System (NEMS) Renewable Fuels Module (RFM) as it relates to the production of the 1995 Annual Energy Outlook (AEO95) forecasts. The report catalogues and describes modeling assumptions, computational methodologies, data inputs, and parameter estimation techniques. A number of offline analyses used in lieu of RFM modeling components are also described. The RFM consists of six analytical submodules that represent each of the major renewable energy resources--wood, municipal solid waste (MSW), solar energy, wind energy, geothermal energy, and alcohol fuels. The RFM also reads in hydroelectric facility capacities and capacity factors from a data file for use by the NEMS Electricity Market Module (EMM). The purpose of the RFM is to define the technological, cost and resource size characteristics of renewable energy technologies. These characteristics are used to compute a levelized cost to be competed against other similarly derived costs from other energy sources and technologies. The competition of these energy sources over the NEMS time horizon determines the market penetration of these renewable energy technologies. The characteristics include available energy capacity, capital costs, fixed operating costs, variable operating costs, capacity factor, heat rate, construction lead time, and fuel product price.

  7. Quality assured technical documentation for nuclear power generation

    International Nuclear Information System (INIS)

    Ault, M.P.

    1992-01-01

    Present day large scale industry in general is made up of highly complex technology subjected to many rigorous external controls and constraints. This is particularly so in the nuclear power industry where it is essential that materials and services provided during the phases of construction, commissioning and operations, conform precisely to requirements as specified. Failure to do this could lead to unit shut-down and loss of income. For over 25 years, a central unit within the Central Electricity Generating Board (CEGB) developed an enviable reputation for the production of high class technical documentation essential during power station commissioning and operations phases. Following privatization of the electricity supply industry in 1991 the unit became a stand-alone organization and since 1989 has been known as Technical Publications Management Services (TPMS). TPMS with its many years of experience now offers its services to industry in general as well as to the electricity supply industry. Work currently being undertaken by TPMS is described here. Recent contracts obtained for work at Sizewell and for Severn Trent Water indicate the continuing and expanding need for specialist documentation services. (author)

  8. Triangular clustering in document networks

    Energy Technology Data Exchange (ETDEWEB)

    Cheng Xueqi; Ren Fuxin [Institute of Computing Technology, Chinese Academy of Sciences, Beijing 100190 (China); Zhou Shi [Department of Computer Science, University College London, Malet Place, London WC1E 6BT (United Kingdom); Hu Maobin [School of Engineering Science, University of Science and Technology of China, Hefei 230026 (China)], E-mail: cxq@ict.ac.cn, E-mail: renfuxin@software.ict.ac.cn, E-mail: s.zhou@adastral.ucl.ac.uk, E-mail: humaobin@ustc.edu.cn

    2009-03-15

    Document networks have the characteristic that a document node, e.g. a webpage or an article, carries meaningful content. Properties of document networks are not only affected by topological connectivity between nodes, but are also strongly influenced by the semantic relation between the content of the nodes. We observed that document networks have a large number of triangles and a high value clustering coefficient. Also there is a strong correlation between the probability of formation of a triangle and the content similarity among the three nodes involved. We propose the degree-similarity product (DSP) model, which well reproduces these properties. The model achieves this by using a preferential attachment mechanism that favours the linkage between nodes that are both popular and similar. This work is a step forward towards a better understanding of the structure and evolution of document networks.

  9. DOCUMENT REPRESENTATION FOR CLUSTERING OF SCIENTIFIC ABSTRACTS

    Directory of Open Access Journals (Sweden)

    S. V. Popova

    2014-01-01

    Full Text Available The key issue of the present paper is clustering of narrow-domain short texts, such as scientific abstracts. The work is based on the observations made when improving the performance of key phrase extraction algorithm. An extended stop-words list was used that was built automatically for the purposes of key phrase extraction and gave the possibility for a considerable quality enhancement of the phrases extracted from scientific publications. A description of the stop- words list creation procedure is given. The main objective is to investigate the possibilities to increase the performance and/or speed of clustering by the above-mentioned list of stop-words as well as information about lexeme parts of speech. In the latter case a vocabulary is applied for the document representation, which contains not all the words that occurred in the collection, but only nouns and adjectives or their sequences encountered in the documents. Two base clustering algorithms are applied: k-means and hierarchical clustering (average agglomerative method. The results show that the use of an extended stop-words list and adjective-noun document representation makes it possible to improve the performance and speed of k-means clustering. In a similar case for average agglomerative method a decline in performance quality may be observed. It is shown that the use of adjective-noun sequences for document representation lowers the clustering quality for both algorithms and can be justified only when a considerable reduction of feature space dimensionality is necessary.

  10. Management Documentation: Indicators & Good Practice at Cultural Heritage Places

    Science.gov (United States)

    Eppich, R.; Garcia Grinda, J. L.

    2015-08-01

    Documentation for cultural heritage places usually refers to describing the physical attributes, surrounding context, condition or environment; most of the time with images, graphics, maps or digital 3D models in their various forms with supporting textural information. Just as important as this type of information is the documentation of managerial attributes. How do managers of cultural heritage places collect information related to financial or economic well-being? How are data collected over time measured, and what are significant indicators for improvement? What quality of indicator is good enough? Good management of cultural heritage places is essential for conservation longevity, preservation of values and enjoyment by the public. But how is management documented? The paper will describe the research methodology, selection and description of attributes or indicators related to good management practice. It will describe the criteria for indicator selection and why they are important, how and when they are collected, by whom, and the difficulties in obtaining this information. As importantly it will describe how this type of documentation directly contributes to improving conservation practice. Good practice summaries will be presented that highlight this type of documentation including Pamplona and Ávila, Spain and Valletta, Malta. Conclusions are drawn with preliminary recommendations for improvement of this important aspect of documentation. Documentation of this nature is not typical and presents a unique challenge to collect, measure and communicate easily. However, it is an essential category that is often ignored yet absolutely essential in order to conserve cultural heritage places.

  11. Efficient document management by introduction of a new, dynamic quality manual application according to DIN EN ISO 9001:2000; Effiziente Qualitaetsmanagementdokumentation durch Einfuehrung eines neuen, dynamischen Qualitaetsmanagementhandbuchs nach DIN EN ISO 9001:2000

    Energy Technology Data Exchange (ETDEWEB)

    Pache, G.; Saueressig, U.; Baumann, T.; Langer, M.; Kotter, E. [Universitaetsklinikum Freiburg (Germany). Abt. Roentgendiagnostik; Duerselen, L. [DxD Consulting, Degern (Germany)

    2008-06-15

    Purpose: Evaluation of the impact of a new, dynamic computer-aided quality manual application (QMA) regarding the acceptance and efficiency of a quality management system (QMS) according to DIN EN ISO 9001:2000. Materials and Method: The QMA combines static pages of HTML with active content generated from an underlying database. Through user access rights, a hierarchy is defined to create and administer quality documents. Document workflow, feedback management and employee survey were analyzed to compare the performance of the new QMH with the formerly used static versions. Results: Integration of a document editor and automated document re-approval accelerated the document process by an average of 10 min. In spite of an increase of the yearly document changes of 60%, the administration effort was reduced by approximately 160 h. Integration of the feedback management system into the QMA decreased handling time from an average of 16.5 to 3.4 days. Simultaneously the number of feedback messages increased from 160 in 2005 to 306 in 2006. Employee satisfaction was raised (old: 3.19{+-}1.02, new: 1.91{+-}0.8). The number of users who partook in the QMA more than once a week also increased from 29.5% to 60%. Conclusion: The computer-aided quality manual application constitutes the basis for the success of our QMS. The possibility to actively participate in the quality management process has led to broad acceptance and usage by the employees. The administration effort was able to be tremendously decreased as compared to conventional QMS. (orig.)

  12. KNOWLEDGE AND VALORIZATION OF HISTORICAL SITES THROUGH 3D DOCUMENTATION AND MODELING

    Directory of Open Access Journals (Sweden)

    E. Farella

    2016-06-01

    Full Text Available The paper presents the first results of an interdisciplinary project related to the 3D documentation, dissemination, valorization and digital access of archeological sites. Beside the mere 3D documentation aim, the project has two goals: (i to easily explore and share via web references and results of the interdisciplinary work, including the interpretative process and the final reconstruction of the remains; (ii to promote and valorize archaeological areas using reality-based 3D data and Virtual Reality devices. This method has been verified on the ruins of the archeological site of Pausilypon, a maritime villa of Roman period (Naples, Italy. Using Unity3D, the virtual tour of the heritage site was integrated and enriched with the surveyed 3D data, text documents, CAAD reconstruction hypotheses, drawings, photos, etc. In this way, starting from the actual appearance of the ruins (panoramic images, passing through the 3D digital surveying models and several other historical information, the user is able to access virtual contents and reconstructed scenarios, all in a single virtual, interactive and immersive environment. These contents and scenarios allow to derive documentation and geometrical information, understand the site, perform analyses, see interpretative processes, communicate historical information and valorize the heritage location.

  13. Physical aspects of quality assurance in radiotherapy: A protocol for quality control

    International Nuclear Information System (INIS)

    Aguirre, J.F.; Alfonso-Laguardia, R.; Andreo, P.; Brunetto, M.; Marenco-Zuniga, H.; Gutt, F.; Torres-Calderon, A.

    2000-06-01

    In consistency with the increasing requests from Member States for establishing radiotherapy programmes, an IAEA Technical Co-operation project was initiated in Latin America aimed at improving the physical aspects (as a complement to the clinical issues) of quality assurance in radiotherapy; this ARCAL XXX project (RLA/6/032) was classified as a Model Project of the IAEA. Among the important outcomes of the project were (i) the training of a considerable number of medical physicists in hospitals of the region, (ii) the development of a protocol for quality control procedures, and (iii) the organization of quality audit site visits (to the participant countries) where the implementation in hospitals of the developed quality control procedures is verified. The present publication is the protocol for quality control of the physical aspects of radiotherapy. It contains detailed procedures on what should be measured by a medical physicist in a radiotherapy treatment unit and related equipment, and how this should be made. The latter is made through several appendices, which make the document rather unique. The protocol was developed by medical physicists of the region for the professionals of the region, and it is the first document of this kind ever written in Spanish. A training course was organized in November 1998 (Havana, Cuba) where its practical implementation was taught. There are plans to have this document translated into different languages for the various regions having similar TC projects. (author)

  14. A Linear Algebra Measure of Cluster Quality.

    Science.gov (United States)

    Mather, Laura A.

    2000-01-01

    Discussion of models for information retrieval focuses on an application of linear algebra to text clustering, namely, a metric for measuring cluster quality based on the theory that cluster quality is proportional to the number of terms that are disjoint across the clusters. Explains term-document matrices and clustering algorithms. (Author/LRW)

  15. Quality Management and Innovation in Information Services - The case study of the Documentation Services of the University of Minho

    OpenAIRE

    Guedes, Susana

    2008-01-01

    [The present paper is based on a curricular traineeship, part of the Degree in Information Science, under the theme of the implementation of Quality Management Systems, occurred at the Documentation Services of the University of Minho (SDUM).] The SDUM have their main purpose in providing the best resources, services and easy access to all the community of the University of Minho (formed by students of several areas, teachers, collaborators and investigators). In or...

  16. Establishment of ''Internal Rules'' and EDMS - Electronic Document Management System at NPP NEK

    International Nuclear Information System (INIS)

    Mandic, D.

    2012-01-01

    two applications (DCM-Document Control Module andQRM-Quality Records Management). Both computer applications were designed in order to fulfil requirements of the criteria VI (Document Control) and criteria XVII (Quality Assurance Records) of the US code [3]. In order to prevent confusion, clarifications regarding the terms ''documents'' and ''records'' are the following: Documents are an organized collection of information or objects that can be treated as a unit. A document may or may not meet the definition of a record. Records are sub-set of all information or all documents held by a person or organisation. Records present information, regardless of physical form or characteristics, appropriate for preservation as evidence of the organization, functions, policies, decisions, procedures, operations, or other activities of the organization. Examples of where this information may reside are books, papers, maps, photographs, machine-readable electronic files, or other documentary materials. Quality Assurance Records related to the NPPs are the records which furnish documentary evidence of the quality of items and activities affecting quality. For the purpose of the standards [4] and [5], a document is considered a quality assurance record when it has been completed.(author).

  17. A documentation tool for product configuration systems - improving the documentation task

    DEFF Research Database (Denmark)

    Hvam, Lars; Jensen, Klaes Ladeby

    2005-01-01

    's experience with the procedure and the hitherto empirical experience from companies having applied the procedure have revealed that there is a need for an IT-based docu-mentation tool to support the process of constructing product configuration systems. Time can be saved by letting a documentation tool handle......Configuration systems are increasingly applied to automate the configuration of complex products. A configuration system is an expert system designed to combine specified modules according to constraints. The constraints are stored as product data and rules in a product model, and one of the most...... essential tasks is thus to develop a complete and consistent product model which can reflect the actual product. A procedure for building product models has been developed at the Centre for Product Modelling (CPM), and the pro-cedure has been successfully applied in several industrial companies. CPM...

  18. Integrated modeling of software cost and quality

    International Nuclear Information System (INIS)

    Rone, K.Y.; Olson, K.M.

    1994-01-01

    In modeling the cost and quality of software systems, the relationship between cost and quality must be considered. This explicit relationship is dictated by the criticality of the software being developed. The balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and the developers with respect to the processes being employed

  19. Modeling Effects of Groundwater Basin Closure, and Reversal of Closure, on Groundwater Quality

    Science.gov (United States)

    Pauloo, R.; Guo, Z.; Fogg, G. E.

    2017-12-01

    Population growth, the expansion of agriculture, and climate uncertainties have accelerated groundwater pumping and overdraft in aquifers worldwide. In many agricultural basins, a water budget may be stable or not in overdraft, yet disconnected ground and surface water bodies can contribute to the formation of a "closed" basin, where water principally exits the basin as evapotranspiration. Although decreasing water quality associated with increases in Total Dissolved Solids (TDS) have been documented in aquifers across the United States in the past half century, connections between water quality declines and significant changes in hydrologic budgets leading to closed basin formation remain poorly understood. Preliminary results from an analysis with a regional-scale mixing model of the Tulare Lake Basin in California indicate that groundwater salinization resulting from open to closed basin conversion can operate on a decades-to-century long time scale. The only way to reverse groundwater salinization caused by basin closure is to refill the basin and change the hydrologic budget sufficiently for natural groundwater discharge to resume. 3D flow and transport modeling, including the effects of heterogeneity based on a hydrostratigraphic facies model, is used to explore rates and time scales of groundwater salinization and its reversal under different water and land management scenarios. The modeling is also used to ascertain the extent to which local and regional heterogeneity need to be included in order to appropriately upscale the advection-dispersion equation in a basin scale groundwater quality management model. Results imply that persistent managed aquifer recharge may slow groundwater salinization, and complete reversal may be possible at sufficiently high water tables.

  20. Health Information Technology, Patient Safety, and Professional Nursing Care Documentation in Acute Care Settings.

    Science.gov (United States)

    Lavin, Mary Ann; Harper, Ellen; Barr, Nancy

    2015-04-14

    The electronic health record (EHR) is a documentation tool that yields data useful in enhancing patient safety, evaluating care quality, maximizing efficiency, and measuring staffing needs. Although nurses applaud the EHR, they also indicate dissatisfaction with its design and cumbersome electronic processes. This article describes the views of nurses shared by members of the Nursing Practice Committee of the Missouri Nurses Association; it encourages nurses to share their EHR concerns with Information Technology (IT) staff and vendors and to take their place at the table when nursing-related IT decisions are made. In this article, we describe the experiential-reflective reasoning and action model used to understand staff nurses' perspectives, share committee reflections and recommendations for improving both documentation and documentation technology, and conclude by encouraging nurses to develop their documentation and informatics skills. Nursing issues include medication safety, documentation and standards of practice, and EHR efficiency. IT concerns include interoperability, vendors, innovation, nursing voice, education, and collaboration.

  1. Review of SKB's Code Documentation and Testing

    International Nuclear Information System (INIS)

    Hicks, T.W.

    2005-01-01

    evolution. Poly3D - a displacement discontinuity model developed at Stanford University and used by SKB to study the effects of movement on fractures that intersect canister deposition holes. UDEC, 3DEC, FLAC, and FLAC3D - geotechnical models developed by HCItasca, and used by SKB in thermo-hydro-mechanical analysis of repository host rock. M3 - a multivariate mixing and mass balance model developed by SKB to study the evolution of groundwater composition. The commercially available codes (CONNECTFLOW, ABAQUS, Poly3D, UDEC, 3DEC, FLAC, and FLAC3D) appear to have been subject to extensive testing, and the wide international usage of these codes offers a high level of confidence that they are fit for intended purpose. However, SKB has modified or developed some commercial codes in-house, and it is unclear whether these developments have become an integral part of, and have been subject to similar levels of testing as, the main code. Greater confidence in the applicability of the modified forms of these codes could be achieved if clear information on code usage and verification were available. Varying standards of code documentation have been identified for the SKB codes COMP23, FARF31, PROPER, the analytical radionuclide transport code, DarcyTools, and M3. The recent DarcyTools reports are of a high standard, providing comprehensive information on the model basis, code usage, and code verification and validation. User's guides and verification reports should be developed for all of SKB's codes that are of a similar standard to the DarcyTools documents and are consistent with appropriate software quality assurance (QA) procedures. To develop a greater understanding of suitable software documentation and testing standards, a brief review has been undertaken of software QA requirements in other radioactive waste disposal programmes. The review has provided useful insights into the type of code documentation that might be expected to accompany the submission of a repository

  2. Business process modeling for processing classified documents using RFID technology

    Directory of Open Access Journals (Sweden)

    Koszela Jarosław

    2016-01-01

    Full Text Available The article outlines the application of the processing approach to the functional description of the designed IT system supporting the operations of the secret office, which processes classified documents. The article describes the application of the method of incremental modeling of business processes according to the BPMN model to the description of the processes currently implemented (“as is” in a manual manner and target processes (“to be”, using the RFID technology for the purpose of their automation. Additionally, the examples of applying the method of structural and dynamic analysis of the processes (process simulation to verify their correctness and efficiency were presented. The extension of the process analysis method is a possibility of applying the warehouse of processes and process mining methods.

  3. EMMC guidance on quality assurance for academic materials modelling software engineering

    OpenAIRE

    European Materials Modelling Council

    2015-01-01

    Proposed recommendations for software development in LEIT projects. This document presents the advice of software owners, commercial and academic, on what academic software could do to generate better quality software, ready to be used by third parties.

  4. Documentation of spectrom-32

    International Nuclear Information System (INIS)

    Callahan, G.D.; Fossum, A.F.; Svalstad, D.K.

    1989-01-01

    SPECTROM-32 is a finite element program for analyzing two-dimensional and axisymmetric inelastic thermomechanical problems related to the geological disposal of nuclear waste. The code is part of the SPECTROM series of special-purpose computer programs that are being developed by RE/SPEC Inc. to address many unique rock mechanics problems encountered in analyzing radioactive wastes stored in geologic formations. This document presents the theoretical basis for the mathematical models, the finite element formulation and solution procedure of the program, a description of the input data for the program, verification problems, and details about program support and continuing documentation. The computer code documentation is intended to satisfy the requirements and guidelines outlined in the document entitled Final Technical Position on Documentation of Computer Codes for High-Level Waste Management. The principal component models used in the program involve thermoelastic, thermoviscoelastic, thermoelastic-plastic, and thermoviscoplastic types of material behavior. Special material considerations provide for the incorporation of limited-tension material behavior and consideration of jointed material behavior. Numerous program options provide the capabilities for various boundary conditions, sliding interfaces, excavation, backfill, arbitrary initial stresses, multiple material domains, load incrementation, plotting database storage and access of results, and other features unique to the geologic disposal of radioactive wastes. Numerous verification problems that exercise many of the program options and illustrate the required data input and printed results are included in the documentation

  5. Farm-Level Effects of Soil Conservation and Commodity Policy Alternatives: Model and Data Documentation.

    Science.gov (United States)

    Sutton, John D.

    This report documents a profit-maximizing linear programming (LP) model of a farm typical of a major corn-soybean producing area in the Southern Michigan-Northern Indiana Drift Plain. Following an introduction, a complete description of the farm is provided. The next section presents the LP model, which is structured to help analyze after-tax…

  6. Model documentation, Renewable Fuels Module of the National Energy Modeling System

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-01-01

    This report documents the objectives, analytical approach, and design of the National Energy Modeling System (NEMS) Renewable Fuels Module (RFM) as it relates to the production of the Annual Energy Outlook 1998 (AEO98) forecasts. The report catalogues and describes modeling assumptions, computational methodologies, data inputs, and parameter estimation techniques. A number of offline analyses used in lieu of RFM modeling components are also described. For AEO98, the RFM was modified in three principal ways, introducing capital cost elasticities of supply for new renewable energy technologies, modifying biomass supply curves, and revising assumptions for use of landfill gas from municipal solid waste (MSW). In addition, the RFM was modified in general to accommodate projections beyond 2015 through 2020. Two supply elasticities were introduced, the first reflecting short-term (annual) cost increases from manufacturing, siting, and installation bottlenecks incurred under conditions of rapid growth, and the second reflecting longer term natural resource, transmission and distribution upgrade, and market limitations increasing costs as more and more of the overall resource is used. Biomass supply curves were also modified, basing forest products supplies on production rather than on inventory, and expanding energy crop estimates to include states west of the Mississippi River using information developed by the Oak Ridge National Laboratory. Finally, for MSW, several assumptions for the use of landfill gas were revised and extended.

  7. The Distributed Geothermal Market Demand Model (dGeo): Documentation

    Energy Technology Data Exchange (ETDEWEB)

    McCabe, Kevin [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Mooney, Meghan E [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Sigrin, Benjamin O [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gleason, Michael [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Liu, Xiaobing [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-11-06

    The National Renewable Energy Laboratory (NREL) developed the Distributed Geothermal Market Demand Model (dGeo) as a tool to explore the potential role of geothermal distributed energy resources (DERs) in meeting thermal energy demands in the United States. The dGeo model simulates the potential for deployment of geothermal DERs in the residential and commercial sectors of the continental United States for two specific technologies: ground-source heat pumps (GHP) and geothermal direct use (DU) for district heating. To quantify the opportunity space for these technologies, dGeo leverages a highly resolved geospatial database and robust bottom-up, agent-based modeling framework. This design is consistent with others in the family of Distributed Generation Market Demand models (dGen; Sigrin et al. 2016), including the Distributed Solar Market Demand (dSolar) and Distributed Wind Market Demand (dWind) models. dGeo is intended to serve as a long-term scenario-modeling tool. It has the capability to simulate the technical potential, economic potential, market potential, and technology deployment of GHP and DU through the year 2050 under a variety of user-defined input scenarios. Through these capabilities, dGeo can provide substantial analytical value to various stakeholders interested in exploring the effects of various techno-economic, macroeconomic, financial, and policy factors related to the opportunity for GHP and DU in the United States. This report documents the dGeo modeling design, methodology, assumptions, and capabilities.

  8. Quality assurance through fire protection documentation

    Energy Technology Data Exchange (ETDEWEB)

    Spitzer, Franz [KAEFER Industrie GmbH, Kirchheim (Germany); Winter, Harald [Harald Winter Software, Unterschleissheim (Germany)

    2010-07-01

    Legal and organisational requirements regarding fire protection have become more and more stringent. In doing so owners must take account of public law and insurance law. A specialised fire protection management system - FiProMan - can assist in meeting these requirements quickly and efficiently by ensuring quality, carrying out subsequent installation work, repair work and making maintenance planable. In addition, the legal requirements of continuance are met. (orig.)

  9. Gaia DR2 documentation Chapter 3: Astrometry

    Science.gov (United States)

    Hobbs, D.; Lindegren, L.; Bastian, U.; Klioner, S.; Butkevich, A.; Stephenson, C.; Hernandez, J.; Lammers, U.; Bombrun, A.; Mignard, F.; Altmann, M.; Davidson, M.; de Bruijne, J. H. J.; Fernández-Hernández, J.; Siddiqui, H.; Utrilla Molina, E.

    2018-04-01

    This chapter of the Gaia DR2 documentation describes the models and processing steps used for the astrometric core solution, namely, the Astrometric Global Iterative Solution (AGIS). The inputs to this solution rely heavily on the basic observables (or astrometric elementaries) which have been pre-processed and discussed in Chapter 2, the results of which were published in Fabricius et al. (2016). The models consist of reference systems and time scales; assumed linear stellar motion and relativistic light deflection; in addition to fundamental constants and the transformation of coordinate systems. Higher level inputs such as: planetary and solar system ephemeris; Gaia tracking and orbit information; initial quasar catalogues and BAM data are all needed for the processing described here. The astrometric calibration models are outlined followed by the details processing steps which give AGIS its name. We also present a basic quality assessment and validation of the scientific results (for details, see Lindegren et al. 2018).

  10. Quality planning and executive force of program files in quality management

    International Nuclear Information System (INIS)

    Sun Danyu

    2008-01-01

    This paper discussed the quality planning in quality management. In the quality planning, the quality objectives, the quality liabilities and the procedures shall be developed, grading supervise shall be exercised, quality assurance program shall be established, and requirements on resource and documents shall be defined. At the same time, we shall also intensify and enhance the execute force of program documentation, supervise and inspect the implementation result, establish detailed check indicators, and bring up the requirement on how to improve the quality of products. (author)

  11. Model quality and safety studies

    DEFF Research Database (Denmark)

    Petersen, K.E.

    1997-01-01

    The paper describes the EC initiative on model quality assessment and emphasizes some of the problems encountered in the selection of data from field tests used in the evaluation process. Further, it discusses the impact of model uncertainties in safety studies of industrial plants. The model...... that most of these have never been through a procedure of evaluation, but nonetheless are used to assist in making decisions that may directly affect the safety of the public and the environment. As a major funder of European research on major industrial hazards, DGXII is conscious of the importance......-tain model is appropriate for use in solving a given problem. Further, the findings from the REDIPHEM project related to dense gas dispersion will be highlighted. Finally, the paper will discuss the need for model quality assessment in safety studies....

  12. Model documentation report: Commercial Sector Demand Module of the National Energy Modeling System

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-01-01

    This report documents the objectives, analytical approach and development of the National Energy Modeling System (NEMS) Commercial Sector Demand Module. The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated through the synthesis and scenario development based on these components. The NEMS Commercial Sector Demand Module is a simulation tool based upon economic and engineering relationships that models commercial sector energy demands at the nine Census Division level of detail for eleven distinct categories of commercial buildings. Commercial equipment selections are performed for the major fuels of electricity, natural gas, and distillate fuel, for the major services of space heating, space cooling, water heating, ventilation, cooking, refrigeration, and lighting. The algorithm also models demand for the minor fuels of residual oil, liquefied petroleum gas, steam coal, motor gasoline, and kerosene, the renewable fuel sources of wood and municipal solid waste, and the minor services of office equipment. Section 2 of this report discusses the purpose of the model, detailing its objectives, primary input and output quantities, and the relationship of the Commercial Module to the other modules of the NEMS system. Section 3 of the report describes the rationale behind the model design, providing insights into further assumptions utilized in the model development process to this point. Section 3 also reviews alternative commercial sector modeling methodologies drawn from existing literature, providing a comparison to the chosen approach. Section 4 details the model structure, using graphics and text to illustrate model flows and key computations.

  13. Use of an advanced document system in post-refuelling updating of nuclear power plant documentation

    International Nuclear Information System (INIS)

    Puech Suanzes, P.; Cortes Soler, M.

    1993-01-01

    This paper discusses the results of the extensive use of an advanced document system to update documentation prepared by traditional methods and affected by changes in the period between two plant refuellings. The implementation of a system for the capture, retrieval and storage of drawings using optical discs is part of a plan to modernize production and management tools and to thus achieve better control of document configuration. These processes are consequently optimized in that: 1. The deterioration of drawings is detained with the help of an identical, updated, legible, reliable support for all users. 2. The time required to update documentation is reduced. Given the large number of drawings, the implementation method should effectively combine costs and time. The document management tools ensure optical disc storage control so that from the moment a drawing resides in the system, any modification to it is made through the system utilities, thus ensuring quality and reducing schedules. The system described was used to update the electrical drawings of Almaraz Nuclear Power Plant. Changes made during the eighth refuelling of Unit I were incorporated and the time needed to issue the updated drawings was reduced by one month. (author)

  14. Structured clinical documentation in the electronic medical record to improve quality and to support practice-based research in epilepsy.

    Science.gov (United States)

    Narayanan, Jaishree; Dobrin, Sofia; Choi, Janet; Rubin, Susan; Pham, Anna; Patel, Vimal; Frigerio, Roberta; Maurer, Darryck; Gupta, Payal; Link, Lourdes; Walters, Shaun; Wang, Chi; Ji, Yuan; Maraganore, Demetrius M

    2017-01-01

    Using the electronic medical record (EMR) to capture structured clinical data at the point of care would be a practical way to support quality improvement and practice-based research in epilepsy. We describe our stepwise process for building structured clinical documentation support tools in the EMR that define best practices in epilepsy, and we describe how we incorporated these toolkits into our clinical workflow. These tools write notes and capture hundreds of fields of data including several score tests: Generalized Anxiety Disorder-7 items, Neurological Disorders Depression Inventory for Epilepsy, Epworth Sleepiness Scale, Quality of Life in Epilepsy-10 items, Montreal Cognitive Assessment/Short Test of Mental Status, and Medical Research Council Prognostic Index. The tools summarize brain imaging, blood laboratory, and electroencephalography results, and document neuromodulation treatments. The tools provide Best Practices Advisories and other clinical decision support when appropriate. The tools prompt enrollment in a DNA biobanking study. We have thus far enrolled 231 patients for initial visits and are starting our first annual follow-up visits and provide a brief description of our cohort. We are sharing these EMR tools and captured data with other epilepsy clinics as part of a Neurology Practice Based Research Network, and are using the tools to conduct pragmatic trials using subgroup-based adaptive designs. © 2016 The Authors. Epilepsia published by Wiley Periodicals, Inc. on behalf of International League Against Epilepsy.

  15. Measures of Quality in Business Process Modelling

    Directory of Open Access Journals (Sweden)

    Radek Hronza

    2015-06-01

    Full Text Available Business process modelling and analysing is undoubtedly one of the most important parts of Applied (Business Informatics. Quality of business process models (diagrams is crucial for any purpose in this area. The goal of a process analyst’s work is to create generally understandable, explicit and error free models. If a process is properly described, created models can be used as an input into deep analysis and optimization. It can be assumed that properly designed business process models (similarly as in the case of correctly written algorithms contain characteristics that can be mathematically described. Besides it will be possible to create a tool that will help process analysts to design proper models. As part of this review will be conducted systematic literature review in order to find and analyse business process model’s design and business process model’s quality measures. It was found that mentioned area had already been the subject of research investigation in the past. Thirty-three suitable scietific publications and twenty-two quality measures were found. Analysed scientific publications and existing quality measures do not reflect all important attributes of business process model’s clarity, simplicity and completeness. Therefore it would be appropriate to add new measures of quality.

  16. Labour Quality Model for Organic Farming Food Chains

    OpenAIRE

    Gassner, B.; Freyer, B.; Leitner, H.

    2008-01-01

    The debate on labour quality in science is controversial as well as in the organic agriculture community. Therefore, we reviewed literature on different labour quality models and definitions, and had key informant interviews on labour quality issues with stakeholders in a regional oriented organic agriculture bread food chain. We developed a labour quality model with nine quality categories and discussed linkages to labour satisfaction, ethical values and IFOAM principles.

  17. ALS and TLS data fusion in cultural heritage documentation and modeling

    Directory of Open Access Journals (Sweden)

    A. Fryskowska

    2015-08-01

    Full Text Available One of the most important aspects of documenting cultural heritage sites is acquiring detailed and accurate data. A popular method of storing 3D information about historical structures is using 3D models. These models are built based on terrestrial or aerial laser scanning data. These methods are seldom used together. Historical buildings usually have a very complex design, therefore the input data, on the basis of which their 3D models are being built, must provide a high enough accuracy to model these complexities. The data processing methods used, as well as the modeling algorithms implemented, should be highly automated and universal. The main of the presented research was to analyze and compare various methods for extracting matching points. The article presents the results of combining data from ALS and TLS using reference points extracted both manually and automatically. Finally, the publication also includes an analysis of the accuracy of the data merging process.

  18. A Hyperbolic Ontology Visualization Tool for Model Application Programming Interface Documentation

    Science.gov (United States)

    Hyman, Cody

    2011-01-01

    Spacecraft modeling, a critically important portion in validating planned spacecraft activities, is currently carried out using a time consuming method of mission to mission model implementations and integration. A current project in early development, Integrated Spacecraft Analysis (ISCA), aims to remedy this hindrance by providing reusable architectures and reducing time spent integrating models with planning and sequencing tools. The principle objective of this internship was to develop a user interface for an experimental ontology-based structure visualization of navigation and attitude control system modeling software. To satisfy this, a number of tree and graph visualization tools were researched and a Java based hyperbolic graph viewer was selected for experimental adaptation. Early results show promise in the ability to organize and display large amounts of spacecraft model documentation efficiently and effectively through a web browser. This viewer serves as a conceptual implementation for future development but trials with both ISCA developers and end users should be performed to truly evaluate the effectiveness of continued development of such visualizations.

  19. The Legal Forensic Model in Determining the Genuineness of Islamic Banking Documents and Their Application in Shariah Courts

    Directory of Open Access Journals (Sweden)

    Wan Abdul Fattah Wan Ismail

    2017-12-01

    Full Text Available Falsification of documents does not only happen in civil courts. Shariah courts also face the same problems despite being ‘religiously’-oriented courts. It can be argued that, in the case of Malaysia, civil courts have clearer guidelines regarding the authentication of documents compared to Shariah courts. This study utilised a questionnaire survey as well as interviews in collecting data to measure the perceptions and opinions of relevant respondents with various stake holdings from those who practice law, with a Shariah and civil background. It should be noted that the key informants were comprised of forensic experts Shariah and civil practitioners. Analysis of the collected data indicates that the necessity of forming a legal forensic model is supported by the majority of the participants, which, therefore, implies that a forensic model that makes the authentication of documents more structured, clear and practical must be formed in Shariah courts. The practice of civil courts in relation to the authentication of documents should be used as a model in Shariah courts so long as they comply with the principles of Islamic law.

  20. Effects of Meteorological Data Quality on Snowpack Modeling

    Science.gov (United States)

    Havens, S.; Marks, D. G.; Robertson, M.; Hedrick, A. R.; Johnson, M.

    2017-12-01

    Detailed quality control of meteorological inputs is the most time-intensive component of running the distributed, physically-based iSnobal snow model, and the effect of data quality of the inputs on the model is unknown. The iSnobal model has been run operationally since WY2013, and is currently run in several basins in Idaho and California. The largest amount of user input during modeling is for the quality control of precipitation, temperature, relative humidity, solar radiation, wind speed and wind direction inputs. Precipitation inputs require detailed user input and are crucial to correctly model the snowpack mass. This research applies a range of quality control methods to meteorological input, from raw input with minimal cleaning, to complete user-applied quality control. The meteorological input cleaning generally falls into two categories. The first is global minimum/maximum and missing value correction that could be corrected and/or interpolated with automated processing. The second category is quality control for inputs that are not globally erroneous, yet are still unreasonable and generally indicate malfunctioning measurement equipment, such as temperature or relative humidity that remains constant, or does not correlate with daily trends observed at nearby stations. This research will determine how sensitive model outputs are to different levels of quality control and guide future operational applications.

  1. An Organizational Model for Excellence in Healthcare Delivery: Evidence From Winners of the Baldrige Quality Award.

    Science.gov (United States)

    Griffith, John R

    Winners of the Baldrige National Quality Award in healthcare have documented top quartile clinical outcomes and patient satisfaction across a variety of American communities and a full spectrum of care. Their results also show high levels of satisfaction among physicians, nurses, and other workers, as well as effective financial performance. The managerial methods they use-collectively, the Baldrige model-are consistent with organizational theory literature and are found across all winners. The winners have sustained excellence after winning and expanded it by acquisition of other healthcare organizations.The model differs substantially from traditional management approaches in healthcare delivery. It is a comprehensive program that emphasizes a shared focus on excellence, systematically responsive management, evidence-based medicine, multidimensional measures and negotiated goals, improvement of work processes, thorough training, and extensive rewards. The model could be expanded on a much larger scale. Doing so successfully would substantially improve the quality and cost of healthcare, as well as the satisfaction and commitment of care providers and other staff. The opportunity deserves further study and trial by large healthcare delivery systems, insurers, and consulting companies.

  2. Assessment model validity document. NAMMU: A program for calculating groundwater flow and transport through porous media

    International Nuclear Information System (INIS)

    Cliffe, K.A.; Morris, S.T.; Porter, J.D.

    1998-05-01

    NAMMU is a computer program for modelling groundwater flow and transport through porous media. This document provides an overview of the use of the program for geosphere modelling in performance assessment calculations and gives a detailed description of the program itself. The aim of the document is to give an indication of the grounds for having confidence in NAMMU as a performance assessment tool. In order to achieve this the following topics are discussed. The basic premises of the assessment approach and the purpose of and nature of the calculations that can be undertaken using NAMMU are outlined. The concepts of the validation of models and the considerations that can lead to increased confidence in models are described. The physical processes that can be modelled using NAMMU and the mathematical models and numerical techniques that are used to represent them are discussed in some detail. Finally, the grounds that would lead one to have confidence that NAMMU is fit for purpose are summarised

  3. Parametric packet-based audiovisual quality model for IPTV services

    CERN Document Server

    Garcia, Marie-Neige

    2014-01-01

    This volume presents a parametric packet-based audiovisual quality model for Internet Protocol TeleVision (IPTV) services. The model is composed of three quality modules for the respective audio, video and audiovisual components. The audio and video quality modules take as input a parametric description of the audiovisual processing path, and deliver an estimate of the audio and video quality. These outputs are sent to the audiovisual quality module which provides an estimate of the audiovisual quality. Estimates of perceived quality are typically used both in the network planning phase and as part of the quality monitoring. The same audio quality model is used for both these phases, while two variants of the video quality model have been developed for addressing the two application scenarios. The addressed packetization scheme is MPEG2 Transport Stream over Real-time Transport Protocol over Internet Protocol. In the case of quality monitoring, that is the case for which the network is already set-up, the aud...

  4. Digital watermarks in electronic document circulation

    Directory of Open Access Journals (Sweden)

    Vitaliy Grigorievich Ivanenko

    2017-07-01

    Full Text Available This paper reviews different protection methods for electronic documents, their good and bad qualities. Common attacks on electronic documents are analyzed. Digital signature and ways of eliminating its flaws are studied. Different digital watermark embedding methods are described, they are divided into 2 types. The solution to protection of electronic documents is based on embedding digital watermarks. Comparative analysis of this methods is given. As a result, the most convenient method is suggested – reversible data hiding. It’s remarked that this technique excels at securing the integrity of the container and its digital watermark. Digital watermark embedding system should prevent illegal access to the digital watermark and its container. Digital watermark requirements for electronic document protection are produced. Legal aspect of copyright protection is reviewed. Advantages of embedding digital watermarks in electronic documents are produced. Modern reversible data hiding techniques are studied. Distinctive features of digital watermark use in Russia are highlighted. Digital watermark serves as an additional layer of defense, that is in most cases unknown to the violator. With an embedded digital watermark, it’s impossible to misappropriate the authorship of the document, even if the intruder signs his name on it. Therefore, digital watermarks can act as an effective additional tool to protect electronic documents.

  5. Metrics for analyzing the quality of model transformations

    NARCIS (Netherlands)

    Amstel, van M.F.; Lange, C.F.J.; Brand, van den M.G.J.; Falcone, G.; Guéhéneuc, Y.G.; Lange, C.F.J.; Porkoláb, Z.; Sahraoui, H.A.

    2008-01-01

    Model transformations become increasingly important with the emergence of model driven engineering of, amongst others, objectoriented software systems. It is therefore necessary to define and evaluate the quality of model transformations. The goal of our research is to make the quality of model

  6. 45 CFR 641.16 - Preparation of environmental documents, generally.

    Science.gov (United States)

    2010-10-01

    ... affect climate and weather patterns; (3) May adversely affect air or water quality; (4) May affect... knowledge and expertise. (e) Type of environmental document. The type of environmental document required... bases for much of the year, and the need to obtain items or materials requiring long lead times), it may...

  7. Photographic Documentation of Emerald Spreadwing at TA-3, LANL

    Energy Technology Data Exchange (ETDEWEB)

    Foy, Bernard R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-06-20

    Los Alamos National Laboratory has a considerable amount of suitable habitat for odonates, or dragonflies and damselflies. Few of these have been properly documented, however. With photographic documentation, the quality and size of odonate habitat on land owned by the Department of Energy will become more apparent to land managers.

  8. Perceived Service Quality models: Are They Still Relevant?

    OpenAIRE

    Polyakova, Olga; Mirza, Mohammed T.

    2015-01-01

    This paper reviews the concept of perceived service quality and provides an update to the body of service quality knowledge. It consolidates the pathway of perceived service quality concept, from its emergence to the research model’s development. It also critically reviews service characteristics as prerequisites of perceived service quality conceptualisation. The examination of six perceived service quality models is intended to identify a superior model that could be used by further researc...

  9. A SYSTEMATIC STUDY OF SOFTWARE QUALITY MODELS

    OpenAIRE

    Dr.Vilas. M. Thakare; Ashwin B. Tomar

    2011-01-01

    This paper aims to provide a basis for software quality model research, through a systematic study ofpapers. It identifies nearly seventy software quality research papers from journals and classifies paper asper research topic, estimation approach, study context and data set. The paper results combined withother knowledge provides support for recommendations in future software quality model research, toincrease the area of search for relevant studies, carefully select the papers within a set ...

  10. Stamp Detection in Color Document Images

    DEFF Research Database (Denmark)

    Micenkova, Barbora; van Beusekom, Joost

    2011-01-01

    , moreover, it can be imprinted with a variable quality and rotation. Previous methods were restricted to detection of stamps of particular shapes or colors. The method presented in the paper includes segmentation of the image by color clustering and subsequent classification of candidate solutions...... by geometrical and color-related features. The approach allows for differentiation of stamps from other color objects in the document such as logos or texts. For the purpose of evaluation, a data set of 400 document images has been collected, annotated and made public. With the proposed method, recall of 83...

  11. GCE Data Toolbox for MATLAB - a software framework for automating environmental data processing, quality control and documentation

    Science.gov (United States)

    Sheldon, W.; Chamblee, J.; Cary, R. H.

    2013-12-01

    Environmental scientists are under increasing pressure from funding agencies and journal publishers to release quality-controlled data in a timely manner, as well as to produce comprehensive metadata for submitting data to long-term archives (e.g. DataONE, Dryad and BCO-DMO). At the same time, the volume of digital data that researchers collect and manage is increasing rapidly due to advances in high frequency electronic data collection from flux towers, instrumented moorings and sensor networks. However, few pre-built software tools are available to meet these data management needs, and those tools that do exist typically focus on part of the data management lifecycle or one class of data. The GCE Data Toolbox has proven to be both a generalized and effective software solution for environmental data management in the Long Term Ecological Research Network (LTER). This open source MATLAB software library, developed by the Georgia Coastal Ecosystems LTER program, integrates metadata capture, creation and management with data processing, quality control and analysis to support the entire data lifecycle. Raw data can be imported directly from common data logger formats (e.g. SeaBird, Campbell Scientific, YSI, Hobo), as well as delimited text files, MATLAB files and relational database queries. Basic metadata are derived from the data source itself (e.g. parsed from file headers) and by value inspection, and then augmented using editable metadata templates containing boilerplate documentation, attribute descriptors, code definitions and quality control rules. Data and metadata content, quality control rules and qualifier flags are then managed together in a robust data structure that supports database functionality and ensures data validity throughout processing. A growing suite of metadata-aware editing, quality control, analysis and synthesis tools are provided with the software to support managing data using graphical forms and command-line functions, as well as

  12. Psychometric Evaluation of the D-Catch, an Instrument to Measure the Accuracy of Nursing Documentation.

    Science.gov (United States)

    D'Agostino, Fabio; Barbaranelli, Claudio; Paans, Wolter; Belsito, Romina; Juarez Vela, Raul; Alvaro, Rosaria; Vellone, Ercole

    2017-07-01

    To evaluate the psychometric properties of the D-Catch instrument. A cross-sectional methodological study. Validity and reliability were estimated with confirmatory factor analysis (CFA) and internal consistency and inter-rater reliability, respectively. A sample of 250 nursing documentations was selected. CFA showed the adequacy of a 1-factor model (chronologically descriptive accuracy) with an outlier item (nursing diagnosis accuracy). Internal consistency and inter-rater reliability were adequate. The D-Catch is a valid and reliable instrument for measuring the accuracy of nursing documentation. Caution is needed when measuring diagnostic accuracy since only one item measures this dimension. The D-Catch can be used as an indicator of the accuracy of nursing documentation and the quality of nursing care. © 2015 NANDA International, Inc.

  13. Evaluation of model quality predictions in CASP9

    KAUST Repository

    Kryshtafovych, Andriy

    2011-01-01

    CASP has been assessing the state of the art in the a priori estimation of accuracy of protein structure prediction since 2006. The inclusion of model quality assessment category in CASP contributed to a rapid development of methods in this area. In the last experiment, 46 quality assessment groups tested their approaches to estimate the accuracy of protein models as a whole and/or on a per-residue basis. We assessed the performance of these methods predominantly on the basis of the correlation between the predicted and observed quality of the models on both global and local scales. The ability of the methods to identify the models closest to the best one, to differentiate between good and bad models, and to identify well modeled regions was also analyzed. Our evaluations demonstrate that even though global quality assessment methods seem to approach perfection point (weighted average per-target Pearson\\'s correlation coefficients are as high as 0.97 for the best groups), there is still room for improvement. First, all top-performing methods use consensus approaches to generate quality estimates, and this strategy has its own limitations. Second, the methods that are based on the analysis of individual models lag far behind clustering techniques and need a boost in performance. The methods for estimating per-residue accuracy of models are less accurate than global quality assessment methods, with an average weighted per-model correlation coefficient in the range of 0.63-0.72 for the best 10 groups.

  14. One multi-media environmental system with linkage between meteorology/ hydrology/ air quality models and water quality model

    Science.gov (United States)

    Tang, C.; Lynch, J. A.; Dennis, R. L.

    2016-12-01

    The biogeochemical processing of nitrogen and associated pollutants is driven by meteorological and hydrological processes in conjunction with pollutant loading. There are feedbacks between meteorology and hydrology that will be affected by land-use change and climate change. Changes in meteorology will affect pollutant deposition. It is important to account for those feedbacks and produce internally consistent simulations of meteorology, hydrology, and pollutant loading to drive the (watershed/water quality) biogeochemical models. In this study, the ecological response to emission reductions in streams in the Potomac watershed was evaluated. Firstly, we simulated the deposition by using the fully coupled Weather Research & Forecasting (WRF) model and the Community Multiscale Air Quality (CAMQ) model; secondly, we created the hydrological data by the offline linked Variable Infiltration Capacity (VIC) model and the WRF model. Lastly, we investigated the water quality by one comprehensive/environment model, namely the linkage of CMAQ, WRF, VIC and the Model of Acidification of Groundwater In Catchment (MAGIC) model from 2002 to 2010.The simulated results (such as NO3, SO4, and SBC) fit well to the observed values. The linkage provides a generally accurate, well-tested tool for evaluating sensitivities to varying meteorology and environmental changes on acidification and other biogeochemical processes, with capability to comprehensively explore strategic policy and management design.

  15. A new parameterization for integrated population models to document amphibian reintroductions.

    Science.gov (United States)

    Duarte, Adam; Pearl, Christopher A; Adams, Michael J; Peterson, James T

    2017-09-01

    Managers are increasingly implementing reintroduction programs as part of a global effort to alleviate amphibian declines. Given uncertainty in factors affecting populations and a need to make recurring decisions to achieve objectives, adaptive management is a useful component of these efforts. A major impediment to the estimation of demographic rates often used to parameterize and refine decision-support models is that life-stage-specific monitoring data are frequently sparse for amphibians. We developed a new parameterization for integrated population models to match the ecology of amphibians and capitalize on relatively inexpensive monitoring data to document amphibian reintroductions. We evaluate the capability of this model by fitting it to Oregon spotted frog (Rana pretiosa) monitoring data collected from 2007 to 2014 following their reintroduction within the Klamath Basin, Oregon, USA. The number of egg masses encountered and the estimated adult and metamorph abundances generally increased following reintroduction. We found that survival probability from egg to metamorph ranged from 0.01 in 2008 to 0.09 in 2009 and was not related to minimum spring temperatures, metamorph survival probability ranged from 0.13 in 2010-2011 to 0.86 in 2012-2013 and was positively related to mean monthly temperatures (logit-scale slope = 2.37), adult survival probability was lower for founders (0.40) than individuals recruited after reintroduction (0.56), and the mean number of egg masses per adult female was 0.74. Our study is the first to test hypotheses concerning Oregon spotted frog egg-to-metamorph and metamorph-to-adult transition probabilities in the wild and document their response at multiple life stages following reintroduction. Furthermore, we provide an example to illustrate how the structure of our integrated population model serves as a useful foundation for amphibian decision-support models within adaptive management programs. The integration of multiple, but

  16. AUDIT plan documenting method

    International Nuclear Information System (INIS)

    Cornecsu, M.

    1995-01-01

    The work describes a method of documenting the AUDIT plan upon the basis of two quantitative elements resulting from quality assurance program appraisal system function implementation degree as established from the latest AUDIT performed an system function weight in QAP, respectively, appraised by taking into account their significance for the activities that are to be performed in the period for which the AUDITs are planned. (Author) 3 Figs., 2 Refs

  17. Patent office governance and patent system quality

    OpenAIRE

    PICARD, Pierre M.; VAN POTTELSBERGHE DE LA POTTERIE, Bruno

    2011-01-01

    The present paper discusses the role of quality in patent systems from the perspective of patent offices' behavior and organization. After documenting original stylized facts, the paper presents a model in which patent offices set patent fees and the quality level of their examination processes. Various objectives of patent offices' governors are considered. We show that the quality of the patent system is maximal for the patent offices that maximises either the social welfare or its own prof...

  18. QUALITY IMPROVEMENT MODEL AT THE MANUFACTURING PROCESS PREPARATION LEVEL

    Directory of Open Access Journals (Sweden)

    Dusko Pavletic

    2009-12-01

    Full Text Available The paper expresses base for an operational quality improvement model at the manufacturing process preparation level. A numerous appropriate related quality assurance and improvement methods and tools are identified. Main manufacturing process principles are investigated in order to scrutinize one general model of manufacturing process and to define a manufacturing process preparation level. Development and introduction of the operational quality improvement model is based on a research conducted and results of methods and tools application possibilities in real manufacturing processes shipbuilding and automotive industry. Basic model structure is described and presented by appropriate general algorithm. Operational quality improvement model developed lays down main guidelines for practical and systematic application of quality improvements methods and tools.

  19. Understanding Quality in Process Modelling: Towards a Holistic Perspective

    Directory of Open Access Journals (Sweden)

    Jan Recker

    2007-09-01

    Full Text Available Quality is one of the main topics in current conceptual modelling research, as is the field of business process modelling. Yet, widely acknowledged academic contributions towards an understanding or measurement of business process model quality are limited at best. In this paper I argue that the development of methodical theories concerning the measurement or establishment of process model quality must be preceded by methodological elaborations on business process modelling. I further argue that existing epistemological foundations of process modelling are insufficient for describing all extrinsic and intrinsic traits of model quality. This in turn has led to a lack of holistic understanding of process modelling. Taking into account the inherent social and purpose-oriented character of process modelling in contemporary organizations I present a socio-pragmatic constructionist methodology of business process modelling and sketch out implications of this perspective towards an understanding of process model quality. I anticipate that, based on this research, theories can be developed that facilitate the evaluation of the ’goodness’ of a business process model.

  20. Model of Quality Management System Using Malcolm Baldrige Criteria in Nursing Education in Surabaya

    Directory of Open Access Journals (Sweden)

    A. Aziz Alimul Hidayat

    2015-04-01

    Full Text Available Introduction: Most of the quality of Nursing Education in Surabaya is still at the low level. It is due to the fact that the process and job performances which have not been integrated yet, systematic and fl exible which are in line with the capacity of the organization and the needs of graduates. This study aims to develop a model of quality management systems of Nursing bachelor’s degree program based on the Malcolm Baldrige Criteria For Performance Excellence. Method: The method used is a cross sectional survey design. This research was conducted with a sample of eight institutions and twenty four of respondents. The data was collected by means of interviews, questionnaires and documentation. Analysis of the data used Partial Least Square (PLS. Result: The results showed that 1 leadership affects the study program as well as the profi le that affects job performances; 2 Leadership affects the strategic planning as well as the strategic planning that affects focus of Human Resources. In addition, the focus of human resources affects the focus process and fi nally affects job performances as well; 3 customer focus affects leadership as well as leadership affects strategic planning. As the impact, strategic planning affects focus of human resources and it affects similarly on the focus process and fi nally affects job performances; 4 All variables are affected by measurements, analysis and knowledge management, except in strategic planning. Discussion: Based on the above results, the model of quality management system can be developed by using the Malcolm Baldrige criteria for the purpose of increasing the quality of Nursing Study Program. On the other hands, this model can be used as a reference of the organization at the level of Nursing Study Program (Strategic Business Unit to restructure the performance of the college in global competition. Keywords: model of quality management system, nursing study program, malcolm baldrige criteria for

  1. Putting people into water quality modelling.

    Science.gov (United States)

    Strickert, G. E.; Hassanzadeh, E.; Noble, B.; Baulch, H. M.; Morales-Marin, L. A.; Lindenschmidt, K. E.

    2017-12-01

    Water quality in the Qu'Appelle River Basin, Saskatchewan is under pressure due to nutrient pollution entering the river system from major cities, industrial zones and agricultural areas. Among these stressors, agricultural activities are basin-wide; therefore, they are the largest non-point source of water pollution in this region. The dynamics of agricultural impacts on water quality are complex and stem from decisions and activities of two distinct stakeholder groups, namely grain farmers and cattle producers, which have different business plans, values, and attitudes towards water quality. As a result, improving water quality in this basin requires engaging with stakeholders to: (1) understand their perspectives regarding a range of agricultural Beneficial Management Practices (BMPs) that can improve water quality in the region, (2) show them the potential consequences of their selected BMPs, and (3) work with stakeholders to better understand the barriers and incentives to implement the effective BMPs. In this line, we held a series of workshops in the Qu'Appelle River Basin with both groups of stakeholders to understand stakeholders' viewpoints about alternative agricultural BMPs and their impact on water quality. Workshop participants were involved in the statement sorting activity (Q-sorts), group discussions, as well as mapping activity. The workshop outcomes show that stakeholder had four distinct viewpoints about the BMPs that can improve water quality, i.e., flow and erosion control, fertilizer management, cattle site management, as well as mixed cattle and wetland management. Accordingly, to simulate the consequences of stakeholder selected BMPs, a conceptual water quality model was developed using System Dynamics (SD). The model estimates potential changes in water quality at the farm, tributary and regional scale in the Qu'Appelle River Basin under each and/or combination of stakeholder selected BMPs. The SD model was then used for real

  2. Technical basis document for internal dosimetry

    International Nuclear Information System (INIS)

    Hickman, D.P.

    1991-01-01

    This document provides the technical basis for the Chem-Nuclear Geotech (Geotech) internal dosimetry program. Geotech policy describes the intentions of the company in complying with radiation protection standards and the as low as reasonably achievable (ALARA) program. It uses this policy and applicable protection standards to derive acceptable methods and levels of bioassay to assure compliance. The models and computational methods used are described in detail within this document. FR-om these models, dose- conversion factors and derived limits are computed. These computations are then verified using existing documentation and verification information or by demonstration of the calculations used to obtain the dose-conversion factors and derived limits. Recommendations for methods of optimizing the internal dosimetry program to provide effective monitoring and dose assessment for workers are provided in the last section of this document. This document is intended to be used in establishing an accredited dosimetry program in accordance with expected Department of Energy Laboratory Accreditation Program (DOELAP) requirements for the selected radionuclides provided in this document, including uranium mill tailing mixtures. Additions and modifications to this document and procedures derived FR-om this document are expected in the future according to changes in standards and changes in programmatic mission

  3. Documentation for grants equal to tax model: Volume 3, Source code

    International Nuclear Information System (INIS)

    Boryczka, M.K.

    1986-01-01

    The GETT model is capable of forecasting the amount of tax liability associated with all property owned and all activities undertaken by the US Department of Energy (DOE) in site characterization and repository development. The GETT program is a user-friendly, menu-driven model developed using dBASE III/trademark/, a relational data base management system. The data base for GETT consists primarily of eight separate dBASE III/trademark/ files corresponding to each of the eight taxes (real property, personal property, corporate income, franchise, sales, use, severance, and excise) levied by State and local jurisdictions on business property and activity. Additional smaller files help to control model inputs and reporting options. Volume 3 of the GETT model documentation is the source code. The code is arranged primarily by the eight tax types. Other code files include those for JURISDICTION, SIMULATION, VALIDATION, TAXES, CHANGES, REPORTS, GILOT, and GETT. The code has been verified through hand calculations

  4. Technical basis document for internal dosimetry

    CERN Document Server

    Hickman, D P

    1991-01-01

    This document provides the technical basis for the Chem-Nuclear Geotech (Geotech) internal dosimetry program. Geotech policy describes the intentions of the company in complying with radiation protection standards and the as low as reasonably achievable (ALARA) program. It uses this policy and applicable protection standards to derive acceptable methods and levels of bioassay to assure compliance. The models and computational methods used are described in detail within this document. FR-om these models, dose- conversion factors and derived limits are computed. These computations are then verified using existing documentation and verification information or by demonstration of the calculations used to obtain the dose-conversion factors and derived limits. Recommendations for methods of optimizing the internal dosimetry program to provide effective monitoring and dose assessment for workers are provided in the last section of this document. This document is intended to be used in establishing an accredited dosi...

  5. NASA software documentation standard software engineering program

    Science.gov (United States)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  6. Pedagogical Documentation as a Lens for Examining Equality in Early Childhood Education

    Science.gov (United States)

    Paananen, Maiju; Lipponen, Lasse

    2018-01-01

    In this paper, we consider pedagogical quality particularly as equal opportunities for participating in decision-making in preschool. Relying on Ferraris' [2013. "Documentality: Why it is necessary to leave traces." New York: Fordham University Press] theory of Documentality, we demonstrate how pedagogical documentation can contribute to…

  7. QUALITY OF AN ACADEMIC STUDY PROGRAMME - EVALUATION MODEL

    Directory of Open Access Journals (Sweden)

    Mirna Macur

    2016-01-01

    Full Text Available Quality of an academic study programme is evaluated by many: employees (internal evaluation and by external evaluators: experts, agencies and organisations. Internal and external evaluation of an academic programme follow written structure that resembles on one of the quality models. We believe the quality models (mostly derived from EFQM excellence model don’t fit very well into non-profit activities, policies and programmes, because they are much more complex than environment, from which quality models derive from (for example assembly line. Quality of an academic study programme is very complex and understood differently by various stakeholders, so we present dimensional evaluation in the article. Dimensional evaluation, as opposed to component and holistic evaluation, is a form of analytical evaluation in which the quality of value of the evaluand is determined by looking at its performance on multiple dimensions of merit or evaluation criteria. First stakeholders of a study programme and their views, expectations and interests are presented, followed by evaluation criteria. They are both joined into the evaluation model revealing which evaluation criteria can and should be evaluated by which stakeholder. Main research questions are posed and research method for each dimension listed.

  8. A parsimonious dynamic model for river water quality assessment.

    Science.gov (United States)

    Mannina, Giorgio; Viviani, Gaspare

    2010-01-01

    Water quality modelling is of crucial importance for the assessment of physical, chemical, and biological changes in water bodies. Mathematical approaches to water modelling have become more prevalent over recent years. Different model types ranging from detailed physical models to simplified conceptual models are available. Actually, a possible middle ground between detailed and simplified models may be parsimonious models that represent the simplest approach that fits the application. The appropriate modelling approach depends on the research goal as well as on data available for correct model application. When there is inadequate data, it is mandatory to focus on a simple river water quality model rather than detailed ones. The study presents a parsimonious river water quality model to evaluate the propagation of pollutants in natural rivers. The model is made up of two sub-models: a quantity one and a quality one. The model employs a river schematisation that considers different stretches according to the geometric characteristics and to the gradient of the river bed. Each stretch is represented with a conceptual model of a series of linear channels and reservoirs. The channels determine the delay in the pollution wave and the reservoirs cause its dispersion. To assess the river water quality, the model employs four state variables: DO, BOD, NH(4), and NO. The model was applied to the Savena River (Italy), which is the focus of a European-financed project in which quantity and quality data were gathered. A sensitivity analysis of the model output to the model input or parameters was done based on the Generalised Likelihood Uncertainty Estimation methodology. The results demonstrate the suitability of such a model as a tool for river water quality management.

  9. Modelling of Operative Report Documents for Data Integration into an openEHR-Based Enterprise Data Warehouse.

    Science.gov (United States)

    Haarbrandt, Birger; Wilschko, Andreas; Marschollek, Michael

    2016-01-01

    In order to integrate operative report documents from two operating room management systems into a data warehouse, we investigated the application of the two-level modelling approach of openEHR to create a shared data model. Based on the systems' analyses, a template consisting of 13 archetypes has been developed. Of these 13 archetypes, 3 have been obtained from the international archetype repository of the openEHR foundation. The remaining 10 archetypes have been newly created. The template was evaluated by an application system expert and through conducting a first test mapping of real-world data from one of the systems. The evaluation showed that by using the two-level modelling approach of openEHR, we succeeded to represent an integrated and shared information model for operative report documents. More research is needed to learn about the limitations of this approach in other data integration scenarios.

  10. Regulation No. 56/2006 Coll. of the Nuclear Regulatory Authority of the Slovak Republic dated as of January 12, 2006 on details concerning requirements for quality system documentation of authorisation holder, as well as details concerning quality requirements for nuclear installations, details concerning quality requirements for classified equipment and details concerning the scope of their approval

    International Nuclear Information System (INIS)

    2006-01-01

    This Regulation provides details of the requirements for quality system documentation holder, details of the quality requirements for nuclear installations, details concerning quality requirements for classified equipment and details of the scope of their approval. This Regulation came into force on March 1, 2006.

  11. Assessment model validity document - HYDRASTAR. A stochastic continuum program for groundwater flow

    Energy Technology Data Exchange (ETDEWEB)

    Gylling, B. [Kemakta Konsult AB, Stockholm (Sweden); Eriksson, Lars [Equa Simulation AB, Sundbyberg (Sweden)

    2001-12-01

    The prevailing document addresses validation of the stochastic continuum model HYDRASTAR designed for Monte Carlo simulations of groundwater flow in fractured rocks. Here, validation is defined as a process to demonstrate that a model concept is fit for its purpose. Preferably, the validation is carried out by comparison of model predictions with independent field observations and experimental measurements. In addition, other sources can also be used to confirm that the model concept gives acceptable results. One method is to compare results with the ones achieved using other model concepts for the same set of input data. Another method is to compare model results with analytical solutions. The model concept HYDRASTAR has been used in several studies including performance assessments of hypothetical repositories for spent nuclear fuel. In the performance assessments, the main tasks for HYDRASTAR have been to calculate groundwater travel time distributions, repository flux distributions, path lines and their exit locations. The results have then been used by other model concepts to calculate the near field release and far field transport. The aim and framework for the validation process includes describing the applicability of the model concept for its purpose in order to build confidence in the concept. Preferably, this is made by comparisons of simulation results with the corresponding field experiments or field measurements. Here, two comparisons with experimental results are reported. In both cases the agreement was reasonably fair. In the broader and more general context of the validation process, HYDRASTAR results have been compared with other models and analytical solutions. Commonly, the approximation calculations agree well with the medians of model ensemble results. Additional indications that HYDRASTAR is suitable for its purpose were obtained from the comparisons with results from other model concepts. Several verification studies have been made for

  12. Discrepancies in Communication Versus Documentation of Weight-Management Benchmarks

    Directory of Open Access Journals (Sweden)

    Christy B. Turer MD, MHS

    2017-02-01

    Full Text Available To examine gaps in communication versus documentation of weight-management clinical practices, communication was recorded during primary care visits with 6- to 12-year-old overweight/obese Latino children. Communication/documentation content was coded by 3 reviewers using communication transcripts and health-record documentation. Discrepancies in communication/documentation content codes were resolved through consensus. Bivariate/multivariable analyses examined factors associated with discrepancies in benchmark communication/documentation. Benchmarks were neither communicated nor documented in up to 42% of visits, and communicated but not documented or documented but not communicated in up to 20% of visits. Lowest benchmark performance rates were for laboratory studies (35% and nutrition/weight-management referrals (42%. In multivariable analysis, overweight (vs obesity was associated with 1.6 more discrepancies in communication versus documentation (P = .03. Many weight-management benchmarks are not met, not documented, or performed without being communicated. Enhanced communication with families and documentation in health records may promote lifestyle changes in overweight children and higher quality care for overweight children in primary care.

  13. 3D laser scanning techniques applying to tunnel documentation and geological mapping at Aespoe hard rock laboratory, Sweden

    International Nuclear Information System (INIS)

    Feng, Q.; Wang, G.; Roeshoff, K.

    2008-01-01

    3D terrestrial laser scanning is nowadays one of the most attractive methods to applying for 3D mapping and documentation of rock faces and tunnels, and shows the most potential to improve the data quality and provide some good solutions in rock engineering projects. In this paper, the state-of-the-art methods are described for different possibility to tunnel documentation and geological mapping based on 3D laser scanning data. Some results are presented from the case study performed at the Hard Rock Laboratory, Aespoe run by SKB, Swedish Nuclear Fuel and Waste Management Co. Comparing to traditional methods, 3D laser scanning techniques can not only provide us with a rapid and 3D digital way for tunnel documentation, but also create a potential chance to achieve high quality data, which might be beneficial to different rock engineering project procedures, including field data acquisition, data processing, data retrieving and management, and also modeling and design. (authors)

  14. Compression of Probabilistic XML Documents

    Science.gov (United States)

    Veldman, Irma; de Keijzer, Ander; van Keulen, Maurice

    Database techniques to store, query and manipulate data that contains uncertainty receives increasing research interest. Such UDBMSs can be classified according to their underlying data model: relational, XML, or RDF. We focus on uncertain XML DBMS with as representative example the Probabilistic XML model (PXML) of [10,9]. The size of a PXML document is obviously a factor in performance. There are PXML-specific techniques to reduce the size, such as a push down mechanism, that produces equivalent but more compact PXML documents. It can only be applied, however, where possibilities are dependent. For normal XML documents there also exist several techniques for compressing a document. Since Probabilistic XML is (a special form of) normal XML, it might benefit from these methods even more. In this paper, we show that existing compression mechanisms can be combined with PXML-specific compression techniques. We also show that best compression rates are obtained with a combination of PXML-specific technique with a rather simple generic DAG-compression technique.

  15. Owning solutions: a collaborative model to improve quality in hospital care for Aboriginal Australians.

    Science.gov (United States)

    Durey, Angela; Wynaden, Dianne; Thompson, Sandra C; Davidson, Patricia M; Bessarab, Dawn; Katzenellenbogen, Judith M

    2012-06-01

    Well-documented health disparities between Aboriginal and Torres Strait Islander (hereafter referred to as Aboriginal) and non-Aboriginal Australians are underpinned by complex historical and social factors. The effects of colonisation including racism continue to impact negatively on Aboriginal health outcomes, despite being under-recognised and under-reported. Many Aboriginal people find hospitals unwelcoming and are reluctant to attend for diagnosis and treatment, particularly with few Aboriginal health professionals employed on these facilities. In this paper, scientific literature and reports on Aboriginal health-care, methodology and cross-cultural education are reviewed to inform a collaborative model of hospital-based organisational change. The paper proposes a collaborative model of care to improve health service delivery by building capacity in Aboriginal and non-Aboriginal personnel by recruiting more Aboriginal health professionals, increasing knowledge and skills to establish good relationships between non-Aboriginal care providers and Aboriginal patients and their families, delivering quality care that is respectful of culture and improving Aboriginal health outcomes. A key element of model design, implementation and evaluation is critical reflection on barriers and facilitators to providing respectful and culturally safe quality care at systemic, interpersonal and patient/family-centred levels. Nurses are central to addressing the current state of inequity and are pivotal change agents within the proposed model. © 2011 Blackwell Publishing Ltd.

  16. An Automated System for the Maintenance of Multiform Documentation

    Science.gov (United States)

    Rousseau, Bertrand; Ruggier, Mario; Smith, Matthiew

    Software documentation for the user often exists in several forms including paper, electronic, on-line help, etc. We have build a system to help with the writing and maintenance of such kinds of documentation which relies on the FrameMaker product. As an example, we show how it is used to maintain the ADAMO documentation, delivered in 4 incarnations on paper, WWW hypertext, KUIP and running examples. The use of the system results in both time saving and quality improvements.

  17. 222-S Laboratory Quality Assurance Plan. Revision 1

    International Nuclear Information System (INIS)

    Meznarich, H.K.

    1995-01-01

    This Quality Assurance Plan provides,quality assurance (QA) guidance, regulatory QA requirements (e.g., 10 CFR 830.120), and quality control (QC) specifications for analytical service. This document follows the U.S Department of Energy (DOE) issued Hanford Analytical Services Quality Assurance Plan (HASQAP). In addition, this document meets the objectives of the Quality Assurance Program provided in the WHC-CM-4-2, Section 2.1. Quality assurance elements required in the Guidelines and Specifications for Preparing Quality Assurance Program Plans (QAMS-004) and Interim Guidelines and Specifications for Preparing Quality Assurance Project Plans (QAMS-005) from the US Environmental Protection Agency (EPA) are covered throughout this document. A quality assurance index is provided in the Appendix A. This document also provides and/or identifies the procedural information that governs laboratory operations. The personnel of the 222-S Laboratory and the Standards Laboratory including managers, analysts, QA/QC staff, auditors, and support staff shall use this document as guidance and instructions for their operational and quality assurance activities. Other organizations that conduct activities described in this document for the 222-S Laboratory shall follow this QA/QC document

  18. Air quality and Atmospheric resources: Phase 1: Background document

    International Nuclear Information System (INIS)

    2001-01-01

    The Environment and Sustainable Development Indicators (ESDI) initiative, under the umbrella of the National Round Table on the Environment and the Economy (NRTEE), commissioned a study for the evaluation and the development of sustainable development indicators (SDIs) in the field of air quality and atmospheric resources. The report contained key information with regard to each indicator or indicator set, and no comprehensive comparative analysis was performed. The report was designed to be used as a technical reference. Where appropriate, SDIs developed by foreign organizations were included. The emphasis of the report was: (1) ambient air quality and human health effects, (2) air emissions having transboundary or global implications for ecosystem health and human health, and (3) demand on the atmosphere for environmental services. The bulk of the research was conducted on the Internet. The report was divided into three sections. A review of the availability of SDIs based on ambient air quality measures was discussed in the first section, while the second section was devoted to the availability of SDIs based on pollutant emission levels. The last section contained a systematic review of those SDIs used or being proposed along with the supporting data available to calculate SDI values. Some observations were also made touching on topics such as the abundance of ambient air quality information, the abundance of pollutant emissions information, the linkages between emissions and ambient air quality, the absence of forecasting, the indoor air quality gap, and the connections to human health. refs., 1 fig

  19. Technical Document on Control of Nitrogen Oxides From Municipal Waste Combustors

    Science.gov (United States)

    This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  20. PSA applications. Good practices and documentation

    International Nuclear Information System (INIS)

    Dewailly, J.; Magne, L.

    1997-10-01

    In this paper, it is shown what the condensed documentation of the main strategic choices and technical assumptions related to a PSA could contain: how to select the internal and external initiating events, how the detail the plant configuration and the general organization of the plant and operating staff, how to highlight the assumptions related to physical models, etc. The proposals in this documentation are based on the R and D D's experience with PSA (construction of PSA models, use of PSA models for operation or maintenance, PSA tools). This document also presents different types of rules or recommendations related to PSA modelling for various applications involved in nuclear power plant operating. Finally, the paper stresses the main difficulties encountered (appropriate use of uncertainties, communication of PSA results to non-specialist users) and it also outlines some prospects for the future. (author)

  1. Airline service quality evaluation: A review on concepts and models

    OpenAIRE

    Navid Haghighat

    2017-01-01

    This paper reviews different major service quality concept and models which led to great developments in evaluating service quality with focusing on improvement process of the models through discussing criticisms of each model. Criticisms against these models are discussed to clarify development steps of newer models which led to the improvement of airline service quality models. The precise and accurate evaluation of service quality needs utilizing a reliable concept with comprehensive crite...

  2. Command Center Library Model Document. Comprehensive Approach to Reusable Defense Software (CARDS)

    Science.gov (United States)

    1992-05-31

    system, and functionality for specifying the layout of the document. 3.7.16.1 FrameMaker FrameMaker is a Commercial Off The Shelf (COTS) component...facilitating WYSIWYG creation of formatted reports with embedded graphics. FrameMaker is an advanced publishing tool that integrates word processing...available for the component FrameMaker : * Product evaluation reports in ASCII and postscript formats • Product assessment on line in model 0 Product

  3. Surface Flux Modeling for Air Quality Applications

    Directory of Open Access Journals (Sweden)

    Limei Ran

    2011-08-01

    Full Text Available For many gasses and aerosols, dry deposition is an important sink of atmospheric mass. Dry deposition fluxes are also important sources of pollutants to terrestrial and aquatic ecosystems. The surface fluxes of some gases, such as ammonia, mercury, and certain volatile organic compounds, can be upward into the air as well as downward to the surface and therefore should be modeled as bi-directional fluxes. Model parameterizations of dry deposition in air quality models have been represented by simple electrical resistance analogs for almost 30 years. Uncertainties in surface flux modeling in global to mesoscale models are being slowly reduced as more field measurements provide constraints on parameterizations. However, at the same time, more chemical species are being added to surface flux models as air quality models are expanded to include more complex chemistry and are being applied to a wider array of environmental issues. Since surface flux measurements of many of these chemicals are still lacking, resistances are usually parameterized using simple scaling by water or lipid solubility and reactivity. Advances in recent years have included bi-directional flux algorithms that require a shift from pre-computation of deposition velocities to fully integrated surface flux calculations within air quality models. Improved modeling of the stomatal component of chemical surface fluxes has resulted from improved evapotranspiration modeling in land surface models and closer integration between meteorology and air quality models. Satellite-derived land use characterization and vegetation products and indices are improving model representation of spatial and temporal variations in surface flux processes. This review describes the current state of chemical dry deposition modeling, recent progress in bi-directional flux modeling, synergistic model development research with field measurements, and coupling with meteorological land surface models.

  4. Review Document: Full Software Trigger

    CERN Document Server

    Albrecht, J; Raven, G

    2014-01-01

    This document presents a trigger system for the upgraded LHCb detector, scheduled to begin operation in 2020. This document serves as input for the internal review towards the "DAQ, online and trigger TDR". The proposed trigger system is implemented entirely in software. In this document we show that track reconstruction of a similar quality to that available in the offline algorithms can be performed on the full inelastic $pp$-collision rate, without prior event selections implemented in custom hardware and without relying upon a partial event reconstruction. A track nding eciency of 98.8 % relative to oine can be achieved for tracks with $p_T >$ 500 MeV/$c$. The CPU time required for this reconstruction is about 40 % of the available budget. Proof-of-principle selections are presented which demonstrate that excellent performance is achievable using an inclusive beauty trigger, in addition to exclusive beauty and charm triggers. Finally, it is shown that exclusive beauty and charm selections that do not intr...

  5. A Distributed Agent Implementation of Multiple Species Flocking Model for Document Partitioning Clustering

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Xiaohui [ORNL; Potok, Thomas E [ORNL

    2006-01-01

    The Flocking model, first proposed by Craig Reynolds, is one of the first bio-inspired computational collective behavior models that has many popular applications, such as animation. Our early research has resulted in a flock clustering algorithm that can achieve better performance than the Kmeans or the Ant clustering algorithms for data clustering. This algorithm generates a clustering of a given set of data through the embedding of the highdimensional data items on a two-dimensional grid for efficient clustering result retrieval and visualization. In this paper, we propose a bio-inspired clustering model, the Multiple Species Flocking clustering model (MSF), and present a distributed multi-agent MSF approach for document clustering.

  6. Quality model for semantic IS standards

    NARCIS (Netherlands)

    Folmer, Erwin Johan Albert

    2011-01-01

    Semantic IS (Information Systems) standards are essential for achieving interoperability between organizations. However a recent survey suggests that not the full benefits of standards are achieved, due to the quality issues. This paper presents a quality model for semantic IS standards, that should

  7. Developing a TQM quality management method model

    NARCIS (Netherlands)

    Zhang, Zhihai

    1997-01-01

    From an extensive review of total quality management literature, the external and internal environment affecting an organization's quality performance and the eleven primary elements of TQM are identified. Based on the primary TQM elements, a TQM quality management method model is developed. This

  8. Applying a sociolinguistic model to the analysis of informed consent documents.

    Science.gov (United States)

    Granero-Molina, José; Fernández-Sola, Cayetano; Aguilera-Manrique, Gabriel

    2009-11-01

    Information on the risks and benefits related to surgical procedures is essential for patients in order to obtain their informed consent. Some disciplines, such as sociolinguistics, offer insights that are helpful for patient-professional communication in both written and oral consent. Communication difficulties become more acute when patients make decisions through an informed consent document because they may sign this with a lack of understanding and information, and consequently feel deprived of their freedom to make their choice about different treatments or surgery. This article discusses findings from documentary analysis using the sociolinguistic SPEAKING model, which was applied to the general and specific informed consent documents required for laparoscopic surgery of the bile duct at Torrecárdenas Hospital, Almería, Spain. The objective of this procedure was to identify flaws when information was provided, together with its readability, its voluntary basis, and patients' consent. The results suggest potential linguistic communication difficulties, different languages being used, cultural clashes, asymmetry of communication between professionals and patients, assignment of rights on the part of patients, and overprotection of professionals and institutions.

  9. Production system with process quality control: modelling and application

    Science.gov (United States)

    Tsou, Jia-Chi

    2010-07-01

    Over the past decade, there has been a great deal of research dedicated to the study of quality and the economics of production. In this article, we develop a dynamic model which is based on the hypothesis of a traditional economic production quantity model. Taguchi's cost of poor quality is used to evaluate the cost of poor quality in the dynamic production system. A practical case from the automotive industry, which uses the Six-sigma DMAIC methodology, is discussed to verify the proposed model. This study shows that there is an optimal value of quality investment to make the production system reach a reasonable quality level and minimise the production cost. Based on our model, the management can adjust its investment in quality improvement to generate considerable financial return.

  10. Future of laser electrophotographic technology for color document printing

    Science.gov (United States)

    Shahin, Michael M.

    1997-04-01

    Recent years have witnessed the development of laser electrophotography as one of the major technologies for document printing, serving a wide range of market applications. With the evolution of color and market demand for color hard copy, electrophotography is again taking center stage to serve the customer need in quality, cost and convenience. Today, electrophotographic technology is used to offer products for color document printing for desktop, mid-volume and high-speed applications. Total cost of ownership, convenience and quality today favor the use of this technology over alternatives in many applications. Development of higher speed color electrophotographic engines demands very high speed, Raster Input Processors and pre-press applications that are expected to become available in the market during the next five years. This presentation will cover the changing environment of office communication and the continuing role of electrophotography in color document printing.

  11. IR and OLAP in XML document warehouses

    DEFF Research Database (Denmark)

    Perez, Juan Manuel; Pedersen, Torben Bach; Berlanga, Rafael

    2005-01-01

    In this paper we propose to combine IR and OLAP (On-Line Analytical Processing) technologies to exploit a warehouse of text-rich XML documents. In the system we plan to develop, a multidimensional implementation of a relevance modeling document model will be used for interactively querying...

  12. Improving the quality of nursing documentation: An action research project

    Directory of Open Access Journals (Sweden)

    Elisha M. Okaisu

    2014-12-01

    Conclusion: Improving nursing documentation involved complex challenges in this setting and demanded multiple approaches. Evidence-based practise was the foundation of changes in systems required to produce visible improvement in practise. The involved role of leadership in these efforts was very important.

  13. [Development and integration of the Oncological Documentation System ODS].

    Science.gov (United States)

    Raab, G; van Den Bergh, M

    2001-08-01

    To simplify clinical routine and to improve medical quality without exceeding the existing resources. Intensifying communication and cooperation between all institutions of patients' health care. The huge amount of documentation work of physicians can no longer be done without modern tools of paperless data processing. The development of ODS was a tight cooperation between physician and technician which resulted in a mutual understanding and led to a high level of user convenience. - At present all cases of gynecology, especially gynecologic oncology can be documented and processed by ODS. Users easily will adopt the system as data entry within different program areas follows the same rules. In addition users can choose between an individual input of data and assistants guiding them through highly specific areas of documentation. ODS is a modern, modular structured and very fast multiuser database environment for in- and outpatient documentation. It automatically generates a lot of reports for clinical day to day business. Statistical routines will help the user reflecting his work and its quality. Documentation of clinical trials according to the GCP guidelines can be done by ODS using the internet or offline datasharing. As ODS is the synthesis of a computer based patient administration system and an oncological documentation database, it represents the basis for the construction of the electronical patient chart as well as the digital documentation of clinical trials. The introduction of this new technology to physicians and nurses has to be done slowly and carefully, in order to increase motivation and to improve the results.

  14. Managing the consistency of distributed documents

    OpenAIRE

    Nentwich, C.

    2005-01-01

    Many businesses produce documents as part of their daily activities: software engineers produce requirements specifications, design models, source code, build scripts and more; business analysts produce glossaries, use cases, organisation charts, and domain ontology models; service providers and retailers produce catalogues, customer data, purchase orders, invoices and web pages. What these examples have in common is that the content of documents is often semantically relate...

  15. 77 FR 4808 - Conference on Air Quality Modeling

    Science.gov (United States)

    2012-01-31

    ... Modeling AGENCY: U.S. Environmental Protection Agency (EPA). ACTION: Notice of conference. SUMMARY: The EPA will be hosting the Tenth Conference on Air Quality Modeling on March 13-15, 2012. Section 320 of the... First, Second, and Third Conferences on Air Quality Modeling as required by CAA Section 320 to help...

  16. Evaluation of Hierarchical Clustering Algorithms for Document Datasets

    National Research Council Canada - National Science Library

    Zhao, Ying; Karypis, George

    2002-01-01

    Fast and high-quality document clustering algorithms play an important role in providing intuitive navigation and browsing mechanisms by organizing large amounts of information into a small number of meaningful clusters...

  17. Using the Characteristics of Documents, Users and Tasks to Predict the Situational Relevance of Health Web Documents

    Directory of Open Access Journals (Sweden)

    Melinda Oroszlányová

    2017-09-01

    Full Text Available Relevance is usually estimated by search engines using document content, disregarding the user behind the search and the characteristics of the task. In this work, we look at relevance as framed in a situational context, calling it situational relevance, and analyze whether it is possible to predict it using documents, users and tasks characteristics. Using an existing dataset composed of health web documents, relevance judgments for information needs, user and task characteristics, we build a multivariate prediction model for situational relevance. Our model has an accuracy of 77.17%. Our findings provide insights into features that could improve the estimation of relevance by search engines, helping to conciliate the systemic and situational views of relevance. In a near future we will work on the automatic assessment of document, user and task characteristics.

  18. A Model of Electronic Document Management System for Limited Partnership

    OpenAIRE

    Faiqunisa, Faiqunisa; Nugroho, Eko; Santosa, Paulus Insap

    2013-01-01

    Both types of documents, electronic and non-electronic are a major component supporting activities. In addition to the documents, communications capabilities to employees and managers effectively with other stakeholders are one of the important key achievements of organizational goals. LP. XYZ is one of the Consultants in the field of Information Technology, but the storage and management of electronic documents and archives themselves carried on a server without a management information sy...

  19. PROCESS DOCUMENTATION: A MODEL FOR KNOWLEDGE MANAGEMENT IN ORGANIZATIONS.

    Science.gov (United States)

    Haddadpoor, Asefeh; Taheri, Behjat; Nasri, Mehran; Heydari, Kamal; Bahrami, Gholamreza

    2015-10-01

    Continuous and interconnected processes are a chain of activities that turn the inputs of an organization to its outputs and help achieve partial and overall goals of the organization. These activates are carried out by two types of knowledge in the organization called explicit and implicit knowledge. Among these, implicit knowledge is the knowledge that controls a major part of the activities of an organization, controls these activities internally and will not be transferred to the process owners unless they are present during the organization's work. Therefore the goal of this study is identification of implicit knowledge and its integration with explicit knowledge in order to improve human resources management, physical resource management, information resource management, training of new employees and other activities of Isfahan University of Medical Science. The project for documentation of activities in department of health of Isfahan University of Medical Science was carried out in several stages. First the main processes and related sub processes were identified and categorized with the help of planning expert. The categorization was carried out from smaller processes to larger ones. In this stage the experts of each process wrote down all their daily activities and organized them into general categories based on logical and physical relations between different activities. Then each activity was assigned a specific code. The computer software was designed after understanding the different parts of the processes, including main and sup processes, and categorization, which will be explained in the following sections. The findings of this study showed that documentation of activities can help expose implicit knowledge because all of inputs and outputs of a process along with the length, location, tools and different stages of the process, exchanged information, storage location of the information and information flow can be identified using proper

  20. Use of an advanced document system in post-refuelling updating of nuclear power plant documentation; Utilizacion de un sistema documental avanzado en la actualizacion de documentacion post recarga

    Energy Technology Data Exchange (ETDEWEB)

    Puech Suanzes, P; Cortes Soler, M [Empresarios Agrupados, A.I.E., Madrid (Spain)

    1993-12-15

    This paper discusses the results of the extensive use of an advanced document system to update documentation prepared by traditional methods and affected by changes in the period between two plant refuellings. The implementation of a system for the capture, retrieval and storage of drawings using optical discs is part of a plan to modernize production and management tools and to thus achieve better control of document configuration. These processes are consequently optimized in that: 1. The deterioration of drawings is detained with the help of an identical, updated, legible, reliable support for all users. 2. The time required to update documentation is reduced. Given the large number of drawings, the implementation method should effectively combine costs and time. The document management tools ensure optical disc storage control so that from the moment a drawing resides in the system, any modification to it is made through the system utilities, thus ensuring quality and reducing schedules. The system described was used to update the electrical drawings of Almaraz Nuclear Power Plant. Changes made during the eighth refuelling of Unit I were incorporated and the time needed to issue the updated drawings was reduced by one month. (author)

  1. Opening the black box—Development, testing and documentation of a mechanistically rich agent-based model

    DEFF Research Database (Denmark)

    Topping, Chris J.; Høye, Toke; Olesen, Carsten Riis

    2010-01-01

    Although increasingly widely used in biology, complex adaptive simulation models such as agent-based models have been criticised for being difficult to communicate and test. This study demonstrates the application of pattern-oriented model testing, and a novel documentation procedure to present...... accessible description of the processes included in the model. Application of the model to a comprehensive historical data set supported the hypothesis that interference competition is the primary population regulating factor in the absence of mammal predators in the brown hare, and that the effect works...

  2. Documentation of spectrom-41

    International Nuclear Information System (INIS)

    Svalstad, D.K.

    1989-01-01

    SPECTROM-41 is a finite element heat transfer computer program developed to analyze thermal problems related to nuclear waste disposal. The code is part of the SPECTROM (Special Purpose Engineering Codes for Thermal/ROck Mechanics) series of special purpose finite element programs that are continually being developed by RE/SPEC Inc. (RSI) to address the many unique formations. This document presents the theoretical basis for the mathematical model, the finite element formulation of the program, and a description of the input data for the program, along with details about program support and continuing documentation. The documentation is intended to satisfy the requirements and guidelines outlined in NUREG-0856. The principal component model used in the programs based on Fourier's law of conductance. Numerous program options provide the capability of considering various boundary conditions, material stratification and anisotropy, and time-dependent heat generation that are characteristic of problems involving the disposal of nuclear waste in geologic formation. Numerous verification problems are included in the documentation in addition to highlights of past and ongoing verification and validation efforts. A typical repository problem is solving using SPECTROM-41 to demonstrate the use of the program in addressing problems related to the disposal of nuclear waste

  3. An effective quality model for evaluating mobile websites

    International Nuclear Information System (INIS)

    Hassan, W.U.; Nawaz, M.T.; Syed, T.H.; Naseem, A.

    2015-01-01

    The Evolution in Web development in recent years has caused emergence of new area of mobile computing, Mobile phone has been transformed into high speed processing device capable of doing the processes which were suppose to be run only on computer previously, Modem mobile phones now have capability to process data with greater speed then desktop systems and with the inclusion of 3G and 4G networks, mobile became the prime choice for users to send and receive data from any device. As a result, there is a major increase in mobile website need and development but due to uniqueness of mobile website usage as compared to desktop website, there is a need to focus on quality aspect of mobile website, So, to increase and preserve quality of mobile website, a quality model is required which has to be designed specifically to evaluate mobile website quality, To design a mobile website quality model, a survey based methodology is used to gather the information regarding website unique usage in mobile from different users. On the basis of this information, a mobile website quality model is presented which aims to evaluate the quality of mobile websites. In proposed model, some sub characteristics are designed to evaluate mobile websites in particular. The result is a proposed model aims to evaluate features of website which are important in context of its deployment and its usability in mobile platform. (author)

  4. FACSIM/MRS [Monitored Retrievable Storage]-2: Storage and shipping model documentation and user's guide

    International Nuclear Information System (INIS)

    Huber, H.D.; Chockie, A.D.; Hostick, C.J.; Otis, P.T.; Sovers, R.A.

    1987-06-01

    The Pacific Northwest Laboratory (PNL) has developed a stochastic computer model, FACSIM/MRS, to assist in assessing the operational performance of the Monitored Retrievable Storage (MRS) waste-handling facility. This report provides the documentation and user's guide for FACSIM/MRS-2, which is also referred to as the back-end model. The FACSIM/MRS-2 model simulates the MRS storage and shipping operations, which include handling canistered spent fuel and secondary waste in the shielded canyon cells, in onsite yard storage, and in repository shipping cask loading areas

  5. Quality Assurance Model for Digital Adult Education Materials

    Science.gov (United States)

    Dimou, Helen; Kameas, Achilles

    2016-01-01

    Purpose: This paper aims to present a model for the quality assurance of digital educational material that is appropriate for adult education. The proposed model adopts the software quality standard ISO/IEC 9126 and takes into account adult learning theories, Bloom's taxonomy of learning objectives and two instructional design models: Kolb's model…

  6. Improving PSA quality of KSNP PSA model

    International Nuclear Information System (INIS)

    Yang, Joon Eon; Ha, Jae Joo

    2004-01-01

    In the RIR (Risk-informed Regulation), PSA (Probabilistic Safety Assessment) plays a major role because it provides overall risk insights for the regulatory body and utility. Therefore, the scope, the level of details and the technical adequacy of PSA, i.e. the quality of PSA is to be ensured for the successful RIR. To improve the quality of Korean PSA, we evaluate the quality of the KSNP (Korean Standard Nuclear Power Plant) internal full-power PSA model based on the 'ASME PRA Standard' and the 'NEI PRA Peer Review Process Guidance.' As a working group, PSA experts of the regulatory body and industry also participated in the evaluation process. It is finally judged that the overall quality of the KSNP PSA is between the ASME Standard Capability Category I and II. We also derive some items to be improved for upgrading the quality of the PSA up to the ASME Standard Capability Category II. In this paper, we show the result of quality evaluation, and the activities to improve the quality of the KSNP PSA model

  7. A conceptual competitive intelligence quality assurance model

    Directory of Open Access Journals (Sweden)

    Tshilidzi Eric Nenzhelele

    2015-12-01

    Full Text Available Competitive Intelligence (CI improves the quality of product and service, decision-making and it improves quality of life. However, it has been established that decision makers are not happy about the quality of CI. This is because enterprises fail in quality assurance of CI. It has been concluded that most enterprises are clueless concerning CI quality assurance. Studies that previously attempted to resolve CI quality problem were limited in scope and focused too much on the quality of information than the overall CI quality. The purpose of this study is to propose a conceptual CI quality assurance model which will help in quality assurance of CI. The research was qualitative in nature and used content analysis.

  8. Guide for the realization of Design Base Documents (DBD)

    International Nuclear Information System (INIS)

    Roca Mallofre, G. la

    2010-01-01

    Guide for improving the consistency and quality content of the Design Base Documents. It's a short description of how to carry out and complete these Documents but focusing on those aspects that can be more confusing and harder to interpret. This guide aims to clarify the term Design Base distinguishing between production and safety, and it focuses on safety Design Base Documents and their values and references. It also emphasizes the difference between the support system and the interface system when there is a functional connection between different systems.

  9. A document preparation system in a large network environment

    Energy Technology Data Exchange (ETDEWEB)

    Vigil, M.; Bouchier, S.; Sanders, C.; Sydoriak, S.; Wheeler, K.

    1988-01-01

    At Los Alamos National Laboratory, we have developed an integrated document preparation system that produces publication-quality documents. This system combines text formatters and computer graphics capabilities that have been adapted to meet the needs of users in a large scientific research laboratory. This paper describes the integration of document processing technology to develop a system architecture, based on a page description language, to provide network-wide capabilities in a distributed computing environment. We describe the Laboratory requirements, the integration and implementation issues, and the challenges we faced developing this system.

  10. Documentation for Grants Equal to Tax model: Volume 1, Technical description

    International Nuclear Information System (INIS)

    1986-01-01

    A computerized model, the Grants Equal to Tax (GETT) model, was developed to assist in evaluating the amount of federal grant monies that would go to state and local jurisdictions under the provisions outlined in the Nuclear Waste Policy Act of 1982. The GETT model is capable of forecasting the amount of tax liability associated with all property owned and all activities undertaken by the US Department of Energy (DOE) in site characterization and repository development. The GETT program is a user-friendly, menu-driven model developed using dBASE III/trademark/, a relational data base management system. The data base for GETT consists primarily of eight separate dBASE III/trademark/ files corresponding to each of the eight taxes levied by state and local jurisdictions on business property and activity. Additional smaller files help to control model inputs and reporting options. Volume 1 of the GETT model documentation is a technical description of the program and its capabilities providing (1) descriptions of the data management system and its procedures; (2) formulas for calculating taxes (illustrated with flow charts); (3) descriptions of tax data base variables for the Deaf Smith County, Texas, Richton Dome, Mississippi, and Davis Canyon, Utah, salt sites; and (4) data inputs for the GETT model. 10 refs., 18 figs., 3 tabs

  11. [Handbook for the preparation of evidence-based documents. Tools derived from scientific knowledge].

    Science.gov (United States)

    Carrión-Camacho, M R; Martínez-Brocca, M A; Paneque-Sánchez-Toscano, I; Valencia-Martín, R; Palomino-García, A; Muñoz-Durán, C; Tamayo-López, M J; González-Eiris-Delgado, C; Otero-Candelera, R; Ortega-Ruiz, F; Sobrino-Márquez, J M; Jiménez-García-Bóveda, R; Fernández-Quero, M; Campos-Pareja, A M

    2013-01-01

    This handbook is intended to be an accessible, easy-to-consult guide to help professionals produce or adapt Evidence-Based Documents. Such documents will help standardize both clinical practice and decision-making, the quality always being monitored in such a way that established references are complied with. Evidence-Based Health Care Committee, a member of "Virgen del Rocío" University Hospital quality structure, proposed the preparation of a handbook to produce Evidence-Based Documents including: a description of products, characteristics, qualities, uses, methodology of production, and application scope of every one of them. The handbook consists of seven Evidence-Based tools, one chapter on critical analysis methodology of scientific literature, one chapter with internet resources, and some appendices with different assessment tools. This Handbook provides general practitioners with a great opportunity to improve quality and as a guideline to standardize clinical healthcare, and managers with a strategy to promote and encourage the development of documents in an effort to reduce clinical practice variability, as well as giving patients the opportunity of taking part in planning their own care. Copyright © 2011 SECA. Published by Elsevier Espana. All rights reserved.

  12. Application of Service Quality Model in Education Environment

    Directory of Open Access Journals (Sweden)

    Ting Ding Hooi

    2016-02-01

    Full Text Available Most of the ideas on service quality stem from the West. The massive developments in research in the West are undeniable of their importance. This leads to the generation and development of new ideas. These ideas were subsequently channeled to developing countries. Ideas obtained were then formulated and used by these developing countries in order to obtain better approach in channeling service quality. There are ample to be learnt from the service quality model, SERVQUAL which attain high acceptance in the West. Service quality in the education system is important to guarantee the effectiveness and quality of education. Effective and quality education will be able to offer quality graduates, which will contribute to the development of the nation. This paper will discuss the application of the SERVQUAL model into the education environment.

  13. ISO 9000 and the total quality management models

    OpenAIRE

    Pacios Lozano, Ana Reyes

    1997-01-01

    Establishes the most outstanding differences between the ISO 9000 norms and total quality management as forms or manners of managing quality used in some information services. Compares two models of total quality: European Foundation far Quality Management and Malcolm Baldrige Awards.

  14. Improving the quality of clinical coding: a comprehensive audit model

    Directory of Open Access Journals (Sweden)

    Hamid Moghaddasi

    2014-04-01

    Full Text Available Introduction: The review of medical records with the aim of assessing the quality of codes has long been conducted in different countries. Auditing medical coding, as an instructive approach, could help to review the quality of codes objectively using defined attributes, and this in turn would lead to improvement of the quality of codes. Method: The current study aimed to present a model for auditing the quality of clinical codes. The audit model was formed after reviewing other audit models, considering their strengths and weaknesses. A clear definition was presented for each quality attribute and more detailed criteria were then set for assessing the quality of codes. Results: The audit tool (based on the quality attributes included legibility, relevancy, completeness, accuracy, definition and timeliness; led to development of an audit model for assessing the quality of medical coding. Delphi technique was then used to reassure the validity of the model. Conclusion: The inclusive audit model designed could provide a reliable and valid basis for assessing the quality of codes considering more quality attributes and their clear definition. The inter-observer check suggested in the method of auditing is of particular importance to reassure the reliability of coding.

  15. Developing Staffing Models to Support Population Health Management And Quality Oucomes in Ambulatory Care Settings.

    Science.gov (United States)

    Haas, Sheila A; Vlasses, Frances; Havey, Julia

    2016-01-01

    There are multiple demands and challenges inherent in establishing staffing models in ambulatory heath care settings today. If health care administrators establish a supportive physical and interpersonal health care environment, and develop high-performing interprofessional teams and staffing models and electronic documentation systems that track performance, patients will have more opportunities to receive safe, high-quality evidence-based care that encourages patient participation in decision making, as well as provision of their care. The health care organization must be aligned and responsive to the community within which it resides, fully invested in population health management, and continuously scanning the environment for competitive, regulatory, and external environmental risks. All of these challenges require highly competent providers willing to change attitudes and culture such as movement toward collaborative practice among the interprofessional team including the patient.

  16. Airline service quality evaluation: A review on concepts and models

    Directory of Open Access Journals (Sweden)

    Navid Haghighat

    2017-12-01

    Full Text Available This paper reviews different major service quality concept and models which led to great developments in evaluating service quality with focusing on improvement process of the models through discussing criticisms of each model. Criticisms against these models are discussed to clarify development steps of newer models which led to the improvement of airline service quality models. The precise and accurate evaluation of service quality needs utilizing a reliable concept with comprehensive criteria and effective measurement techniques as the fundamentals of a valuable framework. In this paper, service quality models improvement is described based on three major service quality concepts, the disconfirmation, performance and hierarchical concepts which are developed subsequently. Reviewing various criteria and different measurement techniques such a statistical analysis and multi-criteria decision making assist researchers to have a clear understanding of the development of the evaluation framework in the airline industry. This study aims at promoting reliable frameworks for evaluating airline service quality in different countries and societies due to economic, cultural and social aspects of each society.

  17. Evaluating predictive models of software quality

    International Nuclear Information System (INIS)

    Ciaschini, V; Canaparo, M; Ronchieri, E; Salomoni, D

    2014-01-01

    Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.

  18. Evaluating Predictive Models of Software Quality

    Science.gov (United States)

    Ciaschini, V.; Canaparo, M.; Ronchieri, E.; Salomoni, D.

    2014-06-01

    Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.

  19. The Digital technical documentation handbook

    CERN Document Server

    Schultz, Susan I; Kavanagh, Frank X; Morse, Marjorie J

    1993-01-01

    The Digital Technical Documentation Handbook describes the process of developing and producing technical user information at Digital Equipment Corporation. * Discusses techniques for making user information _more effective * Covers the draft and reviewprocess, the production and distribution of printed and electronic media, archiving, indexing, testing for usability, and many other topics * Provides quality assurance checklists, contains a glossary and a bibliography of resources for technicalcommunicators

  20. DockQ: A Quality Measure for Protein-Protein Docking Models.

    Directory of Open Access Journals (Sweden)

    Sankar Basu

    Full Text Available The state-of-the-art to assess the structural quality of docking models is currently based on three related yet independent quality measures: Fnat, LRMS, and iRMS as proposed and standardized by CAPRI. These quality measures quantify different aspects of the quality of a particular docking model and need to be viewed together to reveal the true quality, e.g. a model with relatively poor LRMS (>10Å might still qualify as 'acceptable' with a descent Fnat (>0.50 and iRMS (<3.0Å. This is also the reason why the so called CAPRI criteria for assessing the quality of docking models is defined by applying various ad-hoc cutoffs on these measures to classify a docking model into the four classes: Incorrect, Acceptable, Medium, or High quality. This classification has been useful in CAPRI, but since models are grouped in only four bins it is also rather limiting, making it difficult to rank models, correlate with scoring functions or use it as target function in machine learning algorithms. Here, we present DockQ, a continuous protein-protein docking model quality measure derived by combining Fnat, LRMS, and iRMS to a single score in the range [0, 1] that can be used to assess the quality of protein docking models. By using DockQ on CAPRI models it is possible to almost completely reproduce the original CAPRI classification into Incorrect, Acceptable, Medium and High quality. An average PPV of 94% at 90% Recall demonstrating that there is no need to apply predefined ad-hoc cutoffs to classify docking models. Since DockQ recapitulates the CAPRI classification almost perfectly, it can be viewed as a higher resolution version of the CAPRI classification, making it possible to estimate model quality in a more quantitative way using Z-scores or sum of top ranked models, which has been so valuable for the CASP community. The possibility to directly correlate a quality measure to a scoring function has been crucial for the development of scoring functions for

  1. Reflexion on linear regression trip production modelling method for ensuring good model quality

    Science.gov (United States)

    Suprayitno, Hitapriya; Ratnasari, Vita

    2017-11-01

    Transport Modelling is important. For certain cases, the conventional model still has to be used, in which having a good trip production model is capital. A good model can only be obtained from a good sample. Two of the basic principles of a good sampling is having a sample capable to represent the population characteristics and capable to produce an acceptable error at a certain confidence level. It seems that this principle is not yet quite understood and used in trip production modeling. Therefore, investigating the Trip Production Modelling practice in Indonesia and try to formulate a better modeling method for ensuring the Model Quality is necessary. This research result is presented as follows. Statistics knows a method to calculate span of prediction value at a certain confidence level for linear regression, which is called Confidence Interval of Predicted Value. The common modeling practice uses R2 as the principal quality measure, the sampling practice varies and not always conform to the sampling principles. An experiment indicates that small sample is already capable to give excellent R2 value and sample composition can significantly change the model. Hence, good R2 value, in fact, does not always mean good model quality. These lead to three basic ideas for ensuring good model quality, i.e. reformulating quality measure, calculation procedure, and sampling method. A quality measure is defined as having a good R2 value and a good Confidence Interval of Predicted Value. Calculation procedure must incorporate statistical calculation method and appropriate statistical tests needed. A good sampling method must incorporate random well distributed stratified sampling with a certain minimum number of samples. These three ideas need to be more developed and tested.

  2. Topology of Document Retrieval Systems.

    Science.gov (United States)

    Everett, Daniel M.; Cater, Steven C.

    1992-01-01

    Explains the use of a topological structure to examine the closeness between documents in retrieval systems and analyzes the topological structure of a vector-space model, a fuzzy-set model, an extended Boolean model, a probabilistic model, and a TIRS (Topological Information Retrieval System) model. Proofs for the results are appended. (17…

  3. Quality management system in Nuclear Medicine

    International Nuclear Information System (INIS)

    Peña Tornet, Adela; Torres Aroche, Leonel A.

    2016-01-01

    Establishing Management Systems (QMS) in services Nuclear Medicine (NM) is a prerequisite for optimizing the efficacy and safety of diagnostic and therapeutic procedures of this specialty and increase steadily the quality of the services provide patients. Several international organizations such as the IAEA and scientific specialty societies (SNM, EBNM, etc) and national bodies stimulate and enhance their introduction; in our country is also a requirement of the National Nuclear Safety Centre (CNSN). Are presented in this paper, the main experiences of our country related to the implementation of QMS and developed tools for achieving this goal, such as: The QNUMED automated web environment for managing indicators and documentation format digital; b) The development of prototypes and models for the implementation of the documentation system; d) requirements applying QUANUM in conducting audits of quality management in local services including QUANUM T ool tool; and f) human resource development issues in Quality Management. (author)

  4. Environmental restoration value engineering guidance document

    International Nuclear Information System (INIS)

    1995-07-01

    This document provides guidance on Value Engineering (VE). VE is an organized team effort led by a person trained in the methodology to analyze the functions of projects, systems, equipment, facilities, services, and processes for achieving the essential functions at the lowest life cycle cost while maintaining required performance, reliability, availability, quality, and safety. VE has proven to be a superior tool to improve up-front project planning, cut costs, and create a better value for each dollar spent. This document forms the basis for the Environmental Restoration VE Program, describes the VE process, and provides recommendations on when it can be most useful on ER projects

  5. Evaluation Model of Tea Industry Information Service Quality

    OpenAIRE

    Shi , Xiaohui; Chen , Tian’en

    2015-01-01

    International audience; According to characteristics of tea industry information service, this paper have built service quality evaluation index system for tea industry information service quality, R-cluster analysis and multiple regression have been comprehensively used to contribute evaluation model with a high practice and credibility. Proved by the experiment, the evaluation model of information service quality has a good precision, which has guidance significance to a certain extent to e...

  6. Great Lakes water quality initiative technical support document for the procedure to determine bioaccumulation factors. Draft report

    International Nuclear Information System (INIS)

    1993-03-01

    The purpose of the document is to provide the technical information and rationale in support of the proposed procedures to determine bioaccumulation factors. Bioaccumulation factors, together with the quantity of aquatic organisms eaten, determine the extent to which people and wildlife are exposed to chemicals through the consumption of aquatic organisms. The more bioaccumulative a pollutant is, the more important the consumption of aquatic organisms becomes as a potential source of contaminants to humans and wildlife. Bioaccumulation factors are needed to determine both human health and wildlife tier I water quality criteria and tier II values. Also, they are used to define Bioaccumulative Chemicals of Concern among the Great Lakes Initiative universe of pollutants. Bioaccumulation factors range from less than one to several million

  7. The dynamic development of the muzzle imprint by contact gunshot: high-speed documentation utilizing the "skin-skull-brain model".

    Science.gov (United States)

    Thali, M J; Kneubuehl, B P; Dirnhofer, R; Zollinger, U

    2002-07-17

    Many contact gunshots produce a muzzle imprint in the skin of the victim. Different mechanisms have been discussed in literature as being responsible for the creation of the muzzle imprint. Experimenting upon the synthetic non biological skin-skull-brain model, our goal was to document and study the creation of the muzzle imprint with the aid of high-speed photography. In our experiments, we could document with our high-speed photography (at exposure rates in the range of nanoseconds) the bulging, the pressing against the muzzle, and the splitting of the artificial skin. Furthermore, it was possible to photographically record the back pattern of synthetic tissue particles. And, the soot and gunpowder cavity could be reproduced experimentally. In conclusion the experiments completed with the skin-skull-brain model, using high-speed photography for documentation, show the promising possibilities of experimental ballistics with body models.

  8. Dynamic (G2) Model Design Document, 24590-WTP-MDD-PR-01-002, Rev. 12

    Energy Technology Data Exchange (ETDEWEB)

    Deng, Yueying; Kruger, Albert A.

    2013-12-16

    The Hanford Tank Waste Treatment and Immobilization Plant (WTP) Statement of Work (Department of Energy Contract DE-AC27-01RV14136, Section C) requires the contractor to develop and use process models for flowsheet analyses and pre-operational planning assessments. The Dynamic (G2) Flowsheet is a discrete-time process model that enables the project to evaluate impacts to throughput from eventdriven activities such as pumping, sampling, storage, recycle, separation, and chemical reactions. The model is developed by the Process Engineering (PE) department, and is based on the Flowsheet Bases, Assumptions, and Requirements Document (24590-WTP-RPT-PT-02-005), commonly called the BARD. The terminologies of Dynamic (G2) Flowsheet and Dynamic (G2) Model are interchangeable in this document. The foundation of this model is a dynamic material balance governed by prescribed initial conditions, boundary conditions, and operating logic. The dynamic material balance is achieved by tracking the storage and material flows within the plant as time increments. The initial conditions include a feed vector that represents the waste compositions and delivery sequence of the Tank Farm batches, and volumes and concentrations of solutions in process equipment before startup. The boundary conditions are the physical limits of the flowsheet design, such as piping, volumes, flowrates, operation efficiencies, and physical and chemical environments that impact separations, phase equilibriums, and reaction extents. The operating logic represents the rules and strategies of running the plant.

  9. Quality of peas modelled by a structural equation system

    DEFF Research Database (Denmark)

    Bech, A. C.; Juhl, H. J.; Hansen, M.

    2000-01-01

    in a PLS structural model with the Total Food Quality Model as starting point. The results show that texture and flavour do have approximately the same effect on consumers' perception of overall quality. Quality development goals for plant breeders would be to optimse perceived flavour directly...

  10. Adopting software quality measures for healthcare processes.

    Science.gov (United States)

    Yildiz, Ozkan; Demirörs, Onur

    2009-01-01

    In this study, we investigated the adoptability of software quality measures for healthcare process measurement. Quality measures of ISO/IEC 9126 are redefined from a process perspective to build a generic healthcare process quality measurement model. Case study research method is used, and the model is applied to a public hospital's Entry to Care process. After the application, weak and strong aspects of the process can be easily observed. Access audibility, fault removal, completeness of documentation, and machine utilization are weak aspects and these aspects are the candidates for process improvement. On the other hand, functional completeness, fault ratio, input validity checking, response time, and throughput time are the strong aspects of the process.

  11. A Document Analysis of Teacher Evaluation Systems Specific to Physical Education

    Science.gov (United States)

    Norris, Jason M.; van der Mars, Hans; Kulinna, Pamela; Kwon, Jayoun; Amrein-Beardsley, Audrey

    2017-01-01

    Purpose: The purpose of this document analysis study was to examine current teacher evaluation systems, understand current practices, and determine whether the instrumentation is a valid measure of teaching quality as reflected in teacher behavior and effectiveness specific to physical education (PE). Method: An interpretive document analysis…

  12. Development of knowledge models by linguistic analysis of lexical relationships in technical documents

    International Nuclear Information System (INIS)

    Seguela, Patrick

    2001-01-01

    This research thesis addresses the problem of knowledge acquisition and structuring from technical texts, and the use of this knowledge in the development of models. The author presents the Cameleon method which aims at extracting binary lexical relationships from technical texts by identifying linguistic markers. The relevance of this method is assessed in the case of four different corpuses: a written technical corpus, an oral technical corpus, a corpus of texts of instructions, and a corpus of academic texts. The author reports the development of a model of representation of knowledge of a specific field by using lexical relationships. The method is then applied to develop a model used in document search within a knowledge management system [fr

  13. Quality assurance program for environmental assessment of Savannah River Plant waste sites: Environmental information document

    International Nuclear Information System (INIS)

    Looney, B.B.; King, C.M.; Stephenson, D.E.

    1987-03-01

    Forty-eight locations were identified that received a variety of radioactive and nonradioactive constituents during the past 35 years including surface impoundments and shallow land burial facilities. Detailed environmental assessments of existing waste disposal areas, as well as new waste disposal techniques and disposition of tritiated water, were completed to air in an evaluation of the low level, mixed and hazardous waste management activities. These assessments result in estimation of risk, or residual risk, posed by each disposal area to various receptors as a function of waste management alternative. For example, at existing waste sites, the closure actions evaluated were waste removal and closure, no waste removal and closure, and no action; several pathways/receptors were considered, including groundwater to river, groundwater to well, atmospheric transport, occupational exposure, direct exposure, and contamination followed by ingestion of crops and meat. Modeling of chemical transport in a variety of media was an integral part of the assessment process. The quality of the models used and the application of these models were assured by an explicit quality assurance program

  14. The Algorithm Theoretical Basis Document for Level 1A Processing

    Science.gov (United States)

    Jester, Peggy L.; Hancock, David W., III

    2012-01-01

    The first process of the Geoscience Laser Altimeter System (GLAS) Science Algorithm Software converts the Level 0 data into the Level 1A Data Products. The Level 1A Data Products are the time ordered instrument data converted from counts to engineering units. This document defines the equations that convert the raw instrument data into engineering units. Required scale factors, bias values, and coefficients are defined in this document. Additionally, required quality assurance and browse products are defined in this document.

  15. QAM: PROPOSED MODEL FOR QUALITY ASSURANCE IN CBSS

    Directory of Open Access Journals (Sweden)

    Latika Kharb

    2015-08-01

    Full Text Available Component-based software engineering (CBSE / Component-Based Development (CBD lays emphasis on decomposition of the engineered systems into functional or logical components with well-defined interfaces used for communication across the components. Component-based software development approach is based on the idea to develop software systems by selecting appropriate off-the-shelf components and then to assemble them with a well-defined software architecture. Because the new software development paradigm is much different from the traditional approach, quality assurance for component-based software development is a new topic in the software engineering research community. Because component-based software systems are developed on an underlying process different from that of the traditional software, their quality assurance model should address both the process of components and the process of the overall system. Quality assurance for component-based software systems during the life cycle is used to analyze the components for achievement of high quality component-based software systems. Although some Quality assurance techniques and component based approach to software engineering have been studied, there is still no clear and well-defined standard or guidelines for component-based software systems. Therefore, identification of the quality assurance characteristics, quality assurance models, quality assurance tools and quality assurance metrics, are under urgent need. As a major contribution in this paper, I have proposed QAM: Quality Assurance Model for component-based software development, which covers component requirement analysis, component development, component certification, component architecture design, integration, testing, and maintenance.

  16. Model-generated air quality statistics for application in vegetation response models in Alberta

    International Nuclear Information System (INIS)

    McVehil, G.E.; Nosal, M.

    1990-01-01

    To test and apply vegetation response models in Alberta, air pollution statistics representative of various parts of the Province are required. At this time, air quality monitoring data of the requisite accuracy and time resolution are not available for most parts of Alberta. Therefore, there exists a need to develop appropriate air quality statistics. The objectives of the work reported here were to determine the applicability of model generated air quality statistics and to develop by modelling, realistic and representative time series of hourly SO 2 concentrations that could be used to generate the statistics demanded by vegetation response models

  17. Document clustering methods, document cluster label disambiguation methods, document clustering apparatuses, and articles of manufacture

    Science.gov (United States)

    Sanfilippo, Antonio [Richland, WA; Calapristi, Augustin J [West Richland, WA; Crow, Vernon L [Richland, WA; Hetzler, Elizabeth G [Kennewick, WA; Turner, Alan E [Kennewick, WA

    2009-12-22

    Document clustering methods, document cluster label disambiguation methods, document clustering apparatuses, and articles of manufacture are described. In one aspect, a document clustering method includes providing a document set comprising a plurality of documents, providing a cluster comprising a subset of the documents of the document set, using a plurality of terms of the documents, providing a cluster label indicative of subject matter content of the documents of the cluster, wherein the cluster label comprises a plurality of word senses, and selecting one of the word senses of the cluster label.

  18. Use of speech-to-text technology for documentation by healthcare providers.

    Science.gov (United States)

    Ajami, Sima

    2016-01-01

    Medical records are a critical component of a patient's treatment. However, documentation of patient-related information is considered a secondary activity in the provision of healthcare services, often leading to incomplete medical records and patient data of low quality. Advances in information technology (IT) in the health system and registration of information in electronic health records (EHR) using speechto- text conversion software have facilitated service delivery. This narrative review is a literature search with the help of libraries, books, conference proceedings, databases of Science Direct, PubMed, Proquest, Springer, SID (Scientific Information Database), and search engines such as Yahoo, and Google. I used the following keywords and their combinations: speech recognition, automatic report documentation, voice to text software, healthcare, information, and voice recognition. Due to lack of knowledge of other languages, I searched all texts in English or Persian with no time limits. Of a total of 70, only 42 articles were selected. Speech-to-text conversion technology offers opportunities to improve the documentation process of medical records, reduce cost and time of recording information, enhance the quality of documentation, improve the quality of services provided to patients, and support healthcare providers in legal matters. Healthcare providers should recognize the impact of this technology on service delivery.

  19. A systematic literature review of open source software quality assessment models.

    Science.gov (United States)

    Adewumi, Adewole; Misra, Sanjay; Omoregbe, Nicholas; Crawford, Broderick; Soto, Ricardo

    2016-01-01

    Many open source software (OSS) quality assessment models are proposed and available in the literature. However, there is little or no adoption of these models in practice. In order to guide the formulation of newer models so they can be acceptable by practitioners, there is need for clear discrimination of the existing models based on their specific properties. Based on this, the aim of this study is to perform a systematic literature review to investigate the properties of the existing OSS quality assessment models by classifying them with respect to their quality characteristics, the methodology they use for assessment, and their domain of application so as to guide the formulation and development of newer models. Searches in IEEE Xplore, ACM, Science Direct, Springer and Google Search is performed so as to retrieve all relevant primary studies in this regard. Journal and conference papers between the year 2003 and 2015 were considered since the first known OSS quality model emerged in 2003. A total of 19 OSS quality assessment model papers were selected. To select these models we have developed assessment criteria to evaluate the quality of the existing studies. Quality assessment models are classified into five categories based on the quality characteristics they possess namely: single-attribute, rounded category, community-only attribute, non-community attribute as well as the non-quality in use models. Our study reflects that software selection based on hierarchical structures is found to be the most popular selection method in the existing OSS quality assessment models. Furthermore, we found that majority (47%) of the existing models do not specify any domain of application. In conclusion, our study will be a valuable contribution to the community and helps the quality assessment model developers in formulating newer models and also to the practitioners (software evaluators) in selecting suitable OSS in the midst of alternatives.

  20. Business Process Modelling for Measuring Quality

    NARCIS (Netherlands)

    Heidari, F.; Loucopoulos, P.; Brazier, F.M.

    2013-01-01

    Business process modelling languages facilitate presentation, communication and analysis of business processes with different stakeholders. This paper proposes an approach that drives specification and measurement of quality requirements and in doing so relies on business process models as

  1. Analysis of a risk prevention document using dependability techniques: a first step towards an effectiveness model

    Directory of Open Access Journals (Sweden)

    L. Ferrer

    2018-04-01

    Full Text Available Major hazard prevention is a main challenge given that it is specifically based on information communicated to the public. In France, preventive information is notably provided by way of local regulatory documents. Unfortunately, the law requires only few specifications concerning their content; therefore one can question the impact on the general population relative to the way the document is concretely created. Ergo, the purpose of our work is to propose an analytical methodology to evaluate preventive risk communication document effectiveness. The methodology is based on dependability approaches and is applied in this paper to the Document d'Information Communal sur les Risques Majeurs (DICRIM; in English, Municipal Information Document on Major Risks. DICRIM has to be made by mayors and addressed to the public to provide information on major hazards affecting their municipalities. An analysis of law compliance of the document is carried out thanks to the identification of regulatory detection elements. These are applied to a database of 30 DICRIMs. This analysis leads to a discussion on points such as usefulness of the missing elements. External and internal function analysis permits the identification of the form and content requirements and service and technical functions of the document and its components (here its sections. Their results are used to carry out an FMEA (failure modes and effects analysis, which allows us to define the failure and to identify detection elements. This permits the evaluation of the effectiveness of form and content of each components of the document. The outputs are validated by experts from the different fields investigated. Those results are obtained to build, in future works, a decision support model for the municipality (or specialised consulting firms in charge of drawing up documents.

  2. Analysis of a risk prevention document using dependability techniques: a first step towards an effectiveness model

    Science.gov (United States)

    Ferrer, Laetitia; Curt, Corinne; Tacnet, Jean-Marc

    2018-04-01

    Major hazard prevention is a main challenge given that it is specifically based on information communicated to the public. In France, preventive information is notably provided by way of local regulatory documents. Unfortunately, the law requires only few specifications concerning their content; therefore one can question the impact on the general population relative to the way the document is concretely created. Ergo, the purpose of our work is to propose an analytical methodology to evaluate preventive risk communication document effectiveness. The methodology is based on dependability approaches and is applied in this paper to the Document d'Information Communal sur les Risques Majeurs (DICRIM; in English, Municipal Information Document on Major Risks). DICRIM has to be made by mayors and addressed to the public to provide information on major hazards affecting their municipalities. An analysis of law compliance of the document is carried out thanks to the identification of regulatory detection elements. These are applied to a database of 30 DICRIMs. This analysis leads to a discussion on points such as usefulness of the missing elements. External and internal function analysis permits the identification of the form and content requirements and service and technical functions of the document and its components (here its sections). Their results are used to carry out an FMEA (failure modes and effects analysis), which allows us to define the failure and to identify detection elements. This permits the evaluation of the effectiveness of form and content of each components of the document. The outputs are validated by experts from the different fields investigated. Those results are obtained to build, in future works, a decision support model for the municipality (or specialised consulting firms) in charge of drawing up documents.

  3. SFM TECHNIQUE AND FOCUS STACKING FOR DIGITAL DOCUMENTATION OF ARCHAEOLOGICAL ARTIFACTS

    Directory of Open Access Journals (Sweden)

    P. Clini

    2016-06-01

    Full Text Available Digital documentation and high-quality 3D representation are always more requested in many disciplines and areas due to the large amount of technologies and data available for fast, detailed and quick documentation. This work aims to investigate the area of medium and small sized artefacts and presents a fast and low cost acquisition system that guarantees the creation of 3D models with an high level of detail, making the digitalization of cultural heritage a simply and fast procedure. The 3D models of the artefacts are created with the photogrammetric technique Structure From Motion that makes it possible to obtain, in addition to three-dimensional models, high-definition images for a deepened study and understanding of the artefacts. For the survey of small objects (only few centimetres it is used a macro lens and the focus stacking, a photographic technique that consists in capturing a stack of images at different focus planes for each camera pose so that is possible to obtain a final image with a higher depth of field. The acquisition with focus stacking technique has been finally validated with an acquisition with laser triangulation scanner Minolta that demonstrates the validity compatible with the allowable error in relation to the expected precision.

  4. Collaborative problem solving with a total quality model.

    Science.gov (United States)

    Volden, C M; Monnig, R

    1993-01-01

    A collaborative problem-solving system committed to the interests of those involved complies with the teachings of the total quality management movement in health care. Deming espoused that any quality system must become an integral part of routine activities. A process that is used consistently in dealing with problems, issues, or conflicts provides a mechanism for accomplishing total quality improvement. The collaborative problem-solving process described here results in quality decision-making. This model incorporates Ishikawa's cause-and-effect (fishbone) diagram, Moore's key causes of conflict, and the steps of the University of North Dakota Conflict Resolution Center's collaborative problem solving model.

  5. Personalized Metaheuristic Clustering Onto Web Documents

    Institute of Scientific and Technical Information of China (English)

    Wookey Lee

    2004-01-01

    Optimal clustering for the web documents is known to complicated cornbinatorial Optimization problem and it is hard to develop a generally applicable oplimal algorithm. An accelerated simuIated arlneaIing aIgorithm is developed for automatic web document classification. The web document classification problem is addressed as the problem of best describing a match between a web query and a hypothesized web object. The normalized term frequency and inverse document frequency coefficient is used as a measure of the match. Test beds are generated on - line during the search by transforming model web sites. As a result, web sites can be clustered optimally in terms of keyword vectofs of corresponding web documents.

  6. Modelling End-User of Electronic-Government Service: The Role of Information quality, System Quality and Trust

    Science.gov (United States)

    Witarsyah Jacob, Deden; Fudzee, Mohd Farhan Md; Aizi Salamat, Mohamad; Kasim, Shahreen; Mahdin, Hairulnizam; Azhar Ramli, Azizul

    2017-08-01

    Many governments around the world increasingly use internet technologies such as electronic government to provide public services. These services range from providing the most basic informational website to deploying sophisticated tools for managing interactions between government agencies and beyond government. Electronic government (e-government) aims to provide a more accurate, easily accessible, cost-effective and time saving for the community. In this study, we develop a new model of e-government adoption service by extending the Unified Theory of Acceptance and Use of Technology (UTAUT) through the incorporation of some variables such as System Quality, Information Quality and Trust. The model is then tested using a large-scale, multi-site survey research of 237 Indonesian citizens. This model will be validated by using Structural Equation Modeling (SEM). The result indicates that System Quality, Information Quality and Trust variables proven to effect user behavior. This study extends the current understanding on the influence of System Quality, Information Quality and Trust factors to researchers, practitioners, and policy makers.

  7. Conceptual Models, Choices, and Benchmarks for Building Quality Work Cultures.

    Science.gov (United States)

    Acker-Hocevar, Michele

    1996-01-01

    The two models in Florida's Educational Quality Benchmark System represent a new way of thinking about developing schools' work culture. The Quality Performance System Model identifies nine dimensions of work within a quality system. The Change Process Model provides a theoretical framework for changing existing beliefs, attitudes, and behaviors…

  8. Hanford Sampling Quality Management Plan (HSQMP)

    International Nuclear Information System (INIS)

    Hyatt, J.E.

    1995-01-01

    This document provides a management tool for evaluating and designing the appropriate elements of a field sampling program. This document provides discussion of the elements of a program and is to be used as a guidance document during the preparation of project and/or function specific documentation. This document does not specify how a sampling program shall be organized. The HSQMP is to be used as a companion document to the Hanford Analytical Services Quality Assurance Plan (HASQAP) DOE/RL-94-55. The generation of this document was enhanced by conducting baseline evaluations of current sampling organizations. Valuable input was received from members of field and Quality Assurance organizations. The HSQMP is expected to be a living document. Revisions will be made as regulations and or Hanford Site conditions warrant changes in the best management practices. Appendices included are: summary of the sampling and analysis work flow process, a user's guide to the Data Quality Objective process, and a self-assessment checklist

  9. DockQ: A Quality Measure for Protein-Protein Docking Models

    Science.gov (United States)

    Basu, Sankar

    2016-01-01

    The state-of-the-art to assess the structural quality of docking models is currently based on three related yet independent quality measures: Fnat, LRMS, and iRMS as proposed and standardized by CAPRI. These quality measures quantify different aspects of the quality of a particular docking model and need to be viewed together to reveal the true quality, e.g. a model with relatively poor LRMS (>10Å) might still qualify as 'acceptable' with a descent Fnat (>0.50) and iRMS (iRMS to a single score in the range [0, 1] that can be used to assess the quality of protein docking models. By using DockQ on CAPRI models it is possible to almost completely reproduce the original CAPRI classification into Incorrect, Acceptable, Medium and High quality. An average PPV of 94% at 90% Recall demonstrating that there is no need to apply predefined ad-hoc cutoffs to classify docking models. Since DockQ recapitulates the CAPRI classification almost perfectly, it can be viewed as a higher resolution version of the CAPRI classification, making it possible to estimate model quality in a more quantitative way using Z-scores or sum of top ranked models, which has been so valuable for the CASP community. The possibility to directly correlate a quality measure to a scoring function has been crucial for the development of scoring functions for protein structure prediction, and DockQ should be useful in a similar development in the protein docking field. DockQ is available at http://github.com/bjornwallner/DockQ/ PMID:27560519

  10. Implementation of quality management in early stages of research and development projects at a university.

    Science.gov (United States)

    Fiehe, Sandra; Wagner, Georg; Schlanstein, Peter; Rosefort, Christiane; Kopp, Rüdger; Bensberg, Ralf; Knipp, Peter; Schmitz-Rode, Thomas; Steinseifer, Ulrich; Arens, Jutta

    2014-04-01

    The ultimate objective of university research and development projects is usually to create knowledge, but also to successfully transfer results to industry for subsequent marketing. We hypothesized that the university technology transfer requires efficient measures to improve this important step. Besides good scientific practice, foresighted and industry-specific adapted documentation of research processes in terms of a quality management system might improve the technology transfer. In order to bridge the gap between research institute and cooperating industry, a model project has been accompanied by a project specific amount of quality management. However, such a system had to remain manageable and must not constrain the researchers' creativity. Moreover, topics and research team are strongly interdisciplinary, which entails difficulties regarding communication because of different perspectives and terminology. In parallel to the technical work of the model project, an adaptable quality management system with a quality manual, defined procedures, and forms and documents accompanying the research, development and validation was implemented. After process acquisition and analysis the appropriate amount of management for the model project was identified by a self-developed rating system considering project characteristics like size, innovation, stakeholders, interdisciplinarity, etc. Employees were trained according to their needs. The management was supported and the technical documentation was optimized. Finally, the quality management system has been transferred successfully to further projects.

  11. Offshore Wind Guidance Document: Oceanography and Sediment Stability (Version 1) Development of a Conceptual Site Model.

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, Jesse D.; Jason Magalen; Craig Jones

    2014-06-01

    This guidance document provide s the reader with an overview of the key environmental considerations for a typical offshore wind coastal location and the tools to help guide the reader through a thoro ugh planning process. It will enable readers to identify the key coastal processes relevant to their offshore wind site and perform pertinent analysis to guide siting and layout design, with the goal of minimizing costs associated with planning, permitting , and long - ter m maintenance. The document highlight s site characterization and assessment techniques for evaluating spatial patterns of sediment dynamics in the vicinity of a wind farm under typical, extreme, and storm conditions. Finally, the document des cribe s the assimilation of all of this information into the conceptual site model (CSM) to aid the decision - making processes.

  12. Quality assurance of qualitative analysis

    DEFF Research Database (Denmark)

    Ríos, Ángel; Barceló, Damiá; Buydens, Lutgarde

    2003-01-01

    The European Commission has supported the G6MA-CT-2000-01012 project on "Metrology of Qualitative Chemical Analysis" (MEQUALAN), which was developed during 2000-2002. The final result is a document produced by a group of scientists with expertise in different areas of chemical analysis, metrology...... and quality assurance. One important part of this document deals, therefore, with aspects involved in analytical quality assurance of qualitative analysis. This article shows the main conclusions reported in the document referring to the implementation of quality principles in qualitative analysis...

  13. MEASURING THE DATA MODEL QUALITY IN THE ESUPPLY CHAINS

    Directory of Open Access Journals (Sweden)

    Zora Arsovski

    2012-03-01

    Full Text Available The implementation of Internet technology in business has enabled the development of e-business supply chains with large-scale information integration among all partners.The development of information systems (IS is based on the established business objectives whose achievement, among other things, directly depends on the quality of development and design of IS. In the process of analysis of the key elements of company operations in the supply chain, process model and corresponding data model are designed which should enable selection of appropriate information system architecture. The quality of the implemented information system, which supports e-supply chain, directly depends on the level of data model quality. One of the serious limitations of the data model is its complexity. With a large number of entities, data model is difficult to analyse, monitor and maintain. The problem gets bigger when looking at an integrated data model at the level of participating partners in the supply chain, where the data model usually consists of hundreds or even thousands of entities.The paper will analyse the key elements affecting the quality of data models and show their interactions and factors of significance. In addition, the paper presents various measures for assessing the quality of the data model on which it is possible to easily locate the problems and focus efforts in specific parts of a complex data model where it is not economically feasible to review every detail of the model.

  14. Pilot production system cost/benefit analysis: Digital document storage project

    Science.gov (United States)

    1989-01-01

    The Digital Document Storage (DDS)/Pilot Production System (PPS) will provide cost effective electronic document storage, retrieval, hard copy reproduction, and remote access for users of NASA Technical Reports. The DDS/PPS will result in major benefits, such as improved document reproduction quality within a shorter time frame than is currently possible. In addition, the DDS/PPS will provide an important strategic value through the construction of a digital document archive. It is highly recommended that NASA proceed with the DDS Prototype System and a rapid prototyping development methodology in order to validate recent working assumptions upon which the success of the DDS/PPS is dependent.

  15. Quality control of nuclear medicine instruments

    International Nuclear Information System (INIS)

    1984-11-01

    This document, which gives detailed guidance on the quality control of the various electronic instruments used for radiation detection and measurement in nuclear medicine, stems from the work of two Advisory Groups convened by the International Atomic Energy Agency (IAEA). A preliminary document, including recommended test schedules but lacking actual protocols for the tests, was drawn up by the first of these groups, meeting at the IAEA Headquarters in Vienna in 1979. A revised and extended version, incorporating recommended test protocols, was prepared by the second Group, meeting likewise in Vienna in 1982. This version is the model for the present text. The document should be of value to all nuclear medicine units, and especially to those in developing countries, in the initiation or revision of schemes for the quality control of their instruments. Its recommendations have provided the basis for instruction in two IAEA regional technical co-operation projects in the subject field, one initiated in 1981 for countries of Latin America and one initiated in 1982 for countries of Asia and the Pacific

  16. Creation of structured documentation templates using Natural Language Processing techniques.

    Science.gov (United States)

    Kashyap, Vipul; Turchin, Alexander; Morin, Laura; Chang, Frank; Li, Qi; Hongsermeier, Tonya

    2006-01-01

    Structured Clinical Documentation is a fundamental component of the healthcare enterprise, linking both clinical (e.g., electronic health record, clinical decision support) and administrative functions (e.g., evaluation and management coding, billing). One of the challenges in creating good quality documentation templates has been the inability to address specialized clinical disciplines and adapt to local clinical practices. A one-size-fits-all approach leads to poor adoption and inefficiencies in the documentation process. On the other hand, the cost associated with manual generation of documentation templates is significant. Consequently there is a need for at least partial automation of the template generation process. We propose an approach and methodology for the creation of structured documentation templates for diabetes using Natural Language Processing (NLP).

  17. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers.

    Science.gov (United States)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R; Weber, Barbara

    2016-05-09

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.

  18. Data Acquisition for Quality Loss Function Modelling

    DEFF Research Database (Denmark)

    Pedersen, Søren Nygaard; Howard, Thomas J.

    2016-01-01

    Quality loss functions can be a valuable tool when assessing the impact of variation on product quality. Typically, the input for the quality loss function would be a measure of the varying product performance and the output would be a measure of quality. While the unit of the input is given by t...... by the product function in focus, the quality output can be measured and quantified in a number of ways. In this article a structured approach for acquiring stakeholder satisfaction data for use in quality loss function modelling is introduced.......Quality loss functions can be a valuable tool when assessing the impact of variation on product quality. Typically, the input for the quality loss function would be a measure of the varying product performance and the output would be a measure of quality. While the unit of the input is given...

  19. Quality management using model-driven engineering: an overview

    OpenAIRE

    Ruiz-Rube, Iván; Escalona, María José

    2014-01-01

    Quality Management (QM) is one of the critical points of any software development process. In recent years, several proposals have emerged on this issue, mainly with regard to maturity models, quality standards and best practices collections. Besides, Model Driven Engineering (MDE) aims to build software systems through the construction and transformation of models. However, MDE might be used for other different tasks. In this poster, we summarize the main contributions abou...

  20. Design and Establishment of Quality Model of Fundamental Geographic Information Database

    Science.gov (United States)

    Ma, W.; Zhang, J.; Zhao, Y.; Zhang, P.; Dang, Y.; Zhao, T.

    2018-04-01

    In order to make the quality evaluation for the Fundamental Geographic Information Databases(FGIDB) more comprehensive, objective and accurate, this paper studies and establishes a quality model of FGIDB, which formed by the standardization of database construction and quality control, the conformity of data set quality and the functionality of database management system, and also designs the overall principles, contents and methods of the quality evaluation for FGIDB, providing the basis and reference for carry out quality control and quality evaluation for FGIDB. This paper designs the quality elements, evaluation items and properties of the Fundamental Geographic Information Database gradually based on the quality model framework. Connected organically, these quality elements and evaluation items constitute the quality model of the Fundamental Geographic Information Database. This model is the foundation for the quality demand stipulation and quality evaluation of the Fundamental Geographic Information Database, and is of great significance on the quality assurance in the design and development stage, the demand formulation in the testing evaluation stage, and the standard system construction for quality evaluation technology of the Fundamental Geographic Information Database.

  1. Peer review of RELAP5/MOD3 documentation

    International Nuclear Information System (INIS)

    Craddick, W.G.

    1993-01-01

    A peer review was performed on a portion of the documentation of the RELAP5/MOD3 computer code. The review was performed in two phases. The first phase was a review of Volume 3, Developmental Assessment problems, and Volume 4, Models and Correlations. The reviewers for this phase were Dr. Peter Griffith, Dr. Yassin Hassan, Dr. Gerald S. Lellouche, Dr. Marino di Marzo and Mr. Mark Wendel. The reviewers recommended a number of improvements, including using a frozen version of the code for assessment guided by a validation plan, better justification for flow regime maps and extension of models beyond their data base. The second phase was a review of Volume 6, Quality Assurance of Numerical Techniques in RELAP5/MOD3. The reviewers for the second phase were Mr. Mark Wendel and Dr. Paul T. Williams. Recommendations included correction of numerous grammatical and typographical errors and better justification for the use of Lax's Equivalence Theorem

  2. Earth System Documentation (ES-DOC) Preparation for CMIP6

    Science.gov (United States)

    Denvil, S.; Murphy, S.; Greenslade, M. A.; Lawrence, B.; Guilyardi, E.; Pascoe, C.; Treshanksy, A.; Elkington, M.; Hibling, E.; Hassell, D.

    2015-12-01

    During the course of 2015 the Earth System Documentation (ES-DOC) project began its preparations for CMIP6 (Coupled Model Inter-comparison Project 6) by further extending the ES-DOC tooling ecosystem in support of Earth System Model (ESM) documentation creation, search, viewing & comparison. The ES-DOC online questionnaire, the ES-DOC desktop notebook, and the ES-DOC python toolkit will serve as multiple complementary pathways to generating CMIP6 documentation. It is envisaged that institutes will leverage these tools at different points of the CMIP6 lifecycle. Institutes will be particularly interested to know that the documentation burden will be either streamlined or completely automated.As all the tools are tightly integrated with the ES-DOC web-service, institutes can be confident that the latency between documentation creation & publishing will be reduced to a minimum. Published documents will be viewable with the online ES-DOC Viewer (accessible via citable URL's). Model inter-comparison scenarios will be supported using the ES-DOC online Comparator tool. The Comparator is being extended to:• Support comparison of both Model descriptions & Simulation runs;• Greatly streamline the effort involved in compiling official tables.The entire ES-DOC ecosystem is open source and built upon open standards such as the Common Information Model (CIM) (versions 1 and 2).

  3. Documenting Penicillin Allergy: The Impact of Inconsistency

    Science.gov (United States)

    Shah, Nirav S.; Ridgway, Jessica P.; Pettit, Natasha; Fahrenbach, John; Robicsek, Ari

    2016-01-01

    Background Allergy documentation is frequently inconsistent and incomplete. The impact of this variability on subsequent treatment is not well described. Objective To determine how allergy documentation affects subsequent antibiotic choice. Design Retrospective, cohort study. Participants 232,616 adult patients seen by 199 primary care providers (PCPs) between January 1, 2009 and January 1, 2014 at an academic medical system. Main Measures Inter-physician variation in beta-lactam allergy documentation; antibiotic treatment following beta-lactam allergy documentation. Key Results 15.6% of patients had a reported beta-lactam allergy. Of those patients, 39.8% had a specific allergen identified and 22.7% had allergic reaction characteristics documented. Variation between PCPs was greater than would be expected by chance (all ppenicillins”) (24.0% to 58.2%) and documentation of the reaction characteristics (5.4% to 51.9%). After beta-lactam allergy documentation, patients were less likely to receive penicillins (Relative Risk [RR] 0.16 [95% Confidence Interval: 0.15–0.17]) and cephalosporins (RR 0.28 [95% CI 0.27–0.30]) and more likely to receive fluoroquinolones (RR 1.5 [95% CI 1.5–1.6]), clindamycin (RR 3.8 [95% CI 3.6–4.0]) and vancomycin (RR 5.0 [95% CI 4.3–5.8]). Among patients with beta-lactam allergy, rechallenge was more likely when a specific allergen was identified (RR 1.6 [95% CI 1.5–1.8]) and when reaction characteristics were documented (RR 2.0 [95% CI 1.8–2.2]). Conclusions Provider documentation of beta-lactam allergy is highly variable, and details of the allergy are infrequently documented. Classification of a patient as beta-lactam allergic and incomplete documentation regarding the details of the allergy lead to beta-lactam avoidance and use of other antimicrobial agents, behaviors that may adversely impact care quality and cost. PMID:26981866

  4. Towards benchmarking an in-stream water quality model

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available A method of model evaluation is presented which utilises a comparison with a benchmark model. The proposed benchmarking concept is one that can be applied to many hydrological models but, in this instance, is implemented in the context of an in-stream water quality model. The benchmark model is defined in such a way that it is easily implemented within the framework of the test model, i.e. the approach relies on two applications of the same model code rather than the application of two separate model codes. This is illustrated using two case studies from the UK, the Rivers Aire and Ouse, with the objective of simulating a water quality classification, general quality assessment (GQA, which is based on dissolved oxygen, biochemical oxygen demand and ammonium. Comparisons between the benchmark and test models are made based on GQA, as well as a step-wise assessment against the components required in its derivation. The benchmarking process yields a great deal of important information about the performance of the test model and raises issues about a priori definition of the assessment criteria.

  5. Development of an Instructional Quality Assurance Model in Nursing Science

    Science.gov (United States)

    Ajpru, Haruthai; Pasiphol, Shotiga; Wongwanich, Suwimon

    2011-01-01

    The purpose of this study was to develop an instructional quality assurance model in nursing science. The study was divided into 3 phases; (1) to study the information for instructional quality assurance model development (2) to develop an instructional quality assurance model in nursing science and (3) to audit and the assessment of the developed…

  6. Automated Generation of Technical Documentation and Provenance for Reproducible Research

    Science.gov (United States)

    Jolly, B.; Medyckyj-Scott, D.; Spiekermann, R.; Ausseil, A. G.

    2017-12-01

    Data provenance and detailed technical documentation are essential components of high-quality reproducible research, however are often only partially addressed during a research project. Recording and maintaining this information during the course of a project can be a difficult task to get right as it is a time consuming and often boring process for the researchers involved. As a result, provenance records and technical documentation provided alongside research results can be incomplete or may not be completely consistent with the actual processes followed. While providing access to the data and code used by the original researchers goes some way toward enabling reproducibility, this does not count as, or replace, data provenance. Additionally, this can be a poor substitute for good technical documentation and is often more difficult for a third-party to understand - particularly if they do not understand the programming language(s) used. We present and discuss a tool built from the ground up for the production of well-documented and reproducible spatial datasets that are created by applying a series of classification rules to a number of input layers. The internal model of the classification rules required by the tool to process the input data is exploited to also produce technical documentation and provenance records with minimal additional user input. Available provenance records that accompany input datasets are incorporated into those that describe the current process. As a result, each time a new iteration of the analysis is performed the documentation and provenance records are re-generated to provide an accurate description of the exact process followed. The generic nature of this tool, and the lessons learned during its creation, have wider application to other fields where the production of derivative datasets must be done in an open, defensible, and reproducible way.

  7. Documentation for the MODFLOW 6 Groundwater Flow Model

    Science.gov (United States)

    Langevin, Christian D.; Hughes, Joseph D.; Banta, Edward R.; Niswonger, Richard G.; Panday, Sorab; Provost, Alden M.

    2017-08-10

    This report documents the Groundwater Flow (GWF) Model for a new version of MODFLOW called MODFLOW 6. The GWF Model for MODFLOW 6 is based on a generalized control-volume finite-difference approach in which a cell can be hydraulically connected to any number of surrounding cells. Users can define the model grid using one of three discretization packages, including (1) a structured discretization package for defining regular MODFLOW grids consisting of layers, rows, and columns, (2) a discretization by ver­tices package for defining layered unstructured grids consisting of layers and cells, and (3) a general unstruc­tured discretization package for defining flexible grids comprised of cells and their connection properties. For layered grids, a new capability is available for removing thin cells and vertically connecting cells overlying and underlying the thin cells. For complex problems involving water-table conditions, an optional Newton-Raphson formulation, based on the formulations in MODFLOW-NWT and MODFLOW-USG, can be acti­vated. Use of the Newton-Raphson formulation will often improve model convergence and allow solutions to be obtained for difficult problems that cannot be solved using the traditional wetting and drying approach. The GWF Model is divided into “packages,” as was done in previous MODFLOW versions. A package is the part of the model that deals with a single aspect of simulation. Packages included with the GWF Model include those related to internal calculations of groundwater flow (discretization, initial conditions, hydraulic conduc­tance, and storage), stress packages (constant heads, wells, recharge, rivers, general head boundaries, drains, and evapotranspiration), and advanced stress packages (streamflow routing, lakes, multi-aquifer wells, and unsaturated zone flow). An additional package is also available for moving water available in one package into the individual features of the advanced stress packages. The GWF Model

  8. Quality Concerns in Technical Education in India: A Quantifiable Quality Enabled Model

    Science.gov (United States)

    Gambhir, Victor; Wadhwa, N. C.; Grover, Sandeep

    2016-01-01

    Purpose: The paper aims to discuss current Technical Education scenarios in India. It proposes modelling the factors affecting quality in a technical institute and then applying a suitable technique for assessment, comparison and ranking. Design/methodology/approach: The paper chose graph theoretic approach for quantification of quality-enabled…

  9. Accelerator System Model (ASM) user manual with physics and engineering model documentation. ASM version 1.0

    International Nuclear Information System (INIS)

    1993-07-01

    The Accelerator System Model (ASM) is a computer program developed to model proton radiofrequency accelerators and to carry out system level trade studies. The ASM FORTRAN subroutines are incorporated into an intuitive graphical user interface which provides for the open-quotes constructionclose quotes of the accelerator in a window on the computer screen. The interface is based on the Shell for Particle Accelerator Related Codes (SPARC) software technology written for the Macintosh operating system in the C programming language. This User Manual describes the operation and use of the ASM application within the SPARC interface. The Appendix provides a detailed description of the physics and engineering models used in ASM. ASM Version 1.0 is joint project of G. H. Gillespie Associates, Inc. and the Accelerator Technology (AT) Division of the Los Alamos National Laboratory. Neither the ASM Version 1.0 software nor this ASM Documentation may be reproduced without the expressed written consent of both the Los Alamos National Laboratory and G. H. Gillespie Associates, Inc

  10. Forensic intelligence applied to questioned document analysis: A model and its application against organized crime.

    Science.gov (United States)

    De Alcaraz-Fossoul, Josep; Roberts, Katherine A

    2017-07-01

    The capability of forensic sciences to fight crime, especially against organized criminal groups, becomes relevant in the recent economic downturn and the war on terrorism. In view of these societal challenges, the methods of combating crime should experience critical changes in order to improve the effectiveness and efficiency of the current resources available. It is obvious that authorities have serious difficulties combating criminal groups of transnational nature. These are characterized as well structured organizations with international connections, abundant financial resources and comprised of members with significant and diverse expertise. One common practice among organized criminal groups is the use of forged documents that allow for the commission of illegal cross-border activities. Law enforcement can target these movements to identify counterfeits and establish links between these groups. Information on document falsification can become relevant to generate forensic intelligence and to design new strategies against criminal activities of this nature and magnitude. This article discusses a methodology for improving the development of forensic intelligence in the discipline of questioned document analysis. More specifically, it focuses on document forgeries and falsification types used by criminal groups. It also describes the structure of international criminal organizations that use document counterfeits as means to conduct unlawful activities. The model presented is partially based on practical applications of the system that have resulted in satisfactory outcomes in our laboratory. Copyright © 2017 The Chartered Society of Forensic Sciences. Published by Elsevier B.V. All rights reserved.

  11. A quality management model for radiation oncology physics

    International Nuclear Information System (INIS)

    Sternick, E.S.

    1991-01-01

    State-of-the-art radiation physics quality programs operate in a data rich environment. Given the abundance of recordable events, any formalism that serves to identify and monitor a set of attributes correlated with quality is to be regarded as an important management tool. The hierarchical tree structure model describes one such useful planning method. Of the several different types of tree structures, one of the most appropriate for quality management is the pyramid model. In this model, the associations between an overall program objective and the intermediate steps leading to its attainment, are indicated by both horizontal and vertical connectors. The overall objective of the system under study occupies the vertex of the pyramid, while the level immediately below contains its principal components. Further subdivisions of each component occur in successively lower levels. The tree finally terminates at a base level consisting of actions or requirements that must be fulfilled in order to satisfy the overall objective. A pyramid model for a radiation oncology physics quality program is discussed in detail. (author). 21 refs., 4 figs., 6 tabs

  12. Quality management in SNSA

    International Nuclear Information System (INIS)

    Levstek, M.F.; Slokan Dusic, D.

    2002-01-01

    The Slovenian Nuclear Safety Administration (SNSA) within the Ministry of Environment, Spatial Planning and Energy acts as the national regulatory authority for nuclear safety and radiation protection of workers in nuclear installations and of population in the vicinity of nuclear facilities. The SNSA has decided to document its own quality management system due to two basic reasons. Firstly, as a regulatory body for nuclear and radiological safety the SNSA should have an adequate quality management system. Secondly, the Slovenian Government stimulates the initiation of a quality system in all public authorities and that is evident from its strategic directives and aims. In order to develop the quality management system the Quality Board and the Project Team have been established. The quality management system is being developed in accordance with International Standard ISO 9001: 2000, IAEA Safety Series No. 50-C/SG-Q; January 2001 and the IAEATECDOC- 1090: Quality Assurance within Regulatory Bodies; June 1999 considering all other adequate documents referring to nuclear quality. The quality manual together with subordinate level documents are the means to conveying the elements and operation of the quality system to all staff involved, ensuring that the system is effectively implemented and achieves its goals.(author)

  13. Predicting Document Retrieval System Performance: An Expected Precision Measure.

    Science.gov (United States)

    Losee, Robert M., Jr.

    1987-01-01

    Describes an expected precision (EP) measure designed to predict document retrieval performance. Highlights include decision theoretic models; precision and recall as measures of system performance; EP graphs; relevance feedback; and computing the retrieval status value of a document for two models, the Binary Independent Model and the Two Poisson…

  14. HDTS 2017.0 Testing and verification document

    Energy Technology Data Exchange (ETDEWEB)

    Whiteside, Tad S. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-08-01

    This report is a continuation of the series of Hunter Dose Tracking System (HDTS) Quality Assurance documents including (Foley and Powell, 2010; Dixon, 2012). In this report we have created a suite of automated test cases and a system to analyze the results of those tests as well as documented the methodology to ensure the field system performs within specifications. The software test cases cover all of the functions and interactions of functions that are practical to test. With the developed framework, if software defects are discovered, it will be easy to create one or more test cases to reproduce the defect and ensure that code changes correct the defect. These tests con rm HDTS version 2017.0 performs according to its specifications and documentation and that its performance meets the needs of its users at the Savannah River Site.

  15. Air Quality Modelling and the National Emission Database

    DEFF Research Database (Denmark)

    Jensen, S. S.

    The project focuses on development of institutional strengthening to be able to carry out national air emission inventories based on the CORINAIR methodology. The present report describes the link between emission inventories and air quality modelling to ensure that the new national air emission...... inventory is able to take into account the data requirements of air quality models...

  16. Global modelling of river water quality under climate change

    Science.gov (United States)

    van Vliet, Michelle T. H.; Franssen, Wietse H. P.; Yearsley, John R.

    2017-04-01

    Climate change will pose challenges on the quality of freshwater resources for human use and ecosystems for instance by changing the dilution capacity and by affecting the rate of chemical processes in rivers. Here we assess the impacts of climate change and induced streamflow changes on a selection of water quality parameters for river basins globally. We used the Variable Infiltration Capacity (VIC) model and a newly developed global water quality module for salinity, temperature, dissolved oxygen and biochemical oxygen demand. The modelling framework was validated using observed records of streamflow, water temperature, chloride, electrical conductivity, dissolved oxygen and biochemical oxygen demand for 1981-2010. VIC and the water quality module were then forced with an ensemble of bias-corrected General Circulation Model (GCM) output for the representative concentration pathways RCP2.6 and RCP8.5 to study water quality trends and identify critical regions (hotspots) of water quality deterioration for the 21st century.

  17. Temperature documentation - instrument for quality assurance; Temperaturdokumentation - Instrument der Qualitaetssicherung

    Energy Technology Data Exchange (ETDEWEB)

    Hegglin, A [Wurm AG, Winterthur (Switzerland)

    2000-10-01

    Important inspection points of a HACCP concept are the temperatures. On the basis of the demands for a systematic temperature documentation, the application of control systems and instruments is described by several examples. (orig.) [German] Wichtige Kontrollpunkte eines HACCP-Konzepts sind die Temperaturen. Ausgehend von den Anforderungen, die an eine systematische Temperaturedokumentation gestellt werden, wird der Einsatz geeigneter Regel- und Ueberwachungsgeraete an mehreren Beispielen erlaeutert. (orig.)

  18. Klang River water quality modelling using music

    Science.gov (United States)

    Zahari, Nazirul Mubin; Zawawi, Mohd Hafiz; Muda, Zakaria Che; Sidek, Lariyah Mohd; Fauzi, Nurfazila Mohd; Othman, Mohd Edzham Fareez; Ahmad, Zulkepply

    2017-09-01

    Water is an essential resource that sustains life on earth; changes in the natural quality and distribution of water have ecological impacts that can sometimes be devastating. Recently, Malaysia is facing many environmental issues regarding water pollution. The main causes of river pollution are rapid urbanization, arising from the development of residential, commercial, industrial sites, infrastructural facilities and others. The purpose of the study was to predict the water quality of the Connaught Bridge Power Station (CBPS), Klang River. Besides that, affects to the low tide and high tide and. to forecast the pollutant concentrations of the Biochemical Oxygen Demand (BOD) and Total Suspended Solid (TSS) for existing land use of the catchment area through water quality modeling (by using the MUSIC software). Besides that, to identifying an integrated urban stormwater treatment system (Best Management Practice or BMPs) to achieve optimal performance in improving the water quality of the catchment using the MUSIC software in catchment areas having tropical climates. Result from MUSIC Model such as BOD5 at station 1 can be reduce the concentration from Class IV to become Class III. Whereas, for TSS concentration from Class III to become Class II at the station 1. The model predicted a mean TSS reduction of 0.17%, TP reduction of 0.14%, TN reduction of 0.48% and BOD5 reduction of 0.31% for Station 1 Thus, from the result after purposed BMPs the water quality is safe to use because basically water quality monitoring is important due to threat such as activities are harmful to aquatic organisms and public health.

  19. A global conformance quality model. A new strategic tool for minimizing defects caused by variation, error, and complexity

    Energy Technology Data Exchange (ETDEWEB)

    Hinckley, C. Martin [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    1994-01-01

    The performance of Japanese products in the marketplace points to the dominant role of quality in product competition. Our focus is motivated by the tremendous pressure to improve conformance quality by reducing defects to previously unimaginable limits in the range of 1 to 10 parts per million. Toward this end, we have developed a new model of conformance quality that addresses each of the three principle defect sources: (1) Variation, (2) Human Error, and (3) Complexity. Although the role of variation in conformance quality is well documented, errors occur so infrequently that their significance is not well known. We have shown that statistical methods are not useful in characterizing and controlling errors, the most common source of defects. Excessive complexity is also a root source of defects, since it increases errors and variation defects. A missing link in the defining a global model has been the lack of a sound correlation between complexity and defects. We have used Design for Assembly (DFA) methods to quantify assembly complexity and have shown that assembly times can be described in terms of the Pareto distribution in a clear exception to the Central Limit Theorem. Within individual companies we have found defects to be highly correlated with DFA measures of complexity in broad studies covering tens of millions of assembly operations. Applying the global concepts, we predicted that Motorola`s Six Sigma method would only reduce defects by roughly a factor of two rather than orders of magnitude, a prediction confirmed by Motorola`s data. We have also shown that the potential defects rates of product concepts can be compared in the earliest stages of development. The global Conformance Quality Model has demonstrated that the best strategy for improvement depends upon the quality control strengths and weaknesses.

  20. On the typography of flight-deck documentation

    Science.gov (United States)

    Degani, Asaf

    1992-01-01

    Many types of paper documentation are employed on the flight-deck. They range from a simple checklist card to a bulky Aircraft Flight Manual (AFM). Some of these documentations have typographical and graphical deficiencies; yet, many cockpit tasks such as conducting checklists, way-point entry, limitations and performance calculations, and many more, require the use of these documents. Moreover, during emergency and abnormal situations, the flight crews' effectiveness in combating the situation is highly dependent on such documentation; accessing and reading procedures has a significant impact on flight safety. Although flight-deck documentation are an important (and sometimes critical) form of display in the modern cockpit, there is a dearth of information on how to effectively design these displays. The object of this report is to provide a summary of the available literature regarding the design and typographical aspects of printed matter. The report attempts 'to bridge' the gap between basic research about typography, and the kind of information needed by designers of flight-deck documentation. The report focuses on typographical factors such as type-faces, character height, use of lower- and upper-case characters, line length, and spacing. Some graphical aspects such as layout, color coding, fonts, and character contrast are also discussed. In addition, several aspects of cockpit reading conditions such as glare, angular alignment, and paper quality are addressed. Finally, a list of recommendations for the graphical design of flight-deck documentation is provided.

  1. Predictive model for determining the quality of a call

    Science.gov (United States)

    Voznak, M.; Rozhon, J.; Partila, P.; Safarik, J.; Mikulec, M.; Mehic, M.

    2014-05-01

    In this paper the predictive model for speech quality estimation is described. This model allows its user to gain the information about the speech quality in VoIP networks without the need of performing the actual call and the consecutive time consuming sound file evaluation. This rapidly increases usability of the speech quality measurement especially in high load networks, where the actual processing of all calls is rendered difficult or even impossible. This model can reach its results that are highly conformant with the PESQ algorithm only based on the network state parameters that are easily obtainable by the commonly used software tools. Experiments were carried out to investigate whether different languages (English, Czech) have an effect on perceived voice quality for the same network conditions and the language factor was incorporated directly into the model.

  2. Streamlining of the Decontamination and Demolition Document Preparation Process

    International Nuclear Information System (INIS)

    Durand, Nick; Meincke, Carol; Peek, Georgianne

    1999-01-01

    During the past five years, the Sandia National Labo- ratories Decontamination, Decommissioning, Demolition, and Reuse (D3R) Program has evolved and become more focused and efficient. Historical approaches to project documentation, requirements, and drivers are discussed detailing key assumptions, oversight authority, and proj- ect approvals. Discussion of efforts to streamline the D3R project planning and preparation process include the in- corporation of the principles of graded approach, Total Quality Management, and the Observational Method (CH2MHILL April 1989).1 Process improvements were realized by clearly defining regulatory requirements for each phase of a project, establishing general guidance for the program and combining project-specific documents to eliminate redundant and unneeded information. Proc- ess improvements to cost, schedule, and quality are dis- cussed in detail for several projects

  3. Quality assessment in higher education using the SERVQUALQ model

    Directory of Open Access Journals (Sweden)

    Sabina Đonlagić

    2015-01-01

    Full Text Available Economy in Bosnia and Herzegovina is striving towards growth and increased employment and it has been proven by empirical studies worldwide that higher education contributes to socio-economic development of a country. Universities are important for generation, preservation and dissemination of knowledge in order to contribute to socio-economic benefits of a country. Higher education institutions are being pressured to improve value for their activities and providing quality higher education service to students should be taken seriously. In this paper we will address the emerging demand for quality in higher education. Higher education institutions should assess quality of their services and establish methods for improving quality. Activities of quality assurance should be integrated into the management process at higher education institutions. This paper is addressing the issue of service quality measurement in higher education institutions. The most frequently used model in this context is the SERVQUAL model. This model is measuring quality from the students' point of view, since students are considered to be one of the most important stakeholders for a higher education institution. The main objective of this research is to provide empirical evidence that the adapted SERVQAL model can be used in higher education and to identify the service quality gap based on its application at one institution of higher education (Faculty of Economics in Bosnia and Herzegovina. Furthermore, results of the gap analysis using the SERVQUAL methodology provide relevant information in which areas improvement is necessary in order to enhance service quality.

  4. FOSSIL2 energy policy model documentation: FOSSIL2 documentation

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-10-01

    This report discusses the structure, derivations, assumptions, and mathematical formulation of the FOSSIL2 model. Each major facet of the model - supply/demand interactions, industry financing, and production - has been designed to parallel closely the actual cause/effect relationships determining the behavior of the United States energy system. The data base for the FOSSIL2 program is large, as is appropriate for a system dynamics simulation model. When possible, all data were obtained from sources well known to experts in the energy field. Cost and resource estimates are based on DOE data whenever possible. This report presents the FOSSIL2 model at several levels. Volumes II and III of this report list the equations that comprise the FOSSIL2 model, along with variable definitions and a cross-reference list of the model variables. Volume II provides the model equations with each of their variables defined, while Volume III lists the equations, and a one line definition for equations, in a shorter, more readable format.

  5. Development of Water Quality Modeling in the United States

    Science.gov (United States)

    This presentation describes historical trends in water quality model development in the United States, reviews current efforts, and projects promising future directions. Water quality modeling has a relatively long history in the United States. While its origins lie in the work...

  6. Sticky Dots and Lion Adventures Playing a Part in Preschool Documentation Practices

    Science.gov (United States)

    Elfström Pettersson, Katarina

    2015-01-01

    This article examines how material objects such as photographs, papers and computers influence documentation practices in a Swedish preschool. The importance of teacher documentation is emphasized in the 2010 revised Swedish preschool curriculum as a means of evaluating preschool quality. However, the curriculum gives no specific guidelines about…

  7. Mathematical models for atmospheric pollutants. Appendix D. Available air quality models. Final report

    International Nuclear Information System (INIS)

    Drake, R.L.; McNaughton, D.J.; Huang, C.

    1979-08-01

    Models that are available for the analysis of airborne pollutants are summarized. In addition, recommendations are given concerning the use of particular models to aid in particular air quality decision making processes. The air quality models are characterized in terms of time and space scales, steady state or time dependent processes, reference frames, reaction mechanisms, treatment of turbulence and topography, and model uncertainty. Using these characteristics, the models are classified in the following manner: simple deterministic models, such as air pollution indices, simple area source models and rollback models; statistical models, such as averaging time models, time series analysis and multivariate analysis; local plume and puff models; box and multibox models; finite difference or grid models; particle models; physical models, such as wind tunnels and liquid flumes; regional models; and global models

  8. Hydrologic and Water Quality Model Development Using Simulink

    Directory of Open Access Journals (Sweden)

    James D. Bowen

    2014-11-01

    Full Text Available A stormwater runoff model based on the Soil Conservation Service (SCS method and a finite-volume based water quality model have been developed to investigate the use of Simulink for use in teaching and research. Simulink, a MATLAB extension, is a graphically based model development environment for system modeling and simulation. Widely used for mechanical and electrical systems, Simulink has had less use for modeling of hydrologic systems. The watershed model is being considered for use in teaching graduate-level courses in hydrology and/or stormwater modeling. Simulink’s block (data process and arrow (data transfer object model, the copy and paste user interface, the large number of existing blocks, and the absence of computer code allows students to become model developers almost immediately. The visual depiction of systems, their component subsystems, and the flow of data through the systems are ideal attributes for hands-on teaching of hydrologic and mass balance processes to today’s computer-savvy visual learners. Model development with Simulink for research purposes is also investigated. A finite volume, multi-layer pond model using the water quality kinetics present in CE-QUAL-W2 has been developed using Simulink. The model is one of the first uses of Simulink for modeling eutrophication dynamics in stratified natural systems. The model structure and a test case are presented. One use of the model for teaching a graduate-level water quality modeling class is also described.

  9. Quality assurance

    Energy Technology Data Exchange (ETDEWEB)

    Gillespie, B.M.; Gleckler, B.P.

    1995-06-01

    This section of the 1994 Hanford Site Environmental Report summarizes the quality assurance and quality control practices of Hanford Site environmental monitoring and surveillance programs. Samples are analyzed according to documented standard analytical procedures. This section discusses specific measures taken to ensure quality in project management, sample collection, and analytical results.

  10. Quality assurance

    International Nuclear Information System (INIS)

    Gillespie, B.M.; Gleckler, B.P.

    1995-01-01

    This section of the 1994 Hanford Site Environmental Report summarizes the quality assurance and quality control practices of Hanford Site environmental monitoring and surveillance programs. Samples are analyzed according to documented standard analytical procedures. This section discusses specific measures taken to ensure quality in project management, sample collection, and analytical results

  11. The influence of education on chosen typographical characteristics perception in documents

    Directory of Open Access Journals (Sweden)

    Petra Talandová

    2008-01-01

    Full Text Available This paper concentrates on the problems of documents quality evaluation and perception. It presents the results of our own research which was made at our faculty in last months. The research was aimed at marketing materials, their quality and their evaluation by customers. After another comparative research we have concentrated on the question of typography and its evaluation by customers. It has been proved that typography, although not very visible for many readers, does influence readers' evaluation of the whole document.Quality evaluation is influenced by what readers usually see around them but also by what are they taught at schools. For that reason we should influence readers and teach them how should be documents designed. Future specialist should know about this problem.Aim of the research was to find how typography influences readers and whether is the typographic quality evaluation influenced by teaching at school. We have been comparing two groups of respondents, of which one was influenced by typography education and one was not. Respondent were asked to fill in a questionnaire which concentrated on right and wrong characteristic of the text. Result analysis has proved that the group influenced by typography lessons had better results. These conclusions will be applied in the new education plan for an innovation of education courses.

  12. Quality assurance during site construction. Pt. 4

    International Nuclear Information System (INIS)

    Potrz, R.; Dilling, H.

    1980-01-01

    Quality control during the assembly of an pressure water reactor containment: 1.) Fundamental principles of the quality control: short explanation to the specification and job instruction. 2.) Quality control during the assembly: welding control, non destructive material test. 3.) Quality deviations: explanation of an repair-plan. 4.) Documentation: join together the workshop- and site documentation. (orig.)

  13. Waste Management facilities cost information: System Cost Model Software Quality Assurance Plan. Revision 2

    International Nuclear Information System (INIS)

    Peterson, B.L.; Lundeen, A.S.

    1996-02-01

    In May of 1994, Lockheed Idaho Technologies Company (LITCO) in Idaho Falls, Idaho and subcontractors developed the System Cost Model (SCM) application. The SCM estimates life-cycle costs of the entire US Department of Energy (DOE) complex for designing; constructing; operating; and decommissioning treatment, storage, and disposal (TSD) facilities for mixed low-level, low-level, transuranic, and mixed transuranic waste. The SCM uses parametric cost functions to estimate life-cycle costs for various treatment, storage, and disposal modules which reflect planned and existing facilities at DOE installations. In addition, SCM can model new facilities based on capacity needs over the program life cycle. The SCM also provides transportation costs for truck and rail, which include transport of contact-handled, remote-handled, and alpha (transuranic) wastes. The user can provide input data (default data is included in the SCM) including the volume and nature of waste to be managed, the time period over which the waste is to be managed, and the configuration of the waste management complex (i.e., where each installation's generated waste will be treated, stored, and disposed). Then the SCM uses parametric cost equations to estimate the costs of pre-operations (designing), construction costs, operation management, and decommissioning these waste management facilities. For the product to be effective and useful the SCM users must have a high level of confidence in the data generated by the software model. The SCM Software Quality Assurance Plan is part of the overall SCM project management effort to ensure that the SCM is maintained as a quality product and can be relied on to produce viable planning data. This document defines tasks and deliverables to ensure continued product integrity, provide increased confidence in the accuracy of the data generated, and meet the LITCO's quality standards during the software maintenance phase. 8 refs., 1 tab

  14. Waste Management facilities cost information: System Cost Model Software Quality Assurance Plan. Revision 2

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, B.L.; Lundeen, A.S.

    1996-02-01

    In May of 1994, Lockheed Idaho Technologies Company (LITCO) in Idaho Falls, Idaho and subcontractors developed the System Cost Model (SCM) application. The SCM estimates life-cycle costs of the entire US Department of Energy (DOE) complex for designing; constructing; operating; and decommissioning treatment, storage, and disposal (TSD) facilities for mixed low-level, low-level, transuranic, and mixed transuranic waste. The SCM uses parametric cost functions to estimate life-cycle costs for various treatment, storage, and disposal modules which reflect planned and existing facilities at DOE installations. In addition, SCM can model new facilities based on capacity needs over the program life cycle. The SCM also provides transportation costs for truck and rail, which include transport of contact-handled, remote-handled, and alpha (transuranic) wastes. The user can provide input data (default data is included in the SCM) including the volume and nature of waste to be managed, the time period over which the waste is to be managed, and the configuration of the waste management complex (i.e., where each installation`s generated waste will be treated, stored, and disposed). Then the SCM uses parametric cost equations to estimate the costs of pre-operations (designing), construction costs, operation management, and decommissioning these waste management facilities. For the product to be effective and useful the SCM users must have a high level of confidence in the data generated by the software model. The SCM Software Quality Assurance Plan is part of the overall SCM project management effort to ensure that the SCM is maintained as a quality product and can be relied on to produce viable planning data. This document defines tasks and deliverables to ensure continued product integrity, provide increased confidence in the accuracy of the data generated, and meet the LITCO`s quality standards during the software maintenance phase. 8 refs., 1 tab.

  15. Integrated Management System, Configuration and Document Control for Research Reactors

    International Nuclear Information System (INIS)

    Steynberg, B.J.; Bruyn, J.F. du

    2017-01-01

    An integrated management system is a single management framework establishing all the processes necessary for the organisation to address all its goals and objectives. Very often only quality, environment and health & safety goals are included when referred to an integrated management system. However, within the research reactor environment such system should include goals pertinent to economic, environmental, health, operational, quality, safeguards, safety, security, and social considerations. One of the important objectives of an integrated management is to create the environment for a healthy safety culture. Configuration management is a disciplined process that involves both management and technical direction to establish and document the design requirements and the physical configuration of the research reactor and to ensure that they remain consistent with each other and the documentation. Configuration is the combination of the physical, functional, and operational characteristics of the structures, systems, and components (SSCs) or parts of the research reactor, operation, or activity. The basic objectives and general principles of configuration management are the same for all research reactors. The objectives of configuration management are to: a) Establish consistency among design requirements, physical configuration, and documentation (including analyses, drawings, and procedures) for the research reactor; b) Maintain this consistency throughout the life of the research reactor, particularly as changes are being made; and c) Retain confidence in the safety of the research reactor. The key elements needed to manage the configuration of research reactors are design requirements, work control, change control, document control, and configuration management assessments. The objective of document control is to ensure that only the most recently approved versions of documents are used in the process of operating, maintaining, and modifying the research reactor

  16. Biological Water Quality Criteria

    Science.gov (United States)

    Page contains links to Technical Documents pertaining to Biological Water Quality Criteria, including, technical assistance documents for states, tribes and territories, program overviews, and case studies.

  17. Extractive Summarisation of Medical Documents

    Directory of Open Access Journals (Sweden)

    Abeed Sarker

    2012-09-01

    Full Text Available Background Evidence Based Medicine (EBM practice requires practitioners to extract evidence from published medical research when answering clinical queries. Due to the time-consuming nature of this practice, there is a strong motivation for systems that can automatically summarise medical documents and help practitioners find relevant information. Aim The aim of this work is to propose an automatic query-focused, extractive summarisation approach that selects informative sentences from medical documents. MethodWe use a corpus that is specifically designed for summarisation in the EBM domain. We use approximately half the corpus for deriving important statistics associated with the best possible extractive summaries. We take into account factors such as sentence position, length, sentence content, and the type of the query posed. Using the statistics from the first set, we evaluate our approach on a separate set. Evaluation of the qualities of the generated summaries is performed automatically using ROUGE, which is a popular tool for evaluating automatic summaries. Results Our summarisation approach outperforms all baselines (best baseline score: 0.1594; our score 0.1653. Further improvements are achieved when query types are taken into account. Conclusion The quality of extractive summarisation in the medical domain can be significantly improved by incorporating domain knowledge and statistics derived from a specialised corpus. Such techniques can therefore be applied for content selection in end-to-end summarisation systems.

  18. Quality in Preschool

    DEFF Research Database (Denmark)

    Næsby, Torben

    The Danish Day Care Facilities Act, which provides the curriculum on which day care education is based, does not stipulate very clearly what children should learn and therefore how educational processes should be organized. This means that we must accept that there are large local differences in ...... in a Mixed Methods design. Quality assessment of learning environments: ERS -line (quantitative baseline) Children’s outcome, measured by LearnLab and observations of learning environments identifying high quality and inclusion (qualitative)...... and dialectically influence and are influenced by each other (Bronfenbrenner & Morris, 2006; Sheridan, 2009). The research project takes on a Critical Realistic approach, Building on the bio-ecological model; theory on development (proximal processes); an interactionist perspective on quality and inclusion...... in the quality of the learning environment and the impact on children’s well-being, learning and development. A pedagogical perspective on quality has to originate from research and documented evidence but must also inherit the children’s perspectives (Sommer et al, 2013). It has to be viewed interactively...

  19. Document retrieval on repetitive string collections.

    Science.gov (United States)

    Gagie, Travis; Hartikainen, Aleksi; Karhu, Kalle; Kärkkäinen, Juha; Navarro, Gonzalo; Puglisi, Simon J; Sirén, Jouni

    2017-01-01

    Most of the fastest-growing string collections today are repetitive, that is, most of the constituent documents are similar to many others. As these collections keep growing, a key approach to handling them is to exploit their repetitiveness, which can reduce their space usage by orders of magnitude. We study the problem of indexing repetitive string collections in order to perform efficient document retrieval operations on them. Document retrieval problems are routinely solved by search engines on large natural language collections, but the techniques are less developed on generic string collections. The case of repetitive string collections is even less understood, and there are very few existing solutions. We develop two novel ideas, interleaved LCPs and precomputed document lists , that yield highly compressed indexes solving the problem of document listing (find all the documents where a string appears), top- k document retrieval (find the k documents where a string appears most often), and document counting (count the number of documents where a string appears). We also show that a classical data structure supporting the latter query becomes highly compressible on repetitive data. Finally, we show how the tools we developed can be combined to solve ranked conjunctive and disjunctive multi-term queries under the simple [Formula: see text] model of relevance. We thoroughly evaluate the resulting techniques in various real-life repetitiveness scenarios, and recommend the best choices for each case.

  20. Water quality modelling of an impacted semi-arid catchment using flow data from the WEAP model

    Science.gov (United States)

    Slaughter, Andrew R.; Mantel, Sukhmani K.

    2018-04-01

    The continuous decline in water quality in many regions is forcing a shift from quantity-based water resources management to a greater emphasis on water quality management. Water quality models can act as invaluable tools as they facilitate a conceptual understanding of processes affecting water quality and can be used to investigate the water quality consequences of management scenarios. In South Africa, the Water Quality Systems Assessment Model (WQSAM) was developed as a management-focussed water quality model that is relatively simple to be able to utilise the small amount of available observed data. Importantly, WQSAM explicitly links to systems (yield) models routinely used in water resources management in South Africa by using their flow output to drive water quality simulations. Although WQSAM has been shown to be able to represent the variability of water quality in South African rivers, its focus on management from a South African perspective limits its use to within southern African regions for which specific systems model setups exist. Facilitating the use of WQSAM within catchments outside of southern Africa and within catchments for which these systems model setups to not exist would require WQSAM to be able to link to a simple-to-use and internationally-applied systems model. One such systems model is the Water Evaluation and Planning (WEAP) model, which incorporates a rainfall-runoff component (natural hydrology), and reservoir storage, return flows and abstractions (systems modelling), but within which water quality modelling facilities are rudimentary. The aims of the current study were therefore to: (1) adapt the WQSAM model to be able to use as input the flow outputs of the WEAP model and; (2) provide an initial assessment of how successful this linkage was by application of the WEAP and WQSAM models to the Buffalo River for historical conditions; a small, semi-arid and impacted catchment in the Eastern Cape of South Africa. The simulations of

  1. Hybrid Model for e-Learning Quality Evaluation

    Directory of Open Access Journals (Sweden)

    Suzana M. Savic

    2012-02-01

    Full Text Available E-learning is becoming increasingly important for the competitive advantage of economic organizations and higher education institutions. Therefore, it is becoming a significant aspect of quality which has to be integrated into the management system of every organization or institution. The paper examines e-learning quality characteristics, standards, criteria and indicators and presents a multi-criteria hybrid model for e-learning quality evaluation based on the method of Analytic Hierarchy Process, trend analysis, and data comparison.

  2. Quality control and assurance in building the Dukovany nuclear power plant - check on and filing of quality certificates

    International Nuclear Information System (INIS)

    Sokola, J.

    1986-01-01

    The following documents were used for determining the range of documentation required for the construction of the Dukovany nuclear power plant: the decree valid in Czechoslovakia for all industrial structures, the respective Czechoslovak State and branch standards and several special Soviet regulations. For the central recording of all documents on the quality of deliveries and assemblies a special quality assurance unit was set up on the site of the nuclear power plant. In the system of the flow of documents on the quality of the structure of the Dukovany nuclear power plant there are 16 addressees, from outside subcontractors to on-site managerial staff, work safety inspectors, etc., to the enterprise archives and the department of scientific and technical information. A brief description is presented of the different types of documents on the quality of deliveries and assemblies, and the method of inspection of their content and completeness is described. (Z.M.)

  3. Impact of inherent meteorology uncertainty on air quality model predictions

    Science.gov (United States)

    It is well established that there are a number of different classifications and sources of uncertainties in environmental modeling systems. Air quality models rely on two key inputs, namely, meteorology and emissions. When using air quality models for decision making, it is impor...

  4. [Consensus document on overactive bladder in older patients].

    Science.gov (United States)

    Verdejo-Bravo, Carlos; Brenes-Bermúdez, Francisco; Valverde-Moyar, Maria Victoria; Alcántara-Montero, Antonio; Pérez-León, Noemí

    2015-01-01

    Overactive nladder (OAB) is a clinical entity with a high prevalence in the population, having a high impact on quality of life, especially when it occurs with urge urinary incontinence. It is very important to highlight the low rate of consultation of this condition by the older population. This appears to depend on several factors (educational, cultural, professional), and thus leads to the low percentage of older patients who receive appropriate treatment and, on the other hand, a large percentage of older patients with a significant deterioration in their quality of life. Therefore, Scientific societies and Working Groups propose the early detection of OAB in their documents and clinical guidelines. Its etiology is not well known, but is influenced by cerebrovascular processes and other neurological problems, abnormalities of the detrusor muscle of bladder receptors, and obstructive and inflammatory processes of the lower urinary tract. Its diagnosis is clinical, and in the great majority of the cases it can be possible to establish its diagnosis and etiopathogenic orientation, without the need for complex diagnostic procedures. Currently, there are effective treatments for OAB, and we should decide the most appropriate for each elderly patient, based on their individual characteristics. Based on the main clinical practice guidelines, a progressive treatment is proposed, with the antimuscarinics being the most recommended drug treatment. Therefore, a group of very involved professionals in clinical practice for the elderly, and representing two scientific Societies (Spanish Society of Geriatrics and Gerontology [SEGG], and the Spanish Society of Primary Care Physicians [SEMERGEN]) developed this consensus document with the main objective of establishing practices and valid strategies, focused to simplify the management of this clinical entity in the elderly population, and especially to improve their quality of life. The recommendations presented in this

  5. Process air quality data

    Science.gov (United States)

    Butler, C. M.; Hogge, J. E.

    1978-01-01

    Air quality sampling was conducted. Data for air quality parameters, recorded on written forms, punched cards or magnetic tape, are available for 1972 through 1975. Computer software was developed to (1) calculate several daily statistical measures of location, (2) plot time histories of data or the calculated daily statistics, (3) calculate simple correlation coefficients, and (4) plot scatter diagrams. Computer software was developed for processing air quality data to include time series analysis and goodness of fit tests. Computer software was developed to (1) calculate a larger number of daily statistical measures of location, and a number of daily monthly and yearly measures of location, dispersion, skewness and kurtosis, (2) decompose the extended time series model and (3) perform some goodness of fit tests. The computer program is described, documented and illustrated by examples. Recommendations are made for continuation of the development of research on processing air quality data.

  6. Evaluation of a mHealth Data Quality Intervention to Improve Documentation of Pregnancy Outcomes by Health Surveillance Assistants in Malawi: A Cluster Randomized Trial.

    Directory of Open Access Journals (Sweden)

    Olga Joos

    Full Text Available While community health workers are being recognized as an integral work force with growing responsibilities, increased demands can potentially affect motivation and performance. The ubiquity of mobile phones, even in hard-to-reach communities, has facilitated the pursuit of novel approaches to support community health workers beyond traditional modes of supervision, job aids, in-service training, and material compensation. We tested whether supportive short message services (SMS could improve reporting of pregnancies and pregnancy outcomes among community health workers (Health Surveillance Assistants, or HSAs in Malawi.We designed a set of one-way SMS that were sent to HSAs on a regular basis during a 12-month period. We tested the effectiveness of the cluster-randomized intervention in improving the complete documentation of a pregnancy. We defined complete documentation as a pregnancy for which a specific outcome was recorded. HSAs in the treatment group received motivational and data quality SMS. HSAs in the control group received only motivational SMS. During baseline and intervention periods, we matched reported pregnancies to reported outcomes to determine if reporting of matched pregnancies differed between groups and by period. The trial is registered as ISCTRN24785657.Study results show that the mHealth intervention improved the documentation of matched pregnancies in both the treatment (OR 1.31, 95% CI: 1.10-1.55, p<0.01 and control (OR 1.46, 95% CI: 1.11-1.91, p = 0.01 groups relative to the baseline period, despite differences in SMS content between groups. The results should be interpreted with caution given that the study was underpowered. We did not find a statistically significant difference in matched pregnancy documentation between groups during the intervention period (OR 0.94, 95% CI: 0.63-1.38, p = 0.74. mHealth applications have the potential to improve the tracking and data quality of pregnancies and pregnancy outcomes

  7. Ethnobotanical Knowledge Is Vastly Under-Documented in Northwestern South America

    Science.gov (United States)

    Cámara-Leret, Rodrigo; Paniagua-Zambrana, Narel; Balslev, Henrik; Macía, Manuel J.

    2014-01-01

    A main objective of ethnobotany is to document traditional knowledge about plants before it disappears. However, little is known about the coverage of past ethnobotanical studies and thus about how well the existing literature covers the overall traditional knowledge of different human groups. To bridge this gap, we investigated ethnobotanical data-collecting efforts across four countries (Colombia, Ecuador, Peru, Bolivia), three ecoregions (Amazon, Andes, Chocó), and several human groups (including Amerindians, mestizos, and Afro-Americans). We used palms (Arecaceae) as our model group because of their usefulness and pervasiveness in the ethnobotanical literature. We carried out a large number of field interviews (n = 2201) to determine the coverage and quality of palm ethnobotanical data in the existing ethnobotanical literature (n = 255) published over the past 60 years. In our fieldwork in 68 communities, we collected 87,886 use reports and documented 2262 different palm uses and 140 useful palm species. We demonstrate that traditional knowledge on palm uses is vastly under-documented across ecoregions, countries, and human groups. We suggest that the use of standardized data-collecting protocols in wide-ranging ethnobotanical fieldwork is a promising approach for filling critical information gaps. Our work contributes to the Aichi Biodiversity Targets and emphasizes the need for signatory nations to the Convention on Biological Diversity to respond to these information gaps. Given our findings, we hope to stimulate the formulation of clear plans to systematically document ethnobotanical knowledge in northwestern South America and elsewhere before it vanishes. PMID:24416449

  8. Indoor Air Quality Building Education and Assessment Model

    Science.gov (United States)

    The Indoor Air Quality Building Education and Assessment Model (I-BEAM), released in 2002, is a guidance tool designed for use by building professionals and others interested in indoor air quality in commercial buildings.

  9. A-Cell equipment removal quality process plan

    International Nuclear Information System (INIS)

    TAKASUMI, D.S.

    1999-01-01

    This document establishes the quality assuring activities used to manage the 324 building A-Cell equipment removal activity. This activity will package, remove, transport and dispose of the equipment in A-Cell. This document is provided to ensure that appropriate and effective quality assuring activities have been incorporated into the work controlling documentation and procedures

  10. Quality Assurance Guidance for the Collection of Meteorological Data Using Passive Radiometers

    Science.gov (United States)

    This document augments the February 2000 guidance entitled Meteorological Monitoring Guidance for Regulatory Modeling Applications and the March 2008 guidance entitled Quality Assurance Handbook for Air Pollution Measurement Systems Volume IV: Meteorological Measurements Version ...

  11. Accelerator System Model (ASM) user manual with physics and engineering model documentation. ASM version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1993-07-01

    The Accelerator System Model (ASM) is a computer program developed to model proton radiofrequency accelerators and to carry out system level trade studies. The ASM FORTRAN subroutines are incorporated into an intuitive graphical user interface which provides for the {open_quotes}construction{close_quotes} of the accelerator in a window on the computer screen. The interface is based on the Shell for Particle Accelerator Related Codes (SPARC) software technology written for the Macintosh operating system in the C programming language. This User Manual describes the operation and use of the ASM application within the SPARC interface. The Appendix provides a detailed description of the physics and engineering models used in ASM. ASM Version 1.0 is joint project of G. H. Gillespie Associates, Inc. and the Accelerator Technology (AT) Division of the Los Alamos National Laboratory. Neither the ASM Version 1.0 software nor this ASM Documentation may be reproduced without the expressed written consent of both the Los Alamos National Laboratory and G. H. Gillespie Associates, Inc.

  12. Spent Nuclear Fuel Project document control and Records Management Program Description

    International Nuclear Information System (INIS)

    MARTIN, B.M.

    2000-01-01

    The Spent Nuclear Fuel (SNF) Project document control and records management program, as defined within this document, is based on a broad spectrum of regulatory requirements, Department of Energy (DOE) and Project Hanford and SNF Project-specific direction and guidance. The SNF Project Execution Plan, HNF-3552, requires the control of documents and management of records under the auspices of configuration control, conduct of operations, training, quality assurance, work control, records management, data management, engineering and design control, operational readiness review, and project management and turnover. Implementation of the controls, systems, and processes necessary to ensure compliance with applicable requirements is facilitated through plans, directives, and procedures within the Project Hanford Management System (PHMS) and the SNF Project internal technical and administrative procedures systems. The documents cited within this document are those which directly establish or define the SNF Project document control and records management program. There are many peripheral documents that establish requirements and provide direction pertinent to managing specific types of documents that, for the sake of brevity and clarity, are not cited within this document

  13. SBKF Modeling and Analysis Plan: Buckling Analysis of Compression-Loaded Orthogrid and Isogrid Cylinders

    Science.gov (United States)

    Lovejoy, Andrew E.; Hilburger, Mark W.

    2013-01-01

    This document outlines a Modeling and Analysis Plan (MAP) to be followed by the SBKF analysts. It includes instructions on modeling and analysis formulation and execution, model verification and validation, identifying sources of error and uncertainty, and documentation. The goal of this MAP is to provide a standardized procedure that ensures uniformity and quality of the results produced by the project and corresponding documentation.

  14. Quality-assurance plan for water-quality activities in the U.S. Geological Survey Washington Water Science Center

    Science.gov (United States)

    Conn, Kathleen E.; Huffman, Raegan L.; Barton, Cynthia

    2017-05-08

    In accordance with guidelines set forth by the Office of Water Quality in the Water Mission Area of the U.S. Geological Survey, a quality-assurance plan has been created for use by the Washington Water Science Center (WAWSC) in conducting water-quality activities. This qualityassurance plan documents the standards, policies, and procedures used by the WAWSC for activities related to the collection, processing, storage, analysis, and publication of water-quality data. The policies and procedures documented in this quality-assurance plan for water-quality activities complement the quality-assurance plans for surface-water and groundwater activities at the WAWSC.

  15. Mathematical modelling of the process of quality control of construction products

    Directory of Open Access Journals (Sweden)

    Pogorelov Vadim

    2017-01-01

    Full Text Available The study presents the results of years of research in the field of quality management of industrial production construction production, based on mathematical modelling techniques, process and results of implementing the developed programme of monitoring and quality control in the production process of the enterprise. The aim of this work is the presentation of scientific community of the practical results of mathematical modelling in application programs. In the course of the research addressed the description of the applied mathematical models, views, practical results of its application in the applied field to assess quality control. The authors used this mathematical model in practice. The article presents the results of applying this model. The authors developed the experimental software management and quality assessment by using mathematical modeling methods. The authors continue research in this direction to improve the diagnostic systems and quality management systems based on mathematical modeling methods prognostic and diagnostic processes.

  16. Quality education as quality system support

    International Nuclear Information System (INIS)

    Crnoshia, L.; Gavriloska, M.; Denkovska, J.; Dimitrovski, A.

    1999-01-01

    Within the last ten years we are witnesses of the political and economical system transformation, that imposed the need for changing the way of thinking and work planning. The quality has become the imperative of working and a precondition for survival in the market. Solving the quality problems seeks planned and systematic approach that supposed appropriate personnel with adequate knowledge in the field of quality management and implementation of the quality system. Having in mind the need for documented quality system and quality management OKTA, has already started with personnel educational process for quality as a precondition for successful establishment of quality system. In this paper we present quality education approach and manner of its realization in OKTA Crude Oil Refinery - Skopje, Macedonia. (Original)

  17. Improving the Quality of Nursing Documentation in Home Health Care Setting

    Science.gov (United States)

    Obioma, Chidiadi

    2017-01-01

    Poor nursing documentation of patient care was identified in daily nurse visit notes in a health care setting. This problem affects effective communication of patient status with other clinicians, thereby jeopardizing clinical decision-making. The purpose of this evidence-based project was to determine the impact of a retraining program on the…

  18. River water quality model no. 1 (RWQM1): I. Modelling approach

    DEFF Research Database (Denmark)

    Shanahan, P.; Borchardt, D.; Henze, Mogens

    2001-01-01

    Successful river water quality modelling requires the specification of an appropriate model structure and process formulation. Both must be related to the compartment structure of running water ecosystems including their longitudinal, vertical, and lateral zonation patterns. Furthermore...

  19. Document flow segmentation for business applications

    Science.gov (United States)

    Daher, Hani; Belaïd, Abdel

    2013-12-01

    The aim of this paper is to propose a document flow supervised segmentation approach applied to real world heterogeneous documents. Our algorithm treats the flow of documents as couples of consecutive pages and studies the relationship that exists between them. At first, sets of features are extracted from the pages where we propose an approach to model the couple of pages into a single feature vector representation. This representation will be provided to a binary classifier which classifies the relationship as either segmentation or continuity. In case of segmentation, we consider that we have a complete document and the analysis of the flow continues by starting a new document. In case of continuity, the couple of pages are assimilated to the same document and the analysis continues on the flow. If there is an uncertainty on whether the relationship between the couple of pages should be classified as a continuity or segmentation, a rejection is decided and the pages analyzed until this point are considered as a "fragment". The first classification already provides good results approaching 90% on certain documents, which is high at this level of the system.

  20. Quality control of geological voxel models using experts' gaze

    NARCIS (Netherlands)

    Maanen, P.P. van; Busschers, F.S.; Brouwer, A.M.; Meulen, M.J. van der; Erp, J.B.F. van

    2015-01-01

    Due to an expected increase in geological voxel model data-flow and user demands, the development of improved quality control for such models is crucial. This study explores the potential of a new type of quality control that improves the detection of errors by just using gaze behavior of 12

  1. Quality Control of Geological Voxel Models using Experts' Gaze

    NARCIS (Netherlands)

    van Maanen, Peter-Paul; Busschers, Freek S.; Brouwer, Anne-Marie; van der Meulendijk, Michiel J.; van Erp, Johannes Bernardus Fransiscus

    Due to an expected increase in geological voxel model data-flow and user demands, the development of improved quality control for such models is crucial. This study explores the potential of a new type of quality control that improves the detection of errors by just using gaze behavior of 12

  2. Milestones of mathematical model for business process management related to cost estimate documentation in petroleum industry

    Science.gov (United States)

    Khamidullin, R. I.

    2018-05-01

    The paper is devoted to milestones of the optimal mathematical model for a business process related to cost estimate documentation compiled during construction and reconstruction of oil and gas facilities. It describes the study and analysis of fundamental issues in petroleum industry, which are caused by economic instability and deterioration of a business strategy. Business process management is presented as business process modeling aimed at the improvement of the studied business process, namely main criteria of optimization and recommendations for the improvement of the above-mentioned business model.

  3. FACSIM/MRS (Monitored Retrievable Storage)-2: Storage and shipping model documentation and user's guide

    Energy Technology Data Exchange (ETDEWEB)

    Huber, H.D.; Chockie, A.D.; Hostick, C.J.; Otis, P.T.; Sovers, R.A.

    1987-06-01

    The Pacific Northwest Laboratory (PNL) has developed a stochastic computer model, FACSIM/MRS, to assist in assessing the operational performance of the Monitored Retrievable Storage (MRS) waste-handling facility. This report provides the documentation and user's guide for FACSIM/MRS-2, which is also referred to as the back-end model. The FACSIM/MRS-2 model simulates the MRS storage and shipping operations, which include handling canistered spent fuel and secondary waste in the shielded canyon cells, in onsite yard storage, and in repository shipping cask loading areas.

  4. HDTS 2017.1 Testing and Verification Document

    Energy Technology Data Exchange (ETDEWEB)

    Whiteside, T. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-12-01

    This report is a continuation of the series of Hunter Dose Tracking System (HDTS) Quality Assurance documents including (Foley and Powell, 2010; Dixon, 2012; Whiteside, 2017b). In this report we have created a suite of automated test cases and a system to analyze the results of those tests as well as documented the methodology to ensure the field system performs within specifications. The software test cases cover all of the functions and interactions of functions that are practical to test. With the developed framework, if software defects are discovered, it will be easy to create one or more test cases to reproduce the defect and ensure that code changes correct the defect.

  5. Managing Quality

    CERN Document Server

    Kelemen, Mihaela L

    2002-01-01

    Managing Quality provides a comprehensive review and critical analysis of quality management discourses and techniques by drawing on a number of management disciplines such as operations management, HRM, organizational behaviour, strategy, marketing and organization theory. The book: - introduces readers to key concepts and issues in quality management - provides an overview of both managerial and critical perspectives on quality management - presents the 'wisdom' of quality management gurus - documents the way quality is pursued in manufacturing, service and public sector organizations - comp

  6. Guidance and Control Software Project Data - Volume 1: Planning Documents

    Science.gov (United States)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the planning documents from the GCS project. Volume 1 contains five appendices: A. Plan for Software Aspects of Certification for the Guidance and Control Software Project; B. Software Development Standards for the Guidance and Control Software Project; C. Software Verification Plan for the Guidance and Control Software Project; D. Software Configuration Management Plan for the Guidance and Control Software Project; and E. Software Quality Assurance Activities.

  7. Embryo quality predictive models based on cumulus cells gene expression

    Directory of Open Access Journals (Sweden)

    Devjak R

    2016-06-01

    Full Text Available Since the introduction of in vitro fertilization (IVF in clinical practice of infertility treatment, the indicators for high quality embryos were investigated. Cumulus cells (CC have a specific gene expression profile according to the developmental potential of the oocyte they are surrounding, and therefore, specific gene expression could be used as a biomarker. The aim of our study was to combine more than one biomarker to observe improvement in prediction value of embryo development. In this study, 58 CC samples from 17 IVF patients were analyzed. This study was approved by the Republic of Slovenia National Medical Ethics Committee. Gene expression analysis [quantitative real time polymerase chain reaction (qPCR] for five genes, analyzed according to embryo quality level, was performed. Two prediction models were tested for embryo quality prediction: a binary logistic and a decision tree model. As the main outcome, gene expression levels for five genes were taken and the area under the curve (AUC for two prediction models were calculated. Among tested genes, AMHR2 and LIF showed significant expression difference between high quality and low quality embryos. These two genes were used for the construction of two prediction models: the binary logistic model yielded an AUC of 0.72 ± 0.08 and the decision tree model yielded an AUC of 0.73 ± 0.03. Two different prediction models yielded similar predictive power to differentiate high and low quality embryos. In terms of eventual clinical decision making, the decision tree model resulted in easy-to-interpret rules that are highly applicable in clinical practice.

  8. Unsupervised Document Embedding With CNNs

    OpenAIRE

    Liu, Chundi; Zhao, Shunan; Volkovs, Maksims

    2017-01-01

    We propose a new model for unsupervised document embedding. Leading existing approaches either require complex inference or use recurrent neural networks (RNN) that are difficult to parallelize. We take a different route and develop a convolutional neural network (CNN) embedding model. Our CNN architecture is fully parallelizable resulting in over 10x speedup in inference time over RNN models. Parallelizable architecture enables to train deeper models where each successive layer has increasin...

  9. Implementation of a documentation model comprising nursing terminologies--theoretical and methodological issues.

    Science.gov (United States)

    von Krogh, Gunn; Nåden, Dagfinn

    2008-04-01

    To describe and discuss theoretical and methodological issues of implementation of a nursing services documentation model comprising NANDA nursing diagnoses, Nursing Intervention Classification and Nursing Outcome Classification terminologies. The model is developed for electronic patient record and was implemented in a psychiatric hospital on an organizational level and on five test wards in 2001-2005. The theory of Rogers guided the process of innovation, whereas the implementation procedure of McCloskey and Bulecheck combined with adult learning principals guided the test site implementation. The test wards managed in different degrees to adopt the model. Two wards succeeded fully, including a ward with high percentage of staff with interdisciplinary background. Better planning regarding the impact of the organization's innovative aptitude, the innovation strategies and the use of differentiated methods regarding the clinician's individual premises for learning nursing terminologies might have enhanced the adoption to the model. To better understand the nature of barriers and the importance of careful planning regarding the implementation of electronic patient record elements in nursing care services, focusing on nursing terminologies. Further to indicate how a theory and specific procedure can be used to guide the process of implementation throughout the different levels of management.

  10. New Challenges of the Documentation in Media

    Directory of Open Access Journals (Sweden)

    Antonio García Jiménez

    2015-07-01

    Full Text Available This special issue, presented by index.comunicación, is focused on media related information & documentation. This field undergoes constant and profound changes, especially visible in documentation processes. A situation characterized by the existence of tablets, smartphones, applications, and by the almost achieved digitization of traditional documents, in addition to the crisis of the press business model, that involves mutations in the journalists’ tasks and in the relationship between them and Documentation. Papers included in this special issue focus on some of the concerns in this domain: the progressive autonomy of the journalist in access to information sources, the role of press offices as documentation sources, the search of information on the web, the situation of media blogs, the viability of elements of information architecture in smart TV and the development of social TV and its connection to Documentation.

  11. Designing Documents for People to Use

    Directory of Open Access Journals (Sweden)

    David Sless

    Full Text Available This article reports on the work of Communication Research Institute (CRI, an international research center specializing in communication and information design. With the support of government, regulators, industry bodies, and business—and with the participation of people and their advocates—CRI has worked on over 200 public document design projects since it began as a small unit in 1985. CRI investigates practical methods and achievable standards for designing digital and paper public documents, including forms; workplace procedural notices; bills, letters, and emails sent by organizations; labels and instructions that accompany products and services; and legal and financial documents and contracts. CRI has written model grammars for the document types it designs, and the cumulative data from CRI projects has led to a set of systematic methods for designing public-use documents to a high standard. Through research, design, publishing, and advocacy, CRI works to measurably improve the ordinary documents we all have to use. Keywords: Information design, Design methods, Design standards, Communication design, Design diagnostic testing, Design research

  12. A linked hydrodynamic and water quality model for the Salton Sea

    Science.gov (United States)

    Chung, E.G.; Schladow, S.G.; Perez-Losada, J.; Robertson, Dale M.

    2008-01-01

    A linked hydrodynamic and water quality model was developed and applied to the Salton Sea. The hydrodynamic component is based on the one-dimensional numerical model, DLM. The water quality model is based on a new conceptual model for nutrient cycling in the Sea, and simulates temperature, total suspended sediment concentration, nutrient concentrations, including PO4-3, NO3-1 and NH4+1, DO concentration and chlorophyll a concentration as functions of depth and time. Existing water temperature data from 1997 were used to verify that the model could accurately represent the onset and breakup of thermal stratification. 1999 is the only year with a near-complete dataset for water quality variables for the Salton Sea. The linked hydrodynamic and water quality model was run for 1999, and by adjustment of rate coefficients and other water quality parameters, a good match with the data was obtained. In this article, the model is fully described and the model results for reductions in external phosphorus load on chlorophyll a distribution are presented. ?? 2008 Springer Science+Business Media B.V.

  13. Emergency Response Capability Baseline Needs Assessment - Requirements Document

    Energy Technology Data Exchange (ETDEWEB)

    Sharry, J A

    2016-10-04

    This document was prepared by John A. Sharry, LLNL Fire Marshal and LLNL Division Leader for Fire Protection and reviewed by LLNL Emergency Management Department Head James Colson. The document follows and expands upon the format and contents of the DOE Model Fire Protection Baseline Capabilities Assessment document contained on the DOE Fire Protection Web Site, but only addresses emergency response.

  14. Peer review of RELAP5/MOD3 documentation

    International Nuclear Information System (INIS)

    Craddick, W.G.

    1994-01-01

    A peer review was performed on a portion of the documentation of the RELAP5/MOD3 computer code. The review was performed in two phases. The first phase was a review of Vol. III, Developmental Assessment Problems, and Vol. IV, Models and Correlations. The reviewers for this phase were Dr. Peter Griffith, Dr. Yassin Hassan, Dr. Gerald S. Lellouche, Dr. Marino di Marzo and Mr. Mark Wendel. The reviewers recommended a number of improvements, including using a frozen version of the code for assessment guided by a validation plan, better discussion of discrepancies between the code and experimental data, and better justification for flow regime maps and extension of models beyond their data base. The second phase was a review of Vol. VI, Quality Assurance of Numerical Techniques in RELAP5/MOD3. The reviewers for the second phase were Mr. Mark Wendel and Dr. Paul T. Williams. Recommendations included correction of numerous grammatical and typographical errors and better justification for the use of Lax's Equivalence Theorem

  15. Automatic classification of journalistic documents on the Internet1

    Directory of Open Access Journals (Sweden)

    Elias OLIVEIRA

    Full Text Available Abstract Online journalism is increasing every day. There are many news agencies, newspapers, and magazines using digital publication in the global network. Documents published online are available to users, who use search engines to find them. In order to deliver documents that are relevant to the search, they must be indexed and classified. Due to the vast number of documents published online every day, a lot of research has been carried out to find ways to facilitate automatic document classification. The objective of the present study is to describe an experimental approach for the automatic classification of journalistic documents published on the Internet using the Vector Space Model for document representation. The model was tested based on a real journalism database, using algorithms that have been widely reported in the literature. This article also describes the metrics used to assess the performance of these algorithms and their required configurations. The results obtained show the efficiency of the method used and justify further research to find ways to facilitate the automatic classification of documents.

  16. Technical documentation and user's guide for City-County Allocation Model (CCAM). Version 1.0

    International Nuclear Information System (INIS)

    Clark, L.T. Jr.; Scott, M.J.; Hammer, P.

    1986-05-01

    The City-County Allocation Model (CCAM) was developed as part of the Monitored Retrievable Storage (MRS) Program. The CCAM model was designed to allocate population changes forecasted by the MASTER model to specific local communities within commuting distance of the MRS facility. The CCAM model was designed to then forecast the potential changes in demand for key community services such as housing, police protection, and utilities for these communities. The CCAM model uses a flexible on-line data base on demand for community services that is based on a combination of local service levels and state and national service standards. The CCAM model can be used to quickly forecast the potential community service consequence of economic development for local communities anywhere in the country. The remainder of this document is organized as follows. The purpose of this manual is to assist the user in understanding and operating the City-County Allocation Model (CCAM). The annual explains the data sources for the model and code modifications as well as the operational procedures

  17. Private healthcare quality: applying a SERVQUAL model.

    Science.gov (United States)

    Butt, Mohsin Muhammad; de Run, Ernest Cyril

    2010-01-01

    This paper seeks to develop and test the SERVQUAL model scale for measuring Malaysian private health service quality. The study consists of 340 randomly selected participants visiting a private healthcare facility during a three-month data collection period. Data were analyzed using means, correlations, principal component and confirmatory factor analysis to establish the modified SERVQUAL scale's reliability, underlying dimensionality and convergent, discriminant validity. Results indicate a moderate negative quality gap for overall Malaysian private healthcare service quality. Results also indicate a moderate negative quality gap on each service quality scale dimension. However, scale development analysis yielded excellent results, which can be used in wider healthcare policy and practice. Respondents were skewed towards a younger population, causing concern that the results might not represent all Malaysian age groups. The study's major contribution is that it offers a way to assess private healthcare service quality. Second, it successfully develops a scale that can be used to measure health service quality in Malaysian contexts.

  18. How to set up and manage quality control and quality assurance

    NARCIS (Netherlands)

    Visschedijk, M.; Hendriks, R.; Nuyts, K.

    2005-01-01

    This document provides a general introduction to clarify the differences between quality control (QC) and quality assurance (QA). In addition it serves as a starting point for implementing a quality system approach within an organization. The paper offers practical guidance to the implementation of

  19. PENGARUH SERVICE QUALITY, PRODUCT QUALITY DAN PERCEIVED VALUE TERHADAP KEPUASAN DAN LOYALITAS PELANGGAN INDUSTRI KAYU PERHUTANI

    Directory of Open Access Journals (Sweden)

    Winarni Saftarya Timedina Gultom

    2016-07-01

    Full Text Available The purpose of this research is to analyze the influences of service quality, product quality and perceived value on the satisfaction and loyalty of the customers of Perhutani wood industry, the level of customer satisfaction and loyalty and managerial implications. Four data analysis techniques were used including the structural equation modeling partial least squares (SEMPLS with 24 exogenous latent variable attributes and 6 endogenous latent variable attributes; CSI, CLI and IPA. SEMPLS results showed that the variables of service quality (T-statistic=2, 79*>T-table=1, 96 and product quality (T-statistic=6, 45*>T-table=1, 96 significantly influence the satisfaction of the customers of Perhutani wood industry. Perceived value variable (T-statistic=0, 65quality, product quality, and perceived value provided by Perhutani wood industry. The main priority of the performances to be improved based on IPA include the attributes of accuracy of product sizes: length, width, thickness, complaint handling, product delivery, service employees, comprehension of needs, officer awareness, information on the documents and prevailing prices, service time completion, purchase document accuracy, price certainty, allocation certainty, suitability prices with quality, and affordable product prices.Keywords: perhutani, wood industry, satisfaction, loyalty, SEMPLS

  20. Video-documentation: 'The Pannonic ozon project'

    International Nuclear Information System (INIS)

    Loibl, W.; Cabela, E.; Mayer, H. F.; Schmidt, M.

    1998-07-01

    Goal of the project was the production of a video film as documentation of the Pannonian Ozone Project- POP. The main part of the video describes the POP-model consisting of the modules meteorology, emissions and chemistry, developed during the POP-project. The model considers the European emission patterns of ozone precursors and the actual wind fields. It calculates ozone build up and depletion within air parcels due to emission and weather situation along trajectory routes. Actual ozone concentrations are calculated during model runs simulating the photochemical processes within air parcels moving along 4 day trajectories before reaching the Vienna region. The model computations were validated during extensive ground and aircraft-based measurements of ozone precursors and ozone concentration within the POP study area. Scenario computations were used to determine how much ozone can be reduced in north-eastern Austria by emissions control measures. The video lasts 12:20 minutes and consists of computer animations and life video scenes, presenting the ozone problem in general, the POP model and the model results. The video was produced in co-operation by the Austrian Research Center Seibersdorf - Department of Environmental Planning (ARCS) and Joanneum Research - Institute of Informationsystems (JR). ARCS was responsible for idea, concept, storyboard and text while JR was responsible for computer animation and general video production. The speaker text was written with scientific advice by the POP - project partners: Institute of Meteorology and Physics, University of Agricultural Sciences- Vienna, Environment Agency Austria - Air Quality Department, Austrian Research Center Seibersdorf- Environmental Planning Department/System Research Division. The film was produced as German and English version. (author)

  1. Classification of e-government documents based on cooperative expression of word vectors

    Science.gov (United States)

    Fu, Qianqian; Liu, Hao; Wei, Zhiqiang

    2017-03-01

    The effective document classification is a powerful technique to deal with the huge amount of e-government documents automatically instead of accomplishing them manually. The word-to-vector (word2vec) model, which converts semantic word into low-dimensional vectors, could be successfully employed to classify the e-government documents. In this paper, we propose the cooperative expressions of word vector (Co-word-vector), whose multi-granularity of integration explores the possibility of modeling documents in the semantic space. Meanwhile, we also aim to improve the weighted continuous bag of words model based on word2vec model and distributed representation of topic-words based on LDA model. Furthermore, combining the two levels of word representation, performance result shows that our proposed method on the e-government document classification outperform than the traditional method.

  2. Integrating model of the Project Independence Evaluation System. Volume VI. Data documentation. Part I

    Energy Technology Data Exchange (ETDEWEB)

    Allen, B J

    1979-02-01

    This documentation describes the PIES Integrating Model as it existed on January 1, 1978. This volume contains two chapters. In Chapter I, Overview, the following subjects are briefly described: supply data, EIA projection series and scenarios, demand data and assumptions, and supply assumptions - oil and gas availabilities. Chapter II contains supply and demand data tables and sources used by the PIES Integrating Model for the mid-range scenario target years 1985 and 1990. Tabulated information is presented for demand, price, and elasticity data; coal data; imports data; oil and gas data; refineries data; synthetics, shale, and solar/geothermal data; transportation data; and utilities data.

  3. Variability in the quality of overdose advice in Summary of Product Characteristics (SPC) documents: gut decontamination recommendations for CNS drugs.

    Science.gov (United States)

    Wall, Andrew J B; Bateman, D N; Waring, W S

    2009-01-01

    Deliberate self-poisoning is a major cause of morbidity and mortality. The Summary of Product Characteristics (SPC) document is a legal requirement for all drugs, and Section 4.9 addresses the features of toxicity and clinical advice on management of overdose. The quality and appropriateness of this advice have received comparatively little attention. Section 4.9 of the SPC was examined for all drugs in the central nervous system (CNS) category of the British National Formulary. Advice concerning gut decontamination was examined with respect to specific interventions: induced vomiting, oral activated charcoal, gastric lavage, and other interventions. Data were compared with standard reference sources for clinical management advice in poisoning. These were graded 'A' if no important differences existed, 'B' if differences were noted but not thought clinically important, and 'C' if differences were thought to be clinically significant. SPC documents were examined for 258 medications from 67 manufacturers. The overall agreement was 'A' in 23 (8.9%), 'B' in 28 (10.9%) and 'C' in 207 (80.2%). Discrepancies were due to inappropriate recommendation of induced emesis in 21.7% (95% confidence interval 17.1, 27.1), gastric lavage in 38.4% (32.7, 44.4), other gut decontamination in 5.8% (3.6, 9.4) and failure to recommend oral activated charcoal in 57.4% (51.1, 63.4). Gut decontamination advice in SPC documents with respect to CNS drugs was inadequate. Possible reasons for the observed discrepancies and ways of improving the consistency of advice are proposed.

  4. The Lived Environment Life Quality Model for institutionalized people with dementia.

    Science.gov (United States)

    Wood, Wendy; Lampe, Jenna L; Logan, Christina A; Metcalfe, Amy R; Hoesly, Beth E

    2017-02-01

    There is a need for a conceptual practice model that explicates ecological complexities involved in using occupation to optimize the quality of life of institutionalized people with dementia. This study aimed to prepare the Lived Environment Life Quality Model, a dementia-specific conceptual practice model of occupational therapy in institutional facilities, for publication and application to practice. Interviews and focus groups with six expert occupational therapists were subjected to qualitative content analysis to confirm, disconfirm, and further develop the model. The model's lived-environment domain as the focus of assessment and intervention was extensively confirmed, and its quality-of-life domain as the focus of intervention goals and outcomes was both confirmed and further developed. As confirmed in this study, the Lived Environment Life Quality Model is a client-centred, ecologically valid, and occupation-focused guide to optimizing quality of life of institutionalized adults with dementia in present moments and progressively over time.

  5. THE RELATIONSHIP BETWEEN MODELS OF QUALITY MANAGEMENT AND CSR

    Directory of Open Access Journals (Sweden)

    CĂTĂLINA SITNIKOV

    2015-03-01

    Full Text Available Lately, the quality management has integrated more and more among its components Corporate Social Responsibility (CSR. With strong roots in the foundation for sustainable development, protection of the environment, issues of social justness and economic growth, CSR raises numerous issues related to obtaining profits, business performance and firms and activities based on the quality of management. From the point of view of the last issues, the models of quality management built on the fundamental principles of quality become the foundation and catalyst for the effective implementation of CSR in organizations. This is the reason why it is necessary to investigate the extent to which quality management models provide frameworks and guidelines for integrating CSR in the management of quality and, moreover, in the management of the organization, with a clear focus on the extent to which the concept can be institutionalized and operated by the organization.

  6. A pilot modeling technique for handling-qualities research

    Science.gov (United States)

    Hess, R. A.

    1980-01-01

    A brief survey of the more dominant analysis techniques used in closed-loop handling-qualities research is presented. These techniques are shown to rely on so-called classical and modern analytical models of the human pilot which have their foundation in the analysis and design principles of feedback control. The optimal control model of the human pilot is discussed in some detail and a novel approach to the a priori selection of pertinent model parameters is discussed. Frequency domain and tracking performance data from 10 pilot-in-the-loop simulation experiments involving 3 different tasks are used to demonstrate the parameter selection technique. Finally, the utility of this modeling approach in handling-qualities research is discussed.

  7. Software Quality Assessment Tool Based on Meta-Models

    OpenAIRE

    Doneva Rositsa; Gaftandzhieva Silvia; Doneva Zhelyana; Staevsky Nevena

    2015-01-01

    In the software industry it is indisputably essential to control the quality of produced software systems in terms of capabilities for easy maintenance, reuse, portability and others in order to ensure reliability in the software development. But it is also clear that it is very difficult to achieve such a control through a ‘manual’ management of quality.There are a number of approaches for software quality assurance based typically on software quality models (e.g. ISO 9126, McCall’s, Boehm’s...

  8. Large Hospital 50% Energy Savings: Technical Support Document

    Energy Technology Data Exchange (ETDEWEB)

    Bonnema, E.; Studer, D.; Parker, A.; Pless, S.; Torcellini, P.

    2010-09-01

    This Technical Support Document documents the technical analysis and design guidance for large hospitals to achieve whole-building energy savings of at least 50% over ANSI/ASHRAE/IESNA Standard 90.1-2004 and represents a step toward determining how to provide design guidance for aggressive energy savings targets. This report documents the modeling methods used to demonstrate that the design recommendations meet or exceed the 50% goal. EnergyPlus was used to model the predicted energy performance of the baseline and low-energy buildings to verify that 50% energy savings are achievable. Percent energy savings are based on a nominal minimally code-compliant building and whole-building, net site energy use intensity. The report defines architectural-program characteristics for typical large hospitals, thereby defining a prototype model; creates baseline energy models for each climate zone that are elaborations of the prototype models and are minimally compliant with Standard 90.1-2004; creates a list of energy design measures that can be applied to the prototype model to create low-energy models; uses industry feedback to strengthen inputs for baseline energy models and energy design measures; and simulates low-energy models for each climate zone to show that when the energy design measures are applied to the prototype model, 50% energy savings (or more) are achieved.

  9. Optimum profit model considering production, quality and sale problem

    Science.gov (United States)

    Chen, Chung-Ho; Lu, Chih-Lun

    2011-12-01

    Chen and Liu ['Procurement Strategies in the Presence of the Spot Market-an Analytical Framework', Production Planning and Control, 18, 297-309] presented the optimum profit model between the producers and the purchasers for the supply chain system with a pure procurement policy. However, their model with a simple manufacturing cost did not consider the used cost of the customer. In this study, the modified Chen and Liu's model will be addressed for determining the optimum product and process parameters. The authors propose a modified Chen and Liu's model under the two-stage screening procedure. The surrogate variable having a high correlation with the measurable quality characteristic will be directly measured in the first stage. The measurable quality characteristic will be directly measured in the second stage when the product decision cannot be determined in the first stage. The used cost of the customer will be measured by adopting Taguchi's quadratic quality loss function. The optimum purchaser's order quantity, the producer's product price and the process quality level will be jointly determined by maximising the expected profit between them.

  10. Using simulation training to improve shoulder dystocia documentation.

    Science.gov (United States)

    Goffman, Dena; Heo, Hye; Chazotte, Cynthia; Merkatz, Irwin R; Bernstein, Peter S

    2008-12-01

    To estimate whether shoulder dystocia documentation could be improved with a simulation-based educational experience. Obstetricians at our institution (n=71) participated in an unanticipated simulated shoulder dystocia followed by an educational debriefing session. A second shoulder dystocia simulation was completed at a later date. Delivery notes were a required component of each simulation. Notes were evaluated using a standardized checklist for 16 key components. One point was awarded for each element present. Wilcoxon signed rank tests were used to compare documentation between simulations. Participants consisted of 43 (61%) attending and 28 (39%) resident physicians. Ages ranged from 25-63 years (mean+/-standard deviation 37.0+/-9.0), and 75% were female. Years of obstetric experience for our attendings ranged from 4 to 31 years (14.5+/-8.1). Documentation scores were significantly improved after training. Attendings' baseline documentation scores were 8.5+/-2.2 and improved to 9.4+/-2.3, P=.03. Residents' documentation scores also improved (9.0+/-2.1 compared with 10.6+/-2.2, P=.001). In particular, improvement was seen in two components of documentation: 1) providers present for shoulder dystocia (P=.007) and 2) which shoulder was anterior (P<.001). No improvement was seen in standard delivery note components (eg, date, time) or infant characteristics (eg, weight, Apgar scores). Although we showed a significant improvement in the quality of documentation through this simulation program, notes were still suboptimal. Use of standardized forms for shoulder dystocia delivery notes may provide the best solution to ensure appropriate documentation. II.

  11. Quality-assurance and data-management plan for water-quality activities in the Kansas Water Science Center, 2014

    Science.gov (United States)

    Rasmussen, Teresa J.; Bennett, Trudy J.; Foster, Guy M.; Graham, Jennifer L.; Putnam, James E.

    2014-01-01

    As the Nation’s largest water, earth, and biological science and civilian mapping information agency, the U.S. Geological Survey is relied on to collect high-quality data, and produce factual and impartial interpretive reports. This quality-assurance and data-management plan provides guidance for water-quality activities conducted by the Kansas Water Science Center. Policies and procedures are documented for activities related to planning, collecting, storing, documenting, tracking, verifying, approving, archiving, and disseminating water-quality data. The policies and procedures described in this plan complement quality-assurance plans for continuous water-quality monitoring, surface-water, and groundwater activities in Kansas.

  12. Development of an event-driven parser for active document and web-based nuclear design system

    Energy Technology Data Exchange (ETDEWEB)

    Park, Yong Soo

    2005-02-15

    Nuclear design works consist of extensive unit job modules in which many computer codes are used. Each unit module requires time-consuming and erroneous input preparation, code run, output analysis and quality assurance process. The task for safety evaluation of reload core is especially the most man-power intensive and time-consuming due to the large amount of calculations and data exchanges. The purpose of this study is to develop a new nuclear design system called Innovative Design Processor (IDP) in order to minimize human effort and maximize design quality and productivity, and then to achieve an ultimately optimized core loading pattern. Two new basic principles of IDP are the document-oriented design and the web based design. Contrary to the conventional code-oriented or procedure-oriented design, the document-oriented design is human-oriented in that the final document is automatically prepared with complete analysis, table and plots, if the designer writes a design document called active document and feeds it to a parser. This study defined a number of active components and developed an event-driven parser for the active document in HTML (Hypertext Markup Language) or XML (Extensible Markup Language). The active documents can be created on the web, which is another framework of IDP. Using proper mix-up of server side and client side programming under the HAMP (HP-UX/Apache/MySQL/PHP) environment, the document-oriented design process on the web is modeled as a design wizard for designer's convenience and platform independency. This automation using IDP was tested for the reload safety evaluation of Korea Standard Nuclear Power Plant (KSNP) type PWRs. Great time saving was confirmed and IDP can complete several-month jobs in a few days. More optimized core loading pattern, therefore, can be obtained since it takes little time to do the reload safety evaluation tasks with several core loading pattern candidates. Since the technology is also applicable to

  13. Development of an event-driven parser for active document and web-based nuclear design system

    International Nuclear Information System (INIS)

    Park, Yong Soo

    2005-02-01

    Nuclear design works consist of extensive unit job modules in which many computer codes are used. Each unit module requires time-consuming and erroneous input preparation, code run, output analysis and quality assurance process. The task for safety evaluation of reload core is especially the most man-power intensive and time-consuming due to the large amount of calculations and data exchanges. The purpose of this study is to develop a new nuclear design system called Innovative Design Processor (IDP) in order to minimize human effort and maximize design quality and productivity, and then to achieve an ultimately optimized core loading pattern. Two new basic principles of IDP are the document-oriented design and the web based design. Contrary to the conventional code-oriented or procedure-oriented design, the document-oriented design is human-oriented in that the final document is automatically prepared with complete analysis, table and plots, if the designer writes a design document called active document and feeds it to a parser. This study defined a number of active components and developed an event-driven parser for the active document in HTML (Hypertext Markup Language) or XML (Extensible Markup Language). The active documents can be created on the web, which is another framework of IDP. Using proper mix-up of server side and client side programming under the HAMP (HP-UX/Apache/MySQL/PHP) environment, the document-oriented design process on the web is modeled as a design wizard for designer's convenience and platform independency. This automation using IDP was tested for the reload safety evaluation of Korea Standard Nuclear Power Plant (KSNP) type PWRs. Great time saving was confirmed and IDP can complete several-month jobs in a few days. More optimized core loading pattern, therefore, can be obtained since it takes little time to do the reload safety evaluation tasks with several core loading pattern candidates. Since the technology is also applicable to the

  14. Underground Test Area Activity Quality Assurance Plan Nevada National Security Site, Nevada. Revision 2

    Energy Technology Data Exchange (ETDEWEB)

    Krenzien, Susan [Navarro-Intera, LLC (N-I), Las Vegas, NV (United States); Farnham, Irene [Navarro-Intera, LLC (N-I), Las Vegas, NV (United States)

    2015-06-01

    This Quality Assurance Plan (QAP) provides the overall quality assurance (QA) requirements and general quality practices to be applied to the U.S. Department of Energy (DOE), National Nuclear Security Administration Nevada Field Office (NNSA/NFO) Underground Test Area (UGTA) activities. The requirements in this QAP are consistent with DOE Order 414.1D, Change 1, Quality Assurance (DOE, 2013a); U.S. Environmental Protection Agency (EPA) Guidance for Quality Assurance Project Plans for Modeling (EPA, 2002); and EPA Guidance on the Development, Evaluation, and Application of Environmental Models (EPA, 2009). If a participant’s requirement document differs from this QAP, the stricter requirement will take precedence. NNSA/NFO, or designee, must review this QAP every two years. Changes that do not affect the overall scope or requirements will not require an immediate QAP revision but will be incorporated into the next revision cycle after identification. Section 1.0 describes UGTA objectives, participant responsibilities, and administrative and management quality requirements (i.e., training, records, procurement). Section 1.0 also details data management and computer software requirements. Section 2.0 establishes the requirements to ensure newly collected data are valid, existing data uses are appropriate, and environmental-modeling methods are reliable. Section 3.0 provides feedback loops through assessments and reports to management. Section 4.0 provides the framework for corrective actions. Section 5.0 provides references for this document.

  15. Model documentation coal market module of the National Energy Modeling System

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-01

    This report documents the approaches used in developing the Annual Energy Outlook 1995 (AEO95). This report catalogues and describes the assumptions, methodology, estimation techniques, and source code of the coal market module`s three submodules. These are the Coal Production Submodule (CPS), the Coal Export Submodule (CES), the Coal Expert Submodule (CES), and the Coal Distribution Submodule (CDS).

  16. Model documentation coal market module of the National Energy Modeling System

    International Nuclear Information System (INIS)

    1995-03-01

    This report documents the approaches used in developing the Annual Energy Outlook 1995 (AEO95). This report catalogues and describes the assumptions, methodology, estimation techniques, and source code of the coal market module's three submodules. These are the Coal Production Submodule (CPS), the Coal Export Submodule (CES), the Coal Expert Submodule (CES), and the Coal Distribution Submodule (CDS)

  17. Data Model and Relational Database Design for Highway Runoff Water-Quality Metadata

    Science.gov (United States)

    Granato, Gregory E.; Tessler, Steven

    2001-01-01

    A National highway and urban runoff waterquality metadatabase was developed by the U.S. Geological Survey in cooperation with the Federal Highway Administration as part of the National Highway Runoff Water-Quality Data and Methodology Synthesis (NDAMS). The database was designed to catalog available literature and to document results of the synthesis in a format that would facilitate current and future research on highway and urban runoff. This report documents the design and implementation of the NDAMS relational database, which was designed to provide a catalog of available information and the results of an assessment of the available data. All the citations and the metadata collected during the review process are presented in a stratified metadatabase that contains citations for relevant publications, abstracts (or previa), and reportreview metadata for a sample of selected reports that document results of runoff quality investigations. The database is referred to as a metadatabase because it contains information about available data sets rather than a record of the original data. The database contains the metadata needed to evaluate and characterize how valid, current, complete, comparable, and technically defensible published and available information may be when evaluated for application to the different dataquality objectives as defined by decision makers. This database is a relational database, in that all information is ultimately linked to a given citation in the catalog of available reports. The main database file contains 86 tables consisting of 29 data tables, 11 association tables, and 46 domain tables. The data tables all link to a particular citation, and each data table is focused on one aspect of the information collected in the literature search and the evaluation of available information. This database is implemented in the Microsoft (MS) Access database software because it is widely used within and outside of government and is familiar to many

  18. Innovations in projecting emissions for air quality modeling ...

    Science.gov (United States)

    Air quality modeling is used in setting air quality standards and in evaluating their costs and benefits. Historically, modeling applications have projected emissions and the resulting air quality only 5 to 10 years into the future. Recognition that the choice of air quality management strategy has climate change implications is encouraging longer modeling time horizons. However, for multi-decadal time horizons, many questions about future conditions arise. For example, will current population, economic, and land use trends continue, or will we see shifts that may alter the spatial and temporal pattern of emissions? Similarly, will technologies such as building-integrated solar photovoltaics, battery storage, electric vehicles, and CO2 capture emerge as disruptive technologies - shifting how we produce and use energy - or will these technologies achieve only niche markets and have little impact? These are some of the questions that are being evaluated by researchers within the U.S. EPA’s Office of Research and Development. In this presentation, Dr. Loughlin will describe a range of analytical approaches that are being explored. These include: (i) the development of alternative scenarios of the future that can be used to evaluate candidate management strategies over wide-ranging conditions, (ii) the application of energy system models to project emissions decades into the future and to assess the environmental implications of new technologies, (iii) and methodo

  19. Development and application of air quality models at the US ...

    Science.gov (United States)

    Overview of the development and application of air quality models at the U.S. EPA, particularly focused on the development and application of the Community Multiscale Air Quality (CMAQ) model developed within the Computation Exposure Division (CED) of the National Exposure Research Laboratory (NERL). This presentation will provide a simple overview of air quality model development and application geared toward a non-technical student audience. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  20. Non-intrusive speech quality assessment in simplified e-model

    OpenAIRE

    Vozňák, Miroslav

    2012-01-01

    The E-model brings a modern approach to the computation of estimated quality, allowing for easy implementation. One of its advantages is that it can be applied in real time. The method is based on a mathematical computation model evaluating transmission path impairments influencing speech signal, especially delays and packet losses. These parameters, common in an IP network, can affect speech quality dramatically. The paper deals with a proposal for a simplified E-model and its pr...