WorldWideScience

Sample records for source documents analyze

  1. Skyline: an open source document editor for creating and analyzing targeted proteomics experiments.

    Science.gov (United States)

    MacLean, Brendan; Tomazela, Daniela M; Shulman, Nicholas; Chambers, Matthew; Finney, Gregory L; Frewen, Barbara; Kern, Randall; Tabb, David L; Liebler, Daniel C; MacCoss, Michael J

    2010-04-01

    Skyline is a Windows client application for targeted proteomics method creation and quantitative data analysis. It is open source and freely available for academic and commercial use. The Skyline user interface simplifies the development of mass spectrometer methods and the analysis of data from targeted proteomics experiments performed using selected reaction monitoring (SRM). Skyline supports using and creating MS/MS spectral libraries from a wide variety of sources to choose SRM filters and verify results based on previously observed ion trap data. Skyline exports transition lists to and imports the native output files from Agilent, Applied Biosystems, Thermo Fisher Scientific and Waters triple quadrupole instruments, seamlessly connecting mass spectrometer output back to the experimental design document. The fast and compact Skyline file format is easily shared, even for experiments requiring many sample injections. A rich array of graphs displays results and provides powerful tools for inspecting data integrity as data are acquired, helping instrument operators to identify problems early. The Skyline dynamic report designer exports tabular data from the Skyline document model for in-depth analysis with common statistical tools. Single-click, self-updating web installation is available at http://proteome.gs.washington.edu/software/skyline. This web site also provides access to instructional videos, a support board, an issues list and a link to the source code project.

  2. Neurophysiological analytics for all! Free open-source software tools for documenting, analyzing, visualizing, and sharing using electronic notebooks.

    Science.gov (United States)

    Rosenberg, David M; Horn, Charles C

    2016-08-01

    Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research. Copyright © 2016 the American Physiological Society.

  3. Using Primary Source Documents.

    Science.gov (United States)

    Mintz, Steven

    2003-01-01

    Explores the use of primary sources when teaching about U.S. slavery. Includes primary sources from the Gilder Lehrman Documents Collection (New York Historical Society) to teach about the role of slaves in the Revolutionary War, such as a proclamation from Lord Dunmore offering freedom to slaves who joined his army. (CMK)

  4. Analyzing and Interpreting Historical Sources

    DEFF Research Database (Denmark)

    Kipping, Matthias; Wadhwani, Dan; Bucheli, Marcelo

    2014-01-01

    This chapter outlines a methodology for the interpretation of historical sources, helping to realize their full potential for the study of organization, while overcoming their challenges in terms of distortions created by time, changes in context, and selective production or preservation. Drawing....... The chapter contributes to the creation of a language for describing the use of historical sources in management research....

  5. Gaz de France. Source document

    International Nuclear Information System (INIS)

    2005-01-01

    This document was issued by Gaz de France, the French gas utility, at the occasion of the opening of the capital of the company. It is intended to shareholders and presents some informations relative to the stocks admitted to Euronext's Eurolist, some general informations about the company and its capital, some informations about the activities of Gaz de France group, about its financial situation and results, about its management, and about its recent evolution and future perspectives. (J.S.)

  6. Supplemental Information Source Document Waste Management

    Energy Technology Data Exchange (ETDEWEB)

    Wood, Craig [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Halpern, Jonathan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wrons, Ralph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Reiser, Anita [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mond, Michael du [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shain, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-12-01

    This Supplemental Information Source Document for Waste Management was prepared in support of future analyses including those that may be performed as part of the Sandia National Laboratories, New Mexico (SNL/NM) Site-Wide Environmental Impact Statement. This document presents information about waste management practices at SNL/NM, including definitions, inventory data, and an overview of current activities.

  7. TWRS configuration management requirement source document

    International Nuclear Information System (INIS)

    Vann, J.M.

    1997-01-01

    The TWRS Configuration Management (CM) Requirement Source document prescribes CM as a basic product life-cycle function by which work and activities are conducted or accomplished. This document serves as the requirements basis for the TWRS CM program. The objective of the TWRS CM program is to establish consistency among requirements, physical/functional configuration, information, and documentation for TWRS and TWRS products, and to maintain this consistency throughout the life-cycle of TWRS and the product, particularly as changes are being made

  8. Analyzing Red Tape: The Performative vs Informative Roles of Bureaucratic Documents

    OpenAIRE

    Lee, R.M.

    1980-01-01

    The preparation and transfer of documents bureaucratic procedures are generally viewed solely as a means of transferring information within the organization. When taken as the basis for analyzing and improving bureaucratic systems, this view is too narrow. Another, performative aspect of these documents also needs to be considered in the analysis. This paper elaborates on this additional function of organizational documents and points out the need for a broader framework for analyzing bu...

  9. Radiometric analyzer with plural radiation sources and detectors

    International Nuclear Information System (INIS)

    Arima, S.; Oda, M.; Miyashita, K.; Takada, M.

    1977-01-01

    A radiometric analyzer for measuring characteristics of a material by radiation comprises a plurality of systems in which each consists of a radiation source and a radiation detector which are the same in number as the number of elements of the molecule of the material and a linear calibration circuit having inverse response characteristics (calibration curve) of the respective systems of detectors, whereby the measurement is carried out by four fundamental rules by operation of the mutual outputs of said detector system obtained through said linear calibration circuit. One typical embodiment is a radiometric analyzer for hydrocarbons which measures the density of heavy oil, the sulfur content and the calorific value by three detector systems which include a γ-ray source (E/sub γ/ greater than 50 keV), a soft x-ray source (Ex approximately 20 keV), and a neutron ray source. 2 claims, 6 figures

  10. Health physics source document for codes of practice

    International Nuclear Information System (INIS)

    Pearson, G.W.; Meggitt, G.C.

    1989-05-01

    Personnel preparing codes of practice often require basic Health Physics information or advice relating to radiological protection problems and this document is written primarily to supply such information. Certain technical terms used in the text are explained in the extensive glossary. Due to the pace of change in the field of radiological protection it is difficult to produce an up-to-date document. This document was compiled during 1988 however, and therefore contains the principle changes brought about by the introduction of the Ionising Radiations Regulations (1985). The paper covers the nature of ionising radiation, its biological effects and the principles of control. It is hoped that the document will provide a useful source of information for both codes of practice and wider areas and stimulate readers to study radiological protection issues in greater depth. (author)

  11. Mass analyzer ``MASHA'' high temperature target and plasma ion source

    Science.gov (United States)

    Semchenkov, A. G.; Rassadov, D. N.; Bekhterev, V. V.; Bystrov, V. A.; Chizov, A. Yu.; Dmitriev, S. N.; Efremov, A. A.; Guljaev, A. V.; Kozulin, E. M.; Oganessian, Yu. Ts.; Starodub, G. Ya.; Voskresensky, V. M.; Bogomolov, S. L.; Paschenko, S. V.; Zelenak, A.; Tikhonov, V. I.

    2004-05-01

    A new separator and mass analyzer of super heavy atoms (MASHA) has been created at the FLNR JINR Dubna to separate and measure masses of nuclei and molecules with precision better than 10-3. First experiments with the FEBIAD plasma ion source have been done and give an efficiency of ionization of up to 20% for Kr with a low flow test leak (6 particle μA). We suppose a magnetic field optimization, using the additional electrode (einzel lens type) in the extracting system, and an improving of the vacuum conditions in order to increase the ion source efficiency.

  12. Mass analyzer 'MASHA' high temperature target and plasma ion source

    International Nuclear Information System (INIS)

    Semchenkov, A.G.; Rassadov, D.N.; Bekhterev, V.V.; Bystrov, V.A.; Chizov, A.Yu.; Dmitriev, S.N.; Efremov, A.A.; Guljaev, A.V.; Kozulin, E.M.; Oganessian, Yu.Ts.; Starodub, G.Ya.; Voskresensky, V.M.; Bogomolov, S.L.; Paschenko, S.V.; Zelenak, A.; Tikhonov, V.I.

    2004-01-01

    A new separator and mass analyzer of super heavy atoms (MASHA) has been created at the FLNR JINR Dubna to separate and measure masses of nuclei and molecules with precision better than 10 -3 . First experiments with the FEBIAD plasma ion source have been done and give an efficiency of ionization of up to 20% for Kr with a low flow test leak (6 particle μA). We suppose a magnetic field optimization, using the additional electrode (einzel lens type) in the extracting system, and an improving of the vacuum conditions in order to increase the ion source efficiency

  13. Advanced Photon Source experimental beamline Safety Assessment Document: Addendum to the Advanced Photon Source Accelerator Systems Safety Assessment Document (APS-3.2.2.1.0)

    International Nuclear Information System (INIS)

    1995-01-01

    This Safety Assessment Document (SAD) addresses commissioning and operation of the experimental beamlines at the Advanced Photon Source (APS). Purpose of this document is to identify and describe the hazards associated with commissioning and operation of these beamlines and to document the measures taken to minimize these hazards and mitigate the hazard consequences. The potential hazards associated with the commissioning and operation of the APS facility have been identified and analyzed. Physical and administrative controls mitigate identified hazards. No hazard exists in this facility that has not been previously encountered and successfully mitigated in other accelerator and synchrotron radiation research facilities. This document is an updated version of the APS Preliminary Safety Analysis Report (PSAR). During the review of the PSAR in February 1990, the APS was determined to be a Low Hazard Facility. On June 14, 1993, the Acting Director of the Office of Energy Research endorsed the designation of the APS as a Low Hazard Facility, and this Safety Assessment Document supports that designation

  14. Writing in the workplace: Constructing documents using multiple digital sources

    Directory of Open Access Journals (Sweden)

    Mariëlle Leijten

    2014-02-01

    Full Text Available In today’s workplaces professional communication often involves constructing documents from multiple digital sources—integrating one’s own texts/graphics with ideas based on others’ text/graphics. This article presents a case study of a professional communication designer as he constructs a proposal over several days. Drawing on keystroke and interview data, we map the professional’s overall process, plot the time course of his writing/design, illustrate how he searches for content and switches among optional digital sources, and show how he modifies and reuses others’ content. The case study reveals not only that the professional (1 searches extensively through multiple sources for content and ideas but that he also (2 constructs visual content (charts, graphs, photographs as well as verbal content, and (3 manages his attention and motivation over this extended task. Since these three activities are not represented in current models of writing, we propose their addition not just to models of communication design, but also to models of writing in general.

  15. FINANCIAL REPORTING AND SOURCE DOCUMENTS OF UKRAINIAN ENTERPRISES WHEN APPLYING THE IFRS

    Directory of Open Access Journals (Sweden)

    G. Golubnicha

    2013-08-01

    Full Text Available The theoretical, methodological and practical aspects of changes in financial reporting and source documents specific to Ukrainian enterprises in the new conditions resulting from the application of International Financial Reporting Standards have been analyzed. Also, a conceptual approach of defining the patterns of changes in financial reporting and elements of accounting method has been proposed. The issue of internal quality control of analytical accounting information at various stages of its formation has been researched.

  16. ANALYZED FACTORS THAT LEADS TO THE BALANCED SCORECARD NURSING CARE DOCUMENTATION AT RUMAH SAKIT JIWA MENUR SURABAYA

    Directory of Open Access Journals (Sweden)

    Yuli Anggraini

    2017-04-01

    Full Text Available Introduction: Nursing documentation is an important aspect of nursing practice so that should be assessed comprehensively. The objective of the study was to analyze the causing factor of nursing care documentation at Rumah Sakit Jiwa Menur Surabaya through balanced scorecard. Method: This research was an analytical descriptive conducted out on January 2010 at Rumah Sakit Jiwa Menur Surabaya that measured nursing care documentation through four perspectives of balanced scorecard by distributing quisioner to 55 nurses and 69 customers (patient families using inclusion criteria, and holding personal interview to 3 structural offi cial, 2 functional official, and 6 ward supervisors. Data of nurse education, percentage of trained nurse was gained by checklist. Data were analyzed using content analysis to fi nd the causing factor of nursing documentation within balanced scorecard. Result: The result showed that financial, internal business processes, and learning and growth perspectives had causal relationship with nursing care documentation at Rumah Sakit Jiwa Menur Surabaya, but customer perspective didn’t have direct causal relationship with it. Discussion: It can be concluded that impractical nursing documentation form especially in dimension of time on assessment, implementation, and evaluation, and comprehension on assessment, absence of physical nursing standards, limited knowledge on nursing documentation evoked by absence of inhouse training about nursing documentation, ineffective supervision and audit were factors which affecting nursing documentation at Rumah Sakit Jiwa Menur Surabaya. The researcher recommended that the hospital manager should modificate the nursing documentation form using NIC & NOC of NANDA and computerized system, compose physical nursing standards, carry out advanced nursing education and inhouse training about nursing care documentation, improve supervision program, and nursing documentation audit.

  17. Documenting open source migration processes for re-use

    CSIR Research Space (South Africa)

    Gerber, A

    2010-10-01

    Full Text Available There are several sources that indicate a remarkable increase in the adoption of open source software (OSS) into the technology infrastructure of organizations. In fact, the number of medium to large organizations without some OSS installations...

  18. Gaz de France. Source document; Gaz de France. Document de base

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-07-01

    This document was issued by Gaz de France, the French gas utility, at the occasion of the opening of the capital of the company. It is intended to shareholders and presents some informations relative to the stocks admitted to Euronext's Eurolist, some general informations about the company and its capital, some informations about the activities of Gaz de France group, about its financial situation and results, about its management, and about its recent evolution and future perspectives. (J.S.)

  19. RECON: a computer program for analyzing repository economics. Documentation and user's manual

    International Nuclear Information System (INIS)

    Clark, L.L.; Cole, B.M.; McNair, G.W.; Schutz, M.E.

    1983-05-01

    From 1981 through 1983 the Pacific Northwest Laboratory has been developing a computer model named RECON to calculate repository costs from parametric data input. The objective of the program has been to develop the capability to evalute the effect on costs of changes in repository design parameters and operating scenario assumptions. This report documents the development of the model through March of 1983. Included in the report are: (1) descriptions of model development and the underlying equations, assumptions and definitions; (2) descriptions of data input either using card images or an interactive data input program; and (3) detailed listings of the program and definitions of program variables. Cost estimates generated using the model have been verified against independent estimates and good agreement has been obtained

  20. RECON: a computer program for analyzing repository economics. Documentation and user's manual. Revision 1

    International Nuclear Information System (INIS)

    Clark, L.L.; Schutz, M.E.; Luksic, A.T.

    1985-07-01

    From 1981 through 1984 the Pacific Northwest Laboratory has been developing a computer model named RECON to calculate repository costs from parametric data input. The objective of the program has been to develop the capability to evaluate the effect on costs of changes in repository design parameters and operating scenario assumptions. This report documents the development of the model through September of 1984. Included in the report are: (1) descriptions of model development and the underlying equations, assumptions and definitions; (2) descriptions of data input using either card images or an interactive data input program; and (3) detailed listings of the program and definitions of program variables. Cost estimates generated using the model have been verified against independent estimates and good agreement has been obtained. 2 refs

  1. Preparation of computer codes for analyzing sensitivity coefficients of burnup characteristics (2) (Contract research, translated document)

    International Nuclear Information System (INIS)

    Hanaki, Hiroshi; Sanda, Toshio; Ohashi, Masahisa

    2008-10-01

    To develop nuclear design of LMFBR cores, they are important subjects of research and development to improve the accuracy in nuclear design of large LMFBR cores and to design highly efficient core more rationally. The adjusted nuclear cross-sections library has been made by being reflected the result of critical experiment of the JUPITER, etc. effectively as much as possible. And the distinct improvement of the accuracy in nuclear design of large LMFBR cores has been achieved. In the design of large LMFBR cores, however, it is important to accurately estimate not only nuclear characteristics, for example, reaction rate distribution and control rod worth but also burnup characteristics, for example, burnup reactivity loss, breeding ratio and so on. Therefore, it is thought to improve the prediction accuracy for burnup characteristics using many burnup data of 'Joyo' effectively. It is thought the best way to adjust cross sections using sensitivity coefficients of burnup characteristics to utilize burnup data of 'Joyo'. It is able to know the accuracy quantitatively for burnup characteristics of large LMFBR by analyzing the sensitivity coefficients. Therefore in this work computer codes for analyzing sensitivity coefficients of burnup characteristics had been prepared since 1992. In 1992 cross-section adjustment was done by using the data of 'Joyo' and the effect was studied. In this year the adequacy of the codes was studied with a view of applying of design of large LMFBR cores. The results are as follows: (1) The computer codes which could analyze sensitivity coefficients of burnup characteristics taking into consideration plural cycles and refueling were prepared, therefore it came of be able to adjust cross sections using burnup data and to estimate the accuracy for design of large LMFBR cores. The characteristics are not only burnup reactivity loss, breeding ratio but also number density, criticality, reactivity worth, reaction rate ratio, and reaction rate

  2. 2011 Addendum to the SNL/NM SWEIS Supplemental Information Source Documents

    Energy Technology Data Exchange (ETDEWEB)

    Dimmick, Ross [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-12-01

    This document contains updates to the Supplemental Information Sandia National Laboratories/New Mexico Site-Wide Environmental Impact Statement Source Documents that were developed in 2010. In general, this addendum provides calendar year 2010 data, along with changes or additions to text in the original documents.

  3. Cost Analysis Sources and Documents Data Base Reference Manual (Update)

    Science.gov (United States)

    1989-06-01

    M: Refcrence Manual PRICE H: Training Course Workbook 11. Use in Cost Analysis. Important source of cost estimates for electronic and mechanical...Nature of Data. Contains many microeconomic time series by month or quarter. 5. Level of Detail. Very detailed. 6. Normalization Processes Required...Reference Manual. Moorestown, N.J,: GE Corporation, September 1986. 64. PRICE Training Course Workbook . Moorestown, N.J.: GE Corporation, February 1986

  4. SCATTER: Source and Transport of Emplaced Radionuclides: Code documentation

    International Nuclear Information System (INIS)

    Longsine, D.E.

    1987-03-01

    SCATTER simulated several processes leading to the release of radionuclides to the site subsystem and then simulates transport via the groundwater of the released radionuclides to the biosphere. The processes accounted for to quantify release rates to a ground-water migration path include radioactive decay and production, leaching, solubilities, and the mixing of particles with incoming uncontaminated fluid. Several decay chains of arbitrary length can be considered simultaneously. The release rates then serve as source rates to a numerical technique which solves convective-dispersive transport for each decay chain. The decay chains are allowed to have branches and each member can have a different radioactive factor. Results are cast as radionuclide discharge rates to the accessible environment

  5. Source document for waste area groupings at Oak Ridge National Laboratory, Oak Ridge, Tennessee

    International Nuclear Information System (INIS)

    Osborne, P.L.; Kuhaida, A.J., Jr.

    1996-09-01

    This document serves as a source document for Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) and other types of documents developed for and pertaining to Environmental Restoration (ER) Program activities at Oak Ridge National Laboratory (ORNL). It contains descriptions of the (1) regulatory requirements for the ORR ER Program, (2) Oak Ridge Reservation (ORR) ER Program, (3) ORNL site history and characterization, and (4) history and characterization of Waste Area Groupings (WAGS) 1-20. This document was created to save time, effort, and money for persons and organizations drafting documents for the ER Program and to improve consistency in the documents prepared for the program. By eliminating the repetitious use of selected information about the program, this document will help reduce the time and costs associated with producing program documents. By serving as a benchmark for selected information about the ER Program, this reference will help ensure that information presented in future documents is accurate and complete

  6. Pinon Pine Tree Study, Los Alamos National Laboratory: Source document

    International Nuclear Information System (INIS)

    Gonzales, G.J.; Fresquez, P.R.; Mullen, M.A.; Naranjo, L. Jr.

    2000-01-01

    One of the dominant tree species growing within and around Los Alamos National Laboratory (LANL), Los Alamos, NM, lands is the pinon pine (Pinus edulis) tree. Pinon pine is used for firewood, fence posts, and building materials and is a source of nuts for food--the seeds are consumed by a wide variety of animals and are also gathered by people in the area and eaten raw or roasted. This study investigated the (1) concentration of 3 H, 137 Cs, 90 Sr, tot U, 238 Pu, 239,240 Pu, and 241 Am in soils (0- to 12-in. [31 cm] depth underneath the tree), pinon pine shoots (PPS), and pinon pine nuts (PPN) collected from LANL lands and regional background (BG) locations, (2) concentrations of radionuclides in PPN collected in 1977 to present data, (3) committed effective dose equivalent (CEDE) from the ingestion of nuts, and (4) soil to PPS to PPN concentration ratios (CRs). Most radionuclides, with the exception of 3 H in soils, were not significantly higher (p < 0.10) in soils, PPS, and PPN collected from LANL as compared to BG locations, and concentrations of most radionuclides in PPN from LANL have decreased over time. The maximum net CEDE (the CEDE plus two sigma minus BG) at the most conservative ingestion rate (10 lb [4.5 kg]) was 0.0018 mrem (0.018 microSv). Soil-to-nut CRs for most radionuclides were within the range of default values in the literature for common fruits and vegetables

  7. How Do Open Source Communities Document Software Architecture: An Exploratory Survey

    NARCIS (Netherlands)

    Ding, W.; Liang, P.; Tang, A.; Van Vliet, H.; Shahin, M.

    2014-01-01

    Software architecture (SA) documentation provides a blueprint of a software-intensive system for the communication between stakeholders about the high-level design of the system. In open source software (OSS) development, a lack of SA documentation may hinder the use and further development of OSS,

  8. A methodology for improving the SIS-RT in analyzing the traceability of the documents written in Korean language

    International Nuclear Information System (INIS)

    Yoo, Yeong Jae; Kim, Man Cheol; Seong, Poong Hyun

    2002-01-01

    Inspection is widely believed to be an effective software verification and validation (V and V) method. However, software inspection is labor-intensive. This labor-intensive nature is compounded by a view that since software inspection uses little technology, they do not fit in well with a more technology-oriented development environment. Nevertheless, software inspection is gaining in popularity. The researchers of KAIST I and C laboratory developed the software tool managing and supporting inspection tasks, named SIS-RT. SIS-RT is designed to partially automate the software inspection processes. SIS-RT supports the analyses of traceability between the spec documents. To make SIS-RT prepared for the spec document written in Korean language, certain techniques in natural language processing have been reviewed. Among those, the case grammar is most suitable for the analyses of Korean language. In this paper, the methodology for analyzing the traceability between spec documents written in Korean language will be proposed based on the case grammar

  9. Sources of patient uncertainty when reviewing medical disclosure and consent documentation.

    Science.gov (United States)

    Donovan-Kicken, Erin; Mackert, Michael; Guinn, Trey D; Tollison, Andrew C; Breckinridge, Barbara

    2013-02-01

    Despite evidence that medical disclosure and consent forms are ineffective at communicating the risks and hazards of treatment and diagnostic procedures, little is known about exactly why they are difficult for patients to understand. The objective of this research was to examine what features of the forms increase people's uncertainty. Interviews were conducted with 254 individuals. After reading a sample consent form, participants described what they found confusing in the document. With uncertainty management as a theoretical framework, interview responses were analyzed for prominent themes. Four distinct sources of uncertainty emerged from participants' responses: (a) language, (b) risks and hazards, (c) the nature of the procedure, and (d) document composition and format. Findings indicate the value of simplifying medico-legal jargon, signposting definitions of terms, removing language that addresses multiple readers simultaneously, reorganizing bulleted lists of risks, and adding section breaks or negative space. These findings offer suggestions for providing more straightforward details about risks and hazards to patients, not necessarily through greater amounts of information but rather through more clear and sufficient material and better formatting. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  10. Analyzing Capabilities of Commercial and Open-Source Routers to Implement Atomic BGP

    Directory of Open Access Journals (Sweden)

    A. Cvjetić

    2010-06-01

    Full Text Available The paper analyzes implementations of BGP protocol on commercial and open-source routers and presents how some existing BGP extensions and routing table isolation mechanisms may be used to solve issues found in standard BGP implementation.

  11. Time-domain single-source integral equations for analyzing scattering from homogeneous penetrable objects

    KAUST Repository

    Valdé s, Felipe; Andriulli, Francesco P.; Bagci, Hakan; Michielssen, Eric

    2013-01-01

    Single-source time-domain electric-and magnetic-field integral equations for analyzing scattering from homogeneous penetrable objects are presented. Their temporal discretization is effected by using shifted piecewise polynomial temporal basis

  12. Advisory Committee on human radiation experiments. Supplemental Volume 2a, Sources and documentation appendices. Final report

    International Nuclear Information System (INIS)

    1995-01-01

    This large document provides a catalog of the location of large numbers of reports pertaining to the charge of the Presidential Advisory Committee on Human Radiation Research and is arranged as a series of appendices. Titles of the appendices are Appendix A- Records at the Washington National Records Center Reviewed in Whole or Part by DoD Personnel or Advisory Committee Staff; Appendix B- Brief Descriptions of Records Accessions in the Advisory Committee on Human Radiation Experiments (ACHRE) Research Document Collection; Appendix C- Bibliography of Secondary Sources Used by ACHRE; Appendix D- Brief Descriptions of Human Radiation Experiments Identified by ACHRE, and Indexes; Appendix E- Documents Cited in the ACHRE Final Report and other Separately Described Materials from the ACHRE Document Collection; Appendix F- Schedule of Advisory Committee Meetings and Meeting Documentation; and Appendix G- Technology Note

  13. Advisory Committee on human radiation experiments. Supplemental Volume 2a, Sources and documentation appendices. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-01-01

    This large document provides a catalog of the location of large numbers of reports pertaining to the charge of the Presidential Advisory Committee on Human Radiation Research and is arranged as a series of appendices. Titles of the appendices are Appendix A- Records at the Washington National Records Center Reviewed in Whole or Part by DoD Personnel or Advisory Committee Staff; Appendix B- Brief Descriptions of Records Accessions in the Advisory Committee on Human Radiation Experiments (ACHRE) Research Document Collection; Appendix C- Bibliography of Secondary Sources Used by ACHRE; Appendix D- Brief Descriptions of Human Radiation Experiments Identified by ACHRE, and Indexes; Appendix E- Documents Cited in the ACHRE Final Report and other Separately Described Materials from the ACHRE Document Collection; Appendix F- Schedule of Advisory Committee Meetings and Meeting Documentation; and Appendix G- Technology Note.

  14. 76 FR 1173 - Draft Guidance for Industry on Electronic Source Documentation in Clinical Investigations...

    Science.gov (United States)

    2011-01-07

    ... Web page at http://www.fda.gov/RegulatoryInformation/Guidances/default.htm . FDA guidances are issued and updated regularly. We recommend you check the Web site to ensure that you have the most up-to-date... electronic diaries provided by study subjects. When paper source documents are available for review, tracing...

  15. Analyzing of economic growth based on electricity consumption from different sources

    Science.gov (United States)

    Maksimović, Goran; Milosavljević, Valentina; Ćirković, Bratislav; Milošević, Božidar; Jović, Srđan; Alizamir, Meysam

    2017-10-01

    Economic growth could be influenced by different factors. In this study was analyzed the economic growth based on the electricity consumption form different sources. As economic growth indicator gross domestic product (GDP) was used. ANFIS (adaptive neuro fuzzy inference system) methodology was applied to determine the most important factors from the given set for the GDP growth prediction. Six inputs were used: electricity production from coal, hydroelectric, natural gas, nuclear, oil and renewable sources. Results shown that the electricity consumption from renewable sources has the highest impact on the economic or GDP growth prediction.

  16. The resolution of point sources of light as analyzed by quantum detection theory

    Science.gov (United States)

    Helstrom, C. W.

    1972-01-01

    The resolvability of point sources of incoherent light is analyzed by quantum detection theory in terms of two hypothesis-testing problems. In the first, the observer must decide whether there are two sources of equal radiant power at given locations, or whether there is only one source of twice the power located midway between them. In the second problem, either one, but not both, of two point sources is radiating, and the observer must decide which it is. The decisions are based on optimum processing of the electromagnetic field at the aperture of an optical instrument. In both problems the density operators of the field under the two hypotheses do not commute. The error probabilities, determined as functions of the separation of the points and the mean number of received photons, characterize the ultimate resolvability of the sources.

  17. Resolution of point sources of light as analyzed by quantum detection theory.

    Science.gov (United States)

    Helstrom, C. W.

    1973-01-01

    The resolvability of point sources of incoherent thermal light is analyzed by quantum detection theory in terms of two hypothesis-testing problems. In the first, the observer must decide whether there are two sources of equal radiant power at given locations, or whether there is only one source of twice the power located midway between them. In the second problem, either one, but not both, of two point sources is radiating, and the observer must decide which it is. The decisions are based on optimum processing of the electromagnetic field at the aperture of an optical instrument. In both problems the density operators of the field under the two hypotheses do not commute. The error probabilities, determined as functions of the separation of the points and the mean number of received photons, characterize the ultimate resolvability of the sources.

  18. A Novel Airborne Carbon Isotope Analyzer for Methane and Carbon Dioxide Source Fingerprinting

    Science.gov (United States)

    Berman, E. S.; Huang, Y. W.; Owano, T. G.; Leifer, I.

    2014-12-01

    Recent field studies on major sources of the important greenhouse gas methane (CH4) indicate significant underestimation of methane release from fossil fuel industrial (FFI) and animal husbandry sources, among others. In addition, uncertainties still exist with respect to carbon dioxide (CO2) measurements, especially source fingerprinting. CO2 isotopic analysis provides a valuable in situ measurement approach to fingerprint CH4 and CO2as associated with combustion sources, leakage from geologic reservoirs, or biogenic sources. As a result, these measurements can characterize strong combustion source plumes, such as power plant emissions, and discriminate these emissions from other sources. As part of the COMEX (CO2 and MEthane eXperiment) campaign, a novel CO2 isotopic analyzer was installed and collected data aboard the CIRPAS Twin Otter aircraft. Developing methods to derive CH4 and CO2 budgets from remote sensing data is the goal of the summer 2014 COMEX campaign, which combines hyperspectral imaging (HSI) and non-imaging spectroscopy (NIS) with in situ airborne and surface data. COMEX leverages the synergy between high spatial resolution HSI and moderate spatial resolution NIS. The carbon dioxide isotope analyzer developed by Los Gatos Research (LGR) uses LGR's patented Off-Axis ICOS (Integrated Cavity Output Spectroscopy) technology and incorporates proprietary internal thermal control for high sensitivity and optimal instrument stability. This analyzer measures CO2 concentration as well as δ13C, δ18O, and δ17O from CO2 at natural abundance (100-3000 ppm). The laboratory accuracy is ±1.2 ppm (1σ) in CO2 from 370-1000 ppm, with a long-term (1000 s) precision of ±0.012 ppm. The long-term precision for both δ13C and δ18O is 0.04 ‰, and for δ17O is 0.06 ‰. The analyzer was field-tested as part of the COWGAS campaign, a pre-cursor campaign to COMEX in March 2014, where it successfully discriminated plumes related to combustion processes associated with

  19. Renewable energy sources. European Commission papers; Energies renouvelables. Documents de la Commission Europeenne

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-05-01

    The ''Directive on the Promotion of Electricity from Renewable Sources of Energy in the Internal Electricity Market'' was adopted in September 2001. Its purpose is to promote an increase in the contribution of renewable energy sources to electricity production in the internal market for electricity and to create a basis for a future Community framework. Energie-Cites provides in this document a summary of its opinion on the Green Paper and on Alterner II and gives a proposal for an Action Plan concerning the White Paper. (A.L.B.)

  20. Human Rights Texts: Converting Human Rights Primary Source Documents into Data.

    Science.gov (United States)

    Fariss, Christopher J; Linder, Fridolin J; Jones, Zachary M; Crabtree, Charles D; Biek, Megan A; Ross, Ana-Sophia M; Kaur, Taranamol; Tsai, Michael

    2015-01-01

    We introduce and make publicly available a large corpus of digitized primary source human rights documents which are published annually by monitoring agencies that include Amnesty International, Human Rights Watch, the Lawyers Committee for Human Rights, and the United States Department of State. In addition to the digitized text, we also make available and describe document-term matrices, which are datasets that systematically organize the word counts from each unique document by each unique term within the corpus of human rights documents. To contextualize the importance of this corpus, we describe the development of coding procedures in the human rights community and several existing categorical indicators that have been created by human coding of the human rights documents contained in the corpus. We then discuss how the new human rights corpus and the existing human rights datasets can be used with a variety of statistical analyses and machine learning algorithms to help scholars understand how human rights practices and reporting have evolved over time. We close with a discussion of our plans for dataset maintenance, updating, and availability.

  1. A multi-analyzer crystal spectrometer (MAX) for pulsed neutron sources

    International Nuclear Information System (INIS)

    Tajima, K.; Ishikawa, Y.; Kanai, K.; Windsor, C.G.; Tomiyoshi, S.

    1982-03-01

    The paper describes the principle and initial performance of a multi-analyzer crystal spectrometer (MAX) recently installed at the KENS spallation neutron source at Tsukuba. The spectrometer is able to make time of flight scans along a desired direction in reciprocal space, covering a wide range of the energy transfers corresponding to the fifteen analyzer crystals. The constant Q or constant E modes of operation can be performed. The spectrometer is particularly suited for studying collective excitations such as phonons and magnons to high energy transfers using single crystal samples. (author)

  2. Simulation of an IXS imaging analyzer with an extended scattering source

    Energy Technology Data Exchange (ETDEWEB)

    Suvorov, Alexey [Brookhaven National Lab. (BNL), Upton, NY (United States). National Synchrotron Light Source II; Cai, Yong Q. [Brookhaven National Lab. (BNL), Upton, NY (United States). National Synchrotron Light Source II

    2016-09-15

    A concept of an inelastic x-ray scattering (IXS) spectrograph with an imaging analyzer was proposed recently and discussed in a number of publications (see e.g. Ref.1). The imaging analyzer as proposed combines x-ray lenses with highly dispersive crystal optics. It allows conversion of the x-ray energy spectrum into a spatial image with very high energy resolution. However, the presented theoretical analysis of the spectrograph did not take into account details of the scattered radiation source, i.e. sample, and its impact on the spectrograph performance. Using numerical simulations we investigated the influence of the finite sample thickness, the scattering angle and the incident energy detuning on the analyzer image and the ultimate resolution.

  3. Identification, Attribution, and Quantification of Highly Heterogeneous Methane Sources Using a Mobile Stable Isotope Analyzer

    Science.gov (United States)

    Crosson, E.; Rella, C.; Cunningham, K.

    2012-04-01

    Despite methane's importance as a potent greenhouse gas second only to carbon dioxide in the magnitude of its contribution to global warming, natural contributions to the overall methane budget are only poorly understood. A big contributor to this gap in knowledge is the highly spatially and temporally heterogeneous nature of most natural (and for that matter anthropogenic) methane sources. This high degree of heterogeneity, where the methane emission rates can vary over many orders of magnitude on a spatial scale of meters or even centimeters, and over a temporal scale of minutes or even seconds, means that traditional methods of emissions flux estimation, such as flux chambers or eddy-covariance, are difficult or impossible to apply. In this paper we present new measurement methods that are capable of detecting, attributing, and quantifying emissions from highly heterogeneous sources. These methods take full advantage of the new class of methane concentration and stable isotope analyzers that are capable of laboratory-quality analysis from a mobile field platform in real time. In this paper we present field measurements demonstrating the real-time detection of methane 'hot spots,' attribution of the methane to a source process via real-time stable isotope analysis, and quantification of the emissions flux using mobile concentration measurements of the horizontal and vertical atmospheric dispersion, combined with atmospheric transport calculations. Although these techniques are applicable to both anthropogenic and natural methane sources, in this initial work we focus primarily on landfills and fugitive emissions from natural gas distribution, as these sources are better characterized, and because they provide a more reliable and stable source of methane for quantifying the measurement uncertainty inherent in the different methods. Implications of these new technologies and techniques are explored for the quantification of natural methane sources in a variety of

  4. Time-domain single-source integral equations for analyzing scattering from homogeneous penetrable objects

    KAUST Repository

    Valdés, Felipe

    2013-03-01

    Single-source time-domain electric-and magnetic-field integral equations for analyzing scattering from homogeneous penetrable objects are presented. Their temporal discretization is effected by using shifted piecewise polynomial temporal basis functions and a collocation testing procedure, thus allowing for a marching-on-in-time (MOT) solution scheme. Unlike dual-source formulations, single-source equations involve space-time domain operator products, for which spatial discretization techniques developed for standalone operators do not apply. Here, the spatial discretization of the single-source time-domain integral equations is achieved by using the high-order divergence-conforming basis functions developed by Graglia alongside the high-order divergence-and quasi curl-conforming (DQCC) basis functions of Valdés The combination of these two sets allows for a well-conditioned mapping from div-to curl-conforming function spaces that fully respects the space-mapping properties of the space-time operators involved. Numerical results corroborate the fact that the proposed procedure guarantees accuracy and stability of the MOT scheme. © 2012 IEEE.

  5. Advisory Committee on human radiation experiments. Final report, Supplemental Volume 2. Sources and documentation

    International Nuclear Information System (INIS)

    1995-01-01

    This volume and its appendixes supplement the Advisory Committee's final report by reporting how we went about looking for information concerning human radiation experiments and intentional releases, a description of what we found and where we found it, and a finding aid for the information that we collected. This volume begins with an overview of federal records, including general descriptions of the types of records that have been useful and how the federal government handles these records. This is followed by an agency-by-agency account of the discovery process and descriptions of the records reviewed, together with instructions on how to obtain further information from those agencies. There is also a description of other sources of information that have been important, including institutional records, print resources, and nonprint media and interviews. The third part contains brief accounts of ACHRE's two major contemporary survey projects (these are described in greater detail in the final report and another supplemental volume) and other research activities. The final section describes how the ACHRE information-nation collections were managed and the records that ACHRE created in the course of its work; this constitutes a general finding aid for the materials deposited with the National Archives. The appendices provide brief references to federal records reviewed, descriptions of the accessions that comprise the ACHRE Research Document Collection, and descriptions of the documents selected for individual treatment. Also included are an account of the documentation available for ACHRE meetings, brief abstracts of the almost 4,000 experiments individually described by ACHRE staff, a full bibliography of secondary sources used, and other information

  6. Advisory Committee on human radiation experiments. Final report, Supplemental Volume 2. Sources and documentation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-01-01

    This volume and its appendixes supplement the Advisory Committee`s final report by reporting how we went about looking for information concerning human radiation experiments and intentional releases, a description of what we found and where we found it, and a finding aid for the information that we collected. This volume begins with an overview of federal records, including general descriptions of the types of records that have been useful and how the federal government handles these records. This is followed by an agency-by-agency account of the discovery process and descriptions of the records reviewed, together with instructions on how to obtain further information from those agencies. There is also a description of other sources of information that have been important, including institutional records, print resources, and nonprint media and interviews. The third part contains brief accounts of ACHRE`s two major contemporary survey projects (these are described in greater detail in the final report and another supplemental volume) and other research activities. The final section describes how the ACHRE information-nation collections were managed and the records that ACHRE created in the course of its work; this constitutes a general finding aid for the materials deposited with the National Archives. The appendices provide brief references to federal records reviewed, descriptions of the accessions that comprise the ACHRE Research Document Collection, and descriptions of the documents selected for individual treatment. Also included are an account of the documentation available for ACHRE meetings, brief abstracts of the almost 4,000 experiments individually described by ACHRE staff, a full bibliography of secondary sources used, and other information.

  7. EEGNET: An Open Source Tool for Analyzing and Visualizing M/EEG Connectome.

    Science.gov (United States)

    Hassan, Mahmoud; Shamas, Mohamad; Khalil, Mohamad; El Falou, Wassim; Wendling, Fabrice

    2015-01-01

    The brain is a large-scale complex network often referred to as the "connectome". Exploring the dynamic behavior of the connectome is a challenging issue as both excellent time and space resolution is required. In this context Magneto/Electroencephalography (M/EEG) are effective neuroimaging techniques allowing for analysis of the dynamics of functional brain networks at scalp level and/or at reconstructed sources. However, a tool that can cover all the processing steps of identifying brain networks from M/EEG data is still missing. In this paper, we report a novel software package, called EEGNET, running under MATLAB (Math works, inc), and allowing for analysis and visualization of functional brain networks from M/EEG recordings. EEGNET is developed to analyze networks either at the level of scalp electrodes or at the level of reconstructed cortical sources. It includes i) Basic steps in preprocessing M/EEG signals, ii) the solution of the inverse problem to localize / reconstruct the cortical sources, iii) the computation of functional connectivity among signals collected at surface electrodes or/and time courses of reconstructed sources and iv) the computation of the network measures based on graph theory analysis. EEGNET is the unique tool that combines the M/EEG functional connectivity analysis and the computation of network measures derived from the graph theory. The first version of EEGNET is easy to use, flexible and user friendly. EEGNET is an open source tool and can be freely downloaded from this webpage: https://sites.google.com/site/eegnetworks/.

  8. A calderón-preconditioned single source combined field integral equation for analyzing scattering from homogeneous penetrable objects

    KAUST Repository

    Valdé s, Felipe; Andriulli, Francesco P.; Bagci, Hakan; Michielssen, Eric

    2011-01-01

    A new regularized single source equation for analyzing scattering from homogeneous penetrable objects is presented. The proposed equation is a linear combination of a Calderón-preconditioned single source electric field integral equation and a

  9. Development of a primary diffusion source of organic vapors for gas analyzer calibration

    Science.gov (United States)

    Lecuna, M.; Demichelis, A.; Sassi, G.; Sassi, M. P.

    2018-03-01

    The generation of reference mixtures of volatile organic compounds (VOCs) at trace levels (10 ppt-10 ppb) is a challenge for both environmental and clinical measurements. The calibration of gas analyzers for trace VOC measurements requires a stable and accurate source of the compound of interest. The dynamic preparation of gas mixtures by diffusion is a suitable method for fulfilling these requirements. The estimation of the uncertainty of the molar fraction of the VOC in the mixture is a key step in the metrological characterization of a dynamic generator. The performance of a dynamic generator was monitored over a wide range of operating conditions. The generation system was simulated by a model developed with computational fluid dynamics and validated against experimental data. The vapor pressure of the VOC was found to be one of the main contributors to the uncertainty of the diffusion rate and its influence at 10-70 kPa was analyzed and discussed. The air buoyancy effect and perturbations due to the weighing duration were studied. The gas carrier flow rate and the amount of liquid in the vial were found to play a role in limiting the diffusion rate. The results of sensitivity analyses were reported through an uncertainty budget for the diffusion rate. The roles of each influence quantity were discussed. A set of criteria to minimize the uncertainty contribution to the primary diffusion source (25 µg min-1) were estimated: carrier gas flow rate higher than 37.7 sml min-1, a maximum VOC liquid mass decrease in the vial of 4.8 g, a minimum residual mass of 1 g and vial weighing times of 1-3 min. With this procedure a limit uncertainty of 0.5% in the diffusion rate can be obtained for VOC mixtures at trace levels (10 ppt-10 ppb), making the developed diffusion vials a primary diffusion source with potential to become a new reference material for trace VOC analysis.

  10. The Einstein Observatory catalog of IPC x ray sources. Volume 1E: Documentation

    Science.gov (United States)

    Harris, D. E.; Forman, W.; Gioia, I. M.; Hale, J. A.; Harnden, F. R., Jr.; Jones, C.; Karakashian, T.; Maccacaro, T.; Mcsweeney, J. D.; Primini, F. A.

    1993-01-01

    The Einstein Observatory (HEAO-2, launched November 13, 1978) achieved radically improved sensitivity over previous x-ray missions through the use of focusing optics, which simultaneously afforded greatly reduced background and produced true images. During its 2.5-yr mission, the Einstein X-Ray Telescope was pointed toward some 5,000 celestial targets, most of which were detected, and discovered several thousand additional 'serendipitous' sources in the observed fields. This catalog contains contour diagrams and source data, obtained with the imaging proportional counter in the 0.16 to 3.5 keV energy band, and describes methods for recovering upper limits for any sky position within the observed images. The main catalog consists of six volumes (numbered 2 through 7) of right ascension ordered pages, each containing data for one observation. Along with the primary documentation describing how the catalog was constructed, volume 1 contains a complete source list, results for merged fields, a reference system to published papers, and data useful for calculating upper limits and fluxes.

  11. Market Analysis and Consumer Impacts Source Document. Part III. Consumer Behavior and Attitudes Toward Fuel Efficient Vehicles

    Science.gov (United States)

    1980-12-01

    This source document on motor vehicle market analysis and consumer impacts consists of three parts. Part III consists of studies and reviews on: consumer awareness of fuel efficiency issues; consumer acceptance of fuel efficient vehicles; car size ch...

  12. Documentation for grants equal to tax model: Volume 3, Source code

    International Nuclear Information System (INIS)

    Boryczka, M.K.

    1986-01-01

    The GETT model is capable of forecasting the amount of tax liability associated with all property owned and all activities undertaken by the US Department of Energy (DOE) in site characterization and repository development. The GETT program is a user-friendly, menu-driven model developed using dBASE III/trademark/, a relational data base management system. The data base for GETT consists primarily of eight separate dBASE III/trademark/ files corresponding to each of the eight taxes (real property, personal property, corporate income, franchise, sales, use, severance, and excise) levied by State and local jurisdictions on business property and activity. Additional smaller files help to control model inputs and reporting options. Volume 3 of the GETT model documentation is the source code. The code is arranged primarily by the eight tax types. Other code files include those for JURISDICTION, SIMULATION, VALIDATION, TAXES, CHANGES, REPORTS, GILOT, and GETT. The code has been verified through hand calculations

  13. Influence of physiological sources on the impedance cardiogram analyzed using 4D FEM simulations

    International Nuclear Information System (INIS)

    Ulbrich, Mark; Leonhardt, Steffen; Walter, Marian; Mühlsteff, Jens

    2014-01-01

    Impedance cardiography is a simple and inexpensive method to acquire data on hemodynamic parameters. This study analyzes the influence of four dynamic physiological sources (aortic expansion, heart contraction, lung perfusion and erythrocyte orientation) on the impedance signal using a model of the human thorax with a high temporal resolution (125 Hz) based on human MRI data. Simulations of electromagnetic fields were conducted using the finite element method. The ICG signal caused by these sources shows very good agreement with the measured signals (r = 0.89). Standard algorithms can be used to extract characteristic points to calculate left ventricular ejection time and stroke volume (SV). In the presented model, the calculated SV equals the implemented left ventricular volume change of the heart. It is shown that impedance changes due to lung perfusion and heart contraction compensate themselves, and that erythrocyte orientation together with the aortic impedance basically form the ICG signal while taking its characteristic morphology from the aortic signal. The model is robust to conductivity changes of tissues and organ displacements. In addition, it reflects the multi-frequency behavior of the thoracic impedance. (paper)

  14. Analysis and classification of oncology activities on the way to workflow based single source documentation in clinical information systems.

    Science.gov (United States)

    Wagner, Stefan; Beckmann, Matthias W; Wullich, Bernd; Seggewies, Christof; Ries, Markus; Bürkle, Thomas; Prokosch, Hans-Ulrich

    2015-12-22

    Today, cancer documentation is still a tedious task involving many different information systems even within a single institution and it is rarely supported by appropriate documentation workflows. In a comprehensive 14 step analysis we compiled diagnostic and therapeutic pathways for 13 cancer entities using a mixed approach of document analysis, workflow analysis, expert interviews, workflow modelling and feedback loops. These pathways were stepwise classified and categorized to create a final set of grouped pathways and workflows including electronic documentation forms. A total of 73 workflows for the 13 entities based on 82 paper documentation forms additionally to computer based documentation systems were compiled in a 724 page document comprising 130 figures, 94 tables and 23 tumour classifications as well as 12 follow-up tables. Stepwise classification made it possible to derive grouped diagnostic and therapeutic pathways for the three major classes - solid entities with surgical therapy - solid entities with surgical and additional therapeutic activities and - non-solid entities. For these classes it was possible to deduct common documentation workflows to support workflow-guided single-source documentation. Clinical documentation activities within a Comprehensive Cancer Center can likely be realized in a set of three documentation workflows with conditional branching in a modern workflow supporting clinical information system.

  15. Analyzer-based phase-contrast imaging system using a micro focus x-ray source

    International Nuclear Information System (INIS)

    Zhou, Wei; Majidi, Keivan; Brankov, Jovan G.

    2014-01-01

    Here we describe a new in-laboratory analyzer based phase contrast-imaging (ABI) instrument using a conventional X-ray tube source (CXS) aimed at bio-medical imaging applications. Phase contrast-imaging allows visualization of soft tissue details usually obscured in conventional X-ray imaging. The ABI system design and major features are described in detail. The key advantage of the presented system, over the few existing CXS ABI systems, is that it does not require high precision components, i.e., CXS, X-ray detector, and electro-mechanical components. To overcome a main problem introduced by these components, identified as temperature stability, the system components are kept at a constant temperature inside of three enclosures, thus minimizing the electrical and mechanical thermal drifts. This is achieved by using thermoelectric (Peltier) cooling/heating modules that are easy to control precisely. For CXS we utilized a microfocus X-ray source with tungsten (W) anode material. In addition the proposed system eliminates tungsten's multiple spectral lines by selecting monochromator crystal size appropriately therefore eliminating need for the costly mismatched, two-crystal monochromator. The system imaging was fine-tuned for tungsten Kα 1 line with the energy of 59.3 keV since it has been shown to be of great clinical significance by a number of researchers at synchrotron facilities. In this way a laboratory system that can be used for evaluating and quantifying tissue properties, initially explored at synchrotron facilities, would be of great interest to a larger research community. To demonstrate the imaging capability of our instrument we use a chicken thigh tissue sample

  16. Analyzer-based phase-contrast imaging system using a micro focus x-ray source

    Science.gov (United States)

    Zhou, Wei; Majidi, Keivan; Brankov, Jovan G.

    2014-08-01

    Here we describe a new in-laboratory analyzer based phase contrast-imaging (ABI) instrument using a conventional X-ray tube source (CXS) aimed at bio-medical imaging applications. Phase contrast-imaging allows visualization of soft tissue details usually obscured in conventional X-ray imaging. The ABI system design and major features are described in detail. The key advantage of the presented system, over the few existing CXS ABI systems, is that it does not require high precision components, i.e., CXS, X-ray detector, and electro-mechanical components. To overcome a main problem introduced by these components, identified as temperature stability, the system components are kept at a constant temperature inside of three enclosures, thus minimizing the electrical and mechanical thermal drifts. This is achieved by using thermoelectric (Peltier) cooling/heating modules that are easy to control precisely. For CXS we utilized a microfocus X-ray source with tungsten (W) anode material. In addition the proposed system eliminates tungsten's multiple spectral lines by selecting monochromator crystal size appropriately therefore eliminating need for the costly mismatched, two-crystal monochromator. The system imaging was fine-tuned for tungsten Kα1 line with the energy of 59.3 keV since it has been shown to be of great clinical significance by a number of researchers at synchrotron facilities. In this way a laboratory system that can be used for evaluating and quantifying tissue properties, initially explored at synchrotron facilities, would be of great interest to a larger research community. To demonstrate the imaging capability of our instrument we use a chicken thigh tissue sample.

  17. Analyzer-based phase-contrast imaging system using a micro focus x-ray source

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Wei [BME Department, Illinois Institute of Technology, Chicago, Illinois 60616 (United States); Majidi, Keivan; Brankov, Jovan G., E-mail: brankov@iit.edu [ECE Department, Illinois Institute of Technology, Chicago, Illinois 60616 (United States)

    2014-08-15

    Here we describe a new in-laboratory analyzer based phase contrast-imaging (ABI) instrument using a conventional X-ray tube source (CXS) aimed at bio-medical imaging applications. Phase contrast-imaging allows visualization of soft tissue details usually obscured in conventional X-ray imaging. The ABI system design and major features are described in detail. The key advantage of the presented system, over the few existing CXS ABI systems, is that it does not require high precision components, i.e., CXS, X-ray detector, and electro-mechanical components. To overcome a main problem introduced by these components, identified as temperature stability, the system components are kept at a constant temperature inside of three enclosures, thus minimizing the electrical and mechanical thermal drifts. This is achieved by using thermoelectric (Peltier) cooling/heating modules that are easy to control precisely. For CXS we utilized a microfocus X-ray source with tungsten (W) anode material. In addition the proposed system eliminates tungsten's multiple spectral lines by selecting monochromator crystal size appropriately therefore eliminating need for the costly mismatched, two-crystal monochromator. The system imaging was fine-tuned for tungsten Kα{sub 1} line with the energy of 59.3 keV since it has been shown to be of great clinical significance by a number of researchers at synchrotron facilities. In this way a laboratory system that can be used for evaluating and quantifying tissue properties, initially explored at synchrotron facilities, would be of great interest to a larger research community. To demonstrate the imaging capability of our instrument we use a chicken thigh tissue sample.

  18. Guide to Good Practice in using Open Source Compilers with the AGCC Lexical Analyzer

    Directory of Open Access Journals (Sweden)

    2009-01-01

    Full Text Available Quality software always demands a compromise between users' needs and hardware resources. To be faster means expensive devices like powerful processors and virtually unlimited amounts of RAM memory. Or you just need reengineering of the code in terms of adapting that piece of software to the client's hardware architecture. This is the purpose of optimizing code in order to get the utmost software performance from a program in certain given conditions. There are tools for designing and writing the code but the ultimate tool for optimizing remains the modest compiler, this often neglected software jewel the result of hundreds working hours by the best specialists in the world. Even though, only two compilers fulfill the needs of professional developers, a proprietary solution from a giant in the IT industry, and the Open source GNU compiler, for which we develop the AGCC lexical analyzer that helps producing even more efficient software applications. It relies on the most popular hacks and tricks used by professionals and discovered by the author who are proud to present them further below.

  19. Getting more out of biomedical documents with GATE's full lifecycle open source text analytics.

    Science.gov (United States)

    Cunningham, Hamish; Tablan, Valentin; Roberts, Angus; Bontcheva, Kalina

    2013-01-01

    This software article describes the GATE family of open source text analysis tools and processes. GATE is one of the most widely used systems of its type with yearly download rates of tens of thousands and many active users in both academic and industrial contexts. In this paper we report three examples of GATE-based systems operating in the life sciences and in medicine. First, in genome-wide association studies which have contributed to discovery of a head and neck cancer mutation association. Second, medical records analysis which has significantly increased the statistical power of treatment/outcome models in the UK's largest psychiatric patient cohort. Third, richer constructs in drug-related searching. We also explore the ways in which the GATE family supports the various stages of the lifecycle present in our examples. We conclude that the deployment of text mining for document abstraction or rich search and navigation is best thought of as a process, and that with the right computational tools and data collection strategies this process can be made defined and repeatable. The GATE research programme is now 20 years old and has grown from its roots as a specialist development tool for text processing to become a rather comprehensive ecosystem, bringing together software developers, language engineers and research staff from diverse fields. GATE now has a strong claim to cover a uniquely wide range of the lifecycle of text analysis systems. It forms a focal point for the integration and reuse of advances that have been made by many people (the majority outside of the authors' own group) who work in text processing for biomedicine and other areas. GATE is available online under GNU open source licences and runs on all major operating systems. Support is available from an active user and developer community and also on a commercial basis.

  20. Getting more out of biomedical documents with GATE's full lifecycle open source text analytics.

    Directory of Open Access Journals (Sweden)

    Hamish Cunningham

    Full Text Available This software article describes the GATE family of open source text analysis tools and processes. GATE is one of the most widely used systems of its type with yearly download rates of tens of thousands and many active users in both academic and industrial contexts. In this paper we report three examples of GATE-based systems operating in the life sciences and in medicine. First, in genome-wide association studies which have contributed to discovery of a head and neck cancer mutation association. Second, medical records analysis which has significantly increased the statistical power of treatment/outcome models in the UK's largest psychiatric patient cohort. Third, richer constructs in drug-related searching. We also explore the ways in which the GATE family supports the various stages of the lifecycle present in our examples. We conclude that the deployment of text mining for document abstraction or rich search and navigation is best thought of as a process, and that with the right computational tools and data collection strategies this process can be made defined and repeatable. The GATE research programme is now 20 years old and has grown from its roots as a specialist development tool for text processing to become a rather comprehensive ecosystem, bringing together software developers, language engineers and research staff from diverse fields. GATE now has a strong claim to cover a uniquely wide range of the lifecycle of text analysis systems. It forms a focal point for the integration and reuse of advances that have been made by many people (the majority outside of the authors' own group who work in text processing for biomedicine and other areas. GATE is available online under GNU open source licences and runs on all major operating systems. Support is available from an active user and developer community and also on a commercial basis.

  1. A method to analyze "source-sink" structure of non-point source pollution based on remote sensing technology.

    Science.gov (United States)

    Jiang, Mengzhen; Chen, Haiying; Chen, Qinghui

    2013-11-01

    With the purpose of providing scientific basis for environmental planning about non-point source pollution prevention and control, and improving the pollution regulating efficiency, this paper established the Grid Landscape Contrast Index based on Location-weighted Landscape Contrast Index according to the "source-sink" theory. The spatial distribution of non-point source pollution caused by Jiulongjiang Estuary could be worked out by utilizing high resolution remote sensing images. The results showed that, the area of "source" of nitrogen and phosphorus in Jiulongjiang Estuary was 534.42 km(2) in 2008, and the "sink" was 172.06 km(2). The "source" of non-point source pollution was distributed mainly over Xiamen island, most of Haicang, east of Jiaomei and river bank of Gangwei and Shima; and the "sink" was distributed over southwest of Xiamen island and west of Shima. Generally speaking, the intensity of "source" gets weaker along with the distance from the seas boundary increase, while "sink" gets stronger. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Using ISO/IEC 12207 to analyze open source software development processes: an E-learning case study

    OpenAIRE

    Krishnamurthy, Aarthy; O'Connor, Rory

    2013-01-01

    peer-reviewed To date, there is no comprehensive study of open source software development process (OSSDP) carried out for open source (OS) e-learning systems. This paper presents the work which objectively analyzes the open source software development (OSSD) practices carried out by e-learning systems development communities and their results are represented using DEMO models. These results are compared using ISO/IEC 12207:2008. The comparison of DEMO models with ISO/IEC...

  3. What's your favorite blend? Analyzing source and channel choices in business-to-government service interactions

    NARCIS (Netherlands)

    van den Boer, Yvon

    2014-01-01

    In the Netherlands, over a million businesses regularly have to deal with complex matters imposed by the government (e.g., managing tax problems). To solve their problems, businesses have various potential sources to consult (e.g., Tax Office, advisor, friends/family). The myriad sources can be

  4. Analyzing traffic source impact on returning visitors ratio in information provider website

    Science.gov (United States)

    Prasetio, A.; Sari, P. K.; Sharif, O. O.; Sofyan, E.

    2016-04-01

    Web site performance, especially returning visitor is an important metric for an information provider web site. Since high returning visitor is a good indication of a web site’s visitor loyalty, it is important to find a way to improve this metric. This research investigated if there is any difference on returning visitor metric among three web traffic sources namely direct, referral and search. Monthly returning visitor and total visitor from each source is retrieved from Google Analytics tools and then calculated to measure returning visitor ratio. The period of data observation is from July 2012 to June 2015 resulting in a total of 108 samples. These data then analysed using One-Way Analysis of Variance (ANOVA) to address our research question. The results showed that different traffic source has significantly different returning visitor ratio especially between referral traffic source and the other two traffic sources. On the other hand, this research did not find any significant difference between returning visitor ratio from direct and search traffic sources. The owner of the web site can focus to multiply referral links from other relevant sites.

  5. Market Analysis and Consumer Impacts Source Document. Part I. The Motor Vehicle Market in the Late 1970's

    Science.gov (United States)

    1980-12-01

    The source document on motor vehicle market analysis and consumer impact consists of three parts. Part I is an integrated overview of the motor vehicle market in the late 1970's, with sections on the structure of the market, motor vehicle trends, con...

  6. Remote source document verification in two national clinical trials networks: a pilot study.

    Directory of Open Access Journals (Sweden)

    Meredith Mealer

    Full Text Available OBJECTIVE: Barriers to executing large-scale randomized controlled trials include costs, complexity, and regulatory requirements. We hypothesized that source document verification (SDV via remote electronic monitoring is feasible. METHODS: Five hospitals from two NIH sponsored networks provided remote electronic access to study monitors. We evaluated pre-visit remote SDV compared to traditional on-site SDV using a randomized convenience sample of all study subjects due for a monitoring visit. The number of data values verified and the time to perform remote and on-site SDV was collected. RESULTS: Thirty-two study subjects were randomized to either remote SDV (N=16 or traditional on-site SDV (N=16. Technical capabilities, remote access policies and regulatory requirements varied widely across sites. In the adult network, only 14 of 2965 data values (0.47% could not be located remotely. In the traditional on-site SDV arm, 3 of 2608 data values (0.12% required coordinator help. In the pediatric network, all 198 data values in the remote SDV arm and all 183 data values in the on-site SDV arm were located. Although not statistically significant there was a consistent trend for more time consumed per data value (minutes +/- SD: Adult 0.50 +/- 0.17 min vs. 0.39 +/- 0.10 min (two-tailed t-test p=0.11; Pediatric 0.99 +/- 1.07 min vs. 0.56 +/- 0.61 min (p=0.37 and time per case report form: Adult: 4.60 +/- 1.42 min vs. 3.60 +/- 0.96 min (p=0.10; Pediatric: 11.64 +/- 7.54 min vs. 6.07 +/- 3.18 min (p=0.10 using remote SDV. CONCLUSIONS: Because each site had different policies, requirements, and technologies, a common approach to assimilating monitors into the access management system could not be implemented. Despite substantial technology differences, more than 99% of data values were successfully monitored remotely. This pilot study demonstrates the feasibility of remote monitoring and the need to develop consistent access policies for remote study

  7. An inexpensive way to analyze the optics of electrostatic, surface-ionization ion-source configurations

    International Nuclear Information System (INIS)

    Balestrini, S.J.

    1986-01-01

    The optical characteristics of surface ionization sources can often be studied in detail with the aid of a home computer. Sources with two-dimensional symmetry are considered. Ions are created on the surface of a hot filament. An accelerating voltage, V, is applied to the source and filament. The ions are accelerated and focused into a beam by a series of electrodes containing narrow axial slits. The ordering of elementary stages of acceleration that the electrodes form from is the optical stack. The focusing parameters are the fractions of the source voltage applied to the electrodes. A portion of the ions leaves the source through a beam-defining, collimating slit in the final electrode. An ion trajectory at any point along the symmetry axis is described by a vector with two phase space components, which are treated as small quantities. The components at the filament are ω, the displacement from the symmetry axis, and ν, the velocity component of the ion parallel to the filament surface divided by its speed when it leaves the first stage. Elsewhere, the trajectory components are the displacement from the symmetry axis and the slope

  8. Brachytherapy Partial Breast Irradiation: Analyzing Effect of Source Configurations on Dose Metrics Relevant to Toxicity

    International Nuclear Information System (INIS)

    Cormack, Robert A.; Devlin, Phillip M.

    2008-01-01

    Purpose: Recently, the use of partial breast irradiation (PBI) for patients with early-stage breast cancer with low-risk factors has increased. The volume of the high-dose regions has been correlated with toxicity in interstitial treatment. Although no such associations have been made in applicator-based experience, new applicators are being developed that use complex noncentered source configurations. This work studied the effect of noncentered source placements on the volume of the high-dose regions around a spherical applicator. Methods and Materials: Many applicator configurations were numerically simulated for a range of inflation radii. For each configuration, a dose homogeneity index was used as a dose metric to measure the volume of the high-dose region. Results: All multisource configurations examined resulted in an increase of the high-dose region compared with a single-center source. The resulting decrease in the prescription dose homogeneity index was more pronounced for sources further from the center of the applicator, and the effect was reduced as the number of dwell locations was increased. Conclusion: The geometries of particular applicators were not considered to achieve a more general result. On the basis of the calculations of this work, it would appear that treatment using noncentered dwell locations will lead to an increase in the volume of the high-dose regions

  9. Documentation of particle-size analyzer time series, and discrete suspended-sediment and bed-sediment sample data collection, Niobrara River near Spencer, Nebraska, October 2014

    Science.gov (United States)

    Schaepe, Nathaniel J.; Coleman, Anthony M.; Zelt, Ronald B.

    2018-04-06

    The U.S. Geological Survey (USGS), in cooperation with the U.S. Army Corps of Engineers, monitored a sediment release by Nebraska Public Power District from Spencer Dam located on the Niobrara River near Spencer, Nebraska, during the fall of 2014. The accumulated sediment behind Spencer Dam ordinarily is released semiannually; however, the spring 2014 release was postponed until the fall. Because of the postponement, the scheduled fall sediment release would consist of a larger volume of sediment. The larger than normal sediment release expected in fall 2014 provided an opportunity for the USGS and U.S. Army Corps of Engineers to improve the understanding of sediment transport during reservoir sediment releases. A primary objective was to collect continuous suspended-sediment data during the first days of the sediment release to document rapid changes in sediment concentrations. For this purpose, the USGS installed a laser-diffraction particle-size analyzer at a site near the outflow of the dam to collect continuous suspended-sediment data. The laser-diffraction particle-size analyzer measured volumetric particle concentration and particle-size distribution from October 1 to 2 (pre-sediment release) and October 5 to 9 (during sediment release). Additionally, the USGS manually collected discrete suspended-sediment and bed-sediment samples before, during, and after the sediment release. Samples were collected at two sites upstream from Spencer Dam and at three bridges downstream from Spencer Dam. The resulting datasets and basic metadata associated with the datasets were published as a data release; this report provides additional documentation about the data collection methods and the quality of the data.

  10. Experimental determination of chosen document elements parameters from raster graphics sources

    Directory of Open Access Journals (Sweden)

    Jiří Rybička

    2010-01-01

    Full Text Available Visual appearance of documents and their formal quality is considered to be as important as the content quality. Formal and typographical quality of documents can be evaluated by an automated system that processes raster images of documents. A document is described by a formal model that treats a page as an object and also as a set of elements, whereas page elements include text and graphic object. All elements are described by their parameters depending on elements’ type. For future evaluation, mainly text objects are important. This paper describes the experimental determination of chosen document elements parameters from raster images. Techniques for image processing are used, where an image is represented as a matrix of dots and parameter values are extracted. Algorithms for parameter extraction from raster images were designed and were aimed mainly at typographical parameters like indentation, alignment, font size or spacing. Algorithms were tested on a set of 100 images of paragraphs or pages and provide very good results. Extracted parameters can be directly used for typographical quality evaluation.

  11. Historical Website Ecology : Analyzing Past States of the Web Using Archived Source Code

    NARCIS (Netherlands)

    Helmond, A.; Brügger, N.

    2017-01-01

    In this chapter I offer a historical perspective on the changing composition of a website over time. I propose to see the website as an ecosystem through which we can analyze the larger techno-commercial configurations that websites are embedded in. In doing so, I reconceptualize the study of

  12. Simple, sensitive nitrogen analyzer based on pulsed miniplasma source emission spectrometry

    International Nuclear Information System (INIS)

    Jin Zhe; Duan Yixiang

    2003-01-01

    The development of pulsed miniplasma source emission spectrometry for trace nitrogen determination in inert gases is described in this article. The instrument consists of a pulsed miniplasma source generated by an in-house fabricated portable high-voltage supply, an optical beam collection system, an integrated small spectrometer with a charge-coupled-device detector, an interface card, and a notebook computer for controlling spectrometer parameters and signal processing. Trace nitrogen in the inert gases, such as helium and argon, was determined by monitoring the emission intensities from nitrogen molecules at 357 and 337 nm. The analytical performance was examined under various experimental conditions. The system has a detection limit of about 15 ppb (v/v) for nitrogen in helium with a relative standard deviation of 1.5%. The newly developed instrument offers a simple, low-cost, and sensitive method for continuously monitoring trace nitrogen in high-purity inert gases

  13. Understanding Richard Wright's "Black Boy": A Student Casebook to Issues, Sources, and Historical Documents.

    Science.gov (United States)

    Felgar, Robert

    In "Black Boy," Richard Wright triumphs over an ugly, racist world by fashioning an inspiring, powerful, beautiful, and fictionalized autobiography. To help students understand and appreciate his story in the cultural, political, racial, social, and literary contexts of its time, this casebook provides primary historical documents,…

  14. Understanding "Animal Farm": A Student Casebook to Issues, Sources, and Historical Documents.

    Science.gov (United States)

    Rodden, John

    "Animal Farm" is a political allegory of the USSR written in the form of a fable. Its stinging moral warning against the abuse of power is demonstrated in this casebook through a wide variety of historical, political, and literary documents that are directly applicable to George Orwell's novel. Included in the casebook are passages from…

  15. Understanding "The Catcher in the Rye": A Student Casebook to Issues, Sources, and Historical Documents.

    Science.gov (United States)

    Pinsker, Sanford; Pinsker, Ann

    The social, cultural, and historical documents and commentary in this casebook illuminate the reading of "The Catcher in the Rye," a novel that has become an important rite of passage for many young adults. In addition to a literary analysis, the casebook acquaints students with the larger world in which Holden Caulfield, the…

  16. A calderón-preconditioned single source combined field integral equation for analyzing scattering from homogeneous penetrable objects

    KAUST Repository

    Valdés, Felipe

    2011-06-01

    A new regularized single source equation for analyzing scattering from homogeneous penetrable objects is presented. The proposed equation is a linear combination of a Calderón-preconditioned single source electric field integral equation and a single source magnetic field integral equation. The equation is immune to low-frequency and dense-mesh breakdown, and free from spurious resonances. Unlike dual source formulations, this equation involves operator products that cannot be discretized using standard procedures for discretizing standalone electric, magnetic, and combined field operators. Instead, the single source equation proposed here is discretized using a recently developed technique that achieves a well-conditioned mapping from div- to curl-conforming function spaces, thereby fully respecting the space mapping properties of the operators involved, and guaranteeing accuracy and stability. Numerical results show that the proposed equation and discretization technique give rise to rapidly convergent solutions. They also validate the equation\\'s resonant free character. © 2006 IEEE.

  17. Giving Women the Vote: Using Primary Source Documents to Teach about the Fight for Women's Suffrage.

    Science.gov (United States)

    Jacobsen, Margaret

    1988-01-01

    Presents a lesson in which students use primary sources to learn about the organizing strategies used in the fight for women's suffrage. These sources will provide insights into the past and help students develop appreciation for the hardships suffragists endured. Includes objectives, procedures, and suggestions for activities. (LS)

  18. Analyzing the Stock Markets Role as a Source of Capital Formation in Pakistan

    Directory of Open Access Journals (Sweden)

    Hakim Ali Kanasro

    2011-12-01

    Full Text Available This paper is to examine the stock markets role in the capital formation in Pakistan from the period 1st January 2001 to 31st December 2008. This analytical study is based on the data collected from the secondary sources such as State Bank of Pakistan and three stock exchanges; Karachi, Lahore and Islamabad Stock exchanges. The stock market size of capital, number of listed companies and liquidity positions has been examined in the study. The study reveals that Karachi Stock exchange is the oldest and biggest Stock exchange of Pakistan and it is the first mover to adapt institutional developments, new policies and procedures in the business of securities exchange and shares a big role in the capital formation in Pakistan. In recent years all stock exchanges have implemented the advanced technology and fully automated trading systems. This has changed the stock markets role in the capital formation as great boom has been observed during the study period.

  19. Detecting and analyzing soil phosphorus loss associated with critical source areas using a remote sensing approach.

    Science.gov (United States)

    Lou, Hezhen; Yang, Shengtian; Zhao, Changsen; Shi, Liuhua; Wu, Linna; Wang, Yue; Wang, Zhiwei

    2016-12-15

    The detection of critical source areas (CSAs) is a key step in managing soil phosphorus (P) loss and preventing the long-term eutrophication of water bodies at regional scale. Most related studies, however, focus on a local scale, which prevents a clear understanding of the spatial distribution of CSAs for soil P loss at regional scale. Moreover, the continual, long-term variation in CSAs was scarcely reported. It is impossible to identify the factors driving the variation in CSAs, or to collect land surface information essential for CSAs detection, by merely using the conventional methodologies at regional scale. This study proposes a new regional-scale approach, based on three satellite sensors (ASTER, TM/ETM and MODIS), that were implemented successfully to detect CSAs at regional scale over 15years (2000-2014). The approach incorporated five factors (precipitation, slope, soil erosion, land use, soil total phosphorus) that drive soil P loss from CSAs. Results show that the average area of critical phosphorus source areas (CPSAs) was 15,056km 2 over the 15-year period, and it occupied 13.8% of the total area, with a range varying from 1.2% to 23.0%, in a representative, intensive agricultural area of China. In contrast to previous studies, we found that the locations of CSAs with P loss are spatially variable, and are more dispersed in their distribution over the long term. We also found that precipitation acts as a key driving factor in the variation of CSAs at regional scale. The regional-scale method can provide scientific guidance for managing soil phosphorus loss and preventing the long-term eutrophication of water bodies at regional scale, and shows great potential for exploring factors that drive the variation in CSAs at global scale. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. The Application Of Open-Source And Free Photogrammetric Software For The Purposes Of Cultural Heritage Documentation

    Directory of Open Access Journals (Sweden)

    Bartoš Karol

    2014-07-01

    Full Text Available The documentation of cultural heritage is an essential part of appropriate care of historical monuments, representing a part of our history. At present, it represents the current issue, for which considerable funds are being spent, as well as for the documentation of immovable historical monuments in a form of castle ruins, among the others. Non-contact surveying technologies - terrestrial laser scanning and digital photogrammetry belong to the most commonly used technologies, by which suitable documentation can be obtained, however their use may be very costly. In recent years, various types of software products and web services based on the SfM (or MVS method and developed as open-source software, or as a freely available and free service, relying on the basic principles of photogrammetry and computer vision, have started to get into the spotlight. By using the services and software, acquired digital images of a given object can be processed into a point cloud, serving directly as a final output or as a basis for further processing. The aim of this paper, based on images of various objects of the Slanec castle ruins obtained by the DSLR Pentax K5, is to assess the suitability of different types of open-source and free software and free web services and their reliability in terms of surface reconstruction and photo-texture quality for the purposes of castle ruins documentation.

  1. Assessment of self-organizing maps to analyze sole-carbon source utilization profiles.

    Science.gov (United States)

    Leflaive, Joséphine; Céréghino, Régis; Danger, Michaël; Lacroix, Gérard; Ten-Hage, Loïc

    2005-07-01

    The use of community-level physiological profiles obtained with Biolog microplates is widely employed to consider the functional diversity of bacterial communities. Biolog produces a great amount of data which analysis has been the subject of many studies. In most cases, after some transformations, these data were investigated with classical multivariate analyses. Here we provided an alternative to this method, that is the use of an artificial intelligence technique, the Self-Organizing Maps (SOM, unsupervised neural network). We used data from a microcosm study of algae-associated bacterial communities placed in various nutritive conditions. Analyses were carried out on the net absorbances at two incubation times for each substrates and on the chemical guild categorization of the total bacterial activity. Compared to Principal Components Analysis and cluster analysis, SOM appeared as a valuable tool for community classification, and to establish clear relationships between clusters of bacterial communities and sole-carbon sources utilization. Specifically, SOM offered a clear bidimensional projection of a relatively large volume of data and were easier to interpret than plots commonly obtained with multivariate analyses. They would be recommended to pattern the temporal evolution of communities' functional diversity.

  2. The connectome viewer toolkit: an open source framework to manage, analyze, and visualize connectomes.

    Science.gov (United States)

    Gerhard, Stephan; Daducci, Alessandro; Lemkaddem, Alia; Meuli, Reto; Thiran, Jean-Philippe; Hagmann, Patric

    2011-01-01

    Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit - a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/

  3. The Connectome Viewer Toolkit: an open source framework to manage, analyze and visualize connectomes

    Directory of Open Access Journals (Sweden)

    Stephan eGerhard

    2011-06-01

    Full Text Available Abstract Advanced neuroinformatics tools are required for methods of connectome mapping, analysis and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration and sharing. We have designed and implemented the Connectome Viewer Toolkit --- a set of free and extensible open-source neuroimaging tools written in Python. The key components of the toolkit are as follows: 1. The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. 2. The Connectome File Format Library enables management and sharing of connectome files. 3. The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/.

  4. Analyzing the contribution of climate change to long-term variations in sediment nitrogen sources for reservoirs/lakes

    Energy Technology Data Exchange (ETDEWEB)

    Xia, Xinghui, E-mail: xiaxh@bnu.edu.cn [School of Environment, Beijing Normal University, State Key Laboratory of Water Environment Simulation/Key Laboratory of Water and Sediment Sciences of Ministry of Education, Beijing 100875 (China); Wu, Qiong; Zhu, Baotong; Zhao, Pujun [School of Environment, Beijing Normal University, State Key Laboratory of Water Environment Simulation/Key Laboratory of Water and Sediment Sciences of Ministry of Education, Beijing 100875 (China); Zhang, Shangwei [Department of Isotope Biogeochemistry, Helmholtz Centre for Environmental Research — UFZ, Permoserstraße 15, Leipzig 04318 (Germany); Yang, Lingyan [Beijing Municipal Environmental Monitoring Center, Beijing 100048 (China)

    2015-08-01

    We applied a mixing model based on stable isotopic δ{sup 13}C, δ{sup 15}N, and C:N ratios to estimate the contributions of multiple sources to sediment nitrogen. We also developed a conceptual model describing and analyzing the impacts of climate change on nitrogen enrichment. These two models were conducted in Miyun Reservoir to analyze the contribution of climate change to the variations in sediment nitrogen sources based on two {sup 210}Pb and {sup 137}Cs dated sediment cores. The results showed that during the past 50 years, average contributions of soil and fertilizer, submerged macrophytes, N{sub 2}-fixing phytoplankton, and non-N{sub 2}-fixing phytoplankton were 40.7%, 40.3%, 11.8%, and 7.2%, respectively. In addition, total nitrogen (TN) contents in sediment showed significant increasing trends from 1960 to 2010, and sediment nitrogen of both submerged macrophytes and phytoplankton sources exhibited significant increasing trends during the past 50 years. In contrast, soil and fertilizer sources showed a significant decreasing trend from 1990 to 2010. According to the changing trend of N{sub 2}-fixing phytoplankton, changes of temperature and sunshine duration accounted for at least 43% of the trend in the sediment nitrogen enrichment over the past 50 years. Regression analysis of the climatic factors on nitrogen sources showed that the contributions of precipitation, temperature, and sunshine duration to the variations in sediment nitrogen sources ranged from 18.5% to 60.3%. The study demonstrates that the mixing model provides a robust method for calculating the contribution of multiple nitrogen sources in sediment, and this study also suggests that N{sub 2}-fixing phytoplankton could be regarded as an important response factor for assessing the impacts of climate change on nitrogen enrichment. - Highlights: • A mixing model was built to analyze sediment N sources of lakes/reservoirs. • Fertilizer/soil and macrophytes showed decreasing trends during

  5. Analyzing the contribution of climate change to long-term variations in sediment nitrogen sources for reservoirs/lakes

    International Nuclear Information System (INIS)

    Xia, Xinghui; Wu, Qiong; Zhu, Baotong; Zhao, Pujun; Zhang, Shangwei; Yang, Lingyan

    2015-01-01

    We applied a mixing model based on stable isotopic δ 13 C, δ 15 N, and C:N ratios to estimate the contributions of multiple sources to sediment nitrogen. We also developed a conceptual model describing and analyzing the impacts of climate change on nitrogen enrichment. These two models were conducted in Miyun Reservoir to analyze the contribution of climate change to the variations in sediment nitrogen sources based on two 210 Pb and 137 Cs dated sediment cores. The results showed that during the past 50 years, average contributions of soil and fertilizer, submerged macrophytes, N 2 -fixing phytoplankton, and non-N 2 -fixing phytoplankton were 40.7%, 40.3%, 11.8%, and 7.2%, respectively. In addition, total nitrogen (TN) contents in sediment showed significant increasing trends from 1960 to 2010, and sediment nitrogen of both submerged macrophytes and phytoplankton sources exhibited significant increasing trends during the past 50 years. In contrast, soil and fertilizer sources showed a significant decreasing trend from 1990 to 2010. According to the changing trend of N 2 -fixing phytoplankton, changes of temperature and sunshine duration accounted for at least 43% of the trend in the sediment nitrogen enrichment over the past 50 years. Regression analysis of the climatic factors on nitrogen sources showed that the contributions of precipitation, temperature, and sunshine duration to the variations in sediment nitrogen sources ranged from 18.5% to 60.3%. The study demonstrates that the mixing model provides a robust method for calculating the contribution of multiple nitrogen sources in sediment, and this study also suggests that N 2 -fixing phytoplankton could be regarded as an important response factor for assessing the impacts of climate change on nitrogen enrichment. - Highlights: • A mixing model was built to analyze sediment N sources of lakes/reservoirs. • Fertilizer/soil and macrophytes showed decreasing trends during the past two decades.

  6. BIM Open Source Software (OSS for the documentation of cultural heritage

    Directory of Open Access Journals (Sweden)

    Sotiris Logothetis

    2016-11-01

    This paper presents a review of some recent research on the topic. We review the recent developments focusing on the OSS that can be used at various stages of BIM process in the digital documentation of cultural heritage. The results show that there is more preference in the commercial software due to the fact that the OSS is not yet complete and covers all stages of the BIM process. However, lately we have the Edificius in architectural BIM design and “BIM Vision” as Industry Foundation Classes (IFC model viewer that try to attract as many users as possible. These tools are free and they could well be used for the digital reconstruction of cultural heritage.

  7. Analyzing Korean consumers’ latent preferences for electricity generation sources with a hierarchical Bayesian logit model in a discrete choice experiment

    International Nuclear Information System (INIS)

    Byun, Hyunsuk; Lee, Chul-Yong

    2017-01-01

    Generally, consumers use electricity without considering the source the electricity was generated from. Since different energy sources exert varying effects on society, it is necessary to analyze consumers’ latent preference for electricity generation sources. The present study estimates Korean consumers’ marginal utility and an appropriate generation mix is derived using the hierarchical Bayesian logit model in a discrete choice experiment. The results show that consumers consider the danger posed by the source of electricity as the most important factor among the effects of electricity generation sources. Additionally, Korean consumers wish to reduce the contribution of nuclear power from the existing 32–11%, and increase that of renewable energy from the existing 4–32%. - Highlights: • We derive an electricity mix reflecting Korean consumers’ latent preferences. • We use the discrete choice experiment and hierarchical Bayesian logit model. • The danger posed by the generation source is the most important attribute. • The consumers wish to increase the renewable energy proportion from 4.3% to 32.8%. • Korea's cost-oriented energy supply policy and consumers’ preference differ markedly.

  8. Analyzing Source Apportioned Methane in Northern California During DISCOVER-AQ-CA Using Airborne Measurements and Model Simulations

    Science.gov (United States)

    Johnson, Matthew S.

    2014-01-01

    This study analyzes source apportioned methane (CH4) emissions and atmospheric concentrations in northern California during the Discover-AQ-CA field campaign using airborne measurement data and model simulations. Source apportioned CH4 emissions from the Emissions Database for Global Atmospheric Research (EDGAR) version 4.2 were applied in the 3-D chemical transport model GEOS-Chem and analyzed using airborne measurements taken as part of the Alpha Jet Atmospheric eXperiment over the San Francisco Bay Area (SFBA) and northern San Joaquin Valley (SJV). During the time period of the Discover-AQ-CA field campaign EDGAR inventory CH4 emissions were 5.30 Gg/day (Gg 1.0 109 grams) (equating to 1.9 103 Gg/yr) for all of California. According to EDGAR, the SFBA and northern SJV region contributes 30 of total emissions from California. Source apportionment analysis during this study shows that CH4 concentrations over this area of northern California are largely influenced by global emissions from wetlands and local/global emissions from gas and oil production and distribution, waste treatment processes, and livestock management. Model simulations, using EDGAR emissions, suggest that the model under-estimates CH4 concentrations in northern California (average normalized mean bias (NMB) -5 and linear regression slope 0.25). The largest negative biases in the model were calculated on days when hot spots of local emission sources were measured and atmospheric CH4 concentrations reached values 3.0 parts per million (model NMB -10). Sensitivity emission studies conducted during this research suggest that local emissions of CH4 from livestock management processes are likely the primary source of the negative model bias. These results indicate that a variety, and larger quantity, of measurement data needs to be obtained and additional research is necessary to better quantify source apportioned CH4 emissions in California and further the understanding of the physical processes

  9. Analyzing source apportioned methane in northern California during Discover-AQ-CA using airborne measurements and model simulations

    Science.gov (United States)

    Johnson, Matthew S.; Yates, Emma L.; Iraci, Laura T.; Loewenstein, Max; Tadić, Jovan M.; Wecht, Kevin J.; Jeong, Seongeun; Fischer, Marc L.

    2014-12-01

    This study analyzes source apportioned methane (CH4) emissions and atmospheric mixing ratios in northern California during the Discover-AQ-CA field campaign using airborne measurement data and model simulations. Source apportioned CH4 emissions from the Emissions Database for Global Atmospheric Research (EDGAR) version 4.2 were applied in the 3-D chemical transport model GEOS-Chem and analyzed using airborne measurements taken as part of the Alpha Jet Atmospheric eXperiment over the San Francisco Bay Area (SFBA) and northern San Joaquin Valley (SJV). During the time period of the Discover-AQ-CA field campaign EDGAR inventory CH4 emissions were ∼5.30 Gg day-1 (Gg = 1.0 × 109 g) (equating to ∼1.90 × 103 Gg yr-1) for all of California. According to EDGAR, the SFBA and northern SJV region contributes ∼30% of total CH4 emissions from California. Source apportionment analysis during this study shows that CH4 mixing ratios over this area of northern California are largely influenced by global emissions from wetlands and local/global emissions from gas and oil production and distribution, waste treatment processes, and livestock management. Model simulations, using EDGAR emissions, suggest that the model under-estimates CH4 mixing ratios in northern California (average normalized mean bias (NMB) = -5.2% and linear regression slope = 0.20). The largest negative biases in the model were calculated on days when large amounts of CH4 were measured over local emission sources and atmospheric CH4 mixing ratios reached values >2.5 parts per million. Sensitivity emission studies conducted during this research suggest that local emissions of CH4 from livestock management processes are likely the primary source of the negative model bias. These results indicate that a variety, and larger quantity, of measurement data needs to be obtained and additional research is necessary to better quantify source apportioned CH4 emissions in California.

  10. The Analytical Repository Source-Term (AREST) model: Description and documentation

    International Nuclear Information System (INIS)

    Liebetrau, A.M.; Apted, M.J.; Engel, D.W.; Altenhofen, M.K.; Strachan, D.M.; Reid, C.R.; Windisch, C.F.; Erikson, R.L.; Johnson, K.I.

    1987-10-01

    The geologic repository system consists of several components, one of which is the engineered barrier system. The engineered barrier system interfaces with natural barriers that constitute the setting of the repository. A model that simulates the releases from the engineered barrier system into the natural barriers of the geosphere, called a source-term model, is an important component of any model for assessing the overall performance of the geologic repository system. The Analytical Repository Source-Term (AREST) model being developed is one such model. This report describes the current state of development of the AREST model and the code in which the model is implemented. The AREST model consists of three component models and five process models that describe the post-emplacement environment of a waste package. All of these components are combined within a probabilistic framework. The component models are a waste package containment (WPC) model that simulates the corrosion and degradation processes which eventually result in waste package containment failure; a waste package release (WPR) model that calculates the rates of radionuclide release from the failed waste package; and an engineered system release (ESR) model that controls the flow of information among all AREST components and process models and combines release output from the WPR model with failure times from the WPC model to produce estimates of total release. 167 refs., 40 figs., 12 tabs

  11. Using 137Cs and 210Pbex and other sediment source fingerprints to document suspended sediment sources in small forested catchments in south-central Chile

    International Nuclear Information System (INIS)

    Schuller, P.; Walling, D.E.; Iroumé, A.; Quilodrán, C.; Castillo, A.; Navas, A.

    2013-01-01

    A study of the impact of forest harvesting operations on sediment mobilization from forested catchments has been undertaken in south-central Chile. The study focused on two sets of small paired catchments (treatment and control), with similar soil type, but contrasting mean annual rainfall, located about 400 km apart at Nacimiento (1200 mm yr −1 ) and Los Ulmos (2500 mm yr −1 ). The objective was to study the changes in the relative contribution of the primary sources of fine sediment caused by forestry operations. Attention focused on the pre-harvest and post-harvest periods and the post-replanting period was included for the Nacimiento treatment catchment. The sediment source fingerprinting technique was used to document the contributions of the potential sources. Emphasis was placed on discriminating between the forest slopes, forest roads and channel erosion as potential sources of fine sediment and on assessing the relative contributions of these three sources to the sediment yield from the catchments. The fallout radionuclides (FRNs) 137 Cs and excess lead-210, the environmental radionuclides 226 Ra and 40 K and soil organic matter (SOM) were tested as possible fingerprints for discriminating between potential sediment sources. The Kruskal–Wallis test and discriminant function analysis were used to guide the selection of the optimum fingerprint set for each catchment and observation period. Either one or both of the FRNs were selected for inclusion in the optimum fingerprint for all datasets. The relative contribution of each sediment source to the target sediment load was estimated using the selected fingerprint properties, and a mixing model coupled with a Monte Carlo simulation technique that takes account of uncertainty in characterizing sediment source properties. The goodness of fit of the mixing model was tested by comparing the measured and simulated fingerprint properties for the target sediment samples. In the Nacimiento treatment catchment

  12. Implementation of inter-unit analysis for C and C++ languages in a source-based static code analyzer

    Directory of Open Access Journals (Sweden)

    A. V. Sidorin

    2015-01-01

    Full Text Available The proliferation of automated testing capabilities arises a need for thorough testing of large software systems, including system inter-component interfaces. The objective of this research is to build a method for inter-procedural inter-unit analysis, which allows us to analyse large and complex software systems including multi-architecture projects (like Android OS as well as to support complex assembly systems of projects. Since the selected Clang Static Analyzer uses source code directly as input data, we need to develop a special technique to enable inter-unit analysis for such analyzer. This problem is of special nature because of C and C++ language features that assume and encourage the separate compilation of project files. We describe the build and analysis system that was implemented around Clang Static Analyzer to enable inter-unit analysis and consider problems related to support of complex projects. We also consider the task of merging abstract source trees of translation units and its related problems such as handling conflicting definitions, complex build systems and complex projects support, including support for multi-architecture projects, with examples. We consider both issues related to language design and human-related mistakes (that may be intentional. We describe some heuristics that were used for this work to make the merging process faster. The developed system was tested using Android OS as the input to show it is applicable even for such complicated projects. This system does not depend on the inter-procedural analysis method and allows the arbitrary change of its algorithm.

  13. Market Analysis and Consumer Impacts Source Document. Part II. Review of Motor Vehicle Market and Consumer Expenditures on Motor Vehicle Transportation

    Science.gov (United States)

    1980-12-01

    This source document on motor vehicle market analysis and consumer impacts consists of three parts. Part II consists of studies and review on: motor vehicle sales trends; motor vehicle fleet life and fleet composition; car buying patterns of the busi...

  14. Documentation of Source Code.

    Science.gov (United States)

    1988-05-12

    the "load IC" menu option. A prompt will appear in the typescript window requesting the name of the knowledge base to be loaded. Enter...highlighted and then a prompt will appear in the typescript window. The prompt will be requesting the name of the file containing the message to be read in...the file name, the system will begin reading in the message. The listified message is echoed back in the typescript window. After that, the screen

  15. LOW COST ANALYZER FOR THE DETERMINATION OF PHOSPHORUS BASED ON OPEN-SOURCE HARDWARE AND PULSED FLOWS

    Directory of Open Access Journals (Sweden)

    Pablo González

    2016-04-01

    Full Text Available The need for automated analyzers for industrial and environmental samples has triggered the research for new and cost-effective strategies of automation and control of analytical systems. The widespread availability of open-source hardware together with novel analytical methods based on pulsed flows have opened the possibility of implementing standalone automated analytical systems at low cost. Among the areas that can benefit from this approach are the analysis of industrial products and effluents and environmental analysis. In this work, a multi-pumping flow system is proposed for the determination of phosphorus in effluents and polluted water samples. The system employs photometric detection based on the formation of molybdovanadophosphoric acid, and the fluidic circuit is built using three solenoid micropumps. The detection is implemented with a low cost LED-photodiode photometric detection system and the whole system is controlled by an open-source Arduino Uno microcontroller board. The optimization of the timing to ensure the color development and the pumping cycle is discussed for the proposed implementation. Experimental results to evaluate the system behavior are presented verifying a linear relationship between the relative absorbance and the phosphorus concentrations for levels as high as 50 mg L-1.

  16. An open-source framework for analyzing N-electron dynamics. II. Hybrid density functional theory/configuration interaction methodology.

    Science.gov (United States)

    Hermann, Gunter; Pohl, Vincent; Tremblay, Jean Christophe

    2017-10-30

    In this contribution, we extend our framework for analyzing and visualizing correlated many-electron dynamics to non-variational, highly scalable electronic structure method. Specifically, an explicitly time-dependent electronic wave packet is written as a linear combination of N-electron wave functions at the configuration interaction singles (CIS) level, which are obtained from a reference time-dependent density functional theory (TDDFT) calculation. The procedure is implemented in the open-source Python program detCI@ORBKIT, which extends the capabilities of our recently published post-processing toolbox (Hermann et al., J. Comput. Chem. 2016, 37, 1511). From the output of standard quantum chemistry packages using atom-centered Gaussian-type basis functions, the framework exploits the multideterminental structure of the hybrid TDDFT/CIS wave packet to compute fundamental one-electron quantities such as difference electronic densities, transient electronic flux densities, and transition dipole moments. The hybrid scheme is benchmarked against wave function data for the laser-driven state selective excitation in LiH. It is shown that all features of the electron dynamics are in good quantitative agreement with the higher-level method provided a judicious choice of functional is made. Broadband excitation of a medium-sized organic chromophore further demonstrates the scalability of the method. In addition, the time-dependent flux densities unravel the mechanistic details of the simulated charge migration process at a glance. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  17. The systematic profiling of false identity documents: method validation and performance evaluation using seizures known to originate from common and different sources.

    Science.gov (United States)

    Baechler, Simon; Terrasse, Vincent; Pujol, Jean-Philippe; Fritz, Thibaud; Ribaux, Olivier; Margot, Pierre

    2013-10-10

    False identity documents constitute a potential powerful source of forensic intelligence because they are essential elements of transnational crime and provide cover for organized crime. In previous work, a systematic profiling method using false documents' visual features has been built within a forensic intelligence model. In the current study, the comparison process and metrics lying at the heart of this profiling method are described and evaluated. This evaluation takes advantage of 347 false identity documents of four different types seized in two countries whose sources were known to be common or different (following police investigations and dismantling of counterfeit factories). Intra-source and inter-sources variations were evaluated through the computation of more than 7500 similarity scores. The profiling method could thus be validated and its performance assessed using two complementary approaches to measuring type I and type II error rates: a binary classification and the computation of likelihood ratios. Very low error rates were measured across the four document types, demonstrating the validity and robustness of the method to link documents to a common source or to differentiate them. These results pave the way for an operational implementation of a systematic profiling process integrated in a developed forensic intelligence model. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  18. Iraqi Perspectives Project. Primary Source Materials for Saddam and Terrorism: Emerging Insights from Captured Iraqi Documents. Volume 2 (Redacted)

    National Research Council Canada - National Science Library

    Woods, Kevin M

    2007-01-01

    Captured Iraqi documents have uncovered evidence that links the regime of Saddam Hussein to regional and global terrorism, including a variety of revolutionary, liberation, nationalist, and Islamic...

  19. Iraqi Perspectives Project. Primary Source Materials for Saddam and Terrorism: Emerging Insights from Captured Iraqi Documents. Volume 5 (Redacted)

    National Research Council Canada - National Science Library

    Woods, Kevin M

    2007-01-01

    Captured Iraqi documents have uncovered evidence that links the regime of Saddam Hussein to regional and global terrorism, including a variety of revolutionary, liberation, nationalist, and Islamic...

  20. Iraqi Perspectives Project. Primary Source Materials for Saddam and Terrorism: Emerging Insights from Captured Iraqi Documents. Volume 3 (Redacted)

    National Research Council Canada - National Science Library

    Woods, Kevin M

    2007-01-01

    Captured Iraqi documents have uncovered evidence that links the regime of Saddam Hussein to regional and global terrorism, including a variety of revolutionary, liberation, nationalist, and Islamic...

  1. Iraqi Perspectives Project. Primary Source Materials for Saddam and Terrorism: Emerging Insights from Captured Iraqi Documents. Volume 4 (Redacted)

    National Research Council Canada - National Science Library

    Woods, Kevin M

    2007-01-01

    Captured Iraqi documents have uncovered evidence that links the regime of Saddam Hussein to regional and global terrorism, including a variety of revolutionary, liberation, nationalist, and Islamic...

  2. Generalizing Source Geometry of Site Contamination by Simulating and Analyzing Analytical Solution of Three-Dimensional Solute Transport Model

    Directory of Open Access Journals (Sweden)

    Xingwei Wang

    2014-01-01

    Full Text Available Due to the uneven distribution of pollutions and blur edge of pollutant area, there will exist uncertainty of source term shape in advective-diffusion equation model of contaminant transport. How to generalize those irregular source terms and deal with those uncertainties is very critical but rarely studied in previous research. In this study, the fate and transport of contaminant from rectangular and elliptic source geometry were simulated based on a three-dimensional analytical solute transport model, and the source geometry generalization guideline was developed by comparing the migration of contaminant. The result indicated that the variation of source area size had no effect on pollution plume migration when the plume migrated as far as five times of source side length. The migration of pollution plume became slower with the increase of aquifer thickness. The contaminant concentration was decreasing with scale factor rising, and the differences among various scale factors became smaller with the distance to field increasing.

  3. sources

    Directory of Open Access Journals (Sweden)

    Shu-Yin Chiang

    2002-01-01

    Full Text Available In this paper, we study the simplified models of the ATM (Asynchronous Transfer Mode multiplexer network with Bernoulli random traffic sources. Based on the model, the performance measures are analyzed by the different output service schemes.

  4. An Open-Source Strategy for Documenting Events: The Case Study of the 42nd Canadian Federal Election on Twitter

    Directory of Open Access Journals (Sweden)

    Nick Ruest

    2016-04-01

    Full Text Available This article examines the tools, approaches, collaboration, and findings of the Web Archives for Historical Research Group around the capture and analysis of about 4 million tweets during the 2015 Canadian Federal Election. We hope that national libraries and other heritage institutions will find our model useful as they consider how to capture, preserve, and analyze ongoing events using Twitter. While Twitter is not a representative sample of broader society - Pew research shows in their study of US users that it skews young, college-educated, and affluent (above $50,000 household income – Twitter still represents an exponential increase in the amount of information generated, retained, and preserved from 'everyday' people. Therefore, when historians study the 2015 federal election, Twitter will be a prime source.On August 3, 2015, the team initiated both a Search API and Stream API collection with twarc, a tool developed by Ed Summers, using the hashtag #elxn42. The hashtag referred to the election being Canada's 42nd general federal election (hence 'election 42' or elxn42. Data collection ceased on November 5, 2015, the day after Justin Trudeau was sworn in as the 42nd Prime Minister of Canada. We collected for a total of 102 days, 13 hours and 50 minutes. To analyze the data set, we took advantage of a number of command line tools, utilities that are available within twarc, twarc-report, and jq. In accordance with the Twitter Developer Agreement & Policy, and after ethical deliberations discussed below, we made the tweet IDs and other derivative data available in a data repository. This allows other people to use our dataset, cite our dataset, and enhance their own research projects by drawing on #elxn42 tweets. Our analytics included: breaking tweet text down by day to track change over time; client analysis, allowing us to see how the scale of mobile devices affected medium interactions; URL analysis, comparing both to Archive

  5. Exploring information-seeking processes by business: analyzing source and channel choices in business-to-government service interactions

    NARCIS (Netherlands)

    van den Boer, Yvon; Pieterson, Willem Jan; van Dijk, Johannes A.G.M.; Arendsen, R.

    2016-01-01

    With the rise of electronic channels it has become easier for businesses to consult various types of information sources in information-seeking processes. Governments are urged to rethink their role as reliable information source and the roles of their (electronic) service channels to provide

  6. Student-Centered Pedagogy and Real-World Research: Using Documents as Sources of Data in Teaching Social Science Skills and Methods

    Science.gov (United States)

    Peyrefitte, Magali; Lazar, Gillian

    2018-01-01

    This teaching note describes the design and implementation of an activity in a 90-minute teaching session that was developed to introduce a diverse cohort of first-year criminology and sociology students to the use of documents as sources of data. This approach was contextualized in real-world research through scaffolded, student-centered tasks…

  7. Effects of Federal Regulation on the Financial Structure and Performance of the Domestic Motor Vehicle Manufacturers (Source Document)

    Science.gov (United States)

    1978-11-01

    PURPOSE OF THE STUDY : The increasing government regulation of automative transportation : industries in the United States has produced the need for : financial and economic studies of the effects of such policies. : The purpose of this document is t...

  8. Modeling and analyzing flow of third grade nanofluid due to rotating stretchable disk with chemical reaction and heat source

    Science.gov (United States)

    Hayat, T.; Ahmad, Salman; Khan, M. Ijaz; Alsaedi, A.

    2018-05-01

    This article addresses flow of third grade nanofluid due to stretchable rotating disk. Mass and heat transports are analyzed through thermophoresis and Brownian movement effects. Further the effects of heat generation and chemical reaction are also accounted. The obtained ODE's are tackled computationally by means of homotopy analysis method. Graphical outcomes are analyzed for the effects of different variables. The obtained results show that velocity reduces through Reynolds number and material parameters. Temperature and concentration increase with Brownian motion and these decrease by Reynolds number.

  9. A Dual Source Ion Trap Mass Spectrometer for the Mars Organic Molecule Analyzer of ExoMars 2018

    Science.gov (United States)

    Brickerhoff, William B.; vanAmerom, F. H. W.; Danell, R. M.; Arevalo, R.; Atanassova, M.; Hovmand, L.; Mahaffy, P. R.; Cotter, R. J.

    2011-01-01

    We present details on the objectives, requirements, design and operational approach of the core mass spectrometer of the Mars Organic Molecule Analyzer (MOMA) investigation on the 2018 ExoMars mission. The MOMA mass spectrometer enables the investigation to fulfill its objective of analyzing the chemical composition of organic compounds in solid samples obtained from the near surface of Mars. Two methods of ionization are realized, associated with different modes of MOMA operation, in a single compact ion trap mass spectrometer. The stringent mass and power constraints of the mission have led to features such as low voltage and low frequency RF operation [1] and pulse counting detection.

  10. A method to analyze “source–sink” structure of non-point source pollution based on remote sensing technology

    International Nuclear Information System (INIS)

    Jiang, Mengzhen; Chen, Haiying; Chen, Qinghui

    2013-01-01

    With the purpose of providing scientific basis for environmental planning about non-point source pollution prevention and control, and improving the pollution regulating efficiency, this paper established the Grid Landscape Contrast Index based on Location-weighted Landscape Contrast Index according to the “source–sink” theory. The spatial distribution of non-point source pollution caused by Jiulongjiang Estuary could be worked out by utilizing high resolution remote sensing images. The results showed that, the area of “source” of nitrogen and phosphorus in Jiulongjiang Estuary was 534.42 km 2 in 2008, and the “sink” was 172.06 km 2 . The “source” of non-point source pollution was distributed mainly over Xiamen island, most of Haicang, east of Jiaomei and river bank of Gangwei and Shima; and the “sink” was distributed over southwest of Xiamen island and west of Shima. Generally speaking, the intensity of “source” gets weaker along with the distance from the seas boundary increase, while “sink” gets stronger. -- Highlights: •We built an index to study the “source–sink” structure of NSP in a space scale. •The Index was applied in Jiulongjiang estuary and got a well result. •The study is beneficial to discern the high load area of non-point source pollution. -- “Source–Sink” Structure of non-point source nitrogen and phosphorus pollution in Jiulongjiang estuary in China was worked out by the Grid Landscape Contrast Index

  11. Open Source Web-Based Solutions for Disseminating and Analyzing Flood Hazard Information at the Community Level

    Science.gov (United States)

    Santillan, M. M.-M.; Santillan, J. R.; Morales, E. M. O.

    2017-09-01

    We discuss in this paper the development, including the features and functionalities, of an open source web-based flood hazard information dissemination and analytical system called "Flood EViDEns". Flood EViDEns is short for "Flood Event Visualization and Damage Estimations", an application that was developed by the Caraga State University to address the needs of local disaster managers in the Caraga Region in Mindanao, Philippines in accessing timely and relevant flood hazard information before, during and after the occurrence of flood disasters at the community (i.e., barangay and household) level. The web application made use of various free/open source web mapping and visualization technologies (GeoServer, GeoDjango, OpenLayers, Bootstrap), various geospatial datasets including LiDAR-derived elevation and information products, hydro-meteorological data, and flood simulation models to visualize various scenarios of flooding and its associated damages to infrastructures. The Flood EViDEns application facilitates the release and utilization of this flood-related information through a user-friendly front end interface consisting of web map and tables. A public version of the application can be accessed at http://121.97.192.11:8082/. The application is currently expanded to cover additional sites in Mindanao, Philippines through the "Geo-informatics for the Systematic Assessment of Flood Effects and Risks for a Resilient Mindanao" or the "Geo-SAFER Mindanao" Program.

  12. ERP Reliability Analysis (ERA) Toolbox: An open-source toolbox for analyzing the reliability of event-related brain potentials.

    Science.gov (United States)

    Clayson, Peter E; Miller, Gregory A

    2017-01-01

    Generalizability theory (G theory) provides a flexible, multifaceted approach to estimating score reliability. G theory's approach to estimating score reliability has important advantages over classical test theory that are relevant for research using event-related brain potentials (ERPs). For example, G theory does not require parallel forms (i.e., equal means, variances, and covariances), can handle unbalanced designs, and provides a single reliability estimate for designs with multiple sources of error. This monograph provides a detailed description of the conceptual framework of G theory using examples relevant to ERP researchers, presents the algorithms needed to estimate ERP score reliability, and provides a detailed walkthrough of newly-developed software, the ERP Reliability Analysis (ERA) Toolbox, that calculates score reliability using G theory. The ERA Toolbox is open-source, Matlab software that uses G theory to estimate the contribution of the number of trials retained for averaging, group, and/or event types on ERP score reliability. The toolbox facilitates the rigorous evaluation of psychometric properties of ERP scores recommended elsewhere in this special issue. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Investigations of the inductively coupled plasma source for analyzing NURE water samples at the Los Alamos Scientific Laboratory

    International Nuclear Information System (INIS)

    Apel, C.T.; Bieniewski, T.M.; Cox, L.E.; Steinhaus, D.W.

    1977-03-01

    A 3.4-meter direct-reading spectrograph is being used with an inductively coupled plasma source for the simultaneous determination of Ag, Bi, Cd, Cu, Nb, Ni, Pb, Sn, and W in water samples. We have attached a small digital computer to the system in order to obtain intensity data on each element once a second. After the intensities during a run on a sample have stabilized, the computer records the intensity data and outputs the average concentration for each element. To approach the published detection limits, a peristaltic pump must be used to force the water sample into the usual cross-flow nebulizer. We have studied several different nebulizer designs with the goal of improving efficiency and hence sensitivity. One design, the fritted-disk nebulizer, has an efficiency over 60 percent, as compared with the 5 percent efficiency of the original nebulizer

  14. Directed cortical information flow during human object recognition: analyzing induced EEG gamma-band responses in brain's source space.

    Directory of Open Access Journals (Sweden)

    Gernot G Supp

    Full Text Available The increase of induced gamma-band responses (iGBRs; oscillations >30 Hz elicited by familiar (meaningful objects is well established in electroencephalogram (EEG research. This frequency-specific change at distinct locations is thought to indicate the dynamic formation of local neuronal assemblies during the activation of cortical object representations. As analytically power increase is just a property of a single location, phase-synchrony was introduced to investigate the formation of large-scale networks between spatially distant brain sites. However, classical phase-synchrony reveals symmetric, pair-wise correlations and is not suited to uncover the directionality of interactions. Here, we investigated the neural mechanism of visual object processing by means of directional coupling analysis going beyond recording sites, but rather assessing the directionality of oscillatory interactions between brain areas directly. This study is the first to identify the directionality of oscillatory brain interactions in source space during human object recognition and suggests that familiar, but not unfamiliar, objects engage widespread reciprocal information flow. Directionality of cortical information-flow was calculated based upon an established Granger-Causality coupling-measure (partial-directed coherence; PDC using autoregressive modeling. To enable comparison with previous coupling studies lacking directional information, phase-locking analysis was applied, using wavelet-based signal decompositions. Both, autoregressive modeling and wavelet analysis, revealed an augmentation of iGBRs during the presentation of familiar objects relative to unfamiliar controls, which was localized to inferior-temporal, superior-parietal and frontal brain areas by means of distributed source reconstruction. The multivariate analysis of PDC evaluated each possible direction of brain interaction and revealed widespread reciprocal information-transfer during familiar

  15. pROC: an open-source package for R and S+ to analyze and compare ROC curves.

    Science.gov (United States)

    Robin, Xavier; Turck, Natacha; Hainard, Alexandre; Tiberti, Natalia; Lisacek, Frédérique; Sanchez, Jean-Charles; Müller, Markus

    2011-03-17

    Receiver operating characteristic (ROC) curves are useful tools to evaluate classifiers in biomedical and bioinformatics applications. However, conclusions are often reached through inconsistent use or insufficient statistical analysis. To support researchers in their ROC curves analysis we developed pROC, a package for R and S+ that contains a set of tools displaying, analyzing, smoothing and comparing ROC curves in a user-friendly, object-oriented and flexible interface. With data previously imported into the R or S+ environment, the pROC package builds ROC curves and includes functions for computing confidence intervals, statistical tests for comparing total or partial area under the curve or the operating points of different classifiers, and methods for smoothing ROC curves. Intermediary and final results are visualised in user-friendly interfaces. A case study based on published clinical and biomarker data shows how to perform a typical ROC analysis with pROC. pROC is a package for R and S+ specifically dedicated to ROC analysis. It proposes multiple statistical tests to compare ROC curves, and in particular partial areas under the curve, allowing proper ROC interpretation. pROC is available in two versions: in the R programming language or with a graphical user interface in the S+ statistical software. It is accessible at http://expasy.org/tools/pROC/ under the GNU General Public License. It is also distributed through the CRAN and CSAN public repositories, facilitating its installation.

  16. Validation of chlorine and oxygen isotope ratio analysis to differentiate perchlorate sources and to document perchlorate biodegradation

    Science.gov (United States)

    Paul B. Hatzinger,; Böhlke, John Karl; Sturchio, Neil C.; Gu, Baohua

    2013-01-01

    Increased health concerns about perchlorate (ClO4-) during the past decade and subsequent regulatory considerations have generated appreciable interest in source identification. The key objective of the isotopic techniques described in this guidance manual is to provide evidence concerning the origin of ClO4- in soils and groundwater and, more specifically, whether that ClO4- is synthetic or natural. Chlorine and oxygen isotopic analyses of ClO4- provide the primary direct approach whereby different sources of ClO4- can be distinguished from each other. These techniques measure the relative abundances of the stable isotopes of chlorine (37Cl and 35Cl) and oxygen (18O, 17O, and 16O) in ClO4- using isotope-ratio mass spectrometry (IRMS). In addition, the relative abundance of the radioactive chlorine isotope 36Cl is measured using accelerator mass spectrometry (AMS). Taken together, these measurements provide four independent quantities that can be used to distinguish natural and synthetic ClO4- sources, to discriminate different types of natural ClO4-, and to detect ClO4- biodegradation in the environment. Other isotopic, chemical, and geochemical techniques that can be applied in conjunction with isotopic analyses of ClO4- to provide supporting data in forensic studies are also described.

  17. Applying Satellite Data Sources in the Documentation and Landscape Modelling for Graeco-Roman Fortified Sites in the TŪR Abdin Area, Eastern Turkey

    Science.gov (United States)

    Silver, K.; Silver, M.; Törmä, M.; Okkonen, J.; Okkonen, T.

    2017-08-01

    In 2015-2016 the Finnish-Swedish Archaeological Project in Mesopotamia (FSAPM) initiated a pilot study of an unexplored area in the Tūr Abdin region in Northern Mesopotamia (present-day Mardin Province in southeastern Turkey). FSAPM is reliant on satellite image data sources for prospecting, identifying, recording, and mapping largely unknown archaeological sites as well as studying their landscapes in the region. The purpose is to record and document sites in this endangered area for saving its cultural heritage. The sites in question consist of fortified architectural remains in an ancient border zone between the Graeco-Roman/Byzantine world and Parthia/Persia. The location of the archaeological sites in the terrain and the visible archaeological remains, as well as their dimensions and sizes were determined from the ortorectified satellite images, which also provided coordinates. In addition, field documentation was carried out in situ with photographs and notes. The applicability of various satellite data sources for the archaeological documentation of the project was evaluated. Satellite photographs from three 1968 CORONA missions, i.e. the declassified US government satellite photograph archives were acquired. Furthermore, satellite images included a recent GeoEye-1 Satellite Sensor Image from 2010 with a resolution of 0.5 m. Its applicability for prospecting archaeological sites, studying the terrain and producing landscape models in 3D was confirmed. The GeoEye-1 revealed the ruins of a fortified town and a fortress for their documentation and study. Landscape models for the area of these sites were constructed fusing GeoEye-1 with EU-DEM (European Digital Elevation Model data using SRTM and ASTER GDEM data) in order to understand their locations in the terrain.

  18. APPLYING SATELLITE DATA SOURCES IN THE DOCUMENTATION AND LANDSCAPE MODELLING FOR GRAECO-ROMAN/BYZANTINE FORTIFIED SITES IN THE TŪR ABDIN AREA, EASTERN TURKEY

    Directory of Open Access Journals (Sweden)

    K. Silver

    2017-08-01

    Full Text Available In 2015-2016 the Finnish-Swedish Archaeological Project in Mesopotamia (FSAPM initiated a pilot study of an unexplored area in the Tūr Abdin region in Northern Mesopotamia (present-day Mardin Province in southeastern Turkey. FSAPM is reliant on satellite image data sources for prospecting, identifying, recording, and mapping largely unknown archaeological sites as well as studying their landscapes in the region. The purpose is to record and document sites in this endangered area for saving its cultural heritage. The sites in question consist of fortified architectural remains in an ancient border zone between the Graeco-Roman/Byzantine world and Parthia/Persia. The location of the archaeological sites in the terrain and the visible archaeological remains, as well as their dimensions and sizes were determined from the ortorectified satellite images, which also provided coordinates. In addition, field documentation was carried out in situ with photographs and notes. The applicability of various satellite data sources for the archaeological documentation of the project was evaluated. Satellite photographs from three 1968 CORONA missions, i.e. the declassified US government satellite photograph archives were acquired. Furthermore, satellite images included a recent GeoEye-1 Satellite Sensor Image from 2010 with a resolution of 0.5 m. Its applicability for prospecting archaeological sites, studying the terrain and producing landscape models in 3D was confirmed. The GeoEye-1 revealed the ruins of a fortified town and a fortress for their documentation and study. Landscape models for the area of these sites were constructed fusing GeoEye-1 with EU-DEM (European Digital Elevation Model data using SRTM and ASTER GDEM data in order to understand their locations in the terrain.

  19. Following the Ions through a Mass Spectrometer with Atmospheric Pressure Interface: Simulation of Complete Ion Trajectories from Ion Source to Mass Analyzer.

    Science.gov (United States)

    Zhou, Xiaoyu; Ouyang, Zheng

    2016-07-19

    Ion trajectory simulation is an important and useful tool in instrumentation development for mass spectrometry. Accurate simulation of the ion motion through the mass spectrometer with atmospheric pressure ionization source has been extremely challenging, due to the complexity in gas hydrodynamic flow field across a wide pressure range as well as the computational burden. In this study, we developed a method of generating the gas flow field for an entire mass spectrometer with an atmospheric pressure interface. In combination with the electric force, for the first time simulation of ion trajectories from an atmospheric pressure ion source to a mass analyzer in vacuum has been enabled. A stage-by-stage ion repopulation method has also been implemented for the simulation, which helped to avoid an intolerable computational burden for simulations at high pressure regions while it allowed statistically meaningful results obtained for the mass analyzer. It has been demonstrated to be suitable to identify a joint point for combining the high and low pressure fields solved individually. Experimental characterization has also been done to validate the new method for simulation. Good agreement was obtained between simulated and experimental results for ion transfer though an atmospheric pressure interface with a curtain gas.

  20. MSiReader: an open-source interface to view and analyze high resolving power MS imaging files on Matlab platform.

    Science.gov (United States)

    Robichaud, Guillaume; Garrard, Kenneth P; Barry, Jeremy A; Muddiman, David C

    2013-05-01

    During the past decade, the field of mass spectrometry imaging (MSI) has greatly evolved, to a point where it has now been fully integrated by most vendors as an optional or dedicated platform that can be purchased with their instruments. However, the technology is not mature and multiple research groups in both academia and industry are still very actively studying the fundamentals of imaging techniques, adapting the technology to new ionization sources, and developing new applications. As a result, there important varieties of data file formats used to store mass spectrometry imaging data and, concurrent to the development of MSi, collaborative efforts have been undertaken to introduce common imaging data file formats. However, few free software packages to read and analyze files of these different formats are readily available. We introduce here MSiReader, a free open source application to read and analyze high resolution MSI data from the most common MSi data formats. The application is built on the Matlab platform (Mathworks, Natick, MA, USA) and includes a large selection of data analysis tools and features. People who are unfamiliar with the Matlab language will have little difficult navigating the user-friendly interface, and users with Matlab programming experience can adapt and customize MSiReader for their own needs.

  1. Digital photography as source documentation of skin toxicity: an analysis from the Trans Tasman Radiation Oncology Group (TROG) 04.01 Post-Mastectomy Radiation Skin Care trial

    International Nuclear Information System (INIS)

    Graham, Peter H.; Plant, Natalie A.; Graham, Jennifer L.

    2012-01-01

    This study evaluated digital photographs as a method of providing auditable source documentation for radiotherapy-induced skin toxicity and the possibility therefore of centralised, blinded scoring for a multicentre randomised controlled trial. Digital photograph sets from the first five patients from each of 12 participating centres were audited. Minimum camera specifications and photograph requirements were protocol specified. Three readers rated photographs for four key quality items. They also scored skin reactions according to National Cancer Institute Common Terminology Criteria (CTC) v3.0 acute skin score and also for the presence of any moist desquamation. Five hundred fifty-two images were available. Field of view was scored as inadequate in 1–10%, focus inadequate in 0.4–4%, lighting inadequate in 0.2–3% and dividing line marking inadequate for scoring of skin reactions within sectors in 18–23% of photographs by three readers. Reader pairwise inter-observer agreement was 83–88% for CTC acute skin scores, but the kappa value ranged from 0.58 to 0.73. The percentage of image sectors not scored by readers due to difficulty in assessing was 1–10%. Moist desquamation was scored by clinicians in 8 (medial)–13% (lateral) of patients compared with 3–5% and 5–11% by readers. Photo reader inter-observer agreement is only moderate. Photo readers tended to underscore the frequency of moist desquamation, but the trend by sector parallels the clinical scorers. Photographs are useful source documents for auditing and monitoring, but not a replacement for clinical scoring.

  2. Digital photography as source documentation of skin toxicity: an analysis from the Trans Tasman Radiation Oncology Group (TROG) 04.01 post-mastectomy radiation skin care trial.

    Science.gov (United States)

    Graham, Peter H; Plant, Natalie A; Graham, Jennifer Louise; Browne, Lois H; Borg, Martin; Capp, Anne; Delaney, Geoff P; Harvey, Jennifer; Kenny, Lizbeth; Francis, Michael; Zissiadis, Yvonne

    2012-08-01

    This study evaluated digital photographs as a method of providing auditable source documentation for radiotherapy-induced skin toxicity and the possibility therefore of centralised, blinded scoring for a multicentre randomised controlled trial. Digital photograph sets from the first five patients from each of 12 participating centres were audited. Minimum camera specifications and photograph requirements were protocol specified. Three readers rated photographs for four key quality items. They also scored skin reactions according to National Cancer Institute Common Terminology Criteria (CTC) v3.0 acute skin score and also for the presence of any moist desquamation. Five hundred fifty-two images were available. Field of view was scored as inadequate in 1-10%, focus inadequate in 0.4-4%, lighting inadequate in 0.2-3% and dividing line marking inadequate for scoring of skin reactions within sectors in 18-23% of photographs by three readers. Reader pairwise inter-observer agreement was 83-88% for CTC acute skin scores, but the kappa value ranged from 0.58 to 0.73. The percentage of image sectors not scored by readers due to difficulty in assessing was 1-10%. Moist desquamation was scored by clinicians in 8 (medial)-13% (lateral) of patients compared with 3-5% and 5-11% by readers. Photo reader inter-observer agreement is only moderate. Photo readers tended to underscore the frequency of moist desquamation, but the trend by sector parallels the clinical scorers. Photographs are useful source documents for auditing and monitoring, but not a replacement for clinical scoring. © 2012 The Authors. Journal of Medical Imaging and Radiation Oncology © 2012 The Royal Australian and New Zealand College of Radiologists.

  3. Toward Documentation of Program Evolution

    DEFF Research Database (Denmark)

    Vestdam, Thomas; Nørmark, Kurt

    2005-01-01

    The documentation of a program often falls behind the evolution of the program source files. When this happens it may be attractive to shift the documentation mode from updating the documentation to documenting the evolution of the program. This paper describes tools that support the documentatio....... It is concluded that our approach can help revitalize older documentation, and that discovery of the fine grained program evolution steps help the programmer in documenting the evolution of the program....

  4. ITER task title - source term data, modelling, and analysis. ITER subtask no. S81TT05/5 (SEP 1-1). Global tritium source term analysis basis document. Subtask 1: operational tritium effluents and releases. Final report (1995 TASK)

    International Nuclear Information System (INIS)

    Kalyanam, K.M.

    1996-06-01

    This document represents the final report for the global tritium source term analysis task initiated in 1995. The report presents a room-by-room map/table at the subsystem level for the ITER tritium systems, identifying the major equipment, secondary containments, tritium release sources, duration/frequency of tritium releases and the release pathways. The chronic tritium releases during normal operation, as well as tritium releases due to routine maintenance of the Water Distillation Unit, Isotope Separation System and Primary and Secondary Heat Transport Systems, have been estimated for most of the subsystems, based on the IDR design, the Design Description Documents (April - Jun 1995 issues) and the design updates up to December 1995. The report also outlines the methodology and the key assumptions that are adopted in preparing the tritium release estimates. The design parameters for the ITER Basic Performance Phase (BPP) have been used in estimating the tritium releases shown in the room-by-room map/table. The tritium release calculations and the room-by-room map/table have been prepared in EXCEL, so that the estimates can be refined easily as the design evolves and more detailed information becomes available. (author). 23 refs., tabs

  5. Near Earth Inner-Source and Interstellar Pickup Ions Observed with the Hot Plasma Composition Analyzer of the Magnetospheric Multiscale Mission Mms-Hpca

    Science.gov (United States)

    Gomez, R. G.; Fuselier, S. A.; Mukherjee, J.; Gonzalez, C. A.

    2017-12-01

    Pickup ions found near the earth are generally picked up in the rest frame of the solar wind, and propagate radially outward from their point of origin. While propagating, they simultaneously gyrate about the magnetic field. Pickup ions come in two general populations; interstellar and inner source ions. Interstellar ions originate in the interstellar medium, enter the solar system in a neutral charge state, are gravitationally focused on the side of the sun opposite their arrival direction and, are ionized when they travel near the sun. Inner-source ions originate at a location within the solar system and between the sun and the observation point. Both pickup ion populations share similarities in composition and charge states, so measuring of their dynamics, using their velocity distribution functions, f(v)'s, is absolutely essential to distinguishing them, and to determining their spatial and temporal origins. Presented here will be the results of studies conducted with the four Hot Plasma Composition Analyzers of the Magnetospheric Multiscale Mission (MMS-HPCA). These instruments measure the full sky (4π steradians) distribution functions of near earth plasmas at a 10 second cadence in an energy-to-charge range 0.001-40 keV/e. The instruments are also capable of parsing this combined energy-solid angle phase space with 22.5° resolution polar angle, and 11.25° in azimuthal angle, allowing for clear measurement of the pitch angle scattering of the ions.

  6. Fugitive Methane Emission Identification and Source Attribution: Ethane-to-Methane Analysis Using a Portable Cavity Ring-Down Spectroscopy Analyzer

    Science.gov (United States)

    Kim-Hak, D.; Fleck, D.

    2017-12-01

    Natural gas analysis and methane specifically have become increasingly important by virtue of methane's 28-36x greenhouse warming potential compared to CO2 and accounting for 10% of total greenhouse gas emissions in the US alone. Additionally, large uncontrolled leaks, such as the recent one from Aliso Canyon in Southern California, originating from uncapped wells, storage facilities and coal mines have increased the total global contribution of methane missions even further. Determining the specific fingerprint of methane sources by quantifying the ethane to methane (C2:C1) ratios provides us with means to understand processes yielding methane and allows for sources of methane to be mapped and classified through these processes; i.e. biogenic or thermogenic, oil vs. gas vs. coal gas-related. Here we present data obtained using a portable cavity ring-down spectrometry analyzer weighing less than 25 lbs and consuming less than 35W that simultaneously measures methane and ethane in real-time with a raw 1-σ precision of plane gas propagation.

  7. Proof of Concept Coded Aperture Miniature Mass Spectrometer Using a Cycloidal Sector Mass Analyzer, a Carbon Nanotube (CNT) Field Emission Electron Ionization Source, and an Array Detector

    Science.gov (United States)

    Amsden, Jason J.; Herr, Philip J.; Landry, David M. W.; Kim, William; Vyas, Raul; Parker, Charles B.; Kirley, Matthew P.; Keil, Adam D.; Gilchrist, Kristin H.; Radauscher, Erich J.; Hall, Stephen D.; Carlson, James B.; Baldasaro, Nicholas; Stokes, David; Di Dona, Shane T.; Russell, Zachary E.; Grego, Sonia; Edwards, Steven J.; Sperline, Roger P.; Denton, M. Bonner; Stoner, Brian R.; Gehm, Michael E.; Glass, Jeffrey T.

    2018-02-01

    Despite many potential applications, miniature mass spectrometers have had limited adoption in the field due to the tradeoff between throughput and resolution that limits their performance relative to laboratory instruments. Recently, a solution to this tradeoff has been demonstrated by using spatially coded apertures in magnetic sector mass spectrometers, enabling throughput and signal-to-background improvements of greater than an order of magnitude with no loss of resolution. This paper describes a proof of concept demonstration of a cycloidal coded aperture miniature mass spectrometer (C-CAMMS) demonstrating use of spatially coded apertures in a cycloidal sector mass analyzer for the first time. C-CAMMS also incorporates a miniature carbon nanotube (CNT) field emission electron ionization source and a capacitive transimpedance amplifier (CTIA) ion array detector. Results confirm the cycloidal mass analyzer's compatibility with aperture coding. A >10× increase in throughput was achieved without loss of resolution compared with a single slit instrument. Several areas where additional improvement can be realized are identified.

  8. Omega documentation

    Energy Technology Data Exchange (ETDEWEB)

    Howerton, R.J.; Dye, R.E.; Giles, P.C.; Kimlinger, J.R.; Perkins, S.T.; Plechaty, E.F.

    1983-08-01

    OMEGA is a CRAY I computer program that controls nine codes used by LLNL Physical Data Group for: 1) updating the libraries of evaluated data maintained by the group (UPDATE); 2) calculating average values of energy deposited in secondary particles and residual nuclei (ENDEP); 3) checking the libraries for internal consistency, especially for energy conservation (GAMCHK); 4) producing listings, indexes and plots of the library data (UTILITY); 5) producing calculational constants such as group averaged cross sections and transfer matrices for diffusion and Sn transport codes (CLYDE); 6) producing and updating standard files of the calculational constants used by LLNL Sn and diffusion transport codes (NDFL); 7) producing calculational constants for Monte Carlo transport codes that use group-averaged cross sections and continuous energy for particles (CTART); 8) producing and updating standard files used by the LLNL Monte Carlo transport codes (TRTL); and 9) producing standard files used by the LANL pointwise Monte Carlo transport code MCNP (MCPOINT). The first four of these functions and codes deal with the libraries of evaluated data and the last five with various aspects of producing calculational constants for use by transport codes. In 1970 a series, called PD memos, of internal and informal memoranda was begun. These were intended to be circulated among the group for comment and then to provide documentation for later reference whenever questions arose about the subject matter of the memos. They have served this purpose and now will be drawn upon as source material for this more comprehensive report that deals with most of the matters covered in those memos.

  9. Omega documentation

    International Nuclear Information System (INIS)

    Howerton, R.J.; Dye, R.E.; Giles, P.C.; Kimlinger, J.R.; Perkins, S.T.; Plechaty, E.F.

    1983-08-01

    OMEGA is a CRAY I computer program that controls nine codes used by LLNL Physical Data Group for: 1) updating the libraries of evaluated data maintained by the group (UPDATE); 2) calculating average values of energy deposited in secondary particles and residual nuclei (ENDEP); 3) checking the libraries for internal consistency, especially for energy conservation (GAMCHK); 4) producing listings, indexes and plots of the library data (UTILITY); 5) producing calculational constants such as group averaged cross sections and transfer matrices for diffusion and Sn transport codes (CLYDE); 6) producing and updating standard files of the calculational constants used by LLNL Sn and diffusion transport codes (NDFL); 7) producing calculational constants for Monte Carlo transport codes that use group-averaged cross sections and continuous energy for particles (CTART); 8) producing and updating standard files used by the LLNL Monte Carlo transport codes (TRTL); and 9) producing standard files used by the LANL pointwise Monte Carlo transport code MCNP (MCPOINT). The first four of these functions and codes deal with the libraries of evaluated data and the last five with various aspects of producing calculational constants for use by transport codes. In 1970 a series, called PD memos, of internal and informal memoranda was begun. These were intended to be circulated among the group for comment and then to provide documentation for later reference whenever questions arose about the subject matter of the memos. They have served this purpose and now will be drawn upon as source material for this more comprehensive report that deals with most of the matters covered in those memos

  10. The HAW project. Test storage of high-level radiation sources in the Asse salt mine. Documentation and assessment of the storage system

    International Nuclear Information System (INIS)

    Mueller, K.; Rothfuchs, T.

    1994-01-01

    The HAW project aimed primarily at studying the interaction between high-level radioactive waste moulds and rock salt as the respository medium. Another priority was the prototype development and testing of a technical system for the emplacement of high-level radioactive moulds in deep storage boreholes. To simulate real high-level radioactive wastes, special high-level radiation sources (Cs-137, Sr-90) were produced in the United States of America under a German-American cooperation contract, for carrying out the tests at the Asse salt mine. The components of the storage system are described, their position and task within the entire handling procedure explained. Questions of radiation protection and accident protection, of functioning and operating reliability, of quality assurance and examination of documents, materials, of manufacture and functioning, and of documentation are dealt with in detail. With a view to the planning of storage techniques for a mine respository, the experience of development and operation is recorded, and recommendation of further developments are given. Problems which arose during work on the HAW project were partly due to test-specific reasons and will not or not in this form occur in a mine respository. It was planned to start the test emplacement in 1987, and it could have been executed in 1993 after appropriate preparation and approval of the storage system by the mining authority and the Hanover TUEV in 1991. In December 1992, however, the Federal Government decided to give up to the project due to the uncertain licensing situation, and to immediately stop all preparatory work. (orig./HP) [de

  11. Transient analyzer

    International Nuclear Information System (INIS)

    Muir, M.D.

    1975-01-01

    The design and design philosophy of a high performance, extremely versatile transient analyzer is described. This sub-system was designed to be controlled through the data acquisition computer system which allows hands off operation. Thus it may be placed on the experiment side of the high voltage safety break between the experimental device and the control room. This analyzer provides control features which are extremely useful for data acquisition from PPPL diagnostics. These include dynamic sample rate changing, which may be intermixed with multiple post trigger operations with variable length blocks using normal, peak to peak or integrate modes. Included in the discussion are general remarks on the advantages of adding intelligence to transient analyzers, a detailed description of the characteristics of the PPPL transient analyzer, a description of the hardware, firmware, control language and operation of the PPPL transient analyzer, and general remarks on future trends in this type of instrumentation both at PPPL and in general

  12. Standardization Documents

    Science.gov (United States)

    2011-08-01

    Specifications and Standards; Guide Specifications; CIDs; and NGSs . Learn. Perform. Succeed. STANDARDIZATION DOCUMENTS Federal Specifications Commercial...national or international standardization document developed by a private sector association, organization, or technical society that plans ...Maintain lessons learned • Examples: Guidance for application of a technology; Lists of options Learn. Perform. Succeed. DEFENSE HANDBOOK

  13. Maury Documentation

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Supporting documentation for the Maury Collection of marine observations. Includes explanations from Maury himself, as well as guides and descriptions by the U.S....

  14. Documentation Service

    International Nuclear Information System (INIS)

    Charnay, J.; Chosson, L.; Croize, M.; Ducloux, A.; Flores, S.; Jarroux, D.; Melka, J.; Morgue, D.; Mottin, C.

    1998-01-01

    This service assures the treatment and diffusion of the scientific information and the management of the scientific production of the institute as well as the secretariat operation for the groups and services of the institute. The report on documentation-library section mentions: the management of the documentation funds, search in international databases (INIS, Current Contents, Inspects), Pret-Inter service which allows accessing documents through DEMOCRITE network of IN2P3. As realizations also mentioned are: the setup of a video, photo database, the Web home page of the institute's library, follow-up of digitizing the document funds by integrating the CD-ROMs and diskettes, electronic archiving of the scientific production, etc

  15. Computerising documentation

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    The nuclear power generation industry is faced with public concern and government pressures over safety, efficiency and risk. Operators throughout the industry are addressing these issues with the aid of a new technology - technical document management systems (TDMS). Used for strategic and tactical advantage, the systems enable users to scan, archive, retrieve, store, edit, distribute worldwide and manage the huge volume of documentation (paper drawings, CAD data and film-based information) generated in building, maintaining and ensuring safety in the UK's power plants. The power generation industry has recognized that the management and modification of operation critical information is vital to the safety and efficiency of its power plants. Regulatory pressure from the Nuclear Installations Inspectorate (NII) to operate within strict safety margins or lose Site Licences has prompted the need for accurate, up-to-data documentation. A document capture and management retrieval system provides a powerful cost-effective solution, giving rapid access to documentation in a tightly controlled environment. The computerisation of documents and plans is discussed in this article. (Author)

  16. DensToolKit: A comprehensive open-source package for analyzing the electron density and its derivative scalar and vector fields

    Science.gov (United States)

    Solano-Altamirano, J. M.; Hernández-Pérez, Julio M.

    2015-11-01

    DensToolKit is a suite of cross-platform, optionally parallelized, programs for analyzing the molecular electron density (ρ) and several fields derived from it. Scalar and vector fields, such as the gradient of the electron density (∇ρ), electron localization function (ELF) and its gradient, localized orbital locator (LOL), region of slow electrons (RoSE), reduced density gradient, localized electrons detector (LED), information entropy, molecular electrostatic potential, kinetic energy densities K and G, among others, can be evaluated on zero, one, two, and three dimensional grids. The suite includes a program for searching critical points and bond paths of the electron density, under the framework of Quantum Theory of Atoms in Molecules. DensToolKit also evaluates the momentum space electron density on spatial grids, and the reduced density matrix of order one along lines joining two arbitrary atoms of a molecule. The source code is distributed under the GNU-GPLv3 license, and we release the code with the intent of establishing an open-source collaborative project. The style of DensToolKit's code follows some of the guidelines of an object-oriented program. This allows us to supply the user with a simple manner for easily implement new scalar or vector fields, provided they are derived from any of the fields already implemented in the code. In this paper, we present some of the most salient features of the programs contained in the suite, some examples of how to run them, and the mathematical definitions of the implemented fields along with hints of how we optimized their evaluation. We benchmarked our suite against both a freely-available program and a commercial package. Speed-ups of ˜2×, and up to 12× were obtained using a non-parallel compilation of DensToolKit for the evaluation of fields. DensToolKit takes similar times for finding critical points, compared to a commercial package. Finally, we present some perspectives for the future development and

  17. Radiometric analyzer

    International Nuclear Information System (INIS)

    Arima, S.; Oda, M.; Miyashita, K.; Takada, M.

    1977-01-01

    A radiometric analyzer for measuring the characteristic values of a sample by radiation includes a humer of radiation measuring subsystems having different ratios of sensitivities to the elements of the sample and linearizing circuits having inverse function characteristics of calibration functions which correspond to the radiation measuring subsystems. A weighing adder operates a desirable linear combination of the outputs of the linearizing circuits. Operators for operating between two or more different linear combinations are included

  18. Clustering document fragments using background color and texture information

    Science.gov (United States)

    Chanda, Sukalpa; Franke, Katrin; Pal, Umapada

    2012-01-01

    Forensic analysis of questioned documents sometimes can be extensively data intensive. A forensic expert might need to analyze a heap of document fragments and in such cases to ensure reliability he/she should focus only on relevant evidences hidden in those document fragments. Relevant document retrieval needs finding of similar document fragments. One notion of obtaining such similar documents could be by using document fragment's physical characteristics like color, texture, etc. In this article we propose an automatic scheme to retrieve similar document fragments based on visual appearance of document paper and texture. Multispectral color characteristics using biologically inspired color differentiation techniques are implemented here. This is done by projecting document color characteristics to Lab color space. Gabor filter-based texture analysis is used to identify document texture. It is desired that document fragments from same source will have similar color and texture. For clustering similar document fragments of our test dataset we use a Self Organizing Map (SOM) of dimension 5×5, where the document color and texture information are used as features. We obtained an encouraging accuracy of 97.17% from 1063 test images.

  19. Contamination Analyzer

    Science.gov (United States)

    1994-01-01

    Measurement of the total organic carbon content in water is important in assessing contamination levels in high purity water for power generation, pharmaceutical production and electronics manufacture. Even trace levels of organic compounds can cause defects in manufactured products. The Sievers Model 800 Total Organic Carbon (TOC) Analyzer, based on technology developed for the Space Station, uses a strong chemical oxidizing agent and ultraviolet light to convert organic compounds in water to carbon dioxide. After ionizing the carbon dioxide, the amount of ions is determined by measuring the conductivity of the deionized water. The new technique is highly sensitive, does not require compressed gas, and maintenance is minimal.

  20. Analyzing in the present

    DEFF Research Database (Denmark)

    Revsbæk, Line; Tanggaard, Lene

    2015-01-01

    The article presents a notion of “analyzing in the present” as a source of inspiration in analyzing qualitative research materials. The term emerged from extensive listening to interview recordings during everyday commuting to university campus. Paying attention to the way different parts of vari...

  1. Analyzing Clickstreams

    DEFF Research Database (Denmark)

    Andersen, Jesper; Giversen, Anders; Jensen, Allan H.

    in modern enterprises. In the data warehousing pproach, selected information is extracted in advance and stored in a repository. This approach is used because of its high performance. However, in many situations a logical (rather than physical) integration of data is preferable. Previous web-based data......On-Line Analytical Processing (OLAP) enables analysts to gain insight into data through fast and interactive access to a variety of possible views on information, organized in a dimensional model. The demand for data integration is rapidly becoming larger as more and more information sources appear....... Extensible Markup Language (XML) is fast becoming the new standard for data representation and exchange on the World Wide Web. The rapid emergence of XML data on the web, e.g., business-to-business (B2B) ecommerce, is making it necessary for OLAP and other data analysis tools to handleXML data as well...

  2. Electron attachment analyzer

    International Nuclear Information System (INIS)

    Popp, P.; Grosse, H.J.; Leonhardt, J.; Mothes, S.; Oppermann, G.

    1984-01-01

    The invention concerns an electron attachment analyzer for detecting traces of electroaffine substances in electronegative gases, especially in air. The analyzer can be used for monitoring working places, e. g., in operating theatres. The analyzer consists of two electrodes inserted in a base frame of insulating material (quartz or ceramics) and a high-temperature resistant radiation source ( 85 Kr, 3 H, or 63 Ni)

  3. Sources

    International Nuclear Information System (INIS)

    Duffy, L.P.

    1991-01-01

    This paper discusses the sources of radiation in the narrow perspective of radioactivity and the even narrow perspective of those sources that concern environmental management and restoration activities at DOE facilities, as well as a few related sources. Sources of irritation, Sources of inflammatory jingoism, and Sources of information. First, the sources of irritation fall into three categories: No reliable scientific ombudsman to speak without bias and prejudice for the public good, Technical jargon with unclear definitions exists within the radioactive nomenclature, and Scientific community keeps a low-profile with regard to public information. The next area of personal concern are the sources of inflammation. This include such things as: Plutonium being described as the most dangerous substance known to man, The amount of plutonium required to make a bomb, Talk of transuranic waste containing plutonium and its health affects, TMI-2 and Chernobyl being described as Siamese twins, Inadequate information on low-level disposal sites and current regulatory requirements under 10 CFR 61, Enhanced engineered waste disposal not being presented to the public accurately. Numerous sources of disinformation regarding low level radiation high-level radiation, Elusive nature of the scientific community, The Federal and State Health Agencies resources to address comparative risk, and Regulatory agencies speaking out without the support of the scientific community

  4. CMS DOCUMENTATION

    CERN Multimedia

    CMS TALKS AT MAJOR MEETINGS The agenda and talks from major CMS meetings can now be electronically accessed from the iCMS Web site. The following items can be found on: http://cms.cern.ch/iCMS/ General - CMS Weeks (Collaboration Meetings), CMS Weeks Agendas The talks presented at the Plenary Sessions. LHC Symposiums Management - CB - MB - FB - FMC Agendas and minutes are accessible to CMS members through their AFS account (ZH). However some linked documents are restricted to the Board Members. FB documents are only accessible to FB members. LHCC The talks presented at the ‘CMS Meetings with LHCC Referees’ are available on request from the PM or MB Country Representative. Annual Reviews The talks presented at the 2006 Annual reviews are posted.   CMS DOCUMENTS It is considered useful to establish information on the first employment of CMS doctoral students upon completion of their theses. Therefore it is requested that Ph.D students inform the CMS Secretariat a...

  5. CMS DOCUMENTATION

    CERN Multimedia

    CMS TALKS AT MAJOR MEETINGS The agenda and talks from major CMS meetings can now be electronically accessed from the iCMS Web site. The following items can be found on: http://cms.cern.ch/iCMS/ General - CMS Weeks (Collaboration Meetings), CMS Weeks Agendas The talks presented at the Plenary Sessions. LHC Symposiums Management - CB - MB - FB - FMC Agendas and minutes are accessible to CMS members through their AFS account (ZH). However some linked documents are restricted to the Board Members. FB documents are only accessible to FB members. LHCC The talks presented at the ‘CMS Meetings with LHCC Referees’ are available on request from the PM or MB Country Representative. Annual Reviews The talks presented at the 2006 Annual reviews are posted. CMS DOCUMENTS It is considered useful to establish information on the first employment of CMS doctoral students upon completion of their theses. Therefore it is requested that Ph.D students inform the CMS Secretariat about the natu...

  6. CMS DOCUMENTATION

    CERN Multimedia

    CMS TALKS AT MAJOR MEETINGS The agenda and talks from major CMS meetings can now be electronically accessed from the iCMS Web site. The following items can be found on: http://cms.cern.ch/iCMS/ General - CMS Weeks (Collaboration Meetings), CMS Weeks Agendas The talks presented at the Plenary Sessions. LHC Symposiums Management - CB - MB - FB - FMC Agendas and minutes are accessible to CMS members through their AFS account (ZH). However some linked documents are restricted to the Board Members. FB documents are only accessible to FB members. LHCC The talks presented at the ‘CMS Meetings with LHCC Referees’ are available on request from the PM or MB Country Representative. Annual Reviews The talks presented at the 2006 Annual reviews are posted. CMS DOCUMENTS It is considered useful to establish information on the first employment of CMS doctoral students upon completion of their theses. Therefore it is requested that Ph.D students inform the CMS Secretariat about the natur...

  7. CMS DOCUMENTATION

    CERN Multimedia

    CMS TALKS AT MAJOR MEETINGS The agenda and talks from major CMS meetings can now be electronically accessed from the iCMS Web site. The following items can be found on: http://cms.cern.ch/iCMS/ Management- CMS Weeks (Collaboration Meetings), CMS Weeks Agendas The talks presented at the Plenary Sessions. Management - CB - MB - FB Agendas and minutes are accessible to CMS members through their AFS account (ZH). However some linked documents are restricted to the Board Members. FB documents are only accessible to FB members. LHCC The talks presented at the ‘CMS Meetings with LHCC Referees’ are available on request from the PM or MB Country Representative. Annual Reviews The talks presented at the 2007 Annual reviews are posted. CMS DOCUMENTS It is considered useful to establish information on the first employment of CMS doctoral students upon completion of their theses. Therefore it is requested that Ph.D students inform the CMS Secretariat about the nature of employment and ...

  8. CMS DOCUMENTATION

    CERN Multimedia

    CMS TALKS AT MAJOR MEETINGS The agenda and talks from major CMS meetings can now be electronically accessed from the iCMS Web site. The following items can be found on: http://cms.cern.ch/iCMS/ Management- CMS Weeks (Collaboration Meetings), CMS Weeks Agendas The talks presented at the Plenary Sessions. Management - CB - MB - FB Agendas and minutes are accessible to CMS members through their AFS account (ZH). However some linked documents are restricted to the Board Members. FB documents are only accessible to FB members. LHCC The talks presented at the ‘CMS Meetings with LHCC Referees’ are available on request from the PM or MB Country Representative. Annual Reviews The talks presented at the 2007 Annual reviews are posted. CMS DOCUMENTS It is considered useful to establish information on the first employment of CMS doctoral students upon completion of their theses. Therefore it is requested that Ph.D students inform the CMS Secretariat about the nature of em¬pl...

  9. CMS DOCUMENTATION

    CERN Multimedia

    CMS TALKS AT MAJOR MEETINGS The agenda and talks from major CMS meetings can now be electronically accessed from the iCMS Web site. The following items can be found on: http://cms.cern.ch/iCMS/ General - CMS Weeks (Collaboration Meetings), CMS Weeks Agendas The talks presented at the Plenary Sessions. LHC Symposiums Management - CB - MB - FB - FMC Agendas and minutes are accessible to CMS members through their AFS account (ZH). However some linked documents are restricted to the Board Members. FB documents are only accessible to FB members. LHCC The talks presented at the ‘CMS Meetings with LHCC Referees’ are available on request from the PM or MB Country Representative. Annual Reviews The talks presented at the 2006 Annual reviews are posted. CMS DOCUMENTS It is considered useful to establish information on the first employment of CMS doctoral students upon completion of their theses. Therefore it is requested that Ph.D students inform the CMS Secretariat about the na...

  10. Call document

    International Development Research Centre (IDRC) Digital Library (Canada)

    Marie-Isabelle Beyer

    2015-03-31

    Mar 31, 2015 ... It works with researchers as they confront contemporar y challenges .... The type of study and method of systematic review of evidence must be ... of sources valuing rigorous qualitative and quantitative research; and should be ...

  11. Documents on Disarmament.

    Science.gov (United States)

    Arms Control and Disarmament Agency, Washington, DC.

    This publication, latest in a series of volumes issued annually since 1960, contains primary source documents on arms control and disarmament developments during 1969. The main chronological arrangement is supplemented by both chronological and topical lists of contents. Other reference aids include a subject/author index, and lists of…

  12. Document Models

    Directory of Open Access Journals (Sweden)

    A.A. Malykh

    2017-08-01

    Full Text Available In this paper, the concept of locally simple models is considered. Locally simple models are arbitrarily complex models built from relatively simple components. A lot of practically important domains of discourse can be described as locally simple models, for example, business models of enterprises and companies. Up to now, research in human reasoning automation has been mainly concentrated around the most intellectually intensive activities, such as automated theorem proving. On the other hand, the retailer business model is formed from ”jobs”, and each ”job” can be modelled and automated more or less easily. At the same time, the whole retailer model as an integrated system is extremely complex. In this paper, we offer a variant of the mathematical definition of a locally simple model. This definition is intended for modelling a wide range of domains. Therefore, we also must take into account the perceptual and psychological issues. Logic is elitist, and if we want to attract to our models as many people as possible, we need to hide this elitism behind some metaphor, to which ’ordinary’ people are accustomed. As such a metaphor, we use the concept of a document, so our locally simple models are called document models. Document models are built in the paradigm of semantic programming. This allows us to achieve another important goal - to make the documentary models executable. Executable models are models that can act as practical information systems in the described domain of discourse. Thus, if our model is executable, then programming becomes redundant. The direct use of a model, instead of its programming coding, brings important advantages, for example, a drastic cost reduction for development and maintenance. Moreover, since the model is well and sound, and not dissolved within programming modules, we can directly apply AI tools, in particular, machine learning. This significantly expands the possibilities for automation and

  13. Documentation of spectrom-32

    International Nuclear Information System (INIS)

    Callahan, G.D.; Fossum, A.F.; Svalstad, D.K.

    1989-01-01

    SPECTROM-32 is a finite element program for analyzing two-dimensional and axisymmetric inelastic thermomechanical problems related to the geological disposal of nuclear waste. The code is part of the SPECTROM series of special-purpose computer programs that are being developed by RE/SPEC Inc. to address many unique rock mechanics problems encountered in analyzing radioactive wastes stored in geologic formations. This document presents the theoretical basis for the mathematical models, the finite element formulation and solution procedure of the program, a description of the input data for the program, verification problems, and details about program support and continuing documentation. The computer code documentation is intended to satisfy the requirements and guidelines outlined in the document entitled Final Technical Position on Documentation of Computer Codes for High-Level Waste Management. The principal component models used in the program involve thermoelastic, thermoviscoelastic, thermoelastic-plastic, and thermoviscoplastic types of material behavior. Special material considerations provide for the incorporation of limited-tension material behavior and consideration of jointed material behavior. Numerous program options provide the capabilities for various boundary conditions, sliding interfaces, excavation, backfill, arbitrary initial stresses, multiple material domains, load incrementation, plotting database storage and access of results, and other features unique to the geologic disposal of radioactive wastes. Numerous verification problems that exercise many of the program options and illustrate the required data input and printed results are included in the documentation

  14. Orbitmpi Documentation

    International Nuclear Information System (INIS)

    Lowe, Lisa L.

    2000-01-01

    Orbitmpi is a parallelized version of Roscoe White's Orbit code. The code has been parallelized using MPI, which makes it portable to many types of machines. The guidelines used for the parallelization were to increase code performance with minimal changes to the code's original structure. This document gives a general description of how the parallel sections of the code run. It discusses the changes made to the original code and comments on the general procedure for future additions to Orbitmpi, as well as describing the effects of a parallelized random number generator on the code's output. Finally, the scaling results from Hecate and from Puffin are presented. Hecate is a 64-processor Origin 2000 machine, with MIPS R12000 processors and 16GB of memory, and Puffin is a PC cluster with 9 dual-processor 450 MHz Pentium III (18 processors max.), with 100Mbits ethernet communication

  15. Tests of the Royce ultrasonic interface level analyzer

    International Nuclear Information System (INIS)

    WITWER, K.S.

    1999-01-01

    This document describes testing carried out in 1995 on the Royce Interface Level Analyzer. The testing was carried out in the 305 Bldg., Engineering Testing Laboratory, 300 Area. The Level Analyzer was shown to be able to effectively locate the solid liquid interface layer of two different simulants under various conditions and was able to do so after being irradiated with over 5 million RADS gamma from a Cobalt 60 source

  16. CMS DOCUMENTATION

    CERN Multimedia

    CMS TALKS AT MAJOR MEETINGS The agenda and talks from major CMS meetings can now be electronically accessed from the ICMS Web site. The following items can be found on: http://cms.cern.ch/iCMS Management – CMS Weeks (Collaboration Meetings), CMS Weeks Agendas The talks presented at the Plenary Sessions. Management – CB – MB – FB Agendas and minutes are accessible to CMS members through Indico. LHCC The talks presented at the ‘CMS Meetings with LHCC Referees’ are available on request from the PM or MB Country Representative. Annual Reviews The talks presented at the 2008 Annual Reviews are posted in Indico. CMS DOCUMENTS It is considered useful to establish information on the first employment of CMS doctoral student upon completion of their theses.  Therefore it is requested that Ph.D students inform the CMS Secretariat about the nature of employment and name of their first employer. The Notes, Conference Reports and Theses published si...

  17. Source Reconstruction of Brain Potentials Using Bayesian Model Averaging to Analyze Face Intra-Domain vs. Face-Occupation Cross-Domain Processing.

    Science.gov (United States)

    Olivares, Ela I; Lage-Castellanos, Agustín; Bobes, María A; Iglesias, Jaime

    2018-01-01

    We investigated the neural correlates of the access to and retrieval of face structure information in contrast to those concerning the access to and retrieval of person-related verbal information, triggered by faces. We experimentally induced stimulus familiarity via a systematic learning procedure including faces with and without associated verbal information. Then, we recorded event-related potentials (ERPs) in both intra-domain (face-feature) and cross-domain (face-occupation) matching tasks while N400-like responses were elicited by incorrect eyes-eyebrows completions and occupations, respectively. A novel Bayesian source reconstruction approach plus conjunction analysis of group effects revealed that in both cases the generated N170s were of similar amplitude but had different neural origin. Thus, whereas the N170 of faces was associated predominantly to right fusiform and occipital regions (the so-called "Fusiform Face Area", "FFA" and "Occipital Face Area", "OFA", respectively), the N170 of occupations was associated to a bilateral very posterior activity, suggestive of basic perceptual processes. Importantly, the right-sided perceptual P200 and the face-related N250 were evoked exclusively in the intra-domain task, with sources in OFA and extensively in the fusiform region, respectively. Regarding later latencies, the intra-domain N400 seemed to be generated in right posterior brain regions encompassing mainly OFA and, to some extent, the FFA, likely reflecting neural operations triggered by structural incongruities. In turn, the cross-domain N400 was related to more anterior left-sided fusiform and temporal inferior sources, paralleling those described previously for the classic verbal N400. These results support the existence of differentiated neural streams for face structure and person-related verbal processing triggered by faces, which can be activated differentially according to specific task demands.

  18. Source Reconstruction of Brain Potentials Using Bayesian Model Averaging to Analyze Face Intra-Domain vs. Face-Occupation Cross-Domain Processing

    Directory of Open Access Journals (Sweden)

    Ela I. Olivares

    2018-03-01

    Full Text Available We investigated the neural correlates of the access to and retrieval of face structure information in contrast to those concerning the access to and retrieval of person-related verbal information, triggered by faces. We experimentally induced stimulus familiarity via a systematic learning procedure including faces with and without associated verbal information. Then, we recorded event-related potentials (ERPs in both intra-domain (face-feature and cross-domain (face-occupation matching tasks while N400-like responses were elicited by incorrect eyes-eyebrows completions and occupations, respectively. A novel Bayesian source reconstruction approach plus conjunction analysis of group effects revealed that in both cases the generated N170s were of similar amplitude but had different neural origin. Thus, whereas the N170 of faces was associated predominantly to right fusiform and occipital regions (the so-called “Fusiform Face Area”, “FFA” and “Occipital Face Area”, “OFA”, respectively, the N170 of occupations was associated to a bilateral very posterior activity, suggestive of basic perceptual processes. Importantly, the right-sided perceptual P200 and the face-related N250 were evoked exclusively in the intra-domain task, with sources in OFA and extensively in the fusiform region, respectively. Regarding later latencies, the intra-domain N400 seemed to be generated in right posterior brain regions encompassing mainly OFA and, to some extent, the FFA, likely reflecting neural operations triggered by structural incongruities. In turn, the cross-domain N400 was related to more anterior left-sided fusiform and temporal inferior sources, paralleling those described previously for the classic verbal N400. These results support the existence of differentiated neural streams for face structure and person-related verbal processing triggered by faces, which can be activated differentially according to specific task demands.

  19. Document understanding for a broad class of documents

    NARCIS (Netherlands)

    Aiello, Marco; Monz, Christof; Todoran, Leon; Worring, Marcel

    2002-01-01

    We present a document analysis system able to assign logical labels and extract the reading order in a broad set of documents. All information sources, from geometric features and spatial relations to the textual features and content are employed in the analysis. To deal effectively with these

  20. SRS ecology: Environmental information document

    International Nuclear Information System (INIS)

    Wike, L.D.; Shipley, R.W.; Bowers, J.A.

    1993-09-01

    The purpose of this Document is to provide a source of ecological information based on the exiting knowledge gained from research conducted at the Savannah River Site. This document provides a summary and synthesis of ecological research in the three main ecosystem types found at SRS and information on the threatened and endangered species residing there

  1. SRS ecology: Environmental information document

    Energy Technology Data Exchange (ETDEWEB)

    Wike, L.D.; Shipley, R.W.; Bowers, J.A. [and others

    1993-09-01

    The purpose of this Document is to provide a source of ecological information based on the exiting knowledge gained from research conducted at the Savannah River Site. This document provides a summary and synthesis of ecological research in the three main ecosystem types found at SRS and information on the threatened and endangered species residing there.

  2. ChimericSeq: An open-source, user-friendly interface for analyzing NGS data to identify and characterize viral-host chimeric sequences

    Science.gov (United States)

    Shieh, Fwu-Shan; Jongeneel, Patrick; Steffen, Jamin D.; Lin, Selena; Jain, Surbhi; Song, Wei

    2017-01-01

    Identification of viral integration sites has been important in understanding the pathogenesis and progression of diseases associated with particular viral infections. The advent of next-generation sequencing (NGS) has enabled researchers to understand the impact that viral integration has on the host, such as tumorigenesis. Current computational methods to analyze NGS data of virus-host junction sites have been limited in terms of their accessibility to a broad user base. In this study, we developed a software application (named ChimericSeq), that is the first program of its kind to offer a graphical user interface, compatibility with both Windows and Mac operating systems, and optimized for effectively identifying and annotating virus-host chimeric reads within NGS data. In addition, ChimericSeq’s pipeline implements custom filtering to remove artifacts and detect reads with quantitative analytical reporting to provide functional significance to discovered integration sites. The improved accessibility of ChimericSeq through a GUI interface in both Windows and Mac has potential to expand NGS analytical support to a broader spectrum of the scientific community. PMID:28829778

  3. ChimericSeq: An open-source, user-friendly interface for analyzing NGS data to identify and characterize viral-host chimeric sequences.

    Directory of Open Access Journals (Sweden)

    Fwu-Shan Shieh

    Full Text Available Identification of viral integration sites has been important in understanding the pathogenesis and progression of diseases associated with particular viral infections. The advent of next-generation sequencing (NGS has enabled researchers to understand the impact that viral integration has on the host, such as tumorigenesis. Current computational methods to analyze NGS data of virus-host junction sites have been limited in terms of their accessibility to a broad user base. In this study, we developed a software application (named ChimericSeq, that is the first program of its kind to offer a graphical user interface, compatibility with both Windows and Mac operating systems, and optimized for effectively identifying and annotating virus-host chimeric reads within NGS data. In addition, ChimericSeq's pipeline implements custom filtering to remove artifacts and detect reads with quantitative analytical reporting to provide functional significance to discovered integration sites. The improved accessibility of ChimericSeq through a GUI interface in both Windows and Mac has potential to expand NGS analytical support to a broader spectrum of the scientific community.

  4. Development of expert systems for analyzing electronic documents

    Science.gov (United States)

    Abeer Yassin, Al-Azzawi; Shidlovskiy, S.; Jamal, A. A.

    2018-05-01

    The paper analyses a Database Management System (DBMS). Expert systems, Databases, and database technology have become an essential component of everyday life in the modern society. As databases are widely used in every organization with a computer system, data resource control and data management are very important [1]. DBMS is the most significant tool developed to serve multiple users in a database environment consisting of programs that enable users to create and maintain a database. This paper focuses on development of a database management system for General Directorate for education of Diyala in Iraq (GDED) using Clips, java Net-beans and Alfresco and system components, which were previously developed in Tomsk State University at the Faculty of Innovative Technology.

  5. Document Organization Using Kohonen's Algorithm.

    Science.gov (United States)

    Guerrero Bote, Vicente P.; Moya Anegon, Felix de; Herrero Solana, Victor

    2002-01-01

    Discussion of the classification of documents from bibliographic databases focuses on a method of vectorizing reference documents from LISA (Library and Information Science Abstracts) which permits their topological organization using Kohonen's algorithm. Analyzes possibilities of this type of neural network with respect to the development of…

  6. Emission spectrometric isotope analyzer

    International Nuclear Information System (INIS)

    Mauersberger, K.; Meier, G.; Nitschke, W.; Rose, W.; Schmidt, G.; Rahm, N.; Andrae, G.; Krieg, D.; Kuefner, W.; Tamme, G.; Wichlacz, D.

    1982-01-01

    An emission spectrometric isotope analyzer has been designed for determining relative abundances of stable isotopes in gaseous samples in discharge tubes, in liquid samples, and in flowing gaseous samples. It consists of a high-frequency generator, a device for defined positioning of discharge tubes, a grating monochromator with oscillating slit and signal converter, signal generator, window discriminator, AND connection, read-out display, oscillograph, gas dosing device and chemical conversion system with carrier gas source and vacuum pump

  7. SIMULATING PROTEIN DIGESTION ON TROUT A RAPID AND INEXPENSIVE METHOD FOR DOCUMENTING FISH MEAL QUALITY AND SCREENING NOVEL PROTEIN SOURCES FOR USE IN AQUAFEEDS

    Directory of Open Access Journals (Sweden)

    M Bassompierre

    1997-10-01

    Full Text Available A novel in vitro digestion system, which simulated rainbow trout gastric and intestinal digestion was developed. The method was employed to evaluate the impact of the gastric phase of digestion upon degradation of three fish meals od differing quality. Results illustrated that two-phase gastric-intestinal digestion increased the discriminatory powers of the system when compared to one-step intestinal digestion. A comparison of the system with pH-STAT methods demonstrated that the in vitro technique was superior. The presented method provides an ethical and cost effective means for rapid evaluation of fish meals and potentially, alternative protein sources for aquafeeds.

  8. THE EXPERIENCE OF COMPARISON OF STATIC SECURITY CODE ANALYZERS

    Directory of Open Access Journals (Sweden)

    Alexey Markov

    2015-09-01

    Full Text Available This work presents a methodological approach to comparison of static security code analyzers. It substantiates the comparison of the static analyzers as to efficiency and functionality indicators, which are stipulated in the international regulatory documents. The test data for assessment of static analyzers efficiency is represented by synthetic sets of open-source software, which contain vulnerabilities. We substantiated certain criteria for quality assessment of the static security code analyzers subject to standards NIST SP 500-268 and SATEC. We carried out experiments that allowed us to assess a number of the Russian proprietary software tools and open-source tools. We came to the conclusion that it is of paramount importance to develop Russian regulatory framework for testing software security (firstly, for controlling undocumented features and evaluating the quality of static security code analyzers.

  9. Integrated criteria document mercury

    International Nuclear Information System (INIS)

    Sloof, W.; Beelan, P. van; Annema, J.A.; Janus, J.A.

    1995-01-01

    The document contains a systematic review and a critical evaluation of the most relevant data on the priority substance mercury for the purpose of effect-oriented environmental policy. Chapter headings are: properties and existing standards; production, application, sources and emissions (natural sources, industry, energy, households, agriculture, dental use, waste); distribution and transformation (cinnabar; Hg 2+ , Hg 2 2+ , elemental mercury, methylmercury, behavior in soil, water, air, biota); concentrations and fluxes in the environment and exposure levels (sampling and measuring methods, occurrence in soil, water, air etc.); effects (toxicity to humans and aquatic and terrestrial systems); emissions reduction (from industrial sources, energy, waste processing etc.); and evaluation (risks, standards, emission reduction objectives, measuring strategies). 395 refs

  10. Using tritium to document the mean transit time and sources of water contributing to a chain-of-ponds river system: Implications for resource protection

    International Nuclear Information System (INIS)

    Cartwright, I.; Morgenstern, U.

    2016-01-01

    Documenting the interaction between groundwater and rivers is fundamental to understanding hydrological systems. While many studies have examined the location and magnitude of groundwater inflows to rivers, much less is known about the transit times of water in catchments and from where in the aquifer the groundwater originates. Resolving those questions is vital for protecting riverine ecosystems, assessing the impact of contamination, and understanding the potential consequences of groundwater pumping. This study uses tritium ("3H) to evaluate the mean transit times of water contributing to Deep Creek (southeast Australia), which is a chain-of-ponds river system. "3H activities of river water vary between 1.47 and 2.91 TU with lower "3H activities recorded during cease-to-flow periods when the river comprises isolated groundwater-fed pools. Regional groundwater 1–2.5 km away from Deep Creek at depths of 7.5–46.5 m has "3H activities of between 100 years. Alternatively the variation in "3H activities can be explained by mixing of a young near-river water component with up to 50% older groundwater. The results of this study reinforce the need to protect shallow near-river groundwater from contamination in order to safeguard riverine ecosystems and also illustrate the potential pitfalls in using regional bores to characterise the geochemistry of near-river groundwater. - Highlights: • We measured tritium in river water and groundwater from a groundwater-fed river. • Transit times of the river water are years to decades. • Transit times of regional groundwater are decades to centuries. • Regional groundwater is only a minor component of the river water. • Results have implications for protection of the river and its ecosystems.

  11. Documentation of spectrom-41

    International Nuclear Information System (INIS)

    Svalstad, D.K.

    1989-01-01

    SPECTROM-41 is a finite element heat transfer computer program developed to analyze thermal problems related to nuclear waste disposal. The code is part of the SPECTROM (Special Purpose Engineering Codes for Thermal/ROck Mechanics) series of special purpose finite element programs that are continually being developed by RE/SPEC Inc. (RSI) to address the many unique formations. This document presents the theoretical basis for the mathematical model, the finite element formulation of the program, and a description of the input data for the program, along with details about program support and continuing documentation. The documentation is intended to satisfy the requirements and guidelines outlined in NUREG-0856. The principal component model used in the programs based on Fourier's law of conductance. Numerous program options provide the capability of considering various boundary conditions, material stratification and anisotropy, and time-dependent heat generation that are characteristic of problems involving the disposal of nuclear waste in geologic formation. Numerous verification problems are included in the documentation in addition to highlights of past and ongoing verification and validation efforts. A typical repository problem is solving using SPECTROM-41 to demonstrate the use of the program in addressing problems related to the disposal of nuclear waste

  12. Development document for the effluent limitations and guidelines for the ore mining and dressing point source category. Volume I. Final report

    International Nuclear Information System (INIS)

    Jarrett, B.M.; Kirby, R.G.

    1978-07-01

    To establish effluent limitation guidelines and standards of performance, the ore mining and dressing industry was divided into 41 separate categories and subcategories for which separate limitations were recommended. This report deals with the entire metal-ore mining and dressing industry and examines the industry by ten major categories: iron ore; copper ore; lead and zinc ores; gold ore; silver ore; bauxite ore; ferroalloy-metal ores; mercury ores; uranium, radium and vanadium ores; and metal ores, not elsewhere classified ((ores of antimony, beryllium, pltinum, rare earths, tin, titanium, and zirconium). The subcategorization of the ore categories is based primarily upon ore mineralogy and processing or extraction methods employed; however, other factors (such as size, climate or location, and method of mining) are used in some instances. With the best available technology economically achievable, facilities in 21 of the 41 subcategories can be operated with no discharge of process wastewater to navigable waters. No discharge of process wastewater is also achievable as a new source performance standard for facilities in 21 of the 41 subcategories

  13. Record Recommendations for the CERN Document Server

    CERN Document Server

    AUTHOR|(CDS)2096025; Marian, Ludmila

    CERN Document Server (CDS) is the institutional repository of the European Organization for Nuclear Research (CERN). It hosts all the research material produced at CERN, as well as multi- media and administrative documents. It currently has more than 1.5 million records grouped in more than 1000 collections. It’s underlying platform is Invenio, an open source digital library system created at CERN. As the size of CDS increases, discovering useful and interesting records becomes more challenging. Therefore, the goal of this work is to create a system that supports the user in the discovery of related interesting records. To achieve this, a set of recommended records are displayed on the record page. These recommended records are based on the analyzed behavior (page views and downloads) of other users. This work will describe the methods and algorithms used for creating, implementing, and the integration with the underlying software platform, Invenio. A very important decision factor when designing a recomme...

  14. DEMorphy, German Language Morphological Analyzer

    OpenAIRE

    Altinok, Duygu

    2018-01-01

    DEMorphy is a morphological analyzer for German. It is built onto large, compactified lexicons from German Morphological Dictionary. A guesser based on German declension suffixed is also provided. For German, we provided a state-of-art morphological analyzer. DEMorphy is implemented in Python with ease of usability and accompanying documentation. The package is suitable for both academic and commercial purposes wit a permissive licence.

  15. Low-level radioactive waste from commercial nuclear reactors. Volume 3. Bibliographic abstracts of significant source documents. Part 1. Open-literature abstracts for low-level radioactive waste

    Energy Technology Data Exchange (ETDEWEB)

    Bowers, M.K.; Rodgers, B.R.; Jolley, R.L.

    1986-05-01

    The overall task of this program was to provide an assessment of currently available technology for treating commercial low-level radioactive waste (LLRW), to initiate development of a methodology for choosing one technology for a given application, and to identify research needed to improve current treatment techniques and decision methodology. The resulting report is issued in four volumes. Volume 3 of this series is a collection of abstracts of most of the reference documents used for this study. Because of the large volume of literature, the abstracts have been printed in two separate parts. Volume 3, part 1 presents abstracts of the open literature relating to LLRW treatment methodologies. Some of these references pertain to treatment processes for hazardous wastes that may also be applicable to LLRW management. All abstracts have been limited to 21 lines (for brevity), but each abstract contains sufficient information to enable the reader to determine the potential usefulness of the source document and to locate each article. The abstracts are arranged alphabetically by author or organization, and indexed by keyword.

  16. Low-level radioactive waste from commercial nuclear reactors. Volume 3. Bibliographic abstracts of significant source documents. Part 1. Open-literature abstracts for low-level radioactive waste

    International Nuclear Information System (INIS)

    Bowers, M.K.; Rodgers, B.R.; Jolley, R.L.

    1986-05-01

    The overall task of this program was to provide an assessment of currently available technology for treating commercial low-level radioactive waste (LLRW), to initiate development of a methodology for choosing one technology for a given application, and to identify research needed to improve current treatment techniques and decision methodology. The resulting report is issued in four volumes. Volume 3 of this series is a collection of abstracts of most of the reference documents used for this study. Because of the large volume of literature, the abstracts have been printed in two separate parts. Volume 3, part 1 presents abstracts of the open literature relating to LLRW treatment methodologies. Some of these references pertain to treatment processes for hazardous wastes that may also be applicable to LLRW management. All abstracts have been limited to 21 lines (for brevity), but each abstract contains sufficient information to enable the reader to determine the potential usefulness of the source document and to locate each article. The abstracts are arranged alphabetically by author or organization, and indexed by keyword

  17. Histomorphometric Parameters of the Growth Plate and Trabecular Bone in Wild-Type and Trefoil Factor Family 3 (Tff3)-Deficient Mice Analyzed by Free and Open-Source Image Processing Software.

    Science.gov (United States)

    Bijelić, Nikola; Belovari, Tatjana; Stolnik, Dunja; Lovrić, Ivana; Baus Lončar, Mirela

    2017-08-01

    Trefoil factor family 3 (Tff3) peptide is present during intrauterine endochondral ossification in mice, and its deficiency affects cancellous bone quality in secondary ossification centers of mouse tibiae. The aim of this study was to quantitatively analyze parameters describing the growth plate and primary ossification centers in tibiae of 1-month-old wild-type and Tff3 knock-out mice (n=5 per genotype) by using free and open-source software. Digital photographs of the growth plates and trabecular bone were processed by open-source computer programs GIMP and FIJI. Histomorphometric parameters were calculated using measurements made with FIJI. Tff3 knock-out mice had significantly smaller trabecular number and significantly larger trabecular separation. Trabecular bone volume, trabecular bone surface, and trabecular thickness showed no significant difference between the two groups. Although such histomorphological differences were found in the cancellous bone structure, no significant differences were found in the epiphyseal plate histomorphology. Tff3 peptide probably has an effect on the formation and quality of the cancellous bone in the primary ossification centers, but not through disrupting the epiphyseal plate morphology. This work emphasizes the benefits of using free and open-source programs for morphological studies in life sciences.

  18. PM 3655 PHILIPS Logic analyzer

    CERN Multimedia

    A logic analyzer is an electronic instrument that captures and displays multiple signals from a digital system or digital circuit. A logic analyzer may convert the captured data into timing diagrams, protocol decodes, state machine traces, assembly language, or may correlate assembly with source-level software. Logic Analyzers have advanced triggering capabilities, and are useful when a user needs to see the timing relationships between many signals in a digital system.

  19. Generic safety documentation model

    International Nuclear Information System (INIS)

    Mahn, J.A.

    1994-04-01

    This document is intended to be a resource for preparers of safety documentation for Sandia National Laboratories, New Mexico facilities. It provides standardized discussions of some topics that are generic to most, if not all, Sandia/NM facilities safety documents. The material provides a ''core'' upon which to develop facility-specific safety documentation. The use of the information in this document will reduce the cost of safety document preparation and improve consistency of information

  20. Multimodal document management in radiotherapy

    International Nuclear Information System (INIS)

    Fahrner, H.; Kirrmann, S.; Roehner, F.; Schmucker, M.; Hall, M.; Heinemann, F.

    2013-01-01

    Background and purpose: After incorporating treatment planning and the organisational model of treatment planning in the operating schedule system (BAS, 'Betriebsablaufsystem'), complete document qualities were embedded in the digital environment. The aim of this project was to integrate all documents independent of their source (paper-bound or digital) and to make content from the BAS available in a structured manner. As many workflow steps as possible should be automated, e.g. assigning a document to a patient in the BAS. Additionally it must be guaranteed that at all times it could be traced who, when, how and from which source documents were imported into the departmental system. Furthermore work procedures should be changed that the documentation conducted either directly in the departmental system or from external systems can be incorporated digitally and paper document can be completely avoided (e.g. documents such as treatment certificate, treatment plans or documentation). It was a further aim, if possible, to automate the removal of paper documents from the departmental work flow, or even to make such paper documents superfluous. In this way patient letters for follow-up appointments should automatically generated from the BAS. Similarly patient record extracts in the form of PDF files should be enabled, e.g. for controlling purposes. Method: The available document qualities were analysed in detail by a multidisciplinary working group (BAS-AG) and after this examination and assessment of the possibility of modelling in our departmental workflow (BAS) they were transcribed into a flow diagram. The gathered specifications were implemented in a test environment by the clinical and administrative IT group of the department of radiation oncology and subsequent to a detailed analysis introduced into clinical routine. Results: The department has succeeded under the conditions of the aforementioned criteria to embed all relevant documents in the departmental

  1. Synthesis document on the long time behavior of packages: operational document ''bituminous'' 2204

    International Nuclear Information System (INIS)

    Tiffreau, C.

    2004-09-01

    This document is realized in the framework of the law of 1991 on the radioactive wastes management. The 2004 synthesis document on long time behavior of bituminous sludges packages is constituted by two documents, the reference document and the operational document. This paper presents the operational model describing the water alteration of the packages and the associated radioelements release, as the gas term source and the swelling associated to the self-irradiation and the bituminous radiolysis. (A.L.B.)

  2. ENDF/B summary documentation

    International Nuclear Information System (INIS)

    Kinsey, R.

    1979-07-01

    This publication provides a localized source of descriptions for the evaluations contained in the ENDF/B Library. The summary documentation presented is intended to be a more detailed description than the (File 1) comments contained in the computer readable data files, but not so detailed as the formal reports describing each ENDF/B evaluation. The summary documentations were written by the CSEWB (Cross Section Evaluation Working Group) evaluators and compiled by NNDC (National Nuclear Data Center). This edition includes documentation for materials found on ENDF/B Version V tapes 501 to 516 (General Purpose File) excluding tape 504. ENDF/B-V also includes tapes containing partial evaluations for the Special Purpose Actinide (521, 522), Dosimetry (531), Activation (532), Gas Production (533), and Fission Product (541-546) files. The materials found on these tapes are documented elsewhere. Some of the evaluation descriptions in this report contain cross sections or energy level information

  3. ENDF/B summary documentation

    Energy Technology Data Exchange (ETDEWEB)

    Kinsey, R. (comp.)

    1979-07-01

    This publication provides a localized source of descriptions for the evaluations contained in the ENDF/B Library. The summary documentation presented is intended to be a more detailed description than the (File 1) comments contained in the computer readable data files, but not so detailed as the formal reports describing each ENDF/B evaluation. The summary documentations were written by the CSEWB (Cross Section Evaluation Working Group) evaluators and compiled by NNDC (National Nuclear Data Center). This edition includes documentation for materials found on ENDF/B Version V tapes 501 to 516 (General Purpose File) excluding tape 504. ENDF/B-V also includes tapes containing partial evaluations for the Special Purpose Actinide (521, 522), Dosimetry (531), Activation (532), Gas Production (533), and Fission Product (541-546) files. The materials found on these tapes are documented elsewhere. Some of the evaluation descriptions in this report contain cross sections or energy level information. (RWR)

  4. Gaia DR1 documentation

    Science.gov (United States)

    van Leeuwen, F.; de Bruijne, J. H. J.; Arenou, F.; Comoretto, G.; Eyer, L.; Farras Casas, M.; Hambly, N.; Hobbs, D.; Salgado, J.; Utrilla Molina, E.; Vogt, S.; van Leeuwen, M.; Abreu, A.; Altmann, M.; Andrei, A.; Babusiaux, C.; Bastian, U.; Biermann, M.; Blanco-Cuaresma, S.; Bombrun, A.; Borrachero, R.; Brown, A. G. A.; Busonero, D.; Busso, G.; Butkevich, A.; Cantat-Gaudin, T.; Carrasco, J. M.; Castañeda, J.; Charnas, J.; Cheek, N.; Clementini, G.; Crowley, C.; Cuypers, J.; Davidson, M.; De Angeli, F.; De Ridder, J.; Evans, D.; Fabricius, C.; Findeisen, K.; Fleitas, J. M.; Gracia, G.; Guerra, R.; Guy, L.; Helmi, A.; Hernandez, J.; Holl, B.; Hutton, A.; Klioner, S.; Lammers, U.; Lecoeur-Taïbi, I.; Lindegren, L.; Luri, X.; Marinoni, S.; Marrese, P.; Messineo, R.; Michalik, D.; Mignard, F.; Montegriffo, P.; Mora, A.; Mowlavi, N.; Nienartowicz, K.; Pancino, E.; Panem, C.; Portell, J.; Rimoldini, L.; Riva, A.; Robin, A.; Siddiqui, H.; Smart, R.; Sordo, R.; Soria, S.; Turon, C.; Vallenari, A.; Voss, H.

    2017-12-01

    We present the first Gaia data release, Gaia DR1, consisting of astrometry and photometry for over 1 billion sources brighter than magnitude 20.7 in the white-light photometric band G of Gaia. The Gaia Data Processing and Analysis Consortium (DPAC) processed the raw measurements collected with the Gaia instruments during the first 14 months of the mission, and turned these into an astrometric and photometric catalogue. Gaia DR1 consists of three parts: an astrometric data set which contains the positions, parallaxes, and mean proper motions for about 2 million of the brightest stars in common with the Hipparcos and Tycho-2 catalogues (the primary astrometric data set) and the positions for an additional 1.1 billion sources (the secondary astrometric data set). The primary set forms the realisation of the Tycho-Gaia Astrometric Solution (TGAS). The second part of Gaia DR1 is the photometric data set, which contains the mean G-band magnitudes for all sources. The third part consists of the G-band light curves and the characteristics of 3000 Cepheid and RR Lyrae stars observed at high cadence around the south ecliptic pole. The positions and proper motions in the astrometric data set are given in a reference frame that is aligned with the International Celestial Reference Frame (ICRF) to better than 0.1 mas at epoch J2015.0, and non-rotating with respect to the ICRF to within 0.03 mas yr^-1. For the primary astrometric data set, the typical standard error for the positions and parallaxes is about 0.3 mas, while for the proper motions the typical standard error is about 1 mas yr^-1. Whereas it has been suggested in Gaia Collaboration et al. (2016a) that a systematic component of ∼0.3 mas should be 'added' (in quadrature) to the parallax uncertainties, Brown (2017) clarifies that reported parallax standard errors already include local systematics as a result of the calibration of the TGAS parallax uncertainties by comparison to Hipparcos parallaxes. For the subset of

  5. Topology of Document Retrieval Systems.

    Science.gov (United States)

    Everett, Daniel M.; Cater, Steven C.

    1992-01-01

    Explains the use of a topological structure to examine the closeness between documents in retrieval systems and analyzes the topological structure of a vector-space model, a fuzzy-set model, an extended Boolean model, a probabilistic model, and a TIRS (Topological Information Retrieval System) model. Proofs for the results are appended. (17…

  6. A methodology for software documentation

    OpenAIRE

    Torres Júnior, Roberto Dias; Ahlert, Hubert

    2000-01-01

    With the growing complexity of window based software and the use of object-oriented, the development of software is getting more complex than ever. Based on that, this article intends to present a methodology for software documentation and to analyze our experience and how this methodology can aid the software maintenance

  7. Foreign patent documentation and information research

    International Nuclear Information System (INIS)

    Wang Tongsheng; Wu Xianfeng; Liu Jia; Cao Jifen; Song Tianbao; Feng Beiyuan; Zhang Baozhu

    2014-01-01

    Patent documentations are important scientific and technical documentations, which gather legal information, technical information and economic information together. According to WIPO forecasts, making full use of patent documentation can save 40% of research funding and 60% of the study period. Foreign patent documentations are the world's most valuable patent documentations, and many original technologies that have significant influence are first disclosed in foreign patent documentation. Studying and making use of foreign patent documentations can improve our starting point of scientific and technological innovation, and reduce the research investment. This paper analyzes foreign patent documentation and, combining with the actual development of nuclear technology in our country, makes specific recommendations for patent documentation research. (authors)

  8. Registration document 2005; Document de reference 2005

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-07-01

    This reference document of Gaz de France provides information and data on the Group activities in 2005: financial informations, business, activities, equipments factories and real estate, trade, capital, organization charts, employment, contracts and research programs. (A.L.B.)

  9. 2002 reference document; Document de reference 2002

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-07-01

    This 2002 reference document of the group Areva, provides information on the society. Organized in seven chapters, it presents the persons responsible for the reference document and for auditing the financial statements, information pertaining to the transaction, general information on the company and share capital, information on company operation, changes and future prospects, assets, financial position, financial performance, information on company management and executive board and supervisory board, recent developments and future prospects. (A.L.B.)

  10. Sources for the study of collective violence in Early Modern Navarre: the value of the judicial documentation Fuentes para el estudio de la violencia colectiva en la Navarra moderna: el valor de la documentación procesal

    Directory of Open Access Journals (Sweden)

    Javier Ruiz Astiz

    2012-03-01

    Full Text Available During the Old Regime we attended in Kingdom of Navarre of the constant presence of disorders that alter the public order.The assignment of this article is to demonstrate that the study of these phenomena can be realised of a satisfactory way if we resorted to the judicial processes. Thanks to these sources of file the elementary characteristics of these events can be reconstructed helping to the historians to select their true background. Nevertheless, in this work also it is tried to approachthis type of documentation showing not only the advantages that takeimplicit their use, but as well the disadvantages that lock up. Althoughover everything the numerous benefits prevail that allow to obtain this type of sources to the investigators when they decide to confront the studyof past times.Durante el Antiguo Régimen asistimos en el reino de Navarra a la constante presencia de desórdenes que alteran el orden público. El cometido de este artículo es demostrar que el estudio de estos fenómenos se puede realizar de un modo satisfactorio si recurrimos a los procesos judiciales. Gracias a estas fuentes de archivo pueden ser reconstruidas las características elementales de estos sucesos, ayudando a los historiadores a entresacar su verdadero trasfondo. Sin embargo, en este trabajo también se pretende abordar este tipo de documentación mostrando no sólo las ventajas que llevan implícitas su uso, sino a su vez las desventajas que encierran. Aunque por encima de todo priman los numerosos beneficios que permiten obtener este tipo de fuentes a los investigadores cuando deciden afrontar el estudio de épocas pasadas.

  11. Web document engineering

    International Nuclear Information System (INIS)

    White, B.

    1996-05-01

    This tutorial provides an overview of several document engineering techniques which are applicable to the authoring of World Wide Web documents. It illustrates how pre-WWW hypertext research is applicable to the development of WWW information resources

  12. Enterprise Document Management

    Data.gov (United States)

    US Agency for International Development — The function of the operation is to provide e-Signature and document management support for Acquisition and Assisitance (A&A) documents including vouchers in...

  13. Downhole Fluid Analyzer Development

    Energy Technology Data Exchange (ETDEWEB)

    Bill Turner

    2006-11-28

    A novel fiber optic downhole fluid analyzer has been developed for operation in production wells. This device will allow real-time determination of the oil, gas and water fractions of fluids from different zones in a multizone or multilateral completion environment. The device uses near infrared spectroscopy and induced fluorescence measurement to unambiguously determine the oil, water and gas concentrations at all but the highest water cuts. The only downhole components of the system are the fiber optic cable and windows. All of the active components--light sources, sensors, detection electronics and software--will be located at the surface, and will be able to operate multiple downhole probes. Laboratory testing has demonstrated that the sensor can accurately determine oil, water and gas fractions with a less than 5 percent standard error. Once installed in an intelligent completion, this sensor will give the operating company timely information about the fluids arising from various zones or multilaterals in a complex completion pattern, allowing informed decisions to be made on controlling production. The research and development tasks are discussed along with a market analysis.

  14. Climate Model Diagnostic Analyzer

    Science.gov (United States)

    Lee, Seungwon; Pan, Lei; Zhai, Chengxing; Tang, Benyang; Kubar, Terry; Zhang, Zia; Wang, Wei

    2015-01-01

    The comprehensive and innovative evaluation of climate models with newly available global observations is critically needed for the improvement of climate model current-state representation and future-state predictability. A climate model diagnostic evaluation process requires physics-based multi-variable analyses that typically involve large-volume and heterogeneous datasets, making them both computation- and data-intensive. With an exploratory nature of climate data analyses and an explosive growth of datasets and service tools, scientists are struggling to keep track of their datasets, tools, and execution/study history, let alone sharing them with others. In response, we have developed a cloud-enabled, provenance-supported, web-service system called Climate Model Diagnostic Analyzer (CMDA). CMDA enables the physics-based, multivariable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. At the same time, CMDA provides a crowd-sourcing space where scientists can organize their work efficiently and share their work with others. CMDA is empowered by many current state-of-the-art software packages in web service, provenance, and semantic search.

  15. WIPP documentation plan

    International Nuclear Information System (INIS)

    Plung, D.L.; Montgomery, T.T.; Glasstetter, S.R.

    1986-01-01

    In support of the programs at the Waste Isolation Pilot Plant (WIPP), the Publications and Procedures Section developed a documentation plan that provides an integrated document hierarchy; further, this plan affords several unique features: 1) the format for procedures minimizes the writing responsibilities of the technical staff and maximizes use of the writing and editing staff; 2) review cycles have been structured to expedite the processing of documents; and 3) the numbers of documents needed to support the program have been appreciably reduced

  16. Documenting Employee Conduct

    Science.gov (United States)

    Dalton, Jason

    2009-01-01

    One of the best ways for a child care program to lose an employment-related lawsuit is failure to document the performance of its employees. Documentation of an employee's performance can provide evidence of an employment-related decision such as discipline, promotion, or discharge. When properly implemented, documentation of employee performance…

  17. Documents preparation and review

    International Nuclear Information System (INIS)

    1999-01-01

    Ignalina Safety Analysis Group takes active role in assisting regulatory body VATESI to prepare various regulatory documents and reviewing safety reports and other documentation presented by Ignalina NPP in the process of licensing of unit 1. The list of main documents prepared and reviewed is presented

  18. Airborne radionuclide waste-management reference document

    International Nuclear Information System (INIS)

    Brown, R.A.; Christian, J.D.; Thomas, T.R.

    1983-07-01

    This report provides the detailed data required to develop a strategy for airborne radioactive waste management by the Department of Energy (DOE). The airborne radioactive materials of primary concern are tritium (H-3), carbon-14 (C-14), krypton-85 (Kr-85), iodine-129 (I-129), and radioactive particulate matter. The introductory section of the report describes the nature and broad objectives of airborne waste management. The relationship of airborne waste management to other waste management programs is described. The scope of the strategy is defined by considering all potential sources of airborne radionuclides and technologies available for their management. Responsibilities of the regulatory agencies are discussed. Section 2 of this document deals primarily with projected inventories, potential releases, and dose commitments of the principal airborne wastes from the light water reactor (LWR) fuel cycle. In Section 3, dose commitments, technologies, costs, regulations, and waste management criteria are analyzed. Section 4 defines goals and objectives for airborne waste management

  19. Starlink Document Styles

    Science.gov (United States)

    Lawden, M. D.

    This document describes the various styles which are recommended for Starlink documents. It also explains how to use the templates which are provided by Starlink to help authors create documents in a standard style. This paper is concerned mainly with conveying the ``look and feel" of the various styles of Starlink document rather than describing the technical details of how to produce them. Other Starlink papers give recommendations for the detailed aspects of document production, design, layout, and typography. The only style that is likely to be used by most Starlink authors is the Standard style.

  20. Subject (of documents)

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2017-01-01

    This article presents and discuss the concept “subject” or subject matter (of documents) as it has been examined in library and information science (LIS) for more than 100 years. Different theoretical positions are outlined and it is found that the most important distinction is between document......-oriented views versus request-oriented views. The document-oriented view conceive subject as something inherent in documents, whereas the request-oriented view (or the policy based view) understand subject as an attribution made to documents in order to facilitate certain uses of them. Related concepts...

  1. New Challenges of the Documentation in Media

    Directory of Open Access Journals (Sweden)

    Antonio García Jiménez

    2015-07-01

    Full Text Available This special issue, presented by index.comunicación, is focused on media related information & documentation. This field undergoes constant and profound changes, especially visible in documentation processes. A situation characterized by the existence of tablets, smartphones, applications, and by the almost achieved digitization of traditional documents, in addition to the crisis of the press business model, that involves mutations in the journalists’ tasks and in the relationship between them and Documentation. Papers included in this special issue focus on some of the concerns in this domain: the progressive autonomy of the journalist in access to information sources, the role of press offices as documentation sources, the search of information on the web, the situation of media blogs, the viability of elements of information architecture in smart TV and the development of social TV and its connection to Documentation.

  2. A Relevance-Extended Multi-dimensional Model for a Data Warehouse Contextualized with Documents

    DEFF Research Database (Denmark)

    Perez, Juan Manuel; Pedersen, Torben Bach; Berlanga, Rafael

    2005-01-01

    Current data warehouse and OLAP technologies can be applied to analyze the structured data that companies store in their databases. The circumstances that describe the context associated with these data can be found in other internal and external sources of documents. In this paper we propose...... to combine the traditional corporate data warehouse with a document warehouse, resulting in a contextualized warehouse. Thus, contextualized warehouses keep a historical record of the facts and their contexts as described by the documents. In this framework, the user selects an analysis context which...

  3. Forensic document analysis using scanning microscopy

    Science.gov (United States)

    Shaffer, Douglas K.

    2009-05-01

    The authentication and identification of the source of a printed document(s) can be important in forensic investigations involving a wide range of fraudulent materials, including counterfeit currency, travel and identity documents, business and personal checks, money orders, prescription labels, travelers checks, medical records, financial documents and threatening correspondence. The physical and chemical characterization of document materials - including paper, writing inks and printed media - is becoming increasingly relevant for law enforcement agencies, with the availability of a wide variety of sophisticated commercial printers and copiers which are capable of producing fraudulent documents of extremely high print quality, rendering these difficult to distinguish from genuine documents. This paper describes various applications and analytical methodologies using scanning electron miscoscopy/energy dispersive (x-ray) spectroscopy (SEM/EDS) and related technologies for the characterization of fraudulent documents, and illustrates how their morphological and chemical profiles can be compared to (1) authenticate and (2) link forensic documents with a common source(s) in their production history.

  4. From a Content Delivery Portal to a Knowledge Management System for Standardized Cancer Documentation.

    Science.gov (United States)

    Schlue, Danijela; Mate, Sebastian; Haier, Jörg; Kadioglu, Dennis; Prokosch, Hans-Ulrich; Breil, Bernhard

    2017-01-01

    Heterogeneous tumor documentation and its challenges of interpretation of medical terms lead to problems in analyses of data from clinical and epidemiological cancer registries. The objective of this project was to design, implement and improve a national content delivery portal for oncological terms. Data elements of existing handbooks and documentation sources were analyzed, combined and summarized by medical experts of different comprehensive cancer centers. Informatics experts created a generic data model based on an existing metadata repository. In order to establish a national knowledge management system for standardized cancer documentation, a prototypical tumor wiki was designed and implemented. Requirements engineering techniques were applied to optimize this platform. It is targeted to user groups such as documentation officers, physicians and patients. The linkage to other information sources like PubMed and MeSH was realized.

  5. Scheme Program Documentation Tools

    DEFF Research Database (Denmark)

    Nørmark, Kurt

    2004-01-01

    are separate and intended for different documentation purposes they are related to each other in several ways. Both tools are based on XML languages for tool setup and for documentation authoring. In addition, both tools rely on the LAML framework which---in a systematic way---makes an XML language available...... as named functions in Scheme. Finally, the Scheme Elucidator is able to integrate SchemeDoc resources as part of an internal documentation resource....

  6. Varsity letters documenting modern colleges and universities

    CERN Document Server

    Samuels, Helen Willa

    1998-01-01

    A study of the functions of colleges and universities, Varsity Letters is intended to aid those responsible for the documentation of these institutions. Samuels offers specific advice about the records of modern colleges and universities and proposes a method to ensure their adequate documentation. She also offers a method to analyze and plan the preservation of records for any type of institution.

  7. Health physics documentation

    International Nuclear Information System (INIS)

    Stablein, G.

    1980-01-01

    When dealing with radioactive material the health physicist receives innumerable papers and documents within the fields of researching, prosecuting, organizing and justifying radiation protection. Some of these papers are requested by the health physicist and some are required by law. The scope, quantity and deposit periods of the health physics documentation at the Karlsruhe Nuclear Research Center are presented and rationalizing methods discussed. The aim of this documentation should be the application of physics to accident prevention, i.e. documentation should protect those concerned and not the health physicist. (H.K.)

  8. CAED Document Repository

    Data.gov (United States)

    U.S. Environmental Protection Agency — Compliance Assurance and Enforcement Division Document Repository (CAEDDOCRESP) provides internal and external access of Inspection Records, Enforcement Actions, and...

  9. CFO Payment Document Management

    Data.gov (United States)

    US Agency for International Development — Paperless management will enable the CFO to create, store, and access various financial documents electronically. This capability will reduce time looking for...

  10. Web server attack analyzer

    OpenAIRE

    Mižišin, Michal

    2013-01-01

    Web server attack analyzer - Abstract The goal of this work was to create prototype of analyzer of injection flaws attacks on web server. Proposed solution combines capabilities of web application firewall and web server log analyzer. Analysis is based on configurable signatures defined by regular expressions. This paper begins with summary of web attacks, followed by detection techniques analysis on web servers, description and justification of selected implementation. In the end are charact...

  11. IDC System Specification Document.

    Energy Technology Data Exchange (ETDEWEB)

    Clifford, David J.

    2014-12-01

    This document contains the system specifications derived to satisfy the system requirements found in the IDC System Requirements Document for the IDC Reengineering Phase 2 project. Revisions Version Date Author/Team Revision Description Authorized by V1.0 12/2014 IDC Reengineering Project Team Initial delivery M. Harris

  12. INFCE plenary conference documents

    International Nuclear Information System (INIS)

    This document consists of the reports to the First INFCE Plenary Conference (November 1978) by the Working Groups a Plenary Conference of its actions and decisions, the Communique of the Final INFCE Plenary Conference (February 1980), and a list of all documents in the IAEA depository for INFCE

  13. Human Document Project

    NARCIS (Netherlands)

    de Vries, Jeroen; Abelmann, Leon; Manz, A; Elwenspoek, Michael Curt

    2012-01-01

    “The Human Document Project‿ is a project which tries to answer all of the questions related to preserving information about the human race for tens of generations of humans to come or maybe even for a future intelligence which can emerge in the coming thousands of years. This document mainly

  14. Reactive documentation system

    Science.gov (United States)

    Boehnlein, Thomas R.; Kramb, Victoria

    2018-04-01

    Proper formal documentation of computer acquired NDE experimental data generated during research is critical to the longevity and usefulness of the data. Without documentation describing how and why the data was acquired, NDE research teams lose capability such as their ability to generate new information from previously collected data or provide adequate information so that their work can be replicated by others seeking to validate their research. Despite the critical nature of this issue, NDE data is still being generated in research labs without appropriate documentation. By generating documentation in series with data, equal priority is given to both activities during the research process. One way to achieve this is to use a reactive documentation system (RDS). RDS prompts an operator to document the data as it is generated rather than relying on the operator to decide when and what to document. This paper discusses how such a system can be implemented in a dynamic environment made up of in-house and third party NDE data acquisition systems without creating additional burden on the operator. The reactive documentation approach presented here is agnostic enough that the principles can be applied to any operator controlled, computer based, data acquisition system.

  15. Documentation: Records and Reports.

    Science.gov (United States)

    Akers, Michael J

    2017-01-01

    This article deals with documentation to include the beginning of documentation, the requirements of Good Manufacturing Practice reports and records, and the steps that can be taken to minimize Good Manufacturing Practice documentation problems. It is important to remember that documentation for 503a compounding involves the Formulation Record, Compounding Record, Standard Operating Procedures, Safety Data Sheets, etc. For 503b outsourcing facilities, compliance with Current Good Manufacturing Practices is required, so this article is applicable to them. For 503a pharmacies, one can see the development and modification of Good Manufacturing Practice and even observe changes as they are occurring in 503a documentation requirements and anticipate that changes will probably continue to occur. Copyright© by International Journal of Pharmaceutical Compounding, Inc.

  16. Civilian Radioactive Waste Management System Requirements Document

    International Nuclear Information System (INIS)

    1992-12-01

    This document specifies the top-level requirements for the Civilian Radioactive Waste Management System (CRWMS). The document is referred to herein as the CRD, for CRWMS Requirements document. The OCRWM System Engineering Management Plan (SEMP) establishes the technical document hierarchy (hierarchy of technical requirements and configuration baseline documents) for the CRWMS program. The CRD is the top-level document in this hierarchy. The immediate subordinate documents are the System Requirements Documents (SRDS) for the four elements of the CRWMS and the Interface Specification (IFS). The four elements of the CRWMS are the Waste Acceptance System, the Transportation System, the Monitored Retrievable Storage (MRS) System and the Mined Geologic Disposal System (MGDS). The Interface Specification describes the six inter-element interfaces between the four elements. This hierarchy establishes the requirements to be addressed by the design of the system elements. Many of the technical requirements for the CRWMS are documented in a variety of Federal regulations, DOE directives and other Government documentation. It is the purpose of the CRD to establish the technical requirements for the entire program. In doing so, the CRD summarizes source documentation for requirements that must be addressed by the program, specifies particular requirements, and documents derived requirements that are not covered in regulatory and other Government documentation, but are necessary to accomplish the mission of the CRWMS. The CRD defines the CRWMS by identifying the top-level functions the elements must perform (These top-level functions were derived using functional analysis initially documented in the Physical System Requirements (PSR) documents). The CRD also defines the top-level physical architecture of the system and allocates the functions and requirements to the architectural elements of the system

  17. CNEA's quality system documentation

    International Nuclear Information System (INIS)

    Mazzini, M.M.; Garonis, O.H.

    1998-01-01

    Full text: To obtain an effective and coherent documentation system suitable for CNEA's Quality Management Program, we decided to organize the CNEA's quality documentation with : a- Level 1. Quality manual. b- Level 2. Procedures. c-Level 3. Qualities plans. d- Level 4: Instructions. e- Level 5. Records and other documents. The objective of this work is to present a standardization of the documentation of the CNEA's quality system of facilities, laboratories, services, and R and D activities. Considering the diversity of criteria and formats for elaboration the documentation by different departments, and since ultimately each of them generally includes the same quality management policy, we proposed the elaboration of a system in order to improve the documentation, avoiding unnecessary time wasting and costs. This will aloud each sector to focus on their specific documentation. The quality manuals of the atomic centers fulfill the rule 3.6.1 of the Nuclear Regulatory Authority, and the Safety Series 50-C/SG-Q of the International Atomic Energy Agency. They are designed by groups of competent and highly trained people of different departments. The normative procedures are elaborated with the same methodology as the quality manuals. The quality plans which describe the organizational structure of working group and the appropriate documentation, will asses the quality manuals of facilities, laboratories, services, and research and development activities of atomic centers. The responsibilities for approval of the normative documentation are assigned to the management in charge of the administration of economic and human resources in order to fulfill the institutional objectives. Another improvement aimed to eliminate unnecessary invaluable processes is the inclusion of all quality system's normative documentation in the CNEA intranet. (author) [es

  18. Manual of Documentation Practices Applicable to Defence-Aerospace Scientific and Technical Information. Volume 1. Section 1 - Acquisition and Sources. Section 2 - Descriptive Cataloguing. Section 3 - Abstracting and Subject Analysis

    Science.gov (United States)

    1978-08-01

    be added for new subcategories. The Dewey Decimal Classification, the Library of Congress Classification of the United States, and the Universal...to be published. A translation duplicate is a translation of a report or an article into another language. DDC (DOD/USA) Defense Documentation Center...controls need not be elaborate. The system described below has proved adequate over a number of years of operation at the Defense Documentation Center ( DDC

  19. Nuclear power plant analyzer

    International Nuclear Information System (INIS)

    Stritar, A.

    1986-01-01

    The development of Nuclear Power Plant Analyzers in USA is described. There are two different types of Analyzers under development in USA, the forst in Idaho and Los Alamos national Lab, the second in brookhaven National lab. That one is described in detail. The computer hardware and the mathematical models of the reactor vessel thermalhydraulics are described. (author)

  20. Analyzing Peace Pedagogies

    Science.gov (United States)

    Haavelsrud, Magnus; Stenberg, Oddbjorn

    2012-01-01

    Eleven articles on peace education published in the first volume of the Journal of Peace Education are analyzed. This selection comprises peace education programs that have been planned or carried out in different contexts. In analyzing peace pedagogies as proposed in the 11 contributions, we have chosen network analysis as our method--enabling…

  1. Gearbox vibration diagnostic analyzer

    Science.gov (United States)

    1992-01-01

    This report describes the Gearbox Vibration Diagnostic Analyzer installed in the NASA Lewis Research Center's 500 HP Helicopter Transmission Test Stand to monitor gearbox testing. The vibration of the gearbox is analyzed using diagnostic algorithms to calculate a parameter indicating damaged components.

  2. Portable Fuel Quality Analyzer

    Science.gov (United States)

    2014-01-27

    other transportation industries, such as trucking. The PFQA could also be used in fuel blending operations performed at petroleum, ethanol and biodiesel plants. ...Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per...24476 12. DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT The

  3. Software requirements specification document for the AREST code development

    International Nuclear Information System (INIS)

    Engel, D.W.; McGrail, B.P.; Whitney, P.D.; Gray, W.J.; Williford, R.E.; White, M.D.; Eslinger, P.W.; Altenhofen, M.K.

    1993-11-01

    The Analysis of the Repository Source Term (AREST) computer code was selected in 1992 by the U.S. Department of Energy. The AREST code will be used to analyze the performance of an underground high level nuclear waste repository. The AREST code is being modified by the Pacific Northwest Laboratory (PNL) in order to evaluate the engineered barrier and waste package designs, model regulatory compliance, analyze sensitivities, and support total systems performance assessment modeling. The current version of the AREST code was developed to be a very useful tool for analyzing model uncertainties and sensitivities to input parameters. The code has also been used successfully in supplying source-terms that were used in a total systems performance assessment. The current version, however, has been found to be inadequate for the comparison and selection of a design for the waste package. This is due to the assumptions and simplifications made in the selection of the process and system models. Thus, the new version of the AREST code will be designed to focus on the details of the individual processes and implementation of more realistic models. This document describes the requirements of the new models that will be implemented. Included in this document is a section describing the near-field environmental conditions for this waste package modeling, description of the new process models that will be implemented, and a description of the computer requirements for the new version of the AREST code

  4. TRANSPORTATION SYSTEM REQUIREMENTS DOCUMENT

    International Nuclear Information System (INIS)

    2004-01-01

    This document establishes the Transportation system requirements for the U.S. Department of Energy's (DOE's) Civilian Radioactive Waste Management System (CRWMS). These requirements are derived from the Civilian Radioactive Waste Management System Requirements Document (CRD). The Transportation System Requirements Document (TSRD) was developed in accordance with LP-3.1Q-OCRWM, Preparation, Review, and Approval of Office of National Transportation Level-2 Baseline Requirements. As illustrated in Figure 1, the TSRD forms a part of the DOE Office of Civilian Radioactive Waste Management (OCRWM) Technical Baseline

  5. A Categorization of Dynamic Analyzers

    Science.gov (United States)

    Lujan, Michelle R.

    1997-01-01

    Program analysis techniques and tools are essential to the development process because of the support they provide in detecting errors and deficiencies at different phases of development. The types of information rendered through analysis includes the following: statistical measurements of code, type checks, dataflow analysis, consistency checks, test data,verification of code, and debugging information. Analyzers can be broken into two major categories: dynamic and static. Static analyzers examine programs with respect to syntax errors and structural properties., This includes gathering statistical information on program content, such as the number of lines of executable code, source lines. and cyclomatic complexity. In addition, static analyzers provide the ability to check for the consistency of programs with respect to variables. Dynamic analyzers in contrast are dependent on input and the execution of a program providing the ability to find errors that cannot be detected through the use of static analysis alone. Dynamic analysis provides information on the behavior of a program rather than on the syntax. Both types of analysis detect errors in a program, but dynamic analyzers accomplish this through run-time behavior. This paper focuses on the following broad classification of dynamic analyzers: 1) Metrics; 2) Models; and 3) Monitors. Metrics are those analyzers that provide measurement. The next category, models, captures those analyzers that present the state of the program to the user at specified points in time. The last category, monitors, checks specified code based on some criteria. The paper discusses each classification and the techniques that are included under them. In addition, the role of each technique in the software life cycle is discussed. Familiarization with the tools that measure, model and monitor programs provides a framework for understanding the program's dynamic behavior from different, perspectives through analysis of the input

  6. Applications for electronic documents

    International Nuclear Information System (INIS)

    Beitel, G.A.

    1995-01-01

    This paper discusses the application of electronic media to documents, specifically Safety Analysis Reports (SARs), prepared for Environmental Restoration and Waste Management (ER ampersand WM) programs being conducted for the Department of Energy (DOE) at the Idaho National Engineering Laboratory (INEL). Efforts are underway to upgrade our document system using electronic format. To satisfy external requirements (DOE, State, and Federal), ER ampersand WM programs generate a complement of internal requirements documents including a SAR and Technical Safety Requirements along with procedures and training materials. Of interest, is the volume of information and the difficulty in handling it. A recently prepared ER ampersand WM SAR consists of 1,000 pages of text and graphics; supporting references add 10,000 pages. Other programmatic requirements documents consist of an estimated 5,000 pages plus references

  7. Informational system. Documents management

    Directory of Open Access Journals (Sweden)

    Vladut Iacob

    2009-12-01

    Full Text Available Productivity growing, as well as reducing of operational costs in a company can be achieved by adopting a document management solutions. Such application will allow management and structured and efficient transmission of information within the organization.

  8. Transportation System Requirements Document

    International Nuclear Information System (INIS)

    1993-09-01

    This Transportation System Requirements Document (Trans-SRD) describes the functions to be performed by and the technical requirements for the Transportation System to transport spent nuclear fuel (SNF) and high-level radioactive waste (HLW) from Purchaser and Producer sites to a Civilian Radioactive Waste Management System (CRWMS) site, and between CRWMS sites. The purpose of this document is to define the system-level requirements for Transportation consistent with the CRWMS Requirement Document (CRD). These requirements include design and operations requirements to the extent they impact on the development of the physical segments of Transportation. The document also presents an overall description of Transportation, its functions, its segments, and the requirements allocated to the segments and the system-level interfaces with Transportation. The interface identification and description are published in the CRWMS Interface Specification

  9. Integrated Criteria Document Chromium

    NARCIS (Netherlands)

    Slooff W; Cleven RFMJ; Janus JA; van der Poel P; van Beelen P; Boumans LJM; Canton JH; Eerens HC; Krajnc EI; de Leeuw FAAM; Matthijsen AJCM; van de Meent D; van der Meulen A; Mohn GR; Wijland GC; de Bruijn PJ; van Keulen A; Verburgh JJ; van der Woerd KF

    1990-01-01

    Betreft de engelse versie van rapport 758701001
    Bij dit rapport behoort een appendix onder hetzelfde nummer getiteld: "Integrated Criteria Document Chromium: Effects" Auteurs: Janus JA; Krajnc EI
    (appendix: see 710401002A)

  10. NCDC Archive Documentation Manuals

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Climatic Data Center Tape Deck Documentation library is a collection of over 400 manuals describing NCDC's digital holdings (both historic and current)....

  11. Registration document 2005

    International Nuclear Information System (INIS)

    2005-01-01

    This reference document of Gaz de France provides information and data on the Group activities in 2005: financial informations, business, activities, equipments factories and real estate, trade, capital, organization charts, employment, contracts and research programs. (A.L.B.)

  12. Miniature mass analyzer

    CERN Document Server

    Cuna, C; Lupsa, N; Cuna, S; Tuzson, B

    2003-01-01

    The paper presents the concept of different mass analyzers that were specifically designed as small dimension instruments able to detect with great sensitivity and accuracy the main environmental pollutants. The mass spectrometers are very suited instrument for chemical and isotopic analysis, needed in environmental surveillance. Usually, this is done by sampling the soil, air or water followed by laboratory analysis. To avoid drawbacks caused by sample alteration during the sampling process and transport, the 'in situ' analysis is preferred. Theoretically, any type of mass analyzer can be miniaturized, but some are more appropriate than others. Quadrupole mass filter and trap, magnetic sector, time-of-flight and ion cyclotron mass analyzers can be successfully shrunk, for each of them some performances being sacrificed but we must know which parameters are necessary to be kept unchanged. To satisfy the miniaturization criteria of the analyzer, it is necessary to use asymmetrical geometries, with ion beam obl...

  13. Are PDF Documents Accessible?

    Directory of Open Access Journals (Sweden)

    Mireia Ribera Turró

    2008-09-01

    Full Text Available Adobe PDF is one of the most widely used formats in scientific communications and in administrative documents. In its latest versions it has incorporated structural tags and improvements that increase its level of accessibility. This article reviews the concept of accessibility in the reading of digital documents and evaluates the accessibility of PDF according to the most widely established standards.

  14. 2002 reference document

    International Nuclear Information System (INIS)

    2002-01-01

    This 2002 reference document of the group Areva, provides information on the society. Organized in seven chapters, it presents the persons responsible for the reference document and for auditing the financial statements, information pertaining to the transaction, general information on the company and share capital, information on company operation, changes and future prospects, assets, financial position, financial performance, information on company management and executive board and supervisory board, recent developments and future prospects. (A.L.B.)

  15. LCS Content Document Application

    Science.gov (United States)

    Hochstadt, Jake

    2011-01-01

    My project at KSC during my spring 2011 internship was to develop a Ruby on Rails application to manage Content Documents..A Content Document is a collection of documents and information that describes what software is installed on a Launch Control System Computer. It's important for us to make sure the tools we use everyday are secure, up-to-date, and properly licensed. Previously, keeping track of the information was done by Excel and Word files between different personnel. The goal of the new application is to be able to manage and access the Content Documents through a single database backed web application. Our LCS team will benefit greatly with this app. Admin's will be able to login securely to keep track and update the software installed on each computer in a timely manner. We also included exportability such as attaching additional documents that can be downloaded from the web application. The finished application will ease the process of managing Content Documents while streamlining the procedure. Ruby on Rails is a very powerful programming language and I am grateful to have the opportunity to build this application.

  16. Technical approach document

    International Nuclear Information System (INIS)

    1988-04-01

    This document describes the general technical approaches and design criteria adopted by the US Department of Energy (DOE) in order to implement Remedial Action Plans (RAPs) and final designs that comply with EPS standards. This document is a revision to the original document. Major revisions were made to the sections in riprap selection and sizing, and ground-water; only minor revisions were made to the remainder of the document. The US Nuclear Regulatory Commission (NRC) has prepared a Standard Review Plan (NRC-SRP) which describes factors to be considered by the NRC in approving the RAP. Sections 3.0, 4.0, 5.0, and 7.0 of this document are arranged under the same headings as those used in the NRC-SRP. This approach is adopted in order to facilitate joint use of the documents. Section 2.0 (not included in the NRC-SRP) discusses design considerations; Section 3.0 describes surface-water hydrology and erosion control; Section 4.0 describes geotechnical aspects of pile design; Section 5.0 discusses the Alternate Site Selection Process; Section 6.0 deals with radiological issues (in particular, the design of the radon barrier); Section 7.0 discusses protection of groundwater resources; and Section 8.0 discusses site design criteria for the RAC

  17. Extraction spectrophotometric analyzer

    International Nuclear Information System (INIS)

    Batik, J.; Vitha, F.

    1985-01-01

    Automation is discussed of extraction spectrophotometric determination of uranium in a solution. Uranium is extracted from accompanying elements in an HCl medium with a solution of tributyl phosphate in benzene. The determination is performed by measuring absorbance at 655 nm in a single-phase ethanol-water-benzene-tributyl phosphate medium. The design is described of an analyzer consisting of an analytical unit and a control unit. The analyzer performance promises increased productivity of labour, improved operating and hygiene conditions, and mainly more accurate results of analyses. (J.C.)

  18. Securing XML Documents

    Directory of Open Access Journals (Sweden)

    Charles Shoniregun

    2004-11-01

    Full Text Available XML (extensible markup language is becoming the current standard for establishing interoperability on the Web. XML data are self-descriptive and syntax-extensible; this makes it very suitable for representation and exchange of semi-structured data, and allows users to define new elements for their specific applications. As a result, the number of documents incorporating this standard is continuously increasing over the Web. The processing of XML documents may require a traversal of all document structure and therefore, the cost could be very high. A strong demand for a means of efficient and effective XML processing has posed a new challenge for the database world. This paper discusses a fast and efficient indexing technique for XML documents, and introduces the XML graph numbering scheme. It can be used for indexing and securing graph structure of XML documents. This technique provides an efficient method to speed up XML data processing. Furthermore, the paper explores the classification of existing methods impact of query processing, and indexing.

  19. Integrated system for automated financial document processing

    Science.gov (United States)

    Hassanein, Khaled S.; Wesolkowski, Slawo; Higgins, Ray; Crabtree, Ralph; Peng, Antai

    1997-02-01

    A system was developed that integrates intelligent document analysis with multiple character/numeral recognition engines in order to achieve high accuracy automated financial document processing. In this system, images are accepted in both their grayscale and binary formats. A document analysis module starts by extracting essential features from the document to help identify its type (e.g. personal check, business check, etc.). These features are also utilized to conduct a full analysis of the image to determine the location of interesting zones such as the courtesy amount and the legal amount. These fields are then made available to several recognition knowledge sources such as courtesy amount recognition engines and legal amount recognition engines through a blackboard architecture. This architecture allows all the available knowledge sources to contribute incrementally and opportunistically to the solution of the given recognition query. Performance results on a test set of machine printed business checks using the integrated system are also reported.

  20. Americal options analyzed differently

    NARCIS (Netherlands)

    Nieuwenhuis, J.W.

    2003-01-01

    In this note we analyze in a discrete-time context and with a finite outcome space American options starting with the idea that every tradable should be a martingale under a certain measure. We believe that in this way American options become more understandable to people with a good working

  1. Analyzing Political Television Advertisements.

    Science.gov (United States)

    Burson, George

    1992-01-01

    Presents a lesson plan to help students understand that political advertisements often mislead, lie, or appeal to emotion. Suggests that the lesson will enable students to examine political advertisements analytically. Includes a worksheet to be used by students to analyze individual political advertisements. (DK)

  2. Centrifugal analyzer development

    International Nuclear Information System (INIS)

    Burtis, C.A.; Bauer, M.L.; Bostick, W.D.

    1976-01-01

    The development of the centrifuge fast analyzer (CFA) is reviewed. The development of a miniature CFA with computer data analysis is reported and applications for automated diagnostic chemical and hematological assays are discussed. A portable CFA system with microprocessor was adapted for field assays of air and water samples for environmental pollutants, including ammonia, nitrates, nitrites, phosphates, sulfates, and silica. 83 references

  3. Analyzing MER Uplink Reports

    Science.gov (United States)

    Savin, Stephen C.

    2005-01-01

    The MER project includes two rovers working simultaneously on opposite sides of Mars each receiving commands only once a day. Creating this uplink is critical, since a failed uplink means a lost day and a waste of money. Examining the process of creating this uplink, I tracked the use of the system developed for requesting observations as well as the development, from stage to stage, in forming an activity plan. I found the system for requesting observations was commonly misused, if used at all. There are half a dozen reports to document the creation of the uplink plan and often there are discrepancies among them. Despite this, the uplink process worked very well and MER has been one of the most successful missions for NASA in recent memory. Still it is clear there is room for improvement.

  4. Segmentation of complex document

    Directory of Open Access Journals (Sweden)

    Souad Oudjemia

    2014-06-01

    Full Text Available In this paper we present a method for segmentation of documents image with complex structure. This technique based on GLCM (Grey Level Co-occurrence Matrix used to segment this type of document in three regions namely, 'graphics', 'background' and 'text'. Very briefly, this method is to divide the document image, in block size chosen after a series of tests and then applying the co-occurrence matrix to each block in order to extract five textural parameters which are energy, entropy, the sum entropy, difference entropy and standard deviation. These parameters are then used to classify the image into three regions using the k-means algorithm; the last step of segmentation is obtained by grouping connected pixels. Two performance measurements are performed for both graphics and text zones; we have obtained a classification rate of 98.3% and a Misclassification rate of 1.79%.

  5. La Documentation photographique

    Directory of Open Access Journals (Sweden)

    Magali Hamm

    2009-03-01

    Full Text Available La Documentation photographique, revue destinée aux enseignants et étudiants en histoire-géographie, place l’image au cœur de sa ligne éditoriale. Afin de suivre les évolutions actuelles de la géographie, la collection propose une iconographie de plus en plus diversifiée : cartes, photographies, mais aussi caricatures, une de journal ou publicité, toutes étant considérées comme un document géographique à part entière. Car l’image peut se faire synthèse ; elle peut au contraire montrer les différentes facettes d’un objet ; souvent elle permet d’incarner des phénomènes géographiques. Associées à d’autres documents, les images aident les enseignants à initier leurs élèves à des raisonnements géographiques complexes. Mais pour apprendre à les lire, il est fondamental de les contextualiser, de les commenter et d’interroger leur rapport au réel.The Documentation photographique, magazine dedicated to teachers and students in History - Geography, places the image at the heart of its editorial line. In order to follow the evolutions of Geography, the collection presents a more and more diversified iconography: maps, photographs, but also drawings or advertisements, all this documents being considered as geographical ones. Because image can be a synthesis; on the contrary it can present the different facets of a same object; often it enables to portray geographical phenomena. Related to other documents, images assist the teachers in the students’ initiation to complex geographical reasoning. But in order to learn how to read them, it is fundamental to contextualize them, comment them and question their relations with reality.

  6. Managing the consistency of distributed documents

    OpenAIRE

    Nentwich, C.

    2005-01-01

    Many businesses produce documents as part of their daily activities: software engineers produce requirements specifications, design models, source code, build scripts and more; business analysts produce glossaries, use cases, organisation charts, and domain ontology models; service providers and retailers produce catalogues, customer data, purchase orders, invoices and web pages. What these examples have in common is that the content of documents is often semantically relate...

  7. Title list of documents made publicly available

    International Nuclear Information System (INIS)

    1990-04-01

    This document is a monthly publication containing descriptions of information received and generated by the US Nuclear Regulatory Commission (NRC). This information includes (1) docketed material associated with civilian nuclear power plants and other uses of radioactive materials, and (2) nondocketed material received and generated by NRC pertinent to its role as a regulatory agency. The following indexes are included: Personal Author, Corporate Source, Report Number, and Cross Reference to Principal Documents

  8. Rating of transport and radiation source events. Draft additional guidance for the INES national officers for pilot use and feedback; Echelle de classement des incidents de radioprotection: document d'application du systeme international propose par l'AIEA pour les sources radioactives et les transports

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-09-15

    The International Nuclear Event Scale (INES) is a means for promptly communicating to the public in consistent terms the safety significance of any reported event associated with radioactive material and/or radiation and to any event occurring during the transport of radioactive material. As described in the 2001 Edition of the INES User.s Manual, events are classified on the scale at seven levels: the upper levels (4-7) are termed accidents. and the lower levels (1-3) incidents. Events which have no safety significance are classified below scale at Level 0 and termed deviations. An overview of the principles for the rating under INES together with flow charts summarizing the rating process is provided in Appendix I. The 2001 Edition of the INES User.s Manual provides some guidance for the rating of transport and radiation source events. At the technical meeting held in 2002 the INES National Officers requested the IAEA/NEA Secretariat to prepare additional guidance. Progress was reported at the Technical Meeting of the INES National Officers in March 2004 where preparation of this draft additional guidance was requested for pilot use. This note provides additional guidance on the rating of transport and radiation source events. It is for pilot use and feedback and is broadly consistent with the INES User.s Manual. It provides more detailed information and an expanded approach for the rating based on actual exposure of workers and members of the public. It is designed to be used as a self-standing document with limited need for reference to the INES User Manual. (author)

  9. Methods for Analyzing Social Media

    DEFF Research Database (Denmark)

    Jensen, Jakob Linaa

    2013-01-01

    Social media is becoming increasingly attractive for users. It is a fast way to communicate ideas and a key source of information. It is therefore one of the most influential mediums of communication of our time and an important area for audience research. The growth of social media invites many...... new questions such as: How can we analyze social media? Can we use traditional audience research methods and apply them to online content? Which new research strategies have been developed? Which ethical research issues and controversies do we have to pay attention to? This book focuses on research...... strategies and methods for analyzing social media and will be of interest to researchers and practitioners using social media, as well as those wanting to keep up to date with the subject....

  10. Customer Communication Document

    Science.gov (United States)

    2009-01-01

    This procedure communicates to the Customers of the Automation, Robotics and Simulation Division (AR&SD) Dynamics Systems Test Branch (DSTB) how to obtain services of the Six-Degrees-Of-Freedom Dynamic Test System (SDTS). The scope includes the major communication documents between the SDTS and its Customer. It established the initial communication and contact points as well as provides the initial documentation in electronic media for the customer. Contact the SDTS Manager (SM) for the names of numbers of the current contact points.

  11. Document reconstruction by layout analysis of snippets

    Science.gov (United States)

    Kleber, Florian; Diem, Markus; Sablatnig, Robert

    2010-02-01

    Document analysis is done to analyze entire forms (e.g. intelligent form analysis, table detection) or to describe the layout/structure of a document. Also skew detection of scanned documents is performed to support OCR algorithms that are sensitive to skew. In this paper document analysis is applied to snippets of torn documents to calculate features for the reconstruction. Documents can either be destroyed by the intention to make the printed content unavailable (e.g. tax fraud investigation, business crime) or due to time induced degeneration of ancient documents (e.g. bad storage conditions). Current reconstruction methods for manually torn documents deal with the shape, inpainting and texture synthesis techniques. In this paper the possibility of document analysis techniques of snippets to support the matching algorithm by considering additional features are shown. This implies a rotational analysis, a color analysis and a line detection. As a future work it is planned to extend the feature set with the paper type (blank, checked, lined), the type of the writing (handwritten vs. machine printed) and the text layout of a snippet (text size, line spacing). Preliminary results show that these pre-processing steps can be performed reliably on a real dataset consisting of 690 snippets.

  12. Soft Decision Analyzer

    Science.gov (United States)

    Lansdowne, Chatwin; Steele, Glen; Zucha, Joan; Schlesinger, Adam

    2013-01-01

    We describe the benefit of using closed-loop measurements for a radio receiver paired with a counterpart transmitter. We show that real-time analysis of the soft decision output of a receiver can provide rich and relevant insight far beyond the traditional hard-decision bit error rate (BER) test statistic. We describe a Soft Decision Analyzer (SDA) implementation for closed-loop measurements on single- or dual- (orthogonal) channel serial data communication links. The analyzer has been used to identify, quantify, and prioritize contributors to implementation loss in live-time during the development of software defined radios. This test technique gains importance as modern receivers are providing soft decision symbol synchronization as radio links are challenged to push more data and more protocol overhead through noisier channels, and software-defined radios (SDRs) use error-correction codes that approach Shannon's theoretical limit of performance.

  13. KWU Nuclear Plant Analyzer

    International Nuclear Information System (INIS)

    Bennewitz, F.; Hummel, R.; Oelmann, K.

    1986-01-01

    The KWU Nuclear Plant Analyzer is a real time engineering simulator based on the KWU computer programs used in plant transient analysis and licensing. The primary goal is to promote the understanding of the technical and physical processes of a nuclear power plant at an on-site training facility. Thus the KWU Nuclear Plant Analyzer is available with comparable low costs right at the time when technical questions or training needs arise. This has been achieved by (1) application of the transient code NLOOP; (2) unrestricted operator interaction including all simulator functions; (3) using the mainframe computer Control Data Cyber 176 in the KWU computing center; (4) four color graphic displays controlled by a dedicated graphic computer, no control room equipment; and (5) coupling of computers by telecommunication via telephone

  14. Analyzed Using Statistical Moments

    International Nuclear Information System (INIS)

    Oltulu, O.

    2004-01-01

    Diffraction enhanced imaging (DEl) technique is a new x-ray imaging method derived from radiography. The method uses a monorheumetten x-ray beam and introduces an analyzer crystal between an object and a detector Narrow angular acceptance of the analyzer crystal generates an improved contrast over the evaluation radiography. While standart radiography can produce an 'absorption image', DEl produces 'apparent absorption' and 'apparent refraction' images with superior quality. Objects with similar absorption properties may not be distinguished with conventional techniques due to close absorption coefficients. This problem becomes more dominant when an object has scattering properties. A simple approach is introduced to utilize scattered radiation to obtain 'pure absorption' and 'pure refraction' images

  15. ROOT Reference Documentation

    CERN Document Server

    Fuakye, Eric Gyabeng

    2017-01-01

    A ROOT Reference Documentation has been implemented to generate all the lists of libraries needed for each ROOT class. Doxygen has no option to generate or add the lists of libraries for each ROOT class. Therefore shell scripting and a basic C++ program was employed to import the lists of libraries needed by each ROOT class.

  16. Client Oriented Management Documents.

    Science.gov (United States)

    Limaye, Mohan R.; Hightower, Rick

    Noting that accounting reports, including management advisory service (MAS) studies, reports on internal control, and tax memoranda, often appear rather dense and heavy in style--partly because of the legal environment's demand for careful expression and partly because such documents convey very complex information--this paper presents four…

  17. QA programme documentation

    International Nuclear Information System (INIS)

    Scheibelt, L.

    1980-01-01

    The present paper deals with the following topics: The need for a documented Q.A. program; Establishing a Q.A. program; Q.A. activities; Fundamental policies; Q.A. policies; Quality objectives Q.A. manual. (orig./RW)

  18. Student Problems with Documentation.

    Science.gov (United States)

    Freimer, Gloria R.; Perry, Margaret M.

    1986-01-01

    Interviews with faculty, a survey of 20 students, and examination of style manuals revealed that students are confused by inconsistencies in and multiplicity of styles when confronted with writing and documenting a research paper. Librarians are urged to teach various citation formats and work for adoption of standardization. (17 references) (EJS)

  19. Text document classification

    Czech Academy of Sciences Publication Activity Database

    Novovičová, Jana

    č. 62 (2005), s. 53-54 ISSN 0926-4981 R&D Projects: GA AV ČR IAA2075302; GA AV ČR KSK1019101; GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : document representation * categorization * classification Subject RIV: BD - Theory of Information

  20. Course documentation report

    DEFF Research Database (Denmark)

    Buus, Lillian; Bygholm, Ann; Walther, Tina Dyngby Lyng

    A documentation report on the three pedagogical courses developed during the MVU project period. The report describes the three processes taking departure in the structure and material avaiable at the virtual learning environment. Also the report describes the way the two of the courses developed...

  1. Extremely secure identification documents

    International Nuclear Information System (INIS)

    Tolk, K.M.; Bell, M.

    1997-09-01

    The technology developed in this project uses biometric information printed on the document and public key cryptography to ensure that an adversary cannot issue identification documents to unauthorized individuals or alter existing documents to allow their use by unauthorized individuals. This process can be used to produce many types of identification documents with much higher security than any currently in use. The system is demonstrated using a security badge as an example. This project focused on the technologies requiring development in order to make the approach viable with existing badge printing and laminating technologies. By far the most difficult was the image processing required to verify that the picture on the badge had not been altered. Another area that required considerable work was the high density printed data storage required to get sufficient data on the badge for verification of the picture. The image processing process was successfully tested, and recommendations are included to refine the badge system to ensure high reliability. A two dimensional data array suitable for printing the required data on the badge was proposed, but testing of the readability of the array had to be abandoned due to reallocation of the budgeted funds by the LDRD office

  2. Documents and legal texts

    International Nuclear Information System (INIS)

    2017-01-01

    This section treats of the following documents and legal texts: 1 - Belgium 29 June 2014 - Act amending the Act of 22 July 1985 on Third-Party Liability in the Field of Nuclear Energy; 2 - Belgium, 7 December 2016. - Act amending the Act of 22 July 1985 on Third-Party Liability in the Field of Nuclear Energy

  3. Document clustering methods, document cluster label disambiguation methods, document clustering apparatuses, and articles of manufacture

    Science.gov (United States)

    Sanfilippo, Antonio [Richland, WA; Calapristi, Augustin J [West Richland, WA; Crow, Vernon L [Richland, WA; Hetzler, Elizabeth G [Kennewick, WA; Turner, Alan E [Kennewick, WA

    2009-12-22

    Document clustering methods, document cluster label disambiguation methods, document clustering apparatuses, and articles of manufacture are described. In one aspect, a document clustering method includes providing a document set comprising a plurality of documents, providing a cluster comprising a subset of the documents of the document set, using a plurality of terms of the documents, providing a cluster label indicative of subject matter content of the documents of the cluster, wherein the cluster label comprises a plurality of word senses, and selecting one of the word senses of the cluster label.

  4. Technical approach document

    International Nuclear Information System (INIS)

    1989-12-01

    The Uranium Mill Tailings Radiation Control Act (UMTRCA) of 1978, Public Law 95-604 (PL95-604), grants the Secretary of Energy the authority and responsibility to perform such actions as are necessary to minimize radiation health hazards and other environmental hazards caused by inactive uranium mill sites. This Technical Approach Document (TAD) describes the general technical approaches and design criteria adopted by the US Department of Energy (DOE) in order to implement remedial action plans (RAPS) and final designs that comply with EPA standards. It does not address the technical approaches necessary for aquifer restoration at processing sites; a guidance document, currently in preparation, will describe aquifer restoration concerns and technical protocols. This document is a second revision to the original document issued in May 1986; the revision has been made in response to changes to the groundwater standards of 40 CFR 192, Subparts A--C, proposed by EPA as draft standards. New sections were added to define the design approaches and designs necessary to comply with the groundwater standards. These new sections are in addition to changes made throughout the document to reflect current procedures, especially in cover design, water resources protection, and alternate site selection; only minor revisions were made to some of the sections. Sections 3.0 is a new section defining the approach taken in the design of disposal cells; Section 4.0 has been revised to include design of vegetated covers; Section 8.0 discusses design approaches necessary for compliance with the groundwater standards; and Section 9.0 is a new section dealing with nonradiological hazardous constituents. 203 refs., 18 figs., 26 tabs

  5. Analyzing Flowgraphs with ATL

    Directory of Open Access Journals (Sweden)

    Valerio Cosentino

    2013-11-01

    Full Text Available This paper presents a solution to the Flowgraphs case study for the Transformation Tool Contest 2013 (TTC 2013. Starting from Java source code, we execute a chain of model transformations to derive a simplified model of the program, its control flow graph and its data flow graph. Finally we develop a model transformation that validates the program flow by comparing it with a set of flow specifications written in a domain specific language. The proposed solution has been implemented using ATL.

  6. An Introduction to Document Imaging in the Financial Aid Office.

    Science.gov (United States)

    Levy, Douglas A.

    2001-01-01

    First describes the components of a document imaging system in general and then addresses this technology specifically in relation to financial aid document management: its uses and benefits, considerations in choosing a document imaging system, and additional sources for information. (EV)

  7. PhosphoSiteAnalyzer

    DEFF Research Database (Denmark)

    Bennetzen, Martin V; Cox, Jürgen; Mann, Matthias

    2012-01-01

    an algorithm to retrieve kinase predictions from the public NetworKIN webpage in a semiautomated way and applies hereafter advanced statistics to facilitate a user-tailored in-depth analysis of the phosphoproteomic data sets. The interface of the software provides a high degree of analytical flexibility......Phosphoproteomic experiments are routinely conducted in laboratories worldwide, and because of the fast development of mass spectrometric techniques and efficient phosphopeptide enrichment methods, researchers frequently end up having lists with tens of thousands of phosphorylation sites...... and is designed to be intuitive for most users. PhosphoSiteAnalyzer is a freeware program available at http://phosphosite.sourceforge.net ....

  8. Electrodynamic thermogravimetric analyzer

    International Nuclear Information System (INIS)

    Spjut, R.E.; Bar-Ziv, E.; Sarofim, A.F.; Longwell, J.P.

    1986-01-01

    The design and operation of a new device for studying single-aerosol-particle kinetics at elevated temperatures, the electrodynamic thermogravimetric analyzer (EDTGA), was examined theoretically and experimentally. The completed device consists of an electrodynamic balance modified to permit particle heating by a CO 2 laser, temperature measurement by a three-color infrared-pyrometry system, and continuous weighing by a position-control system. In this paper, the position-control, particle-weight-measurement, heating, and temperature-measurement systems are described and their limitations examined

  9. Analyzing Chinese Financial Reporting

    Institute of Scientific and Technical Information of China (English)

    SABRINA; ZHANG

    2008-01-01

    If the world’s capital markets could use a harmonized accounting framework it would not be necessary for a comparison between two or more sets of accounting standards. However,there is much to do before this becomes reality.This article aims to pres- ent a general overview of China’s General Accepted Accounting Principles(GAAP), U.S.General Accepted Accounting Principles and International Financial Reporting Standards(IFRS),and to analyze the differ- ences among IFRS,U.S.GAAP and China GAAP using fixed assets as an example.

  10. Inductive dielectric analyzer

    International Nuclear Information System (INIS)

    Agranovich, Daniel; Popov, Ivan; Ben Ishai, Paul; Feldman, Yuri; Polygalov, Eugene

    2017-01-01

    One of the approaches to bypass the problem of electrode polarization in dielectric measurements is the free electrode method. The advantage of this technique is that, the probing electric field in the material is not supplied by contact electrodes, but rather by electromagnetic induction. We have designed an inductive dielectric analyzer based on a sensor comprising two concentric toroidal coils. In this work, we present an analytic derivation of the relationship between the impedance measured by the sensor and the complex dielectric permittivity of the sample. The obtained relationship was successfully employed to measure the dielectric permittivity and conductivity of various alcohols and aqueous salt solutions. (paper)

  11. Tank Monitoring and Document control System (TMACS) As Built Software Design Document

    Energy Technology Data Exchange (ETDEWEB)

    GLASSCOCK, J.A.

    2000-01-27

    This document describes the software design for the Tank Monitor and Control System (TMACS). This document captures the existing as-built design of TMACS as of November 1999. It will be used as a reference document to the system maintainers who will be maintaining and modifying the TMACS functions as necessary. The heart of the TMACS system is the ''point-processing'' functionality where a sample value is received from the field sensors and the value is analyzed, logged, or alarmed as required. This Software Design Document focuses on the point-processing functions.

  12. Tank Monitoring and Document control System (TMACS) As Built Software Design Document

    International Nuclear Information System (INIS)

    GLASSCOCK, J.A.

    2000-01-01

    This document describes the software design for the Tank Monitor and Control System (TMACS). This document captures the existing as-built design of TMACS as of November 1999. It will be used as a reference document to the system maintainers who will be maintaining and modifying the TMACS functions as necessary. The heart of the TMACS system is the ''point-processing'' functionality where a sample value is received from the field sensors and the value is analyzed, logged, or alarmed as required. This Software Design Document focuses on the point-processing functions

  13. Plutonium solution analyzer

    International Nuclear Information System (INIS)

    Burns, D.A.

    1994-09-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%-O.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40-240 g/L and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4-4.0 g/L. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 mL of each sample and standard, and generates waste at the rate of about 1.5 mL per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded)

  14. Multiple capillary biochemical analyzer

    Science.gov (United States)

    Dovichi, N.J.; Zhang, J.Z.

    1995-08-08

    A multiple capillary analyzer allows detection of light from multiple capillaries with a reduced number of interfaces through which light must pass in detecting light emitted from a sample being analyzed, using a modified sheath flow cuvette. A linear or rectangular array of capillaries is introduced into a rectangular flow chamber. Sheath fluid draws individual sample streams through the cuvette. The capillaries are closely and evenly spaced and held by a transparent retainer in a fixed position in relation to an optical detection system. Collimated sample excitation radiation is applied simultaneously across the ends of the capillaries in the retainer. Light emitted from the excited sample is detected by the optical detection system. The retainer is provided by a transparent chamber having inward slanting end walls. The capillaries are wedged into the chamber. One sideways dimension of the chamber is equal to the diameter of the capillaries and one end to end dimension varies from, at the top of the chamber, slightly greater than the sum of the diameters of the capillaries to, at the bottom of the chamber, slightly smaller than the sum of the diameters of the capillaries. The optical system utilizes optic fibers to deliver light to individual photodetectors, one for each capillary tube. A filter or wavelength division demultiplexer may be used for isolating fluorescence at particular bands. 21 figs.

  15. Plutonium solution analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Burns, D.A.

    1994-09-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%-O.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40-240 g/L and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4-4.0 g/L. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 mL of each sample and standard, and generates waste at the rate of about 1.5 mL per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded).

  16. Areva - 2011 Reference document

    International Nuclear Information System (INIS)

    2011-01-01

    After having indicated the person responsible of this document and the legal account auditors, and provided some financial information, this document gives an overview of the different risk factors existing in the company: law risks, industrial and environmental risks, operational risks, risks related to large projects, market and liquidity risks. Then, after having recalled the history and evolution of the company and the evolution of its investments over the last five years, it proposes an overview of Areva's activities on the markets of nuclear energy and renewable energies, of its clients and suppliers, of its strategy, of the activities of its different departments. Other information are provided: company's flow chart, estate properties (plants, equipment), an analysis of its financial situation, its research and development policy, the present context, profit previsions or estimations, management organization and operation

  17. Detecting people of interest from internet data sources

    Science.gov (United States)

    Cardillo, Raymond A.; Salerno, John J.

    2006-04-01

    In previous papers, we have documented success in determining the key people of interest from a large corpus of real-world evidence. Our recent efforts focus on exploring additional domains and data sources. Internet data sources such as email, web pages, and news feeds make it easier to gather a large corpus of documents for various domains, but detecting people of interest in these sources introduces new challenges. Analyzing these massive sources magnifies entity resolution problems, and demands a storage management strategy that supports efficient algorithmic analysis and visualization techniques. This paper discusses the techniques we used in order to analyze the ENRON email repository, which are also applicable to analyzing web pages returned from our "Buddy" meta-search engine.

  18. Documentation of Concurrent programs.

    Science.gov (United States)

    1983-07-01

    preparing the documentation formats, and Tom McDonald for preparing the supplemental materials and statistical analyses. 16 [ 1 -16- I j REFERENCES I Boehm...34*h (eeeeeotop to enter h, to IAmlOCSot land ,uoMXM to R&’T- kfC ’Cod At 1*1 ,lgo: 0) 4 O ~ en ttR .I SA’ tgOhegl that ’t to 40n. .hi &ren a ~ O toll s

  19. SANSMIC design document.

    Energy Technology Data Exchange (ETDEWEB)

    Weber, Paula D. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Rudeen, David Keith [GRAM, Inc., Albuquerque, NM (United States)

    2015-07-01

    The United States Strategic Petroleum Reserve (SPR) maintains an underground storage system consisting of caverns that were leached or solution mined in four salt domes located near the Gulf of Mexico in Texas and Louisiana. The SPR comprises more than 60 active caverns containing approximately 700 million barrels of crude oil. Sandia National Labo- ratories (SNL) is the geotechnical advisor to the SPR. As the most pressing need at the inception of the SPR was to create and fill storage volume with oil, the decision was made to leach the caverns and fill them simultaneously (leach-fill). Therefore, A.J. Russo developed SANSMIC in the early 1980s which allows for a transient oil-brine interface (OBI) making it possible to model leach-fill and withdrawal operations. As the majority of caverns are currently filled to storage capacity, the primary uses of SANSMIC at this time are related to the effects of small and large withdrawals, expansion of existing caverns, and projecting future pillar to diameter ratios. SANSMIC was identified by SNL as a priority candidate for qualification. This report continues the quality assurance (QA) process by documenting the "as built" mathematical and numerical models that comprise this document. The pro- gram flow is outlined and the models are discussed in detail. Code features that were added later or were not documented previously have been expounded. No changes in the code's physics have occurred since the original documentation (Russo, 1981, 1983) although recent experiments may yield improvements to the temperature and plume methods in the future.

  20. Electronic Braille Document Reader

    OpenAIRE

    Arif, Shahab; Holmes, Violeta

    2013-01-01

    This paper presents an investigation into developing a portable Braille device which would allow visually impaired individuals to read electronic documents by actuating Braille text on a finger. Braille books tend to be bulky in size due to the minimum size requirements for each Braille cell. E-books can be read in Braille using refreshable Braille displays connected to a computer. However, the refreshable Braille displays are expensive, bulky and are not portable. These factors restrict blin...

  1. Electronic Braille Document Reader

    OpenAIRE

    Arif, S.

    2012-01-01

    An investigation was conducted into developing a portable Braille device which would allow visually impaired individuals to read electronic documents by actuating Braille text on a finger. Braille books tend to be bulky in size due to the minimum size requirements for each Braille cell. E-books can be read in Braille using refreshable Braille displays connected to a computer. However, the refreshable Braille displays are expensive, bulky and are not portable. These factors restrict blind and ...

  2. SGHWR - quality assurance documentation

    International Nuclear Information System (INIS)

    Garrard, R.S.; Caulfield, J.

    1976-01-01

    The quality assurance program for a modern power station such as an SGHWR type reactor plant must include a record of quality achievement. The case history record which is evidence of the actual quality of the plant and is a data bank of design, manufacture, and results of inspections and tests, is described. Documentation distribution, which keeps all key areas informed of plant item quality status, and the retrieval and storage of information, are briefly discussed. (U.K.)

  3. AUDIT plan documenting method

    International Nuclear Information System (INIS)

    Cornecsu, M.

    1995-01-01

    The work describes a method of documenting the AUDIT plan upon the basis of two quantitative elements resulting from quality assurance program appraisal system function implementation degree as established from the latest AUDIT performed an system function weight in QAP, respectively, appraised by taking into account their significance for the activities that are to be performed in the period for which the AUDITs are planned. (Author) 3 Figs., 2 Refs

  4. Technical document characterization by data analysis

    International Nuclear Information System (INIS)

    Mauget, A.

    1993-05-01

    Nuclear power plants possess documents analyzing all the plant systems, which represents a vast quantity of paper. Analysis of textual data can enable a document to be classified by grouping the texts containing the same words. These methods are used on system manuals for feasibility studies. The system manual is then analyzed by LEXTER and the terms it has selected are examined. We first classify according to style (sentences containing general words, technical sentences, etc.), and then according to terms. However, it will not be possible to continue in this fashion for the 100 system manuals existing, because of lack of sufficient storage capacity. Another solution is being developed. (author)

  5. AREVA - 2013 Reference document

    International Nuclear Information System (INIS)

    2014-01-01

    This Reference Document contains information on the AREVA group's objectives, prospects and development strategies, as well as estimates of the markets, market shares and competitive position of the AREVA group. Content: 1 - Person responsible for the Reference Document; 2 - Statutory auditors; 3 - Selected financial information; 4 - Description of major risks confronting the company; 5 - Information about the issuer; 6 - Business overview; 7 - Organizational structure; 8 - Property, plant and equipment; 9 - Situation and activities of the company and its subsidiaries; 10 - Capital resources; 11 - Research and development programs, patents and licenses; 12 - Trend information; 13 - Profit forecasts or estimates; 14 - Management and supervisory bodies; 15 - Compensation and benefits; 16 - Functioning of the management and supervisory bodies; 17 - Human resources information; 18 - Principal shareholders; 19 - Transactions with related parties; 20 - Financial information concerning assets, financial positions and financial performance; 21 - Additional information; 22 - Major contracts; 23 - Third party information, statements by experts and declarations of interest; 24 - Documents on display; 25 - Information on holdings; Appendix 1: report of the supervisory board chairman on the preparation and organization of the board's activities and internal control procedures; Appendix 2: statutory auditors' reports; Appendix 3: environmental report; Appendix 4: non-financial reporting methodology and independent third-party report on social, environmental and societal data; Appendix 5: ordinary and extraordinary general shareholders' meeting; Appendix 6: values charter; Appendix 7: table of concordance of the management report; glossaries

  6. Content Documents Management

    Science.gov (United States)

    Muniz, R.; Hochstadt, J.; Boelke J.; Dalton, A.

    2011-01-01

    The Content Documents are created and managed under the System Software group with. Launch Control System (LCS) project. The System Software product group is lead by NASA Engineering Control and Data Systems branch (NEC3) at Kennedy Space Center. The team is working on creating Operating System Images (OSI) for different platforms (i.e. AIX, Linux, Solaris and Windows). Before the OSI can be created, the team must create a Content Document which provides the information of a workstation or server, with the list of all the software that is to be installed on it and also the set where the hardware belongs. This can be for example in the LDS, the ADS or the FR-l. The objective of this project is to create a User Interface Web application that can manage the information of the Content Documents, with all the correct validations and filters for administrator purposes. For this project we used one of the most excellent tools in agile development applications called Ruby on Rails. This tool helps pragmatic programmers develop Web applications with Rails framework and Ruby programming language. It is very amazing to see how a student can learn about OOP features with the Ruby language, manage the user interface with HTML and CSS, create associations and queries with gems, manage databases and run a server with MYSQL, run shell commands with command prompt and create Web frameworks with Rails. All of this in a real world project and in just fifteen weeks!

  7. Trace impurity analyzer

    International Nuclear Information System (INIS)

    Schneider, W.J.; Edwards, D. Jr.

    1979-01-01

    The desirability for long-term reliability of large scale helium refrigerator systems used on superconducting accelerator magnets has necessitated detection of impurities to levels of a few ppM. An analyzer that measures trace impurity levels of condensable contaminants in concentrations of less than a ppM in 15 atm of He is described. The instrument makes use of the desorption temperature at an indicated pressure of the various impurities to determine the type of contaminant. The pressure rise at that temperature yields a measure of the contaminant level of the impurity. A LN 2 cryogenic charcoal trap is also employed to measure air impurities (nitrogen and oxygen) to obtain the full range of contaminant possibilities. The results of this detector which will be in use on the research and development helium refrigerator of the ISABELLE First-Cell is described

  8. Analyzing Water's Optical Absorption

    Science.gov (United States)

    2002-01-01

    A cooperative agreement between World Precision Instruments (WPI), Inc., and Stennis Space Center has led the UltraPath(TM) device, which provides a more efficient method for analyzing the optical absorption of water samples at sea. UltraPath is a unique, high-performance absorbance spectrophotometer with user-selectable light path lengths. It is an ideal tool for any study requiring precise and highly sensitive spectroscopic determination of analytes, either in the laboratory or the field. As a low-cost, rugged, and portable system capable of high- sensitivity measurements in widely divergent waters, UltraPath will help scientists examine the role that coastal ocean environments play in the global carbon cycle. UltraPath(TM) is a trademark of World Precision Instruments, Inc. LWCC(TM) is a trademark of World Precision Instruments, Inc.

  9. PDA: Pooled DNA analyzer

    Directory of Open Access Journals (Sweden)

    Lin Chin-Yu

    2006-04-01

    Full Text Available Abstract Background Association mapping using abundant single nucleotide polymorphisms is a powerful tool for identifying disease susceptibility genes for complex traits and exploring possible genetic diversity. Genotyping large numbers of SNPs individually is performed routinely but is cost prohibitive for large-scale genetic studies. DNA pooling is a reliable and cost-saving alternative genotyping method. However, no software has been developed for complete pooled-DNA analyses, including data standardization, allele frequency estimation, and single/multipoint DNA pooling association tests. This motivated the development of the software, 'PDA' (Pooled DNA Analyzer, to analyze pooled DNA data. Results We develop the software, PDA, for the analysis of pooled-DNA data. PDA is originally implemented with the MATLAB® language, but it can also be executed on a Windows system without installing the MATLAB®. PDA provides estimates of the coefficient of preferential amplification and allele frequency. PDA considers an extended single-point association test, which can compare allele frequencies between two DNA pools constructed under different experimental conditions. Moreover, PDA also provides novel chromosome-wide multipoint association tests based on p-value combinations and a sliding-window concept. This new multipoint testing procedure overcomes a computational bottleneck of conventional haplotype-oriented multipoint methods in DNA pooling analyses and can handle data sets having a large pool size and/or large numbers of polymorphic markers. All of the PDA functions are illustrated in the four bona fide examples. Conclusion PDA is simple to operate and does not require that users have a strong statistical background. The software is available at http://www.ibms.sinica.edu.tw/%7Ecsjfann/first%20flow/pda.htm.

  10. Security analysis for biometric data in ID documents

    NARCIS (Netherlands)

    Schimke, S.; Kiltz, S.; Vielhauer, C.; Kalker, A.A.C.M.

    2005-01-01

    In this paper we analyze chances and challenges with respect to the security of using biometrics in ID documents. We identify goals for ID documents, set by national and international authorities, and discuss the degree of security, which is obtainable with the inclusion of biometric into documents

  11. Building 894 hazards assessment document

    International Nuclear Information System (INIS)

    Banda, Z.; Williams, M.

    1996-07-01

    The Department of Energy Order 5500.3A requires facility-specific hazards assessments be prepared, maintained, and used for emergency planning purposes. This hazards assessment document describes the chemical and radiological hazards associated with Building 894. The entire inventory was subjected to the screening criteria for potential airborne impact to onsite and offsite individuals out of which 9 chemicals were kept for further evaluation. The air dispersion model, ALOHA, estimated pollutant concentrations downwind from the source of a release, taking into consideration the toxicological and physical characteristics of the release site, the atmospheric conditions, and the circumstances of the release. The greatest distance at which a postulated facility event will produce consequences exceeding the Early Severe Health Effects threshold is 130 meters. The highest emergency classification is a General Emergency. The Emergency Planning Zone is a nominal 130 meter area that conforms to DOE boundaries and physical/jurisdictional boundaries such as fence lines and streets

  12. Building 6630 hazards assessment document

    International Nuclear Information System (INIS)

    Williams, M.; Banda, Z.

    1996-10-01

    The Department of Energy Order 5500.3A requires facility-specific hazards assessments be prepared, maintained, and used for emergency planning purposes. This hazards assessment document describes the chemical and radiological hazards associated with Building 6630. The entire inventory was subjected to the screening criteria for potential airborne impact to onsite and offsite individuals out of which one chemical was kept for further evaluation. The air dispersion model, ALOHA, estimated pollutant concentrations downwind from the source of a release, taking into consideration the toxicological and physical characteristics of the chemical release site, the atmospheric conditions, and the circumstances of the release. The greatest distance at which a postulated facility event will produce consequences exceeding the Early Severe Health Effects threshold is 76 meters. The highest emergency classification is an Alert. The Emergency Planning Zone is a nominal 100 meter area that conforms to DOE boundaries and physical/jurisdictional boundaries such as fence lines and streets

  13. SRS ECOLOGY ENVIRONMENTAL INFORMATION DOCUMENT

    Energy Technology Data Exchange (ETDEWEB)

    Wike, L; Doug Martin, D; Eric Nelson, E; Nancy Halverson, N; John Mayer, J; Michael Paller, M; Rodney Riley, R; Michael Serrato, M

    2006-03-01

    The SRS Ecology Environmental Information Document (EEID) provides a source of information on the ecology of Savannah River Site (SRS). The SRS is a U.S. Department of Energy (DOE)--owned property on the upper Atlantic Coastal Plain of South Carolina, centered approximately 40 kilometers (25 miles) southeast of Augusta, Georgia. The entire site was designated a National Environmental Research Park in 1972 by the Atomic Energy Commission, the predecessor of DOE. This document summarizes and synthesizes ecological research and monitoring conducted on the three main types of ecosystems found at SRS: terrestrial, wetland and aquatic. It also summarizes the available information on the threatened and endangered species found on the Savannah River Site. SRS is located along the Savannah River and encompasses an area of 80,267 hectares (310 square miles) in three South Carolina counties. It contains diverse habitats, flora, and fauna. Habitats include upland terrestrial areas, wetlands, streams, reservoirs, and the adjacent Savannah River. These diverse habitats support a variety of plants and animals, including many commercially or recreationally valuable species and several rare, threatened, or endangered species. Soils are the basic terrestrial resource, influencing the development of terrestrial biological communities. Many different soils exist on the SRS, from hydric to well-drained, and from sand to clay. In general, SRS soils are predominantly well-drained loamy sands.

  14. High level waste storage tanks 242-A evaporator standards/requirement identification document

    International Nuclear Information System (INIS)

    Biebesheimer, E.

    1996-01-01

    This document, the Standards/Requirements Identification Document (S/RIDS) for the subject facility, represents the necessary and sufficient requirements to provide an adequate level of protection of the worker, public health and safety, and the environment. It lists those source documents from which requirements were extracted, and those requirements documents considered, but from which no requirements where taken. Documents considered as source documents included State and Federal Regulations, DOE Orders, and DOE Standards

  15. A neutron activation analyzer

    International Nuclear Information System (INIS)

    Westphal, G.P.; Lemmel, H.; Grass, F.; De Regge, P.P.; Burns, K.; Markowicz, A.

    2005-01-01

    Dubbed 'Analyzer' because of its simplicity, a neutron activation analysis facility for short-lived isomeric transitions is based on a low-cost rabbit system and an adaptive digital filter which are controlled by a software performing irradiation control, loss-free gamma-spectrometry, spectra evaluation, nuclide identification and calculation of concentrations in a fully automatic flow of operations. Designed for TRIGA reactors and constructed from inexpensive plastic tubing and an aluminum in-core part, the rabbit system features samples of 5 ml and 10 ml with sample separation at 150 ms and 200 ms transport time or 25 ml samples without separation at a transport time of 300 ms. By automatically adapting shaping times to pulse intervals the preloaded digital filter gives best throughput at best resolution up to input counting rates of 10 6 cps. Loss-free counting enables quantitative correction of counting losses of up to 99%. As a test of system reproducibility in sample separation geometry, K, Cl, Mn, Mg, Ca, Sc, and V have been determined in various reference materials at excellent agreement with consensus values. (author)

  16. Analyzing Visibility Configurations.

    Science.gov (United States)

    Dachsbacher, C

    2011-04-01

    Many algorithms, such as level of detail rendering and occlusion culling methods, make decisions based on the degree of visibility of an object, but do not analyze the distribution, or structure, of the visible and occluded regions across surfaces. We present an efficient method to classify different visibility configurations and show how this can be used on top of existing methods based on visibility determination. We adapt co-occurrence matrices for visibility analysis and generalize them to operate on clusters of triangular surfaces instead of pixels. We employ machine learning techniques to reliably classify the thus extracted feature vectors. Our method allows perceptually motivated level of detail methods for real-time rendering applications by detecting configurations with expected visual masking. We exemplify the versatility of our method with an analysis of area light visibility configurations in ray tracing and an area-to-area visibility analysis suitable for hierarchical radiosity refinement. Initial results demonstrate the robustness, simplicity, and performance of our method in synthetic scenes, as well as real applications.

  17. Fiscal year 1999 waste information requirements document

    International Nuclear Information System (INIS)

    Adams, M.R.

    1998-01-01

    The Waste Information Requirements Document (WIRD) has the following purposes: To describe the overall drivers that require characterization information and to document their source; To define how characterization is going to satisfy the drivers, close issues, and measure and report progress; and To describe deliverables and acceptance criteria for characterization. Characterization information is required to maintain regulatory compliance, perform operations and maintenance, resolve safety issues, and prepare for disposal of waste. Commitments addressing these requirements are derived from the Hanford Federal Facility Agreement and Consent Order, also known as the Tri-Party Agreement; the Recommendation 93-5 Implementation Plan (DOE-RL 1996a) to the Defense Nuclear Facilities Safety Board (DNFSB); and other requirement sources listed in Section 2.0. The Waste Information Requirements Document replaces the tank waste analysis plans and the tank characterization plan previously required by the Tri-Party Agreement, Milestone M-44-01 and M-44-02 series

  18. Interpreting XML documents via an RDF schema

    NARCIS (Netherlands)

    Klein, Michel; Handschuh, Siegfried; Staab, Steffen

    2003-01-01

    One of the major problems in the realization of the vision of the ``Semantic Web''; is the transformation of existing web data into sources that can be processed and used by machines. This paper presents a procedure that can be used to turn XML documents into knowledge structures, by interpreting

  19. Blood pressure documentation in the emergency department

    Science.gov (United States)

    Daniel, Ana Carolina Queiroz Godoy; Machado, Juliana Pereira; Veiga, Eugenia Velludo

    2017-01-01

    ABSTRACT Objective To analyze the frequency of blood pressure documentation performed by nursing professionals in an emergency department. Methods This is a cross-sectional, observational, descriptive, and analytical study, which included medical records of adult patients admitted to the observation ward of an emergency department, between March and May 2014. Data were obtained through a collection instrument divided into three parts: patient identification, triage data, and blood pressure documentation. For statistical analysis, Pearson’s correlation coefficient was used, with a significance level of α<0.05. Results One hundred fifty-seven records and 430 blood pressure measurements were analyzed with an average of three measurements per patient. Of these measures, 46.5% were abnormal. The mean time from admission to documentation of the first blood pressure measurement was 2.5 minutes, with 42 minutes between subsequent measures. There is no correlation between the systolic blood pressure values and the mean time interval between blood pressure documentations: 0.173 (p=0.031). Conclusion The present study found no correlation between frequency of blood pressure documentation and blood pressure values. The frequency of blood pressure documentation increased according to the severity of the patient and decreased during the length of stay in the emergency department. PMID:28444085

  20. AREVA 2009 reference document

    International Nuclear Information System (INIS)

    2009-01-01

    This Reference Document contains information on the AREVA group's objectives, prospects and development strategies. It contains information on the markets, market shares and competitive position of the AREVA group. This information provides an adequate picture of the size of these markets and of the AREVA group's competitive position. Content: 1 - Person responsible for the Reference Document and Attestation by the person responsible for the Reference Document; 2 - Statutory and Deputy Auditors; 3 - Selected financial information; 4 - Risks: Risk management and coverage, Legal risk, Industrial and environmental risk, Operating risk, Risk related to major projects, Liquidity and market risk, Other risk; 5 - Information about the issuer: History and development, Investments; 6 - Business overview: Markets for nuclear power and renewable energies, AREVA customers and suppliers, Overview and strategy of the group, Business divisions, Discontinued operations: AREVA Transmission and Distribution; 7 - Organizational structure; 8 - Property, plant and equipment: Principal sites of the AREVA group, Environmental issues that may affect the issuer's; 9 - Analysis of and comments on the group's financial position and performance: Overview, Financial position, Cash flow, Statement of financial position, Events subsequent to year-end closing for 2009; 10 - Capital Resources; 11 - Research and development programs, patents and licenses; 12 -trend information: Current situation, Financial objectives; 13 - Profit forecasts or estimates; 14 - Administrative, management and supervisory bodies and senior management; 15 - Compensation and benefits; 16 - Functioning of corporate bodies; 17 - Employees; 18 - Principal shareholders; 19 - Transactions with related parties: French state, CEA, EDF group; 20 - Financial information concerning assets, financial positions and financial performance; 21 - Additional information: Share capital, Certificate of incorporation and by-laws; 22 - Major

  1. Viviendo el documental

    OpenAIRE

    Álvarez Moreno, Víctor

    2017-01-01

    En el siguiente trabajo se recoge el proceso de realización y elaboración de un documental en 360 grados sobre la catedral de Valladolid bajo el título Reconstruyendo la catedral. El trabajo une realidad virtual con narrativa periodística. La realidad virtual es una herramienta que permite transformar al espectador en un testigo de la historia. En este caso, se muestra lo que pudo ser la catedral de Valladolid, cuyo objetivo era convertirse en la catedral más grande del territorio europeo. ...

  2. Documents and legal texts

    International Nuclear Information System (INIS)

    2016-01-01

    This section treats of the following documents and legal texts: 1 - Brazil: Law No. 13,260 of 16 March 2016 (To regulate the provisions of item XLIII of Article 5 of the Federal Constitution on terrorism, dealing with investigative and procedural provisions and redefining the concept of a terrorist organisation; and amends Laws No. 7,960 of 21 December 1989 and No. 12,850 of 2 August 2013); 2 - India: The Atomic Energy (Amendment) Act, 2015; Department Of Atomic Energy Notification (Civil Liability for Nuclear Damage); 3 - Japan: Act on Subsidisation, etc. for Nuclear Damage Compensation Funds following the implementation of the Convention on Supplementary Compensation for Nuclear Damage

  3. Digital Microfluidics Sample Analyzer

    Science.gov (United States)

    Pollack, Michael G.; Srinivasan, Vijay; Eckhardt, Allen; Paik, Philip Y.; Sudarsan, Arjun; Shenderov, Alex; Hua, Zhishan; Pamula, Vamsee K.

    2010-01-01

    Three innovations address the needs of the medical world with regard to microfluidic manipulation and testing of physiological samples in ways that can benefit point-of-care needs for patients such as premature infants, for which drawing of blood for continuous tests can be life-threatening in their own right, and for expedited results. A chip with sample injection elements, reservoirs (and waste), droplet formation structures, fluidic pathways, mixing areas, and optical detection sites, was fabricated to test the various components of the microfluidic platform, both individually and in integrated fashion. The droplet control system permits a user to control droplet microactuator system functions, such as droplet operations and detector operations. Also, the programming system allows a user to develop software routines for controlling droplet microactuator system functions, such as droplet operations and detector operations. A chip is incorporated into the system with a controller, a detector, input and output devices, and software. A novel filler fluid formulation is used for the transport of droplets with high protein concentrations. Novel assemblies for detection of photons from an on-chip droplet are present, as well as novel systems for conducting various assays, such as immunoassays and PCR (polymerase chain reaction). The lab-on-a-chip (a.k.a., lab-on-a-printed-circuit board) processes physiological samples and comprises a system for automated, multi-analyte measurements using sub-microliter samples of human serum. The invention also relates to a diagnostic chip and system including the chip that performs many of the routine operations of a central labbased chemistry analyzer, integrating, for example, colorimetric assays (e.g., for proteins), chemiluminescence/fluorescence assays (e.g., for enzymes, electrolytes, and gases), and/or conductometric assays (e.g., for hematocrit on plasma and whole blood) on a single chip platform.

  4. Approaches to assign security levels for radioactive substances and radiation sources

    International Nuclear Information System (INIS)

    Ivanov, M.V.; Petrovskij, N.P.; Pinchuk, G.N.; Telkov, S.N.; Kuzin, V.V.

    2011-01-01

    The article contains analyzed provisions on categorization of radioactive substances and radiation sources according to the extent of their potential danger. Above provisions are used in the IAEA documents and in Russian regulatory documents for differentiation of regulatory requirements to physical security. It is demonstrated that with the account of possible threats of violators, rules of physical protection of radiation sources and radioactive substances should be amended as regards the approaches to assign their categories and security levels [ru

  5. System Documentation AS Basis For Company Business Process Improvement

    OpenAIRE

    Pelawi, Dewan

    2012-01-01

    Business process is an activity performed together to achieve business goals. Good business process will support the achievement of the organization’s plan to make profit for the company. In order to understand the business process, the business process needs to be documented and analyzed. The purpose of research is to make the system documentation as a basis to improve or complete the ongoing business process. The research method is system documentation. System documentation is a way to desc...

  6. AGENDA 21 - the basic conceptual document - Agenda of the 21 century which was accepted on the United Nations Conference on Environment and Development (UNCED) in Rio de Janeiro in 1992. Part IV. Sources for realization

    International Nuclear Information System (INIS)

    1996-01-01

    This part of the AGENDA 21 contains 8 chapters: Financial sources and mechanisms; A transfer of environmentally suitable technologies, co-operation and creation of the potential; Science for the sustainable development; Support of the learning, enlightenment and professional organization; National mechanisms and international co-operation at the formation of potential of the development countries; International institutional arrangement; International legal instruments and mechanisms; Information for adjudication. Resolution No 1: The acceptance of the text about the environment and development

  7. AREVA - 2012 Reference document

    International Nuclear Information System (INIS)

    2013-03-01

    After a presentation of the person responsible for this Reference Document, of statutory auditors, and of a summary of financial information, this report address the different risk factors: risk management and coverage, legal risk, industrial and environmental risk, operational risk, risk related to major projects, liquidity and market risk, and other risks (related to political and economic conditions, to Group's structure, and to human resources). The next parts propose information about the issuer, a business overview (markets for nuclear power and renewable energies, customers and suppliers, group's strategy, operations), a brief presentation of the organizational structure, a presentation of properties, plants and equipment (principal sites, environmental issues which may affect these items), analysis and comments on the group's financial position and performance, a presentation of capital resources, a presentation of research and development activities (programs, patents and licenses), a brief description of financial objectives and profit forecasts or estimates, a presentation of administration, management and supervision bodies, a description of the operation of corporate bodies, an overview of personnel, of principal shareholders, and of transactions with related parties, a more detailed presentation of financial information concerning assets, financial positions and financial performance. Addition information regarding share capital is given, as well as an indication of major contracts, third party information, available documents, and information on holdings

  8. AREVA 2010 Reference document

    International Nuclear Information System (INIS)

    2010-01-01

    After a presentation of the person responsible for this document, and of statutory auditors, this report proposes some selected financial information. Then, it addresses, presents and comments the different risk factors: risk management and coverage, legal risk, industrial and environmental risk, operational risk, risks related to major projects, liquidity and market risk, and other risk. Then, after a presentation of the issuer, it proposes a business overview (markets for nuclear and renewable energies, AREVA customers and suppliers, strategy, activities), a presentation of the organizational structure, a presentation of AREVA properties, plants and equipment (sites, environmental issues), an analysis and comment of the group's financial position and performance, a presentation of its capital resources, an overview of its research and development activities, programs, patents and licenses. It indicates profit forecast and estimates, presents the administrative, management and supervisory bodies, and compensation and benefits amounts, reports of the functioning of corporate bodies. It describes the human resource company policy, indicates the main shareholders and transactions with related parties. It proposes financial information concerning assets, financial positions and financial performance. This document contains its French and its English versions

  9. ExactPack Documentation

    Energy Technology Data Exchange (ETDEWEB)

    Singleton, Robert Jr. [Los Alamos National Laboratory; Israel, Daniel M. [Los Alamos National Laboratory; Doebling, Scott William [Los Alamos National Laboratory; Woods, Charles Nathan [Los Alamos National Laboratory; Kaul, Ann [Los Alamos National Laboratory; Walter, John William Jr [Los Alamos National Laboratory; Rogers, Michael Lloyd [Los Alamos National Laboratory

    2016-05-09

    For code verification, one compares the code output against known exact solutions. There are many standard test problems used in this capacity, such as the Noh and Sedov problems. ExactPack is a utility that integrates many of these exact solution codes into a common API (application program interface), and can be used as a stand-alone code or as a python package. ExactPack consists of python driver scripts that access a library of exact solutions written in Fortran or Python. The spatial profiles of the relevant physical quantities, such as the density, fluid velocity, sound speed, or internal energy, are returned at a time specified by the user. The solution profiles can be viewed and examined by a command line interface or a graphical user interface, and a number of analysis tools and unit tests are also provided. We have documented the physics of each problem in the solution library, and provided complete documentation on how to extend the library to include additional exact solutions. ExactPack’s code architecture makes it easy to extend the solution-code library to include additional exact solutions in a robust, reliable, and maintainable manner.

  10. Regulatory guidance document

    International Nuclear Information System (INIS)

    1994-05-01

    The Office of Civilian Radioactive Waste Management (OCRWM) Program Management System Manual requires preparation of the OCRWM Regulatory Guidance Document (RGD) that addresses licensing, environmental compliance, and safety and health compliance. The document provides: regulatory compliance policy; guidance to OCRWM organizational elements to ensure a consistent approach when complying with regulatory requirements; strategies to achieve policy objectives; organizational responsibilities for regulatory compliance; guidance with regard to Program compliance oversight; and guidance on the contents of a project-level Regulatory Compliance Plan. The scope of the RGD includes site suitability evaluation, licensing, environmental compliance, and safety and health compliance, in accordance with the direction provided by Section 4.6.3 of the PMS Manual. Site suitability evaluation and regulatory compliance during site characterization are significant activities, particularly with regard to the YW MSA. OCRWM's evaluation of whether the Yucca Mountain site is suitable for repository development must precede its submittal of a license application to the Nuclear Regulatory Commission (NRC). Accordingly, site suitability evaluation is discussed in Chapter 4, and the general statements of policy regarding site suitability evaluation are discussed in Section 2.1. Although much of the data and analyses may initially be similar, the licensing process is discussed separately in Chapter 5. Environmental compliance is discussed in Chapter 6. Safety and Health compliance is discussed in Chapter 7

  11. Digital watermarks in electronic document circulation

    Directory of Open Access Journals (Sweden)

    Vitaliy Grigorievich Ivanenko

    2017-07-01

    Full Text Available This paper reviews different protection methods for electronic documents, their good and bad qualities. Common attacks on electronic documents are analyzed. Digital signature and ways of eliminating its flaws are studied. Different digital watermark embedding methods are described, they are divided into 2 types. The solution to protection of electronic documents is based on embedding digital watermarks. Comparative analysis of this methods is given. As a result, the most convenient method is suggested – reversible data hiding. It’s remarked that this technique excels at securing the integrity of the container and its digital watermark. Digital watermark embedding system should prevent illegal access to the digital watermark and its container. Digital watermark requirements for electronic document protection are produced. Legal aspect of copyright protection is reviewed. Advantages of embedding digital watermarks in electronic documents are produced. Modern reversible data hiding techniques are studied. Distinctive features of digital watermark use in Russia are highlighted. Digital watermark serves as an additional layer of defense, that is in most cases unknown to the violator. With an embedded digital watermark, it’s impossible to misappropriate the authorship of the document, even if the intruder signs his name on it. Therefore, digital watermarks can act as an effective additional tool to protect electronic documents.

  12. Document flow segmentation for business applications

    Science.gov (United States)

    Daher, Hani; Belaïd, Abdel

    2013-12-01

    The aim of this paper is to propose a document flow supervised segmentation approach applied to real world heterogeneous documents. Our algorithm treats the flow of documents as couples of consecutive pages and studies the relationship that exists between them. At first, sets of features are extracted from the pages where we propose an approach to model the couple of pages into a single feature vector representation. This representation will be provided to a binary classifier which classifies the relationship as either segmentation or continuity. In case of segmentation, we consider that we have a complete document and the analysis of the flow continues by starting a new document. In case of continuity, the couple of pages are assimilated to the same document and the analysis continues on the flow. If there is an uncertainty on whether the relationship between the couple of pages should be classified as a continuity or segmentation, a rejection is decided and the pages analyzed until this point are considered as a "fragment". The first classification already provides good results approaching 90% on certain documents, which is high at this level of the system.

  13. Annotating Document Changes

    NARCIS (Netherlands)

    Spadini, E.

    2015-01-01

    Textual scholars use the collation for creating critical and genetic editions, or for studying textual transmission. Collation tools allow to compare the sources and detect the presence of textual variation; but they do not take into account the kind of variation involved. In this paper, we aim at

  14. Areva - 2016 Reference document

    International Nuclear Information System (INIS)

    2017-01-01

    Areva supplies high added-value products and services to support the operation of the global nuclear fleet. The company is present throughout the entire nuclear cycle, from uranium mining to used fuel recycling, including nuclear reactor design and operating services. Areva is recognized by utilities around the world for its expertise, its skills in cutting-edge technologies and its dedication to the highest level of safety. Areva's 36,000 employees are helping build tomorrow's energy model: supplying ever safer, cleaner and more economical energy to the greatest number of people. This Reference Document contains information on Areva's objectives, prospects and development strategies. It contains estimates of the markets, market shares and competitive position of Areva

  15. Working document dispersion models

    International Nuclear Information System (INIS)

    Dop, H. van

    1988-01-01

    This report is a summary of the most important results from June 1985 of the collaboration of the RIVM (Dutch National Institute for Public Health and Environment Hygiene) and KNMI (Royal Dutch Meteorologic Institute) on the domain of dispersion models. It contains a short description of the actual SO x /NO x -model. Furthermore it contains recommendations for modifications of some numerical-mathematical aspects and an impulse to a more complete description of chemical processes in the atmosphere and the (wet) deposition process. A separate chapter is devoted to the preparation of meteorologic data which are relevant for dispersion as well as atmospheric chemistry and deposition. This report serves as working document for the final formulation of a acidifying- and oxidant-model. (H.W.). 69 refs.; 51 figs.; 13 tabs.; 3 schemes

  16. Documents and legal texts

    International Nuclear Information System (INIS)

    2013-01-01

    This section reprints a selection of recently published legislative texts and documents: - Russian Federation: Federal Law No.170 of 21 November 1995 on the use of atomic energy, Adopted by the State Duma on 20 October 1995; - Uruguay: Law No.19.056 On the Radiological Protection and Safety of Persons, Property and the Environment (4 January 2013); - Japan: Third Supplement to Interim Guidelines on Determination of the Scope of Nuclear Damage resulting from the Accident at the Tokyo Electric Power Company Fukushima Daiichi and Daini Nuclear Power Plants (concerning Damages related to Rumour-Related Damage in the Agriculture, Forestry, Fishery and Food Industries), 30 January 2013; - France and the United States: Joint Statement on Liability for Nuclear Damage (Aug 2013); - Franco-Russian Nuclear Power Declaration (1 November 2013)

  17. Wind system documentation

    Energy Technology Data Exchange (ETDEWEB)

    Froggatt, J.R.; Tatum, C.P.

    1993-01-15

    Atmospheric transport and diffusion models have been developed by the Environmental Technology Section (ETS) of the Savannah River Technology Center to calculate the location and concentration of toxic or radioactive materials during an accidental release at the Savannah River Site (SRS). The output from these models has been used to support initial on-site and off-site emergency response activities such as protective action decision making and field monitoring coordination. These atmospheric transport and diffusion models have been incorporated into an automated computer-based system called the (Weather Information and Display) System and linked to real-time meteorological and radiological monitoring instruments to provide timely information for these emergency response activities (Hunter, 1990). This report documents various aspects of the WIND system.

  18. ICRS Recommendation Document

    DEFF Research Database (Denmark)

    Roos, Ewa M.; Engelhart, Luella; Ranstam, Jonas

    2011-01-01

    and function evaluated for validity and psychometric properties in patients with articular cartilage lesions. Results: The knee-specific instruments, titled the International Knee Documentation Committee Subjective Knee Form and the Knee injury and Osteoarthritis and Outcome Score, both fulfill the basic......Abstract Objective: The purpose of this article is to describe and recommend patient-reported outcome instruments for use in patients with articular cartilage lesions undergoing cartilage repair interventions. Methods: Nonsystematic literature search identifying measures addressing pain...... constructs at all levels according to the International Classification of Functioning. Conclusions: Because there is no obvious superiority of either instrument at this time, both outcome measures are recommended for use in cartilage repair. Rescaling of the Lysholm Scoring Scale has been suggested...

  19. Areva, reference document 2006

    International Nuclear Information System (INIS)

    2006-01-01

    This reference document contains information on the AREVA group's objectives, prospects and development strategies, particularly in Chapters 4 and 7. It contains information on the markets, market shares and competitive position of the AREVA group. Content: - 1 Person responsible for the reference document and persons responsible for auditing the financial statements; - 2 Information pertaining to the transaction (Not applicable); - 3 General information on the company and its share capital: Information on AREVA, on share capital and voting rights, Investment certificate trading, Dividends, Organization chart of AREVA group companies, Equity interests, Shareholders' agreements; - 4 Information on company operations, new developments and future prospects: Overview and strategy of the AREVA group, The Nuclear Power and Transmission and Distribution markets, The energy businesses of the AREVA group, Front End division, Reactors and Services division, Back End division, Transmission and Distribution division, Major contracts, The principal sites of the AREVA group, AREVA's customers and suppliers, Sustainable Development and Continuous Improvement, Capital spending programs, Research and development programs, intellectual property and trademarks, Risk and insurance; - 5 Assets - Financial position - Financial performance: Analysis of and comments on the group's financial position and performance, 2006 Human Resources Report, Environmental Report, Consolidated financial statements, Notes to the consolidated financial statements, AREVA SA financial statements, Notes to the corporate financial statements; 6 - Corporate Governance: Composition and functioning of corporate bodies, Executive compensation, Profit-sharing plans, AREVA Values Charter, Annual Combined General Meeting of Shareholders of May 3, 2007; 7 - Recent developments and future prospects: Events subsequent to year-end closing for 2006, Outlook; 8 - Glossary; 9 - Table of concordance

  20. Areva - 2014 Reference document

    International Nuclear Information System (INIS)

    2015-01-01

    Areva supplies high added-value products and services to support the operation of the global nuclear fleet. The company is present throughout the entire nuclear cycle, from uranium mining to used fuel recycling, including nuclear reactor design and operating services. Areva is recognized by utilities around the world for its expertise, its skills in cutting-edge technologies and its dedication to the highest level of safety. Areva's 44,000 employees are helping build tomorrow's energy model: supplying ever safer, cleaner and more economical energy to the greatest number of people. This Reference Document contains information on Areva's objectives, prospects and development strategies. It contains estimates of the markets, market shares and competitive position of Areva. Contents: 1 - Person responsible; 2 - Statutory auditors; 3 - Selected financial information; 4 - Risk factors; 5 - Information about the issuer; 6 - Business overview; 7 - Organizational structure; 8 - Property, plant and equipment; 9 - Analysis of and comments on the group's financial position and performance; 10 - Capital resources; 11 - Research and development programs, patents and licenses; 12 - Trend information; 13 - Profit forecasts; 14 - Administrative, management and supervisory bodies and senior management; 15 - Compensation and benefits; 16 - Functioning of administrative, management and supervisory bodies and senior management; 17 - Employees; 18 - Principal shareholders; 19 - Transactions with related parties; 20 - Financial information concerning assets, financial positions and financial performance; 21 - Additional information; 22 - Major contracts; 23 - Third party information, statements by experts and declarations of interest; 24 - Documents on display; 25 - information on holdings; appendix: Report of the Chairman of the Board of Directors on governance, internal control procedures and risk management, Statutory Auditors' report, Corporate social

  1. Areva reference document 2007

    International Nuclear Information System (INIS)

    2008-01-01

    This reference document contains information on the AREVA group's objectives, prospects and development strategies, particularly in Chapters 4 and 7. It contains also information on the markets, market shares and competitive position of the AREVA group. Content: 1 - Person responsible for the reference document and persons responsible for auditing the financial statements; 2 - Information pertaining to the transaction (not applicable); 3 - General information on the company and its share capital: Information on Areva, Information on share capital and voting rights, Investment certificate trading, Dividends, Organization chart of AREVA group companies, Equity interests, Shareholders' agreements; 4 - Information on company operations, new developments and future prospects: Overview and strategy of the AREVA group, The Nuclear Power and Transmission and Distribution markets, The energy businesses of the AREVA group, Front End division, Reactors and Services division, Back End division, Transmission and Distribution division, Major contracts 140 Principal sites of the AREVA group, AREVA's customers and suppliers, Sustainable Development and Continuous Improvement, Capital spending programs, Research and Development programs, Intellectual Property and Trademarks, Risk and insurance; 5 - Assets financial position financial performance: Analysis of and comments on the group's financial position and performance, Human Resources report, Environmental report, Consolidated financial statements 2007, Notes to the consolidated financial statements, Annual financial statements 2007, Notes to the corporate financial statements; 6 - Corporate governance: Composition and functioning of corporate bodies, Executive compensation, Profit-sharing plans, AREVA Values Charter, Annual Ordinary General Meeting of Shareholders of April 17, 2008; 7 - Recent developments and future prospects: Events subsequent to year-end closing for 2007, Outlook; Glossary; table of concordance

  2. The Bern Simple Climate Model (BernSCM) v1.0: an extensible and fully documented open-source re-implementation of the Bern reduced-form model for global carbon cycle-climate simulations

    Science.gov (United States)

    Strassmann, Kuno M.; Joos, Fortunat

    2018-05-01

    The Bern Simple Climate Model (BernSCM) is a free open-source re-implementation of a reduced-form carbon cycle-climate model which has been used widely in previous scientific work and IPCC assessments. BernSCM represents the carbon cycle and climate system with a small set of equations for the heat and carbon budget, the parametrization of major nonlinearities, and the substitution of complex component systems with impulse response functions (IRFs). The IRF approach allows cost-efficient yet accurate substitution of detailed parent models of climate system components with near-linear behavior. Illustrative simulations of scenarios from previous multimodel studies show that BernSCM is broadly representative of the range of the climate-carbon cycle response simulated by more complex and detailed models. Model code (in Fortran) was written from scratch with transparency and extensibility in mind, and is provided open source. BernSCM makes scientifically sound carbon cycle-climate modeling available for many applications. Supporting up to decadal time steps with high accuracy, it is suitable for studies with high computational load and for coupling with integrated assessment models (IAMs), for example. Further applications include climate risk assessment in a business, public, or educational context and the estimation of CO2 and climate benefits of emission mitigation options.

  3. The Bern Simple Climate Model (BernSCM v1.0: an extensible and fully documented open-source re-implementation of the Bern reduced-form model for global carbon cycle–climate simulations

    Directory of Open Access Journals (Sweden)

    K. M. Strassmann

    2018-05-01

    Full Text Available The Bern Simple Climate Model (BernSCM is a free open-source re-implementation of a reduced-form carbon cycle–climate model which has been used widely in previous scientific work and IPCC assessments. BernSCM represents the carbon cycle and climate system with a small set of equations for the heat and carbon budget, the parametrization of major nonlinearities, and the substitution of complex component systems with impulse response functions (IRFs. The IRF approach allows cost-efficient yet accurate substitution of detailed parent models of climate system components with near-linear behavior. Illustrative simulations of scenarios from previous multimodel studies show that BernSCM is broadly representative of the range of the climate–carbon cycle response simulated by more complex and detailed models. Model code (in Fortran was written from scratch with transparency and extensibility in mind, and is provided open source. BernSCM makes scientifically sound carbon cycle–climate modeling available for many applications. Supporting up to decadal time steps with high accuracy, it is suitable for studies with high computational load and for coupling with integrated assessment models (IAMs, for example. Further applications include climate risk assessment in a business, public, or educational context and the estimation of CO2 and climate benefits of emission mitigation options.

  4. Title list of documents made publicly available

    International Nuclear Information System (INIS)

    1994-06-01

    The Title List of Documents Made Publicly Available is a monthly publication. It contains descriptions of the information received and generated by the US Nuclear Regulatory Commission (NRC). This information includes (1) docketed material associated with civilian nuclear power plants and other uses of radioactive materials and (2) nondocketed material received and generated by NRC pertinent to its role as a regulatory agency. As used here, docketed does not refer to Court dockets; it refers to the system by which NRC maintains its regulatory records. This series of documents is indexed by a Personal Author Index, a Corporate Source Index, and a Report Number Index

  5. Title list of documents made publicly available

    International Nuclear Information System (INIS)

    1982-03-01

    The Title List of Documents Made Publicly Available is a monthly publication. It contains descriptions of the information received and generated by the US Nuclear Regulatory Commission (NRC). This information includes (1) docketed material associated with civilian nuclear power plants and other uses of radioactive materials and (2) nondocketed material received and generated by NRC pertinent to its role as a regulatory agency. As used here, docketed does not refer to Court dockets; it refers to the system by which NRC maintains its regulatory records. This series of documents is indexed by a Personal Author Index, a Corporate Source Index, and a Report Number Index

  6. Title list of documents made publicly available

    International Nuclear Information System (INIS)

    1991-01-01

    The Title List of Documents Made Publicly Available is a monthly publication. It contains descriptions of the information received and generated by the US Nuclear Regulatory Commission (NRC). This information includes docketed material associated with civilian nuclear power plants and other uses of radioactive materials and nondocketed material received and generated by NRC pertinent to its role as a regulatory agency. This series of documents is indexed by a Personal Author Index, a Corporate Source Index, and a Report Number Index. The docketed information contained in the Title List includes the information formerly issued though the Department of Energy publication Power Reactor Docket Information, last published in January 1979

  7. Title List of documents made publicly available

    International Nuclear Information System (INIS)

    1982-05-01

    This document contains descriptions of the information received and generated by the US NRC. This information includes: (1) docketed material associated with civilian nuclear power plants and other uses of radioactive materials and (2) nondocketed material received and generated by NRC pertinent to its role as a regulatory agency. As used here, docketed does not refer to court dockets; it refers to the system by which NRC maintains its regulatory records. This series of documents is indexed by a Personal Author Index, a Corporate Source Index, and a Report Number Index

  8. Semantic Similarity between Web Documents Using Ontology

    Science.gov (United States)

    Chahal, Poonam; Singh Tomer, Manjeet; Kumar, Suresh

    2018-06-01

    The World Wide Web is the source of information available in the structure of interlinked web pages. However, the procedure of extracting significant information with the assistance of search engine is incredibly critical. This is for the reason that web information is written mainly by using natural language, and further available to individual human. Several efforts have been made in semantic similarity computation between documents using words, concepts and concepts relationship but still the outcome available are not as per the user requirements. This paper proposes a novel technique for computation of semantic similarity between documents that not only takes concepts available in documents but also relationships that are available between the concepts. In our approach documents are being processed by making ontology of the documents using base ontology and a dictionary containing concepts records. Each such record is made up of the probable words which represents a given concept. Finally, document ontology's are compared to find their semantic similarity by taking the relationships among concepts. Relevant concepts and relations between the concepts have been explored by capturing author and user intention. The proposed semantic analysis technique provides improved results as compared to the existing techniques.

  9. Semantic Similarity between Web Documents Using Ontology

    Science.gov (United States)

    Chahal, Poonam; Singh Tomer, Manjeet; Kumar, Suresh

    2018-03-01

    The World Wide Web is the source of information available in the structure of interlinked web pages. However, the procedure of extracting significant information with the assistance of search engine is incredibly critical. This is for the reason that web information is written mainly by using natural language, and further available to individual human. Several efforts have been made in semantic similarity computation between documents using words, concepts and concepts relationship but still the outcome available are not as per the user requirements. This paper proposes a novel technique for computation of semantic similarity between documents that not only takes concepts available in documents but also relationships that are available between the concepts. In our approach documents are being processed by making ontology of the documents using base ontology and a dictionary containing concepts records. Each such record is made up of the probable words which represents a given concept. Finally, document ontology's are compared to find their semantic similarity by taking the relationships among concepts. Relevant concepts and relations between the concepts have been explored by capturing author and user intention. The proposed semantic analysis technique provides improved results as compared to the existing techniques.

  10. Tangible interactive system for document browsing and visualisation of multimedia data

    Science.gov (United States)

    Rytsar, Yuriy; Voloshynovskiy, Sviatoslav; Koval, Oleksiy; Deguillaume, Frederic; Topak, Emre; Startchik, Sergei; Pun, Thierry

    2006-01-01

    In this paper we introduce and develop a framework for document interactive navigation in multimodal databases. First, we analyze the main open issues of existing multimodal interfaces and then discuss two applications that include interaction with documents in several human environments, i.e., the so-called smart rooms. Second, we propose a system set-up dedicated to the efficient navigation in the printed documents. This set-up is based on the fusion of data from several modalities that include images and text. Both modalities can be used as cover data for hidden indexes using data-hiding technologies as well as source data for robust visual hashing. The particularities of the proposed robust visual hashing are described in the paper. Finally, we address two practical applications of smart rooms for tourism and education and demonstrate the advantages of the proposed solution.

  11. LDRD 149045 final report distinguishing documents.

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Scott A.

    2010-09-01

    This LDRD 149045 final report describes work that Sandians Scott A. Mitchell, Randall Laviolette, Shawn Martin, Warren Davis, Cindy Philips and Danny Dunlavy performed in 2010. Prof. Afra Zomorodian provided insight. This was a small late-start LDRD. Several other ongoing efforts were leveraged, including the Networks Grand Challenge LDRD, and the Computational Topology CSRF project, and the some of the leveraged work is described here. We proposed a sentence mining technique that exploited both the distribution and the order of parts-of-speech (POS) in sentences in English language documents. The ultimate goal was to be able to discover 'call-to-action' framing documents hidden within a corpus of mostly expository documents, even if the documents were all on the same topic and used the same vocabulary. Using POS was novel. We also took a novel approach to analyzing POS. We used the hypothesis that English follows a dynamical system and the POS are trajectories from one state to another. We analyzed the sequences of POS using support vector machines and the cycles of POS using computational homology. We discovered that the POS were a very weak signal and did not support our hypothesis well. Our original goal appeared to be unobtainable with our original approach. We turned our attention to study an aspect of a more traditional approach to distinguishing documents. Latent Dirichlet Allocation (LDA) turns documents into bags-of-words then into mixture-model points. A distance function is used to cluster groups of points to discover relatedness between documents. We performed a geometric and algebraic analysis of the most popular distance functions and made some significant and surprising discoveries, described in a separate technical report.

  12. Positron sources

    International Nuclear Information System (INIS)

    Chehab, R.

    1994-01-01

    A tentative survey of positron sources is given. Physical processes on which positron generation is based are indicated and analyzed. Explanation of the general features of electromagnetic interactions and nuclear β + decay makes it possible to predict the yield and emittance for a given optical matching system between the positron source and the accelerator. Some kinds of matching systems commonly used - mainly working with solenoidal field - are studied and the acceptance volume calculated. Such knowledge is helpful in comparing different matching systems. Since for large machines, a significant distance exists between the positron source and the experimental facility, positron emittance has to be preserved during beam transfer over large distances and methods used for that purpose are indicated. Comparison of existing positron sources leads to extrapolation to sources for future linear colliders. Some new ideas associated with these sources are also presented. (orig.)

  13. Improving the Product Documentation Process of a Small Software Company

    Science.gov (United States)

    Valtanen, Anu; Ahonen, Jarmo J.; Savolainen, Paula

    Documentation is an important part of the software process, even though it is often neglected in software companies. The eternal question is how much documentation is enough. In this article, we present a practical implementation of lightweight product documentation process resulting from SPI efforts in a small company. Small companies’ financial and human resources are often limited. The documentation process described here, offers a template for creating adequate documentation consuming minimal amount of resources. The key element of the documentation process is an open source web-based bugtracking system that was customized to be used as a documentation tool. The use of the tool enables iterative and well structured documentation. The solution best serves the needs of a small company with off-the-shelf software products and striving for SPI.

  14. Documents and legal texts

    International Nuclear Information System (INIS)

    2015-01-01

    This section treats of the following Documents and legal texts: 1 - Canada: Nuclear Liability and Compensation Act (An Act respecting civil liability and compensation for damage in case of a nuclear incident, repealing the Nuclear Liability Act and making consequential amendments to other acts); 2 - Japan: Act on Compensation for Nuclear Damage (The purpose of this act is to protect persons suffering from nuclear damage and to contribute to the sound development of the nuclear industry by establishing a basic system regarding compensation in case of nuclear damage caused by reactor operation etc.); Act on Indemnity Agreements for Compensation of Nuclear Damage; 3 - Slovak Republic: Act on Civil Liability for Nuclear Damage and on its Financial Coverage and on Changes and Amendments to Certain Laws (This Act regulates: a) The civil liability for nuclear damage incurred in the causation of a nuclear incident, b) The scope of powers of the Nuclear Regulatory Authority (hereinafter only as the 'Authority') in relation to the application of this Act, c) The competence of the National Bank of Slovakia in relation to the supervised financial market entities in the financial coverage of liability for nuclear damage; and d) The penalties for violation of this Act)

  15. Documents and legal texts

    International Nuclear Information System (INIS)

    2014-01-01

    This section of the Bulletin presents the recently published documents and legal texts sorted by country: - Brazil: Resolution No. 169 of 30 April 2014. - Japan: Act Concerning Exceptions to Interruption of Prescription Pertaining to Use of Settlement Mediation Procedures by the Dispute Reconciliation Committee for Nuclear Damage Compensation in relation to Nuclear Damage Compensation Disputes Pertaining to the Great East Japan Earthquake (Act No. 32 of 5 June 2013); Act Concerning Measures to Achieve Prompt and Assured Compensation for Nuclear Damage Arising from the Nuclear Plant Accident following the Great East Japan Earthquake and Exceptions to the Extinctive Prescription, etc. of the Right to Claim Compensation for Nuclear Damage (Act No. 97 of 11 December 2013); Fourth Supplement to Interim Guidelines on Determination of the Scope of Nuclear Damage Resulting from the Accident at the Tokyo Electric Power Company Fukushima Daiichi and Daini Nuclear Power Plants (Concerning Damages Associated with the Prolongation of Evacuation Orders, etc.); Outline of 'Fourth Supplement to Interim Guidelines (Concerning Damages Associated with the Prolongation of Evacuation Orders, etc.)'. - OECD Nuclear Energy Agency: Decision and Recommendation of the Steering Committee Concerning the Application of the Paris Convention to Nuclear Installations in the Process of Being Decommissioned; Joint Declaration on the Security of Supply of Medical Radioisotopes. - United Arab Emirates: Federal Decree No. (51) of 2014 Ratifying the Convention on Supplementary Compensation for Nuclear Damage; Ratification of the Federal Supreme Council of Federal Decree No. (51) of 2014 Ratifying the Convention on Supplementary Compensation for Nuclear Damage

  16. Critical issues in an electronic documentation system.

    Science.gov (United States)

    Weir, Charlene R; Nebeker, Jonathan R

    2007-10-11

    The Veterans Health Administration (VHA), of the U.S. Department of Veteran Affairs has instituted a medical record (EMR) that includes electronic documentation of all narrative components of the medical record. To support clinicians using the system, multiple efforts have been instituted to ease the creation of narrative reports. Although electronic documentation is easier to read and improves access to information, it also may create new and additional hazards for users. This study is the first step in a series of studies to evaluate the issues surrounding the creation and use of electronic documentation. Eighty-eight providers across multiple clinical roles were interviewed in 10 primary care sites in the VA system. Interviews were tape-recorded, transcribed and qualitatively analyzed for themes. In addition, specific questions were asked about perceived harm due to electronic documentation practices. Five themes relating to difficulties with electronic documentation were identified: 1) information overload; 2) hidden information; 3) lack of trust; 4) communication; 5) decision-making. Three providers reported that they knew of an incident where current documentation practices had caused patient harm and over 75% of respondents reported significant mis-trust of the system.

  17. Analyzing stakeholders' workshop dialogue for evidence of social learning

    Directory of Open Access Journals (Sweden)

    Amanda L. Bentley Brymer

    2018-03-01

    Full Text Available After much debate and synthesis, social learning scholarship is entering an era of empirical research. Given the range across individual-, network-, and systems-level perspectives and scales, clear documentation of social learning processes is critical for making claims about social learning outcomes and their impacts. Past studies have relied on participant recall and concept maps to document perceptions of social learning process and outcome. Using an individual-centric perspective and importing ideas from communication and psychology on question-answer learning through conversational agents, we contribute an expanded conceptual framework and qualitative analytical strategy for assessing stakeholder dialogue for evidence of social learning. We observed stakeholder dialogue across five workshops coordinated for the Bruneau-Owyhee Sage-Grouse Habitat Project (BOSH in Owyhee County, Idaho, USA. Participants' dialogue was audio recorded, transcribed, and analyzed for cross-case patterns. Deductive and inductive coding techniques were applied to illuminate cognitive, relational, and epistemic dimensions of learning and topics of learning. A key finding supports our inclusion of the epistemic dimension and highlights a need for future research: although some participants articulated epistemic positions, they did not challenge each other to share sources or justify factual claims. These findings align with previous research suggesting that, in addition to considering diversity and representation (who is at the table, we should pay more attention to how participants talk, perhaps prompting specific patterns of speech as we endeavor to draw causal connections between social learning processes and outcomes.

  18. Environmental restoration value engineering guidance document

    International Nuclear Information System (INIS)

    1995-07-01

    This document provides guidance on Value Engineering (VE). VE is an organized team effort led by a person trained in the methodology to analyze the functions of projects, systems, equipment, facilities, services, and processes for achieving the essential functions at the lowest life cycle cost while maintaining required performance, reliability, availability, quality, and safety. VE has proven to be a superior tool to improve up-front project planning, cut costs, and create a better value for each dollar spent. This document forms the basis for the Environmental Restoration VE Program, describes the VE process, and provides recommendations on when it can be most useful on ER projects

  19. Document image analysis: A primer

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    (1) Typical documents in today's office are computer-generated, but even so, inevitably by different computers and ... different sizes, from a business card to a large engineering drawing. Document analysis ... Whether global or adaptive ...

  20. Document management in engineering construction

    International Nuclear Information System (INIS)

    Liao Bing

    2008-01-01

    Document management is one important part of systematic quality management, which is one of the key factors to ensure the construction quality. In the engineering construction, quality management and document management shall interwork all the time, to ensure the construction quality. Quality management ensures that the document is correctly generated and adopted, and thus the completeness, accuracy and systematicness of the document satisfy the filing requirements. Document management ensures that the document is correctly transferred during the construction, and various testimonies such as files and records are kept for the engineering construction and its quality management. This paper addresses the document management in the engineering construction based on the interwork of the quality management and document management. (author)

  1. Recommended HSE-7 documents hierarchy

    International Nuclear Information System (INIS)

    Klein, R.B.; Jennrich, E.A.; Lund, D.M.; Danna, J.G.; Davis, K.D.; Rutz, A.C.

    1990-01-01

    This report recommends a hierarchy of waste management documents at Los Alamos National Laboratory (LANL or ''Laboratory''). The hierarchy addresses documents that are required to plan, implement, and document waste management programs at Los Alamos. These documents will enable the waste management group and the six sections contained within that group to satisfy requirements that are imposed upon them by the US Department of Energy (DOE), DOE Albuquerque Operations, US Environmental Protection Agency, various State of New Mexico agencies, and Laboratory management

  2. Sources for charged particles

    International Nuclear Information System (INIS)

    Arianer, J.

    1997-01-01

    This document is a basic course on charged particle sources for post-graduate students and thematic schools on large facilities and accelerator physics. A simple but precise description of the creation and the emission of charged particles is presented. This course relies on every year upgraded reference documents. Following relevant topics are considered: electronic emission processes, technological and practical considerations on electron guns, positron sources, production of neutral atoms, ionization, plasma and discharge, different types of positive and negative ion sources, polarized particle sources, materials for the construction of ion sources, low energy beam production and transport. (N.T.)

  3. GPC Single Source Letter

    Science.gov (United States)

    This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  4. Analysis and Evaluation of the Skills of the Degree in Information and Documentation at the University of Zaragoza

    Directory of Open Access Journals (Sweden)

    María del Carmen AGUSTÍN LACRUZ

    2016-05-01

    Full Text Available Specific skill described in the teaching guides of the Degree in Information and Documentation of the University of Zaragoza are studied. The methodology consists of analyzing the skills of all subjects by inclusion in a database and processed later by SPSS (v. 22.0. The results highlighted the study of 37 subjects. The average number of competitions per subject is 3.1. Skill with the largest presence are: Preparation and dissemination of information, Knowledge of the professional environment of Information and Documentation, Identification and evaluation of information sources and resources, and Organization and storage of Information. Skill without presence are: Information Technology: Telecommunications and Business skills. In the core subjects is the most common Preparation and dissemination of information. In the optional subjects, the most common skill are Knowledge of the professional environment of information and documentation, Identification, and evaluation of sources and resources of information and Preparation and dissemination of information. Keywords

  5. Improving collaborative documentation in CMS

    International Nuclear Information System (INIS)

    Lassila-Perini, Kati; Salmi, Leena

    2010-01-01

    Complete and up-to-date documentation is essential for efficient data analysis in a large and complex collaboration like CMS. Good documentation reduces the time spent in problem solving for users and software developers. The scientists in our research environment do not necessarily have the interests or skills of professional technical writers. This results in inconsistencies in the documentation. To improve the quality, we have started a multidisciplinary project involving CMS user support and expertise in technical communication from the University of Turku, Finland. In this paper, we present possible approaches to study the usability of the documentation, for instance, usability tests conducted recently for the CMS software and computing user documentation.

  6. REVEAL: Software Documentation and Platform Migration

    Science.gov (United States)

    Wilson, Michael A.; Veibell, Victoir T.; Freudinger, Lawrence C.

    2008-01-01

    The Research Environment for Vehicle Embedded Analysis on Linux (REVEAL) is reconfigurable data acquisition software designed for network-distributed test and measurement applications. In development since 2001, it has been successfully demonstrated in support of a number of actual missions within NASA s Suborbital Science Program. Improvements to software configuration control were needed to properly support both an ongoing transition to operational status and continued evolution of REVEAL capabilities. For this reason the project described in this report targets REVEAL software source documentation and deployment of the software on a small set of hardware platforms different from what is currently used in the baseline system implementation. This report specifically describes the actions taken over a ten week period by two undergraduate student interns and serves as a final report for that internship. The topics discussed include: the documentation of REVEAL source code; the migration of REVEAL to other platforms; and an end-to-end field test that successfully validates the efforts.

  7. Gaia DR2 documentation

    Science.gov (United States)

    van Leeuwen, F.; de Bruijne, J. H. J.; Arenou, F.; Bakker, J.; Blomme, R.; Busso, G.; Cacciari, C.; Castañeda, J.; Cellino, A.; Clotet, M.; Comoretto, G.; Eyer, L.; González-Núñez, J.; Guy, L.; Hambly, N.; Hobbs, D.; van Leeuwen, M.; Luri, X.; Manteiga, M.; Pourbaix, D.; Roegiers, T.; Salgado, J.; Sartoretti, P.; Tanga, P.; Ulla, A.; Utrilla Molina, E.; Abreu, A.; Altmann, M.; Andrae, R.; Antoja, T.; Audard, M.; Babusiaux, C.; Bailer-Jones, C. A. L.; Barache, C.; Bastian, U.; Beck, M.; Berthier, J.; Bianchi, L.; Biermann, M.; Bombrun, A.; Bossini, D.; Breddels, M.; Brown, A. G. A.; Busonero, D.; Butkevich, A.; Cantat-Gaudin, T.; Carrasco, J. M.; Cheek, N.; Clementini, G.; Creevey, O.; Crowley, C.; David, M.; Davidson, M.; De Angeli, F.; De Ridder, J.; Delbò, M.; Dell'Oro, A.; Diakité, S.; Distefano, E.; Drimmel, R.; Durán, J.; Evans, D. W.; Fabricius, C.; Fabrizio, M.; Fernández-Hernández, J.; Findeisen, K.; Fleitas, J.; Fouesneau, M.; Galluccio, L.; Gracia-Abril, G.; Guerra, R.; Gutiérrez-Sánchez, R.; Helmi, A.; Hernandez, J.; Holl, B.; Hutton, A.; Jean-Antoine-Piccolo, A.; Jevardat de Fombelle, G.; Joliet, E.; Jordi, C.; Juhász, Á.; Klioner, S.; Löffler, W.; Lammers, U.; Lanzafame, A.; Lebzelter, T.; Leclerc, N.; Lecoeur-Taïbi, I.; Lindegren, L.; Marinoni, S.; Marrese, P. M.; Mary, N.; Massari, D.; Messineo, R.; Michalik, D.; Mignard, F.; Molinaro, R.; Molnár, L.; Montegriffo, P.; Mora, A.; Mowlavi, N.; Muinonen, K.; Muraveva, T.; Nienartowicz, K.; Ordenovic, C.; Pancino, E.; Panem, C.; Pauwels, T.; Petit, J.; Plachy, E.; Portell, J.; Racero, E.; Regibo, S.; Reylé, C.; Rimoldini, L.; Ripepi, V.; Riva, A.; Robichon, N.; Robin, A.; Roelens, M.; Romero-Gómez, M.; Sarro, L.; Seabroke, G.; Segovia, J. C.; Siddiqui, H.; Smart, R.; Smith, K.; Sordo, R.; Soria, S.; Spoto, F.; Stephenson, C.; Turon, C.; Vallenari, A.; Veljanoski, J.; Voutsinas, S.

    2018-04-01

    The second Gaia data release, Gaia DR2, encompasses astrometry, photometry, radial velocities, astrophysical parameters (stellar effective temperature, extinction, reddening, radius, and luminosity), and variability information plus astrometry and photometry for a sample of pre-selected bodies in the solar system. The data collected during the first 22 months of the nominal, five-year mission have been processed by the Gaia Data Processing and Analysis Consortium (DPAC), resulting into this second data release. A summary of the release properties is provided in Gaia Collaboration et al. (2018b). The overall scientific validation of the data is described in Arenou et al. (2018). Background information on the mission and the spacecraft can be found in Gaia Collaboration et al. (2016), with a more detailed presentation of the Radial Velocity Spectrometer (RVS) in Cropper et al. (2018). In addition, Gaia DR2 is accompanied by various, dedicated papers that describe the processing and validation of the various data products. Four more Gaia Collaboration papers present a glimpse of the scientific richness of the data. In addition to this set of refereed publications, this documentation provides a detailed, complete overview of the processing and validation of the Gaia DR2 data. Gaia data, from both Gaia DR1 and Gaia DR2, can be retrieved from the Gaia archive, which is accessible from https://archives.esac.esa.int/gaia. The archive also provides various tutorials on data access and data queries plus an integrated data model (i.e., description of the various fields in the data tables). In addition, Luri et al. (2018) provide concrete advice on how to deal with Gaia astrometry, with recommendations on how best to estimate distances from parallaxes. The Gaia archive features an enhanced visualisation service which can be used for quick initial explorations of the entire Gaia DR2 data set. Pre-computed cross matches between Gaia DR2 and a selected set of large surveys are

  8. Multichannel analyzer development in CAMAC

    International Nuclear Information System (INIS)

    Nagy, J.Z.; Zarandy, A.

    1988-01-01

    The data acquisition in TOKAMAK experiments some CAMAC modules have been developed. The modules are the following: 64 K analyzer memory, 32 K analyzer memory, 6-channel pulse peak analyzer memory which contains the 32 K analyzer memory and eight AD-converters

  9. Positron sources

    International Nuclear Information System (INIS)

    Chehab, R.

    1989-01-01

    A tentative survey of positron sources is given. Physical processes on which positron generation is based are indicated and analyzed. Explanation of the general features of electromagnetic interactions and nuclear β + decay makes it possible to predict the yield and emittance for a given optical matching system between the positron source and the accelerator. Some kinds of matching systems commonly used - mainly working with solenoidal fields - are studied and the acceptance volume calculated. Such knowledge is helpful in comparing different matching systems. Since for large machines, a significant distance exists between the positron source and the experimental facility, positron emittance has to be preserved during beam transfer over large distances and methods used for that purpose are indicated. Comparison of existing positron sources leads to extrapolation to sources for future linear colliders

  10. Handling of radioactive sources in Ecuador

    International Nuclear Information System (INIS)

    Benitez, Manuel

    2000-01-01

    This document describes the following aspects: sealed and unsealed radioactive sources, radiation detectors, personnel and area monitoring, surface pollution, radioactive wastes control and radioactive sources transferring. (The author)

  11. The MetaLex Document Server : Legal Documents as Versioned Linked Data

    NARCIS (Netherlands)

    Hoekstra, R.; Aroyo, L.; Welty, C.; Alani, H.; Taylor, J.; Bernstein, A.; Kagal, L.; Noy, N.; Blomqvist, E.

    2011-01-01

    This paper introduces the MetaLex Document Server (MDS), an ongoing project to improve access to legal sources (regulations, court rulings) by means of a generic legal XML syntax (CEN MetaLex) and Linked Data. The MDS defines a generic conversion mechanism from legacy legal XML syntaxes to CEN

  12. 31 CFR 501.724 - Documents that may be withheld.

    Science.gov (United States)

    2010-07-01

    ... 31 Money and Finance: Treasury 3 2010-07-01 2010-07-01 false Documents that may be withheld. 501.724 Section 501.724 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued... privileged; (2) The document would disclose the identity of a confidential source; or (3) The Administrative...

  13. Investigating the statistical properties of user-generated documents

    OpenAIRE

    Inches, Giacomo; Carman, Mark J.; Crestani, Fabio

    2011-01-01

    The importance of the Internet as a communication medium is reflected in the large amount of documents being generated every day by users of the different services that take place online. In this work we aim at analyzing the properties of these online user-generated documents for some of the established services over the Internet (Kongregate, Twitter, Myspace and Slashdot) and comparing them with a consolidated collection of standard information retrieval documents (from the Wall Street...

  14. Investigating the Statistical Properties of User-Generated Documents

    OpenAIRE

    Inches Giacomo; Carman Mark James

    2011-01-01

    The importance of the Internet as a communication medium is reflected in the large amount of documents being generated every day by users of the different services that take place online. In this work we aim at analyzing the properties of these online user generated documents for some of the established services over the Internet (Kongregate Twitter Myspace and Slashdot) and comparing them with a consolidated collection of standard information retrieval documents (from the Wall Street Journal...

  15. Emergency medicine resident physicians' perceptions of electronic documentation and workflow: a mixed methods study.

    Science.gov (United States)

    Neri, P M; Redden, L; Poole, S; Pozner, C N; Horsky, J; Raja, A S; Poon, E; Schiff, G; Landman, A

    2015-01-01

    To understand emergency department (ED) physicians' use of electronic documentation in order to identify usability and workflow considerations for the design of future ED information system (EDIS) physician documentation modules. We invited emergency medicine resident physicians to participate in a mixed methods study using task analysis and qualitative interviews. Participants completed a simulated, standardized patient encounter in a medical simulation center while documenting in the test environment of a currently used EDIS. We recorded the time on task, type and sequence of tasks performed by the participants (including tasks performed in parallel). We then conducted semi-structured interviews with each participant. We analyzed these qualitative data using the constant comparative method to generate themes. Eight resident physicians participated. The simulation session averaged 17 minutes and participants spent 11 minutes on average on tasks that included electronic documentation. Participants performed tasks in parallel, such as history taking and electronic documentation. Five of the 8 participants performed a similar workflow sequence during the first part of the session while the remaining three used different workflows. Three themes characterize electronic documentation: (1) physicians report that location and timing of documentation varies based on patient acuity and workload, (2) physicians report a need for features that support improved efficiency; and (3) physicians like viewing available patient data but struggle with integration of the EDIS with other information sources. We confirmed that physicians spend much of their time on documentation (65%) during an ED patient visit. Further, we found that resident physicians did not all use the same workflow and approach even when presented with an identical standardized patient scenario. Future EHR design should consider these varied workflows while trying to optimize efficiency, such as improving

  16. Documentation of Cultural Heritage Objects

    Directory of Open Access Journals (Sweden)

    Jon Grobovšek

    2013-09-01

    Full Text Available EXTENDED ABSTRACT:The first and important phase of documentation of cultural heritage objects is to understand which objects need to be documented. The entire documentation process is determined by the characteristics and scope of the cultural heritage object. The next question to be considered is the expected outcome of the documentation process and the purpose for which it will be used. These two essential guidelines determine each stage of the documentation workflow: the choice of the most appropriate data capturing technology and data processing method, how detailed should the documentation be, what problems may occur, what the expected outcome is, what it will be used for, and the plan for storing data and results. Cultural heritage objects require diverse data capturing and data processing methods. It is important that even the first stages of raw data capturing are oriented towards the applicability of results. The selection of the appropriate working method can facilitate the data processing and the preparation of final documentation. Documentation of paintings requires different data capturing method than documentation of buildings or building areas. The purpose of documentation can also be the preservation of the contemporary cultural heritage to posterity or the basis for future projects and activities on threatened objects. Documentation procedures should be adapted to our needs and capabilities. Captured and unprocessed data are lost unless accompanied by additional analyses and interpretations. Information on tools, procedures and outcomes must be included into documentation. A thorough analysis of unprocessed but accessible documentation, if adequately stored and accompanied by additional information, enables us to gather useful data. In this way it is possible to upgrade the existing documentation and to avoid data duplication or unintentional misleading of users. The documentation should be archived safely and in a way to meet

  17. An observational study of the accuracy and completeness of an anesthesia information management system: recommendations for documentation system changes.

    Science.gov (United States)

    Wilbanks, Bryan A; Moss, Jacqueline A; Berner, Eta S

    2013-08-01

    Anesthesia information management systems must often be tailored to fit the environment in which they are implemented. Extensive customization necessitates that systems be analyzed for both accuracy and completeness of documentation design to ensure that the final record is a true representation of practice. The purpose of this study was to determine the accuracy of a recently installed system in the capture of key perianesthesia data. This study used an observational design and was conducted using a convenience sample of nurse anesthetists. Observational data of the nurse anesthetists'delivery of anesthesia care were collected using a touch-screen tablet computer utilizing an Access database customized observational data collection tool. A questionnaire was also administered to these nurse anesthetists to assess perceived accuracy, completeness, and satisfaction with the electronic documentation system. The major sources of data not documented in the system were anesthesiologist presence (20%) and placement of intravenous lines (20%). The major sources of inaccuracies in documentation were gas flow rates (45%), medication administration times (30%), and documentation of neuromuscular function testing (20%)-all of the sources of inaccuracies were related to the use of charting templates that were not altered to reflect the actual interventions performed.

  18. DOD Renewable Energy Projects: Improved Guidance Needed for Analyzing and Documenting Costs and Benefits

    Science.gov (United States)

    2016-09-01

    other locations. • Consumption . To the extent economically feasible and technically practicable, not less than 7.5 percent of electrical energy...Policy Act of 2005, to count toward the consumption goal, DOD must possess renewable energy credits for electricity it consumes.18 Executive Order...difficulties inherent in predicting electricity prices sometimes decades into the future.62 However, DOD did not consistently describe the

  19. SECAD-- a Schema-based Environment for Configuring, Analyzing and Documenting Integrated Fusion Simulations. Final report

    International Nuclear Information System (INIS)

    Shasharina, Svetlana

    2012-01-01

    SECAD is a project that developed a GUI for running integrated fusion simulations as implemented in FACETS and SWIM SciDAC projects. Using the GUI users can submit simulations locally and remotely and visualize the simulation results

  20. EDF Group - 2010 Reference Document

    International Nuclear Information System (INIS)

    2011-04-01

    Beside the accounts of EDF for 2008 and 2009, this voluminous document presents persons in charge, legal account auditors, and how risks are managed within the company. It gives an overview of EDF activities, of its organization, of its assets. It presents and discusses its financial situation and results, indicates the main contracts, and proposes other documents concerning the company. Many documents and reports are provided in appendix

  1. A tandem parallel plate analyzer

    International Nuclear Information System (INIS)

    Hamada, Y.; Fujisawa, A.; Iguchi, H.; Nishizawa, A.; Kawasumi, Y.

    1996-11-01

    By a new modification of a parallel plate analyzer the second-order focus is obtained in an arbitrary injection angle. This kind of an analyzer with a small injection angle will have an advantage of small operational voltage, compared to the Proca and Green analyzer where the injection angle is 30 degrees. Thus, the newly proposed analyzer will be very useful for the precise energy measurement of high energy particles in MeV range. (author)

  2. pBEAM Documentation: Release 0.1.0

    Energy Technology Data Exchange (ETDEWEB)

    Ning, S. A. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2013-09-01

    The Polynomial Beam Element Analysis Module (pBEAM) is a finite element code for beam-like structures. It was originally written to analyze tower/monopiles and rotor blades of wind turbines but can be used for any beam-like structure. This document discusses installation, usage, and documentation of the module.

  3. 32 CFR 989.11 - Combining EIAP with other documentation.

    Science.gov (United States)

    2010-07-01

    ... documentation. (a) The EPF combines environmental analysis with other related documentation when practicable (40 CFR 1506.4) following the procedures prescribed by the CEQ regulations and this part. (b) The EPF must... the EIAP. Prior to making a decision to proceed, the EPF must analyze the environmental impacts that...

  4. Nuclear power plants documentation system

    International Nuclear Information System (INIS)

    Schwartz, E.L.

    1991-01-01

    Since the amount of documents (type and quantity) necessary for the entire design of a NPP is very large, this implies that an overall and detailed identification, filling and retrieval system shall be implemented. This is even more applicable to the FINAL QUALITY DOCUMENTATION of the plant, as stipulated by IAEA Safety Codes and related guides. For such a purpose it was developed a DOCUMENTATION MANUAL, which describes in detail the before mentioned documentation system. Here we present the expected goals and results which we have to reach for Angra 2 and 3 Project. (author)

  5. Grid and Data Analyzing and Security

    Directory of Open Access Journals (Sweden)

    Fatemeh SHOKRI

    2012-12-01

    Full Text Available This paper examines the importance of secure structures in the process of analyzing and distributing information with aid of Grid-based technologies. The advent of distributed network has provided many practical opportunities for detecting and recording the time of events, and made efforts to identify the events and solve problems of storing information such as being up-to-date and documented. In this regard, the data distribution systems in a network environment should be accurate. As a consequence, a series of continuous and updated data must be at hand. In this case, Grid is the best answer to use data and resource of organizations by common processing.

  6. Documenting Architectural Heritage in Bahia, Brazil, Using Spherical Photogrammetry

    Science.gov (United States)

    De Amorim, A. L.; Fangi, G.; Malinverni, E. S.

    2013-07-01

    The Cultural Heritage disappears at a rate higher than we are able, not only, to restore but also to document: human and natural factors, negligence or worst, deliberate demolitions put in danger the collective Architectural Heritage (AH). According to CIPA statements, the recording is important and has to follow some guidelines. The Architectural and Urban Heritage data have to be historically related, critically assessed and analyzed, before to be organized according to a thematic structure and become available for further uses. This paper shows the experiences developed by the Laboratory of Computer Graphics applied to Architecture and Design (LCAD), at the Architecture School of the Federal University of Bahia (FAUFBA), Brazil, in cooperation with the Università Politecnica delle Marche (UNIVPM, DICEA Department), Italy, in documenting architectural heritage. The research set up now has been carried out in the historical sites of Bahia, as Pelourinho neighborhood, a World Heritage by UNESCO. Other historical sites are in the plan of this survey, like the cities of Lençóis and Mucugê in Chapada Diamantina region. The aim is to build a technological platform based on low cost digital technologies and open source tools, such as Panoramic Spherical Photogrammetry, Spatial Database, Geographic Information Systems, Three-dimensional Geometric Modeling, CAD technology, for the collection, validation and dissemination of AH.

  7. Basic freight forwarding and transport
 documentation in freight forwarder’s work

    Directory of Open Access Journals (Sweden)

    Adam Salomon

    2014-09-01

    Full Text Available The purpose of the article is to present the basic documentation in international freight forwarder’s work, in particular, insurance documents and transport documents in various modes of transport. An additional goal is to identify sources of the paper, which can be used to properly completing the individual documents.

  8. Storing XML Documents in Databases

    NARCIS (Netherlands)

    A.R. Schmidt; S. Manegold (Stefan); M.L. Kersten (Martin); L.C. Rivero; J.H. Doorn; V.E. Ferraggine

    2005-01-01

    textabstractThe authors introduce concepts for loading large amounts of XML documents into databases where the documents are stored and maintained. The goal is to make XML databases as unobtrusive in multi-tier systems as possible and at the same time provide as many services defined by the XML

  9. Magnetic fusion program summary document

    International Nuclear Information System (INIS)

    1979-04-01

    This document outlines the current and planned research, development, and commercialization (RD and C) activities of the Offic of Fusion Energy under the Assistant Secretary for Energy Technology, US Department of Energy (DOE). The purpose of this document is to explain the Office of Fusion Energy's activities to Congress and its committees and to interested members of the public

  10. Documenting the Engineering Design Process

    Science.gov (United States)

    Hollers, Brent

    2017-01-01

    Documentation of ideas and the engineering design process is a critical, daily component of a professional engineer's job. While patent protection is often cited as the primary rationale for documentation, it can also benefit the engineer, the team, company, and stakeholders through creating a more rigorously designed and purposeful solution.…

  11. ITK optical links backup document

    CERN Document Server

    Huffman, B T; The ATLAS collaboration; Flick, T; Ye, J

    2013-01-01

    This document describes the proposed optical links to be used for the ITK in the phase II upgrade. The current R&D for optical links pursued in the Versatile Link group is reviewed. In particular the results demonstrating the radiation tolerance of all the on-detector components are documented. The bandwidth requirements and the resulting numerology are given.

  12. Contextualizing Data Warehouses with Documents

    DEFF Research Database (Denmark)

    Perez, Juan Manuel; Berlanga, Rafael; Aramburu, Maria Jose

    2008-01-01

    warehouse with a document warehouse, resulting in a contextualized warehouse. Thus, the user first selects an analysis context by supplying some keywords. Then, the analysis is performed on a novel type of OLAP cube, called an R-cube, which is materialized by retrieving and ranking the documents...

  13. Chemical Contaminant and Decontaminant Test Methodology Source Document. Second Edition

    Science.gov (United States)

    2012-07-01

    endorsement of any commercial products. This report may not be cited for purposes of advertisement . This report has been approved for public release...Department of Justice: Washington, DC, 2001. UNCLASSIFIED Report. 6. Stuempfle, A. K.; Howells, D. J.; Armour , S. J.; Boulet, C. A. International Task Force

  14. Sources and Sinks: Elucidating Mechanisms, Documenting Patterns, and Forecasting Impacts

    Science.gov (United States)

    2017-01-18

    diversity, we performed a randomized block ANOVA on allelic richness and expected heterozygosity using study site as treatment and blocking by locus. We...100,000 burn-in period and 100,000 MCMC (Monte Carlo Markov Chain ) repetitions. The value of k with the lowest DIC value was chosen as the appropriate...Molecular Ecology 17: 3628-3639. Fazio III, V. W., Miles, D. B., & White, M. M. 2004. Genetic differentiation in the endangered Black-capped Vireo

  15. Development of the 2007 Chemical Decontaminant Source Document

    Science.gov (United States)

    2009-03-01

    Number : AB002 5975 MS ACQUISITION PARAMETERS General Information Tune File Acquistion Mode atune .u SIM MS Information Solvent Delay 3.50 min...Acquistion Mode atune .u SIM MS Information Solvent Delay : 5.00 min EM Absolute : False EM Offset : 400 Resulting EM Volt age : 2400.0* [Sim

  16. Triangular clustering in document networks

    Energy Technology Data Exchange (ETDEWEB)

    Cheng Xueqi; Ren Fuxin [Institute of Computing Technology, Chinese Academy of Sciences, Beijing 100190 (China); Zhou Shi [Department of Computer Science, University College London, Malet Place, London WC1E 6BT (United Kingdom); Hu Maobin [School of Engineering Science, University of Science and Technology of China, Hefei 230026 (China)], E-mail: cxq@ict.ac.cn, E-mail: renfuxin@software.ict.ac.cn, E-mail: s.zhou@adastral.ucl.ac.uk, E-mail: humaobin@ustc.edu.cn

    2009-03-15

    Document networks have the characteristic that a document node, e.g. a webpage or an article, carries meaningful content. Properties of document networks are not only affected by topological connectivity between nodes, but are also strongly influenced by the semantic relation between the content of the nodes. We observed that document networks have a large number of triangles and a high value clustering coefficient. Also there is a strong correlation between the probability of formation of a triangle and the content similarity among the three nodes involved. We propose the degree-similarity product (DSP) model, which well reproduces these properties. The model achieves this by using a preferential attachment mechanism that favours the linkage between nodes that are both popular and similar. This work is a step forward towards a better understanding of the structure and evolution of document networks.

  17. Engineering Documentation and Data Control

    Science.gov (United States)

    Matteson, Michael J.; Bramley, Craig; Ciaruffoli, Veronica

    2001-01-01

    Mississippi Space Services (MSS) the facility services contractor for NASA's John C. Stennis Space Center (SSC), is utilizing technology to improve engineering documentation and data control. Two identified improvement areas, labor intensive documentation research and outdated drafting standards, were targeted as top priority. MSS selected AutoManager(R) WorkFlow from Cyco software to manage engineering documentation. The software is currently installed on over 150 desctops. The outdated SSC drafting standard was written for pre-CADD drafting methods, in other words, board drafting. Implementation of COTS software solutions to manage engineering documentation and update the drafting standard resulted in significant increases in productivity by reducing the time spent searching for documents.

  18. Analyzing Log Files using Data-Mining

    Directory of Open Access Journals (Sweden)

    Marius Mihut

    2008-01-01

    Full Text Available Information systems (i.e. servers, applications and communication devices create a large amount of monitoring data that are saved as log files. For analyzing them, a data-mining approach is helpful. This article presents the steps which are necessary for creating an ‘analyzing instrument’, based on an open source software called Waikato Environment for Knowledge Analysis (Weka [1]. For exemplification, a system log file created by a Windows-based operating system, is used as input file.

  19. Multiple sclerosis documentation system (MSDS): moving from documentation to management of MS patients.

    Science.gov (United States)

    Ziemssen, Tjalf; Kempcke, Raimar; Eulitz, Marco; Großmann, Lars; Suhrbier, Alexander; Thomas, Katja; Schultheiss, Thorsten

    2013-09-01

    The long disease duration of multiple sclerosis and the increasing therapeutic options require a individualized therapeutic approach which should be carefully documented over years of observation. To switch from MS documentation to an innovative MS management, new computer- and internet-based tools could be implemented as we could demonstrate with the novel computer-based patient management system "multiple sclerosis management system 3D" (MSDS 3D). MSDS 3D allows documentation and management of visit schedules and mandatory examinations via defined study modules by integration of data input from various sources (patients, attending physicians and MS nurses). It provides forms for the documentation of patient visits as well as clinical and diagnostic findings. Information can be collected via interactive touch screens. Specific modules allow the management of highly efficacious treatments as natalizumab or fingolimod. MSDS can be used to transfer the documented data to databases as, e.g. the registry of the German MS society or REGIMS. MSDS has already been implemented successfully in clinical practice and is currently being evaluated in a multicenter setting. High-quality management and documentation are crucial for improvements in clinical practice and research work.

  20. Digital Multi Channel Analyzer Enhancement

    International Nuclear Information System (INIS)

    Gonen, E.; Marcus, E.; Wengrowicz, U.; Beck, A.; Nir, J.; Sheinfeld, M.; Broide, A.; Tirosh, D.

    2002-01-01

    A cement analyzing system based on radiation spectroscopy had been developed [1], using novel digital approach for real-time, high-throughput and low-cost Multi Channel Analyzer. The performance of the developed system had a severe problem: the resulted spectrum suffered from lack of smoothness, it was very noisy and full of spikes and surges, therefore it was impossible to use this spectrum for analyzing the cement substance. This paper describes the work carried out to improve the system performance

  1. The Use of Speech Technology to Protect the Document Turnover

    Directory of Open Access Journals (Sweden)

    Alexandr M. Alyushin

    2017-06-01

    Full Text Available The wide current paper documents implementation in practice workflows are shown. The basic aspects of document protection related to the protection of their content and legal components are underlined. For contextual component assigned semantic information aspect of the document is considered. For legal component attributed facts and conditions for the creation, approval, negotiation of the document to specific persons is viewed. The documents protection problem importance is shown in connection with possible terrorist threats. The importance of such factor as the time of fraud detection towards the efficiency of documents protection is shown. The fraud detection time requirements for documents of different nature – financial, legal, management is analyzed. The documents used for the operational management of dangerous objects is point out as the most sensitive to the falsification. It is shown that their deliberate falsification can lead to accidents and technogenic catastrophes and human casualties. A comparative analysis of currently used protecting documents methods are presented. Biometric and non-biometric methods of documents protection are point out.Theanalysis of their short comings are given. The conclusion about the prospects of document protection on the basis of the voice signature technology are done. The basic steps of voice information processing in the implementation of this technology are analyzed. The software that implements a documents counterfeiting new protection technology is proposed. The technology is based on the audiomarkers usage at the end of the document, which contains a general information about it. The technology is applicable to the wide range of documents such as financial and valuable papers, contracts, etc. One of the most important advantages of this technology is that any changes in the document can not be done without the author of the document because audiomarker keeps the biometric data of the person

  2. Shoulder dystocia documentation: an evaluation of a documentation training intervention.

    Science.gov (United States)

    LeRiche, Tammy; Oppenheimer, Lawrence; Caughey, Sharon; Fell, Deshayne; Walker, Mark

    2015-03-01

    To evaluate the quality and content of nurse and physician shoulder dystocia delivery documentation before and after MORE training in shoulder dystocia management skills and documentation. Approximately 384 charts at the Ottawa Hospital General Campus involving a diagnosis of shoulder dystocia between the years of 2000 and 2006 excluding the training year of 2003 were identified. The charts were evaluated for 14 key components derived from a validated instrument. The delivery notes were then scored based on these components by 2 separate investigators who were blinded to delivery note author, date, and patient identification to further quantify delivery record quality. Approximately 346 charts were reviewed for physician and nurse delivery documentation. The average score for physician notes was 6 (maximum possible score of 14) both before and after the training intervention. The nurses' average score was 5 before and after the training intervention. Negligible improvement was observed in the content and quality of shoulder dystocia documentation before and after nurse and physician training.

  3. Energy sources

    International Nuclear Information System (INIS)

    Vajda, Gy.

    1998-01-01

    A comprehensive review is presented of the available sources of energy in the world is presented. About 80 percent of primary energy utilization is based on fossile fuels, and their dominant role is not expected to change in the foreseeable future. Data are given on petroleum, natural gas and coal based power production. The role and economic aspects of nuclear power are analyzed. A brief summary of renewable energy sources is presented. The future prospects of the world's energy resources are discussed, and the special position of Hungary regarding fossil, nuclear and renewable energy and the country's energy potential is evaluated. (R.P.)

  4. Audit of Orthopaedic Surgical Documentation

    Directory of Open Access Journals (Sweden)

    Fionn Coughlan

    2015-01-01

    Full Text Available Introduction. The Royal College of Surgeons in England published guidelines in 2008 outlining the information that should be documented at each surgery. St. James’s Hospital uses a standard operation sheet for all surgical procedures and these were examined to assess documentation standards. Objectives. To retrospectively audit the hand written orthopaedic operative notes according to established guidelines. Methods. A total of 63 operation notes over seven months were audited in terms of date and time of surgery, surgeon, procedure, elective or emergency indication, operative diagnosis, incision details, signature, closure details, tourniquet time, postop instructions, complications, prosthesis, and serial numbers. Results. A consultant performed 71.4% of procedures; however, 85.7% of the operative notes were written by the registrar. The date and time of surgery, name of surgeon, procedure name, and signature were documented in all cases. The operative diagnosis and postoperative instructions were frequently not documented in the designated location. Incision details were included in 81.7% and prosthesis details in only 30% while the tourniquet time was not documented in any. Conclusion. Completion and documentation of operative procedures were excellent in some areas; improvement is needed in documenting tourniquet time, prosthesis and incision details, and the location of operative diagnosis and postoperative instructions.

  5. Title List of documents made publicly available

    International Nuclear Information System (INIS)

    1982-07-01

    The Title List of Documents Made Publicly Available is a monthly publication. It contains descriptions of the information received and generated by the US Nuclear Regulatory Commission (NRC). This information includes: (1) docketed material associated with civilian nuclear power plants and other uses of radioactive materials and (2) nondocketed material received and generated by NRC pertinent to its role as a regulatory agency. As used here, docketed does not refer to Court dockets; it refers to the system by which NRC maintains its regulatory records. This series of documents is indexed by a Personal Author Index, a Corporate Source Index, and a Report Number Index. The docketed information contained in the Title List includes the information formerly issued through the Department of Energy publication Power Reactor Docket Information, last published in January 1979. Microfiche of the docketed information listed in the Title List is available for sale on a subscription basis from the National Technical Information Service (NTIS)

  6. Title List of documents made publicly available

    International Nuclear Information System (INIS)

    1982-06-01

    The Title List of Documents Made Publicly Available is a monthly publication. It contains descriptions of the information received and generated by the US Nuclear Regulatory Commission (NRC). This information includes: (1) docketed material associated with civilian nuclear power plants and other uses of radioactive materials and (2) nondocketed material received and generated by NRC pertinent to its role as a regulatory agency. As used here, docketed does not refer to Court dockets; it refers to the system by which NRC maintains its regulatory records. This series of documents is indexed by a Personal Author Index, a Corporate Source Index, and a Report Number Index. The docketed information contained in the Title List includes the information formerly issued throught the Department of Energy publication Power Reactor Docket Information, last published in January 1979. Microfiche of the docketed information listed in the Title List is available for sale on a subscription basis from the National Technical Information Service (NTIS)

  7. A Framework for the Systematic Collection of Open Source Intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Pouchard, Line Catherine [ORNL; Trien, Joseph P [ORNL; Dobson, Jonathan D [ORNL

    2009-01-01

    Following legislative directions, the Intelligence Community has been mandated to make greater use of Open Source Intelligence (OSINT). Efforts are underway to increase the use of OSINT but there are many obstacles. One of these obstacles is the lack of tools helping to manage the volume of available data and ascertain its credibility. We propose a unique system for selecting, collecting and storing Open Source data from the Web and the Open Source Center. Some data management tasks are automated, document source is retained, and metadata containing geographical coordinates are added to the documents. Analysts are thus empowered to search, view, store, and analyze Web data within a single tool. We present ORCAT I and ORCAT II, two implementations of the system.

  8. CRISP90 - SOFTWARE DESIGN ANALYZER SYSTEM

    Science.gov (United States)

    Tausworthe, R. C.

    1994-01-01

    The CRISP90 Software Design Analyzer System, an update of CRISP-80, is a set of programs forming a software design and documentation tool which supports top-down, hierarchic, modular, structured design and programming methodologies. The quality of a computer program can often be significantly influenced by the design medium in which the program is developed. The medium must foster the expression of the programmer's ideas easily and quickly, and it must permit flexible and facile alterations, additions, and deletions to these ideas as the design evolves. The CRISP90 software design analyzer system was developed to provide the PDL (Programmer Design Language) programmer with such a design medium. A program design using CRISP90 consists of short, English-like textual descriptions of data, interfaces, and procedures that are imbedded in a simple, structured, modular syntax. The display is formatted into two-dimensional, flowchart-like segments for a graphic presentation of the design. Together with a good interactive full-screen editor or word processor, the CRISP90 design analyzer becomes a powerful tool for the programmer. In addition to being a text formatter, the CRISP90 system prepares material that would be tedious and error prone to extract manually, such as a table of contents, module directory, structure (tier) chart, cross-references, and a statistics report on the characteristics of the design. Referenced modules are marked by schematic logic symbols to show conditional, iterative, and/or concurrent invocation in the program. A keyword usage profile can be generated automatically and glossary definitions inserted into the output documentation. Another feature is the capability to detect changes that were made between versions. Thus, "change-bars" can be placed in the output document along with a list of changed pages and a version history report. Also, items may be marked as "to be determined" and each will appear on a special table until the item is

  9. Multichannel analyzer type CMA-3

    International Nuclear Information System (INIS)

    Czermak, A.; Jablonski, J.; Ostrowicz, A.

    1978-01-01

    Multichannel analyzer CMA-3 is designed for two-parametric analysis with operator controlled logical windows. It is implemented in CAMAC standard. A single crate contains all required modules and is controlled by the PDP-11/10 minicomputer. Configuration of CMA-3 is shown. CMA-3 is the next version of the multichannel analyzer described in report No 958/E-8. (author)

  10. Analyzing data files in SWAN

    CERN Document Server

    Gajam, Niharika

    2016-01-01

    Traditionally analyzing data happens via batch-processing and interactive work on the terminal. The project aims to provide another way of analyzing data files: A cloud-based approach. It aims to make it a productive and interactive environment through the combination of FCC and SWAN software.

  11. Documenting the Earliest Chinese Journals

    Directory of Open Access Journals (Sweden)

    Jian-zhong (Joe Zhou

    2001-10-01

    Full Text Available

    頁次:19-24

    According to various authoritative sources, the English word "journal" was first used in the 16lh century, but the existence of the journal in its original meaning as a daily record can be traced back to Acta Diuma (Daily Events in ancient Roman cities as early as 59 B.C. This article documents the first appearance of Chinese daily records that were much early than 59 B.C.

    The evidence of the earlier Chinese daily records came from some important archaeological discoveries in the 1970's, but they were also documented by Sima Qian (145 B.C. - 85 B.C., the grand historian of the Han Dynasty imperial court. Sima's lifetime contribution was the publication of Shi Ji (史記 (The Grand Scribe's Records, the Records hereafter. The Records is a book of history of a grand scope. It encompasses all Chinese history from 30lh century B.C. through the end of the second century B.C. in 130 chapters and over 525,000 Chinese

  12. Documents

    International Development Research Centre (IDRC) Digital Library (Canada)

    livelihoods in spite of chronic water shortages. Farmers' Association: By the People for the People. Supported by WaDImena, a team from the Desert. Development Centre (DDC) at the American University in. Cairo helped farmers to found their first association to improve agricultural water management in Abu Minqar.

  13. Mimvec: a deep learning approach for analyzing the human phenome.

    Science.gov (United States)

    Gan, Mingxin; Li, Wenran; Zeng, Wanwen; Wang, Xiaojian; Jiang, Rui

    2017-09-21

    The human phenome has been widely used with a variety of genomic data sources in the inference of disease genes. However, most existing methods thus far derive phenotype similarity based on the analysis of biomedical databases by using the traditional term frequency-inverse document frequency (TF-IDF) formulation. This framework, though intuitive, not only ignores semantic relationships between words but also tends to produce high-dimensional vectors, and hence lacks the ability to precisely capture intrinsic semantic characteristics of biomedical documents. To overcome these limitations, we propose a framework called mimvec to analyze the human phenome by making use of the state-of-the-art deep learning technique in natural language processing. We converted 24,061 records in the Online Mendelian Inheritance in Man (OMIM) database to low-dimensional vectors using our method. We demonstrated that the vector presentation not only effectively enabled classification of phenotype records against gene ones, but also succeeded in discriminating diseases of different inheritance styles and different mechanisms. We further derived pairwise phenotype similarities between 7988 human inherited diseases using their vector presentations. With a joint analysis of this phenome with multiple genomic data, we showed that phenotype overlap indeed implied genotype overlap. We finally used the derived phenotype similarities with genomic data to prioritize candidate genes and demonstrated advantages of this method over existing ones. Our method is capable of not only capturing semantic relationships between words in biomedical records but also alleviating the dimensional disaster accompanying the traditional TF-IDF framework. With the approaching of precision medicine, there will be abundant electronic records of medicine and health awaiting for deep analysis, and we expect to see a wide spectrum of applications borrowing the idea of our method in the near future.

  14. Document Management in Local Government.

    Science.gov (United States)

    Williams, Bernard J. S.

    1998-01-01

    The latest in electronic document management in British local government is discussed. Finance, revenues, and benefits systems of leading vendors to local authorities are highlighted. A planning decisions archive management system and other information services are discussed. (AEF)

  15. Vietnamese Document Representation and Classification

    Science.gov (United States)

    Nguyen, Giang-Son; Gao, Xiaoying; Andreae, Peter

    Vietnamese is very different from English and little research has been done on Vietnamese document classification, or indeed, on any kind of Vietnamese language processing, and only a few small corpora are available for research. We created a large Vietnamese text corpus with about 18000 documents, and manually classified them based on different criteria such as topics and styles, giving several classification tasks of different difficulty levels. This paper introduces a new syllable-based document representation at the morphological level of the language for efficient classification. We tested the representation on our corpus with different classification tasks using six classification algorithms and two feature selection techniques. Our experiments show that the new representation is effective for Vietnamese categorization, and suggest that best performance can be achieved using syllable-pair document representation, an SVM with a polynomial kernel as the learning algorithm, and using Information gain and an external dictionary for feature selection.

  16. Virtual Library Design Document; TOPICAL

    International Nuclear Information System (INIS)

    M. A. deLamare

    2001-01-01

    The objective of this document is to establish a design for the virtual library user and administrative layers that complies with the requirements of the virtual library software specification and subordinate module specification

  17. Reactor operation environmental information document

    Energy Technology Data Exchange (ETDEWEB)

    Bauer, L.R.; Hayes, D.W.; Hunter, C.H.; Marter, W.L.; Moyer, R.A.

    1989-12-01

    This volume is a reactor operation environmental information document for the Savannah River Plant. Topics include meteorology, surface hydrology, transport, environmental impacts, and radiation effects. 48 figs., 56 tabs. (KD)

  18. CERN Document Server (CDS): Introduction

    CERN Multimedia

    CERN. Geneva; Costa, Flavio

    2017-01-01

    A short online tutorial introducing the CERN Document Server (CDS). Basic functionality description, the notion of Revisions and the CDS test environment. Links: CDS Production environment CDS Test environment  

  19. Document development, management and transmission

    International Nuclear Information System (INIS)

    Meister, K.

    1998-01-01

    Environmental monitoring can only be carried out by means of cartographic outputs allowing the representation of information in a compressed way and with a local reference. On account of this requirement and the continuously growing importance of international data exchange the development of a universal tool for the combination of data to so-called documents has been started for the management and for the exchange of these documents with other systems. (R.P.)

  20. ENDF/B summary documentation

    International Nuclear Information System (INIS)

    Garber, D.

    1975-10-01

    Descriptions of the evaluations contained in the ENDF/B library are given. The summary documentation presented is intended to be a more detailed description than the (File 1) comments contained in the computer-readable data files, but not so detailed as the formal reports describing each ENDF/B evaluation. The documentations were written by the CSEWG evaluators and compiled by NNCSC. Selected materials which comprise this volume include from 1 H to 244 Cm

  1. Storing XML Documents in Databases

    OpenAIRE

    Schmidt, A.R.; Manegold, Stefan; Kersten, Martin; Rivero, L.C.; Doorn, J.H.; Ferraggine, V.E.

    2005-01-01

    textabstractThe authors introduce concepts for loading large amounts of XML documents into databases where the documents are stored and maintained. The goal is to make XML databases as unobtrusive in multi-tier systems as possible and at the same time provide as many services defined by the XML standards as possible. The ubiquity of XML has sparked great interest in deploying concepts known from Relational Database Management Systems such as declarative query languages, transactions, indexes ...

  2. Relativistic effects in the calibration of electrostatic electron analyzers. I. Toroidal analyzers

    Energy Technology Data Exchange (ETDEWEB)

    Keski Rahkonen, O [Helsinki University of Technology, Espoo (Finland). Laboratory of Physics; Krause, M O [Oak Ridge National Lab., Tenn. (USA)

    1978-02-01

    Relativistic correction terms up to the second order are derived for the kinetic energy of an electron travelling along the circular central trajectory of a toroidal analyzer. Furthermore, a practical energy calibration equation of the spherical sector plate analyzer is written for the variable-plate-voltage recording mode. Accurate measurements with a spherical analyzer performed using kinetic energies from 600 to 2100 eV are in good agreement with this theory showing our approximation (neglect of fringing fields, and source and detector geometry) is realistic enough for actual calibration purposes.

  3. Tritium sources

    International Nuclear Information System (INIS)

    Glodic, S.; Boreli, F.

    1993-01-01

    Tritium is the only radioactive isotope of hydrogen. It directly follows the metabolism of water and it can be bound into genetic material, so it is very important to control levels of contamination. In order to define the state of contamination it is necessary to establish 'zero level', i.e. actual global inventory. The importance of tritium contamination monitoring increases with the development of fusion power installations. Different sources of tritium are analyzed and summarized in this paper. (author)

  4. Patterns for Effectively Documenting Frameworks

    Science.gov (United States)

    Aguiar, Ademar; David, Gabriel

    Good design and implementation are necessary but not sufficient pre-requisites for successfully reusing object-oriented frameworks. Although not always recognized, good documentation is crucial for effective framework reuse, and often hard, costly, and tiresome, coming with many issues, especially when we are not aware of the key problems and respective ways of addressing them. Based on existing literature, case studies and lessons learned, the authors have been mining proven solutions to recurrent problems of documenting object-oriented frameworks, and writing them in pattern form, as patterns are a very effective way of communicating expertise and best practices. This paper presents a small set of patterns addressing problems related to the framework documentation itself, here seen as an autonomous and tangible product independent of the process used to create it. The patterns aim at helping non-experts on cost-effectively documenting object-oriented frameworks. In concrete, these patterns provide guidance on choosing the kinds of documents to produce, how to relate them, and which contents to include. Although the focus is more on the documents themselves, rather than on the process and tools to produce them, some guidelines are also presented in the paper to help on applying the patterns to a specific framework.

  5. [Automated analyzer of enzyme immunoassay].

    Science.gov (United States)

    Osawa, S

    1995-09-01

    Automated analyzers for enzyme immunoassay can be classified by several points of view: the kind of labeled antibodies or enzymes, detection methods, the number of tests per unit time, analytical time and speed per run. In practice, it is important for us consider the several points such as detection limits, the number of tests per unit time, analytical range, and precision. Most of the automated analyzers on the market can randomly access and measure samples. I will describe the recent advance of automated analyzers reviewing their labeling antibodies and enzymes, the detection methods, the number of test per unit time and analytical time and speed per test.

  6. ON EXPERIENCE OF THE ELECTRONIC DOCUMENT MANAGEMENT SYSTEM IMPLEMENTATION IN THE MEDICAL UNIVERSITY

    OpenAIRE

    A. V. Semenets; V. Yu. Kovalok

    2015-01-01

    An importance of the application of the electronic document management to the Ukraine healthcare is shown. The electronic document management systems market overview is presented. Example of the usage of the open-source electronic document management system in the Ternopil State Medical University by I. Ya. Horbachevsky is shown. The implementation capabilities of the electronic document management system within a cloud services are shown. The electronic document management features of the Mi...

  7. Massively Scalable Near Duplicate Detection in Streams of Documents using MDSH

    Energy Technology Data Exchange (ETDEWEB)

    Bogen, Paul Logasa [ORNL; Symons, Christopher T [ORNL; McKenzie, Amber T [ORNL; Patton, Robert M [ORNL; Gillen, Rob [ORNL

    2013-01-01

    In a world where large-scale text collections are not only becoming ubiquitous but also are growing at increasing rates, near duplicate documents are becoming a growing concern that has the potential to hinder many different information filtering tasks. While others have tried to address this problem, prior techniques have only been used on limited collection sizes and static cases. We will briefly describe the problem in the context of Open Source Intelligence (OSINT) along with our additional constraints for performance. In this work we propose two variations on Multi-dimensional Spectral Hash (MDSH) tailored for working on extremely large, growing sets of text documents. We analyze the memory and runtime characteristics of our techniques and provide an informal analysis of the quality of the near-duplicate clusters produced by our techniques.

  8. Title list of documents made publicly available

    International Nuclear Information System (INIS)

    1979-12-01

    This monthly publication contains descriptions of the information received and generated by the US Nuclear Regulatory Commission (NRC). This information includes docketed material associated with civilian nuclear power plants and other uses of radioactive materials, and nondocketed material received and generated by NRC pertinent to its role as a regulatory agency. This series of documents is indexed by a Personal Author Index, a Corporate Source Index, and a Report Number Index. The docketed information includes the inforation formerly issued through the Department of Energy's Technical Information Center under the title Power Reactor Docket Information (PRDI) and, in addition, information received or generated on other uses of radioactive materials

  9. CSTT Update: Fuel Quality Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Brosha, Eric L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Lujan, Roger W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Mukundan, Rangachary [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rockward, Tommy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Romero, Christopher J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Williams, Stefan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Wilson, Mahlon S. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-02-06

    These are slides from a presentation. The following topics are covered: project background (scope and approach), developing the prototype (timeline), update on intellectual property, analyzer comparisons (improving humidification, stabilizing the baseline, applying clean-up strategy, impact of ionomer content and improving clean-up), proposed operating mode, considerations for testing in real-world conditions (Gen 1 analyzer electronics development, testing partner identified, field trial planning), summary, and future work.

  10. THE ACTIVITY OF MIHAIL BEREZOVSCHI REFLECTED IN THE NATIONAL ARCHIVE’S DOCUMENTS DURING THE PERIOD OF BIG ROMANIA

    Directory of Open Access Journals (Sweden)

    BARBANOI HRISTINA

    2016-12-01

    Full Text Available This article comes as a natural continuation of the author`s previous scientifi c publication – „Th e Activity of Mihail Berezovschi Refl ected in the National Archive Documents during Tsarist Bessarabia”, this time revealing the information collected from archival sources tabs aft er the Great Unifi cation of the Romanian Principalities which took place in 1918. Within this article, I presented the results of the analysis of the information drawn from dossiers tabs 205 and 206, from inventory 3 Fund 1135, also dossiers 245 and 1135, inventory 8 from Fund 1772, and from dossier 631, inventory 30 from Fund 1862, which proved to be some really valuable documents that bring new light to the biography of the great composer, conductor, teacher and priest Mihail Berezovschi. Due to the historical period, to which these documents belong, they are already written in the Romanian language, unlike the documents analyzed in the article regarding the activity of M. Berezovschi during Tsarist Bessarabia. At the same time, consulting some recent sources there was included in the article information about the fate of M. Berezovschi’s children, which could itself be the subject of separate investigations in archival sources.

  11. THE ACTIVITY OF MIHAIL BEREZOVSCHI REFLECTED IN THE NATIONAL ARCHIVE’S DOCUMENTS DURING THE PERIOD OF BIG ROMANIA

    Directory of Open Access Journals (Sweden)

    BARBANOI HRISTINA

    2016-12-01

    Full Text Available This article comes as a natural continuation of the author`s previous scientifi c publication – „The Activity of Mihail Berezovschi Refl ected in the National Archive Documents during Tsarist Bessarabia”, this time revealing the information collected from archival sources tabs aft er the Great Unifi cation of the Romanian Principalities which took place in 1918. Within this article, I presented the results of the analysis of the information drawn from dossiers tabs 205 and 206, from inventory 3 Fund 1135, also dossiers 245 and 1135, inventory 8 from Fund 1772, and from dossier 631, inventory 30 from Fund 1862, which proved to be some really valuable documents that bring new light to the biography of the great composer, conductor, teacher and priest Mihail Berezovschi. Due to the historical period, to which these documents belong, they are already written in the Romanian language, unlike the documents analyzed in the article regarding the activity of M. Berezovschi during Tsarist Bessarabia. At the same time, consulting some recent sources there was included in the article information about the fate of M. Berezovschi’s children, which could itself be the subject of separate investigations in archival sources.

  12. Motor Gasoline Market Model documentation report

    International Nuclear Information System (INIS)

    1993-09-01

    The purpose of this report is to define the objectives of the Motor Gasoline Market Model (MGMM), describe its basic approach and to provide detail on model functions. This report is intended as a reference document for model analysts, users, and the general public. The MGMM performs a short-term (6- to 9-month) forecast of demand and price for motor gasoline in the US market; it also calculates end of month stock levels. The model is used to analyze certain market behavior assumptions or shocks and to determine the effect on market price, demand and stock level

  13. Nuclear waste issues: a perspectives document

    International Nuclear Information System (INIS)

    Cohen, J.J.; Smith, C.F.; Ciminese, F.J.

    1983-02-01

    This report contains the results of systematic survey of perspectives on the question of radioactive waste management. Sources of information for this review include the scientific literature, regulatory and government documents, pro-nuclear and anti-nuclear publications, and news media articles. In examining the sources of information, it has become evident that a major distinction can be made between the optimistic or positive viewpoints, and the pessimistic or negative ones. Consequently, these form the principal categories for presentation of the perspectives on the radioactive waste management problem have been further classified as relating to the following issue areas: the physical aspects of radiation, longevity, radiotoxicity, the quantity of radioactive wastes, and perceptual factors

  14. Nuclear waste issues: a perspectives document

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, J.J.; Smith, C.F.; Ciminese, F.J.

    1983-02-01

    This report contains the results of systematic survey of perspectives on the question of radioactive waste management. Sources of information for this review include the scientific literature, regulatory and government documents, pro-nuclear and anti-nuclear publications, and news media articles. In examining the sources of information, it has become evident that a major distinction can be made between the optimistic or positive viewpoints, and the pessimistic or negative ones. Consequently, these form the principal categories for presentation of the perspectives on the radioactive waste management problem have been further classified as relating to the following issue areas: the physical aspects of radiation, longevity, radiotoxicity, the quantity of radioactive wastes, and perceptual factors.

  15. Modeling Documents with Event Model

    Directory of Open Access Journals (Sweden)

    Longhui Wang

    2015-08-01

    Full Text Available Currently deep learning has made great breakthroughs in visual and speech processing, mainly because it draws lessons from the hierarchical mode that brain deals with images and speech. In the field of NLP, a topic model is one of the important ways for modeling documents. Topic models are built on a generative model that clearly does not match the way humans write. In this paper, we propose Event Model, which is unsupervised and based on the language processing mechanism of neurolinguistics, to model documents. In Event Model, documents are descriptions of concrete or abstract events seen, heard, or sensed by people and words are objects in the events. Event Model has two stages: word learning and dimensionality reduction. Word learning is to learn semantics of words based on deep learning. Dimensionality reduction is the process that representing a document as a low dimensional vector by a linear mode that is completely different from topic models. Event Model achieves state-of-the-art results on document retrieval tasks.

  16. Analyzing rare diseases terms in biomedical terminologies

    Directory of Open Access Journals (Sweden)

    Erika Pasceri

    2012-03-01

    Full Text Available Rare disease patients too often face common problems, including the lack of access to correct diagnosis, lack of quality information on the disease, lack of scientific knowledge of the disease, inequities and difficulties in access to treatment and care. These things could be changed by implementing a comprehensive approach to rare diseases, increasing international cooperation in scientific research, by gaining and sharing scientific knowledge about and by developing tools for extracting and sharing knowledge. A significant aspect to analyze is the organization of knowledge in the biomedical field for the proper management and recovery of health information. For these purposes, the sources needed have been acquired from the Office of Rare Diseases Research, the National Organization of Rare Disorders and Orphanet, organizations that provide information to patients and physicians and facilitate the exchange of information among different actors involved in this field. The present paper shows the representation of rare diseases terms in biomedical terminologies such as MeSH, ICD-10, SNOMED CT and OMIM, leveraging the fact that these terminologies are integrated in the UMLS. At the first level, it was analyzed the overlap among sources and at a second level, the presence of rare diseases terms in target sources included in UMLS, working at the term and concept level. We found that MeSH has the best representation of rare diseases terms.

  17. Standards for Documenting Finite‐Fault Earthquake Rupture Models

    KAUST Repository

    Mai, Paul Martin

    2016-04-06

    In this article, we propose standards for documenting and disseminating finite‐fault earthquake rupture models, and related data and metadata. A comprehensive documentation of the rupture models, a detailed description of the data processing steps, and facilitating the access to the actual data that went into the earthquake source inversion are required to promote follow‐up research and to ensure interoperability, transparency, and reproducibility of the published slip‐inversion solutions. We suggest a formatting scheme that describes the kinematic rupture process in an unambiguous way to support subsequent research. We also provide guidelines on how to document the data, metadata, and data processing. The proposed standards and formats represent a first step to establishing best practices for comprehensively documenting input and output of finite‐fault earthquake source studies.

  18. Standards for Documenting Finite‐Fault Earthquake Rupture Models

    KAUST Repository

    Mai, Paul Martin; Shearer, Peter; Ampuero, Jean‐Paul; Lay, Thorne

    2016-01-01

    In this article, we propose standards for documenting and disseminating finite‐fault earthquake rupture models, and related data and metadata. A comprehensive documentation of the rupture models, a detailed description of the data processing steps, and facilitating the access to the actual data that went into the earthquake source inversion are required to promote follow‐up research and to ensure interoperability, transparency, and reproducibility of the published slip‐inversion solutions. We suggest a formatting scheme that describes the kinematic rupture process in an unambiguous way to support subsequent research. We also provide guidelines on how to document the data, metadata, and data processing. The proposed standards and formats represent a first step to establishing best practices for comprehensively documenting input and output of finite‐fault earthquake source studies.

  19. Method of stabilizing single channel analyzers

    International Nuclear Information System (INIS)

    Fasching, G.E.; Patton, G.H.

    1975-01-01

    A method and the apparatus to reduce the drift of single channel analyzers are described. Essentially, this invention employs a time-sharing or multiplexing technique to insure that the outputs from two single channel analyzers (SCAS) maintain the same count ratio regardless of variations in the threshold voltage source or voltage changes, the multiplexing technique is accomplished when a flip flop, actuated by a clock, changes state to switch the output from the individual SCAS before these outputs are sent to a ratio counting scalar. In the particular system embodiment disclosed that illustrates this invention, the sulfur content of coal is determined by subjecting the coal to radiation from a neutron producing source. A photomultiplier and detector system equates the transmitted gamma radiation to an analog voltage signal and sends the same signal after amplification, to a SCA system that contains the invention. Therein, at least two single channel analyzers scan the analog signal over different parts of a spectral region. The two outputs may then be sent to a digital multiplexer so that the output from the multiplexer contains counts falling within two distinct segments of the region. By dividing the counts from the multiplexer by each other, the percentage of sulfur within the coal sample under observation may be determined. (U.S.)

  20. Terminologie de Base de la Documentation. (Basic Terminology of Documentation).

    Science.gov (United States)

    Commission des Communautes Europeennes (Luxembourg). Bureau de Terminologie.

    This glossary is designed to aid non-specialists whose activities require that they have some familiarity with the terminology of the modern methods of documentation. Definitions have been assembled from various dictionaries, manuals, etc., with particular attention being given to the publications of UNESCO and the International Standards…

  1. On-Demand Urine Analyzer

    Science.gov (United States)

    Farquharson, Stuart; Inscore, Frank; Shende, Chetan

    2010-01-01

    A lab-on-a-chip was developed that is capable of extracting biochemical indicators from urine samples and generating their surface-enhanced Raman spectra (SERS) so that the indicators can be quantified and identified. The development was motivated by the need to monitor and assess the effects of extended weightlessness, which include space motion sickness and loss of bone and muscle mass. The results may lead to developments of effective exercise programs and drug regimes that would maintain astronaut health. The analyzer containing the lab-on-a- chip includes materials to extract 3- methylhistidine (a muscle-loss indicator) and Risedronate (a bone-loss indicator) from the urine sample and detect them at the required concentrations using a Raman analyzer. The lab-on- a-chip has both an extractive material and a SERS-active material. The analyzer could be used to monitor the onset of diseases, such as osteoporosis.

  2. Device for analyzing a solution

    International Nuclear Information System (INIS)

    Marchand, Joseph.

    1978-01-01

    The device enables a solution containing an antigen to be analyzed by the radio-immunology technique without coming up against the problems of antigen-antibody complex and free antigen separation. This device, for analyzing a solution containing a biological compound capable of reacting with an antagonistic compound specific of the biological compound, features a tube closed at its bottom end and a component set and immobilized in the bottom of the tube so as to leave a capacity between the bottom of the tube and its lower end. The component has a large developed surface and is so shaped that it allows the solution to be analyzed to have access to the bottom of the tube; it is made of a material having some elastic deformation and able to take up a given quantity of the biological compound or of the antagonistic compound specific of the biological compound [fr

  3. Reactor operation environmental information document

    Energy Technology Data Exchange (ETDEWEB)

    Haselow, J.S.; Price, V.; Stephenson, D.E.; Bledsoe, H.W.; Looney, B.B.

    1989-12-01

    The Savannah River Site (SRS) produces nuclear materials, primarily plutonium and tritium, to meet the requirements of the Department of Defense. These products have been formed in nuclear reactors that were built during 1950--1955 at the SRS. K, L, and P reactors are three of five reactors that have been used in the past to produce the nuclear materials. All three of these reactors discontinued operation in 1988. Currently, intense efforts are being extended to prepare these three reactors for restart in a manner that protects human health and the environment. To document that restarting the reactors will have minimal impacts to human health and the environment, a three-volume Reactor Operations Environmental Impact Document has been prepared. The document focuses on the impacts of restarting the K, L, and P reactors on both the SRS and surrounding areas. This volume discusses the geology, seismology, and subsurface hydrology. 195 refs., 101 figs., 16 tabs.

  4. Magnetic fusion: Environmental Readiness Document

    International Nuclear Information System (INIS)

    1981-03-01

    Environmental Readiness Documents are prepared periodically to review and evaluate the environmental status of an energy technology during the several phases of development of that technology. Through these documents, the Office of Environment within the Department of Energy provides an independent and objective assessment of the environmental risks and potential impacts associated with the progression of the technology to the next stage of development and with future extensive use of the technology. This Environmental Readiness Document was prepared to assist the Department of Energy in evaluating the readiness of magnetic fusion technology with respect to environmental issues. An effort has been made to identify potential environmental problems that may be encountered based upon current knowledge, proposed and possible new environmental regulations, and the uncertainties inherent in planned environmental research

  5. FLAMMABLE GAS TECHNICAL BASIS DOCUMENT

    Energy Technology Data Exchange (ETDEWEB)

    KRIPPS, L.J.

    2005-02-18

    This document describes the qualitative evaluation of frequency and consequences for double shell tank (DST) and single shell tank (SST) representative flammable gas accidents and associated hazardous conditions without controls. The evaluation indicated that safety-significant SSCs and/or TSRS were required to prevent or mitigate flammable gas accidents. Discussion on the resulting control decisions is included. This technical basis document was developed to support of the Tank Farms Documented Safety Analysis (DSA) and describes the risk binning process for the flammable gas representative accidents and associated represented hazardous conditions. The purpose of the risk binning process is to determine the need for safety-significant structures, systems, and components (SSC) and technical safety requirement (TSR)-level controls for a given representative accident or represented hazardous condition based on an evaluation of the event frequency and consequence.

  6. FLAMMABLE GAS TECHNICAL BASIS DOCUMENT

    Energy Technology Data Exchange (ETDEWEB)

    KRIPPS, L.J.

    2005-03-03

    This document describes the qualitative evaluation of frequency and consequences for DST and SST representative flammable gas accidents and associated hazardous conditions without controls. The evaluation indicated that safety-significant structures, systems and components (SSCs) and/or technical safety requirements (TSRs) were required to prevent or mitigate flammable gas accidents. Discussion on the resulting control decisions is included. This technical basis document was developed to support WP-13033, Tank Farms Documented Safety Analysis (DSA), and describes the risk binning process for the flammable gas representative accidents and associated represented hazardous conditions. The purpose of the risk binning process is to determine the need for safety-significant structures, systems, and components (SSC) and technical safety requirement (TSR)-level controls for a given representative accident or represented hazardous condition based on an evaluation of the event frequency and consequence.

  7. Multichannel analyzer embedded in FPGA

    International Nuclear Information System (INIS)

    Garcia D, A.; Hernandez D, V. M.; Vega C, H. R.; Ordaz G, O. O.; Bravo M, I.

    2017-10-01

    Ionizing radiation has different applications, so it is a very significant and useful tool, which in turn can be dangerous for living beings if they are exposed to uncontrolled doses. However, due to its characteristics, it cannot be perceived by any of the senses of the human being, so that in order to know the presence of it, radiation detectors and additional devices are required to quantify and classify it. A multichannel analyzer is responsible for separating the different pulse heights that are generated in the detectors, in a certain number of channels; according to the number of bits of the analog to digital converter. The objective of the work was to design and implement a multichannel analyzer and its associated virtual instrument, for nuclear spectrometry. The components of the multichannel analyzer were created in VHDL hardware description language and packaged in the Xilinx Vivado design suite, making use of resources such as the ARM processing core that the System on Chip Zynq contains and the virtual instrument was developed on the LabView programming graphics platform. The first phase was to design the hardware architecture to be embedded in the FPGA and for the internal control of the multichannel analyzer the application was generated for the ARM processor in C language. For the second phase, the virtual instrument was developed for the management, control and visualization of the results. The data obtained as a result of the development of the system were observed graphically in a histogram showing the spectrum measured. The design of the multichannel analyzer embedded in FPGA was tested with two different radiation detection systems (hyper-pure germanium and scintillation) which allowed determining that the spectra obtained are similar in comparison with the commercial multichannel analyzers. (Author)

  8. Drinking water protection plan; a discussion document

    International Nuclear Information System (INIS)

    2001-01-01

    This draft document outlines the plan of action devised by the Government of British Columbia in an effort to safeguard the purity of the drinking water supply in the province, and invites British Columbians to participate in the elaboration of such a plan. This document concentrates on the assessment of the sources of the water supply (watersheds and aquifers) and on measures to ensure the integrity of the system of water treatment and distribution as the principal components of a comprehensive plan to protect drinking water. The proposed plan involves a multi-barrier approach that will use a combination of measures to ensure that water sources are properly managed and waterworks systems provide safe drinking water. New drinking water planning procedures, more effective local influence and authority, enforceable standards, better access to information and public education programs form the essence of the plan. A series of public meetings are scheduled to provide the public at large with opportunities to comment on the government's plan of action and to offer suggestions for additional measures

  9. ELECTRICAL POWER SYSTEM DESCRIPTION DOCUMENT

    Energy Technology Data Exchange (ETDEWEB)

    M. Maniyar

    2004-06-22

    The purpose of this revision of the System Description Document (SDD) is to establish requirements that drive the design of the electrical power system and their bases to allow the design effort to proceed to License Application. This SDD is a living document that will be revised at strategic points as the design matures over time. This SDD identifies the requirements and describes the system design as they exist at this time, with emphasis on those attributes of the design provided to meet the requirements. This SDD has been developed to be an engineering tool for design control. Accordingly, the primary audience are design engineers. This type of SDD leads and follows the design process. It leads the design process with regard to the flow down of upper tier requirements onto the system. Knowledge of these requirements is essential to performing the design process. This SDD follows the design with regard to the description of the system. The description provided in the SDD is a reflection of the results of the design process to date. Functional and operational requirements applicable to this system are obtained from ''Project Functional and Operational Requirements'' (F&OR) (Siddoway, 2003). Other requirements to support the design process have been taken from higher level requirements documents such as ''Project Design Criteria Document'' (PDC) (Doraswamy 2004), the fire hazards analyses, and the preclosure safety analysis. The above mentioned low-level documents address ''Project Requirements Document'' (PRD) (Canori and Leitner 2003) requirements. This SDD includes several appendices with supporting information. Appendix B lists key system charts, diagrams, drawings, and lists; and Appendix C is a list of system procedures.

  10. Waste management system requirements document

    International Nuclear Information System (INIS)

    1991-02-01

    This volume defines the top level requirements for the Mined Geologic Disposal System (MGDS). It is designed to be used in conjunction with Volume 1 of the WMSR, General System Requirements. It provides a functional description expanding the requirements allocated to the MGDS in Volume 1 and elaborates on each requirement by providing associated performance criteria as appropriate. Volumes 1 and 4 of the WMSR provide a minimum set of requirements that must be satisfied by the final MGDS design. This document sets forth specific requirements that must be fulfilled. It is not the intent or purpose of this top level document to describe how each requirement is to be satisfied in the final MGDS design. Each subsequent level of the technical document hierarchy must provide further guidance and definition as to how each of these requirements is to be implemented in the design. It is expected that each subsequent level of requirements will be significantly more detailed. Section 2 of this volume provides a functional description of the MGDS. Each function is addressed in terms of requirements, and performance criteria. Section 3 provides a list of controlling documents. Each document cited in a requirement of Chapter 2 is included in this list and is incorporated into this document as a requirement on the final system. The WMSR addresses only federal requirements (i.e., laws, regulations and DOE orders). State and local requirements are not addressed. However, it will be specifically noted at the potentially affected WMSR requirements that there could be additional or more stringent regulations imposed by a state or local requirements or administering agency over the cited federal requirements

  11. Loviisa nuclear power plant analyzer

    International Nuclear Information System (INIS)

    Porkholm, K.; Nurmilaukas, P.; Tiihonen, O.; Haenninen, M.; Puska, E.

    1992-12-01

    The APROS Simulation Environment has been developed since 1986 by Imatran Voima Oy (IVO) and the Technical Research Centre of Finland (VTT). It provides tools, solution algorithms and process components for use in different simulation systems for design, analysis and training purposes. One of its main nuclear applications is the Loviisa Nuclear Power Plant Analyzer (LPA). The Loviisa Plant Analyzer includes all the important plant components both in the primary and in the secondary circuits. In addition, all the main control systems, the protection system and the high voltage electrical systems are included. (orig.)

  12. Repository of not readily available documents for project W-320

    Energy Technology Data Exchange (ETDEWEB)

    Conner, J.C.

    1997-04-18

    The purpose of this document is to provide a readily available source of the technical reports needed for the development of the safety documentation provided for the waste retrieval sluicing system (WRSS), designed to remove the radioactive and chemical sludge from tank 241-C-106, and transport that material to double-shell tank 241-AY-102 via a new, temporary, shielded, encased transfer line.

  13. A quantitative reading of competences documents of Law new degrees.

    OpenAIRE

    Leví Orta, Genoveva del Carmen; Ramos Méndez, Eduardo

    2014-01-01

    Documents formulating competences of degrees are key sources for analysis, evaluation and profile comparison of training, currently offered by different university degrees. This work aims to make a quantitative reading of competences documents of Law degree from various Spanish universities, based on the ideas of Content Analysis. The methodology has two phases. Firstly, a dictionary of concepts related to the components of competences is identified in the documentary corpus. Next, the corpus...

  14. Repository of not readily available documents for project W-320

    International Nuclear Information System (INIS)

    Conner, J.C.

    1997-01-01

    The purpose of this document is to provide a readily available source of the technical reports needed for the development of the safety documentation provided for the waste retrieval sluicing system (WRSS), designed to remove the radioactive and chemical sludge from tank 241-C-106, and transport that material to double-shell tank 241-AY-102 via a new, temporary, shielded, encased transfer line

  15. Documents of judicial institutions in the 80-90's 18th century in the State Archives of Dnipropetrovsk region

    Directory of Open Access Journals (Sweden)

    Posunko, O. M.

    2016-06-01

    Full Text Available The author analyzes the documents of the juridical instances of Yekaterinoslav vicegerency in the 80-90ies of the 18th century. The study is based on the materials of the State Archives of the Dnipropetrovsk region. It was considerer their information possibilities for region history. The data is interesting for historians from South and Left-bank Ukraine. In the fullness of time the part of the former land of Hetmanate was the part of the Yekaterinoslav province. Therefore many cases are showing real use of standards of Little Russian Law especially in the areas of the inheritance and matrimonial law. By the South Ukraine history analyzed documents give information about the development of trade; the formation of landed proprietorship in the region; the work of various government institutions. Also a lot of stories to the social history of the region which is very important in limited capacity of source base.

  16. Document segmentation via oblique cuts

    Science.gov (United States)

    Svendsen, Jeremy; Branzan-Albu, Alexandra

    2013-01-01

    This paper presents a novel solution for the layout segmentation of graphical elements in Business Intelligence documents. We propose a generalization of the recursive X-Y cut algorithm, which allows for cutting along arbitrary oblique directions. An intermediate processing step consisting of line and solid region removal is also necessary due to presence of decorative elements. The output of the proposed segmentation is a hierarchical structure which allows for the identification of primitives in pie and bar charts. The algorithm was tested on a database composed of charts from business documents. Results are very promising.

  17. Spectrum analysis on quality requirements consideration in software design documents.

    Science.gov (United States)

    Kaiya, Haruhiko; Umemura, Masahiro; Ogata, Shinpei; Kaijiri, Kenji

    2013-12-01

    Software quality requirements defined in the requirements analysis stage should be implemented in the final products, such as source codes and system deployment. To guarantee this meta-requirement, quality requirements should be considered in the intermediate stages, such as the design stage or the architectural definition stage. We propose a novel method for checking whether quality requirements are considered in the design stage. In this method, a technique called "spectrum analysis for quality requirements" is applied not only to requirements specifications but also to design documents. The technique enables us to derive the spectrum of a document, and quality requirements considerations in the document are numerically represented in the spectrum. We can thus objectively identify whether the considerations of quality requirements in a requirements document are adapted to its design document. To validate the method, we applied it to commercial software systems with the help of a supporting tool, and we confirmed that the method worked well.

  18. Biomass Scenario Model Documentation: Data and References

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Y.; Newes, E.; Bush, B.; Peterson, S.; Stright, D.

    2013-05-01

    The Biomass Scenario Model (BSM) is a system dynamics model that represents the entire biomass-to-biofuels supply chain, from feedstock to fuel use. The BSM is a complex model that has been used for extensive analyses; the model and its results can be better understood if input data used for initialization and calibration are well-characterized. It has been carefully validated and calibrated against the available data, with data gaps filled in using expert opinion and internally consistent assumed values. Most of the main data sources that feed into the model are recognized as baseline values by the industry. This report documents data sources and references in Version 2 of the BSM (BSM2), which only contains the ethanol pathway, although subsequent versions of the BSM contain multiple conversion pathways. The BSM2 contains over 12,000 total input values, with 506 distinct variables. Many of the variables are opportunities for the user to define scenarios, while others are simply used to initialize a stock, such as the initial number of biorefineries. However, around 35% of the distinct variables are defined by external sources, such as models or reports. The focus of this report is to provide insight into which sources are most influential in each area of the supply chain.

  19. The security analyzer: A security analyzer program written in Prolog

    International Nuclear Information System (INIS)

    Zimmerman, B.D.; Densley, P.J.

    1986-09-01

    The Security Analyzer is a software tool capable of analyzing the effectiveness of a facility's security system. It is written in the Prolog logic programming computer language, using entity-relationship data modeling techniques. The program performs the following functions: (1) provides descriptive, locational and operational status information about intrusion detectors and assessment devices (i.e., ''sensors'' and ''cameras'') upon request; (2) provides for storage and retrieval of maintenance history information for various components of the security system (including intrusion detectors), and allows for changing that information as desired; (3) provides a ''search'' mode, wherein all paths are found from any specified physical location to another specified location which satisfy user chosen ''intruder detection'' probability and elapsed time criteria (i.e., the program finds the ''weakest paths'' from a security point of view). The first two of these functions can be provided fairly easily with a conventional database program; the third function could be provided using Fortran or some similar language, though with substantial difficulty. In the Security Analyzer program, all these functions are provided in a simple and straight-forward manner. This simplicity is possible because the program is written in the symbolic (as opposed to numeric) processing language Prolog, and because the knowledge base is structured according to entity-relationship modeling principles. Also, the use of Prolog and the entity-relationship modeling technique allows the capabilities of the Security analyzer program, both for knowledge base interrogation and for searching-type operations, to be easily expanded in ways that would be very difficult for a numeric and more algorithmically deterministic language such as Fortran to duplicate. 4 refs

  20. Methods of analyzing crude oil

    Science.gov (United States)

    Cooks, Robert Graham; Jjunju, Fred Paul Mark; Li, Anyin; Rogan, Iman S.

    2017-08-15

    The invention generally relates to methods of analyzing crude oil. In certain embodiments, methods of the invention involve obtaining a crude oil sample, and subjecting the crude oil sample to mass spectrometry analysis. In certain embodiments, the method is performed without any sample pre-purification steps.

  1. Therapy Talk: Analyzing Therapeutic Discourse

    Science.gov (United States)

    Leahy, Margaret M.

    2004-01-01

    Therapeutic discourse is the talk-in-interaction that represents the social practice between clinician and client. This article invites speech-language pathologists to apply their knowledge of language to analyzing therapy talk and to learn how talking practices shape clinical roles and identities. A range of qualitative research approaches,…

  2. The Convertible Arbitrage Strategy Analyzed

    NARCIS (Netherlands)

    Loncarski, I.; Ter Horst, J.R.; Veld, C.H.

    2006-01-01

    This paper analyzes convertible bond arbitrage on the Canadian market for the period 1998 to 2004.Convertible bond arbitrage is the combination of a long position in convertible bonds and a short position in the underlying stocks. Convertible arbitrage has been one of the most successful strategies

  3. Analyzing the complexity of nanotechnology

    NARCIS (Netherlands)

    Vries, de M.J.; Schummer, J.; Baird, D.

    2006-01-01

    Nanotechnology is a highly complex technological development due to many uncertainties in our knowledge about it. The Dutch philosopher Herman Dooyeweerd has developed a conceptual framework that can be used (1) to analyze the complexity of technological developments and (2) to see how priorities

  4. Proton-beam energy analyzer

    International Nuclear Information System (INIS)

    Belan, V.N.; Bolotin, L.I.; Kiselev, V.A.; Linnik, A.F.; Uskov, V.V.

    1989-01-01

    The authors describe a magnetic analyzer for measurement of proton-beam energy in the range from 100 keV to 25 MeV. The beam is deflected in a uniform transverse magnetic field and is registered by photographing a scintillation screen. The energy spectrum of the beam is constructed by microphotometry of the photographic film

  5. Integration of clinical research documentation in electronic health records.

    Science.gov (United States)

    Broach, Debra

    2015-04-01

    Clinical trials of investigational drugs and devices are often conducted within healthcare facilities concurrently with clinical care. With implementation of electronic health records, new communication methods are required to notify nonresearch clinicians of research participation. This article reviews clinical research source documentation, the electronic health record and the medical record, areas in which the research record and electronic health record overlap, and implications for the research nurse coordinator in documentation of the care of the patient/subject. Incorporation of clinical research documentation in the electronic health record will lead to a more complete patient/subject medical record in compliance with both research and medical records regulations. A literature search provided little information about the inclusion of clinical research documentation within the electronic health record. Although regulations and guidelines define both source documentation and the medical record, integration of research documentation in the electronic health record is not clearly defined. At minimum, the signed informed consent(s), investigational drug or device usage, and research team contact information should be documented within the electronic health record. Institutional policies should define a standardized process for this integration in the absence federal guidance. Nurses coordinating clinical trials are in an ideal position to define this integration.

  6. The SPAR thermal analyzer: Present and future

    Science.gov (United States)

    Marlowe, M. B.; Whetstone, W. D.; Robinson, J. C.

    The SPAR thermal analyzer, a system of finite-element processors for performing steady-state and transient thermal analyses, is described. The processors communicate with each other through the SPAR random access data base. As each processor is executed, all pertinent source data is extracted from the data base and results are stored in the data base. Steady state temperature distributions are determined by a direct solution method for linear problems and a modified Newton-Raphson method for nonlinear problems. An explicit and several implicit methods are available for the solution of transient heat transfer problems. Finite element plotting capability is available for model checkout and verification.

  7. Every document and picture tells a story: using internal corporate document reviews, semiotics, and content analysis to assess tobacco advertising.

    Science.gov (United States)

    Anderson, S J; Dewhirst, T; Ling, P M

    2006-06-01

    In this article we present communication theory as a conceptual framework for conducting documents research on tobacco advertising strategies, and we discuss two methods for analysing advertisements: semiotics and content analysis. We provide concrete examples of how we have used tobacco industry documents archives and tobacco advertisement collections iteratively in our research to yield a synergistic analysis of these two complementary data sources. Tobacco promotion researchers should consider adopting these theoretical and methodological approaches.

  8. Melter Disposal Strategic Planning Document

    Energy Technology Data Exchange (ETDEWEB)

    BURBANK, D.A.

    2000-09-25

    This document describes the proposed strategy for disposal of spent and failed melters from the tank waste treatment plant to be built by the Office of River Protection at the Hanford site in Washington. It describes program management activities, disposal and transportation systems, leachate management, permitting, and safety authorization basis approvals needed to execute the strategy.

  9. Compression of Probabilistic XML Documents

    Science.gov (United States)

    Veldman, Irma; de Keijzer, Ander; van Keulen, Maurice

    Database techniques to store, query and manipulate data that contains uncertainty receives increasing research interest. Such UDBMSs can be classified according to their underlying data model: relational, XML, or RDF. We focus on uncertain XML DBMS with as representative example the Probabilistic XML model (PXML) of [10,9]. The size of a PXML document is obviously a factor in performance. There are PXML-specific techniques to reduce the size, such as a push down mechanism, that produces equivalent but more compact PXML documents. It can only be applied, however, where possibilities are dependent. For normal XML documents there also exist several techniques for compressing a document. Since Probabilistic XML is (a special form of) normal XML, it might benefit from these methods even more. In this paper, we show that existing compression mechanisms can be combined with PXML-specific compression techniques. We also show that best compression rates are obtained with a combination of PXML-specific technique with a rather simple generic DAG-compression technique.

  10. BASIC Instructional Program: System Documentation.

    Science.gov (United States)

    Dageforde, Mary L.

    This report documents the BASIC Instructional Program (BIP), a "hands-on laboratory" that teaches elementary programming in the BASIC language, as implemented in the MAINSAIL language, a machine-independent revision of SAIL which should facilitate implementation of BIP on other computing systems. Eight instructional modules which make up…

  11. The Darfur Atrocities Documentation Project

    Science.gov (United States)

    Totten, Samuel

    2004-01-01

    One of the many important aspects of the Darfur Atrocities Documentation Project was that it set a precedent for what the U.S. and/or other nations can, and should do, when future cases of potential genocide arise. Far too often in the recent past, the international community (the United Nations, individual governments, many nongovernmental…

  12. Variable & Recode Definitions - SEER Documentation

    Science.gov (United States)

    Resources that define variables and provide documentation for reporting using SEER and related datasets. Choose from SEER coding and staging manuals plus instructions for recoding behavior, site, stage, cause of death, insurance, and several additional topics. Also guidance on months survived, calculating Hispanic mortality, and site-specific surgery.

  13. Document Delivery: Evaluating the Options.

    Science.gov (United States)

    Ward, Suzanne M.

    1997-01-01

    Discusses options available to libraries for document delivery. Topics include users' needs; cost; copyright compliance; traditional interlibrary loan; types of suppliers; selection criteria, including customer service; new developments in interlibrary loan, including outsourcing arrangements; and the need to evaluate suppliers. (LRW)

  14. The Digital technical documentation handbook

    CERN Document Server

    Schultz, Susan I; Kavanagh, Frank X; Morse, Marjorie J

    1993-01-01

    The Digital Technical Documentation Handbook describes the process of developing and producing technical user information at Digital Equipment Corporation. * Discusses techniques for making user information _more effective * Covers the draft and reviewprocess, the production and distribution of printed and electronic media, archiving, indexing, testing for usability, and many other topics * Provides quality assurance checklists, contains a glossary and a bibliography of resources for technicalcommunicators

  15. Methodological Aspects of Architectural Documentation

    Directory of Open Access Journals (Sweden)

    Arivaldo Amorim

    2011-12-01

    Full Text Available This paper discusses the methodological approach that is being developed in the state of Bahia in Brazil since 2003, in architectural and urban sites documentation, using extensive digital technologies. Bahia has a vast territory with important architectural ensembles ranging from the sixteenth century to present day. As part of this heritage is constructed of raw earth and wood, it is very sensitive to various deleterious agents. It is therefore critical document this collection that is under threats. To conduct those activities diverse digital technologies that could be used in documentation process are being experimented. The task is being developed as an academic research, with few financial resources, by scholarship students and some volunteers. Several technologies are tested ranging from the simplest to the more sophisticated ones, used in the main stages of the documentation project, as follows: work overall planning, data acquisition, processing and management and ultimately, to control and evaluate the work. The activities that motivated this paper are being conducted in the cities of Rio de Contas and Lençóis in the Chapada Diamantina, located at 420 km and 750 km from Salvador respectively, in Cachoeira city at Recôncavo Baiano area, 120 km from Salvador, the capital of Bahia state, and at Pelourinho neighbourhood, located in the historic capital. Part of the material produced can be consulted in the website: < www.lcad.ufba.br>.

  16. Waste Management System Requirements Document

    International Nuclear Information System (INIS)

    1992-02-01

    This DCP establishes an interim plan for the Office of Civilian Radioactive Waste Management (OCRWM) technical baseline until the results of the OCRWM Document Hierarchy Task Force can be implemented. This plan is needed to maintain continuity in the Program for ongoing work in the areas of Waste Acceptance, Transportation, Monitored Retrievable Storage (MRS) and Yucca Mountain Site Characterization

  17. Compression of Probabilistic XML documents

    NARCIS (Netherlands)

    Veldman, Irma

    2009-01-01

    Probabilistic XML (PXML) files resulting from data integration can become extremely large, which is undesired. For XML there are several techniques available to compress the document and since probabilistic XML is in fact (a special form of) XML, it might benefit from these methods even more. In

  18. Gstruct: a system for extracting schemas from GML documents

    Science.gov (United States)

    Chen, Hui; Zhu, Fubao; Guan, Jihong; Zhou, Shuigeng

    2008-10-01

    Geography Markup Language (GML) becomes the de facto standard for geographic information representation on the internet. GML schema provides a way to define the structure, content, and semantic of GML documents. It contains useful structural information of GML documents and plays an important role in storing, querying and analyzing GML data. However, GML schema is not mandatory, and it is common that a GML document contains no schema. In this paper, we present Gstruct, a tool for GML schema extraction. Gstruct finds the features in the input GML documents, identifies geometry datatypes as well as simple datatypes, then integrates all these features and eliminates improper components to output the optimal schema. Experiments demonstrate that Gstruct is effective in extracting semantically meaningful schemas from GML documents.

  19. Sources for charged particles; Les sources de particules chargees

    Energy Technology Data Exchange (ETDEWEB)

    Arianer, J.

    1997-09-01

    This document is a basic course on charged particle sources for post-graduate students and thematic schools on large facilities and accelerator physics. A simple but precise description of the creation and the emission of charged particles is presented. This course relies on every year upgraded reference documents. Following relevant topics are considered: electronic emission processes, technological and practical considerations on electron guns, positron sources, production of neutral atoms, ionization, plasma and discharge, different types of positive and negative ion sources, polarized particle sources, materials for the construction of ion sources, low energy beam production and transport. (N.T.).

  20. Richland Environmental Restoration Project management action process document

    International Nuclear Information System (INIS)

    1996-04-01

    This document is the prescribed means for providing direct input to the US Department of Energy Headquarters regarding the status, accomplishments, strategy, and issues of the Richland Environmental Restoration Project. The project mission, organizational interfaces, and operational history of the Hanford Site are provided. Remediation strategies are analyzed in detail. The document includes a status of Richland Environmental Restoration project activities and accomplishments, and it presents current cost summaries, schedules, and technical baselines

  1. Richland Environmental Restoration Project management action process document

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-04-01

    This document is the prescribed means for providing direct input to the US Department of Energy Headquarters regarding the status, accomplishments, strategy, and issues of the Richland Environmental Restoration Project. The project mission, organizational interfaces, and operational history of the Hanford Site are provided. Remediation strategies are analyzed in detail. The document includes a status of Richland Environmental Restoration project activities and accomplishments, and it presents current cost summaries, schedules, and technical baselines.

  2. ELECTRICAL SUPPORT SYSTEM DESCRIPTION DOCUMENT

    Energy Technology Data Exchange (ETDEWEB)

    S. Roy

    2004-06-24

    The purpose of this revision of the System Design Description (SDD) is to establish requirements that drive the design of the electrical support system and their bases to allow the design effort to proceed to License Application. This SDD is a living document that will be revised at strategic points as the design matures over time. This SDD identifies the requirements and describes the system design as they exist at this time, with emphasis on those attributes of the design provided to meet the requirements. This SDD has been developed to be an engineering tool for design control. Accordingly, the primary audience/users are design engineers. This type of SDD both ''leads'' and ''trails'' the design process. It leads the design process with regard to the flow down of upper tier requirements onto the system. Knowledge of these requirements is essential in performing the design process. The SDD trails the design with regard to the description of the system. The description provided in the SDD is a reflection of the results of the design process to date. Functional and operational requirements applicable to electrical support systems are obtained from the ''Project Functional and Operational Requirements'' (F&OR) (Siddoway 2003). Other requirements to support the design process have been taken from higher-level requirements documents such as the ''Project Design Criteria Document'' (PDC) (Doraswamy 2004), and fire hazards analyses. The above-mentioned low-level documents address ''Project Requirements Document'' (PRD) (Canon and Leitner 2003) requirements. This SDD contains several appendices that include supporting information. Appendix B lists key system charts, diagrams, drawings, and lists, and Appendix C includes a list of system procedures.

  3. SNF AGING SYSTEM DESCRIPTION DOCUMENT

    International Nuclear Information System (INIS)

    L.L. Swanson

    2005-01-01

    The purpose of this system description document (SDD) is to establish requirements that drive the design of the spent nuclear fuel (SNF) aging system and associated bases, which will allow the design effort to proceed. This SDD will be revised at strategic points as the design matures. This SDD identifies the requirements and describes the system design, as it currently exists, with emphasis on attributes of the design provided to meet the requirements. This SDD is an engineering tool for design control; accordingly, the primary audience and users are design engineers. This SDD is part of an iterative design process. It leads the design process with regard to the flow down of upper tier requirements onto the system. Knowledge of these requirements is essential in performing the design process. The SDD follows the design with regard to the description of the system. The description provided in the SDD reflects the current results of the design process. Throughout this SDD, the term aging cask applies to vertical site-specific casks and to horizontal aging modules. The term overpack is a vertical site-specific cask that contains a dual-purpose canister (DPC) or a disposable canister. Functional and operational requirements applicable to this system were obtained from ''Project Functional and Operational Requirements'' (F andOR) (Curry 2004 [DIRS 170557]). Other requirements that support the design process were taken from documents such as ''Project Design Criteria Document'' (PDC) (BSC 2004 [DES 171599]), ''Site Fire Hazards Analyses'' (BSC 2005 [DIRS 172174]), and ''Nuclear Safety Design Bases for License Application'' (BSC 2005 [DIRS 171512]). The documents address requirements in the ''Project Requirements Document'' (PRD) (Canori and Leitner 2003 [DIRS 166275]). This SDD includes several appendices. Appendix A is a Glossary; Appendix B is a list of key system charts, diagrams, drawings, lists and additional supporting information; and Appendix C is a list of

  4. ELECTRICAL SUPPORT SYSTEM DESCRIPTION DOCUMENT

    International Nuclear Information System (INIS)

    Roy, S.

    2004-01-01

    The purpose of this revision of the System Design Description (SDD) is to establish requirements that drive the design of the electrical support system and their bases to allow the design effort to proceed to License Application. This SDD is a living document that will be revised at strategic points as the design matures over time. This SDD identifies the requirements and describes the system design as they exist at this time, with emphasis on those attributes of the design provided to meet the requirements. This SDD has been developed to be an engineering tool for design control. Accordingly, the primary audience/users are design engineers. This type of SDD both ''leads'' and ''trails'' the design process. It leads the design process with regard to the flow down of upper tier requirements onto the system. Knowledge of these requirements is essential in performing the design process. The SDD trails the design with regard to the description of the system. The description provided in the SDD is a reflection of the results of the design process to date. Functional and operational requirements applicable to electrical support systems are obtained from the ''Project Functional and Operational Requirements'' (F andOR) (Siddoway 2003). Other requirements to support the design process have been taken from higher-level requirements documents such as the ''Project Design Criteria Document'' (PDC) (Doraswamy 2004), and fire hazards analyses. The above-mentioned low-level documents address ''Project Requirements Document'' (PRD) (Canon and Leitner 2003) requirements. This SDD contains several appendices that include supporting information. Appendix B lists key system charts, diagrams, drawings, and lists, and Appendix C includes a list of system procedures

  5. Areva - 2011 Reference document; Areva - Document de reference 2011

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-07-01

    After having indicated the person responsible of this document and the legal account auditors, and provided some financial information, this document gives an overview of the different risk factors existing in the company: law risks, industrial and environmental risks, operational risks, risks related to large projects, market and liquidity risks. Then, after having recalled the history and evolution of the company and the evolution of its investments over the last five years, it proposes an overview of Areva's activities on the markets of nuclear energy and renewable energies, of its clients and suppliers, of its strategy, of the activities of its different departments. Other information are provided: company's flow chart, estate properties (plants, equipment), an analysis of its financial situation, its research and development policy, the present context, profit previsions or estimations, management organization and operation

  6. Analyzer for gamma cameras diagnostic

    International Nuclear Information System (INIS)

    Oramas Polo, I.; Osorio Deliz, J. F.; Diaz Garcia, A.

    2013-01-01

    This research work was carried out to develop an analyzer for gamma cameras diagnostic. It is composed of an electronic system that includes hardware and software capabilities, and operates from the acquisition of the 4 head position signals of a gamma camera detector. The result is the spectrum of the energy delivered by nuclear radiation coming from the camera detector head. This system includes analog processing of position signals from the camera, digitization and the subsequent processing of the energy signal in a multichannel analyzer, sending data to a computer via a standard USB port and processing of data in a personal computer to obtain the final histogram. The circuits are composed of an analog processing board and a universal kit with micro controller and programmable gate array. (Author)

  7. New approach to analyzing vulnerability

    International Nuclear Information System (INIS)

    O'Callaghan, P.B.; Carlson, R.L.; Riedeman, G.W.

    1986-01-01

    The Westinghouse Hanford Company (WHC) has recently completed construction of the Fuel Cycle Plant (FCP) at Richland, Washington. At start-up the facility will fabricate driver fuel for the Fast Flux Test Facility in the Secure Automated Fabrication line. After construction completion, but before facility certification, the Department of Energy (DOE) Richland Operation Office requested that a vulnerability analysis be performed which assumed multiple insiders as a threat to the security system. A unique method of analyzing facility vulnerabilities was developed at the Security Applications Center (SAC), which is managed by WHC for DOE. The method that was developed verifies a previous vulnerability assessment, as well as introducing a modeling technique which analyzes security alarms in relation to delaying factors and possible insider activities. With this information it is possible to assess the relative strength or weakness of various possible routes to and from a target within a facility

  8. Lifecycle management for nuclear engineering project documents

    International Nuclear Information System (INIS)

    Zhang Li; Zhang Ming; Zhang Ling

    2010-01-01

    The nuclear engineering project documents with great quantity and various types of data, in which the relationships of each document are complex, the edition of document update frequently, are managed difficultly. While the safety of project even the nuclear safety is threatened seriously by the false documents and mistakes. In order to ensure the integrality, veracity and validity of project documents, the lifecycle theory of document is applied to build documents center, record center, structure and database of document lifecycle management system. And the lifecycle management is used to the documents of nuclear engineering projects from the production to pigeonhole, to satisfy the quality requirement of nuclear engineering projects. (authors)

  9. Analyzing the Facebook Friendship Graph

    OpenAIRE

    Catanese, Salvatore; De Meo, Pasquale; Ferrara, Emilio; Fiumara, Giacomo

    2010-01-01

    Online Social Networks (OSN) during last years acquired a huge and increasing popularity as one of the most important emerging Web phenomena, deeply modifying the behavior of users and contributing to build a solid substrate of connections and relationships among people using the Web. In this preliminary work paper, our purpose is to analyze Facebook, considering a significant sample of data reflecting relationships among subscribed users. Our goal is to extract, from this platform, relevant ...

  10. Sulfur Dioxide Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, Stephen R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-05-01

    The Sulfur Dioxide Analyzer measures sulfur dioxide based on absorbance of UV light at one wavelength by SO2 molecules which then decay to a lower energy state by emitting UV light at a longer wavelength. Specifically, SO2 + hυ1 →SO2 *→SO2 + hυ2 The emitted light is proportional to the concentration of SO2 in the optical cell. External communication with the analyzer is available through an Ethernet port configured through the instrument network of the AOS systems. The Model 43i-TLE is part of the i-series of Thermo Scientific instruments. The i-series instruments are designed to interface with external computers through the proprietary Thermo Scientific iPort Software. However, this software is somewhat cumbersome and inflexible. Brookhaven National Laboratory (BNL) has written an interface program in National Instruments LabView that both controls the Model 43i-TLE Analyzer AND queries the unit for all measurement and housekeeping data. The LabView vi (the software program written by BNL) ingests all raw data from the instrument and outputs raw data files in a uniform data format similar to other instruments in the AOS and described more fully in Section 6.0 below.

  11. Remote Laser Diffraction PSD Analyzer

    International Nuclear Information System (INIS)

    Batcheller, T.A.; Huestis, G.M.; Bolton, S.M.

    2000-01-01

    Particle size distribution (PSD) analysis of radioactive slurry samples were obtained using a modified off-the-shelf classical laser light scattering particle size analyzer. A Horiba Instruments Inc. Model La-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a hot cell (gamma radiation) environment. The general details of the modifications to this analyzer are presented in this paper. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not achievable - making this technology far superior than the traditional methods used previously. Remote deployment and utilization of this technology is in an exploratory stage. The risk of malfunction in this radiation environment is countered by gaining of this tremendously useful fundamental engineering data. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives

  12. A new uranium automatic analyzer

    International Nuclear Information System (INIS)

    Xia Buyun; Zhu Yaokun; Wang Bin; Cong Peiyuan; Zhang Lan

    1993-01-01

    A new uranium automatic analyzer based on the flow injection analysis (FIA) principle has been developed. It consists of a multichannel peristaltic pump, an injection valve, a photometric detector, a single-chip microprocessor system and electronic circuit. The new designed multifunctional auto-injection valve can automatically change the injection volume of the sample and the channels so that the determination ranges and items can easily be changed. It also can make the instrument vary the FIA operation modes that it has functions of a universal instrument. A chromatographic column with extractant-containing resin was installed in the manifold of the analyzer for the concentration and separation of trace uranium. The 2-(5-bromo-2-pyridylazo)-5-diethyl-aminophenol (Br-PADAP) was used as colour reagent. Uranium was determined in the aqueous solution by adding cetyl-pyridium bromide (CPB). The uranium in the solution in the range 0.02-500 mg · L -1 can be directly determined without any pretreatment. A sample throughput rate of 30-90 h -1 and reproducibility of 1-2% were obtained. The analyzer has been satisfactorily applied to the laboratory and the plant

  13. Remote Laser Diffraction PSD Analyzer

    International Nuclear Information System (INIS)

    Batcheller, Thomas Aquinas; Huestis, Gary Michael; Bolton, Steven Michael

    2000-01-01

    Particle size distribution (PSD) analysis of radioactive slurry samples were obtained using a modified ''off-the-shelf'' classical laser light scattering particle size analyzer. A Horiba Instruments Inc. Model La-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a ''hot cell'' (gamma radiation) environment. The general details of the modifications to this analyzer are presented in this paper. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not achievable--making this technology far superior than the traditional methods used previously. Remote deployment and utilization of this technology is in an exploratory stage. The risk of malfunction in this radiation environment is countered by gaining of this tremendously useful fundamental engineering data. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives

  14. Chandra Source Catalog: User Interface

    Science.gov (United States)

    Bonaventura, Nina; Evans, Ian N.; Rots, Arnold H.; Tibbetts, Michael S.; van Stone, David W.; Zografou, Panagoula; Primini, Francis A.; Glotfelty, Kenny J.; Anderson, Craig S.; Chen, Judy C.; Davis, John E.; Doe, Stephen M.; Evans, Janet D.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Gibbs, Danny G., II; Grier, John D.; Hain, Roger; Hall, Diane M.; Harbo, Peter N.; He, Helen; Houck, John C.; Karovska, Margarita; Kashyap, Vinay L.; Lauer, Jennifer; McCollough, Michael L.; McDowell, Jonathan C.; Miller, Joseph B.; Mitschang, Arik W.; Morgan, Douglas L.; Mossman, Amy E.; Nichols, Joy S.; Nowak, Michael A.; Plummer, David A.; Refsdal, Brian L.; Siemiginowska, Aneta L.; Sundheim, Beth A.; Winkelman, Sherry L.

    2009-09-01

    The Chandra Source Catalog (CSC) is intended to be the definitive catalog of all X-ray sources detected by Chandra. For each source, the CSC provides positions and multi-band fluxes, as well as derived spatial, spectral, and temporal source properties. Full-field and source region data products are also available, including images, photon event lists, light curves, and spectra. The Chandra X-ray Center CSC website (http://cxc.harvard.edu/csc/) is the place to visit for high-level descriptions of each source property and data product included in the catalog, along with other useful information, such as step-by-step catalog tutorials, answers to FAQs, and a thorough summary of the catalog statistical characterization. Eight categories of detailed catalog documents may be accessed from the navigation bar on most of the 50+ CSC pages; these categories are: About the Catalog, Creating the Catalog, Using the Catalog, Catalog Columns, Column Descriptions, Documents, Conferences, and Useful Links. There are also prominent links to CSCview, the CSC data access GUI, and related help documentation, as well as a tutorial for using the new CSC/Google Earth interface. Catalog source properties are presented in seven scientific categories, within two table views: the Master Source and Source Observations tables. Each X-ray source has one ``master source'' entry and one or more ``source observation'' entries, the details of which are documented on the CSC ``Catalog Columns'' pages. The master source properties represent the best estimates of the properties of a source; these are extensively described on the following pages of the website: Position and Position Errors, Source Flags, Source Extent and Errors, Source Fluxes, Source Significance, Spectral Properties, and Source Variability. The eight tutorials (``threads'') available on the website serve as a collective guide for accessing, understanding, and manipulating the source properties and data products provided by the catalog.

  15. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach

    Directory of Open Access Journals (Sweden)

    Mike W.-L. Cheung

    2016-05-01

    Full Text Available Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists – and probably the most crucial one – is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  16. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach.

    Science.gov (United States)

    Cheung, Mike W-L; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists-and probably the most crucial one-is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  17. EIA model documentation: Electricity market module - electricity fuel dispatch

    International Nuclear Information System (INIS)

    1997-01-01

    This report documents the National Energy Modeling System Electricity Fuel Dispatch Submodule (EFD), a submodule of the Electricity Market Module (EMM) as it was used for EIA's Annual Energy Outlook 1997. It replaces previous documentation dated March 1994 and subsequent yearly update revisions. The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated through the synthesis and scenario development based on these components. This document serves four purposes. First, it is a reference document providing a detailed description of the model for reviewers and potential users of the EFD including energy experts at the Energy Information Administration (EIA), other Federal agencies, state energy agencies, private firms such as utilities and consulting firms, and non-profit groups such as consumer and environmental groups. Second, this report meets the legal requirement of the Energy Information Administration (EIA) to provide adequate documentation in support of its statistical and forecast reports. Third, it facilitates continuity in model development by providing documentation which details model enhancements that were undertaken for AE097 and since the previous documentation. Last, because the major use of the EFD is to develop forecasts, this documentation explains the calculations, major inputs and assumptions which were used to generate the AE097

  18. EIA model documentation: Electricity market module - electricity fuel dispatch

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-01-01

    This report documents the National Energy Modeling System Electricity Fuel Dispatch Submodule (EFD), a submodule of the Electricity Market Module (EMM) as it was used for EIA`s Annual Energy Outlook 1997. It replaces previous documentation dated March 1994 and subsequent yearly update revisions. The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated through the synthesis and scenario development based on these components. This document serves four purposes. First, it is a reference document providing a detailed description of the model for reviewers and potential users of the EFD including energy experts at the Energy Information Administration (EIA), other Federal agencies, state energy agencies, private firms such as utilities and consulting firms, and non-profit groups such as consumer and environmental groups. Second, this report meets the legal requirement of the Energy Information Administration (EIA) to provide adequate documentation in support of its statistical and forecast reports. Third, it facilitates continuity in model development by providing documentation which details model enhancements that were undertaken for AE097 and since the previous documentation. Last, because the major use of the EFD is to develop forecasts, this documentation explains the calculations, major inputs and assumptions which were used to generate the AE097.

  19. PHOTOGRAPHY AS DOCUMENT: OTLET AND BRIET’S CONSIDERATIONS

    Directory of Open Access Journals (Sweden)

    Izângela Maria Sansoni Tonello

    2018-04-01

    Full Text Available Introduction: The amount and variety of information that are conveyed in different media and means incite a concern, especially in relation to photographic documents, since they are currently the focus of interest of the Information Science field. In this context, this paper emphasizes the role of photographs as sources of information capable of generating knowledge as well as an important aid for research in different areas. Objective: The main goal of this study was to research the concepts and definitions underpinning the photograph as a document in information units. Methodology: Bibliographic and documentary research. Results: It can be affirmed through the meanings about the term document discussed in the literature by the researched authors that the photograph corresponds to the assumptions necessary to substantiate document and photograph in photographic document. Conclusions: It is understood that this study clarifies some issues related to photograph as a document; however, this proposition raises reflections about the importance of the production context as well as its essential relationship with other documents, so that it is indisputably consolidated as a photographic document.

  20. Analyzing Decision Logs to Understand Decision Making in Serious Crime Investigations.

    Science.gov (United States)

    Dando, Coral J; Ormerod, Thomas C

    2017-12-01

    Objective To study decision making by detectives when investigating serious crime through the examination of decision logs to explore hypothesis generation and evidence selection. Background Decision logs are used to record and justify decisions made during serious crime investigations. The complexity of investigative decision making is well documented, as are the errors associated with miscarriages of justice and inquests. The use of decision logs has not been the subject of an empirical investigation, yet they offer an important window into the nature of investigative decision making in dynamic, time-critical environments. Method A sample of decision logs from British police forces was analyzed qualitatively and quantitatively to explore hypothesis generation and evidence selection by police detectives. Results Analyses revealed diversity in documentation of decisions that did not correlate with case type and identified significant limitations of the decision log approach to supporting investigative decision making. Differences emerged between experienced and less experienced officers' decision log records in exploration of alternative hypotheses, generation of hypotheses, and sources of evidential inquiry opened over phase of investigation. Conclusion The practical use of decision logs is highly constrained by their format and context of use. Despite this, decision log records suggest that experienced detectives display strategic decision making to avoid confirmation and satisficing, which affect less experienced detectives. Application Potential applications of this research include both training in case documentation and the development of new decision log media that encourage detectives, irrespective of experience, to generate multiple hypotheses and optimize the timely selection of evidence to test them.

  1. The security analyzer, a security analyzer program written in Prolog

    International Nuclear Information System (INIS)

    Zimmerman, B.D.; Densley, P.J.; Carlson, R.L.

    1987-01-01

    A technique has been developed to characterize a nuclear facility and measure the strengths and weaknesses of the physical protection system. It utilizes the artificial intelligence capabilities available in the prolog programming language to probe a facility's defenses and find potential attack paths that meet designated search criteria. As sensors or barriers become inactive due to maintenance, failure, or inclement weather conditions, the protection system can rapidly be reanalyzed to discover weaknesses that would need to be strengthened by alternative means. Conversely, proposed upgrades and enhancements can be easily entered into the database and their effect measured against a variety of potential adversary attacks. Thus the security analyzer is a tool that aids the protection planner as well as the protection operations staff

  2. Wilmar joint market model, Documentation

    International Nuclear Information System (INIS)

    Meibom, P.; Larsen, Helge V.; Barth, R.; Brand, H.; Weber, C.; Voll, O.

    2006-01-01

    The Wilmar Planning Tool is developed in the project Wind Power Integration in Liberalised Electricity Markets (WILMAR) supported by EU (Contract No. ENK5-CT-2002-00663). A User Shell implemented in an Excel workbook controls the Wilmar Planning Tool. All data are contained in Access databases that communicate with various sub-models through text files that are exported from or imported to the databases. The Joint Market Model (JMM) constitutes one of these sub-models. This report documents the Joint Market model (JMM). The documentation describes: 1. The file structure of the JMM. 2. The sets, parameters and variables in the JMM. 3. The equations in the JMM. 4. The looping structure in the JMM. (au)

  3. Consultation document Energy Market Concentrations

    International Nuclear Information System (INIS)

    De Maa, J.; Van Gemert, M.; Karel, A.; La Bastide, G.; Giesbertz, P.; Buijs, F.; Vermeulen, M.; Beusmans, P.

    2006-06-01

    This the second consultation document of the Netherlands Competition Authority (NMa) on the title subject (the first was in 2002). The purpose of the consultation is to involve all the relevant and interested parties in the development of the energy market in the Netherlands and to consult those parties on studies that have been carried out by the NMa so far: (1) defining (possible) relevant markets in the electricity sector, and (2) the vision and opinion of the NMa with respect to mergers and take-overs. Also, the consultation document is a contribution to the response of the letter from the Dutch Minister of Economic Affairs of May 2005 in which the NMa was requested to give an overview of the preconditions with regard to competition and it's legal aspects [nl

  4. Endangered Language Documentation and Transmission

    Directory of Open Access Journals (Sweden)

    D. Victoria Rau

    2007-01-01

    Full Text Available This paper describes an on-going project on digital archiving Yami language documentation (http://www.hrelp.org/grants/projects/index.php?projid=60. We present a cross-disciplinary approach, involving computer science and applied linguistics, to document the Yami language and prepare teaching materials. Our discussion begins with an introduction to an integrated framework for archiving, processing and developing learning materials for Yami (Yang and Rau 2005, followed by a historical account of Yami language teaching, from a grammatical syllabus (Dong and Rau 2000b to a communicative syllabus using a multimedia CD as a resource (Rau et al. 2005, to the development of interactive on-line learning based on the digital archiving project. We discuss the methods used and challenges of each stage of preparing Yami teaching materials, and present a proposal for rethinking pedagogical models for e-learning.

  5. Fuel Handling Facility Description Document

    International Nuclear Information System (INIS)

    M.A. LaFountain

    2005-01-01

    The purpose of the facility description document (FDD) is to establish the requirements and their bases that drive the design of the Fuel Handling Facility (FHF) to allow the design effort to proceed to license application. This FDD is a living document that will be revised at strategic points as the design matures. It identifies the requirements and describes the facility design as it currently exists, with emphasis on design attributes provided to meet the requirements. This FDD was developed as an engineering tool for design control. Accordingly, the primary audience and users are design engineers. It leads the design process with regard to the flow down of upper tier requirements onto the facility. Knowledge of these requirements is essential to performing the design process. It trails the design with regard to the description of the facility. This description is a reflection of the results of the design process to date

  6. Documentation requirements for radiation sterilization

    DEFF Research Database (Denmark)

    Miller, A.

    1995-01-01

    Several standards are recently approved or are under development by the standard organizations ISO and CEN in the field of radiation sterilization. Particularly in Europe these standards define new requirements on some issues and on other issues they emphasize the necessary documentation for appr......Several standards are recently approved or are under development by the standard organizations ISO and CEN in the field of radiation sterilization. Particularly in Europe these standards define new requirements on some issues and on other issues they emphasize the necessary documentation...... for approval of radiation sterilized products. The impact of these standards on the radiation sterilization is discussed, with special attention given to a few special issues, mainly traceability and uncertainty of measurement results....

  7. Wilmar joint market model, Documentation

    Energy Technology Data Exchange (ETDEWEB)

    Meibom, P.; Larsen, Helge V. [Risoe National Lab. (Denmark); Barth, R.; Brand, H. [IER, Univ. of Stuttgart (Germany); Weber, C.; Voll, O. [Univ. of Duisburg-Essen (Germany)

    2006-01-15

    The Wilmar Planning Tool is developed in the project Wind Power Integration in Liberalised Electricity Markets (WILMAR) supported by EU (Contract No. ENK5-CT-2002-00663). A User Shell implemented in an Excel workbook controls the Wilmar Planning Tool. All data are contained in Access databases that communicate with various sub-models through text files that are exported from or imported to the databases. The Joint Market Model (JMM) constitutes one of these sub-models. This report documents the Joint Market model (JMM). The documentation describes: 1. The file structure of the JMM. 2. The sets, parameters and variables in the JMM. 3. The equations in the JMM. 4. The looping structure in the JMM. (au)

  8. The Aqueduct Global Flood Analyzer

    Science.gov (United States)

    Iceland, Charles

    2015-04-01

    As population growth and economic growth take place, and as climate change accelerates, many regions across the globe are finding themselves increasingly vulnerable to flooding. A recent OECD study of the exposure of the world's large port cities to coastal flooding found that 40 million people were exposed to a 1 in 100 year coastal flood event in 2005, and the total value of exposed assets was about US 3,000 billion, or 5% of global GDP. By the 2070s, those numbers were estimated to increase to 150 million people and US 35,000 billion, or roughly 9% of projected global GDP. Impoverished people in developing countries are particularly at risk because they often live in flood-prone areas and lack the resources to respond. WRI and its Dutch partners - Deltares, IVM-VU University Amsterdam, Utrecht University, and PBL Netherlands Environmental Assessment Agency - are in the initial stages of developing a robust set of river flood and coastal storm surge risk measures that show the extent of flooding under a variety of scenarios (both current and future), together with the projected human and economic impacts of these flood scenarios. These flood risk data and information will be accessible via an online, easy-to-use Aqueduct Global Flood Analyzer. We will also investigate the viability, benefits, and costs of a wide array of flood risk reduction measures that could be implemented in a variety of geographic and socio-economic settings. Together, the activities we propose have the potential for saving hundreds of thousands of lives and strengthening the resiliency and security of many millions more, especially those who are most vulnerable. Mr. Iceland will present Version 1.0 of the Aqueduct Global Flood Analyzer and provide a preview of additional elements of the Analyzer to be released in the coming years.

  9. Fuel analyzer; Analisador de combustiveis

    Energy Technology Data Exchange (ETDEWEB)

    Cozzolino, Roberval [RS Motors, Indaiatuba, SP (Brazil)

    2008-07-01

    The current technology 'COMBUSTIMETRO' aims to examine the fuel through performance of the engine, as the role of the fuel is to produce energy for the combustion engine in the form of which is directly proportional to the quality and type of fuel. The 'COMBUSTIMETRO' has an engine that always keeps the same entry of air, fuel and fixed point of ignition. His operation is monitored by sensors (Sonda Lambda, RPM and Gases Analyzer) connected to a processor that performs calculations and records the information, generate reports and graphs. (author)

  10. Documentation of Accounting Records in Light of Legislative Innovations

    Directory of Open Access Journals (Sweden)

    K. V. BEZVERKHIY

    2017-05-01

    Full Text Available Legislative reforms in accounting aim to simplify accounting records and compilation of financial reports by business entities, thus increasing the position of Ukraine in the global ranking of Doing Business. This simplification is implied in the changes in the Regulation on Documentation of Accounting Records, entered into force to the Resolution of the Ukrainian Ministry of Finance. The objective of the study is to analyze the legislative innovations involved. The review of changes in documentation of accounting records is made. A comparative analysis of changes in the Regulation on Documentation of Accounting Records is made by sections: 1 General; 2 Primary documents; 3 Accounting records; 4 Correction of errors in primary documents and accounting records; 5 Organization of document circulation; 6 Storage of documents. Methods of analysis and synthesis are used for separating the differences in the editions of the Regulation on Documentation of Accounting Records. The result of the study has theoretical and practical value for the domestic business enterprise sector.

  11. Extractive Summarisation of Medical Documents

    OpenAIRE

    Abeed Sarker; Diego Molla; Cecile Paris

    2012-01-01

    Background Evidence Based Medicine (EBM) practice requires practitioners to extract evidence from published medical research when answering clinical queries. Due to the time-consuming nature of this practice, there is a strong motivation for systems that can automatically summarise medical documents and help practitioners find relevant information. Aim The aim of this work is to propose an automatic query-focused, extractive summarisation approach that selects informative sentences from medic...

  12. Unsupervised Document Embedding With CNNs

    OpenAIRE

    Liu, Chundi; Zhao, Shunan; Volkovs, Maksims

    2017-01-01

    We propose a new model for unsupervised document embedding. Leading existing approaches either require complex inference or use recurrent neural networks (RNN) that are difficult to parallelize. We take a different route and develop a convolutional neural network (CNN) embedding model. Our CNN architecture is fully parallelizable resulting in over 10x speedup in inference time over RNN models. Parallelizable architecture enables to train deeper models where each successive layer has increasin...

  13. Indian Language Document Analysis and Understanding

    Indian Academy of Sciences (India)

    documents would contain text of more than one script (for example, English, Hindi and the ... O'Gorman and Govindaraju provides a good overview on document image ... word level in bilingual documents containing Roman and Tamil scripts.

  14. Documentation for the Waste Reduction Model (WARM)

    Science.gov (United States)

    This page describes the WARM documentation files and provides links to all documentation files associated with EPA’s Waste Reduction Model (WARM). The page includes a brief summary of the chapters documenting the greenhouse gas emission and energy factors.

  15. EDF group - Reference Document 2007

    International Nuclear Information System (INIS)

    2008-01-01

    The EDF Group is a leading player in the European energy industry, active in all areas of the electricity value chain, from generation to trading and network management. The leader in the French electricity market, the Group also has solid positions in the United Kingdom, Germany and Italy, with a portfolio of 38.5 million European customers and a generation fleet which is unique in the world. It intends to play a major role in the global revival of nuclear and is increasingly active in the gas chain. The Group has a sound business model, evenly balanced between regulated and deregulated activities. Given its R and D capability, its track record and expertise in nuclear, fossil-fired and hydro generation and in renewable energies, together with its energy eco-efficiency offers, EDF is well placed to deliver competitive solutions to reconcile sustainable economic growth and climate preservation. This document is EDF Group's Reference Document and Annual Financial Report for the year 2007. It contains information about Group profile, governance, business, investments, property, plant and equipment, management, financial position, human resources, shareholders, etc. The document includes the 2008 half-year financial report and consolidated financial statements, and the report drafted by the Statutory Auditors

  16. Vision document Energy Market Concentrations

    International Nuclear Information System (INIS)

    De Maa, J.; Van Gemert, M.; Giesbertz, P.; Vermeulen, M.; Beusmans, P.; Te Velthuis, M.; Drahos, M.

    2006-11-01

    June 2006 the second consultation document of the Netherlands Competition Authority (NMa) on the title subject (the first was in 2002) was published. The purpose of the consultation is to involve all the relevant and interested parties in the development of the energy market in the Netherlands and to consult those parties on studies that have been carried out by the NMa so far: (1) defining (possible) relevant markets in the electricity sector, and (2) the vision and opinion of the NMa with respect to mergers and take-overs. Also, the consultation document is a contribution to the response of the letter from the Dutch Minister of Economic Affairs of May 2005 in which the NMa was requested to give an overview of the preconditions with regard to competition and it's legal aspects. In this vision document all the relevant parties and stakeholders are informed about the development of energy markets in the Netherlands and abroad. Also an overview is given of the reactions from many stakeholders, involved and interested parties. [nl

  17. Waste Management System Requirement document

    International Nuclear Information System (INIS)

    1990-04-01

    This volume defines the top level technical requirements for the Monitored Retrievable Storage (MRS) facility. It is designed to be used in conjunction with Volume 1, General System Requirements. Volume 3 provides a functional description expanding the requirements allocated to the MRS facility in Volume 1 and, when appropriate, elaborates on requirements by providing associated performance criteria. Volumes 1 and 3 together convey a minimum set of requirements that must be satisfied by the final MRS facility design without unduly constraining individual design efforts. The requirements are derived from the Nuclear Waste Policy Act of 1982 (NWPA), the Nuclear Waste Policy Amendments Act of 1987 (NWPAA), the Environmental Protection Agency's (EPA) Environmental Standards for the Management and Disposal of Spent Nuclear Fuel (40 CFR 191), NRC Licensing Requirements for the Independent Storage of Spent Nuclear and High-Level Radioactive Waste (10 CFR 72), and other federal statutory and regulatory requirements, and major program policy decisions. This document sets forth specific requirements that will be fulfilled. Each subsequent level of the technical document hierarchy will be significantly more detailed and provide further guidance and definition as to how each of these requirements will be implemented in the design. Requirements appearing in Volume 3 are traceable into the MRS Design Requirements Document. Section 2 of this volume provides a functional breakdown for the MRS facility. 1 tab

  18. Review Document: Full Software Trigger

    CERN Document Server

    Albrecht, J; Raven, G

    2014-01-01

    This document presents a trigger system for the upgraded LHCb detector, scheduled to begin operation in 2020. This document serves as input for the internal review towards the "DAQ, online and trigger TDR". The proposed trigger system is implemented entirely in software. In this document we show that track reconstruction of a similar quality to that available in the offline algorithms can be performed on the full inelastic $pp$-collision rate, without prior event selections implemented in custom hardware and without relying upon a partial event reconstruction. A track nding eciency of 98.8 % relative to oine can be achieved for tracks with $p_T >$ 500 MeV/$c$. The CPU time required for this reconstruction is about 40 % of the available budget. Proof-of-principle selections are presented which demonstrate that excellent performance is achievable using an inclusive beauty trigger, in addition to exclusive beauty and charm triggers. Finally, it is shown that exclusive beauty and charm selections that do not intr...

  19. Mixed waste characterization reference document

    International Nuclear Information System (INIS)

    1997-09-01

    Waste characterization and monitoring are major activities in the management of waste from generation through storage and treatment to disposal. Adequate waste characterization is necessary to ensure safe storage, selection of appropriate and effective treatment, and adherence to disposal standards. For some wastes characterization objectives can be difficult and costly to achieve. The purpose of this document is to evaluate costs of characterizing one such waste type, mixed (hazardous and radioactive) waste. For the purpose of this document, waste characterization includes treatment system monitoring, where monitoring is a supplement or substitute for waste characterization. This document establishes a cost baseline for mixed waste characterization and treatment system monitoring requirements from which to evaluate alternatives. The cost baseline established as part of this work includes costs for a thermal treatment technology (i.e., a rotary kiln incinerator), a nonthermal treatment process (i.e., waste sorting, macronencapsulation, and catalytic wet oxidation), and no treatment (i.e., disposal of waste at the Waste Isolation Pilot Plant (WIPP)). The analysis of improvement over the baseline includes assessment of promising areas for technology development in front-end waste characterization, process equipment, off gas controls, and monitoring. Based on this assessment, an ideal characterization and monitoring configuration is described that minimizes costs and optimizes resources required for waste characterization

  20. Now You See It: Using Documentation to Make Learning Visible in LCs

    Science.gov (United States)

    Mino, Jack J.

    2014-01-01

    The practice of documentation is discussed as a means of making learning visible in the LC classroom. A documentation heuristic consisting of a four-stage cycle was used to capture, analyze and report what Bass and Eynon (2009) refer to as the "visible evidence of invisible learning" (p. 5). A variety of documentation samples are…