WorldWideScience

Sample records for extracting high-level information

  1. Development of technical information database for high level waste disposal

    International Nuclear Information System (INIS)

    Kudo, Koji; Takada, Susumu; Kawanishi, Motoi

    2005-01-01

    A concept design of the high level waste disposal information database and the disposal technologies information database are explained. The high level waste disposal information database contains information on technologies, waste, management and rules, R and D, each step of disposal site selection, characteristics of sites, demonstration of disposal technology, design of disposal site, application for disposal permit, construction of disposal site, operation and closing. Construction of the disposal technologies information system and the geological disposal technologies information system is described. The screen image of the geological disposal technologies information system is shown. User is able to search the full text retrieval and attribute retrieval in the image. (S.Y. )

  2. Licensing information needs for a high-level waste repository

    International Nuclear Information System (INIS)

    Wright, R.J.; Greeves, J.T.; Logsdon, M.J.

    1985-01-01

    The information needs for licensing findings during the development of a repository for high-level waste (HLW) are described. In particular, attention is given to the information and needs to demonstrate, for construction authorization purposes: repository constructibility, waste retrievability, waste containment, and waste isolation

  3. Extraction of transuranic elements from high-level waste

    International Nuclear Information System (INIS)

    Morita, Y.; Kubota, M.; Tani, S.

    1991-01-01

    The present study on the counter-current continuous extraction and back-extraction offered a promising prospect of separating TRU from HLW by the DIDPA extraction process which consisted of the following three steps; simultaneous extraction of TRU, Np, Pu, Am and Cm (and rare earths) with 0.5 M DIDPA - 0.1 M TBP solvent, back-extraction of trivalent TRU, Am and Cm, with 4 M HNO 3 , and back-extraction of TRU actinides, Np and Pu, with oxalic acid. At the extraction step, temperature should be raised and H 2 O 2 should be added several times. The contacting time of the aqueous and organic phases is the most important parameter for Np extraction. Raising temperature at the first back-extraction step also has a good effect on the recovery of Am and Cm. The back-extraction of Np with oxalic acid is a simple process without change of Np oxidation state. A small part of Ru remained in the used solvent. However, its concentration was not so high that its remaining would have no influence on the several times recycling of the solvent. (author)

  4. High Level Information Fusion (HLIF) with nested fusion loops

    Science.gov (United States)

    Woodley, Robert; Gosnell, Michael; Fischer, Amber

    2013-05-01

    Situation modeling and threat prediction require higher levels of data fusion in order to provide actionable information. Beyond the sensor data and sources the analyst has access to, the use of out-sourced and re-sourced data is becoming common. Through the years, some common frameworks have emerged for dealing with information fusion—perhaps the most ubiquitous being the JDL Data Fusion Group and their initial 4-level data fusion model. Since these initial developments, numerous models of information fusion have emerged, hoping to better capture the human-centric process of data analyses within a machine-centric framework. 21st Century Systems, Inc. has developed Fusion with Uncertainty Reasoning using Nested Assessment Characterizer Elements (FURNACE) to address challenges of high level information fusion and handle bias, ambiguity, and uncertainty (BAU) for Situation Modeling, Threat Modeling, and Threat Prediction. It combines JDL fusion levels with nested fusion loops and state-of-the-art data reasoning. Initial research has shown that FURNACE is able to reduce BAU and improve the fusion process by allowing high level information fusion (HLIF) to affect lower levels without the double counting of information or other biasing issues. The initial FURNACE project was focused on the underlying algorithms to produce a fusion system able to handle BAU and repurposed data in a cohesive manner. FURNACE supports analyst's efforts to develop situation models, threat models, and threat predictions to increase situational awareness of the battlespace. FURNACE will not only revolutionize the military intelligence realm, but also benefit the larger homeland defense, law enforcement, and business intelligence markets.

  5. Spent nuclear fuel project high-level information management plan

    Energy Technology Data Exchange (ETDEWEB)

    Main, G.C.

    1996-09-13

    This document presents the results of the Spent Nuclear Fuel Project (SNFP) Information Management Planning Project (IMPP), a short-term project that identified information management (IM) issues and opportunities within the SNFP and outlined a high-level plan to address them. This high-level plan for the SNMFP IM focuses on specific examples from within the SNFP. The plan`s recommendations can be characterized in several ways. Some recommendations address specific challenges that the SNFP faces. Others form the basis for making smooth transitions in several important IM areas. Still others identify areas where further study and planning are indicated. The team`s knowledge of developments in the IM industry and at the Hanford Site were crucial in deciding where to recommend that the SNFP act and where they should wait for Site plans to be made. Because of the fast pace of the SNFP and demands on SNFP staff, input and interaction were primarily between the IMPP team and members of the SNFP Information Management Steering Committee (IMSC). Key input to the IMPP came from a workshop where IMSC members and their delegates developed a set of draft IM principles. These principles, described in Section 2, became the foundation for the recommendations found in the transition plan outlined in Section 5. Availability of SNFP staff was limited, so project documents were used as a basis for much of the work. The team, realizing that the status of the project and the environment are continually changing, tried to keep abreast of major developments since those documents were generated. To the extent possible, the information contained in this document is current as of the end of fiscal year (FY) 1995. Programs and organizations on the Hanford Site as a whole are trying to maximize their return on IM investments. They are coordinating IM activities and trying to leverage existing capabilities. However, the SNFP cannot just rely on Sitewide activities to meet its IM requirements

  6. Solvent extraction in the treatment of acidic high-level liquid waste : where do we stand?

    International Nuclear Information System (INIS)

    Horwitz, E. P.; Schulz, W. W.

    1998-01-01

    During the last 15 years, a number of solvent extraction/recovery processes have been developed for the removal of the transuranic elements, 90 Sr and 137 Cs from acidic high-level liquid waste. These processes are based on the use of a variety of both acidic and neutral extractants. This chapter will present an overview and analysis of the various extractants and flowsheets developed to treat acidic high-level liquid waste streams. The advantages and disadvantages of each extractant along with comparisons of the individual systems are discussed

  7. Separation of transuranium elements from high-level waste by extraction with diisodecyl phosphoric acid

    International Nuclear Information System (INIS)

    Morita, Y.; Kubota, M.; Tani, S.

    1991-01-01

    Separation of transuranic elements (TRU) by extraction with diisodecyl phosphoric acid (DIDPA) has been studied to develop a partitioning process for high-level waste (HLW). In the present study, experiments of counter-current continuous extraction and back-extraction using a miniature mixer-settler were carried out to find the optimum process condition for the separation of Np initially in the pentavalent state and to examine the extraction behaviors of fission and corrosion products. (J.P.N.)

  8. Recent developments in the extraction separation method for treatment of high-level liquid waste

    International Nuclear Information System (INIS)

    Jiao Rongzhou; Song Chongli; Zhu Yongjun

    2000-01-01

    A description and review of the recent developments in the extraction separation method for partitioning transuranium elements from high-level liquid waste (HLLW) is presented. The extraction separation processes such as TRUEX process, DIAMEX process, DIDPA process, CTH process, TRPO process are briefly discussed

  9. High level cognitive information processing in neural networks

    Science.gov (United States)

    Barnden, John A.; Fields, Christopher A.

    1992-01-01

    Two related research efforts were addressed: (1) high-level connectionist cognitive modeling; and (2) local neural circuit modeling. The goals of the first effort were to develop connectionist models of high-level cognitive processes such as problem solving or natural language understanding, and to understand the computational requirements of such models. The goals of the second effort were to develop biologically-realistic model of local neural circuits, and to understand the computational behavior of such models. In keeping with the nature of NASA's Innovative Research Program, all the work conducted under the grant was highly innovative. For instance, the following ideas, all summarized, are contributions to the study of connectionist/neural networks: (1) the temporal-winner-take-all, relative-position encoding, and pattern-similarity association techniques; (2) the importation of logical combinators into connection; (3) the use of analogy-based reasoning as a bridge across the gap between the traditional symbolic paradigm and the connectionist paradigm; and (4) the application of connectionism to the domain of belief representation/reasoning. The work on local neural circuit modeling also departs significantly from the work of related researchers. In particular, its concentration on low-level neural phenomena that could support high-level cognitive processing is unusual within the area of biological local circuit modeling, and also serves to expand the horizons of the artificial neural net field.

  10. Demonstration of Caustic-Side Solvent Extraction with Savannah River Site High Level Waste

    International Nuclear Information System (INIS)

    Walker, D.D.

    2001-01-01

    Researchers successfully demonstrated the chemistry and process equipment of the Caustic-Side Solvent Extraction (CSSX) flowsheet for the decontamination of high level waste using a 33-stage, 2-cm centrifugal contactor apparatus at the Savannah River Technology Center. This represents the first CSSX process demonstration using Savannah River Site (SRS) high level waste. Three tests lasting 6, 12, and 48 hours processed simulated average SRS waste, simulated Tank 37H/44F composite waste, and Tank 37H/44F high level waste, respectively

  11. Fundamental study on the extraction of transuranium elements from high-level liquid waste

    International Nuclear Information System (INIS)

    Kubota, Masumitsu; Morita, Yasuji; Tochiyama, Osamu; Inoue, Yasushi.

    1988-01-01

    A great many extractants have been studied for the separation of transuranium elements. The present study deals with the survey and classification of the extractants appearing in literature, bearing in mind the relationship between the molecular structure of extractants and their extractability for the transuranium elements from the standpoint of their selective separation from high-level liquid waste (HLW) generated from fuel reprocessing. The extractants surveyed were classified into six groups; unidentate neutral organophosphorus compounds, bidentate neutral organophosphorus compounds, acidic organophosphorus compounds, amines and ammonium salts, N,N-disubstituted amides and the other compounds. These extractants are not always applicable to the separation of transuranium elements from HLW because of their limitations in extractability and radiation durability. Only a limited number of extractants belonging to the bidentate neutral organophosphorus compounds and the acidic organophosphorus compounds are considered to be suitable for the present purpose. (author)

  12. Information Fusion for High Level Situation Assessment and Prediction

    National Research Council Canada - National Science Library

    Ji, Qiang

    2007-01-01

    .... In addition, we developed algorithms for performing active information fusion to improve both fusion accuracy and efficiency so that decision making and situation assessment can be made in a timely and efficient manner...

  13. Optimization of TRPO process parameters for americium extraction from high level waste

    International Nuclear Information System (INIS)

    Chen Jing; Wang Jianchen; Song Chongli

    2001-01-01

    The numerical calculations for Am multistage fractional extraction by trialkyl phosphine oxide (TRPO) were verified by a hot test. 1750L/t-U high level waste (HLW) was used as the feed to the TRPO process. The analysis used the simple objective function to minimize the total waste content in the TRPO process streams. Some process parameters were optimized after other parameters were selected. The optimal process parameters for Am extraction by TRPO are: 10 stages for extraction and 2 stages for scrubbing; a flow rate ratio of 0.931 for extraction and 4.42 for scrubbing; nitric acid concentration of 1.35 mol/L for the feed and 0.5 mol/L for the scrubbing solution. Finally, the nitric acid and Am concentration profiles in the optimal TRPO extraction process are given

  14. Interaction between High-Level and Low-Level Image Analysis for Semantic Video Object Extraction

    Directory of Open Access Journals (Sweden)

    Andrea Cavallaro

    2004-06-01

    Full Text Available The task of extracting a semantic video object is split into two subproblems, namely, object segmentation and region segmentation. Object segmentation relies on a priori assumptions, whereas region segmentation is data-driven and can be solved in an automatic manner. These two subproblems are not mutually independent, and they can benefit from interactions with each other. In this paper, a framework for such interaction is formulated. This representation scheme based on region segmentation and semantic segmentation is compatible with the view that image analysis and scene understanding problems can be decomposed into low-level and high-level tasks. Low-level tasks pertain to region-oriented processing, whereas the high-level tasks are closely related to object-level processing. This approach emulates the human visual system: what one “sees” in a scene depends on the scene itself (region segmentation as well as on the cognitive task (semantic segmentation at hand. The higher-level segmentation results in a partition corresponding to semantic video objects. Semantic video objects do not usually have invariant physical properties and the definition depends on the application. Hence, the definition incorporates complex domain-specific knowledge and is not easy to generalize. For the specific implementation used in this paper, motion is used as a clue to semantic information. In this framework, an automatic algorithm is presented for computing the semantic partition based on color change detection. The change detection strategy is designed to be immune to the sensor noise and local illumination variations. The lower-level segmentation identifies the partition corresponding to perceptually uniform regions. These regions are derived by clustering in an N-dimensional feature space, composed of static as well as dynamic image attributes. We propose an interaction mechanism between the semantic and the region partitions which allows to

  15. Information extraction system

    Science.gov (United States)

    Lemmond, Tracy D; Hanley, William G; Guensche, Joseph Wendell; Perry, Nathan C; Nitao, John J; Kidwell, Paul Brandon; Boakye, Kofi Agyeman; Glaser, Ron E; Prenger, Ryan James

    2014-05-13

    An information extraction system and methods of operating the system are provided. In particular, an information extraction system for performing meta-extraction of named entities of people, organizations, and locations as well as relationships and events from text documents are described herein.

  16. Actinide separation of high-level waste using solvent extractants on magnetic microparticles

    International Nuclear Information System (INIS)

    Nunez, L.; Buchholz, B.A.; Kaminski, M.; Aase, S.B.; Brown, N.R.; Vandegrift, G.F.

    1994-01-01

    Polymeric-coated ferromagnetic particles with an absorbed layer of octyl(phenyl)-N,N-diisobutylcarbamoylmethylphosphine oxide (CMPO) diluted by tributyl phosphate (TBP) are being evaluated for application in the separation and the recovery of low concentrations of americium and plutonium from nuclear waste solutions. Due to their chemical nature, these extractants selectively complex americium and plutonium contaminants onto the particles, which can be recovered from the waste solution using a magnet. The effectiveness of the extractant-absorbed particles at removing transuranics (TRU) from simulated solutions and various nitric acid solutions was measured by gamma and liquid scintillation counting of plutonium and americium. The HNO 3 concentration range was 0.01 M to 6M. The partition coefficients (K d ) for various actinides at 2M HNO 3 were determined to be between 3,000 and 30,000. These values are larger than those projected for TRU recovery by traditional liquid/liquid extraction. Results from transmission electron microscopy indicated a large dependence of K d on relative magnetite location within the polymer and the polymer surface area. Energy disperse spectroscopy demonstrated homogeneous metal complexation on the polymer surface with no metal clustering. The radiolytic stability of the particles was determined by using 60 Co gamma irradiation under various conditions. The results showed that K d more strongly depends on the nitric acid dissolution rate of the magnetite than the gamma irradiation dose. Results of actinide separation from simulated high-level waste representative of that at various DOE sites are also discussed

  17. Multimedia Information Extraction

    CERN Document Server

    Maybury, Mark T

    2012-01-01

    The advent of increasingly large consumer collections of audio (e.g., iTunes), imagery (e.g., Flickr), and video (e.g., YouTube) is driving a need not only for multimedia retrieval but also information extraction from and across media. Furthermore, industrial and government collections fuel requirements for stock media access, media preservation, broadcast news retrieval, identity management, and video surveillance.  While significant advances have been made in language processing for information extraction from unstructured multilingual text and extraction of objects from imagery and vid

  18. From GPS tracks to context: Inference of high-level context information through spatial clustering

    OpenAIRE

    Moreira, Adriano; Santos, Maribel Yasmina

    2005-01-01

    Location-aware applications use the location of users to adapt their behaviour and to select the relevant information for users in a particular situation. This location information is obtained through a set of location sensors, or from network-based location services, and is often used directly, without any further processing, as a parameter in a selection process. In this paper we propose a method to infer high-level context information from a series of position records obtained from a GPS r...

  19. Collaboration, Automation, and Information Management at Hanford High Level Radioactive Waste (HLW) Tank Farms

    International Nuclear Information System (INIS)

    Aurah, Mirwaise Y.; Roberts, Mark A.

    2013-01-01

    Washington River Protection Solutions (WRPS), operator of High Level Radioactive Waste (HLW) Tank Farms at the Hanford Site, is taking an over 20-year leap in technology, replacing systems that were monitored with clipboards and obsolete computer systems, as well as solving major operations and maintenance hurdles in the area of process automation and information management. While WRPS is fully compliant with procedures and regulations, the current systems are not integrated and do not share data efficiently, hampering how information is obtained and managed

  20. A novel quantum information hiding protocol based on entanglement swapping of high-level Bell states

    International Nuclear Information System (INIS)

    Xu Shu-Jiang; Wang Lian-Hai; Chen Xiu-Bo; Niu Xin-Xin; Yang Yi-Xian

    2015-01-01

    Using entanglement swapping of high-level Bell states, we first derive a covert layer between the secret message and the possible output results of the entanglement swapping between any two generalized Bell states, and then propose a novel high-efficiency quantum information hiding protocol based on the covert layer. In the proposed scheme, a covert channel can be built up under the cover of a high-level quantum secure direct communication (QSDC) channel for securely transmitting secret messages without consuming any auxiliary quantum state or any extra communication resource. It is shown that this protocol not only has a high embedding efficiency but also achieves a good imperceptibility as well as a high security. (paper)

  1. Next Generation Extractants for Cesium Separation from High-Level Waste: From Fundamental Concepts to Site Implementation

    International Nuclear Information System (INIS)

    Moyer, Bruce A.; Bazelaire, Eve; Bonnesen, Peter V.; Bryan, Jeffrey C.; Delmau, Latitia H.; Engle, Nancy L.; Gorbunova, Maryna G.; Keever, Tamara J.; Levitskaia, Tatiana G.; Sachleben, Richard A.; Tomkins, Bruce A.; Bartsch, Richard A.

    2004-01-01

    General project objectives. This project seeks a fundamental understanding and major improvement in cesium separation from high-level waste by cesium-selective calixcrown extractants. Systems of particular interest involve novel solvent-extraction systems containing specific members of the calix[4]arene-crown-6 family, alcohol solvating agents, and alkylamines. Questions being addressed pertain to cesium binding strength, extraction selectivity, cesium stripping, and extractant solubility. Enhanced properties in this regard will specifically benefit cleanup projects funded by the USDOE Office of Environmental Management to treat and dispose of high-level radioactive wastes currently stored in underground tanks at the Savannah River Site (SRS), the Hanford site, and the Idaho National Environmental and Engineering Laboratory.1 The most direct beneficiary will be the SRS Salt Processing Project, which has recently identified the Caustic-Side Solvent Extraction (CSSX) process employing a calixcrown as its preferred technology for cesium removal from SRS high level tank waste.2 This technology owes its development in part to fundamental results obtained in this program

  2. Next Generation Extractants for Cesium Separation from High-Level Waste: From Fundamental Concepts to Site Implementation

    International Nuclear Information System (INIS)

    Moyer, Bruce A; Bazelaire, Eve; Bonnesen, Peter V.; Bryan, Jeffrey C.; Delmau, Laetitia H.; Engle, Nancy L.; Gorbunova, Maryna G.; Keever, Tamara J.; Levitskaia, Tatiana G.; Sachleben, Richard A.; Tomkins, Bruce A.; Bartsch, Richard A.; Talanov, Vladimir S.; Gibson, Harry W.; Jones, Jason W.; Hay, Benjamin P.

    2003-01-01

    This project seeks a fundamental understanding and major improvement in cesium separation from high-level waste by cesium-selective calixcrown extractants. Systems of particular interest involve novel solvent-extraction systems containing specific members of the calix[4]arene-crown-6 family, alcohol solvating agents, and alkylamines. Questions being addressed pertain to cesium binding strength, extraction selectivity, cesium stripping, and extractant solubility. Enhanced properties in this regard will specifically benefit cleanup projects funded by the USDOE Office of Environmental Management to treat and dispose of high-level radioactive wastes currently stored in underground tanks at the Savannah River Site (SRS), the Hanford site, and the Idaho National Environmental and Engineering Laboratory.1 The most direct beneficiary will be the SRS Salt Processing Project, which has recently identified the Caustic-Side Solvent Extraction (CSSX) process employing a calixcrown as its preferred technology for cesium removal from SRS high-level tank waste.2 This technology owes its development in part to fundamental results obtained in this program

  3. High-Level Antimicrobial Efficacy of Representative Mediterranean Natural Plant Extracts against Oral Microorganisms

    Directory of Open Access Journals (Sweden)

    Lamprini Karygianni

    2014-01-01

    Full Text Available Nature is an unexplored reservoir of novel phytopharmaceuticals. Since biofilm-related oral diseases often correlate with antibiotic resistance, plant-derived antimicrobial agents could enhance existing treatment options. Therefore, the rationale of the present report was to examine the antimicrobial impact of Mediterranean natural extracts on oral microorganisms. Five different extracts from Olea europaea, mastic gum, and Inula viscosa were tested against ten bacteria and one Candida albicans strain. The extraction protocols were conducted according to established experimental procedures. Two antimicrobial assays—the minimum inhibitory concentration (MIC assay and the minimum bactericidal concentration (MBC assay—were applied. The screened extracts were found to be active against each of the tested microorganisms. O. europaea presented MIC and MBC ranges of 0.07–10.00 mg mL−1 and 0.60–10.00 mg mL−1, respectively. The mean MBC values for mastic gum and I. viscosa were 0.07–10.00 mg mL−1 and 0.15–10.00 mg mL−1, respectively. Extracts were less effective against C. albicans and exerted bactericidal effects at a concentration range of 0.07–5.00 mg mL−1 on strict anaerobic bacteria (Porphyromonas gingivalis, Prevotella intermedia, Fusobacterium nucleatum, and Parvimonas micra. Ethyl acetate I. viscosa extract and total mastic extract showed considerable antimicrobial activity against oral microorganisms and could therefore be considered as alternative natural anti-infectious agents.

  4. Site characterization information needs for a high-level waste geologic repository

    International Nuclear Information System (INIS)

    Gupta, D.C.; Nataraja, M.S.; Justus, P.S.

    1987-01-01

    At each of the three candidate sites recommended for site characterization for High-Level Waste Geologic Repository development, the DOE has proposed to conduct both surface-based testing and in situ exploration and testing at the depths that wastes would be emplaced. The basic information needs and consequently the planned surface-based and in situ testing program will be governed to a large extent by the amount of credit taken for individual components of the geologic repository in meeting the performance objectives and siting criteria. Therefore, identified information to be acquired from site characterization activities should be commensurate with DOE's assigned performance goals for the repository system components on a site-specific basis. Because of the uncertainties that are likely to be associated with initial assignment of performance goals, the information needs should be both reasonably and conservatively identified

  5. High-level radioactive waste disposal: Key geochemical issues and information needs for site characterization

    International Nuclear Information System (INIS)

    Brooks, D.J.; Bembia, P.J.; Bradbury, J.W.; Jackson, K.C.; Kelly, W.R.; Kovach, L.A.; Mo, T.; Tesoriero, J.A.

    1986-01-01

    Geochemistry plays a key role in determining the potential of a high-level radioactive waste disposal site for long-term radionuclide containment and isolation. The Nuclear Regulatory Commission (NRC) has developed a set of issues and information needs important for characterizing geochemistry at the potential sites being investigated by the Department of Energy Basalt Waste Isolation Project, Nevada Nuclear Waste Storage Investigations project, and Salt Repository Project. The NRC site issues and information needs consider (1) the geochemical environment of the repository, (2) changes to the initial geochemical environment caused by construction and waste emplacement, and (3) interactions that affect the transport of waste radionuclides to the accessible environment. The development of these issues and information needs supports the ongoing effort of the NRC to identify and address areas of geochemical data uncertainty during prelicensing interactions

  6. Determination of Np, Pu and Am in high level radioactive waste with extraction-liquid scintillation counting

    International Nuclear Information System (INIS)

    Yang Dazhu; Zhu Yongjun; Jiao Rongzhou

    1994-01-01

    A new method for the determination of transuranium elements, Np, Pu and Am with extraction-liquid scintillation counting has been studied systematically. Procedures for the separation of Pu and Am by HDEHP-TRPO extraction and for the separation of Np by TTA-TiOA extraction have been developed, by which the recovery of Np, Pu and Am is 97%, 99% and 99%, respectively, and the decontamination factors for the major fission products ( 90 Sr, 137 Cs etc.) are 10 4 -10 6 . Pulse shape discrimination (PSD) technique has been introduced to liquid scintillation counting, by which the counting efficiency of α-activity is >99% and the rejection of β-counts is >99.95%. This new method, combining extraction and pulse shape discrimination with liquid scintillation technique, has been successfully applied to the assay of Np, Pu and Am in high level radioactive waste. (author) 7 refs.; 7 figs.; 4 tabs

  7. Caesium extraction from acidic high level liquid wastes with functionalized calixarenes

    International Nuclear Information System (INIS)

    Simon, N.; Eymard, S.; Tournois, B.; Dozol, J.F.

    2000-01-01

    In the framework of French law programme, studies are under way to selectively remove caesium from acidic high activity wastes. Calix[4]arene crown derivatives exhibit outstanding efficiency and selectivity for caesium. An optimisation of the formulation of a selective extractant system for Cs based on crown calixarenes and usable in a process which use liquid-liquid extraction is presented. A system involving a monoamide as a modifier is proposed. Besides these improvements, a reference solvent based on a standard 1,3-di-(n-octyl-oxy)2,4-calix(4)arene crown is studied. Flow-sheets related to this system are calculated and easily transferable to the optimised new system. (authors)

  8. Polarity-specific high-level information propagation in neural networks.

    Science.gov (United States)

    Lin, Yen-Nan; Chang, Po-Yen; Hsiao, Pao-Yueh; Lo, Chung-Chuan

    2014-01-01

    Analyzing the connectome of a nervous system provides valuable information about the functions of its subsystems. Although much has been learned about the architectures of neural networks in various organisms by applying analytical tools developed for general networks, two distinct and functionally important properties of neural networks are often overlooked. First, neural networks are endowed with polarity at the circuit level: Information enters a neural network at input neurons, propagates through interneurons, and leaves via output neurons. Second, many functions of nervous systems are implemented by signal propagation through high-level pathways involving multiple and often recurrent connections rather than by the shortest paths between nodes. In the present study, we analyzed two neural networks: the somatic nervous system of Caenorhabditis elegans (C. elegans) and the partial central complex network of Drosophila, in light of these properties. Specifically, we quantified high-level propagation in the vertical and horizontal directions: the former characterizes how signals propagate from specific input nodes to specific output nodes and the latter characterizes how a signal from a specific input node is shared by all output nodes. We found that the two neural networks are characterized by very efficient vertical and horizontal propagation. In comparison, classic small-world networks show a trade-off between vertical and horizontal propagation; increasing the rewiring probability improves the efficiency of horizontal propagation but worsens the efficiency of vertical propagation. Our result provides insights into how the complex functions of natural neural networks may arise from a design that allows them to efficiently transform and combine input signals.

  9. Challenges in Managing Information Extraction

    Science.gov (United States)

    Shen, Warren H.

    2009-01-01

    This dissertation studies information extraction (IE), the problem of extracting structured information from unstructured data. Example IE tasks include extracting person names from news articles, product information from e-commerce Web pages, street addresses from emails, and names of emerging music bands from blogs. IE is all increasingly…

  10. Separation of aromatic precipitates from simulated high level radioactive waste by hydrolysis, evaporation and liquid-liquid extraction

    International Nuclear Information System (INIS)

    Young, S.R.; Shah, H.B.; Carter, J.T.

    1991-01-01

    The Defense Waste Processing Facility (DWPF) at the SRS will be the United States' first facility to process High Level radioactive Waste (HLW) into a borosilicate glass matrix. The removal of aromatic precipitates by hydrolysis, evaporation and liquid-liquid extraction will be a key step in the processing of the HLW. This step, titled the Precipitate Hydrolysis Process, has been demonstrated by the Savannah River Laboratory with the Precipitate Hydrolysis Experimental Facility (PHEF). The mission of the PHEF is to demonstrate processing of simulated high level radioactive waste which contains tetraphenylborate precipitates and nitrite. Reduction of nitrite by hydroxylamine nitrate and hydrolysis of the tetraphenylborate by formic acid is discussed. Gaseous production, which is primarily benzene, nitrous oxide and carbon dioxide, has been quantified. Production of high-boiling organic compounds and the accumulation of these organic compounds within the process are addressed

  11. INTEC High-Level Waste Studies Universal Solvent Extraction Feasibility Study

    International Nuclear Information System (INIS)

    Banaee, J.; Barnes, C.M.; Battisti, T.; Herrmann, S.; Losinski, S.J.; McBride, S.

    2000-01-01

    This report summarizes a feasibility study that has been conducted on the Universal Solvent Extraction (UNEX) Process for treatment and disposal of 4.3 million liters of INEEL sodium-bearing waste located at the Idaho Nuclear Technology and Engineering Center. This feasibility study covers two scenarios of treatment. The first, the UNEX Process, partitions the Cs/Sr from the SBW and creates remote-handled LLW and contact-handled TRU waste forms. Phase one of this study, covered in the 30% review documents, dealt with defining the processes and defining the major unit operations. The second phase of the project, contained in the 60% review, expanded on the application of the UNEX processes and included facility requirements and definitions. Two facility options were investigated for the UNEX process, resulting in a 2 x 2 matrix of process/facility scenarios as follows: Option A, UNEX at Greenfield Facility, Option B, Modified UNEX at Greenfield Facility, Option C, UNEX at NWCF, th is document, covers life-cycle costs for all options presented along with results and conclusions determined from the study

  12. Development of Effective Solvent Modifiers for the Solvent Extraction of Cesium from Alkaline High-Level Tank Waste

    International Nuclear Information System (INIS)

    Bonnesen, Peter V.; Delmau, Laetitia H.; Moyer, Bruce A.; Lumetta, Gregg J.

    2003-01-01

    A series of novel alkylphenoxy fluorinated alcohols were prepared and investigated for their effectiveness as modifiers in solvents containing calix(4)arene-bis-(tert-octylbenzo)-crown-6 for extracting cesium from alkaline nitrate media. A modifier that contained a terminal 1,1,2,2-tetrafluoroethoxy group was found to decompose following long-term exposure to warm alkaline solutions. However, replacement of the tetrafluoroethoxy group with a 2,2,3,3-tetrafluoropropoxy group led to a series of modifiers that possessed the alkaline stability required for a solvent extraction process. Within this series of modifiers, the structure of the alkyl substituent (tert-octyl, tert-butyl, tert-amyl, and sec-butyl) of the alkylphenoxy moiety was found to have a profound impact on the phase behavior of the solvent in liquid-liquid contacting experiments, and hence on the overall suitability of the modifier for a solvent extraction process. The sec-butyl derivative(1-(2,2,3,3-tetrafluoropropoxy)-3- (4-sec-butylphenoxy)-2-propanol) (Cs-7SB) was found to possess the best overall balance of properties with respect to third phase and coalescence behavior, cleanup following degradation, resistance to solids formation, and cesium distribution behavior. Accordingly, this modifier was selected for use as a component of the solvent employed in the Caustic-Side Solvent Extraction (CSSX) process for removing cesium from high level nuclear waste (HLW) at the U.S. Department of Energy's (DOE) Savannah River Site. In batch equilibrium experiments, this solvent has also been successfully shown to extract cesium from both simulated and actual solutions generated from caustic leaching of HLW tank sludge stored in tank B-110 at the DOE's Hanford Site.

  13. Selective extraction of actinides from high level liquid wastes. Study of the possibilities offered by the Redox properties of actinides

    International Nuclear Information System (INIS)

    Adnet, J.M.

    1991-07-01

    Partitioning of high level liquid wastes coming from nuclear fuel reprocessing by the PUREX process, consists in the elimination of minor actinides (Np, Am, and traces of Pu and U). Among the possible processes, the selective extraction of actinides with oxidation states higher than three is studied. First part of this work deals with a preliminary step; the elimination of the ruthenium from fission products solutions using the electrovolatilization of the RuO4 compound. The second part of this work concerns the complexation and oxidation reactions of the elements U, Np, Pu and Am in presence of a compound belonging to the insaturated polyanions family: the potassium phosphotungstate. For actinide ions with oxidation state (IV) complexed with phosphotungstate anion the extraction mechanism by dioctylamine was studied and the use of a chromatographic extraction technic permitted successful separations between tetravalents actinides and trivalents actinides. Finally, in accordance with the obtained results, the basis of a separation scheme for the management of fission products solutions is proposed

  14. Scenario Customization for Information Extraction

    National Research Council Canada - National Science Library

    Yangarber, Roman

    2001-01-01

    Information Extraction (IE) is an emerging NLP technology, whose function is to process unstructured, natural language text, to locate specific pieces of information, or facts, in the text, and to use these facts to fill a database...

  15. Partitioning of actinide from simulated high level wastes arising from reprocessing of PHWR fuels: counter current extraction studies using CMPO

    International Nuclear Information System (INIS)

    Deshingkar, D.S.; Chitnis, R.R.; Wattal, P.K.; Theyyunni, T.K.; Nair, M.K.T.; Ramanujam, A.; Dhami, P.S.; Gopalakrishnan, V.; Rao, M.K.; Mathur, J.N.; Murali, M.S.; Iyer, R.H.; Badheka, L.P.; Banerji, A.

    1994-01-01

    High level wastes (HLW) arising from reprocessing of pressurised heavy water reactor (PHWR) fuels contain actinides like neptunium, americium and cerium which are not extracted in the Purex process. They also contain small quantities of uranium and plutonium in addition to fission products. Removal of these actinides prior to vitrification of HLW can effectively reduce the active surveillance period of final waste form. Counter current studies using indigenously synthesised octyl (phenyl)-N, N-diisobutylcarbamoylmethylphosphine oxide (CMPO) were taken up as a follow-up of successful runs with simulated sulphate bearing low acid HLW solutions. The simulated HLW arising from reprocessing of PHWR fuel was prepared based on presumed burnup of 6500 MWd/Te of uranium, 3 years cooling period and 800 litres of waste generation per tonne of fuel reprocessed. The alpha activity of the HLW raffinate after extraction with the CMPO-TBP mixture could be brought down to near background level. (author). 13 refs., 2 tabs., 12 figs

  16. Partitioning of actinide from simulated high level wastes arising from reprocessing of PHWR fuels: counter current extraction studies using CMPO

    Energy Technology Data Exchange (ETDEWEB)

    Deshingkar, D S; Chitnis, R R; Wattal, P K; Theyyunni, T K; Nair, M K.T. [Bhabha Atomic Research Centre, Bombay (India). Process Engineering and Systems Development Div.; Ramanujam, A; Dhami, P S; Gopalakrishnan, V; Rao, M K [Bhabha Atomic Research Centre, Bombay (India). Fuel Reprocessing Group; Mathur, J N; Murali, M S; Iyer, R H [Bhabha Atomic Research Centre, Bombay (India). Radiochemistry Div.; Badheka, L P; Banerji, A [Bhabha Atomic Research Centre, Bombay (India). Bio-organic Div.

    1994-12-31

    High level wastes (HLW) arising from reprocessing of pressurised heavy water reactor (PHWR) fuels contain actinides like neptunium, americium and cerium which are not extracted in the Purex process. They also contain small quantities of uranium and plutonium in addition to fission products. Removal of these actinides prior to vitrification of HLW can effectively reduce the active surveillance period of final waste form. Counter current studies using indigenously synthesised octyl (phenyl)-N, N-diisobutylcarbamoylmethylphosphine oxide (CMPO) were taken up as a follow-up of successful runs with simulated sulphate bearing low acid HLW solutions. The simulated HLW arising from reprocessing of PHWR fuel was prepared based on presumed burnup of 6500 MWd/Te of uranium, 3 years cooling period and 800 litres of waste generation per tonne of fuel reprocessed. The alpha activity of the HLW raffinate after extraction with the CMPO-TBP mixture could be brought down to near background level. (author). 13 refs., 2 tabs., 12 figs.

  17. Extracting useful information from images

    DEFF Research Database (Denmark)

    Kucheryavskiy, Sergey

    2011-01-01

    The paper presents an overview of methods for extracting useful information from digital images. It covers various approaches that utilized different properties of images, like intensity distribution, spatial frequencies content and several others. A few case studies including isotropic and heter......The paper presents an overview of methods for extracting useful information from digital images. It covers various approaches that utilized different properties of images, like intensity distribution, spatial frequencies content and several others. A few case studies including isotropic...

  18. High-level nuclear-waste disposal: information exchange and conflict resolution

    International Nuclear Information System (INIS)

    Hadden, S.G.; Chiles, J.R.; Anaejionu, P.; Cerny, K.J.

    1981-07-01

    The research presented here was conceived as an exploration of the interactions among parties involved in the resolution of the high-level radioactive waste (HLW) disposal issue. Because of the major differences in the nature of the interactions between levels of government, on the one hand, and between government and the public, on the other hand, this study is divided into two primary areas - public participation and intergovernmental relations. These areas are further divided into theoretical and practical considerations. The format of the paper reflects the divisions explained above as well as the interaction of the various authors. Public participation is addressed from a theoretical perspective in Part 2. In Part 3 an essentially pragmatic approach is taken drawing on experiences from similar exercises. These two aspects of the study are presented in separate parts because the authors worked largely independently. Intergovernmental relations is treated in Part 4. The treatment is organized as two Sections of Part 4 to reflect the authors' close interaction which yielded a more integrated treatment of the theoretical and practical aspects of intergovernmental relations. Detailed recommendations and conclusions appear in the final subsections of Parts 2, 3, and 4. Part 5, Summary and Conclusions, does not reiterate the detailed conclusions and recommendations presented in previous parts but rather expresses some general perceptions with respect to the high-level waste disposal issue. A brief review of the Table of Contents will assist in visualizing the detailed format of this study and in identifying the portions of greatest relevance to specific questions. A detailed Subject Index and an Acronym Index have been included for the reader's convenience

  19. High level organizing principles for display of systems fault information for commercial flight crews

    Science.gov (United States)

    Rogers, William H.; Schutte, Paul C.

    1993-01-01

    Advanced fault management aiding concepts for commercial pilots are being developed in a research program at NASA Langley Research Center. One aim of this program is to re-evaluate current design principles for display of fault information to the flight crew: (1) from a cognitive engineering perspective and (2) in light of the availability of new types of information generated by advanced fault management aids. The study described in this paper specifically addresses principles for organizing fault information for display to pilots based on their mental models of fault management.

  20. Operation environment construction of geological information database for high level radioactive waste geological disposal

    International Nuclear Information System (INIS)

    Wang Peng; Gao Min; Huang Shutao; Wang Shuhong; Zhao Yongan

    2014-01-01

    To fulfill the requirements of data storage and management in HLW geological disposal, a targeted construction method for data operation environment was proposed in this paper. The geological information database operation environment constructed by this method has its unique features. And it also will be the important support for HLW geological disposal project and management. (authors)

  1. Information systems of telemedicine for the regions with high levels of radiation

    International Nuclear Information System (INIS)

    Yanchuk, V.; Svistelnyk, S.

    2002-01-01

    The necessity of the telemedicine system creation for the consulting of people living on the territory contaminated with radionuclides is stipulated by requirements of consulting people and well-qualified medical staff shortage in such region. The amount of patients rise year by year caused by increasing people by means of two nodes creation: the node of investigation using Ultrasound and MRT Equipment supporting the information about investigation and the node of consulting centre supporting the consultation on the basis of the investigation data analysis. (authors)

  2. Preliminary study on the three-dimensional geoscience information system of high-level radioactive waste geological disposal

    International Nuclear Information System (INIS)

    Li Peinan; Zhu Hehua; Li Xiaojun; Wang Ju; Zhong Xia

    2010-01-01

    The 3D geosciences information system of high-level radioactive waste geological disposal is an important research direction in the current high-level radioactive waste disposal project and a platform of information integration and publishing can be used for the relevant research direction based on the provided data and models interface. Firstly, this paper introduces the basic features about the disposal project of HLW and the function and requirement of the system, which includes the input module, the database management module, the function module, the maintenance module and the output module. Then, the framework system of the high-level waste disposal project information system has been studied, and the overall system architecture has been proposed. Finally, based on the summary and analysis of the database management, the 3D modeling, spatial analysis, digital numerical integration and visualization of underground project, the implementations of key functional modules and the platform have been expounded completely, and the conclusion has been drawn that the component-based software development method should be utilized in system development. (authors)

  3. Comparison of solvent extraction and extraction chromatography resin techniques for uranium isotopic characterization in high-level radioactive waste and barrier materials.

    Science.gov (United States)

    Hurtado-Bermúdez, Santiago; Villa-Alfageme, María; Mas, José Luis; Alba, María Dolores

    2018-07-01

    The development of Deep Geological Repositories (DGP) to the storage of high-level radioactive waste (HLRW) is mainly focused in systems of multiple barriers based on the use of clays, and particularly bentonites, as natural and engineered barriers in nuclear waste isolation due to their remarkable properties. Due to the fact that uranium is the major component of HLRW, it is required to go in depth in the analysis of the chemistry of the reaction of this element within bentonites. The determination of uranium under the conditions of HLRW, including the analysis of silicate matrices before and after the uranium-bentonite reaction, was investigated. The performances of a state-of-the-art and widespread radiochemical method based on chromatographic UTEVA resins, and a well-known and traditional method based on solvent extraction with tri-n-butyl phosphate (TBP), for the analysis of uranium and thorium isotopes in solid matrices with high concentrations of uranium were analysed in detail. In the development of this comparison, both radiochemical approaches have an overall excellent performance in order to analyse uranium concentration in HLRW samples. However, due to the high uranium concentration in the samples, the chromatographic resin is not able to avoid completely the uranium contamination in the thorium fraction. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. The application of quadtree algorithm for information integration in the high-level radioactive waste geological disposal

    International Nuclear Information System (INIS)

    Gao Min; Zhong Xia; Huang Shutao

    2008-01-01

    A multi-source database for high-level radioactive waste geological disposal, aims to promote the information process of the geological of HLW. In the periods of the multi-dimensional and multi-source and the integration of information and applications, it also relates to computer software and hardware, the paper preliminary analysises the data resources Beishan area, Gansu Province. The paper introduces a theory based on GIS technology and methods and open source code GDAL application, at the same time, it discusses the technical methods how to finish the application of the Quadtree algorithm in the area of information resources management system, fully sharing, rapid retrieval and so on. A more detailed description of the characteristics of existing data resources, space-related data retrieval algorithm theory, programming design and implementation of ideas are showed in the paper. (authors)

  5. Demonstration of pyropartitioning process by using genuine high-level liquid waste. Reductive-extraction of actinide elements from chlorination product

    International Nuclear Information System (INIS)

    Uozumi, Koichi; Iizuka, Masatoshi; Kurata, Masaki; Ougier, Michel; Malmbeck, Rikard; Winckel, Stefaan van

    2009-01-01

    The pyropartitioning process separates the minor actinide elements (MAs) together with uranium and plutonium from the high-level liquid waste generated at the Purex reprocessing of spent LWR fuel and introduces them to metallic fuel cycle. For the demonstration of this technology, a series experiment using 520g of genuine high-level liquid waste was started and the conversion of actinide elements to their chlorides was already demonstrated by denitration and chlorination. In the present study, a reductive extraction experiment in molten salt/liquid cadmium system to recover actinide elements from the chlorination product of the genuine high-level liquid waste was performed. The results of the experiment are as following; 1) By the addition of the cadmium-lithium alloy reductant, almost all of plutonium and MAs in the initial high-level liquid waste were recovered in the cadmium phase. It means no mass loss during denitration, chlorination, and reductive-extraction. 2) The separation factor values of plutonium, MAs, and rare-earth fission product elements versus uranium agreed with the literature values. Therefore, actinide elements will be separated from fission product elements in the actual system. Hence, the pyropartitioning process was successfully demonstrated. (author)

  6. Progress in evaluation of radionuclide geochemical information developed by DOE high-level nuclear waste repository site projects

    International Nuclear Information System (INIS)

    Meyer, R.E.; Arnold, W.D.; O'Kelley, G.D.; Case, F.I.; Land, J.F.

    1989-08-01

    Information that is being developed by projects within the Department of Energy (DOE) pertinent to the potential geochemical behavior of radionuclides at candidate sites for a high-level radioactive waste repository is being evaluated by Oak Ridge National Laboratory (ORNL) for the Nuclear Regulatory Commission (NRC). During this report period, all experiments were conducted with tuff from the proposed high-level nuclear waste site at Yucca Mountain, Nevada. The principal emphasis in this report period was on column studies of migration of uranium and technetium in water from well J-13 at the Yucca Mountain site. Columns 1 cm in diameter and about 5 cm long were constructed and carefully packed with ground tuff. The characteristics of the columns were tested by determination of elution curves of tritium and TcO 4 - . Elution peaks obtained in past studies with uranium were asymmetrical and the shapes were often complex, observations that suggested irreversibilities in the sorption reaction. To try to understand these observations, the effects of flow rate and temperature on uranium migration were studied in detail. Sorption ratios calculated from the elution peaks became larger as the flow rate decreased and as the temperature increased. These observations support the conclusion that the sorption of uranium is kinetically hindered. To confirm this, batch sorption ratio experiments were completed for uranium as a function of time for a variety of conditions

  7. 76 FR 35137 - Vulnerability and Threat Information for Facilities Storing Spent Nuclear Fuel and High-Level...

    Science.gov (United States)

    2011-06-16

    ... High-Level Radioactive Waste AGENCY: U.S. Nuclear Regulatory Commission. ACTION: Public meeting... Nuclear Fuel, High-Level Radioactive Waste, and Reactor-Related Greater Than Class C Waste,'' and 73... Spent Nuclear Fuel (SNF) and High-Level Radioactive Waste (HLW) storage facilities. The draft regulatory...

  8. Extracting information from multiplex networks

    Science.gov (United States)

    Iacovacci, Jacopo; Bianconi, Ginestra

    2016-06-01

    Multiplex networks are generalized network structures that are able to describe networks in which the same set of nodes are connected by links that have different connotations. Multiplex networks are ubiquitous since they describe social, financial, engineering, and biological networks as well. Extending our ability to analyze complex networks to multiplex network structures increases greatly the level of information that is possible to extract from big data. For these reasons, characterizing the centrality of nodes in multiplex networks and finding new ways to solve challenging inference problems defined on multiplex networks are fundamental questions of network science. In this paper, we discuss the relevance of the Multiplex PageRank algorithm for measuring the centrality of nodes in multilayer networks and we characterize the utility of the recently introduced indicator function Θ ˜ S for describing their mesoscale organization and community structure. As working examples for studying these measures, we consider three multiplex network datasets coming for social science.

  9. Extractions of High Quality RNA from the Seeds of Jerusalem Artichoke and Other Plant Species with High Levels of Starch and Lipid.

    Science.gov (United States)

    Mornkham, Tanupat; Wangsomnuk, Preeya Puangsomlee; Fu, Yong-Bi; Wangsomnuk, Pinich; Jogloy, Sanun; Patanothai, Aran

    2013-04-29

    Jerusalem artichoke (Helianthus tuberosus L.) is an important tuber crop. However, Jerusalem artichoke seeds contain high levels of starch and lipid, making the extraction of high-quality RNA extremely difficult and the gene expression analysis challenging. This study was aimed to improve existing methods for extracting total RNA from Jerusalem artichoke dry seeds and to assess the applicability of the improved method in other plant species. Five RNA extraction methods were evaluated on Jerusalem artichoke seeds and two were modified. One modified method with the significant improvement was applied to assay seeds of diverse Jerusalem artichoke accessions, sunflower, rice, maize, peanut and marigold. The effectiveness of the improved method to extract total RNA from seeds was assessed using qPCR analysis of four selected genes. The improved method of Ma and Yang (2011) yielded a maximum RNA solubility and removed most interfering substances. The improved protocol generated 29 to 41 µg RNA/30 mg fresh weight. An A260/A280 ratio of 1.79 to 2.22 showed their RNA purity. Extracted RNA was effective for downstream applications such as first-stranded cDNA synthesis, cDNA cloning and qPCR. The improved method was also effective to extract total RNA from seeds of sunflower, rice, maize and peanut that are rich in polyphenols, lipids and polysaccharides.

  10. Extractions of High Quality RNA from the Seeds of Jerusalem Artichoke and Other Plant Species with High Levels of Starch and Lipid

    Directory of Open Access Journals (Sweden)

    Tanupat Mornkham

    2013-04-01

    Full Text Available Jerusalem artichoke (Helianthus tuberosus L. is an important tuber crop. However, Jerusalem artichoke seeds contain high levels of starch and lipid, making the extraction of high-quality RNA extremely difficult and the gene expression analysis challenging. This study was aimed to improve existing methods for extracting total RNA from Jerusalem artichoke dry seeds and to assess the applicability of the improved method in other plant species. Five RNA extraction methods were evaluated on Jerusalem artichoke seeds and two were modified. One modified method with the significant improvement was applied to assay seeds of diverse Jerusalem artichoke accessions, sunflower, rice, maize, peanut and marigold. The effectiveness of the improved method to extract total RNA from seeds was assessed using qPCR analysis of four selected genes. The improved method of Ma and Yang (2011 yielded a maximum RNA solubility and removed most interfering substances. The improved protocol generated 29 to 41 µg RNA/30 mg fresh weight. An A260/A280 ratio of 1.79 to 2.22 showed their RNA purity. Extracted RNA was effective for downstream applications such as first-stranded cDNA synthesis, cDNA cloning and qPCR. The improved method was also effective to extract total RNA from seeds of sunflower, rice, maize and peanut that are rich in polyphenols, lipids and polysaccharides.

  11. Effects of P-Zn interaction and lime on plant growth in the presence of high levels of extractable zinc

    Energy Technology Data Exchange (ETDEWEB)

    Koukoulakis, P

    1973-01-01

    Six glasshouse experiments were conducted in order to study (a) the effect of P and lime on dry matter yield and mineral composition of tomato, cotton, maize and sudan grass grown on a Zn polluted soil (containing 170 ppM of 2.5% acetic acid extractable Zn), (b) the effect of residual P on dry matter yield and mineral composition of beans, lettuce, and maize grown on a similar soil, and (c) the effect of various Zn treatments on the availability of indigenous and added P of a soil low in Zn (11 ppM). It was found that the yield response to applied P of maize and sudan grass was independent of lime, while cotton, tomato and beans failed almost completely to respond to the absence of lime. The crops responded differently to the excess soil Zn and the dry matter yields were related to the ability to accumulate Zn. High Zn accumulator plants failed to respond to applied P in the absence of lime, while low Zn accumulating plants responded positively. The positive and highly significant effect of P on total Zn uptake of plants, masked the depressive effect of P on Zn concentration. However, the results indicated that the P-Zn interrelationship is far more complicated than a dilution effect caused by the promotive effect of applied P. Studies of the effect of applied Zn levels on available soil P and conversely, indicated that a strong mutual fixation, probably coprecipitation takes place in the soil, which may account for a considerable part of the depressive effect of P on plant Zn, in addition to the effects like coprecipitation in roots and dilution, reported in the literature. Finally, the residual effect of P varied with the plant species, and the plant Zn concentration was found to be a determinant factor in controlling dry matter yields. 58 references, 13 figures, 24 tables.

  12. Transductive Pattern Learning for Information Extraction

    National Research Council Canada - National Science Library

    McLernon, Brian; Kushmerick, Nicholas

    2006-01-01

    .... We present TPLEX, a semi-supervised learning algorithm for information extraction that can acquire extraction patterns from a small amount of labelled text in conjunction with a large amount of unlabelled text...

  13. TRU decontamination of high-level Purex waste by solvent extraction using a mixed octyl(phenyl)-N,N-diisobutyl-carbamoylmethylphosphine oxide/TBP/NPH (TRUEX) solvent

    International Nuclear Information System (INIS)

    Horwitz, E.P.; Kalina, D.G.; Diamond, H.; Kaplan, L.; Vandegrift, G.F.; Leonard, R.A.; Steindler, M.J.; Schulz, W.W.

    1984-01-01

    The TRUEX (transuranium extraction) process was tested on a simulated high-level dissolved sludge waste (DSW). A batch counter-current extraction mode was used for seven extraction and three scrub stages. One additional extraction stage and two scrub stages and all strip stages were performed by batch extraction. The TRUEX solvent consisted of 0.20 M octyl(phenyl)-N,N-diisobutylcarbamoyl-methylphosphine oxide-1.4 M TBP in Conoco (C 12 -C 14 ). The feed solution was 1.0 M in HNO 3 , 0.3 M in H 2 C 2 O 4 and contained mixed (stable) fission products, U, Np, Pu, and Am, and a number of inert constituents, e.g., Fe and Al. The test showed that the process is capable of reducing the TRU concentration in the DSW by a factor of 4 x 10 4 (to <100 nCi/g of disposed form) and reducing the quantity of TRU waste by two orders of magnitude

  14. Risk-informed assessment of radionuclide release from dissolution of spent nuclear fuel and high-level waste glass

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Tae M., E-mail: tae.ahn@nrc.gov

    2017-06-15

    Highlights: • Dissolution of HLW waste form was assessed with long-term risk informed approach. • The radionuclide release rate decreases with time from the initial release rate. • Fast release radionuclides can be dispersed with discrete container failure time. • Fast release radionuclides can be restricted by container opening area. • Dissolved radionuclides may be further sequestered by sorption or others means. - Abstract: This paper aims to detail the different parameters to be considered for use in an assessment of radionuclide release. The dissolution of spent nuclear fuel and high-level nuclear waste glass was considered for risk and performance insights in a generic disposal system for more than 100,000 years. The probabilistic performance assessment includes the waste form, container, geology, and hydrology. Based on the author’s previous extended work and data from the literature, this paper presents more detailed specific cases of (1) the time dependence of radionuclide release, (2) radionuclide release coupled with container failure (rate-limiting process), (3) radionuclide release through the opening area of the container and cladding, and (4) sequestration of radionuclides in the near field after container failure. These cases are better understood for risk and performance insights. The dissolved amount of waste form is not linear with time but is higher at first. The radionuclide release rate from waste form dissolution can be constrained by container failure time. The partial opening area of the container surface may decrease radionuclide release. Radionuclides sequestered by various chemical reactions in the near field of a failed container may become stable with time as the radiation level decreases with time.

  15. Information Extraction for Social Media

    NARCIS (Netherlands)

    Habib, M. B.; Keulen, M. van

    2014-01-01

    The rapid growth in IT in the last two decades has led to a growth in the amount of information available online. A new style for sharing information is social media. Social media is a continuously instantly updated source of information. In this position paper, we propose a framework for

  16. Information Extraction From Chemical Patents

    Directory of Open Access Journals (Sweden)

    Sandra Bergmann

    2012-01-01

    Full Text Available The development of new chemicals or pharmaceuticals is preceded by an indepth analysis of published patents in this field. This information retrieval is a costly and time inefficient step when done by a human reader, yet it is mandatory for potential success of an investment. The goal of the research project UIMA-HPC is to automate and hence speed-up the process of knowledge mining about patents. Multi-threaded analysis engines, developed according to UIMA (Unstructured Information Management Architecture standards, process texts and images in thousands of documents in parallel. UNICORE (UNiform Interface to COmputing Resources workflow control structures make it possible to dynamically allocate resources for every given task to gain best cpu-time/realtime ratios in an HPC environment.

  17. ALICE High Level Trigger

    CERN Multimedia

    Alt, T

    2013-01-01

    The ALICE High Level Trigger (HLT) is a computing farm designed and build for the real-time, online processing of the raw data produced by the ALICE detectors. Events are fully reconstructed from the raw data, analyzed and compressed. The analysis summary together with the compressed data and a trigger decision is sent to the DAQ. In addition the reconstruction of the events allows for on-line monitoring of physical observables and this information is provided to the Data Quality Monitor (DQM). The HLT can process event rates of up to 2 kHz for proton-proton and 200 Hz for Pb-Pb central collisions.

  18. High-level specification of a proposed information architecture for support of a bioterrorism early-warning system.

    Science.gov (United States)

    Berkowitz, Murray R

    2013-01-01

    Current information systems for use in detecting bioterrorist attacks lack a consistent, overarching information architecture. An overview of the use of biological agents as weapons during a bioterrorist attack is presented. Proposed are the design, development, and implementation of a medical informatics system to mine pertinent databases, retrieve relevant data, invoke appropriate biostatistical and epidemiological software packages, and automatically analyze these data. The top-level information architecture is presented. Systems requirements and functional specifications for this level are presented. Finally, future studies are identified.

  19. High-Level Colloquium on Information Literacy and Lifelong Learning Bibliotheca Alexandrina, Alexandria, Egypt - Report of Meeting

    OpenAIRE

    Breivik, Patricia; Byrne, Alex; Forest Horton, Woody; Ferreiro, Soledad; Boekhorst, Albert; Hassan, Helena; Ponjuan, Gloria; Lau, Jesús; Candy, Phil

    2006-01-01

    Report of a Meeting Sponsored by the United Nations Education, Scientific, and Cultural Organisation (UNESCO), National Forum on Information Literacy (NFIL) and the International Federation of Library Associations and Institutions (IFLA). The report is organized according to four primary areas related to Information Literacy: Education and Learning, Health and Human Services, Business and Economic Development, and Governance and Citizenship. It highlights recommendations for empowering cit...

  20. A Value of Information approach to data quality objectives for the Hanford high-level waste tanks

    International Nuclear Information System (INIS)

    Wood, T.W.; Hunter, V.L.; Ulvila, J.W.

    1995-02-01

    This report summarizes a Pacific Northwest Laboratory review of the organic-nitrate reaction safety issue in the Hanford single-shell tanks. This study employed a decision analytic method known as Value of Information (VOI). VOI analysis is a special form of decision analysis that has an information collection alternative as one of the initial decision choices. This type of decision analysis, therefore results in the ability to specify the preferred information collection alternative, taking into account all information gathering and other relevant alternatives. For example, the risk reduction benefit associated with further sampling to quantify total organic carbon inventory or to improve information on energetics can be compared to the risk reduction benefit of better temperature monitoring, operational restrictions, or mitigation by moisture control. This approach allows freedom from built-in assumptions, e.g., that all tanks must be sampled to some degree or that all tanks must be deemed intrinsically safe by some means or another. It allows for each tank management decision to be judged in terms of risk reduction from the current state of affairs, and for that state of affairs to be continuously updated to incorporate new information on tank contents, the phenomenology of safety issues, or the effectiveness of mitigation schemes

  1. Information needs for characterization of high-level waste repository sites in six geologic media. Volume 1. Main report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1985-05-01

    Evaluation of the geologic isolation of radioactive materials from the biosphere requires an intimate knowledge of site geologic conditions, which is gained through precharacterization and site characterization studies. This report presents the results of an intensive literature review, analysis and compilation to delineate the information needs, applicable techniques and evaluation criteria for programs to adequately characterize a site in six geologic media. These media, in order of presentation, are: granite, shale, basalt, tuff, bedded salt and dome salt. Guidelines are presented to assess the efficacy (application, effectiveness, and resolution) of currently used exploratory and testing techniques for precharacterization or characterization of a site. These guidelines include the reliability, accuracy and resolution of techniques deemed acceptable, as well as cost estimates of various field and laboratory techniques used to obtain the necessary information. Guidelines presented do not assess the relative suitability of media. 351 refs., 10 figs., 31 tabs.

  2. Information needs for characterization of high-level waste repository sites in six geologic media. Volume 1. Main report

    International Nuclear Information System (INIS)

    1985-05-01

    Evaluation of the geologic isolation of radioactive materials from the biosphere requires an intimate knowledge of site geologic conditions, which is gained through precharacterization and site characterization studies. This report presents the results of an intensive literature review, analysis and compilation to delineate the information needs, applicable techniques and evaluation criteria for programs to adequately characterize a site in six geologic media. These media, in order of presentation, are: granite, shale, basalt, tuff, bedded salt and dome salt. Guidelines are presented to assess the efficacy (application, effectiveness, and resolution) of currently used exploratory and testing techniques for precharacterization or characterization of a site. These guidelines include the reliability, accuracy and resolution of techniques deemed acceptable, as well as cost estimates of various field and laboratory techniques used to obtain the necessary information. Guidelines presented do not assess the relative suitability of media. 351 refs., 10 figs., 31 tabs

  3. Nuclear fuel reprocessing and high level waste disposal: informational hearings. Volume XII. Public and private roles, Part 2

    International Nuclear Information System (INIS)

    1977-01-01

    Presentations were made on institutional experiences at Nuclear Fuel Services, the framework for an acceptable nuclear future, the Price-Anderson Indemnity Act, Congress and nuclear energy policy, human dimension, and risk perception. The supplemental testimony and materials submitted for the record included information of the nuclear waste at West Valley, New York, the perception and acceptability of risk from nuclear and alternative energy sources, and psychological determinants of perceived and acceptable risk

  4. Extracting Information from Multimedia Meeting Collections

    OpenAIRE

    Gatica-Perez, Daniel; Zhang, Dong; Bengio, Samy

    2005-01-01

    Multimedia meeting collections, composed of unedited audio and video streams, handwritten notes, slides, and electronic documents that jointly constitute a raw record of complex human interaction processes in the workplace, have attracted interest due to the increasing feasibility of recording them in large quantities, by the opportunities for information access and retrieval applications derived from the automatic extraction of relevant meeting information, and by the challenges that the ext...

  5. DKIE: Open Source Information Extraction for Danish

    DEFF Research Database (Denmark)

    Derczynski, Leon; Field, Camilla Vilhelmsen; Bøgh, Kenneth Sejdenfaden

    2014-01-01

    Danish is a major Scandinavian language spoken daily by around six million people. However, it lacks a unified, open set of NLP tools. This demonstration will introduce DKIE, an extensible open-source toolkit for processing Danish text. We implement an information extraction architecture for Danish...

  6. Dissecting empathy: high levels of psychopathic and autistic traits are characterised by difficulties in different social information processing domains

    Directory of Open Access Journals (Sweden)

    Patricia L Lockwood

    2013-11-01

    Full Text Available Individuals with psychopathy or autism spectrum disorder (ASD can behave in ways that suggest lack of empathy towards others. However, many different cognitive and affective processes may lead to unempathic behavior and the social processing profiles of individuals with high psychopathic vs. ASD traits are likely different. Whilst psychopathy appears characterized by problems with resonating with others’ emotions, ASD appears characterized by problems with cognitive perspective-taking. In addition, alexithymia has previously been associated with both disorders, but the contribution of alexithymia needs further exploration. In a community sample (N=110 we show for the first time that although affective resonance and cognitive perspective-taking are related, high psychopathic traits relate to problems with resonating with others’ emotions, but not cognitive perspective taking. Conversely, high ASD traits relate to problems with cognitive perspective-taking but not resonating with others’ emotions. Alexithymia was associated with problems with affective resonance independently of psychopathic traits, suggesting that different component processes (reduced tendency to feel what others feel and reduced ability to identify and describe feelings comprise affective resonance. Alexithymia was not associated with the reduced cognitive perspective-taking in high ASD traits. Our data suggest that (1 elevated psychopathic and ASD traits are characterized by difficulties in different social information processing domains and (2 reduced affective resonance in individuals with elevated psychopathic traits and the reduced cognitive perspective taking in individuals with elevated ASD traits are not explained by co-occurring alexithymia. (3 Alexithymia is independently associated with reduced affective resonance. Consequently, our data point to different component processes within the construct of empathy that are suggestive of partially separable cognitive

  7. Progress in evaluation of radionuclide geochemical information developed by DOE high-level nuclear waste repository site projects. Annual report, October 1984-September 1985. Volume 4

    International Nuclear Information System (INIS)

    Meyer, R.E.; Arnold, W.D.; Blencoe, J.G.; Jacobs, G.K.; Kelmers, A.D.; Seeley, F.G.; Whatley, S.K.

    1986-05-01

    Information pertaining to the potential geochemical behavior of radionuclides at candidate sites for a high-level radioactive waste repository, which is being developed by projects within the Department of Energy (DOE), is being evaluated by Oak Ridge National Laboratory for the Nuclear Regulatory Commission (NRC). During this report period, emphasis was placed on the evaluation of information pertinent to the Hanford site in southeastern Washington. Results on the sorption/solubility behavior of technetium, neptunium, and uranium in the basalt/water geochemical system are summarized and compared to the results of DOE. Also, summaries of results are reported from two geochemical modeling studies: (1) an evaluation of the information developed by DOE on the native copper deposits of Michigan as a natural analog for the emplacement of copper canisters in a repository in basalt, and (2) calculation of the solubility and speciation of radionuclides for representative groundwaters from the Yucca Mountain site in Nevada

  8. Studies on Am(III) separation from simulated high-level waste using cobalt bis(dicarbollide) (1(-)) ion derivative covalently bound to N,N'-di-n-octyl diglycol diamide as extractant and DTPA as stripping agent

    Czech Academy of Sciences Publication Activity Database

    Bubeníková, M.; Selucký, P.; Rais, J.; Grüner, Bohumír; Švec, Petr

    2012-01-01

    Roč. 293, č. 1 (2012), s. 403-408 ISSN 0236-5731 R&D Projects: GA ČR GA104/09/0668 Institutional research plan: CEZ:AV0Z40320502 Keywords : Solvent extraction * actinides * high- level liquid waste * dicarbollide derivatives * carboranes * TODGA * DTPA Subject RIV: CA - Inorganic Chemistry Impact factor: 1.467, year: 2012

  9. Unsupervised information extraction by text segmentation

    CERN Document Server

    Cortez, Eli

    2013-01-01

    A new unsupervised approach to the problem of Information Extraction by Text Segmentation (IETS) is proposed, implemented and evaluated herein. The authors' approach relies on information available on pre-existing data to learn how to associate segments in the input string with attributes of a given domain relying on a very effective set of content-based features. The effectiveness of the content-based features is also exploited to directly learn from test data structure-based features, with no previous human-driven training, a feature unique to the presented approach. Based on the approach, a

  10. Extracting the information backbone in online system.

    Science.gov (United States)

    Zhang, Qian-Ming; Zeng, An; Shang, Ming-Sheng

    2013-01-01

    Information overload is a serious problem in modern society and many solutions such as recommender system have been proposed to filter out irrelevant information. In the literature, researchers have been mainly dedicated to improving the recommendation performance (accuracy and diversity) of the algorithms while they have overlooked the influence of topology of the online user-object bipartite networks. In this paper, we find that some information provided by the bipartite networks is not only redundant but also misleading. With such "less can be more" feature, we design some algorithms to improve the recommendation performance by eliminating some links from the original networks. Moreover, we propose a hybrid method combining the time-aware and topology-aware link removal algorithms to extract the backbone which contains the essential information for the recommender systems. From the practical point of view, our method can improve the performance and reduce the computational time of the recommendation system, thus improving both of their effectiveness and efficiency.

  11. Information extraction from muon radiography data

    International Nuclear Information System (INIS)

    Borozdin, K.N.; Asaki, T.J.; Chartrand, R.; Hengartner, N.W.; Hogan, G.E.; Morris, C.L.; Priedhorsky, W.C.; Schirato, R.C.; Schultz, L.J.; Sottile, M.J.; Vixie, K.R.; Wohlberg, B.E.; Blanpied, G.

    2004-01-01

    Scattering muon radiography was proposed recently as a technique of detection and 3-d imaging for dense high-Z objects. High-energy cosmic ray muons are deflected in matter in the process of multiple Coulomb scattering. By measuring the deflection angles we are able to reconstruct the configuration of high-Z material in the object. We discuss the methods for information extraction from muon radiography data. Tomographic methods widely used in medical images have been applied to a specific muon radiography information source. Alternative simple technique based on the counting of high-scattered muons in the voxels seems to be efficient in many simulated scenes. SVM-based classifiers and clustering algorithms may allow detection of compact high-Z object without full image reconstruction. The efficiency of muon radiography can be increased using additional informational sources, such as momentum estimation, stopping power measurement, and detection of muonic atom emission.

  12. Fundamental Chemistry of the Universal Extractant (UNEX) for the Simultaneous Separation of Fission Products and Transurancies from High-Level Waste Streams

    International Nuclear Information System (INIS)

    Herbst, R. Scott

    2004-01-01

    Through collaborative research by the Idaho National Engineering and Environmental Laboratory and the Khlopin Radium Institute (St. Petersburg, Russia) the concept of a Universal Extraction (UNEX) solvent for simultaneously removing radioactive strontium, cesium, lanthanides, and transuranics from acidic aqueous waste streams in a single unit operation was developed and validated. These development efforts focused on the application of the process, where extractants were simply evaluated for extraction efficiency. The objective of this project is to conduct research that combines classical chemical techniques with advanced instrumental methods to elucidate the mechanisms of simultaneous metal extraction and study further the coordination geometries of extracted metal ions. This project is developing a fundamental understanding of the complicated, synergistic extraction chemistry of the multi-component UNEX solvent system. The results will facilitate enhancements to the process chemistry--increasing the efficiency of the UNEX process, minimizing primary and secondary waste streams, and enhancing compatibility of the product streams with the final waste forms. The global objective is implementing the UNEX process at the industrial scale

  13. The influences of scientific information on the growing in opinion for high level waste repository. Focusing on education in civil engineering course

    International Nuclear Information System (INIS)

    Amemiya, Kiyoshi; Chijimatsu, Masakazu

    2002-01-01

    In this research, survey of awareness and attitude to high level radioactive waste (HLW) disposal on 36 students of a postgraduate course was conducted. They have been studying civil and rock engineering, so they belong to 'the Group' that acquires high education, culture and faculty to understand the science in geological disposal of HLW. First of all the awareness of danger or safety to HLW disposal was examined. Some 23% regard HLW disposal as safe, on the contrary 60% feel danger. This is similar to the awareness of the average public. And some 72% think that HLW should be disposal, but only 6% agree the repository in their town. It shows that the Group of high education has a tendency of calmly understand the necessity of disposal, but they also have a nature so-called 'not in my back yard (NIMBY)'. After that, the students were divided in two groups. Then, one group received information from the promoter, and another received information from opponents. The result of second questionnaire shows that the awareness of danger is affected strongly by given information even in this Group, but they become thoughtful and prudent in their opinion and decision-making as increasing information. Finally in this paper it is studied that 'what is the role of education of civil engineering?' and 'what is key issue in R and D of HLW disposal?' considering Public Acceptance. (author)

  14. Extracting the information backbone in online system.

    Directory of Open Access Journals (Sweden)

    Qian-Ming Zhang

    Full Text Available Information overload is a serious problem in modern society and many solutions such as recommender system have been proposed to filter out irrelevant information. In the literature, researchers have been mainly dedicated to improving the recommendation performance (accuracy and diversity of the algorithms while they have overlooked the influence of topology of the online user-object bipartite networks. In this paper, we find that some information provided by the bipartite networks is not only redundant but also misleading. With such "less can be more" feature, we design some algorithms to improve the recommendation performance by eliminating some links from the original networks. Moreover, we propose a hybrid method combining the time-aware and topology-aware link removal algorithms to extract the backbone which contains the essential information for the recommender systems. From the practical point of view, our method can improve the performance and reduce the computational time of the recommendation system, thus improving both of their effectiveness and efficiency.

  15. Extracting the Information Backbone in Online System

    Science.gov (United States)

    Zhang, Qian-Ming; Zeng, An; Shang, Ming-Sheng

    2013-01-01

    Information overload is a serious problem in modern society and many solutions such as recommender system have been proposed to filter out irrelevant information. In the literature, researchers have been mainly dedicated to improving the recommendation performance (accuracy and diversity) of the algorithms while they have overlooked the influence of topology of the online user-object bipartite networks. In this paper, we find that some information provided by the bipartite networks is not only redundant but also misleading. With such “less can be more” feature, we design some algorithms to improve the recommendation performance by eliminating some links from the original networks. Moreover, we propose a hybrid method combining the time-aware and topology-aware link removal algorithms to extract the backbone which contains the essential information for the recommender systems. From the practical point of view, our method can improve the performance and reduce the computational time of the recommendation system, thus improving both of their effectiveness and efficiency. PMID:23690946

  16. High-level verification

    CERN Document Server

    Lerner, Sorin; Kundu, Sudipta

    2011-01-01

    Given the growing size and heterogeneity of Systems on Chip (SOC), the design process from initial specification to chip fabrication has become increasingly complex. This growing complexity provides incentive for designers to use high-level languages such as C, SystemC, and SystemVerilog for system-level design. While a major goal of these high-level languages is to enable verification at a higher level of abstraction, allowing early exploration of system-level designs, the focus so far for validation purposes has been on traditional testing techniques such as random testing and scenario-based

  17. Chaotic spectra: How to extract dynamic information

    International Nuclear Information System (INIS)

    Taylor, H.S.; Gomez Llorente, J.M.; Zakrzewski, J.; Kulander, K.C.

    1988-10-01

    Nonlinear dynamics is applied to chaotic unassignable atomic and molecular spectra with the aim of extracting detailed information about regular dynamic motions that exist over short intervals of time. It is shown how this motion can be extracted from high resolution spectra by doing low resolution studies or by Fourier transforming limited regions of the spectrum. These motions mimic those of periodic orbits (PO) and are inserts into the dominant chaotic motion. Considering these inserts and the PO as a dynamically decoupled region of space, resonant scattering theory and stabilization methods enable us to compute ladders of resonant states which interact with the chaotic quasi-continuum computed in principle from basis sets placed off the PO. The interaction of the resonances with the quasicontinuum explains the low resolution spectra seen in such experiments. It also allows one to associate low resolution features with a particular PO. The motion on the PO thereby supplies the molecular movements whose quantization causes the low resolution spectra. Characteristic properties of the periodic orbit based resonances are discussed. The method is illustrated on the photoabsorption spectrum of the hydrogen atom in a strong magnetic field and on the photodissociation spectrum of H 3 + . Other molecular systems which are currently under investigation using this formalism are also mentioned. 53 refs., 10 figs., 2 tabs

  18. General Algorithm (High level)

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. General Algorithm (High level). Iteratively. Use Tightness Property to remove points of P1,..,Pi. Use random sampling to get a Random Sample (of enough points) from the next largest cluster, Pi+1. Use the Random Sampling Procedure to approximate ci+1 using the ...

  19. Extraction of quantifiable information from complex systems

    CERN Document Server

    Dahmen, Wolfgang; Griebel, Michael; Hackbusch, Wolfgang; Ritter, Klaus; Schneider, Reinhold; Schwab, Christoph; Yserentant, Harry

    2014-01-01

    In April 2007, the  Deutsche Forschungsgemeinschaft (DFG) approved the  Priority Program 1324 “Mathematical Methods for Extracting Quantifiable Information from Complex Systems.” This volume presents a comprehensive overview of the most important results obtained over the course of the program.   Mathematical models of complex systems provide the foundation for further technological developments in science, engineering and computational finance.  Motivated by the trend toward steadily increasing computer power, ever more realistic models have been developed in recent years. These models have also become increasingly complex, and their numerical treatment poses serious challenges.   Recent developments in mathematics suggest that, in the long run, much more powerful numerical solution strategies could be derived if the interconnections between the different fields of research were systematically exploited at a conceptual level. Accordingly, a deeper understanding of the mathematical foundations as w...

  20. Extraction of temporal information in functional MRI

    Science.gov (United States)

    Singh, M.; Sungkarat, W.; Jeong, Jeong-Won; Zhou, Yongxia

    2002-10-01

    The temporal resolution of functional MRI (fMRI) is limited by the shape of the haemodynamic response function (hrf) and the vascular architecture underlying the activated regions. Typically, the temporal resolution of fMRI is on the order of 1 s. We have developed a new data processing approach to extract temporal information on a pixel-by-pixel basis at the level of 100 ms from fMRI data. Instead of correlating or fitting the time-course of each pixel to a single reference function, which is the common practice in fMRI, we correlate each pixel's time-course to a series of reference functions that are shifted with respect to each other by 100 ms. The reference function yielding the highest correlation coefficient for a pixel is then used as a time marker for that pixel. A Monte Carlo simulation and experimental study of this approach were performed to estimate the temporal resolution as a function of signal-to-noise ratio (SNR) in the time-course of a pixel. Assuming a known and stationary hrf, the simulation and experimental studies suggest a lower limit in the temporal resolution of approximately 100 ms at an SNR of 3. The multireference function approach was also applied to extract timing information from an event-related motor movement study where the subjects flexed a finger on cue. The event was repeated 19 times with the event's presentation staggered to yield an approximately 100-ms temporal sampling of the haemodynamic response over the entire presentation cycle. The timing differences among different regions of the brain activated by the motor task were clearly visualized and quantified by this method. The results suggest that it is possible to achieve a temporal resolution of /spl sim/200 ms in practice with this approach.

  1. High Level Radioactive Waste Management

    International Nuclear Information System (INIS)

    1991-01-01

    The proceedings of the second annual international conference on High Level Radioactive Waste Management, held on April 28--May 3, 1991, Las Vegas, Nevada, provides information on the current technical issue related to international high level radioactive waste management activities and how they relate to society as a whole. Besides discussing such technical topics as the best form of the waste, the integrity of storage containers, design and construction of a repository, the broader social aspects of these issues are explored in papers on such subjects as conformance to regulations, transportation safety, and public education. By providing this wider perspective of high level radioactive waste management, it becomes apparent that the various disciplines involved in this field are interrelated and that they should work to integrate their waste management activities. Individual records are processed separately for the data bases

  2. High-level Petri Nets

    DEFF Research Database (Denmark)

    various journals and collections. As a result, much of this knowledge is not readily available to people who may be interested in using high-level nets. Within the Petri net community this problem has been discussed many times, and as an outcome this book has been compiled. The book contains reprints...... of some of the most important papers on the application and theory of high-level Petri nets. In this way it makes the relevant literature more available. It is our hope that the book will be a useful source of information and that, e.g., it can be used in the organization of Petri net courses. To make......High-level Petri nets are now widely used in both theoretical analysis and practical modelling of concurrent systems. The main reason for the success of this class of net models is that they make it possible to obtain much more succinct and manageable descriptions than can be obtained by means...

  3. Optical Aperture Synthesis Object's Information Extracting Based on Wavelet Denoising

    International Nuclear Information System (INIS)

    Fan, W J; Lu, Y

    2006-01-01

    Wavelet denoising is studied to improve OAS(optical aperture synthesis) object's Fourier information extracting. Translation invariance wavelet denoising based on Donoho wavelet soft threshold denoising is researched to remove Pseudo-Gibbs in wavelet soft threshold image. OAS object's information extracting based on translation invariance wavelet denoising is studied. The study shows that wavelet threshold denoising can improve the precision and the repetition of object's information extracting from interferogram, and the translation invariance wavelet denoising information extracting is better than soft threshold wavelet denoising information extracting

  4. Respiratory Information Extraction from Electrocardiogram Signals

    KAUST Repository

    Amin, Gamal El Din Fathy

    2010-12-01

    The Electrocardiogram (ECG) is a tool measuring the electrical activity of the heart, and it is extensively used for diagnosis and monitoring of heart diseases. The ECG signal reflects not only the heart activity but also many other physiological processes. The respiratory activity is a prominent process that affects the ECG signal due to the close proximity of the heart and the lungs. In this thesis, several methods for the extraction of respiratory process information from the ECG signal are presented. These methods allow an estimation of the lung volume and the lung pressure from the ECG signal. The potential benefit of this is to eliminate the corresponding sensors used to measure the respiration activity. A reduction of the number of sensors connected to patients will increase patients’ comfort and reduce the costs associated with healthcare. As a further result, the efficiency of diagnosing respirational disorders will increase since the respiration activity can be monitored with a common, widely available method. The developed methods can also improve the detection of respirational disorders that occur while patients are sleeping. Such disorders are commonly diagnosed in sleeping laboratories where the patients are connected to a number of different sensors. Any reduction of these sensors will result in a more natural sleeping environment for the patients and hence a higher sensitivity of the diagnosis.

  5. Application of Box-Behnken design to optimize multi-sorbent solid phase extraction for trace neonicotinoids in water containing high level of matrix substances.

    Science.gov (United States)

    Zhang, Junjie; Wei, Yanli; Li, Huizhen; Zeng, Eddy Y; You, Jing

    2017-08-01

    Extensive use of neonicotinoid insecticides has raised great concerns about their ecological risk. A reliable method to measure trace neonicotinoids in complicated aquatic environment is a premise for assessing their aquatic risk. To effectively remove matrix interfering substances from field water samples before instrumental analysis with HPLC/MS/MS, a multi-sorbent solid phase extraction method was developed using Box-Behnken design. The optimized method employed 200mg HLB/GCB (w/w, 8/2) as the sorbents and 6mL of 20% acetone in acetonitrile as the elution solution. The method was applied for measuring neonicotinoids in water at a wide range of concentrations (0.03-100μg/L) containing various amounts of matrix components. The recoveries of acetamiprid, imidacloprid, thiacloprid and thiamethoxam from the spiked samples ranged from 76.3% to 107% while clothianidin and dinotefuran had relatively lower recoveries. The recoveries of neonicotinoids in water with various amounts of matrix interfering substances were comparable and the matrix removal rates were approximately 50%. The method was sensitive with method detection limits in the range of 1.8-6.8ng/L for all target neonicotinoids. Finally, the developed method was validated by measurement of trace neonicotinoids in natural water. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Progress in evaluation of radionuclide geochemical information developed by DOE high-level nuclear waste repository site projects: Report for April 1986-September 1987

    International Nuclear Information System (INIS)

    Meyer, R.E.; Arnold, W.D.; Blencoe, J.G.; O'Kelley, G.D.; Land, J.F.

    1988-02-01

    Experiments were conducted with tuff from the proposed high-level nuclear waste site at Yucca Mountain, Nevada. Batch sorption ratio determinations were conducted for strontium, cesium, uranium, and technetium onto samples of tuff using real and synthetic groundwater J-13. There were no significant differences in sorption ratios in experiments with real and synthetic groundwater. Columns 1 cm in diameter and about 5 cm long were constructed, and experiments were conducted with the objective of correlating the results of batch and the column experiments. The characteristics of the columns were tested by determination of elution curves in J-13 containing tritium and technetium as the TcO 4 - ion. For strontium and cesium, fairly good correlation between values of the sorption ratio obtained by the two methods was observed. Little or no technetium sorption was observed with either method. The elution peaks obtained with neptunium and uranium were asymmetrical and the shapes were often complex, observations which suggest irreversibilities in the sorption reaction. An experiment was performed to provide information on the compositions of the first groundwaters that will contact waste canisters in a tuff-hosted repository after very near field temperatures have cooled to below 100/degree/C. Synthetic groundwater J-13 was slowly dripped onto a slab of tuff maintained at 95-100/degree/C, and the result was a thin encrustation of solids on the slab as the water evaporated. Fresh J-13 groundwater was then allowed to contact the encrustation in a vessel maintained at 90/degree/C. The principal result of the experiment was a significant loss of calcium and magnesium from the fresh J-13 groundwater

  7. High level nuclear wastes

    International Nuclear Information System (INIS)

    Lopez Perez, B.

    1987-01-01

    The transformations involved in the nuclear fuels during the burn-up at the power nuclear reactors for burn-up levels of 33.000 MWd/th are considered. Graphs and data on the radioactivity variation with the cooling time and heat power of the irradiated fuel are presented. Likewise, the cycle of the fuel in light water reactors is presented and the alternatives for the nuclear waste management are discussed. A brief description of the management of the spent fuel as a high level nuclear waste is shown, explaining the reprocessing and giving data about the fission products and their radioactivities, which must be considered on the vitrification processes. On the final storage of the nuclear waste into depth geological burials, both alternatives are coincident. The countries supporting the reprocessing are indicated and the Spanish programm defined in the Plan Energetico Nacional (PEN) is shortly reviewed. (author) 8 figs., 4 tabs

  8. Sample-based XPath Ranking for Web Information Extraction

    NARCIS (Netherlands)

    Jundt, Oliver; van Keulen, Maurice

    Web information extraction typically relies on a wrapper, i.e., program code or a configuration that specifies how to extract some information from web pages at a specific website. Manually creating and maintaining wrappers is a cumbersome and error-prone task. It may even be prohibitive as some

  9. The Agent of extracting Internet Information with Lead Order

    Science.gov (United States)

    Mo, Zan; Huang, Chuliang; Liu, Aijun

    In order to carry out e-commerce better, advanced technologies to access business information are in need urgently. An agent is described to deal with the problems of extracting internet information that caused by the non-standard and skimble-scamble structure of Chinese websites. The agent designed includes three modules which respond to the process of extracting information separately. A method of HTTP tree and a kind of Lead algorithm is proposed to generate a lead order, with which the required web can be retrieved easily. How to transform the extracted information structuralized with natural language is also discussed.

  10. Cause Information Extraction from Financial Articles Concerning Business Performance

    Science.gov (United States)

    Sakai, Hiroyuki; Masuyama, Shigeru

    We propose a method of extracting cause information from Japanese financial articles concerning business performance. Our method acquires cause informtion, e. g. “_??__??__??__??__??__??__??__??__??__??_ (zidousya no uriage ga koutyou: Sales of cars were good)”. Cause information is useful for investors in selecting companies to invest. Our method extracts cause information as a form of causal expression by using statistical information and initial clue expressions automatically. Our method can extract causal expressions without predetermined patterns or complex rules given by hand, and is expected to be applied to other tasks for acquiring phrases that have a particular meaning not limited to cause information. We compared our method with our previous one originally proposed for extracting phrases concerning traffic accident causes and experimental results showed that our new method outperforms our previous one.

  11. Can we replace curation with information extraction software?

    Science.gov (United States)

    Karp, Peter D

    2016-01-01

    Can we use programs for automated or semi-automated information extraction from scientific texts as practical alternatives to professional curation? I show that error rates of current information extraction programs are too high to replace professional curation today. Furthermore, current IEP programs extract single narrow slivers of information, such as individual protein interactions; they cannot extract the large breadth of information extracted by professional curators for databases such as EcoCyc. They also cannot arbitrate among conflicting statements in the literature as curators can. Therefore, funding agencies should not hobble the curation efforts of existing databases on the assumption that a problem that has stymied Artificial Intelligence researchers for more than 60 years will be solved tomorrow. Semi-automated extraction techniques appear to have significantly more potential based on a review of recent tools that enhance curator productivity. But a full cost-benefit analysis for these tools is lacking. Without such analysis it is possible to expend significant effort developing information-extraction tools that automate small parts of the overall curation workflow without achieving a significant decrease in curation costs.Database URL. © The Author(s) 2016. Published by Oxford University Press.

  12. Mining knowledge from text repositories using information extraction ...

    Indian Academy of Sciences (India)

    Information extraction (IE); text mining; text repositories; knowledge discovery from .... general purpose English words. However ... of precision and recall, as extensive experimentation is required due to lack of public tagged corpora. 4. Mining ...

  13. Mars Target Encyclopedia: Information Extraction for Planetary Science

    Science.gov (United States)

    Wagstaff, K. L.; Francis, R.; Gowda, T.; Lu, Y.; Riloff, E.; Singh, K.

    2017-06-01

    Mars surface targets / and published compositions / Seek and ye will find. We used text mining methods to extract information from LPSC abstracts about the composition of Mars surface targets. Users can search by element, mineral, or target.

  14. Integrating Information Extraction Agents into a Tourism Recommender System

    Science.gov (United States)

    Esparcia, Sergio; Sánchez-Anguix, Víctor; Argente, Estefanía; García-Fornes, Ana; Julián, Vicente

    Recommender systems face some problems. On the one hand information needs to be maintained updated, which can result in a costly task if it is not performed automatically. On the other hand, it may be interesting to include third party services in the recommendation since they improve its quality. In this paper, we present an add-on for the Social-Net Tourism Recommender System that uses information extraction and natural language processing techniques in order to automatically extract and classify information from the Web. Its goal is to maintain the system updated and obtain information about third party services that are not offered by service providers inside the system.

  15. Addressing Information Proliferation: Applications of Information Extraction and Text Mining

    Science.gov (United States)

    Li, Jingjing

    2013-01-01

    The advent of the Internet and the ever-increasing capacity of storage media have made it easy to store, deliver, and share enormous volumes of data, leading to a proliferation of information on the Web, in online libraries, on news wires, and almost everywhere in our daily lives. Since our ability to process and absorb this information remains…

  16. Synthesis and characterization of hybrid silicon based complexing materials: extraction of transuranic elements from high level liquid waste; Synthese et caracterisation de gels hybrides de silice a proprietes complexantes: applications a l'extraction des transuraniens des effluents aqueux

    Energy Technology Data Exchange (ETDEWEB)

    Conocar, O

    1999-07-01

    Hybrid organic/inorganic silica compounds with extractive properties have been developed under an enhanced decontamination program for radioactive aqueous nitric acid waste in nuclear facilities. The materials were obtained by the sol-gel process through hydrolysis and poly-condensation of complexing organo-tri-alkoxy-silanes with the corresponding tetra-alkoxy-silane. Hybrid silica compounds were initially synthesized and characterized from mono- and bis-silyl precursors with malonamide or ethylenediamine patterns. Solids with different specific areas and pore diameters were obtained depending on the nature of the precursor, its functionality and its concentration in the tetra-alkoxy-silane. These compounds were then considered and assessed for use in plutonium and americium extraction. Excellent results-partitioning coefficients and capacities have been obtained with malonamide hybrid silica. The comparison with silica compounds impregnated or grafted with the same type of organic group is significant in this respect. Much of the improved performance obtained with hybrid silica may be attributed to the large quantity of complexing groups that can be incorporated in these materials. The effect of the solid texture on the extraction performance was also studied. Although the capacity increased with the specific area, little effect was observed on the distribution coefficients -notably for americium- indicating that the most favorable complexation sites are found on the outer surface. Macroporous malonamide hybrid silica compounds were synthesized to study the effects of the pore diameter, but the results have been inconclusive to date because of the unexpected molecular composition of the materials. (author)

  17. Information extraction from multi-institutional radiology reports.

    Science.gov (United States)

    Hassanpour, Saeed; Langlotz, Curtis P

    2016-01-01

    The radiology report is the most important source of clinical imaging information. It documents critical information about the patient's health and the radiologist's interpretation of medical findings. It also communicates information to the referring physicians and records that information for future clinical and research use. Although efforts to structure some radiology report information through predefined templates are beginning to bear fruit, a large portion of radiology report information is entered in free text. The free text format is a major obstacle for rapid extraction and subsequent use of information by clinicians, researchers, and healthcare information systems. This difficulty is due to the ambiguity and subtlety of natural language, complexity of described images, and variations among different radiologists and healthcare organizations. As a result, radiology reports are used only once by the clinician who ordered the study and rarely are used again for research and data mining. In this work, machine learning techniques and a large multi-institutional radiology report repository are used to extract the semantics of the radiology report and overcome the barriers to the re-use of radiology report information in clinical research and other healthcare applications. We describe a machine learning system to annotate radiology reports and extract report contents according to an information model. This information model covers the majority of clinically significant contents in radiology reports and is applicable to a wide variety of radiology study types. Our automated approach uses discriminative sequence classifiers for named-entity recognition to extract and organize clinically significant terms and phrases consistent with the information model. We evaluated our information extraction system on 150 radiology reports from three major healthcare organizations and compared its results to a commonly used non-machine learning information extraction method. We

  18. Progress in evaluation of radionuclide geochemical information developed by DOE high-level nuclear waste repository site projects: Report for April 1986--September 1987

    International Nuclear Information System (INIS)

    Meyer, R.E.; Arnold, W.D.; Blencoe, J.G.; O'Kelley, G.D.; Land, J.F.

    1988-07-01

    During this report period, all experiments were conducted with tuff from the proposed high-level nuclear waste site at Yucca Mountain, Nevada. Batch sorption ratio determinations were conducted for strontium, cesium, uranium, and technetium onto samples of tuff using real and synthetic groundwater J-13. There were no significant differences in sorption ratios in experiments with real and synthetic groundwater. Columns were tested by determination of elution curves in J-13 containing tritium and technetium as the TcO 4 /sup /minus// ion. For strontium and cesium, fairly good correlation between values of the sorption ratio obtained by the two methods was observed. Little technetium sorption was observed with either method. The elution peaks obtained with neptunium and uranium were asymmetrical and the shapes were often complex, observations which suggest irreversibilities in the sorption reaction. Synthetic groundwater J-13 was slowly dripped onto a slab of tuff maintained at 95--100/degree/C, and the result was a thin encrustation of solids on the slab as the water evaporated. Fresh J-13 groundwater was then allowed to contact the encrustation in a vessel maintained at 90/degree/C. The principal result of the experiment was a significant loss of calcium and magnesium from the fresh J-13 groundwater. 13 refs. 25 figs., 9 tabs

  19. Fine-grained information extraction from German transthoracic echocardiography reports.

    Science.gov (United States)

    Toepfer, Martin; Corovic, Hamo; Fette, Georg; Klügl, Peter; Störk, Stefan; Puppe, Frank

    2015-11-12

    Information extraction techniques that get structured representations out of unstructured data make a large amount of clinically relevant information about patients accessible for semantic applications. These methods typically rely on standardized terminologies that guide this process. Many languages and clinical domains, however, lack appropriate resources and tools, as well as evaluations of their applications, especially if detailed conceptualizations of the domain are required. For instance, German transthoracic echocardiography reports have not been targeted sufficiently before, despite of their importance for clinical trials. This work therefore aimed at development and evaluation of an information extraction component with a fine-grained terminology that enables to recognize almost all relevant information stated in German transthoracic echocardiography reports at the University Hospital of Würzburg. A domain expert validated and iteratively refined an automatically inferred base terminology. The terminology was used by an ontology-driven information extraction system that outputs attribute value pairs. The final component has been mapped to the central elements of a standardized terminology, and it has been evaluated according to documents with different layouts. The final system achieved state-of-the-art precision (micro average.996) and recall (micro average.961) on 100 test documents that represent more than 90 % of all reports. In particular, principal aspects as defined in a standardized external terminology were recognized with f 1=.989 (micro average) and f 1=.963 (macro average). As a result of keyword matching and restraint concept extraction, the system obtained high precision also on unstructured or exceptionally short documents, and documents with uncommon layout. The developed terminology and the proposed information extraction system allow to extract fine-grained information from German semi-structured transthoracic echocardiography reports

  20. Extraction of Information of Audio-Visual Contents

    Directory of Open Access Journals (Sweden)

    Carlos Aguilar

    2011-10-01

    Full Text Available In this article we show how it is possible to use Channel Theory (Barwise and Seligman, 1997 for modeling the process of information extraction realized by audiences of audio-visual contents. To do this, we rely on the concepts pro- posed by Channel Theory and, especially, its treatment of representational systems. We then show how the information that an agent is capable of extracting from the content depends on the number of channels he is able to establish between the content and the set of classifications he is able to discriminate. The agent can endeavor the extraction of information through these channels from the totality of content; however, we discuss the advantages of extracting from its constituents in order to obtain a greater number of informational items that represent it. After showing how the extraction process is endeavored for each channel, we propose a method of representation of all the informative values an agent can obtain from a content using a matrix constituted by the channels the agent is able to establish on the content (source classifications, and the ones he can understand as individual (destination classifications. We finally show how this representation allows reflecting the evolution of the informative items through the evolution of audio-visual content.

  1. Semantic Information Extraction of Lanes Based on Onboard Camera Videos

    Science.gov (United States)

    Tang, L.; Deng, T.; Ren, C.

    2018-04-01

    In the field of autonomous driving, semantic information of lanes is very important. This paper proposes a method of automatic detection of lanes and extraction of semantic information from onboard camera videos. The proposed method firstly detects the edges of lanes by the grayscale gradient direction, and improves the Probabilistic Hough transform to fit them; then, it uses the vanishing point principle to calculate the lane geometrical position, and uses lane characteristics to extract lane semantic information by the classification of decision trees. In the experiment, 216 road video images captured by a camera mounted onboard a moving vehicle were used to detect lanes and extract lane semantic information. The results show that the proposed method can accurately identify lane semantics from video images.

  2. Development of geo-information data management system and application to geological disposal of high-level radioactive waste in China

    Directory of Open Access Journals (Sweden)

    Wang Peng

    2017-01-01

    Full Text Available In this paper, based on information technology, a geo-information database was established and a geo-information data management system (named as HLW-GIS was developed to facilitate data management work of the multi-source and multidisciplinary data been which are generated during site selection process of geological repository in China. Many important functions, such as basic create, retrieve, update, and delete operations, full text search and download functions, can be achieved through this management system. Even the function of statistics and analysis for certain professional data can be provided. Finally, a few hundred gigabytes of data from numerous different disciplines were integrated, stored, and managed successfully. Meanwhile, the management system can also provide a significant reference for data management work of related research fields, such as decommissioning and management of nuclear facilities, resource prospection and environmental protection.

  3. Image understanding systems based on the unifying representation of perceptual and conceptual information and the solution of mid-level and high-level vision problems

    Science.gov (United States)

    Kuvychko, Igor

    2001-10-01

    Vision is a part of a larger information system that converts visual information into knowledge structures. These structures drive vision process, resolving ambiguity and uncertainty via feedback, and provide image understanding, that is an interpretation of visual information in terms of such knowledge models. A computer vision system based on such principles requires unifying representation of perceptual and conceptual information. Computer simulation models are built on the basis of graphs/networks. The ability of human brain to emulate similar graph/networks models is found. That means a very important shift of paradigm in our knowledge about brain from neural networks to the cortical software. Starting from the primary visual areas, brain analyzes an image as a graph-type spatial structure. Primary areas provide active fusion of image features on a spatial grid-like structure, where nodes are cortical columns. The spatial combination of different neighbor features cannot be described as a statistical/integral characteristic of the analyzed region, but uniquely characterizes such region itself. Spatial logic and topology naturally present in such structures. Mid-level vision processes like clustering, perceptual grouping, multilevel hierarchical compression, separation of figure from ground, etc. are special kinds of graph/network transformations. They convert low-level image structure into the set of more abstract ones, which represent objects and visual scene, making them easy for analysis by higher-level knowledge structures. Higher-level vision phenomena like shape from shading, occlusion, etc. are results of such analysis. Such approach gives opportunity not only to explain frequently unexplainable results of the cognitive science, but also to create intelligent computer vision systems that simulate perceptional processes in both what and where visual pathways. Such systems can open new horizons for robotic and computer vision industries.

  4. Knowledge Dictionary for Information Extraction on the Arabic Text Data

    Directory of Open Access Journals (Sweden)

    Wahyu Jauharis Saputra

    2013-04-01

    Full Text Available Information extraction is an early stage of a process of textual data analysis. Information extraction is required to get information from textual data that can be used for process analysis, such as classification and categorization. A textual data is strongly influenced by the language. Arabic is gaining a significant attention in many studies because Arabic language is very different from others, and in contrast to other languages, tools and research on the Arabic language is still lacking. The information extracted using the knowledge dictionary is a concept of expression. A knowledge dictionary is usually constructed manually by an expert and this would take a long time and is specific to a problem only. This paper proposed a method for automatically building a knowledge dictionary. Dictionary knowledge is formed by classifying sentences having the same concept, assuming that they will have a high similarity value. The concept that has been extracted can be used as features for subsequent computational process such as classification or categorization. Dataset used in this paper was the Arabic text dataset. Extraction result was tested by using a decision tree classification engine and the highest precision value obtained was 71.0% while the highest recall value was 75.0%. 

  5. Metal Sounds Stiffer than Drums for Ears, but Not Always for Hands: Low-Level Auditory Features Affect Multisensory Stiffness Perception More than High-Level Categorical Information

    Science.gov (United States)

    Liu, Juan; Ando, Hiroshi

    2016-01-01

    Most real-world events stimulate multiple sensory modalities simultaneously. Usually, the stiffness of an object is perceived haptically. However, auditory signals also contain stiffness-related information, and people can form impressions of stiffness from the different impact sounds of metal, wood, or glass. To understand whether there is any interaction between auditory and haptic stiffness perception, and if so, whether the inferred material category is the most relevant auditory information, we conducted experiments using a force-feedback device and the modal synthesis method to present haptic stimuli and impact sound in accordance with participants’ actions, and to modulate low-level acoustic parameters, i.e., frequency and damping, without changing the inferred material categories of sound sources. We found that metal sounds consistently induced an impression of stiffer surfaces than did drum sounds in the audio-only condition, but participants haptically perceived surfaces with modulated metal sounds as significantly softer than the same surfaces with modulated drum sounds, which directly opposes the impression induced by these sounds alone. This result indicates that, although the inferred material category is strongly associated with audio-only stiffness perception, low-level acoustic parameters, especially damping, are more tightly integrated with haptic signals than the material category is. Frequency played an important role in both audio-only and audio-haptic conditions. Our study provides evidence that auditory information influences stiffness perception differently in unisensory and multisensory tasks. Furthermore, the data demonstrated that sounds with higher frequency and/or shorter decay time tended to be judged as stiffer, and contact sounds of stiff objects had no effect on the haptic perception of soft surfaces. We argue that the intrinsic physical relationship between object stiffness and acoustic parameters may be applied as prior

  6. Ontology-Based Information Extraction for Business Intelligence

    Science.gov (United States)

    Saggion, Horacio; Funk, Adam; Maynard, Diana; Bontcheva, Kalina

    Business Intelligence (BI) requires the acquisition and aggregation of key pieces of knowledge from multiple sources in order to provide valuable information to customers or feed statistical BI models and tools. The massive amount of information available to business analysts makes information extraction and other natural language processing tools key enablers for the acquisition and use of that semantic information. We describe the application of ontology-based extraction and merging in the context of a practical e-business application for the EU MUSING Project where the goal is to gather international company intelligence and country/region information. The results of our experiments so far are very promising and we are now in the process of building a complete end-to-end solution.

  7. High-level-waste immobilization

    International Nuclear Information System (INIS)

    Crandall, J.L.

    1982-01-01

    Analysis of risks, environmental effects, process feasibility, and costs for disposal of immobilized high-level wastes in geologic repositories indicates that the disposal system safety has a low sensitivity to the choice of the waste disposal form

  8. NAMED ENTITY RECOGNITION FROM BIOMEDICAL TEXT -AN INFORMATION EXTRACTION TASK

    Directory of Open Access Journals (Sweden)

    N. Kanya

    2016-07-01

    Full Text Available Biomedical Text Mining targets the Extraction of significant information from biomedical archives. Bio TM encompasses Information Retrieval (IR and Information Extraction (IE. The Information Retrieval will retrieve the relevant Biomedical Literature documents from the various Repositories like PubMed, MedLine etc., based on a search query. The IR Process ends up with the generation of corpus with the relevant document retrieved from the Publication databases based on the query. The IE task includes the process of Preprocessing of the document, Named Entity Recognition (NER from the documents and Relationship Extraction. This process includes Natural Language Processing, Data Mining techniques and machine Language algorithm. The preprocessing task includes tokenization, stop word Removal, shallow parsing, and Parts-Of-Speech tagging. NER phase involves recognition of well-defined objects such as genes, proteins or cell-lines etc. This process leads to the next phase that is extraction of relationships (IE. The work was based on machine learning algorithm Conditional Random Field (CRF.

  9. A Two-Step Resume Information Extraction Algorithm

    Directory of Open Access Journals (Sweden)

    Jie Chen

    2018-01-01

    Full Text Available With the rapid growth of Internet-based recruiting, there are a great number of personal resumes among recruiting systems. To gain more attention from the recruiters, most resumes are written in diverse formats, including varying font size, font colour, and table cells. However, the diversity of format is harmful to data mining, such as resume information extraction, automatic job matching, and candidates ranking. Supervised methods and rule-based methods have been proposed to extract facts from resumes, but they strongly rely on hierarchical structure information and large amounts of labelled data, which are hard to collect in reality. In this paper, we propose a two-step resume information extraction approach. In the first step, raw text of resume is identified as different resume blocks. To achieve the goal, we design a novel feature, Writing Style, to model sentence syntax information. Besides word index and punctuation index, word lexical attribute and prediction results of classifiers are included in Writing Style. In the second step, multiple classifiers are employed to identify different attributes of fact information in resumes. Experimental results on a real-world dataset show that the algorithm is feasible and effective.

  10. Optimum detection for extracting maximum information from symmetric qubit sets

    International Nuclear Information System (INIS)

    Mizuno, Jun; Fujiwara, Mikio; Sasaki, Masahide; Akiba, Makoto; Kawanishi, Tetsuya; Barnett, Stephen M.

    2002-01-01

    We demonstrate a class of optimum detection strategies for extracting the maximum information from sets of equiprobable real symmetric qubit states of a single photon. These optimum strategies have been predicted by Sasaki et al. [Phys. Rev. A 59, 3325 (1999)]. The peculiar aspect is that the detections with at least three outputs suffice for optimum extraction of information regardless of the number of signal elements. The cases of ternary (or trine), quinary, and septenary polarization signals are studied where a standard von Neumann detection (a projection onto a binary orthogonal basis) fails to access the maximum information. Our experiments demonstrate that it is possible with present technologies to attain about 96% of the theoretical limit

  11. Extracting Semantic Information from Visual Data: A Survey

    Directory of Open Access Journals (Sweden)

    Qiang Liu

    2016-03-01

    Full Text Available The traditional environment maps built by mobile robots include both metric ones and topological ones. These maps are navigation-oriented and not adequate for service robots to interact with or serve human users who normally rely on the conceptual knowledge or semantic contents of the environment. Therefore, the construction of semantic maps becomes necessary for building an effective human-robot interface for service robots. This paper reviews recent research and development in the field of visual-based semantic mapping. The main focus is placed on how to extract semantic information from visual data in terms of feature extraction, object/place recognition and semantic representation methods.

  12. Rapid automatic keyword extraction for information retrieval and analysis

    Science.gov (United States)

    Rose, Stuart J [Richland, WA; Cowley,; E, Wendy [Richland, WA; Crow, Vernon L [Richland, WA; Cramer, Nicholas O [Richland, WA

    2012-03-06

    Methods and systems for rapid automatic keyword extraction for information retrieval and analysis. Embodiments can include parsing words in an individual document by delimiters, stop words, or both in order to identify candidate keywords. Word scores for each word within the candidate keywords are then calculated based on a function of co-occurrence degree, co-occurrence frequency, or both. Based on a function of the word scores for words within the candidate keyword, a keyword score is calculated for each of the candidate keywords. A portion of the candidate keywords are then extracted as keywords based, at least in part, on the candidate keywords having the highest keyword scores.

  13. Robust Vehicle and Traffic Information Extraction for Highway Surveillance

    Directory of Open Access Journals (Sweden)

    Yeh Chia-Hung

    2005-01-01

    Full Text Available A robust vision-based traffic monitoring system for vehicle and traffic information extraction is developed in this research. It is challenging to maintain detection robustness at all time for a highway surveillance system. There are three major problems in detecting and tracking a vehicle: (1 the moving cast shadow effect, (2 the occlusion effect, and (3 nighttime detection. For moving cast shadow elimination, a 2D joint vehicle-shadow model is employed. For occlusion detection, a multiple-camera system is used to detect occlusion so as to extract the exact location of each vehicle. For vehicle nighttime detection, a rear-view monitoring technique is proposed to maintain tracking and detection accuracy. Furthermore, we propose a method to improve the accuracy of background extraction, which usually serves as the first step in any vehicle detection processing. Experimental results are given to demonstrate that the proposed techniques are effective and efficient for vision-based highway surveillance.

  14. High-level fusion of depth and intensity for pedestrian classification

    NARCIS (Netherlands)

    Rohrbach, M.; Enzweiler, M.; Gavrila, D.M.

    2009-01-01

    This paper presents a novel approach to pedestrian classification which involves a high-level fusion of depth and intensity cues. Instead of utilizing depth information only in a pre-processing step, we propose to extract discriminative spatial features (gradient orientation histograms and local

  15. High-level radioactive wastes. Supplement 1

    International Nuclear Information System (INIS)

    McLaren, L.H.

    1984-09-01

    This bibliography contains information on high-level radioactive wastes included in the Department of Energy's Energy Data Base from August 1982 through December 1983. These citations are to research reports, journal articles, books, patents, theses, and conference papers from worldwide sources. Five indexes, each preceded by a brief description, are provided: Corporate Author, Personal Author, Subject, Contract Number, and Report Number. 1452 citations

  16. Advanced applications of natural language processing for performing information extraction

    CERN Document Server

    Rodrigues, Mário

    2015-01-01

    This book explains how can be created information extraction (IE) applications that are able to tap the vast amount of relevant information available in natural language sources: Internet pages, official documents such as laws and regulations, books and newspapers, and social web. Readers are introduced to the problem of IE and its current challenges and limitations, supported with examples. The book discusses the need to fill the gap between documents, data, and people, and provides a broad overview of the technology supporting IE. The authors present a generic architecture for developing systems that are able to learn how to extract relevant information from natural language documents, and illustrate how to implement working systems using state-of-the-art and freely available software tools. The book also discusses concrete applications illustrating IE uses.   ·         Provides an overview of state-of-the-art technology in information extraction (IE), discussing achievements and limitations for t...

  17. High-Level Radioactive Waste.

    Science.gov (United States)

    Hayden, Howard C.

    1995-01-01

    Presents a method to calculate the amount of high-level radioactive waste by taking into consideration the following factors: the fission process that yields the waste, identification of the waste, the energy required to run a 1-GWe plant for one year, and the uranium mass required to produce that energy. Briefly discusses waste disposal and…

  18. High-level radioactive wastes

    International Nuclear Information System (INIS)

    Grissom, M.C.

    1982-10-01

    This bibliography contains 812 citations on high-level radioactive wastes included in the Department of Energy's Energy Data Base from January 1981 through July 1982. These citations are to research reports, journal articles, books, patents, theses, and conference papers from worldwide sources. Five indexes are provided: Corporate Author, Personal Author, Subject, Contract Number, and Report Number

  19. Improving information extraction using a probability-based approach

    DEFF Research Database (Denmark)

    Kim, S.; Ahmed, Saeema; Wallace, K.

    2007-01-01

    Information plays a crucial role during the entire life-cycle of a product. It has been shown that engineers frequently consult colleagues to obtain the information they require to solve problems. However, the industrial world is now more transient and key personnel move to other companies...... or retire. It is becoming essential to retrieve vital information from archived product documents, if it is available. There is, therefore, great interest in ways of extracting relevant and sharable information from documents. A keyword-based search is commonly used, but studies have shown...... the recall, while maintaining the high precision, a learning approach that makes identification decisions based on a probability model, rather than simply looking up the presence of the pre-defined variations, looks promising. This paper presents the results of developing such a probability-based entity...

  20. Technetium Chemistry in High-Level Waste

    International Nuclear Information System (INIS)

    Hess, Nancy J.

    2006-01-01

    Tc contamination is found within the DOE complex at those sites whose mission involved extraction of plutonium from irradiated uranium fuel or isotopic enrichment of uranium. At the Hanford Site, chemical separations and extraction processes generated large amounts of high level and transuranic wastes that are currently stored in underground tanks. The waste from these extraction processes is currently stored in underground High Level Waste (HLW) tanks. However, the chemistry of the HLW in any given tank is greatly complicated by repeated efforts to reduce volume and recover isotopes. These processes ultimately resulted in mixing of waste streams from different processes. As a result, the chemistry and the fate of Tc in HLW tanks are not well understood. This lack of understanding has been made evident in the failed efforts to leach Tc from sludge and to remove Tc from supernatants prior to immobilization. Although recent interest in Tc chemistry has shifted from pretreatment chemistry to waste residuals, both needs are served by a fundamental understanding of Tc chemistry

  1. RPython high-level synthesis

    Science.gov (United States)

    Cieszewski, Radoslaw; Linczuk, Maciej

    2016-09-01

    The development of FPGA technology and the increasing complexity of applications in recent decades have forced compilers to move to higher abstraction levels. Compilers interprets an algorithmic description of a desired behavior written in High-Level Languages (HLLs) and translate it to Hardware Description Languages (HDLs). This paper presents a RPython based High-Level synthesis (HLS) compiler. The compiler get the configuration parameters and map RPython program to VHDL. Then, VHDL code can be used to program FPGA chips. In comparison of other technologies usage, FPGAs have the potential to achieve far greater performance than software as a result of omitting the fetch-decode-execute operations of General Purpose Processors (GPUs), and introduce more parallel computation. This can be exploited by utilizing many resources at the same time. Creating parallel algorithms computed with FPGAs in pure HDL is difficult and time consuming. Implementation time can be greatly reduced with High-Level Synthesis compiler. This article describes design methodologies and tools, implementation and first results of created VHDL backend for RPython compiler.

  2. 40 CFR 227.30 - High-level radioactive waste.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 24 2010-07-01 2010-07-01 false High-level radioactive waste. 227.30...-level radioactive waste. High-level radioactive waste means the aqueous waste resulting from the operation of the first cycle solvent extraction system, or equivalent, and the concentrated waste from...

  3. Transliteration normalization for Information Extraction and Machine Translation

    Directory of Open Access Journals (Sweden)

    Yuval Marton

    2014-12-01

    Full Text Available Foreign name transliterations typically include multiple spelling variants. These variants cause data sparseness and inconsistency problems, increase the Out-of-Vocabulary (OOV rate, and present challenges for Machine Translation, Information Extraction and other natural language processing (NLP tasks. This work aims to identify and cluster name spelling variants using a Statistical Machine Translation method: word alignment. The variants are identified by being aligned to the same “pivot” name in another language (the source-language in Machine Translation settings. Based on word-to-word translation and transliteration probabilities, as well as the string edit distance metric, names with similar spellings in the target language are clustered and then normalized to a canonical form. With this approach, tens of thousands of high-precision name transliteration spelling variants are extracted from sentence-aligned bilingual corpora in Arabic and English (in both languages. When these normalized name spelling variants are applied to Information Extraction tasks, improvements over strong baseline systems are observed. When applied to Machine Translation tasks, a large improvement potential is shown.

  4. Removing high-level contaminants

    International Nuclear Information System (INIS)

    Wallace, Paula

    2013-01-01

    Full text: Using biomimicry, an Australian cleantech innovation making inroads intoChinas's industrial sector offers multiple benefits to miners and processors in Australia. Stephen Shelley, the executive chairman of Creative Water Technology (CWT), was on hand at a recent trade show to explain how his Melbourne company has developed world-class techniques in zero liquid discharge and fractional crystallization of minerals to apply to a wide range of water treatment and recycling applications. “Most existing technologies operate with high energy distillation, filters or biological processing. CWT's appliance uses a low temperature, thermal distillation process known as adiabatic recovery to desalinate, dewater and/or recycle highly saline and highly contaminated waste water,” said Shelley. The technology has been specifically designed to handle the high levels of contaminant that alternative technologies struggle to process, with proven water quality results for feed water samples with TDS levels over 300,000ppm converted to clean water with less than 20ppm. Comparatively, reverse osmosis struggles to process contaminant levels over 70,000ppm effectively. “CWT is able to reclaim up to 97% clean usable water and up to 100% of the contaminants contained in the feed water,” said Shelley, adding that soluble and insoluble contaminants are separately extracted and dried for sale or re-use. In industrial applications CWT has successfully processed feed water with contaminant levels over 650,000 mg/1- without the use of chemicals. “The technology would be suitable for companies in oil exploration and production, mining, smelting, biofuels, textiles and the agricultural and food production sectors,” said Shelley. When compared to a conventional desalination plant, the CWT system is able to capture the value in the brine that most plants discard, not only from the salt but the additional water it contains. “If you recover those two commodities... then you

  5. Knowledge discovery: Extracting usable information from large amounts of data

    International Nuclear Information System (INIS)

    Whiteson, R.

    1998-01-01

    The threat of nuclear weapons proliferation is a problem of world wide concern. Safeguards are the key to nuclear nonproliferation and data is the key to safeguards. The safeguards community has access to a huge and steadily growing volume of data. The advantages of this data rich environment are obvious, there is a great deal of information which can be utilized. The challenge is to effectively apply proven and developing technologies to find and extract usable information from that data. That information must then be assessed and evaluated to produce the knowledge needed for crucial decision making. Efficient and effective analysis of safeguards data will depend on utilizing technologies to interpret the large, heterogeneous data sets that are available from diverse sources. With an order-of-magnitude increase in the amount of data from a wide variety of technical, textual, and historical sources there is a vital need to apply advanced computer technologies to support all-source analysis. There are techniques of data warehousing, data mining, and data analysis that can provide analysts with tools that will expedite their extracting useable information from the huge amounts of data to which they have access. Computerized tools can aid analysts by integrating heterogeneous data, evaluating diverse data streams, automating retrieval of database information, prioritizing inputs, reconciling conflicting data, doing preliminary interpretations, discovering patterns or trends in data, and automating some of the simpler prescreening tasks that are time consuming and tedious. Thus knowledge discovery technologies can provide a foundation of support for the analyst. Rather than spending time sifting through often irrelevant information, analysts could use their specialized skills in a focused, productive fashion. This would allow them to make their analytical judgments with more confidence and spend more of their time doing what they do best

  6. Evolving spectral transformations for multitemporal information extraction using evolutionary computation

    Science.gov (United States)

    Momm, Henrique; Easson, Greg

    2011-01-01

    Remote sensing plays an important role in assessing temporal changes in land features. The challenge often resides in the conversion of large quantities of raw data into actionable information in a timely and cost-effective fashion. To address this issue, research was undertaken to develop an innovative methodology integrating biologically-inspired algorithms with standard image classification algorithms to improve information extraction from multitemporal imagery. Genetic programming was used as the optimization engine to evolve feature-specific candidate solutions in the form of nonlinear mathematical expressions of the image spectral channels (spectral indices). The temporal generalization capability of the proposed system was evaluated by addressing the task of building rooftop identification from a set of images acquired at different dates in a cross-validation approach. The proposed system generates robust solutions (kappa values > 0.75 for stage 1 and > 0.4 for stage 2) despite the statistical differences between the scenes caused by land use and land cover changes coupled with variable environmental conditions, and the lack of radiometric calibration between images. Based on our results, the use of nonlinear spectral indices enhanced the spectral differences between features improving the clustering capability of standard classifiers and providing an alternative solution for multitemporal information extraction.

  7. Recognition techniques for extracting information from semistructured documents

    Science.gov (United States)

    Della Ventura, Anna; Gagliardi, Isabella; Zonta, Bruna

    2000-12-01

    Archives of optical documents are more and more massively employed, the demand driven also by the new norms sanctioning the legal value of digital documents, provided they are stored on supports that are physically unalterable. On the supply side there is now a vast and technologically advanced market, where optical memories have solved the problem of the duration and permanence of data at costs comparable to those for magnetic memories. The remaining bottleneck in these systems is the indexing. The indexing of documents with a variable structure, while still not completely automated, can be machine supported to a large degree with evident advantages both in the organization of the work, and in extracting information, providing data that is much more detailed and potentially significant for the user. We present here a system for the automatic registration of correspondence to and from a public office. The system is based on a general methodology for the extraction, indexing, archiving, and retrieval of significant information from semi-structured documents. This information, in our prototype application, is distributed among the database fields of sender, addressee, subject, date, and body of the document.

  8. High-level radioactive wastes. Supplement 1

    Energy Technology Data Exchange (ETDEWEB)

    McLaren, L.H. (ed.)

    1984-09-01

    This bibliography contains information on high-level radioactive wastes included in the Department of Energy's Energy Data Base from August 1982 through December 1983. These citations are to research reports, journal articles, books, patents, theses, and conference papers from worldwide sources. Five indexes, each preceded by a brief description, are provided: Corporate Author, Personal Author, Subject, Contract Number, and Report Number. 1452 citations.

  9. Research of the cost-benefit evaluation for reprocessing research and development and high-level radioactive waste disposal research and development. Establishing R and D scenarios and extracting their effects

    International Nuclear Information System (INIS)

    Sugihara, K; Miura, N; Arii, Y

    2004-02-01

    This report is intended to explain the outline of research in the FY 2003 on cost-benefit evaluation for Reprocessing R and D and High-Level Radioactive Waste Disposal R and D. We decided to apply the Method of Cost-Benefit Analysis, based on cost-benefit analysis for Fast Reactor cycle system R and D, to Reprocessing R and D and High-Level Radioactive Waste Disposal R and D, and to compare with the results of cost-benefit analysis for both the JNC R and D scenario and the other optional scenarios. In this year, we first thought out all R and D scenarios in the future for Reprocessing R and D and High-Level Radioactive Waste Disposal R and D, and rejected difficult scenarios technically and impossible scenarios socially. Finally, the reasonable R and D scenarios were established. Besides, we thought out the effects (merits) by carrying out the R and D, and separated out them from economical view, environmental view, view of nuclear non-proliferation and so on. (author)

  10. Automated extraction of chemical structure information from digital raster images

    Directory of Open Access Journals (Sweden)

    Shedden Kerby A

    2009-02-01

    Full Text Available Abstract Background To search for chemical structures in research articles, diagrams or text representing molecules need to be translated to a standard chemical file format compatible with cheminformatic search engines. Nevertheless, chemical information contained in research articles is often referenced as analog diagrams of chemical structures embedded in digital raster images. To automate analog-to-digital conversion of chemical structure diagrams in scientific research articles, several software systems have been developed. But their algorithmic performance and utility in cheminformatic research have not been investigated. Results This paper aims to provide critical reviews for these systems and also report our recent development of ChemReader – a fully automated tool for extracting chemical structure diagrams in research articles and converting them into standard, searchable chemical file formats. Basic algorithms for recognizing lines and letters representing bonds and atoms in chemical structure diagrams can be independently run in sequence from a graphical user interface-and the algorithm parameters can be readily changed-to facilitate additional development specifically tailored to a chemical database annotation scheme. Compared with existing software programs such as OSRA, Kekule, and CLiDE, our results indicate that ChemReader outperforms other software systems on several sets of sample images from diverse sources in terms of the rate of correct outputs and the accuracy on extracting molecular substructure patterns. Conclusion The availability of ChemReader as a cheminformatic tool for extracting chemical structure information from digital raster images allows research and development groups to enrich their chemical structure databases by annotating the entries with published research articles. Based on its stable performance and high accuracy, ChemReader may be sufficiently accurate for annotating the chemical database with links

  11. Information Extraction, Data Integration, and Uncertain Data Management: The State of The Art

    NARCIS (Netherlands)

    Habib, Mena Badieh; van Keulen, Maurice

    2011-01-01

    Information Extraction, data Integration, and uncertain data management are different areas of research that got vast focus in the last two decades. Many researches tackled those areas of research individually. However, information extraction systems should have integrated with data integration

  12. Timing of High-level Waste Disposal

    International Nuclear Information System (INIS)

    2008-01-01

    This study identifies key factors influencing the timing of high-level waste (HLW) disposal and examines how social acceptability, technical soundness, environmental responsibility and economic feasibility impact on national strategies for HLW management and disposal. Based on case study analyses, it also presents the strategic approaches adopted in a number of national policies to address public concerns and civil society requirements regarding long-term stewardship of high-level radioactive waste. The findings and conclusions of the study confirm the importance of informing all stakeholders and involving them in the decision-making process in order to implement HLW disposal strategies successfully. This study will be of considerable interest to nuclear energy policy makers and analysts as well as to experts in the area of radioactive waste management and disposal. (author)

  13. Information Extraction for Clinical Data Mining: A Mammography Case Study.

    Science.gov (United States)

    Nassif, Houssam; Woods, Ryan; Burnside, Elizabeth; Ayvaci, Mehmet; Shavlik, Jude; Page, David

    2009-01-01

    Breast cancer is the leading cause of cancer mortality in women between the ages of 15 and 54. During mammography screening, radiologists use a strict lexicon (BI-RADS) to describe and report their findings. Mammography records are then stored in a well-defined database format (NMD). Lately, researchers have applied data mining and machine learning techniques to these databases. They successfully built breast cancer classifiers that can help in early detection of malignancy. However, the validity of these models depends on the quality of the underlying databases. Unfortunately, most databases suffer from inconsistencies, missing data, inter-observer variability and inappropriate term usage. In addition, many databases are not compliant with the NMD format and/or solely consist of text reports. BI-RADS feature extraction from free text and consistency checks between recorded predictive variables and text reports are crucial to addressing this problem. We describe a general scheme for concept information retrieval from free text given a lexicon, and present a BI-RADS features extraction algorithm for clinical data mining. It consists of a syntax analyzer, a concept finder and a negation detector. The syntax analyzer preprocesses the input into individual sentences. The concept finder uses a semantic grammar based on the BI-RADS lexicon and the experts' input. It parses sentences detecting BI-RADS concepts. Once a concept is located, a lexical scanner checks for negation. Our method can handle multiple latent concepts within the text, filtering out ultrasound concepts. On our dataset, our algorithm achieves 97.7% precision, 95.5% recall and an F 1 -score of 0.97. It outperforms manual feature extraction at the 5% statistical significance level.

  14. INFORMATION EXTRACTION IN TOMB PIT USING HYPERSPECTRAL DATA

    Directory of Open Access Journals (Sweden)

    X. Yang

    2018-04-01

    Full Text Available Hyperspectral data has characteristics of multiple bands and continuous, large amount of data, redundancy, and non-destructive. These characteristics make it possible to use hyperspectral data to study cultural relics. In this paper, the hyperspectral imaging technology is adopted to recognize the bottom images of an ancient tomb located in Shanxi province. There are many black remains on the bottom surface of the tomb, which are suspected to be some meaningful texts or paintings. Firstly, the hyperspectral data is preprocessing to get the reflectance of the region of interesting. For the convenient of compute and storage, the original reflectance value is multiplied by 10000. Secondly, this article uses three methods to extract the symbols at the bottom of the ancient tomb. Finally we tried to use morphology to connect the symbols and gave fifteen reference images. The results show that the extraction of information based on hyperspectral data can obtain a better visual experience, which is beneficial to the study of ancient tombs by researchers, and provides some references for archaeological research findings.

  15. Information Extraction in Tomb Pit Using Hyperspectral Data

    Science.gov (United States)

    Yang, X.; Hou, M.; Lyu, S.; Ma, S.; Gao, Z.; Bai, S.; Gu, M.; Liu, Y.

    2018-04-01

    Hyperspectral data has characteristics of multiple bands and continuous, large amount of data, redundancy, and non-destructive. These characteristics make it possible to use hyperspectral data to study cultural relics. In this paper, the hyperspectral imaging technology is adopted to recognize the bottom images of an ancient tomb located in Shanxi province. There are many black remains on the bottom surface of the tomb, which are suspected to be some meaningful texts or paintings. Firstly, the hyperspectral data is preprocessing to get the reflectance of the region of interesting. For the convenient of compute and storage, the original reflectance value is multiplied by 10000. Secondly, this article uses three methods to extract the symbols at the bottom of the ancient tomb. Finally we tried to use morphology to connect the symbols and gave fifteen reference images. The results show that the extraction of information based on hyperspectral data can obtain a better visual experience, which is beneficial to the study of ancient tombs by researchers, and provides some references for archaeological research findings.

  16. Automated Extraction of Substance Use Information from Clinical Texts.

    Science.gov (United States)

    Wang, Yan; Chen, Elizabeth S; Pakhomov, Serguei; Arsoniadis, Elliot; Carter, Elizabeth W; Lindemann, Elizabeth; Sarkar, Indra Neil; Melton, Genevieve B

    2015-01-01

    Within clinical discourse, social history (SH) includes important information about substance use (alcohol, drug, and nicotine use) as key risk factors for disease, disability, and mortality. In this study, we developed and evaluated a natural language processing (NLP) system for automated detection of substance use statements and extraction of substance use attributes (e.g., temporal and status) based on Stanford Typed Dependencies. The developed NLP system leveraged linguistic resources and domain knowledge from a multi-site social history study, Propbank and the MiPACQ corpus. The system attained F-scores of 89.8, 84.6 and 89.4 respectively for alcohol, drug, and nicotine use statement detection, as well as average F-scores of 82.1, 90.3, 80.8, 88.7, 96.6, and 74.5 respectively for extraction of attributes. Our results suggest that NLP systems can achieve good performance when augmented with linguistic resources and domain knowledge when applied to a wide breadth of substance use free text clinical notes.

  17. Domain-independent information extraction in unstructured text

    Energy Technology Data Exchange (ETDEWEB)

    Irwin, N.H. [Sandia National Labs., Albuquerque, NM (United States). Software Surety Dept.

    1996-09-01

    Extracting information from unstructured text has become an important research area in recent years due to the large amount of text now electronically available. This status report describes the findings and work done during the second year of a two-year Laboratory Directed Research and Development Project. Building on the first-year`s work of identifying important entities, this report details techniques used to group words into semantic categories and to output templates containing selective document content. Using word profiles and category clustering derived during a training run, the time-consuming knowledge-building task can be avoided. Though the output still lacks in completeness when compared to systems with domain-specific knowledge bases, the results do look promising. The two approaches are compatible and could complement each other within the same system. Domain-independent approaches retain appeal as a system that adapts and learns will soon outpace a system with any amount of a priori knowledge.

  18. Extracting and Using Photon Polarization Information in Radiative B Decays

    Energy Technology Data Exchange (ETDEWEB)

    Grossman, Yuval

    2000-05-09

    The authors discuss the uses of conversion electron pairs for extracting photon polarization information in weak radiative B decays. Both cases of leptons produced through a virtual and real photon are considered. Measurements of the angular correlation between the (K-pi) and (e{sup +}e{sup {minus}}) decay planes in B --> K*(--> K-pi)gamma (*)(--> e{sup +}e{sup {minus}}) decays can be used to determine the helicity amplitudes in the radiative B --> K*gamma decays. A large right-handed helicity amplitude in B-bar decays is a signal of new physics. The time-dependent CP asymmetry in the B{sup 0} decay angular correlation is shown to measure sin 2-beta and cos 2-beta with little hadronic uncertainty.

  19. Extraction of neutron spectral information from Bonner-Sphere data

    CERN Document Server

    Haney, J H; Zaidins, C S

    1999-01-01

    We have extended a least-squares method of extracting neutron spectral information from Bonner-Sphere data which was previously developed by Zaidins et al. (Med. Phys. 5 (1978) 42). A pulse-height analysis with background stripping is employed which provided a more accurate count rate for each sphere. Newer response curves by Mares and Schraube (Nucl. Instr. and Meth. A 366 (1994) 461) were included for the moderating spheres and the bare detector which comprise the Bonner spectrometer system. Finally, the neutron energy spectrum of interest was divided using the philosophy of fuzzy logic into three trapezoidal regimes corresponding to slow, moderate, and fast neutrons. Spectral data was taken using a PuBe source in two different environments and the analyzed data is presented for these cases as slow, moderate, and fast neutron fluences. (author)

  20. ONTOGRABBING: Extracting Information from Texts Using Generative Ontologies

    DEFF Research Database (Denmark)

    Nilsson, Jørgen Fischer; Szymczak, Bartlomiej Antoni; Jensen, P.A.

    2009-01-01

    We describe principles for extracting information from texts using a so-called generative ontology in combination with syntactic analysis. Generative ontologies are introduced as semantic domains for natural language phrases. Generative ontologies extend ordinary finite ontologies with rules...... for producing recursively shaped terms representing the ontological content (ontological semantics) of NL noun phrases and other phrases. We focus here on achieving a robust, often only partial, ontology-driven parsing of and ascription of semantics to a sentence in the text corpus. The aim of the ontological...... analysis is primarily to identify paraphrases, thereby achieving a search functionality beyond mere keyword search with synsets. We further envisage use of the generative ontology as a phrase-based rather than word-based browser into text corpora....

  1. Optimizing High Level Waste Disposal

    International Nuclear Information System (INIS)

    Dirk Gombert

    2005-01-01

    If society is ever to reap the potential benefits of nuclear energy, technologists must close the fuel-cycle completely. A closed cycle equates to a continued supply of fuel and safe reactors, but also reliable and comprehensive closure of waste issues. High level waste (HLW) disposal in borosilicate glass (BSG) is based on 1970s era evaluations. This host matrix is very adaptable to sequestering a wide variety of radionuclides found in raffinates from spent fuel reprocessing. However, it is now known that the current system is far from optimal for disposal of the diverse HLW streams, and proven alternatives are available to reduce costs by billions of dollars. The basis for HLW disposal should be reassessed to consider extensive waste form and process technology research and development efforts, which have been conducted by the United States Department of Energy (USDOE), international agencies and the private sector. Matching the waste form to the waste chemistry and using currently available technology could increase the waste content in waste forms to 50% or more and double processing rates. Optimization of the HLW disposal system would accelerate HLW disposition and increase repository capacity. This does not necessarily require developing new waste forms, the emphasis should be on qualifying existing matrices to demonstrate protection equal to or better than the baseline glass performance. Also, this proposed effort does not necessarily require developing new technology concepts. The emphasis is on demonstrating existing technology that is clearly better (reliability, productivity, cost) than current technology, and justifying its use in future facilities or retrofitted facilities. Higher waste processing and disposal efficiency can be realized by performing the engineering analyses and trade-studies necessary to select the most efficient methods for processing the full spectrum of wastes across the nuclear complex. This paper will describe technologies being

  2. Information extraction and knowledge graph construction from geoscience literature

    Science.gov (United States)

    Wang, Chengbin; Ma, Xiaogang; Chen, Jianguo; Chen, Jingwen

    2018-03-01

    Geoscience literature published online is an important part of open data, and brings both challenges and opportunities for data analysis. Compared with studies of numerical geoscience data, there are limited works on information extraction and knowledge discovery from textual geoscience data. This paper presents a workflow and a few empirical case studies for that topic, with a focus on documents written in Chinese. First, we set up a hybrid corpus combining the generic and geology terms from geology dictionaries to train Chinese word segmentation rules of the Conditional Random Fields model. Second, we used the word segmentation rules to parse documents into individual words, and removed the stop-words from the segmentation results to get a corpus constituted of content-words. Third, we used a statistical method to analyze the semantic links between content-words, and we selected the chord and bigram graphs to visualize the content-words and their links as nodes and edges in a knowledge graph, respectively. The resulting graph presents a clear overview of key information in an unstructured document. This study proves the usefulness of the designed workflow, and shows the potential of leveraging natural language processing and knowledge graph technologies for geoscience.

  3. Data Assimilation to Extract Soil Moisture Information from SMAP Observations

    Directory of Open Access Journals (Sweden)

    Jana Kolassa

    2017-11-01

    Full Text Available This study compares different methods to extract soil moisture information through the assimilation of Soil Moisture Active Passive (SMAP observations. Neural network (NN and physically-based SMAP soil moisture retrievals were assimilated into the National Aeronautics and Space Administration (NASA Catchment model over the contiguous United States for April 2015 to March 2017. By construction, the NN retrievals are consistent with the global climatology of the Catchment model soil moisture. Assimilating the NN retrievals without further bias correction improved the surface and root zone correlations against in situ measurements from 14 SMAP core validation sites (CVS by 0.12 and 0.16, respectively, over the model-only skill, and reduced the surface and root zone unbiased root-mean-square error (ubRMSE by 0.005 m 3 m − 3 and 0.001 m 3 m − 3 , respectively. The assimilation reduced the average absolute surface bias against the CVS measurements by 0.009 m 3 m − 3 , but increased the root zone bias by 0.014 m 3 m − 3 . Assimilating the NN retrievals after a localized bias correction yielded slightly lower surface correlation and ubRMSE improvements, but generally the skill differences were small. The assimilation of the physically-based SMAP Level-2 passive soil moisture retrievals using a global bias correction yielded similar skill improvements, as did the direct assimilation of locally bias-corrected SMAP brightness temperatures within the SMAP Level-4 soil moisture algorithm. The results show that global bias correction methods may be able to extract more independent information from SMAP observations compared to local bias correction methods, but without accurate quality control and observation error characterization they are also more vulnerable to adverse effects from retrieval errors related to uncertainties in the retrieval inputs and algorithm. Furthermore, the results show that using global bias correction approaches without a

  4. Multi-Filter String Matching and Human-Centric Entity Matching for Information Extraction

    Science.gov (United States)

    Sun, Chong

    2012-01-01

    More and more information is being generated in text documents, such as Web pages, emails and blogs. To effectively manage this unstructured information, one broadly used approach includes locating relevant content in documents, extracting structured information and integrating the extracted information for querying, mining or further analysis. In…

  5. Earth Science Data Analytics: Preparing for Extracting Knowledge from Information

    Science.gov (United States)

    Kempler, Steven; Barbieri, Lindsay

    2016-01-01

    Data analytics is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. Data analytics is a broad term that includes data analysis, as well as an understanding of the cognitive processes an analyst uses to understand problems and explore data in meaningful ways. Analytics also include data extraction, transformation, and reduction, utilizing specific tools, techniques, and methods. Turning to data science, definitions of data science sound very similar to those of data analytics (which leads to a lot of the confusion between the two). But the skills needed for both, co-analyzing large amounts of heterogeneous data, understanding and utilizing relevant tools and techniques, and subject matter expertise, although similar, serve different purposes. Data Analytics takes on a practitioners approach to applying expertise and skills to solve issues and gain subject knowledge. Data Science, is more theoretical (research in itself) in nature, providing strategic actionable insights and new innovative methodologies. Earth Science Data Analytics (ESDA) is the process of examining, preparing, reducing, and analyzing large amounts of spatial (multi-dimensional), temporal, or spectral data using a variety of data types to uncover patterns, correlations and other information, to better understand our Earth. The large variety of datasets (temporal spatial differences, data types, formats, etc.) invite the need for data analytics skills that understand the science domain, and data preparation, reduction, and analysis techniques, from a practitioners point of view. The application of these skills to ESDA is the focus of this presentation. The Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster was created in recognition of the practical need to facilitate the co-analysis of large amounts of data and information for Earth science. Thus, from a to

  6. Testing the reliability of information extracted from ancient zircon

    Science.gov (United States)

    Kielman, Ross; Whitehouse, Martin; Nemchin, Alexander

    2015-04-01

    Studies combining zircon U-Pb chronology, trace element distribution as well as O and Hf isotope systematics are a powerful way to gain understanding of the processes shaping Earth's evolution, especially in detrital populations where constraints from the original host are missing. Such studies of the Hadean detrital zircon population abundant in sedimentary rocks in Western Australia have involved analysis of an unusually large number of individual grains, but also highlighted potential problems with the approach, only apparent when multiple analyses are obtained from individual grains. A common feature of the Hadean as well as many early Archaean zircon populations is their apparent inhomogeneity, which reduces confidence in conclusions based on studies combining chemistry and isotopic characteristics of zircon. In order to test the reliability of information extracted from early Earth zircon, we report results from one of the first in-depth multi-method study of zircon from a relatively simple early Archean magmatic rock, used as an analogue to ancient detrital zircon. The approach involves making multiple SIMS analyses in individual grains in order to be comparable to the most advanced studies of detrital zircon populations. The investigated sample is a relatively undeformed, non-migmatitic ca. 3.8 Ga tonalite collected a few kms south of the Isua Greenstone Belt, southwest Greenland. Extracted zircon grains can be combined into three different groups based on the behavior of their U-Pb systems: (i) grains that show internally consistent and concordant ages and define an average age of 3805±15 Ma, taken to be the age of the rock, (ii) grains that are distributed close to the concordia line, but with significant variability between multiple analyses, suggesting an ancient Pb loss and (iii) grains that have multiple analyses distributed along a discordia pointing towards a zero intercept, indicating geologically recent Pb-loss. This overall behavior has

  7. Extraction of CT dose information from DICOM metadata: automated Matlab-based approach.

    Science.gov (United States)

    Dave, Jaydev K; Gingold, Eric L

    2013-01-01

    The purpose of this study was to extract exposure parameters and dose-relevant indexes of CT examinations from information embedded in DICOM metadata. DICOM dose report files were identified and retrieved from a PACS. An automated software program was used to extract from these files information from the structured elements in the DICOM metadata relevant to exposure. Extracting information from DICOM metadata eliminated potential errors inherent in techniques based on optical character recognition, yielding 100% accuracy.

  8. Medicaid Analytic eXtract (MAX) General Information

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicaid Analytic eXtract (MAX) data is a set of person-level data files on Medicaid eligibility, service utilization, and payments. The MAX data are created to...

  9. The CMS High Level Trigger System

    CERN Document Server

    Afaq, A; Bauer, G; Biery, K; Boyer, V; Branson, J; Brett, A; Cano, E; Carboni, A; Cheung, H; Ciganek, M; Cittolin, S; Dagenhart, W; Erhan, S; Gigi, D; Glege, F; Gómez-Reino, Robert; Gulmini, M; Gutiérrez-Mlot, E; Gutleber, J; Jacobs, C; Kim, J C; Klute, M; Kowalkowski, J; Lipeles, E; Lopez-Perez, Juan Antonio; Maron, G; Meijers, F; Meschi, E; Moser, R; Murray, S; Oh, A; Orsini, L; Paus, C; Petrucci, A; Pieri, M; Pollet, L; Rácz, A; Sakulin, H; Sani, M; Schieferdecker, P; Schwick, C; Sexton-Kennedy, E; Sumorok, K; Suzuki, I; Tsirigkas, D; Varela, J

    2007-01-01

    The CMS Data Acquisition (DAQ) System relies on a purely software driven High Level Trigger (HLT) to reduce the full Level-1 accept rate of 100 kHz to approximately 100 Hz for archiving and later offline analysis. The HLT operates on the full information of events assembled by an event builder collecting detector data from the CMS front-end systems. The HLT software consists of a sequence of reconstruction and filtering modules executed on a farm of O(1000) CPUs built from commodity hardware. This paper presents the architecture of the CMS HLT, which integrates the CMS reconstruction framework in the online environment. The mechanisms to configure, control, and monitor the Filter Farm and the procedures to validate the filtering code within the DAQ environment are described.

  10. Information Extraction with Character-level Neural Networks and Free Noisy Supervision

    OpenAIRE

    Meerkamp, Philipp; Zhou, Zhengyi

    2016-01-01

    We present an architecture for information extraction from text that augments an existing parser with a character-level neural network. The network is trained using a measure of consistency of extracted data with existing databases as a form of noisy supervision. Our architecture combines the ability of constraint-based information extraction systems to easily incorporate domain knowledge and constraints with the ability of deep neural networks to leverage large amounts of data to learn compl...

  11. Semantics-based information extraction for detecting economic events

    NARCIS (Netherlands)

    A.C. Hogenboom (Alexander); F. Frasincar (Flavius); K. Schouten (Kim); O. van der Meer

    2013-01-01

    textabstractAs today's financial markets are sensitive to breaking news on economic events, accurate and timely automatic identification of events in news items is crucial. Unstructured news items originating from many heterogeneous sources have to be mined in order to extract knowledge useful for

  12. Tagline: Information Extraction for Semi-Structured Text Elements in Medical Progress Notes

    Science.gov (United States)

    Finch, Dezon Kile

    2012-01-01

    Text analysis has become an important research activity in the Department of Veterans Affairs (VA). Statistical text mining and natural language processing have been shown to be very effective for extracting useful information from medical documents. However, neither of these techniques is effective at extracting the information stored in…

  13. An Effective Approach to Biomedical Information Extraction with Limited Training Data

    Science.gov (United States)

    Jonnalagadda, Siddhartha

    2011-01-01

    In the current millennium, extensive use of computers and the internet caused an exponential increase in information. Few research areas are as important as information extraction, which primarily involves extracting concepts and the relations between them from free text. Limitations in the size of training data, lack of lexicons and lack of…

  14. A rapid extraction of landslide disaster information research based on GF-1 image

    Science.gov (United States)

    Wang, Sai; Xu, Suning; Peng, Ling; Wang, Zhiyi; Wang, Na

    2015-08-01

    In recent years, the landslide disasters occurred frequently because of the seismic activity. It brings great harm to people's life. It has caused high attention of the state and the extensive concern of society. In the field of geological disaster, landslide information extraction based on remote sensing has been controversial, but high resolution remote sensing image can improve the accuracy of information extraction effectively with its rich texture and geometry information. Therefore, it is feasible to extract the information of earthquake- triggered landslides with serious surface damage and large scale. Taking the Wenchuan county as the study area, this paper uses multi-scale segmentation method to extract the landslide image object through domestic GF-1 images and DEM data, which uses the estimation of scale parameter tool to determine the optimal segmentation scale; After analyzing the characteristics of landslide high-resolution image comprehensively and selecting spectrum feature, texture feature, geometric features and landform characteristics of the image, we can establish the extracting rules to extract landslide disaster information. The extraction results show that there are 20 landslide whose total area is 521279.31 .Compared with visual interpretation results, the extraction accuracy is 72.22%. This study indicates its efficient and feasible to extract earthquake landslide disaster information based on high resolution remote sensing and it provides important technical support for post-disaster emergency investigation and disaster assessment.

  15. High-level language computer architecture

    CERN Document Server

    Chu, Yaohan

    1975-01-01

    High-Level Language Computer Architecture offers a tutorial on high-level language computer architecture, including von Neumann architecture and syntax-oriented architecture as well as direct and indirect execution architecture. Design concepts of Japanese-language data processing systems are discussed, along with the architecture of stack machines and the SYMBOL computer system. The conceptual design of a direct high-level language processor is also described.Comprised of seven chapters, this book first presents a classification of high-level language computer architecture according to the pr

  16. National high-level waste systems analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Kristofferson, K.; Oholleran, T.P.; Powell, R.H.

    1995-09-01

    This report documents the assessment of budgetary impacts, constraints, and repository availability on the storage and treatment of high-level waste and on both existing and pending negotiated milestones. The impacts of the availabilities of various treatment systems on schedule and throughput at four Department of Energy sites are compared to repository readiness in order to determine the prudent application of resources. The information modeled for each of these sites is integrated with a single national model. The report suggests a high-level-waste model that offers a national perspective on all high-level waste treatment and storage systems managed by the Department of Energy.

  17. National high-level waste systems analysis report

    International Nuclear Information System (INIS)

    Kristofferson, K.; Oholleran, T.P.; Powell, R.H.

    1995-09-01

    This report documents the assessment of budgetary impacts, constraints, and repository availability on the storage and treatment of high-level waste and on both existing and pending negotiated milestones. The impacts of the availabilities of various treatment systems on schedule and throughput at four Department of Energy sites are compared to repository readiness in order to determine the prudent application of resources. The information modeled for each of these sites is integrated with a single national model. The report suggests a high-level-waste model that offers a national perspective on all high-level waste treatment and storage systems managed by the Department of Energy

  18. Towards an information extraction and knowledge formation framework based on Shannon entropy

    Directory of Open Access Journals (Sweden)

    Iliescu Dragoș

    2017-01-01

    Full Text Available Information quantity subject is approached in this paperwork, considering the specific domain of nonconforming product management as information source. This work represents a case study. Raw data were gathered from a heavy industrial works company, information extraction and knowledge formation being considered herein. Involved method for information quantity estimation is based on Shannon entropy formula. Information and entropy spectrum are decomposed and analysed for extraction of specific information and knowledge-that formation. The result of the entropy analysis point out the information needed to be acquired by the involved organisation, this being presented as a specific knowledge type.

  19. Extracting local information from crowds through betting markets

    Science.gov (United States)

    Weijs, Steven

    2015-04-01

    In this research, a set-up is considered in which users can bet against a forecasting agency to challenge their probabilistic forecasts. From an information theory standpoint, a reward structure is considered that either provides the forecasting agency with better information, paying the successful providers of information for their winning bets, or funds excellent forecasting agencies through users that think they know better. Especially for local forecasts, the approach may help to diagnose model biases and to identify local predictive information that can be incorporated in the models. The challenges and opportunities for implementing such a system in practice are also discussed.

  20. Other-than-high-level waste

    International Nuclear Information System (INIS)

    Bray, G.R.

    1976-01-01

    The main emphasis of the work in the area of partitioning transuranic elements from waste has been in the area of high-level liquid waste. But there are ''other-than-high-level wastes'' generated by the back end of the nuclear fuel cycle that are both large in volume and contaminated with significant quantities of transuranic elements. The combined volume of these other wastes is approximately 50 times that of the solidified high-level waste. These other wastes also contain up to 75% of the transuranic elements associated with waste generated by the back end of the fuel cycle. Therefore, any detailed evaluation of partitioning as a viable waste management option must address both high-level wastes and ''other-than-high-level wastes.''

  1. Spoken Language Understanding Systems for Extracting Semantic Information from Speech

    CERN Document Server

    Tur, Gokhan

    2011-01-01

    Spoken language understanding (SLU) is an emerging field in between speech and language processing, investigating human/ machine and human/ human communication by leveraging technologies from signal processing, pattern recognition, machine learning and artificial intelligence. SLU systems are designed to extract the meaning from speech utterances and its applications are vast, from voice search in mobile devices to meeting summarization, attracting interest from both commercial and academic sectors. Both human/machine and human/human communications can benefit from the application of SLU, usin

  2. Patients subject to high levels of coercion: staff's understanding.

    Science.gov (United States)

    Bowers, Len; Wright, Steve; Stewart, Duncan

    2014-05-01

    Measures to keep staff and patients safe (containment) frequently involve coercion. A small proportion of patients is subject to a large proportion of containment use. To reduce the use of containment, we need a better understanding of the circumstances in which it is used and the understandings of patients and staff. Two sweeps were made of all the wards, spread over four hospital sites, in one large London mental health organization to identify patients who had been subject to high levels of containment in the previous two weeks. Data were then extracted from their case notes about their past history, current problem behaviours, and how they were understood by the patients involved and the staff. Nurses and consultant psychiatrists were interviewed to supplement the information from the case records. Twenty-six heterogeneous patients were identified, with many ages, genders, diagnoses, and psychiatric specialities represented. The main problem behaviours giving rise to containment use were violence and self-harm. The roots of the problem behaviours were to be found in severe psychiatric symptoms, cognitive difficulties, personality traits, and the implementation of the internal structure of the ward by staff. Staff's range and depth of understandings was limited and did not include functional analysis, defence mechanisms, specific cognitive assessment, and other potential frameworks. There is a need for more in-depth assessment and understanding of patients' problems, which may lead to additional ways to reduce containment use.

  3. Ocean disposal of high level radioactive waste

    International Nuclear Information System (INIS)

    1983-01-01

    This study confirms, subject to limitations of current knowledge, the engineering feasibility of free fall penetrators for High Level Radioactive Waste disposal in deep ocean seabed sediments. Restricted sediment property information is presently the principal bar to an unqualified statement of feasibility. A 10m minimum embedment and a 500 year engineered barrier waste containment life are identified as appropriate basic penetrator design criteria at this stage. A range of designs are considered in which the length, weight and cross section of the penetrator are varied. Penetrators from 3m to 20m long and 2t to 100t in weight constructed of material types and thicknesses to give a 500 year containment life are evaluated. The report concludes that the greatest degree of confidence is associated with performance predictions for 75 to 200 mm thick soft iron and welded joints. A range of lengths and capacities from a 3m long single waste canister penetrator to a 20m long 12 canister design are identified as meriting further study. Estimated embedment depths for this range of penetrator designs lie between 12m and 90m. Alternative manufacture, transport and launch operations are assessed and recommendations are made. (author)

  4. Sifting Through Chaos: Extracting Information from Unstructured Legal Opinions.

    Science.gov (United States)

    Oliveira, Bruno Miguel; Guimarães, Rui Vasconcellos; Antunes, Luís; Rodrigues, Pedro Pereira

    2018-01-01

    Abiding to the law is, in some cases, a delicate balance between the rights of different players. Re-using health records is such a case. While the law grants reuse rights to public administration documents, in which health records produced in public health institutions are included, it also grants privacy to personal records. To safeguard a correct usage of data, public hospitals in Portugal employ jurists that are responsible for allowing or withholding access rights to health records. To help decision making, these jurists can consult the legal opinions issued by the national committee on public administration documents usage. While these legal opinions are of undeniable value, due to their doctrine contribution, they are only available in a format best suited from printing, forcing individual consultation of each document, with no option, whatsoever of clustered search, filtering or indexing, which are standard operations nowadays in a document management system. When having to decide on tens of data requests a day, it becomes unfeasible to consult the hundreds of legal opinions already available. With the objective to create a modern document management system, we devised an open, platform agnostic system that extracts and compiles the legal opinions, ex-tracts its contents and produces metadata, allowing for a fast searching and filtering of said legal opinions.

  5. Handling and storage of conditioned high-level wastes

    International Nuclear Information System (INIS)

    1983-01-01

    This report deals with certain aspects of the management of one of the most important wastes, i.e. the handling and storage of conditioned (immobilized and packaged) high-level waste from the reprocessing of spent nuclear fuel and, although much of the material presented here is based on information concerning high-level waste from reprocessing LWR fuel, the principles, as well as many of the details involved, are applicable to all fuel types. The report provides illustrative background material on the arising and characteristics of high-level wastes and, qualitatively, their requirements for conditioning. The report introduces the principles important in conditioned high-level waste storage and describes the types of equipment and facilities, used or studied, for handling and storage of such waste. Finally, it discusses the safety and economic aspects that are considered in the design and operation of handling and storage facilities

  6. Handling and storage of conditioned high-level wastes

    International Nuclear Information System (INIS)

    Heafield, W.

    1984-01-01

    This paper deals with certain aspects of the management of one of the most important radioactive wastes arising from the nuclear fuel cycle, i.e. the handling and storage of conditioned high-level wastes. The paper is based on an IAEA report of the same title published during 1983 in the Technical Reports Series. The paper provides illustrative background material on the characteristics of high-level wastes and, qualitatively, their requirements for conditioning. The principles important in the storage of high-level wastes are reviewed in conjunction with the radiological and socio-political considerations involved. Four fundamentally different storage concepts are described with reference to published information and the safety aspects of particular storage concepts are discussed. Finally, overall conclusions are presented which confirm the availability of technology for constructing and operating conditioned high-level waste storage facilities for periods of at least several decades. (author)

  7. Information extraction from FN plots of tungsten microemitters

    Energy Technology Data Exchange (ETDEWEB)

    Mussa, Khalil O. [Department of Physics, Mu' tah University, Al-Karak (Jordan); Mousa, Marwan S., E-mail: mmousa@mutah.edu.jo [Department of Physics, Mu' tah University, Al-Karak (Jordan); Fischer, Andreas, E-mail: andreas.fischer@physik.tu-chemnitz.de [Institut für Physik, Technische Universität Chemnitz, Chemnitz (Germany)

    2013-09-15

    Tungsten based microemitter tips have been prepared both clean and coated with dielectric materials. For clean tungsten tips, apex radii have been varied ranging from 25 to 500 nm. These tips were manufactured by electrochemical etching a 0.1 mm diameter high purity (99.95%) tungsten wire at the meniscus of two molar NaOH solution. Composite micro-emitters considered here are consisting of a tungsten core coated with different dielectric materials—such as magnesium oxide (MgO), sodium hydroxide (NaOH), tetracyanoethylene (TCNE), and zinc oxide (ZnO). It is worthwhile noting here, that the rather unconventional NaOH coating has shown several interesting properties. Various properties of these emitters were measured including current–voltage (IV) characteristics and the physical shape of the tips. A conventional field emission microscope (FEM) with a tip (cathode)–screen (anode) separation standardized at 10 mm was used to electrically characterize the electron emitters. The system was evacuated down to a base pressure of ∼10{sup −8}mbar when baked at up to ∼180°C overnight. This allowed measurements of typical field electron emission (FE) characteristics, namely the IV characteristics and the emission images on a conductive phosphorus screen (the anode). Mechanical characterization has been performed through a FEI scanning electron microscope (SEM). Within this work, the mentioned experimental results are connected to the theory for analyzing Fowler–Nordheim (FN) plots. We compared and evaluated the data extracted from clean tungsten tips of different radii and determined deviations between the results of different extraction methods applied. In particular, we derived the apex radii of several clean and coated tungsten tips by both SEM imaging and analyzing FN plots. The aim of this analysis is to support the ongoing discussion on recently developed improvements of the theory for analyzing FN plots related to metal field electron emitters, which in

  8. Information extraction from FN plots of tungsten microemitters

    International Nuclear Information System (INIS)

    Mussa, Khalil O.; Mousa, Marwan S.; Fischer, Andreas

    2013-01-01

    Tungsten based microemitter tips have been prepared both clean and coated with dielectric materials. For clean tungsten tips, apex radii have been varied ranging from 25 to 500 nm. These tips were manufactured by electrochemical etching a 0.1 mm diameter high purity (99.95%) tungsten wire at the meniscus of two molar NaOH solution. Composite micro-emitters considered here are consisting of a tungsten core coated with different dielectric materials—such as magnesium oxide (MgO), sodium hydroxide (NaOH), tetracyanoethylene (TCNE), and zinc oxide (ZnO). It is worthwhile noting here, that the rather unconventional NaOH coating has shown several interesting properties. Various properties of these emitters were measured including current–voltage (IV) characteristics and the physical shape of the tips. A conventional field emission microscope (FEM) with a tip (cathode)–screen (anode) separation standardized at 10 mm was used to electrically characterize the electron emitters. The system was evacuated down to a base pressure of ∼10 −8 mbar when baked at up to ∼180°C overnight. This allowed measurements of typical field electron emission (FE) characteristics, namely the IV characteristics and the emission images on a conductive phosphorus screen (the anode). Mechanical characterization has been performed through a FEI scanning electron microscope (SEM). Within this work, the mentioned experimental results are connected to the theory for analyzing Fowler–Nordheim (FN) plots. We compared and evaluated the data extracted from clean tungsten tips of different radii and determined deviations between the results of different extraction methods applied. In particular, we derived the apex radii of several clean and coated tungsten tips by both SEM imaging and analyzing FN plots. The aim of this analysis is to support the ongoing discussion on recently developed improvements of the theory for analyzing FN plots related to metal field electron emitters, which in

  9. Optimal Information Extraction of Laser Scanning Dataset by Scale-Adaptive Reduction

    Science.gov (United States)

    Zang, Y.; Yang, B.

    2018-04-01

    3D laser technology is widely used to collocate the surface information of object. For various applications, we need to extract a good perceptual quality point cloud from the scanned points. To solve the problem, most of existing methods extract important points based on a fixed scale. However, geometric features of 3D object come from various geometric scales. We propose a multi-scale construction method based on radial basis function. For each scale, important points are extracted from the point cloud based on their importance. We apply a perception metric Just-Noticeable-Difference to measure degradation of each geometric scale. Finally, scale-adaptive optimal information extraction is realized. Experiments are undertaken to evaluate the effective of the proposed method, suggesting a reliable solution for optimal information extraction of object.

  10. OPTIMAL INFORMATION EXTRACTION OF LASER SCANNING DATASET BY SCALE-ADAPTIVE REDUCTION

    Directory of Open Access Journals (Sweden)

    Y. Zang

    2018-04-01

    Full Text Available 3D laser technology is widely used to collocate the surface information of object. For various applications, we need to extract a good perceptual quality point cloud from the scanned points. To solve the problem, most of existing methods extract important points based on a fixed scale. However, geometric features of 3D object come from various geometric scales. We propose a multi-scale construction method based on radial basis function. For each scale, important points are extracted from the point cloud based on their importance. We apply a perception metric Just-Noticeable-Difference to measure degradation of each geometric scale. Finally, scale-adaptive optimal information extraction is realized. Experiments are undertaken to evaluate the effective of the proposed method, suggesting a reliable solution for optimal information extraction of object.

  11. SIGWX Charts - High Level Significant Weather

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — High level significant weather (SIGWX) forecasts are provided for the en-route portion of international flights. NOAA's National Weather Service Aviation Center...

  12. Evaluation of solidified high-level waste forms

    International Nuclear Information System (INIS)

    1981-01-01

    One of the objectives of the IAEA waste management programme is to coordinate and promote development of improved technology for the safe management of radioactive wastes. The Agency accomplished this objective specifically through sponsoring Coordinated Research Programmes on the ''Evaluation of Solidified High Level Waste Products'' in 1977. The primary objectives of this programme are to review and disseminate information on the properties of solidified high-level waste forms, to provide a mechanism for analysis and comparison of results from different institutes, and to help coordinate future plans and actions. This report is a summary compilation of the key information disseminated at the second meeting of this programme

  13. Recovering method for high level radioactive material

    International Nuclear Information System (INIS)

    Fukui, Toshiki

    1998-01-01

    Offgas filters such as of nuclear fuel reprocessing facilities and waste control facilities are burnt, and the burnt ash is melted by heating, and then the molten ashes are brought into contact with a molten metal having a low boiling point to transfer the high level radioactive materials in the molten ash to the molten metal. Then, only the molten metal is evaporated and solidified by drying, and residual high level radioactive materials are recovered. According to this method, the high level radioactive materials in the molten ashes are transferred to the molten metal and separated by the difference of the distribution rate of the molten ash and the molten metal. Subsequently, the molten metal to which the high level radioactive materials are transferred is heated to a temperature higher than the boiling point so that only the molten metal is evaporated and dried to be removed, and residual high level radioactive materials are recovered easily. On the other hand, the molten ash from which the high level radioactive material is removed can be discarded as ordinary industrial wastes as they are. (T.M.)

  14. Information extraction from FN plots of tungsten microemitters.

    Science.gov (United States)

    Mussa, Khalil O; Mousa, Marwan S; Fischer, Andreas

    2013-09-01

    Tungsten based microemitter tips have been prepared both clean and coated with dielectric materials. For clean tungsten tips, apex radii have been varied ranging from 25 to 500 nm. These tips were manufactured by electrochemical etching a 0.1 mm diameter high purity (99.95%) tungsten wire at the meniscus of two molar NaOH solution. Composite micro-emitters considered here are consisting of a tungsten core coated with different dielectric materials-such as magnesium oxide (MgO), sodium hydroxide (NaOH), tetracyanoethylene (TCNE), and zinc oxide (ZnO). It is worthwhile noting here, that the rather unconventional NaOH coating has shown several interesting properties. Various properties of these emitters were measured including current-voltage (IV) characteristics and the physical shape of the tips. A conventional field emission microscope (FEM) with a tip (cathode)-screen (anode) separation standardized at 10 mm was used to electrically characterize the electron emitters. The system was evacuated down to a base pressure of ∼10(-8) mbar when baked at up to ∼180 °C overnight. This allowed measurements of typical field electron emission (FE) characteristics, namely the IV characteristics and the emission images on a conductive phosphorus screen (the anode). Mechanical characterization has been performed through a FEI scanning electron microscope (SEM). Within this work, the mentioned experimental results are connected to the theory for analyzing Fowler-Nordheim (FN) plots. We compared and evaluated the data extracted from clean tungsten tips of different radii and determined deviations between the results of different extraction methods applied. In particular, we derived the apex radii of several clean and coated tungsten tips by both SEM imaging and analyzing FN plots. The aim of this analysis is to support the ongoing discussion on recently developed improvements of the theory for analyzing FN plots related to metal field electron emitters, which in particular

  15. Study on methods and techniques of aeroradiometric weak information extraction for sandstone-hosted uranium deposits based on GIS

    International Nuclear Information System (INIS)

    Han Shaoyang; Ke Dan; Hou Huiqun

    2005-01-01

    The weak information extraction is one of the important research contents in the current sandstone-type uranium prospecting in China. This paper introduces the connotation of aeroradiometric weak information extraction, and discusses the formation theories of aeroradiometric weak information extraction, and discusses the formation theories of aeroradiometric weak information and establishes some effective mathematic models for weak information extraction. Models for weak information extraction are realized based on GIS software platform. Application tests of weak information extraction are realized based on GIS software platform. Application tests of weak information extraction are completed in known uranium mineralized areas. Research results prove that the prospective areas of sandstone-type uranium deposits can be rapidly delineated by extracting aeroradiometric weak information. (authors)

  16. Extraction of Graph Information Based on Image Contents and the Use of Ontology

    Science.gov (United States)

    Kanjanawattana, Sarunya; Kimura, Masaomi

    2016-01-01

    A graph is an effective form of data representation used to summarize complex information. Explicit information such as the relationship between the X- and Y-axes can be easily extracted from a graph by applying human intelligence. However, implicit knowledge such as information obtained from other related concepts in an ontology also resides in…

  17. Extracting information of fixational eye movements through pupil tracking

    Science.gov (United States)

    Xiao, JiangWei; Qiu, Jian; Luo, Kaiqin; Peng, Li; Han, Peng

    2018-01-01

    Human eyes are never completely static even when they are fixing a stationary point. These irregular, small movements, which consist of micro-tremors, micro-saccades and drifts, can prevent the fading of the images that enter our eyes. The importance of researching the fixational eye movements has been experimentally demonstrated recently. However, the characteristics of fixational eye movements and their roles in visual process have not been explained clearly, because these signals can hardly be completely extracted by now. In this paper, we developed a new eye movement detection device with a high-speed camera. This device includes a beam splitter mirror, an infrared light source and a high-speed digital video camera with a frame rate of 200Hz. To avoid the influence of head shaking, we made the device wearable by fixing the camera on a safety helmet. Using this device, the experiments of pupil tracking were conducted. By localizing the pupil center and spectrum analysis, the envelope frequency spectrum of micro-saccades, micro-tremors and drifts are shown obviously. The experimental results show that the device is feasible and effective, so that the device can be applied in further characteristic analysis.

  18. Extracting Social Networks and Contact Information From Email and the Web

    National Research Council Canada - National Science Library

    Culotta, Aron; Bekkerman, Ron; McCallum, Andrew

    2005-01-01

    ...-suited for such information extraction tasks. By recursively calling itself on new people discovered on the Web, the system builds a social network with multiple degrees of separation from the user...

  19. Lithium NLP: A System for Rich Information Extraction from Noisy User Generated Text on Social Media

    OpenAIRE

    Bhargava, Preeti; Spasojevic, Nemanja; Hu, Guoning

    2017-01-01

    In this paper, we describe the Lithium Natural Language Processing (NLP) system - a resource-constrained, high- throughput and language-agnostic system for information extraction from noisy user generated text on social media. Lithium NLP extracts a rich set of information including entities, topics, hashtags and sentiment from text. We discuss several real world applications of the system currently incorporated in Lithium products. We also compare our system with existing commercial and acad...

  20. High-level radioactive waste management

    International Nuclear Information System (INIS)

    Schneider, K.J.; Liikala, R.C.

    1974-01-01

    High-level radioactive waste in the U.S. will be converted to an encapsulated solid and shipped to a Federal repository for retrievable storage for extended periods. Meanwhile the development of concepts for ultimate disposal of the waste which the Federal Government would manage is being actively pursued. A number of promising concepts have been proposed, for which there is high confidence that one or more will be suitable for long-term, ultimate disposal. Initial evaluations of technical (or theoretical) feasibility for the various waste disposal concepts show that in the broad category, (i.e., geologic, seabed, ice sheet, extraterrestrial, and transmutation) all meet the criteria for judging feasibility, though a few alternatives within these categories do not. Preliminary cost estimates show that, although many millions of dollars may be required, the cost for even the most exotic concepts is small relative to the total cost of electric power generation. For example, the cost estimates for terrestrial disposal concepts are less than 1 percent of the total generating costs. The cost for actinide transmutation is estimated at around 1 percent of generation costs, while actinide element disposal in space is less than 5 percent of generating costs. Thus neither technical feasibility nor cost seems to be a no-go factor in selecting a waste management system. The seabed, ice sheet, and space disposal concepts face international policy constraints. The information being developed currently in safety, environmental concern, and public response will be important factors in determining which concepts appear most promising for further development

  1. Information Extraction of High Resolution Remote Sensing Images Based on the Calculation of Optimal Segmentation Parameters

    Science.gov (United States)

    Zhu, Hongchun; Cai, Lijie; Liu, Haiying; Huang, Wei

    2016-01-01

    Multi-scale image segmentation and the selection of optimal segmentation parameters are the key processes in the object-oriented information extraction of high-resolution remote sensing images. The accuracy of remote sensing special subject information depends on this extraction. On the basis of WorldView-2 high-resolution data, the optimal segmentation parameters methodof object-oriented image segmentation and high-resolution image information extraction, the following processes were conducted in this study. Firstly, the best combination of the bands and weights was determined for the information extraction of high-resolution remote sensing image. An improved weighted mean-variance method was proposed andused to calculatethe optimal segmentation scale. Thereafter, the best shape factor parameter and compact factor parameters were computed with the use of the control variables and the combination of the heterogeneity and homogeneity indexes. Different types of image segmentation parameters were obtained according to the surface features. The high-resolution remote sensing images were multi-scale segmented with the optimal segmentation parameters. Ahierarchical network structure was established by setting the information extraction rules to achieve object-oriented information extraction. This study presents an effective and practical method that can explain expert input judgment by reproducible quantitative measurements. Furthermore the results of this procedure may be incorporated into a classification scheme. PMID:27362762

  2. Production and properties of solidified high-level waste

    International Nuclear Information System (INIS)

    Brodersen, K.

    1980-08-01

    Available information on production and properties of solidified high-level waste are presented. The review includes literature up to the end of 1979. The feasibility of production of various types of solidified high-level wast is investigated. The main emphasis is on borosilicate glass but other options are also mentioned. The expected long-term behaviour of the materials are discussed on the basis of available results from laboratory experiments. Examples of the use of the information in safety analysis of disposal in salt formations are given. The work has been made on behalf of the Danish utilities investigation of the possibilities of disposal of high-level waste in salt domes in Jutland. (author)

  3. Overview of ImageCLEF 2017: information extraction from images

    OpenAIRE

    Ionescu, Bogdan; Müller, Henning; Villegas, Mauricio; Arenas, Helbert; Boato, Giulia; Dang Nguyen, Duc Tien; Dicente Cid, Yashin; Eickhoff, Carsten; Seco de Herrera, Alba G.; Gurrin, Cathal; Islam, Bayzidul; Kovalev, Vassili; Liauchuk, Vitali; Mothe, Josiane; Piras, Luca

    2017-01-01

    This paper presents an overview of the ImageCLEF 2017 evaluation campaign, an event that was organized as part of the CLEF (Conference and Labs of the Evaluation Forum) labs 2017. ImageCLEF is an ongoing initiative (started in 2003) that promotes the evaluation of technologies for annotation, indexing and retrieval for providing information access to collections of images in various usage scenarios and domains. In 2017, the 15th edition of ImageCLEF, three main tasks were proposed and one pil...

  4. Statistical techniques to extract information during SMAP soil moisture assimilation

    Science.gov (United States)

    Kolassa, J.; Reichle, R. H.; Liu, Q.; Alemohammad, S. H.; Gentine, P.

    2017-12-01

    Statistical techniques permit the retrieval of soil moisture estimates in a model climatology while retaining the spatial and temporal signatures of the satellite observations. As a consequence, the need for bias correction prior to an assimilation of these estimates is reduced, which could result in a more effective use of the independent information provided by the satellite observations. In this study, a statistical neural network (NN) retrieval algorithm is calibrated using SMAP brightness temperature observations and modeled soil moisture estimates (similar to those used to calibrate the SMAP Level 4 DA system). Daily values of surface soil moisture are estimated using the NN and then assimilated into the NASA Catchment model. The skill of the assimilation estimates is assessed based on a comprehensive comparison to in situ measurements from the SMAP core and sparse network sites as well as the International Soil Moisture Network. The NN retrieval assimilation is found to significantly improve the model skill, particularly in areas where the model does not represent processes related to agricultural practices. Additionally, the NN method is compared to assimilation experiments using traditional bias correction techniques. The NN retrieval assimilation is found to more effectively use the independent information provided by SMAP resulting in larger model skill improvements than assimilation experiments using traditional bias correction techniques.

  5. Research on Crowdsourcing Emergency Information Extraction of Based on Events' Frame

    Science.gov (United States)

    Yang, Bo; Wang, Jizhou; Ma, Weijun; Mao, Xi

    2018-01-01

    At present, the common information extraction method cannot extract the structured emergency event information accurately; the general information retrieval tool cannot completely identify the emergency geographic information; these ways also do not have an accurate assessment of these results of distilling. So, this paper proposes an emergency information collection technology based on event framework. This technique is to solve the problem of emergency information picking. It mainly includes emergency information extraction model (EIEM), complete address recognition method (CARM) and the accuracy evaluation model of emergency information (AEMEI). EIEM can be structured to extract emergency information and complements the lack of network data acquisition in emergency mapping. CARM uses a hierarchical model and the shortest path algorithm and allows the toponomy pieces to be joined as a full address. AEMEI analyzes the results of the emergency event and summarizes the advantages and disadvantages of the event framework. Experiments show that event frame technology can solve the problem of emergency information drawing and provides reference cases for other applications. When the emergency disaster is about to occur, the relevant departments query emergency's data that has occurred in the past. They can make arrangements ahead of schedule which defense and reducing disaster. The technology decreases the number of casualties and property damage in the country and world. This is of great significance to the state and society.

  6. [Extraction of management information from the national quality assurance program].

    Science.gov (United States)

    Stausberg, Jürgen; Bartels, Claus; Bobrowski, Christoph

    2007-07-15

    Starting with clinically motivated projects, the national quality assurance program has established a legislative obligatory framework. Annual feedback of results is an important means of quality control. The annual reports cover quality-related information with high granularity. A synopsis for corporate management is missing, however. Therefore, the results of the University Clinics in Greifswald, Germany, have been analyzed and aggregated to support hospital management. Strengths were identified by the ranking of results within the state for each quality indicator, weaknesses by the comparison with national reference values. The assessment was aggregated per clinical discipline and per category (indication, process, and outcome). A composition of quality indicators was claimed multiple times. A coherent concept is still missing. The method presented establishes a plausible summary of strengths and weaknesses of a hospital from the point of view of the national quality assurance program. Nevertheless, further adaptation of the program is needed to better assist corporate management.

  7. EAP high-level product architecture

    DEFF Research Database (Denmark)

    Guðlaugsson, Tómas Vignir; Mortensen, Niels Henrik; Sarban, Rahimullah

    2013-01-01

    EAP technology has the potential to be used in a wide range of applications. This poses the challenge to the EAP component manufacturers to develop components for a wide variety of products. Danfoss Polypower A/S is developing an EAP technology platform, which can form the basis for a variety...... of EAP technology products while keeping complexity under control. High level product architecture has been developed for the mechanical part of EAP transducers, as the foundation for platform development. A generic description of an EAP transducer forms the core of the high level product architecture...... the function of the EAP transducers to be changed, by basing the EAP transducers on a different combination of organ alternatives. A model providing an overview of the high level product architecture has been developed to support daily development and cooperation across development teams. The platform approach...

  8. Extracting of implicit information in English advertising texts with phonetic and lexical-morphological means

    Directory of Open Access Journals (Sweden)

    Traikovskaya Natalya Petrovna

    2015-12-01

    Full Text Available The article deals with phonetic and lexical-morphological language means participating in the process of extracting implicit information in English-speaking advertising texts for men and women. The functioning of phonetic means of the English language is not the basis for implication of information in advertising texts. Lexical and morphological means play the role of markers of relevant information, playing the role of the activator ofimplicit information in the texts of advertising.

  9. Answers to your questions on high-level nuclear waste

    International Nuclear Information System (INIS)

    1987-11-01

    This booklet contains answers to frequently asked questions about high-level nuclear wastes. Written for the layperson, the document contains basic information on the hazards of radiation, the Nuclear Waste Management Program, the proposed geologic repository, the proposed monitored retrievable storage facility, risk assessment, and public participation in the program

  10. High-Level waste process and product data annotated bibliography

    International Nuclear Information System (INIS)

    Stegen, G.E.

    1996-01-01

    The objective of this document is to provide information on available issued documents that will assist interested parties in finding available data on high-level waste and transuranic waste feed compositions, properties, behavior in candidate processing operations, and behavior on candidate product glasses made from those wastes. This initial compilation is only a partial list of available references

  11. High-level manpower movement and Japan's foreign aid.

    Science.gov (United States)

    Furuya, K

    1992-01-01

    "Japan's technical assistance programs to Asian countries are summarized. Movements of high-level manpower accompanying direct foreign investments by private enterprise are also reviewed. Proposals for increased human resources development include education and training of foreigners in Japan as well as the training of Japanese aid experts and the development of networks for information exchange." excerpt

  12. High-Level Application Framework for LCLS

    Energy Technology Data Exchange (ETDEWEB)

    Chu, P; Chevtsov, S.; Fairley, D.; Larrieu, C.; Rock, J.; Rogind, D.; White, G.; Zalazny, M.; /SLAC

    2008-04-22

    A framework for high level accelerator application software is being developed for the Linac Coherent Light Source (LCLS). The framework is based on plug-in technology developed by an open source project, Eclipse. Many existing functionalities provided by Eclipse are available to high-level applications written within this framework. The framework also contains static data storage configuration and dynamic data connectivity. Because the framework is Eclipse-based, it is highly compatible with any other Eclipse plug-ins. The entire infrastructure of the software framework will be presented. Planned applications and plug-ins based on the framework are also presented.

  13. Post-processing of Deep Web Information Extraction Based on Domain Ontology

    Directory of Open Access Journals (Sweden)

    PENG, T.

    2013-11-01

    Full Text Available Many methods are utilized to extract and process query results in deep Web, which rely on the different structures of Web pages and various designing modes of databases. However, some semantic meanings and relations are ignored. So, in this paper, we present an approach for post-processing deep Web query results based on domain ontology which can utilize the semantic meanings and relations. A block identification model (BIM based on node similarity is defined to extract data blocks that are relevant to specific domain after reducing noisy nodes. Feature vector of domain books is obtained by result set extraction model (RSEM based on vector space model (VSM. RSEM, in combination with BIM, builds the domain ontology on books which can not only remove the limit of Web page structures when extracting data information, but also make use of semantic meanings of domain ontology. After extracting basic information of Web pages, a ranking algorithm is adopted to offer an ordered list of data records to users. Experimental results show that BIM and RSEM extract data blocks and build domain ontology accurately. In addition, relevant data records and basic information are extracted and ranked. The performances precision and recall show that our proposed method is feasible and efficient.

  14. The management of high-level radioactive wastes

    International Nuclear Information System (INIS)

    Lennemann, Wm.L.

    1979-01-01

    The definition of high-level radioactive wastes is given. The following aspects of high-level radioactive wastes' management are discussed: fuel reprocessing and high-level waste; storage of high-level liquid waste; solidification of high-level waste; interim storage of solidified high-level waste; disposal of high-level waste; disposal of irradiated fuel elements as a waste

  15. PAIRWISE BLENDING OF HIGH LEVEL WASTE

    International Nuclear Information System (INIS)

    CERTA, P.J.

    2006-01-01

    The primary objective of this study is to demonstrate a mission scenario that uses pairwise and incidental blending of high level waste (HLW) to reduce the total mass of HLW glass. Secondary objectives include understanding how recent refinements to the tank waste inventory and solubility assumptions affect the mass of HLW glass and how logistical constraints may affect the efficacy of HLW blending

  16. Materials for high-level waste containment

    International Nuclear Information System (INIS)

    Marsh, G.P.

    1982-01-01

    The function of the high-level radioactive waste container in storage and of a container/overpack combination in disposal is considered. The consequent properties required from potential fabrication materials are discussed. The strategy adopted in selecting containment materials and the experimental programme underway to evaluate them are described. (U.K.)

  17. Current high-level waste solidification technology

    International Nuclear Information System (INIS)

    Bonner, W.F.; Ross, W.A.

    1976-01-01

    Technology has been developed in the U.S. and abroad for solidification of high-level waste from nuclear power production. Several processes have been demonstrated with actual radioactive waste and are now being prepared for use in the commercial nuclear industry. Conversion of the waste to a glass form is favored because of its high degree of nondispersibility and safety

  18. a Statistical Texture Feature for Building Collapse Information Extraction of SAR Image

    Science.gov (United States)

    Li, L.; Yang, H.; Chen, Q.; Liu, X.

    2018-04-01

    Synthetic Aperture Radar (SAR) has become one of the most important ways to extract post-disaster collapsed building information, due to its extreme versatility and almost all-weather, day-and-night working capability, etc. In view of the fact that the inherent statistical distribution of speckle in SAR images is not used to extract collapsed building information, this paper proposed a novel texture feature of statistical models of SAR images to extract the collapsed buildings. In the proposed feature, the texture parameter of G0 distribution from SAR images is used to reflect the uniformity of the target to extract the collapsed building. This feature not only considers the statistical distribution of SAR images, providing more accurate description of the object texture, but also is applied to extract collapsed building information of single-, dual- or full-polarization SAR data. The RADARSAT-2 data of Yushu earthquake which acquired on April 21, 2010 is used to present and analyze the performance of the proposed method. In addition, the applicability of this feature to SAR data with different polarizations is also analysed, which provides decision support for the data selection of collapsed building information extraction.

  19. A method for automating the extraction of specialized information from the web

    NARCIS (Netherlands)

    Lin, L.; Liotta, A.; Hippisley, A.; Hao, Y.; Liu, J.; Wang, Y.; Cheung, Y-M.; Yin, H.; Jiao, L.; Ma, j.; Jiao, Y-C.

    2005-01-01

    The World Wide Web can be viewed as a gigantic distributed database including millions of interconnected hosts some of which publish information via web servers or peer-to-peer systems. We present here a novel method for the extraction of semantically rich information from the web in a fully

  20. Information analysis of iris biometrics for the needs of cryptology key extraction

    Directory of Open Access Journals (Sweden)

    Adamović Saša

    2013-01-01

    Full Text Available The paper presents a rigorous analysis of iris biometric information for the synthesis of an optimized system for the extraction of a high quality cryptology key. Estimations of local entropy and mutual information were identified as segments of the iris most suitable for this purpose. In order to optimize parameters, corresponding wavelets were transformed, in order to obtain the highest possible entropy and mutual information lower in the transformation domain, which set frameworks for the synthesis of systems for the extraction of truly random sequences of iris biometrics, without compromising authentication properties. [Projekat Ministarstva nauke Republike Srbije, br. TR32054 i br. III44006

  1. FPGA based compute nodes for high level triggering in PANDA

    International Nuclear Information System (INIS)

    Kuehn, W; Gilardi, C; Kirschner, D; Lang, J; Lange, S; Liu, M; Perez, T; Yang, S; Schmitt, L; Jin, D; Li, L; Liu, Z; Lu, Y; Wang, Q; Wei, S; Xu, H; Zhao, D; Korcyl, K; Otwinowski, J T; Salabura, P

    2008-01-01

    PANDA is a new universal detector for antiproton physics at the HESR facility at FAIR/GSI. The PANDA data acquisition system has to handle interaction rates of the order of 10 7 /s and data rates of several 100 Gb/s. FPGA based compute nodes with multi-Gb/s bandwidth capability using the ATCA architecture are designed to handle tasks such as event building, feature extraction and high level trigger processing. Data connectivity is provided via optical links as well as multiple Gb Ethernet ports. The boards will support trigger algorithms such us pattern recognition for RICH detectors, EM shower analysis, fast tracking algorithms and global event characterization. Besides VHDL, high level C-like hardware description languages will be considered to implement the firmware

  2. Disposal of high-level radioactive waste

    International Nuclear Information System (INIS)

    Glasby, G.P.

    1977-01-01

    Although controversy surrounding the possible introduction of nuclear power into New Zealand has raised many points including radiation hazards, reactor safety, capital costs, sources of uranium and earthquake risks on the one hand versus energy conservation and alternative sources of energy on the other, one problem remains paramount and is of global significance - the storage and dumping of the high-level radioactive wastes of the reactor core. The generation of abundant supplies of energy now in return for the storage of these long-lived highly radioactive wastes has been dubbed the so-called Faustian bargain. This article discusses the growth of the nuclear industry and its implications to high-level waste disposal particularly in the deep-sea bed. (auth.)

  3. High-Level Waste Melter Study Report

    Energy Technology Data Exchange (ETDEWEB)

    Perez, Joseph M.; Bickford, Dennis F.; Day, Delbert E.; Kim, Dong-Sang; Lambert, Steven L.; Marra, Sharon L.; Peeler, David K.; Strachan, Denis M.; Triplett, Mark B.; Vienna, John D.; Wittman, Richard S.

    2001-07-13

    At the Hanford Site in Richland, Washington, the path to site cleanup involves vitrification of the majority of the wastes that currently reside in large underground tanks. A Joule-heated glass melter is the equipment of choice for vitrifying the high-level fraction of these wastes. Even though this technology has general national and international acceptance, opportunities may exist to improve or change the technology to reduce the enormous cost of accomplishing the mission of site cleanup. Consequently, the U.S. Department of Energy requested the staff of the Tanks Focus Area to review immobilization technologies, waste forms, and modifications to requirements for solidification of the high-level waste fraction at Hanford to determine what aspects could affect cost reductions with reasonable long-term risk. The results of this study are summarized in this report.

  4. MedTime: a temporal information extraction system for clinical narratives.

    Science.gov (United States)

    Lin, Yu-Kai; Chen, Hsinchun; Brown, Randall A

    2013-12-01

    Temporal information extraction from clinical narratives is of critical importance to many clinical applications. We participated in the EVENT/TIMEX3 track of the 2012 i2b2 clinical temporal relations challenge, and presented our temporal information extraction system, MedTime. MedTime comprises a cascade of rule-based and machine-learning pattern recognition procedures. It achieved a micro-averaged f-measure of 0.88 in both the recognitions of clinical events and temporal expressions. We proposed and evaluated three time normalization strategies to normalize relative time expressions in clinical texts. The accuracy was 0.68 in normalizing temporal expressions of dates, times, durations, and frequencies. This study demonstrates and evaluates the integration of rule-based and machine-learning-based approaches for high performance temporal information extraction from clinical narratives. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. Decommissioning high-level waste surface facilities

    International Nuclear Information System (INIS)

    1978-04-01

    The protective storage, entombment and dismantlement options of decommissioning a High-Level Waste Surface Facility (HLWSF) was investigated. A reference conceptual design for the facility was developed based on the designs of similar facilities. State-of-the-art decommissioning technologies were identified. Program plans and cost estimates for decommissioning the reference conceptual designs were developed. Good engineering design concepts were on the basis of this work identified

  6. Research of building information extraction and evaluation based on high-resolution remote-sensing imagery

    Science.gov (United States)

    Cao, Qiong; Gu, Lingjia; Ren, Ruizhi; Wang, Lang

    2016-09-01

    Building extraction currently is important in the application of high-resolution remote sensing imagery. At present, quite a few algorithms are available for detecting building information, however, most of them still have some obvious disadvantages, such as the ignorance of spectral information, the contradiction between extraction rate and extraction accuracy. The purpose of this research is to develop an effective method to detect building information for Chinese GF-1 data. Firstly, the image preprocessing technique is used to normalize the image and image enhancement is used to highlight the useful information in the image. Secondly, multi-spectral information is analyzed. Subsequently, an improved morphological building index (IMBI) based on remote sensing imagery is proposed to get the candidate building objects. Furthermore, in order to refine building objects and further remove false objects, the post-processing (e.g., the shape features, the vegetation index and the water index) is employed. To validate the effectiveness of the proposed algorithm, the omission errors (OE), commission errors (CE), the overall accuracy (OA) and Kappa are used at final. The proposed method can not only effectively use spectral information and other basic features, but also avoid extracting excessive interference details from high-resolution remote sensing images. Compared to the original MBI algorithm, the proposed method reduces the OE by 33.14% .At the same time, the Kappa increase by 16.09%. In experiments, IMBI achieved satisfactory results and outperformed other algorithms in terms of both accuracies and visual inspection

  7. High-level waste processing and disposal

    International Nuclear Information System (INIS)

    Crandall, J.L.; Krause, H.; Sombret, C.; Uematsu, K.

    1984-01-01

    The national high-level waste disposal plans for France, the Federal Republic of Germany, Japan, and the United States are covered. Three conclusions are reached. The first conclusion is that an excellent technology already exists for high-level waste disposal. With appropriate packaging, spent fuel seems to be an acceptable waste form. Borosilicate glass reprocessing waste forms are well understood, in production in France, and scheduled for production in the next few years in a number of other countries. For final disposal, a number of candidate geological repository sites have been identified and several demonstration sites opened. The second conclusion is that adequate financing and a legal basis for waste disposal are in place in most countries. Costs of high-level waste disposal will probably add about 5 to 10% to the costs of nuclear electric power. The third conclusion is less optimistic. Political problems remain formidable in highly conservative regulations, in qualifying a final disposal site, and in securing acceptable transport routes

  8. Information Extraction of High-Resolution Remotely Sensed Image Based on Multiresolution Segmentation

    Directory of Open Access Journals (Sweden)

    Peng Shao

    2014-08-01

    Full Text Available The principle of multiresolution segmentation was represented in detail in this study, and the canny algorithm was applied for edge-detection of a remotely sensed image based on this principle. The target image was divided into regions based on object-oriented multiresolution segmentation and edge-detection. Furthermore, object hierarchy was created, and a series of features (water bodies, vegetation, roads, residential areas, bare land and other information were extracted by the spectral and geometrical features. The results indicate that the edge-detection has a positive effect on multiresolution segmentation, and overall accuracy of information extraction reaches to 94.6% by the confusion matrix.

  9. End-to-end information extraction without token-level supervision

    DEFF Research Database (Denmark)

    Palm, Rasmus Berg; Hovy, Dirk; Laws, Florian

    2017-01-01

    Most state-of-the-art information extraction approaches rely on token-level labels to find the areas of interest in text. Unfortunately, these labels are time-consuming and costly to create, and consequently, not available for many real-life IE tasks. To make matters worse, token-level labels...... and output text. We evaluate our model on the ATIS data set, MIT restaurant corpus and the MIT movie corpus and compare to neural baselines that do use token-level labels. We achieve competitive results, within a few percentage points of the baselines, showing the feasibility of E2E information extraction...

  10. Handbook of high-level radioactive waste transportation

    International Nuclear Information System (INIS)

    Sattler, L.R.

    1992-10-01

    The High-Level Radioactive Waste Transportation Handbook serves as a reference to which state officials and members of the general public may turn for information on radioactive waste transportation and on the federal government's system for transporting this waste under the Civilian Radioactive Waste Management Program. The Handbook condenses and updates information contained in the Midwestern High-Level Radioactive Waste Transportation Primer. It is intended primarily to assist legislators who, in the future, may be called upon to enact legislation pertaining to the transportation of radioactive waste through their jurisdictions. The Handbook is divided into two sections. The first section places the federal government's program for transporting radioactive waste in context. It provides background information on nuclear waste production in the United States and traces the emergence of federal policy for disposing of radioactive waste. The second section covers the history of radioactive waste transportation; summarizes major pieces of legislation pertaining to the transportation of radioactive waste; and provides an overview of the radioactive waste transportation program developed by the US Department of Energy (DOE). To supplement this information, a summary of pertinent federal and state legislation and a glossary of terms are included as appendices, as is a list of publications produced by the Midwestern Office of The Council of State Governments (CSG-MW) as part of the Midwestern High-Level Radioactive Waste Transportation Project

  11. Extraction Method for Earthquake-Collapsed Building Information Based on High-Resolution Remote Sensing

    International Nuclear Information System (INIS)

    Chen, Peng; Wu, Jian; Liu, Yaolin; Wang, Jing

    2014-01-01

    At present, the extraction of earthquake disaster information from remote sensing data relies on visual interpretation. However, this technique cannot effectively and quickly obtain precise and efficient information for earthquake relief and emergency management. Collapsed buildings in the town of Zipingpu after the Wenchuan earthquake were used as a case study to validate two kinds of rapid extraction methods for earthquake-collapsed building information based on pixel-oriented and object-oriented theories. The pixel-oriented method is based on multi-layer regional segments that embody the core layers and segments of the object-oriented method. The key idea is to mask layer by layer all image information, including that on the collapsed buildings. Compared with traditional techniques, the pixel-oriented method is innovative because it allows considerably rapid computer processing. As for the object-oriented method, a multi-scale segment algorithm was applied to build a three-layer hierarchy. By analyzing the spectrum, texture, shape, location, and context of individual object classes in different layers, the fuzzy determined rule system was established for the extraction of earthquake-collapsed building information. We compared the two sets of results using three variables: precision assessment, visual effect, and principle. Both methods can extract earthquake-collapsed building information quickly and accurately. The object-oriented method successfully overcomes the pepper salt noise caused by the spectral diversity of high-resolution remote sensing data and solves the problem of same object, different spectrums and that of same spectrum, different objects. With an overall accuracy of 90.38%, the method achieves more scientific and accurate results compared with the pixel-oriented method (76.84%). The object-oriented image analysis method can be extensively applied in the extraction of earthquake disaster information based on high-resolution remote sensing

  12. Using text mining techniques to extract phenotypic information from the PhenoCHF corpus.

    Science.gov (United States)

    Alnazzawi, Noha; Thompson, Paul; Batista-Navarro, Riza; Ananiadou, Sophia

    2015-01-01

    Phenotypic information locked away in unstructured narrative text presents significant barriers to information accessibility, both for clinical practitioners and for computerised applications used for clinical research purposes. Text mining (TM) techniques have previously been applied successfully to extract different types of information from text in the biomedical domain. They have the potential to be extended to allow the extraction of information relating to phenotypes from free text. To stimulate the development of TM systems that are able to extract phenotypic information from text, we have created a new corpus (PhenoCHF) that is annotated by domain experts with several types of phenotypic information relating to congestive heart failure. To ensure that systems developed using the corpus are robust to multiple text types, it integrates text from heterogeneous sources, i.e., electronic health records (EHRs) and scientific articles from the literature. We have developed several different phenotype extraction methods to demonstrate the utility of the corpus, and tested these methods on a further corpus, i.e., ShARe/CLEF 2013. Evaluation of our automated methods showed that PhenoCHF can facilitate the training of reliable phenotype extraction systems, which are robust to variations in text type. These results have been reinforced by evaluating our trained systems on the ShARe/CLEF corpus, which contains clinical records of various types. Like other studies within the biomedical domain, we found that solutions based on conditional random fields produced the best results, when coupled with a rich feature set. PhenoCHF is the first annotated corpus aimed at encoding detailed phenotypic information. The unique heterogeneous composition of the corpus has been shown to be advantageous in the training of systems that can accurately extract phenotypic information from a range of different text types. Although the scope of our annotation is currently limited to a single

  13. Terrain Extraction by Integrating Terrestrial Laser Scanner Data and Spectral Information

    Science.gov (United States)

    Lau, C. L.; Halim, S.; Zulkepli, M.; Azwan, A. M.; Tang, W. L.; Chong, A. K.

    2015-10-01

    The extraction of true terrain points from unstructured laser point cloud data is an important process in order to produce an accurate digital terrain model (DTM). However, most of these spatial filtering methods just utilizing the geometrical data to discriminate the terrain points from nonterrain points. The point cloud filtering method also can be improved by using the spectral information available with some scanners. Therefore, the objective of this study is to investigate the effectiveness of using the three-channel (red, green and blue) of the colour image captured from built-in digital camera which is available in some Terrestrial Laser Scanner (TLS) for terrain extraction. In this study, the data acquisition was conducted at a mini replica landscape in Universiti Teknologi Malaysia (UTM), Skudai campus using Leica ScanStation C10. The spectral information of the coloured point clouds from selected sample classes are extracted for spectral analysis. The coloured point clouds which within the corresponding preset spectral threshold are identified as that specific feature point from the dataset. This process of terrain extraction is done through using developed Matlab coding. Result demonstrates that a higher spectral resolution passive image is required in order to improve the output. This is because low quality of the colour images captured by the sensor contributes to the low separability in spectral reflectance. In conclusion, this study shows that, spectral information is capable to be used as a parameter for terrain extraction.

  14. Cermets for high level waste containment

    International Nuclear Information System (INIS)

    Aaron, W.S.; Quinby, T.C.; Kobisk, E.H.

    1978-01-01

    Cermet materials are currently under investigation as an alternate for the primary containment of high level wastes. The cermet in this study is an iron--nickel base metal matrix containing uniformly dispersed, micron-size fission product oxides, aluminosilicates, and titanates. Cermets possess high thermal conductivity, and typical waste loading of 70 wt % with volume reduction factors of 2 to 200 and low processing volatility losses have been realized. Preliminary leach studies indicate a leach resistance comparable to other candidate waste forms; however, more quantitative data are required. Actual waste studies have begun on NFS Acid Thorex, SRP dried sludge and fresh, unneutralized SRP process wastes

  15. Python based high-level synthesis compiler

    Science.gov (United States)

    Cieszewski, Radosław; Pozniak, Krzysztof; Romaniuk, Ryszard

    2014-11-01

    This paper presents a python based High-Level synthesis (HLS) compiler. The compiler interprets an algorithmic description of a desired behavior written in Python and map it to VHDL. FPGA combines many benefits of both software and ASIC implementations. Like software, the mapped circuit is flexible, and can be reconfigured over the lifetime of the system. FPGAs therefore have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. Creating parallel programs implemented in FPGAs is not trivial. This article describes design, implementation and first results of created Python based compiler.

  16. The CMS High-Level Trigger

    International Nuclear Information System (INIS)

    Covarelli, R.

    2009-01-01

    At the startup of the LHC, the CMS data acquisition is expected to be able to sustain an event readout rate of up to 100 kHz from the Level-1 trigger. These events will be read into a large processor farm which will run the 'High-Level Trigger'(HLT) selection algorithms and will output a rate of about 150 Hz for permanent data storage. In this report HLT performances are shown for selections based on muons, electrons, photons, jets, missing transverse energy, τ leptons and b quarks: expected efficiencies, background rates and CPU time consumption are reported as well as relaxation criteria foreseen for a LHC startup instantaneous luminosity.

  17. The CMS High-Level Trigger

    CERN Document Server

    Covarelli, Roberto

    2009-01-01

    At the startup of the LHC, the CMS data acquisition is expected to be able to sustain an event readout rate of up to 100 kHz from the Level-1 trigger. These events will be read into a large processor farm which will run the "High-Level Trigger" (HLT) selection algorithms and will output a rate of about 150 Hz for permanent data storage. In this report HLT performances are shown for selections based on muons, electrons, photons, jets, missing transverse energy, tau leptons and b quarks: expected efficiencies, background rates and CPU time consumption are reported as well as relaxation criteria foreseen for a LHC startup instantaneous luminosity.

  18. The CMS High-Level Trigger

    Science.gov (United States)

    Covarelli, R.

    2009-12-01

    At the startup of the LHC, the CMS data acquisition is expected to be able to sustain an event readout rate of up to 100 kHz from the Level-1 trigger. These events will be read into a large processor farm which will run the "High-Level Trigger" (HLT) selection algorithms and will output a rate of about 150 Hz for permanent data storage. In this report HLT performances are shown for selections based on muons, electrons, photons, jets, missing transverse energy, τ leptons and b quarks: expected efficiencies, background rates and CPU time consumption are reported as well as relaxation criteria foreseen for a LHC startup instantaneous luminosity.

  19. Service Oriented Architecture for High Level Applications

    International Nuclear Information System (INIS)

    Chu, P.

    2012-01-01

    Standalone high level applications often suffer from poor performance and reliability due to lengthy initialization, heavy computation and rapid graphical update. Service-oriented architecture (SOA) is trying to separate the initialization and computation from applications and to distribute such work to various service providers. Heavy computation such as beam tracking will be done periodically on a dedicated server and data will be available to client applications at all time. Industrial standard service architecture can help to improve the performance, reliability and maintainability of the service. Robustness will also be improved by reducing the complexity of individual client applications.

  20. Information retrieval and terminology extraction in online resources for patients with diabetes.

    Science.gov (United States)

    Seljan, Sanja; Baretić, Maja; Kucis, Vlasta

    2014-06-01

    Terminology use, as a mean for information retrieval or document indexing, plays an important role in health literacy. Specific types of users, i.e. patients with diabetes need access to various online resources (on foreign and/or native language) searching for information on self-education of basic diabetic knowledge, on self-care activities regarding importance of dietetic food, medications, physical exercises and on self-management of insulin pumps. Automatic extraction of corpus-based terminology from online texts, manuals or professional papers, can help in building terminology lists or list of "browsing phrases" useful in information retrieval or in document indexing. Specific terminology lists represent an intermediate step between free text search and controlled vocabulary, between user's demands and existing online resources in native and foreign language. The research aiming to detect the role of terminology in online resources, is conducted on English and Croatian manuals and Croatian online texts, and divided into three interrelated parts: i) comparison of professional and popular terminology use ii) evaluation of automatic statistically-based terminology extraction on English and Croatian texts iii) comparison and evaluation of extracted terminology performed on English manual using statistical and hybrid approaches. Extracted terminology candidates are evaluated by comparison with three types of reference lists: list created by professional medical person, list of highly professional vocabulary contained in MeSH and list created by non-medical persons, made as intersection of 15 lists. Results report on use of popular and professional terminology in online diabetes resources, on evaluation of automatically extracted terminology candidates in English and Croatian texts and on comparison of statistical and hybrid extraction methods in English text. Evaluation of automatic and semi-automatic terminology extraction methods is performed by recall

  1. The immobilization of High Level Waste Into Glass

    International Nuclear Information System (INIS)

    Aisyah; Martono, H.

    1998-01-01

    High level liquid waste is generated from the first step extraction in the nuclear fuel reprocessing. The waste is immobilized with boro-silicate glass. A certain composition of glass is needed for a certain type of waste, so that the properties of waste glass would meet the requirement either for further process or for disposal. The effect of waste loading on either density, thermal expansion, softening point and leaching rate has been studied. The composition of the high level liquid waste has been determined by ORIGEN 2 and the result has been used to prepare simulated high level waste. The waste loading in the waste glass has been set to be 19.48; 22.32; 25.27; and 26.59 weight percent. The result shows that increasing the waste loading has resulted in the higher density with no thermal expansion and softening point significant change. The increase in the waste loading increase that leaching rate. The properties of the waste glass in this research have not shown any deviation from the standard waste glass properties

  2. OpenCV-Based Nanomanipulation Information Extraction and the Probe Operation in SEM

    Directory of Open Access Journals (Sweden)

    Dongjie Li

    2015-02-01

    Full Text Available Aimed at the established telenanomanipulation system, the method of extracting location information and the strategies of probe operation were studied in this paper. First, the machine learning algorithm of OpenCV was used to extract location information from SEM images. Thus nanowires and probe in SEM images can be automatically tracked and the region of interest (ROI can be marked quickly. Then the location of nanowire and probe can be extracted from the ROI. To study the probe operation strategy, the Van der Waals force between probe and a nanowire was computed; thus relevant operating parameters can be obtained. With these operating parameters, the nanowire in 3D virtual environment can be preoperated and an optimal path of the probe can be obtained. The actual probe runs automatically under the telenanomanipulation system's control. Finally, experiments were carried out to verify the above methods, and results show the designed methods have achieved the expected effect.

  3. High-level Component Interfaces for Collaborative Development: A Proposal

    Directory of Open Access Journals (Sweden)

    Thomas Marlowe

    2009-12-01

    Full Text Available Software development has rapidly moved toward collaborative development models where multiple partners collaborate in creating and evolving software intensive systems or components of sophisticated ubiquitous socio-technical-ecosystems. In this paper we extend the concept of software interface to a flexible high-level interface as means for accommodating change and localizing, controlling and managing the exchange of knowledge and functional, behavioral, quality, project and business related information between the partners and between the developed components.

  4. Methods to extract information on the atomic and molecular states from scientific abstracts

    International Nuclear Information System (INIS)

    Sasaki, Akira; Ueshima, Yutaka; Yamagiwa, Mitsuru; Murata, Masaki; Kanamaru, Toshiyuki; Shirado, Tamotsu; Isahara, Hitoshi

    2005-01-01

    We propose a new application of information technology to recognize and extract expressions of atomic and molecular states from electrical forms of scientific abstracts. Present results will help scientists to understand atomic states as well as the physics discussed in the articles. Combining with the internet search engines, it will make one possible to collect not only atomic and molecular data but broader scientific information over a wide range of research fields. (author)

  5. System and method for extracting physiological information from remotely detected electromagnetic radiation

    NARCIS (Netherlands)

    2016-01-01

    The present invention relates to a device and a method for extracting physiological information indicative of at least one health symptom from remotely detected electromagnetic radiation. The device comprises an interface (20) for receiving a data stream comprising remotely detected image data

  6. System and method for extracting physiological information from remotely detected electromagnetic radiation

    NARCIS (Netherlands)

    2015-01-01

    The present invention relates to a device and a method for extracting physiological information indicative of at least one health symptom from remotely detected electromagnetic radiation. The device comprises an interface (20) for receiving a data stream comprising remotely detected image data

  7. Network and Ensemble Enabled Entity Extraction in Informal Text (NEEEEIT) final report

    Energy Technology Data Exchange (ETDEWEB)

    Kegelmeyer, Philip W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shead, Timothy M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dunlavy, Daniel M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-09-01

    This SAND report summarizes the activities and outcomes of the Network and Ensemble Enabled Entity Extraction in Information Text (NEEEEIT) LDRD project, which addressed improving the accuracy of conditional random fields for named entity recognition through the use of ensemble methods.

  8. A construction scheme of web page comment information extraction system based on frequent subtree mining

    Science.gov (United States)

    Zhang, Xiaowen; Chen, Bingfeng

    2017-08-01

    Based on the frequent sub-tree mining algorithm, this paper proposes a construction scheme of web page comment information extraction system based on frequent subtree mining, referred to as FSM system. The entire system architecture and the various modules to do a brief introduction, and then the core of the system to do a detailed description, and finally give the system prototype.

  9. EXTRACT

    DEFF Research Database (Denmark)

    Pafilis, Evangelos; Buttigieg, Pier Luigi; Ferrell, Barbra

    2016-01-01

    The microbial and molecular ecology research communities have made substantial progress on developing standards for annotating samples with environment metadata. However, sample manual annotation is a highly labor intensive process and requires familiarity with the terminologies used. We have the...... and text-mining-assisted curation revealed that EXTRACT speeds up annotation by 15-25% and helps curators to detect terms that would otherwise have been missed.Database URL: https://extract.hcmr.gr/......., organism, tissue and disease terms. The evaluators in the BioCreative V Interactive Annotation Task found the system to be intuitive, useful, well documented and sufficiently accurate to be helpful in spotting relevant text passages and extracting organism and environment terms. Comparison of fully manual...

  10. The ALICE Dimuon Spectrometer High Level Trigger

    CERN Document Server

    Becker, B; Cicalo, Corrado; Das, Indranil; de Vaux, Gareth; Fearick, Roger; Lindenstruth, Volker; Marras, Davide; Sanyal, Abhijit; Siddhanta, Sabyasachi; Staley, Florent; Steinbeck, Timm; Szostak, Artur; Usai, Gianluca; Vilakazi, Zeblon

    2009-01-01

    The ALICE Dimuon Spectrometer High Level Trigger (dHLT) is an on-line processing stage whose primary function is to select interesting events that contain distinct physics signals from heavy resonance decays such as J/psi and Gamma particles, amidst unwanted background events. It forms part of the High Level Trigger of the ALICE experiment, whose goal is to reduce the large data rate of about 25 GB/s from the ALICE detectors by an order of magnitude, without loosing interesting physics events. The dHLT has been implemented as a software trigger within a high performance and fault tolerant data transportation framework, which is run on a large cluster of commodity compute nodes. To reach the required processing speeds, the system is built as a concurrent system with a hierarchy of processing steps. The main algorithms perform partial event reconstruction, starting with hit reconstruction on the level of the raw data received from the spectrometer. Then a tracking algorithm finds track candidates from the recon...

  11. Processing vessel for high level radioactive wastes

    International Nuclear Information System (INIS)

    Maekawa, Hiromichi

    1998-01-01

    Upon transferring an overpack having canisters containing high level radioactive wastes sealed therein and burying it into an underground processing hole, an outer shell vessel comprising a steel plate to be fit and contained in the processing hole is formed. A bury-back layer made of dug earth and sand which had been discharged upon forming the processing hole is formed on the inner circumferential wall of the outer shell vessel. A buffer layer having a predetermined thickness is formed on the inner side of the bury-back layer, and the overpack is contained in the hollow portion surrounded by the layer. The opened upper portion of the hollow portion is covered with the buffer layer and the bury-back layer. Since the processing vessel having a shielding performance previously formed on the ground, the state of packing can be observed. In addition, since an operator can directly operates upon transportation and burying of the high level radioactive wastes, remote control is no more necessary. (T.M.)

  12. Semi-automatic building extraction in informal settlements from high-resolution satellite imagery

    Science.gov (United States)

    Mayunga, Selassie David

    The extraction of man-made features from digital remotely sensed images is considered as an important step underpinning management of human settlements in any country. Man-made features and buildings in particular are required for varieties of applications such as urban planning, creation of geographical information systems (GIS) databases and Urban City models. The traditional man-made feature extraction methods are very expensive in terms of equipment, labour intensive, need well-trained personnel and cannot cope with changing environments, particularly in dense urban settlement areas. This research presents an approach for extracting buildings in dense informal settlement areas using high-resolution satellite imagery. The proposed system uses a novel strategy of extracting building by measuring a single point at the approximate centre of the building. The fine measurement of the building outlines is then effected using a modified snake model. The original snake model on which this framework is based, incorporates an external constraint energy term which is tailored to preserving the convergence properties of the snake model; its use to unstructured objects will negatively affect their actual shapes. The external constrained energy term was removed from the original snake model formulation, thereby, giving ability to cope with high variability of building shapes in informal settlement areas. The proposed building extraction system was tested on two areas, which have different situations. The first area was Tungi in Dar Es Salaam, Tanzania where three sites were tested. This area is characterized by informal settlements, which are illegally formulated within the city boundaries. The second area was Oromocto in New Brunswick, Canada where two sites were tested. Oromocto area is mostly flat and the buildings are constructed using similar materials. Qualitative and quantitative measures were employed to evaluate the accuracy of the results as well as the performance

  13. Hierarchical High Level Information Fusion (H2LIFT)

    Science.gov (United States)

    2008-09-15

    platform increases, human decision makers are being overwhelmed with data. In this research, the CUBRC proposed a cost effective two-year program of...8 5.1.1.3.5 Geodetic vs. Geocentric Latitude ....................................................................... 9 5.1.1.3.6...are being overwhelmed with data. In this research, the CUBRC proposed a cost effective two-year program of a novel approach in the near "real-time

  14. RESEARCH ON REMOTE SENSING GEOLOGICAL INFORMATION EXTRACTION BASED ON OBJECT ORIENTED CLASSIFICATION

    Directory of Open Access Journals (Sweden)

    H. Gao

    2018-04-01

    Full Text Available The northern Tibet belongs to the Sub cold arid climate zone in the plateau. It is rarely visited by people. The geological working conditions are very poor. However, the stratum exposures are good and human interference is very small. Therefore, the research on the automatic classification and extraction of remote sensing geological information has typical significance and good application prospect. Based on the object-oriented classification in Northern Tibet, using the Worldview2 high-resolution remote sensing data, combined with the tectonic information and image enhancement, the lithological spectral features, shape features, spatial locations and topological relations of various geological information are excavated. By setting the threshold, based on the hierarchical classification, eight kinds of geological information were classified and extracted. Compared with the existing geological maps, the accuracy analysis shows that the overall accuracy reached 87.8561 %, indicating that the classification-oriented method is effective and feasible for this study area and provides a new idea for the automatic extraction of remote sensing geological information.

  15. The ARES High-level Intermediate Representation

    Energy Technology Data Exchange (ETDEWEB)

    Moss, Nicholas David [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-03-03

    The LLVM intermediate representation (IR) lacks semantic constructs for depicting common high-performance operations such as parallel and concurrent execution, communication and synchronization. Currently, representing such semantics in LLVM requires either extending the intermediate form (a signi cant undertaking) or the use of ad hoc indirect means such as encoding them as intrinsics and/or the use of metadata constructs. In this paper we discuss a work in progress to explore the design and implementation of a new compilation stage and associated high-level intermediate form that is placed between the abstract syntax tree and when it is lowered to LLVM's IR. This highlevel representation is a superset of LLVM IR and supports the direct representation of these common parallel computing constructs along with the infrastructure for supporting analysis and transformation passes on this representation.

  16. CAMAC and high-level-languages

    International Nuclear Information System (INIS)

    Degenhardt, K.H.

    1976-05-01

    A proposal for easy programming of CAMAC systems with high-level-languages (FORTRAN, RTL/2, etc.) and interpreters (BASIC, MUMTI, etc.) using a few subroutines and a LAM driver is presented. The subroutines and the LAM driver are implemented for PDP11/RSX-11M and for the CAMAC controllers DEC CA11A (branch controller), BORER type 1533A (single crate controller) and DEC CA11F (single crate controller). Mixed parallel/serial CAMAC systems employing KINETIC SYSTEMS serial driver mod. 3992 and serial crate controllers mod. 3950 are implemented for all mentioned parallel controllers, too. DMA transfers from or to CAMAC modules using non-processor-request controllers (BORER type 1542, DEC CA11FN) are available. (orig.) [de

  17. National high-level waste systems analysis

    International Nuclear Information System (INIS)

    Kristofferson, K.; O'Holleran, T.P.

    1996-01-01

    Previously, no mechanism existed that provided a systematic, interrelated view or national perspective of all high-level waste treatment and storage systems that the US Department of Energy manages. The impacts of budgetary constraints and repository availability on storage and treatment must be assessed against existing and pending negotiated milestones for their impact on the overall HLW system. This assessment can give DOE a complex-wide view of the availability of waste treatment and help project the time required to prepare HLW for disposal. Facilities, throughputs, schedules, and milestones were modeled to ascertain the treatment and storage systems resource requirements at the Hanford Site, Savannah River Site, Idaho National Engineering Laboratory, and West Valley Demonstration Project. The impacts of various treatment system availabilities on schedule and throughput were compared to repository readiness to determine the prudent application of resources. To assess the various impacts, the model was exercised against a number of plausible scenarios as discussed in this paper

  18. International high-level radioactive waste repositories

    International Nuclear Information System (INIS)

    Lin, W.

    1996-01-01

    Although nuclear technologies benefit everyone, the associated nuclear wastes are a widespread and rapidly growing problem. Nuclear power plants are in operation in 25 countries, and are under construction in others. Developing countries are hungry for electricity to promote economic growth; industrialized countries are eager to export nuclear technologies and equipment. These two ingredients, combined with the rapid shrinkage of worldwide fossil fuel reserves, will increase the utilization of nuclear power. All countries utilizing nuclear power produce at least a few tens of tons of spent fuel per year. That spent fuel (and reprocessing products, if any) constitutes high-level nuclear waste. Toxicity, long half-life, and immunity to chemical degradation make such waste an almost permanent threat to human beings. This report discusses the advantages of utilizing repositories for disposal of nuclear wastes

  19. Intergenerational ethics of high level radioactive waste

    Energy Technology Data Exchange (ETDEWEB)

    Takeda, Kunihiko [Nagoya Univ., Graduate School of Engineering, Nagoya, Aichi (Japan); Nasu, Akiko; Maruyama, Yoshihiro [Shibaura Inst. of Tech., Tokyo (Japan)

    2003-03-01

    The validity of intergenerational ethics on the geological disposal of high level radioactive waste originating from nuclear power plants was studied. The result of the study on geological disposal technology showed that the current method of disposal can be judged to be scientifically reliable for several hundred years and the radioactivity level will be less than one tenth of the tolerable amount after 1,000 years or more. This implies that the consideration of intergenerational ethics of geological disposal is meaningless. Ethics developed in western society states that the consent of people in the future is necessary if the disposal has influence on them. Moreover, the ethics depends on generally accepted ideas in western society and preconceptions based on racism and sexism. The irrationality becomes clearer by comparing the dangers of the exhaustion of natural resources and pollution from harmful substances in a recycling society. (author)

  20. Management of high level radioactive waste

    International Nuclear Information System (INIS)

    Redon, A.; Mamelle, J.; Chambon, M.

    1977-01-01

    The world wide needs in reprocessing will reach the value of 10.000 t/y of irradiated fuels, in the mid of the 80's. Several countries will have planned, in their nuclear programme, the construction of reprocessing plants with a 1500 t/y capacity, corresponding to 50.000 MWe installed. At such a level, the solidification of the radioactive waste will become imperative. For this reason, all efforts, in France, have been directed towards the realization of industrial plants able of solidifying the fission products as a glassy material. The advantages of this decision, and the reasons for it are presented. The continuing development work, and the conditions and methods of storing the high-level wastes prior to solidification, and of the interim storage (for thermal decay) and the ultimate disposal after solidification are described [fr

  1. Intergenerational ethics of high level radioactive waste

    International Nuclear Information System (INIS)

    Takeda, Kunihiko; Nasu, Akiko; Maruyama, Yoshihiro

    2003-01-01

    The validity of intergenerational ethics on the geological disposal of high level radioactive waste originating from nuclear power plants was studied. The result of the study on geological disposal technology showed that the current method of disposal can be judged to be scientifically reliable for several hundred years and the radioactivity level will be less than one tenth of the tolerable amount after 1,000 years or more. This implies that the consideration of intergenerational ethics of geological disposal is meaningless. Ethics developed in western society states that the consent of people in the future is necessary if the disposal has influence on them. Moreover, the ethics depends on generally accepted ideas in western society and preconceptions based on racism and sexism. The irrationality becomes clearer by comparing the dangers of the exhaustion of natural resources and pollution from harmful substances in a recycling society. (author)

  2. A Method for Extracting Road Boundary Information from Crowdsourcing Vehicle GPS Trajectories.

    Science.gov (United States)

    Yang, Wei; Ai, Tinghua; Lu, Wei

    2018-04-19

    Crowdsourcing trajectory data is an important approach for accessing and updating road information. In this paper, we present a novel approach for extracting road boundary information from crowdsourcing vehicle traces based on Delaunay triangulation (DT). First, an optimization and interpolation method is proposed to filter abnormal trace segments from raw global positioning system (GPS) traces and interpolate the optimization segments adaptively to ensure there are enough tracking points. Second, constructing the DT and the Voronoi diagram within interpolated tracking lines to calculate road boundary descriptors using the area of Voronoi cell and the length of triangle edge. Then, the road boundary detection model is established integrating the boundary descriptors and trajectory movement features (e.g., direction) by DT. Third, using the boundary detection model to detect road boundary from the DT constructed by trajectory lines, and a regional growing method based on seed polygons is proposed to extract the road boundary. Experiments were conducted using the GPS traces of taxis in Beijing, China, and the results show that the proposed method is suitable for extracting the road boundary from low-frequency GPS traces, multi-type road structures, and different time intervals. Compared with two existing methods, the automatically extracted boundary information was proved to be of higher quality.

  3. A Method for Extracting Road Boundary Information from Crowdsourcing Vehicle GPS Trajectories

    Directory of Open Access Journals (Sweden)

    Wei Yang

    2018-04-01

    Full Text Available Crowdsourcing trajectory data is an important approach for accessing and updating road information. In this paper, we present a novel approach for extracting road boundary information from crowdsourcing vehicle traces based on Delaunay triangulation (DT. First, an optimization and interpolation method is proposed to filter abnormal trace segments from raw global positioning system (GPS traces and interpolate the optimization segments adaptively to ensure there are enough tracking points. Second, constructing the DT and the Voronoi diagram within interpolated tracking lines to calculate road boundary descriptors using the area of Voronoi cell and the length of triangle edge. Then, the road boundary detection model is established integrating the boundary descriptors and trajectory movement features (e.g., direction by DT. Third, using the boundary detection model to detect road boundary from the DT constructed by trajectory lines, and a regional growing method based on seed polygons is proposed to extract the road boundary. Experiments were conducted using the GPS traces of taxis in Beijing, China, and the results show that the proposed method is suitable for extracting the road boundary from low-frequency GPS traces, multi-type road structures, and different time intervals. Compared with two existing methods, the automatically extracted boundary information was proved to be of higher quality.

  4. High-level waste processing and disposal

    International Nuclear Information System (INIS)

    Crandall, J.L.; Krause, H.; Sombret, C.; Uematsu, K.

    1984-11-01

    Without reprocessing, spent LWR fuel itself is generally considered an acceptable waste form. With reprocessing, borosilicate glass canisters, have now gained general acceptance for waste immobilization. The current first choice for disposal is emplacement in an engineered structure in a mined cavern at a depth of 500-1000 meters. A variety of rock types are being investigated including basalt, clay, granite, salt, shale, and volcanic tuff. This paper gives specific coverage to the national high level waste disposal plans for France, the Federal Republic of Germany, Japan and the United States. The French nuclear program assumes prompt reprocessing of its spent fuels, and France has already constructed the AVM. Two larger borosilicate glass plants are planned for a new French reprocessing plant at La Hague. France plans to hold the glass canisters in near-surface storage for a forty to sixty year cooling period and then to place them into a mined repository. The FRG and Japan also plan reprocessing for their LWR fuels. Both are currently having some fuel reprocessed by France, but both are also planning reprocessing plants which will include waste vitrification facilities. West Germany is now constructing the PAMELA Plant at Mol, Belgium to vitrify high level reprocessing wastes at the shutdown Eurochemic Plant. Japan is now operating a vitrification mockup test facility and plans a pilot plant facility at the Tokai reprocessing plant by 1990. Both countries have active geologic repository programs. The United State program assumes little LWR fuel reprocessing and is thus primarily aimed at direct disposal of spent fuel into mined repositories. However, the US have two borosilicate glass plants under construction to vitrify existing reprocessing wastes

  5. Ramifications of defining high-level waste

    International Nuclear Information System (INIS)

    Wood, D.E.; Campbell, M.H.; Shupe, M.W.

    1987-01-01

    The Nuclear Regulatory Commission (NRC) is considering rule making to provide a concentration-based definition of high-level waste (HLW) under authority derived from the Nuclear Waste Policy Act (NWPA) of 1982 and the Low Level Waste Policy Amendments Act of 1985. The Department of Energy (DOE), which has the responsibility to dispose of certain kinds of commercial waste, is supporting development of a risk-based classification system by the Oak Ridge National Laboratory to assist in developing and implementing the NRC rule. The system is two dimensional, with the axes based on the phrases highly radioactive and requires permanent isolation in the definition of HLW in the NWPA. Defining HLW will reduce the ambiguity in the present source-based definition by providing concentration limits to establish which materials are to be called HLW. The system allows the possibility of greater-confinement disposal for some wastes which do not require the degree of isolation provided by a repository. The definition of HLW will provide a firm basis for waste processing options which involve partitioning of waste into a high-activity stream for repository disposal, and a low-activity stream for disposal elsewhere. Several possible classification systems have been derived and the characteristics of each are discussed. The Defense High Level Waste Technology Lead Office at DOE - Richland Operations Office, supported by Rockwell Hanford Operations, has coordinated reviews of the ORNL work by a technical peer review group and other DOE offices. The reviews produced several recommendations and identified several issues to be addressed in the NRC rule making. 10 references, 3 figures

  6. The high level vibration test program

    International Nuclear Information System (INIS)

    Hofmayer, C.H.; Curreri, J.R.; Park, Y.J.; Kato, W.Y.; Kawakami, S.

    1989-01-01

    As part of cooperative agreements between the US and Japan, tests have been performed on the seismic vibration table at the Tadotsu Engineering Laboratory of Nuclear Power Engineering Test Center (NUPEC) in Japan. The objective of the test program was to use the NUPEC vibration table to drive large diameter nuclear power piping to substantial plastic strain with an earthquake excitation and to compare the results with state-of-the-art analysis of the problem. The test model was subjected to a maximum acceleration well beyond what nuclear power plants are designed to withstand. A modified earthquake excitation was applied and the excitation level was increased carefully to minimize the cumulative fatigue damage due to the intermediate level excitations. Since the piping was pressurized, and the high level earthquake excitation was repeated several times, it was possible to investigate the effects of ratchetting and fatigue as well. Elastic and inelastic seismic response behavior of the test model was measured in a number of test runs with an increasing excitation input level up to the limit of the vibration table. In the maximum input condition, large dynamic plastic strains were obtained in the piping. Crack initiation was detected following the second maximum excitation run. Crack growth was carefully monitored during the next two additional maximum excitation runs. The final test resulted in a maximum crack depth of approximately 94% of the wall thickness. The HLVT (high level vibration test) program has enhanced understanding of the behavior of piping systems under severe earthquake loading. As in other tests to failure of piping components, it has demonstrated significant seismic margin in nuclear power plant piping

  7. The ATLAS high level trigger region of interest builder

    International Nuclear Information System (INIS)

    Blair, R.; Dawson, J.; Drake, G.; Haberichter, W.; Schlereth, J.; Zhang, J.; Ermoline, Y.; Pope, B.; Aboline, M.; High Energy Physics; Michigan State Univ.

    2008-01-01

    This article describes the design, testing and production of the ATLAS Region of Interest Builder (RoIB). This device acts as an interface between the Level 1 trigger and the high level trigger (HLT) farm for the ATLAS LHC detector. It distributes all of the Level 1 data for a subset of events to a small number of (16 or less) individual commodity processors. These processors in turn provide this information to the HLT. This allows the HLT to use the Level 1 information to narrow data requests to areas of the detector where Level 1 has identified interesting objects

  8. YAdumper: extracting and translating large information volumes from relational databases to structured flat files.

    Science.gov (United States)

    Fernández, José M; Valencia, Alfonso

    2004-10-12

    Downloading the information stored in relational databases into XML and other flat formats is a common task in bioinformatics. This periodical dumping of information requires considerable CPU time, disk and memory resources. YAdumper has been developed as a purpose-specific tool to deal with the integral structured information download of relational databases. YAdumper is a Java application that organizes database extraction following an XML template based on an external Document Type Declaration. Compared with other non-native alternatives, YAdumper substantially reduces memory requirements and considerably improves writing performance.

  9. High level waste fixation in cermet form

    International Nuclear Information System (INIS)

    Kobisk, E.H.; Aaron, W.S.; Quinby, T.C.; Ramey, D.W.

    1981-01-01

    Commercial and defense high level waste fixation in cermet form is being studied by personnel of the Isotopes Research Materials Laboratory, Solid State Division (ORNL). As a corollary to earlier research and development in forming high density ceramic and cermet rods, disks, and other shapes using separated isotopes, similar chemical and physical processing methods have been applied to synthetic and real waste fixation. Generally, experimental products resulting from this approach have shown physical and chemical characteristics which are deemed suitable for long-term storage, shipping, corrosive environments, high temperature environments, high waste loading, decay heat dissipation, and radiation damage. Although leach tests are not conclusive, what little comparative data are available show cermet to withstand hydrothermal conditions in water and brine solutions. The Soxhlet leach test, using radioactive cesium as a tracer, showed that leaching of cermet was about X100 less than that of 78 to 68 glass. Using essentially uncooled, untreated waste, cermet fixation was found to accommodate up to 75% waste loading and yet, because of its high thermal conductivity, a monolith of 0.6 m diameter and 3.3 m-length would have only a maximum centerline temperature of 29 K above the ambient value

  10. Tracking at High Level Trigger in CMS

    CERN Document Server

    Tosi, Mia

    2016-01-01

    The trigger systems of the LHC detectors play a crucial role in determining the physics capabili- ties of the experiments. A reduction of several orders of magnitude of the event rate is needed to reach values compatible with detector readout, offline storage and analysis capability. The CMS experiment has been designed with a two-level trigger system: the Level-1 Trigger (L1T), implemented on custom-designed electronics, and the High Level Trigger (HLT), a stream- lined version of the CMS offline reconstruction software running on a computer farm. A software trigger system requires a trade-off between the complexity of the algorithms, the sustainable out- put rate, and the selection efficiency. With the computing power available during the 2012 data taking the maximum reconstruction time at HLT was about 200 ms per event, at the nominal L1T rate of 100 kHz. Track reconstruction algorithms are widely used in the HLT, for the reconstruction of the physics objects as well as in the identification of b-jets and ...

  11. Performance of the CMS High Level Trigger

    CERN Document Server

    Perrotta, Andrea

    2015-01-01

    The CMS experiment has been designed with a 2-level trigger system. The first level is implemented using custom-designed electronics. The second level is the so-called High Level Trigger (HLT), a streamlined version of the CMS offline reconstruction software running on a computer farm. For Run II of the Large Hadron Collider, the increases in center-of-mass energy and luminosity will raise the event rate to a level challenging for the HLT algorithms. The increase in the number of interactions per bunch crossing, on average 25 in 2012, and expected to be around 40 in Run II, will be an additional complication. We present here the expected performance of the main triggers that will be used during the 2015 data taking campaign, paying particular attention to the new approaches that have been developed to cope with the challenges of the new run. This includes improvements in HLT electron and photon reconstruction as well as better performing muon triggers. We will also present the performance of the improved trac...

  12. Vitrification of high-level liquid wastes

    International Nuclear Information System (INIS)

    Varani, J.L.; Petraitis, E.J.; Vazquez, Antonio.

    1987-01-01

    High-level radioactive liquid wastes produced in the fuel elements reprocessing require, for their disposal, a preliminary treatment by which, through a series of engineering barriers, the dispersion into the biosphere is delayed by 10 000 years. Four groups of compounds are distinguished among a great variety of final products and methods of elaboration. From these, the borosilicate glasses were chosen. Vitrification experiences were made at a laboratory scale with simulated radioactive wastes, employing different compositions of borosilicate glass. The installations are described. A series of tests were carried out on four basic formulae using always the same methodology, consisting of a dry mixture of the vitreous matrix's products and a dry simulated mixture. Several quality tests of the glasses were made 1: Behaviour in leaching following the DIN 12 111 standard; 2: Mechanical resistance; parameters related with the facility of the different glasses for increasing their surface were studied; 3: Degree of devitrification: it is shown that devitrification turns the glasses containing radioactive wastes easily leachable. From all the glasses tested, the composition SiO 2 , Al 2 O 3 , B 2 O 3 , Na 2 O, CaO shows the best retention characteristics. (M.E.L.) [es

  13. Vitrification of high level wastes in France

    International Nuclear Information System (INIS)

    Sombret, C.

    1984-02-01

    A brief historical background of the research and development work conducted in France over 25 years is first presented. Then, the papers deals with the vitrification at (1) the UP1 reprocessing plant (Marcoule) and (2) the UP2 and UP3 reprocessing plants (La Hague). 1) The properties of glass required for high-level radioactive waste vitrification are recalled. The vitrification process and facility of Marcoule are presented. (2) The average characteristics (chemical composition, activity) of LWR fission product solution are given. The glass formulations developed to solidify LWR waste solution must meet the same requirements as those used in the UP1 facility at Marcoule. Three important aspects must be considered with respect to the glass fabrication process: corrosiveness of the molten glass with regard to metals, viscosity of the molten glass, and, volatization during glass fabrication. The glass properties required in view of interim storage and long-term disposal are then largely developed. Two identical vitrification facilities are planned for the site: T7, to process the UP2 throughput, and T7 for the UP3 plant. A prototype unit was built and operated at Marcoule

  14. High-level nuclear waste disposal

    International Nuclear Information System (INIS)

    Burkholder, H.C.

    1985-01-01

    The meeting was timely because many countries had begun their site selection processes and their engineering designs were becoming well-defined. The technology of nuclear waste disposal was maturing, and the institutional issues arising from the implementation of that technology were being confronted. Accordingly, the program was structured to consider both the technical and institutional aspects of the subject. The meeting started with a review of the status of the disposal programs in eight countries and three international nuclear waste management organizations. These invited presentations allowed listeners to understand the similarities and differences among the various national approaches to solving this very international problem. Then seven invited presentations describing nuclear waste disposal from different perspectives were made. These included: legal and judicial, electric utility, state governor, ethical, and technical perspectives. These invited presentations uncovered several issues that may need to be resolved before high-level nuclear wastes can be emplaced in a geologic repository in the United States. Finally, there were sixty-six contributed technical presentations organized in ten sessions around six general topics: site characterization and selection, repository design and in-situ testing, package design and testing, disposal system performance, disposal and storage system cost, and disposal in the overall waste management system context. These contributed presentations provided listeners with the results of recent applied RandD in each of the subject areas

  15. CMS High Level Trigger Timing Measurements

    International Nuclear Information System (INIS)

    Richardson, Clint

    2015-01-01

    The two-level trigger system employed by CMS consists of the Level 1 (L1) Trigger, which is implemented using custom-built electronics, and the High Level Trigger (HLT), a farm of commercial CPUs running a streamlined version of the offline CMS reconstruction software. The operational L1 output rate of 100 kHz, together with the number of CPUs in the HLT farm, imposes a fundamental constraint on the amount of time available for the HLT to process events. Exceeding this limit impacts the experiment's ability to collect data efficiently. Hence, there is a critical need to characterize the performance of the HLT farm as well as the algorithms run prior to start up in order to ensure optimal data taking. Additional complications arise from the fact that the HLT farm consists of multiple generations of hardware and there can be subtleties in machine performance. We present our methods of measuring the timing performance of the CMS HLT, including the challenges of making such measurements. Results for the performance of various Intel Xeon architectures from 2009-2014 and different data taking scenarios are also presented. (paper)

  16. Decontamination of high-level waste canisters

    International Nuclear Information System (INIS)

    Nesbitt, J.F.; Slate, S.C.; Fetrow, L.K.

    1980-12-01

    This report presents evaluations of several methods for the in-process decontamination of metallic canisters containing any one of a number of solidified high-level waste (HLW) forms. The use of steam-water, steam, abrasive blasting, electropolishing, liquid honing, vibratory finishing and soaking have been tested or evaluated as potential techniques to decontaminate the outer surfaces of HLW canisters. Either these techniques have been tested or available literature has been examined to assess their applicability to the decontamination of HLW canisters. Electropolishing has been found to be the most thorough method to remove radionuclides and other foreign material that may be deposited on or in the outer surface of a canister during any of the HLW processes. Steam or steam-water spraying techniques may be adequate for some applications but fail to remove all contaminated forms that could be present in some of the HLW processes. Liquid honing and abrasive blasting remove contamination and foreign material very quickly and effectively from small areas and components although these blasting techniques tend to disperse the material removed from the cleaned surfaces. Vibratory finishing is very capable of removing the bulk of contamination and foreign matter from a variety of materials. However, special vibratory finishing equipment would have to be designed and adapted for a remote process. Soaking techniques take long periods of time and may not remove all of the smearable contamination. If soaking involves pickling baths that use corrosive agents, these agents may cause erosion of grain boundaries that results in rough surfaces

  17. Extracting information from two-dimensional electrophoresis gels by partial least squares regression

    DEFF Research Database (Denmark)

    Jessen, Flemming; Lametsch, R.; Bendixen, E.

    2002-01-01

    of all proteins/spots in the gels. In the present study it is demonstrated how information can be extracted by multivariate data analysis. The strategy is based on partial least squares regression followed by variable selection to find proteins that individually or in combination with other proteins vary......Two-dimensional gel electrophoresis (2-DE) produces large amounts of data and extraction of relevant information from these data demands a cautious and time consuming process of spot pattern matching between gels. The classical approach of data analysis is to detect protein markers that appear...... or disappear depending on the experimental conditions. Such biomarkers are found by comparing the relative volumes of individual spots in the individual gels. Multivariate statistical analysis and modelling of 2-DE data for comparison and classification is an alternative approach utilising the combination...

  18. From remote sensing data about information extraction for 3D geovisualization - Development of a workflow

    International Nuclear Information System (INIS)

    Tiede, D.

    2010-01-01

    With an increased availability of high (spatial) resolution remote sensing imagery since the late nineties, the need to develop operative workflows for the automated extraction, provision and communication of information from such data has grown. Monitoring requirements, aimed at the implementation of environmental or conservation targets, management of (environmental-) resources, and regional planning as well as international initiatives, especially the joint initiative of the European Commission and ESA (European Space Agency) for Global Monitoring for Environment and Security (GMES) play also a major part. This thesis addresses the development of an integrated workflow for the automated provision of information derived from remote sensing data. Considering applied data and fields of application, this work aims to design the workflow as generic as possible. Following research questions are discussed: What are the requirements of a workflow architecture that seamlessly links the individual workflow elements in a timely manner and secures accuracy of the extracted information effectively? How can the workflow retain its efficiency if mounds of data are processed? How can the workflow be improved with regards to automated object-based image analysis (OBIA)? Which recent developments could be of use? What are the limitations or which workarounds could be applied in order to generate relevant results? How can relevant information be prepared target-oriented and communicated effectively? How can the more recently developed freely available virtual globes be used for the delivery of conditioned information under consideration of the third dimension as an additional, explicit carrier of information? Based on case studies comprising different data sets and fields of application it is demonstrated how methods to extract and process information as well as to effectively communicate results can be improved and successfully combined within one workflow. It is shown that (1

  19. Addressing Risk Assessment for Patient Safety in Hospitals through Information Extraction in Medical Reports

    Science.gov (United States)

    Proux, Denys; Segond, Frédérique; Gerbier, Solweig; Metzger, Marie Hélène

    Hospital Acquired Infections (HAI) is a real burden for doctors and risk surveillance experts. The impact on patients' health and related healthcare cost is very significant and a major concern even for rich countries. Furthermore required data to evaluate the threat is generally not available to experts and that prevents from fast reaction. However, recent advances in Computational Intelligence Techniques such as Information Extraction, Risk Patterns Detection in documents and Decision Support Systems allow now to address this problem.

  20. From Specific Information Extraction to Inferences: A Hierarchical Framework of Graph Comprehension

    Science.gov (United States)

    2004-09-01

    The skill to interpret the information displayed in graphs is so important to have, the National Council of Teachers of Mathematics has created...guidelines to ensure that students learn these skills ( NCTM : Standards for Mathematics , 2003). These guidelines are based primarily on the extraction of...graphical perception. Human Computer Interaction, 8, 353-388. NCTM : Standards for Mathematics . (2003, 2003). Peebles, D., & Cheng, P. C.-H. (2002

  1. Extracting breathing rate information from a wearable reflectance pulse oximeter sensor.

    Science.gov (United States)

    Johnston, W S; Mendelson, Y

    2004-01-01

    The integration of multiple vital physiological measurements could help combat medics and field commanders to better predict a soldier's health condition and enhance their ability to perform remote triage procedures. In this paper we demonstrate the feasibility of extracting accurate breathing rate information from a photoplethysmographic signal that was recorded by a reflectance pulse oximeter sensor mounted on the forehead and subsequently processed by a simple time domain filtering and frequency domain Fourier analysis.

  2. Overview of high-level waste management accomplishments

    International Nuclear Information System (INIS)

    Lawroski, H.; Berreth, J.R.; Freeby, W.A.

    1980-01-01

    Storage of power reactor spent fuel is necessary at present because of the lack of reprocessing operations particularly in the U.S. By considering the above solidification and storage scenario, there is more than reasonable assurance that acceptable, stable, low heat generation rate, solidified waste can be produced, and safely disposed. The public perception of no waste disposal solutions is being exploited by detractors of nuclear power application. The inability to even point to one overall system demonstration lends credibility to the negative assertions. By delaying the gathering of on-line information to qualify repository sites, and to implement a demonstration, the actions of the nuclear power detractors are self serving in that they can continue to point out there is no demonstration of satisfactory high-level waste disposal. By maintaining the liquid and solidified high-level waste in secure above ground storage until acceptable decay heat generation rates are achieved, by producing a compatible, high integrity, solid waste form, by providing a second or even third barrier as a compound container and by inserting the enclosed waste form in a qualified repository with spacing to assure moderately low temperature disposal conditions, there appears to be no technical reason for not progressing further with the disposal of high-level wastes and needed implementation of the complete nuclear power fuel cycle

  3. Extraction of land cover change information from ENVISAT-ASAR data in Chengdu Plain

    Science.gov (United States)

    Xu, Wenbo; Fan, Jinlong; Huang, Jianxi; Tian, Yichen; Zhang, Yong

    2006-10-01

    Land cover data are essential to most global change research objectives, including the assessment of current environmental conditions and the simulation of future environmental scenarios that ultimately lead to public policy development. Chinese Academy of Sciences generated a nationwide land cover database in order to carry out the quantification and spatial characterization of land use/cover changes (LUCC) in 1990s. In order to improve the reliability of the database, we will update the database anytime. But it is difficult to obtain remote sensing data to extract land cover change information in large-scale. It is hard to acquire optical remote sensing data in Chengdu plain, so the objective of this research was to evaluate multitemporal ENVISAT advanced synthetic aperture radar (ASAR) data for extracting land cover change information. Based on the fieldwork and the nationwide 1:100000 land cover database, the paper assesses several land cover changes in Chengdu plain, for example: crop to buildings, forest to buildings, and forest to bare land. The results show that ENVISAT ASAR data have great potential for the applications of extracting land cover change information.

  4. KneeTex: an ontology-driven system for information extraction from MRI reports.

    Science.gov (United States)

    Spasić, Irena; Zhao, Bo; Jones, Christopher B; Button, Kate

    2015-01-01

    In the realm of knee pathology, magnetic resonance imaging (MRI) has the advantage of visualising all structures within the knee joint, which makes it a valuable tool for increasing diagnostic accuracy and planning surgical treatments. Therefore, clinical narratives found in MRI reports convey valuable diagnostic information. A range of studies have proven the feasibility of natural language processing for information extraction from clinical narratives. However, no study focused specifically on MRI reports in relation to knee pathology, possibly due to the complexity of knee anatomy and a wide range of conditions that may be associated with different anatomical entities. In this paper we describe KneeTex, an information extraction system that operates in this domain. As an ontology-driven information extraction system, KneeTex makes active use of an ontology to strongly guide and constrain text analysis. We used automatic term recognition to facilitate the development of a domain-specific ontology with sufficient detail and coverage for text mining applications. In combination with the ontology, high regularity of the sublanguage used in knee MRI reports allowed us to model its processing by a set of sophisticated lexico-semantic rules with minimal syntactic analysis. The main processing steps involve named entity recognition combined with coordination, enumeration, ambiguity and co-reference resolution, followed by text segmentation. Ontology-based semantic typing is then used to drive the template filling process. We adopted an existing ontology, TRAK (Taxonomy for RehAbilitation of Knee conditions), for use within KneeTex. The original TRAK ontology expanded from 1,292 concepts, 1,720 synonyms and 518 relationship instances to 1,621 concepts, 2,550 synonyms and 560 relationship instances. This provided KneeTex with a very fine-grained lexico-semantic knowledge base, which is highly attuned to the given sublanguage. Information extraction results were evaluated

  5. DEFENSE HIGH LEVEL WASTE GLASS DEGRADATION

    International Nuclear Information System (INIS)

    Ebert, W.

    2001-01-01

    The purpose of this Analysis/Model Report (AMR) is to document the analyses that were done to develop models for radionuclide release from high-level waste (HLW) glass dissolution that can be integrated into performance assessment (PA) calculations conducted to support site recommendation and license application for the Yucca Mountain site. This report was developed in accordance with the ''Technical Work Plan for Waste Form Degradation Process Model Report for SR'' (CRWMS M andO 2000a). It specifically addresses the item, ''Defense High Level Waste Glass Degradation'', of the product technical work plan. The AP-3.15Q Attachment 1 screening criteria determines the importance for its intended use of the HLW glass model derived herein to be in the category ''Other Factors for the Postclosure Safety Case-Waste Form Performance'', and thus indicates that this factor does not contribute significantly to the postclosure safety strategy. Because the release of radionuclides from the glass will depend on the prior dissolution of the glass, the dissolution rate of the glass imposes an upper bound on the radionuclide release rate. The approach taken to provide a bound for the radionuclide release is to develop models that can be used to calculate the dissolution rate of waste glass when contacted by water in the disposal site. The release rate of a particular radionuclide can then be calculated by multiplying the glass dissolution rate by the mass fraction of that radionuclide in the glass and by the surface area of glass contacted by water. The scope includes consideration of the three modes by which water may contact waste glass in the disposal system: contact by humid air, dripping water, and immersion. The models for glass dissolution under these contact modes are all based on the rate expression for aqueous dissolution of borosilicate glasses. The mechanism and rate expression for aqueous dissolution are adequately understood; the analyses in this AMR were conducted to

  6. SAR matrices: automated extraction of information-rich SAR tables from large compound data sets.

    Science.gov (United States)

    Wassermann, Anne Mai; Haebel, Peter; Weskamp, Nils; Bajorath, Jürgen

    2012-07-23

    We introduce the SAR matrix data structure that is designed to elucidate SAR patterns produced by groups of structurally related active compounds, which are extracted from large data sets. SAR matrices are systematically generated and sorted on the basis of SAR information content. Matrix generation is computationally efficient and enables processing of large compound sets. The matrix format is reminiscent of SAR tables, and SAR patterns revealed by different categories of matrices are easily interpretable. The structural organization underlying matrix formation is more flexible than standard R-group decomposition schemes. Hence, the resulting matrices capture SAR information in a comprehensive manner.

  7. Comparison of methods of extracting information for meta-analysis of observational studies in nutritional epidemiology

    Directory of Open Access Journals (Sweden)

    Jong-Myon Bae

    2016-01-01

    Full Text Available OBJECTIVES: A common method for conducting a quantitative systematic review (QSR for observational studies related to nutritional epidemiology is the “highest versus lowest intake” method (HLM, in which only the information concerning the effect size (ES of the highest category of a food item is collected on the basis of its lowest category. However, in the interval collapsing method (ICM, a method suggested to enable a maximum utilization of all available information, the ES information is collected by collapsing all categories into a single category. This study aimed to compare the ES and summary effect size (SES between the HLM and ICM. METHODS: A QSR for evaluating the citrus fruit intake and risk of pancreatic cancer and calculating the SES by using the HLM was selected. The ES and SES were estimated by performing a meta-analysis using the fixed-effect model. The directionality and statistical significance of the ES and SES were used as criteria for determining the concordance between the HLM and ICM outcomes. RESULTS: No significant differences were observed in the directionality of SES extracted by using the HLM or ICM. The application of the ICM, which uses a broader information base, yielded more-consistent ES and SES, and narrower confidence intervals than the HLM. CONCLUSIONS: The ICM is advantageous over the HLM owing to its higher statistical accuracy in extracting information for QSR on nutritional epidemiology. The application of the ICM should hence be recommended for future studies.

  8. Immobilization of high-level wastes into sintered glass: 1

    International Nuclear Information System (INIS)

    Russo, D.O.; Messi de Bernasconi, N.; Audero, M.A.

    1987-01-01

    In order to immobilize the high-level radioactive wastes from fuel elements reprocessing, borosilicate glass was adopted. Sintering experiments are described with the variety VG 98/12 (SiO 2 , TiO 2 , Al 2 O 3 , B 2 O 3 , MgO, CaO and Na 2 O) (which does not present devitrification problems) mixed with simulated calcinated wastes. The hot pressing line (sintering under pressure) was explored in two variants 1: In can; 2: In graphite matrix with sintered pellet extraction. With scanning electron microscopy it is observed that the simulated wastes do not disolve in the vitreous matrix, but they remain dispersed in the same. The results obtained point out that the leaching velocities are independent from the density and from the matrix type employed, as well as from the fact that the wastes do no dissolve in the matrix. (M.E.L.) [es

  9. Feature extraction and learning using context cue and Rényi entropy based mutual information

    DEFF Research Database (Denmark)

    Pan, Hong; Olsen, Søren Ingvor; Zhu, Yaping

    2015-01-01

    information. In particular, for feature extraction, we develop a new set of kernel descriptors−Context Kernel Descriptors (CKD), which enhance the original KDES by embedding the spatial context into the descriptors. Context cues contained in the context kernel enforce some degree of spatial consistency, thus...... improving the robustness of CKD. For feature learning and reduction, we propose a novel codebook learning method, based on a Rényi quadratic entropy based mutual information measure called Cauchy-Schwarz Quadratic Mutual Information (CSQMI), to learn a compact and discriminative CKD codebook. Projecting...... as the information about the underlying labels of the CKD using CSQMI. Thus the resulting codebook and reduced CKD are discriminative. We verify the effectiveness of our method on several public image benchmark datasets such as YaleB, Caltech-101 and CIFAR-10, as well as a challenging chicken feet dataset of our own...

  10. Method of extracting significant trouble information of nuclear power plants using probabilistic analysis technique

    International Nuclear Information System (INIS)

    Shimada, Yoshio; Miyazaki, Takamasa

    2005-01-01

    In order to analyze and evaluate large amounts of trouble information of overseas nuclear power plants, it is necessary to select information that is significant in terms of both safety and reliability. In this research, a method of efficiently and simply classifying degrees of importance of components in terms of safety and reliability while paying attention to root-cause components appearing in the information was developed. Regarding safety, the reactor core damage frequency (CDF), which is used in the probabilistic analysis of a reactor, was used. Regarding reliability, the automatic plant trip probability (APTP), which is used in the probabilistic analysis of automatic reactor trips, was used. These two aspects were reflected in the development of criteria for classifying degrees of importance of components. By applying these criteria, a simple method of extracting significant trouble information of overseas nuclear power plants was developed. (author)

  11. The IAEA's high level radioactive waste management programme

    International Nuclear Information System (INIS)

    Saire, D.E.

    1994-01-01

    This paper presents the different activities that are performed under the International Atomic Energy Agency's (IAEA) high level radioactive waste management programme. The Agency's programme is composed of five main activities (information exchange, international safety standards, R ampersand D activities, advisory services and special projects) which are described in the paper. Special emphasis is placed on the RADioactive WAste Safety Standards (RADWASS) programme which was implemented in 1991 to document international consensus that exists on the safe management of radioactive waste. The paper also raises the question about the need for regional repositories to serve certain countries that do not have the resources or infrastructure to construct a national repository

  12. A critically educated public explores high level radioactive waste management

    International Nuclear Information System (INIS)

    Blum, J.E.

    1994-01-01

    It is vital to the citizens of Nevada that they and their children are given an opportunity to explore all sides of the characterization of Yucca Mountain as a potential repository site for spent nuclear fuel. The state-wide, national and international implications demand a reasoned and complete approach to this issue, which has become emotionally and irrationally charged and fueled by incomplete perception and information. The purpose of this paper is to provide curriculum suggestions and recommend concomitant policy developments that will lead to the implementation of a Critical Thinking (CT) approach to High Level Radioactive Waste Management

  13. High-level nuclear waste disposal: Ethical considerations

    International Nuclear Information System (INIS)

    Maxey, M.N.

    1985-01-01

    Popular skepticism about, and moral objections to, recent legislation providing for the management and permanent disposal of high-level radioactive wastes have derived their credibility from two major sources: government procrastination in enacting waste disposal program, reinforcing public perceptions of their unprecedented danger and the inflated rhetoric and pretensions to professional omnicompetence of influential scientists with nuclear expertise. Ethical considerations not only can but must provide a mediating framework for the resolution of such a polarized political controversy. Implicit in moral objections to proposals for permanent nuclear waste disposal are concerns about three ethical principles: fairness to individuals, equitable protection among diverse social groups, and informed consent through due process and participation

  14. Automated concept-level information extraction to reduce the need for custom software and rules development.

    Science.gov (United States)

    D'Avolio, Leonard W; Nguyen, Thien M; Goryachev, Sergey; Fiore, Louis D

    2011-01-01

    Despite at least 40 years of promising empirical performance, very few clinical natural language processing (NLP) or information extraction systems currently contribute to medical science or care. The authors address this gap by reducing the need for custom software and rules development with a graphical user interface-driven, highly generalizable approach to concept-level retrieval. A 'learn by example' approach combines features derived from open-source NLP pipelines with open-source machine learning classifiers to automatically and iteratively evaluate top-performing configurations. The Fourth i2b2/VA Shared Task Challenge's concept extraction task provided the data sets and metrics used to evaluate performance. Top F-measure scores for each of the tasks were medical problems (0.83), treatments (0.82), and tests (0.83). Recall lagged precision in all experiments. Precision was near or above 0.90 in all tasks. Discussion With no customization for the tasks and less than 5 min of end-user time to configure and launch each experiment, the average F-measure was 0.83, one point behind the mean F-measure of the 22 entrants in the competition. Strong precision scores indicate the potential of applying the approach for more specific clinical information extraction tasks. There was not one best configuration, supporting an iterative approach to model creation. Acceptable levels of performance can be achieved using fully automated and generalizable approaches to concept-level information extraction. The described implementation and related documentation is available for download.

  15. Monitoring of geological repositories for high level radioactive waste

    International Nuclear Information System (INIS)

    2001-04-01

    Geological repositories for disposal of high level radioactive waste are designed to provide isolation of the waste from human environment for many thousands of years. This report discusses the possible purposes for monitoring geological repositories at the different stages of a repository programme, the use that may be made of the information obtained and the techniques that might be applied. This report focuses on the different objectives that monitoring might have at various stages of a programme, from the initiation of work on a candidate site, to the period after repository closure. Each objective may require somewhat different types of information, or may use the same information in different ways. Having evaluated monitoring requirements, the report concludes with a brief evaluation of available monitoring techniques

  16. The Feature Extraction Based on Texture Image Information for Emotion Sensing in Speech

    Directory of Open Access Journals (Sweden)

    Kun-Ching Wang

    2014-09-01

    Full Text Available In this paper, we present a novel texture image feature for Emotion Sensing in Speech (ESS. This idea is based on the fact that the texture images carry emotion-related information. The feature extraction is derived from time-frequency representation of spectrogram images. First, we transform the spectrogram as a recognizable image. Next, we use a cubic curve to enhance the image contrast. Then, the texture image information (TII derived from the spectrogram image can be extracted by using Laws’ masks to characterize emotional state. In order to evaluate the effectiveness of the proposed emotion recognition in different languages, we use two open emotional databases including the Berlin Emotional Speech Database (EMO-DB and eNTERFACE corpus and one self-recorded database (KHUSC-EmoDB, to evaluate the performance cross-corpora. The results of the proposed ESS system are presented using support vector machine (SVM as a classifier. Experimental results show that the proposed TII-based feature extraction inspired by visual perception can provide significant classification for ESS systems. The two-dimensional (2-D TII feature can provide the discrimination between different emotions in visual expressions except for the conveyance pitch and formant tracks. In addition, the de-noising in 2-D images can be more easily completed than de-noising in 1-D speech.

  17. An Accurate Integral Method for Vibration Signal Based on Feature Information Extraction

    Directory of Open Access Journals (Sweden)

    Yong Zhu

    2015-01-01

    Full Text Available After summarizing the advantages and disadvantages of current integral methods, a novel vibration signal integral method based on feature information extraction was proposed. This method took full advantage of the self-adaptive filter characteristic and waveform correction feature of ensemble empirical mode decomposition in dealing with nonlinear and nonstationary signals. This research merged the superiorities of kurtosis, mean square error, energy, and singular value decomposition on signal feature extraction. The values of the four indexes aforementioned were combined into a feature vector. Then, the connotative characteristic components in vibration signal were accurately extracted by Euclidean distance search, and the desired integral signals were precisely reconstructed. With this method, the interference problem of invalid signal such as trend item and noise which plague traditional methods is commendably solved. The great cumulative error from the traditional time-domain integral is effectively overcome. Moreover, the large low-frequency error from the traditional frequency-domain integral is successfully avoided. Comparing with the traditional integral methods, this method is outstanding at removing noise and retaining useful feature information and shows higher accuracy and superiority.

  18. A cascade of classifiers for extracting medication information from discharge summaries

    Directory of Open Access Journals (Sweden)

    Halgrim Scott

    2011-07-01

    Full Text Available Abstract Background Extracting medication information from clinical records has many potential applications, and recently published research, systems, and competitions reflect an interest therein. Much of the early extraction work involved rules and lexicons, but more recently machine learning has been applied to the task. Methods We present a hybrid system consisting of two parts. The first part, field detection, uses a cascade of statistical classifiers to identify medication-related named entities. The second part uses simple heuristics to link those entities into medication events. Results The system achieved performance that is comparable to other approaches to the same task. This performance is further improved by adding features that reference external medication name lists. Conclusions This study demonstrates that our hybrid approach outperforms purely statistical or rule-based systems. The study also shows that a cascade of classifiers works better than a single classifier in extracting medication information. The system is available as is upon request from the first author.

  19. Three-dimensional information extraction from GaoFen-1 satellite images for landslide monitoring

    Science.gov (United States)

    Wang, Shixin; Yang, Baolin; Zhou, Yi; Wang, Futao; Zhang, Rui; Zhao, Qing

    2018-05-01

    To more efficiently use GaoFen-1 (GF-1) satellite images for landslide emergency monitoring, a Digital Surface Model (DSM) can be generated from GF-1 across-track stereo image pairs to build a terrain dataset. This study proposes a landslide 3D information extraction method based on the terrain changes of slope objects. The slope objects are mergences of segmented image objects which have similar aspects; and the terrain changes are calculated from the post-disaster Digital Elevation Model (DEM) from GF-1 and the pre-disaster DEM from GDEM V2. A high mountain landslide that occurred in Wenchuan County, Sichuan Province is used to conduct a 3D information extraction test. The extracted total area of the landslide is 22.58 ha; the displaced earth volume is 652,100 m3; and the average sliding direction is 263.83°. The accuracies of them are 0.89, 0.87 and 0.95, respectively. Thus, the proposed method expands the application of GF-1 satellite images to the field of landslide emergency monitoring.

  20. Development of a test system for high level liquid waste partitioning

    OpenAIRE

    Duan Wu H.; Chen Jing; Wang Jian C.; Wang Shu W.; Wang Xing H.

    2015-01-01

    The partitioning and transmutation strategy has increasingly attracted interest for the safe treatment and disposal of high level liquid waste, in which the partitioning of high level liquid waste is one of the critical technical issues. An improved total partitioning process, including a tri-alkylphosphine oxide process for the removal of actinides, a crown ether strontium extraction process for the removal of strontium, and a calixcrown ether cesium extra...

  1. Statistics of high-level scene context.

    Science.gov (United States)

    Greene, Michelle R

    2013-01-01

    CONTEXT IS CRITICAL FOR RECOGNIZING ENVIRONMENTS AND FOR SEARCHING FOR OBJECTS WITHIN THEM: contextual associations have been shown to modulate reaction time and object recognition accuracy, as well as influence the distribution of eye movements and patterns of brain activations. However, we have not yet systematically quantified the relationships between objects and their scene environments. Here I seek to fill this gap by providing descriptive statistics of object-scene relationships. A total of 48, 167 objects were hand-labeled in 3499 scenes using the LabelMe tool (Russell et al., 2008). From these data, I computed a variety of descriptive statistics at three different levels of analysis: the ensemble statistics that describe the density and spatial distribution of unnamed "things" in the scene; the bag of words level where scenes are described by the list of objects contained within them; and the structural level where the spatial distribution and relationships between the objects are measured. The utility of each level of description for scene categorization was assessed through the use of linear classifiers, and the plausibility of each level for modeling human scene categorization is discussed. Of the three levels, ensemble statistics were found to be the most informative (per feature), and also best explained human patterns of categorization errors. Although a bag of words classifier had similar performance to human observers, it had a markedly different pattern of errors. However, certain objects are more useful than others, and ceiling classification performance could be achieved using only the 64 most informative objects. As object location tends not to vary as a function of category, structural information provided little additional information. Additionally, these data provide valuable information on natural scene redundancy that can be exploited for machine vision, and can help the visual cognition community to design experiments guided by statistics

  2. Criteria for high-level waste disposal

    International Nuclear Information System (INIS)

    Sousselier, Y.

    1981-01-01

    Disposal of radioactive wastes is storage without the intention of retrieval. But in such storage, it may be useful and in some cases necessary to have the possibility of retrieval at least for a certain period of time. In order to propose some criteria for HLW disposal, one has to examine how this basic concept is to be applied. HLW is waste separated as a raffinate in the first cycle of solvent extraction in reprocessing. Such waste contains the bulk of fission products which have long half lives, therefore the safety of a disposal site, at least after a certain period of time, must be intrinsic, i.e. not based on human intervention. There is a consensus that such a disposal is feasible in a suitable geological formation in which the integrity of the container will be reinforced by several additional barriers. Criteria for disposal can be proposed for all aspects of the question. The author discusses the aims of the safety analysis, particularly the length of time for this analysis, and the acceptable dose commitments resulting from the release of radionuclides, the number and role of each barrier, and a holistic analysis of safety external factors. (Auth.)

  3. Stability of High-Level Waste Forms

    Energy Technology Data Exchange (ETDEWEB)

    Besmann, Theodore M.; Vienna, John D.

    2005-09-30

    The objective of the proposed effort is to use a new approach to develop solution models of complex waste glass systems and spent fuel that are predictive with regard to composition, phase separation, and volatility. The effort will also yield thermodynamic values for waste components that are fundamentally required for corrosion models used to predict the leaching/corrosion behavior for waste glass and spent fuel material. This basic information and understanding of chemical behavior can subsequently be used directly in computational models of leaching and transport in geologic media, in designing and engineering waste forms and barrier systems, and in prediction of chemical interactions.

  4. THE EXTRACTION OF INDOOR BUILDING INFORMATION FROM BIM TO OGC INDOORGML

    Directory of Open Access Journals (Sweden)

    T.-A. Teo

    2017-07-01

    Full Text Available Indoor Spatial Data Infrastructure (indoor-SDI is an important SDI for geosptial analysis and location-based services. Building Information Model (BIM has high degree of details in geometric and semantic information for building. This study proposed direct conversion schemes to extract indoor building information from BIM to OGC IndoorGML. The major steps of the research include (1 topological conversion from building model into indoor network model; and (2 generation of IndoorGML. The topological conversion is a major process of generating and mapping nodes and edges from IFC to indoorGML. Node represents every space (e.g. IfcSpace and objects (e.g. IfcDoor in the building while edge shows the relationships between nodes. According to the definition of IndoorGML, the topological model in the dual space is also represented as a set of nodes and edges. These definitions of IndoorGML are the same as in the indoor network. Therefore, we can extract the necessary data in the indoor network and easily convert them into IndoorGML based on IndoorGML Schema. The experiment utilized a real BIM model to examine the proposed method. The experimental results indicated that the 3D indoor model (i.e. IndoorGML model can be automatically imported from IFC model by the proposed procedure. In addition, the geometric and attribute of building elements are completely and correctly converted from BIM to indoor-SDI.

  5. Methods from Information Extraction from LIDAR Intensity Data and Multispectral LIDAR Technology

    Science.gov (United States)

    Scaioni, M.; Höfle, B.; Baungarten Kersting, A. P.; Barazzetti, L.; Previtali, M.; Wujanz, D.

    2018-04-01

    LiDAR is a consolidated technology for topographic mapping and 3D reconstruction, which is implemented in several platforms On the other hand, the exploitation of the geometric information has been coupled by the use of laser intensity, which may provide additional data for multiple purposes. This option has been emphasized by the availability of sensors working on different wavelength, thus able to provide additional information for classification of surfaces and objects. Several applications ofmonochromatic and multi-spectral LiDAR data have been already developed in different fields: geosciences, agriculture, forestry, building and cultural heritage. The use of intensity data to extract measures of point cloud quality has been also developed. The paper would like to give an overview on the state-of-the-art of these techniques, and to present the modern technologies for the acquisition of multispectral LiDAR data. In addition, the ISPRS WG III/5 on `Information Extraction from LiDAR Intensity Data' has collected and made available a few open data sets to support scholars to do research on this field. This service is presented and data sets delivered so far as are described.

  6. The effect of informed consent on stress levels associated with extraction of impacted mandibular third molars.

    Science.gov (United States)

    Casap, Nardy; Alterman, Michael; Sharon, Guy; Samuni, Yuval

    2008-05-01

    To evaluate the effect of informed consent on stress levels associated with removal of impacted mandibular third molars. A total of 60 patients scheduled for extraction of impacted mandibular third molars participated in this study. The patients were unaware of the study's objectives. Data from 20 patients established the baseline levels of electrodermal activity (EDA). The remaining 40 patients were randomly assigned into 2 equal groups receiving either a detailed document of informed consent, disclosing the possible risks involved with the surgery, or a simplified version. Pulse, blood pressure, and EDA were monitored before, during, and after completion of the consent document. Changes in EDA, but not in blood pressure, were measured on completion of either version of the consent document. A greater increase in EDA was associated with the detailed version of the consent document (P = .004). A similar concomitant increase (although nonsignificant) in pulse values was monitored on completion of both versions. Completion of overdisclosed document of informed consent is associated with changes in physiological parameters. The results suggest that overdetailed listing and disclosure before extraction of impacted mandibular third molars can increase patient stress.

  7. METHODS FROM INFORMATION EXTRACTION FROM LIDAR INTENSITY DATA AND MULTISPECTRAL LIDAR TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    M. Scaioni

    2018-04-01

    Full Text Available LiDAR is a consolidated technology for topographic mapping and 3D reconstruction, which is implemented in several platforms On the other hand, the exploitation of the geometric information has been coupled by the use of laser intensity, which may provide additional data for multiple purposes. This option has been emphasized by the availability of sensors working on different wavelength, thus able to provide additional information for classification of surfaces and objects. Several applications ofmonochromatic and multi-spectral LiDAR data have been already developed in different fields: geosciences, agriculture, forestry, building and cultural heritage. The use of intensity data to extract measures of point cloud quality has been also developed. The paper would like to give an overview on the state-of-the-art of these techniques, and to present the modern technologies for the acquisition of multispectral LiDAR data. In addition, the ISPRS WG III/5 on ‘Information Extraction from LiDAR Intensity Data’ has collected and made available a few open data sets to support scholars to do research on this field. This service is presented and data sets delivered so far as are described.

  8. About increasing informativity of diagnostic system of asynchronous electric motor by extracting additional information from values of consumed current parameter

    Science.gov (United States)

    Zhukovskiy, Y.; Korolev, N.; Koteleva, N.

    2018-05-01

    This article is devoted to expanding the possibilities of assessing the technical state of the current consumption of asynchronous electric drives, as well as increasing the information capacity of diagnostic methods, in conditions of limited access to equipment and incompleteness of information. The method of spectral analysis of the electric drive current can be supplemented by an analysis of the components of the current of the Park's vector. The research of the hodograph evolution in the moment of appearance and development of defects was carried out using the example of current asymmetry in the phases of an induction motor. The result of the study is the new diagnostic parameters of the asynchronous electric drive. During the research, it was proved that the proposed diagnostic parameters allow determining the type and level of the defect. At the same time, there is no need to stop the equipment and taky it out of service for repair. Modern digital control and monitoring systems can use the proposed parameters based on the stator current of an electrical machine to improve the accuracy and reliability of obtaining diagnostic patterns and predicting their changes in order to improve the equipment maintenance systems. This approach can also be used in systems and objects where there are significant parasitic vibrations and unsteady loads. The extraction of useful information can be carried out in electric drive systems in the structure of which there is a power electric converter.

  9. Multi-Paradigm and Multi-Lingual Information Extraction as Support for Medical Web Labelling Authorities

    Directory of Open Access Journals (Sweden)

    Martin Labsky

    2010-10-01

    Full Text Available Until recently, quality labelling of medical web content has been a pre-dominantly manual activity. However, the advances in automated text processing opened the way to computerised support of this activity. The core enabling technology is information extraction (IE. However, the heterogeneity of websites offering medical content imposes particular requirements on the IE techniques to be applied. In the paper we discuss these requirements and describe a multi-paradigm approach to IE addressing them. Experiments on multi-lingual data are reported. The research has been carried out within the EU MedIEQ project.

  10. Scholarly Information Extraction Is Going to Make a Quantum Leap with PubMed Central (PMC).

    Science.gov (United States)

    Matthies, Franz; Hahn, Udo

    2017-01-01

    With the increasing availability of complete full texts (journal articles), rather than their surrogates (titles, abstracts), as resources for text analytics, entirely new opportunities arise for information extraction and text mining from scholarly publications. Yet, we gathered evidence that a range of problems are encountered for full-text processing when biomedical text analytics simply reuse existing NLP pipelines which were developed on the basis of abstracts (rather than full texts). We conducted experiments with four different relation extraction engines all of which were top performers in previous BioNLP Event Extraction Challenges. We found that abstract-trained engines loose up to 6.6% F-score points when run on full-text data. Hence, the reuse of existing abstract-based NLP software in a full-text scenario is considered harmful because of heavy performance losses. Given the current lack of annotated full-text resources to train on, our study quantifies the price paid for this short cut.

  11. High-level waste management technology program plan

    International Nuclear Information System (INIS)

    Harmon, H.D.

    1995-01-01

    The purpose of this plan is to document the integrated technology program plan for the Savannah River Site (SRS) High-Level Waste (HLW) Management System. The mission of the SRS HLW System is to receive and store SRS high-level wastes in a see and environmentally sound, and to convert these wastes into forms suitable for final disposal. These final disposal forms are borosilicate glass to be sent to the Federal Repository, Saltstone grout to be disposed of on site, and treated waste water to be released to the environment via a permitted outfall. Thus, the technology development activities described herein are those activities required to enable successful accomplishment of this mission. The technology program is based on specific needs of the SRS HLW System and organized following the systems engineering level 3 functions. Technology needs for each level 3 function are listed as reference, enhancements, and alternatives. Finally, FY-95 funding, deliverables, and schedules are s in Chapter IV with details on the specific tasks that are funded in FY-95 provided in Appendix A. The information in this report represents the vision of activities as defined at the beginning of the fiscal year. Depending on emergent issues, funding changes, and other factors, programs and milestones may be adjusted during the fiscal year. The FY-95 SRS HLW technology program strongly emphasizes startup support for the Defense Waste Processing Facility and In-Tank Precipitation. Closure of technical issues associated with these operations has been given highest priority. Consequently, efforts on longer term enhancements and alternatives are receiving minimal funding. However, High-Level Waste Management is committed to participation in the national Radioactive Waste Tank Remediation Technology Focus Area. 4 refs., 5 figs., 9 tabs

  12. High-level waste management technology program plan

    Energy Technology Data Exchange (ETDEWEB)

    Harmon, H.D.

    1995-01-01

    The purpose of this plan is to document the integrated technology program plan for the Savannah River Site (SRS) High-Level Waste (HLW) Management System. The mission of the SRS HLW System is to receive and store SRS high-level wastes in a see and environmentally sound, and to convert these wastes into forms suitable for final disposal. These final disposal forms are borosilicate glass to be sent to the Federal Repository, Saltstone grout to be disposed of on site, and treated waste water to be released to the environment via a permitted outfall. Thus, the technology development activities described herein are those activities required to enable successful accomplishment of this mission. The technology program is based on specific needs of the SRS HLW System and organized following the systems engineering level 3 functions. Technology needs for each level 3 function are listed as reference, enhancements, and alternatives. Finally, FY-95 funding, deliverables, and schedules are s in Chapter IV with details on the specific tasks that are funded in FY-95 provided in Appendix A. The information in this report represents the vision of activities as defined at the beginning of the fiscal year. Depending on emergent issues, funding changes, and other factors, programs and milestones may be adjusted during the fiscal year. The FY-95 SRS HLW technology program strongly emphasizes startup support for the Defense Waste Processing Facility and In-Tank Precipitation. Closure of technical issues associated with these operations has been given highest priority. Consequently, efforts on longer term enhancements and alternatives are receiving minimal funding. However, High-Level Waste Management is committed to participation in the national Radioactive Waste Tank Remediation Technology Focus Area. 4 refs., 5 figs., 9 tabs.

  13. Accurate facade feature extraction method for buildings from three-dimensional point cloud data considering structural information

    Science.gov (United States)

    Wang, Yongzhi; Ma, Yuqing; Zhu, A.-xing; Zhao, Hui; Liao, Lixia

    2018-05-01

    Facade features represent segmentations of building surfaces and can serve as a building framework. Extracting facade features from three-dimensional (3D) point cloud data (3D PCD) is an efficient method for 3D building modeling. By combining the advantages of 3D PCD and two-dimensional optical images, this study describes the creation of a highly accurate building facade feature extraction method from 3D PCD with a focus on structural information. The new extraction method involves three major steps: image feature extraction, exploration of the mapping method between the image features and 3D PCD, and optimization of the initial 3D PCD facade features considering structural information. Results show that the new method can extract the 3D PCD facade features of buildings more accurately and continuously. The new method is validated using a case study. In addition, the effectiveness of the new method is demonstrated by comparing it with the range image-extraction method and the optical image-extraction method in the absence of structural information. The 3D PCD facade features extracted by the new method can be applied in many fields, such as 3D building modeling and building information modeling.

  14. Developing an Approach to Prioritize River Restoration using Data Extracted from Flood Risk Information System Databases.

    Science.gov (United States)

    Vimal, S.; Tarboton, D. G.; Band, L. E.; Duncan, J. M.; Lovette, J. P.; Corzo, G.; Miles, B.

    2015-12-01

    Prioritizing river restoration requires information on river geometry. In many states in the US detailed river geometry has been collected for floodplain mapping and is available in Flood Risk Information Systems (FRIS). In particular, North Carolina has, for its 100 Counties, developed a database of numerous HEC-RAS models which are available through its Flood Risk Information System (FRIS). These models that include over 260 variables were developed and updated by numerous contractors. They contain detailed surveyed or LiDAR derived cross-sections and modeled flood extents for different extreme event return periods. In this work, over 4700 HEC-RAS models' data was integrated and upscaled to utilize detailed cross-section information and 100-year modelled flood extent information to enable river restoration prioritization for the entire state of North Carolina. We developed procedures to extract geomorphic properties such as entrenchment ratio, incision ratio, etc. from these models. Entrenchment ratio quantifies the vertical containment of rivers and thereby their vulnerability to flooding and incision ratio quantifies the depth per unit width. A map of entrenchment ratio for the whole state was derived by linking these model results to a geodatabase. A ranking of highly entrenched counties enabling prioritization for flood allowance and mitigation was obtained. The results were shared through HydroShare and web maps developed for their visualization using Google Maps Engine API.

  15. Extracting Low-Frequency Information from Time Attenuation in Elastic Waveform Inversion

    Science.gov (United States)

    Guo, Xuebao; Liu, Hong; Shi, Ying; Wang, Weihong

    2017-03-01

    Low-frequency information is crucial for recovering background velocity, but the lack of low-frequency information in field data makes inversion impractical without accurate initial models. Laplace-Fourier domain waveform inversion can recover a smooth model from real data without low-frequency information, which can be used for subsequent inversion as an ideal starting model. In general, it also starts with low frequencies and includes higher frequencies at later inversion stages, while the difference is that its ultralow frequency information comes from the Laplace-Fourier domain. Meanwhile, a direct implementation of the Laplace-transformed wavefield using frequency domain inversion is also very convenient. However, because broad frequency bands are often used in the pure time domain waveform inversion, it is difficult to extract the wavefields dominated by low frequencies in this case. In this paper, low-frequency components are constructed by introducing time attenuation into the recorded residuals, and the rest of the method is identical to the traditional time domain inversion. Time windowing and frequency filtering are also applied to mitigate the ambiguity of the inverse problem. Therefore, we can start at low frequencies and to move to higher frequencies. The experiment shows that the proposed method can achieve a good inversion result in the presence of a linear initial model and records without low-frequency information.

  16. Audio-Visual Speech Recognition Using Lip Information Extracted from Side-Face Images

    Directory of Open Access Journals (Sweden)

    Koji Iwano

    2007-03-01

    Full Text Available This paper proposes an audio-visual speech recognition method using lip information extracted from side-face images as an attempt to increase noise robustness in mobile environments. Our proposed method assumes that lip images can be captured using a small camera installed in a handset. Two different kinds of lip features, lip-contour geometric features and lip-motion velocity features, are used individually or jointly, in combination with audio features. Phoneme HMMs modeling the audio and visual features are built based on the multistream HMM technique. Experiments conducted using Japanese connected digit speech contaminated with white noise in various SNR conditions show effectiveness of the proposed method. Recognition accuracy is improved by using the visual information in all SNR conditions. These visual features were confirmed to be effective even when the audio HMM was adapted to noise by the MLLR method.

  17. Approaching the largest ‘API’: extracting information from the Internet with Python

    Directory of Open Access Journals (Sweden)

    Jonathan E. Germann

    2018-02-01

    Full Text Available This article explores the need for libraries to algorithmically access and manipulate the world’s largest API: the Internet. The billions of pages on the ‘Internet API’ (HTTP, HTML, CSS, XPath, DOM, etc. are easily accessible and manipulable. Libraries can assist in creating meaning through the datafication of information on the world wide web. Because most information is created for human consumption, some programming is required for automated extraction. Python is an easy-to-learn programming language with extensive packages and community support for web page automation. Four packages (Urllib, Selenium, BeautifulSoup, Scrapy in Python can automate almost any web page for all sized projects. An example warrant data project is explained to illustrate how well Python packages can manipulate web pages to create meaning through assembling custom datasets.

  18. DEVELOPMENT OF AUTOMATIC EXTRACTION METHOD FOR ROAD UPDATE INFORMATION BASED ON PUBLIC WORK ORDER OUTLOOK

    Science.gov (United States)

    Sekimoto, Yoshihide; Nakajo, Satoru; Minami, Yoshitaka; Yamaguchi, Syohei; Yamada, Harutoshi; Fuse, Takashi

    Recently, disclosure of statistic data, representing financial effects or burden for public work, through each web site of national or local government, enables us to discuss macroscopic financial trends. However, it is still difficult to grasp a basic property nationwide how each spot was changed by public work. In this research, our research purpose is to collect road update information reasonably which various road managers provide, in order to realize efficient updating of various maps such as car navigation maps. In particular, we develop the system extracting public work concerned and registering summary including position information to database automatically from public work order outlook, released by each local government, combinating some web mining technologies. Finally, we collect and register several tens of thousands from web site all over Japan, and confirm the feasibility of our method.

  19. Vision in high-level football officials.

    Science.gov (United States)

    Baptista, António Manuel Gonçalves; Serra, Pedro M; McAlinden, Colm; Barrett, Brendan T

    2017-01-01

    Officiating in football depends, at least to some extent, upon adequate visual function. However, there is no vision standard for football officiating and the nature of the relationship between officiating performance and level of vision is unknown. As a first step in characterising this relationship, we report on the clinically-measured vision and on the perceived level of vision in elite-level, Portuguese football officials. Seventy-one referees (R) and assistant referees (AR) participated in the study, representing 92% of the total population of elite level football officials in Portugal in the 2013/2014 season. Nine of the 22 Rs (40.9%) and ten of the 49 ARs (20.4%) were international-level. Information about visual history was also gathered. Perceived vision was assessed using the preference-values-assigned-to-global-visual-status (PVVS) and the Quality-of-Vision (QoV) questionnaire. Standard clinical vision measures (including visual acuity, contrast sensitivity and stereopsis) were gathered in a subset (n = 44, 62%) of the participants. Data were analysed according to the type (R/AR) and level (international/national) of official, and Bonferroni corrections were applied to reduce the risk of type I errors. Adopting criterion for statistical significance of pfootball officials were similar to published normative values for young, adult populations and similar between R and AR. Clinically-measured vision did not differ according to officiating level. Visual acuity measured with and without a pinhole disc indicated that around one quarter of participants may be capable of better vision when officiating, as evidenced by better acuity (≥1 line of letters) using the pinhole. Amongst the clinical visual tests we used, we did not find evidence for above-average performance in elite-level football officials. Although the impact of uncorrected mild to moderate refractive error upon officiating performance is unknown, with a greater uptake of eye examinations, visual

  20. Geopositioning with a quadcopter: Extracted feature locations and predicted accuracy without a priori sensor attitude information

    Science.gov (United States)

    Dolloff, John; Hottel, Bryant; Edwards, David; Theiss, Henry; Braun, Aaron

    2017-05-01

    This paper presents an overview of the Full Motion Video-Geopositioning Test Bed (FMV-GTB) developed to investigate algorithm performance and issues related to the registration of motion imagery and subsequent extraction of feature locations along with predicted accuracy. A case study is included corresponding to a video taken from a quadcopter. Registration of the corresponding video frames is performed without the benefit of a priori sensor attitude (pointing) information. In particular, tie points are automatically measured between adjacent frames using standard optical flow matching techniques from computer vision, an a priori estimate of sensor attitude is then computed based on supplied GPS sensor positions contained in the video metadata and a photogrammetric/search-based structure from motion algorithm, and then a Weighted Least Squares adjustment of all a priori metadata across the frames is performed. Extraction of absolute 3D feature locations, including their predicted accuracy based on the principles of rigorous error propagation, is then performed using a subset of the registered frames. Results are compared to known locations (check points) over a test site. Throughout this entire process, no external control information (e.g. surveyed points) is used other than for evaluation of solution errors and corresponding accuracy.

  1. Inexperienced clinicians can extract pathoanatomic information from MRI narrative reports with high reproducability for use in research/quality assurance

    DEFF Research Database (Denmark)

    Kent, Peter; Briggs, Andrew M; Albert, Hanne Birgit

    2011-01-01

    Background Although reproducibility in reading MRI images amongst radiologists and clinicians has been studied previously, no studies have examined the reproducibility of inexperienced clinicians in extracting pathoanatomic information from magnetic resonance imaging (MRI) narrative reports and t...

  2. [Extraction of buildings three-dimensional information from high-resolution satellite imagery based on Barista software].

    Science.gov (United States)

    Zhang, Pei-feng; Hu, Yuan-man; He, Hong-shi

    2010-05-01

    The demand for accurate and up-to-date spatial information of urban buildings is becoming more and more important for urban planning, environmental protection, and other vocations. Today's commercial high-resolution satellite imagery offers the potential to extract the three-dimensional information of urban buildings. This paper extracted the three-dimensional information of urban buildings from QuickBird imagery, and validated the precision of the extraction based on Barista software. It was shown that the extraction of three-dimensional information of the buildings from high-resolution satellite imagery based on Barista software had the advantages of low professional level demand, powerful universality, simple operation, and high precision. One pixel level of point positioning and height determination accuracy could be achieved if the digital elevation model (DEM) and sensor orientation model had higher precision and the off-Nadir View Angle was relatively perfect.

  3. High level waste management in Asia: R and D perspectives

    International Nuclear Information System (INIS)

    Deokattey, Sangeeta; Bhanumurthy, K.

    2010-01-01

    The present work is an attempt to provide an overview, about the status of R and D and current trends in high level radioactive waste management, particularly in Asian countries. The INIS database (for the period 1976 to 2010) was selected for this purpose, as this is the most authoritative global source of information, in the area of Nuclear Science and Technology. Appropriate query formulations on the database, resulted in the retrieval of 4322 unique bibliographic records. Using the content analysis method (which is both a qualitative as well as a quantitative research method), all the records were analyzed. Part One of the analysis details Scientometric R and D indicators, such as the countries and the institutions involved in R and D, the types of publications, and programmes and projects related to High Level Waste management. Part Two is a subject-based analysis, grouped under the following broad categories: I. Waste Processing 1. Partitioning and transmutation (including ADS) II. Waste Immobilization 1. Glass waste forms and 2. Crystalline ceramics and other waste forms III. Waste Disposal 1. Performance assessment and safety evaluation studies 2. Geohydrological studies a. Site selection and characterization, b. In situ underground experiments, c. Rock mechanical characterization 3. Deep geological repositories a. Sorption, migration and groundwater chemistry b. Engineered barrier systems and IV. Waste Packaging Materials. The results of this analysis are summarized in the study. (author)

  4. Spent Fuel and High-Level Radioactive Waste Transportation Report

    International Nuclear Information System (INIS)

    1992-03-01

    This publication is intended to provide its readers with an introduction to the issues surrounding the subject of transportation of spent nuclear fuel and high-level radioactive waste, especially as those issues impact the southern region of the United States. It was originally issued by SSEB in July 1987 as the Spent Nuclear Fuel and High-Level Radioactive Waste Transportation Primer, a document patterned on work performed by the Western Interstate Energy Board and designed as a ''comprehensive overview of the issues.'' This work differs from that earlier effort in that it is designed for the educated layman with little or no background in nuclear waste Issues. In addition. this document is not a comprehensive examination of nuclear waste issues but should instead serve as a general introduction to the subject. Owing to changes in the nuclear waste management system, program activities by the US Department of Energy and other federal agencies and developing technologies, much of this information is dated quickly. While this report uses the most recent data available, readers should keep in mind that some of the material is subject to rapid change. SSEB plans periodic updates in the future to account for changes in the program. Replacement pages will be supplied to all parties in receipt of this publication provided they remain on the SSEB mailing list

  5. Risk communication system for high level radioactive waste disposal

    International Nuclear Information System (INIS)

    Kugo, Akihide; Uda, Akinobu; Shimoda, Hirosi; Yoshikawa, Hidekazu; Ito, Kyoko; Wakabayashi, Yasunaga

    2005-01-01

    In order to gain a better understanding and acceptance of the task of implementing high level radioactive waste disposal, a study on new communication system about social risk information has been initiated by noticing the rapid expansion of Internet in the society. First, text mining method was introduced to identify the core public interest, examining public comments on the technical report of high level radioactive waste disposal. Then we designed the dialog-mode contents based on the theory of norm activation by Schwartz. Finally, the discussion board was mounted on the web site. By constructing such web communication system which includes knowledge base contents, introspective contents, and interactive discussion board, we conducted the experiment for verifying the principles such as that the basic technical knowledge and trust, and social ethics are indispensable in this process to close the perception gap between nuclear specialists and the general public. The participants of the experiment increased their interest in the topics with which they were not familiar and actively posted their opinions on the BBS. The dialog-mode contents were significantly more effective than the knowledge-based contents in promoting introspection that brought people into a greater awareness of problems such as social dilemma. (author)

  6. Spent fuel and high-level radioactive waste transportation report

    Energy Technology Data Exchange (ETDEWEB)

    1989-11-01

    This publication is intended to provide its readers with an introduction to the issues surrounding the subject of transportation of spent nuclear fuel and high-level radioactive waste, especially as those issues impact the southern region of the United States. It was originally issued by the Southern States Energy Board (SSEB) in July 1987 as the Spent Nuclear Fuel and High-Level Radioactive Waste Transportation Primer, a document patterned on work performed by the Western Interstate Energy Board and designed as a ``comprehensive overview of the issues.`` This work differs from that earlier effort in that it is designed for the educated layman with little or no background in nuclear waste issues. In addition, this document is not a comprehensive examination of nuclear waste issues but should instead serve as a general introduction to the subject. Owing to changes in the nuclear waste management system, program activities by the US Department of Energy and other federal agencies and developing technologies, much of this information is dated quickly. While this report uses the most recent data available, readers should keep in mind that some of the material is subject to rapid change. SSEB plans periodic updates in the future to account for changes in the program. Replacement pages sew be supplied to all parties in receipt of this publication provided they remain on the SSEB mailing list.

  7. Spent fuel and high-level radioactive waste transportation report

    International Nuclear Information System (INIS)

    1989-11-01

    This publication is intended to provide its readers with an introduction to the issues surrounding the subject of transportation of spent nuclear fuel and high-level radioactive waste, especially as those issues impact the southern region of the United States. It was originally issued by the Southern States Energy Board (SSEB) in July 1987 as the Spent Nuclear Fuel and High-Level Radioactive Waste Transportation Primer, a document patterned on work performed by the Western Interstate Energy Board and designed as a ''comprehensive overview of the issues.'' This work differs from that earlier effort in that it is designed for the educated layman with little or no background in nuclear waste issues. In addition, this document is not a comprehensive examination of nuclear waste issues but should instead serve as a general introduction to the subject. Owing to changes in the nuclear waste management system, program activities by the US Department of Energy and other federal agencies and developing technologies, much of this information is dated quickly. While this report uses the most recent data available, readers should keep in mind that some of the material is subject to rapid change. SSEB plans periodic updates in the future to account for changes in the program. Replacement pages sew be supplied to all parties in receipt of this publication provided they remain on the SSEB mailing list

  8. Spent fuel and high-level radioactive waste transportation report

    International Nuclear Information System (INIS)

    1990-11-01

    This publication is intended to provide its readers with an introduction to the issues surrounding the subject of transportation of spent nuclear fuel and high-level radioactive waste, especially as those issues impact the southern region of the United States. It was originally issued by the Southern States Energy Board (SSEB) in July 1987 as the Spent Nuclear Fuel and High-Level Radioactive Waste Transportation Primer, a document patterned on work performed by the Western Interstate Energy Board and designed as a ''comprehensive overview of the issues.'' This work differs from that earlier effort in that it is designed for the educated layman with little or no background in nuclear waste issues. In addition, this document is not a comprehensive examination of nuclear waste issues but should instead serve as a general introduction to the subject. Owing to changes in the nuclear waste management system, program activities by the US Department of Energy and other federal agencies and developing technologies, much of this information is dated quickly. While this report uses the most recent data available, readers should keep in mind that some of the material is subject to rapid change. SSEB plans periodic updates in the future to account for changes in the program. Replacement pages will be supplied to all parties in receipt of this publication provided they remain on the SSEB mailing list

  9. Overview of image processing tools to extract physical information from JET videos

    Science.gov (United States)

    Craciunescu, T.; Murari, A.; Gelfusa, M.; Tiseanu, I.; Zoita, V.; EFDA Contributors, JET

    2014-11-01

    In magnetic confinement nuclear fusion devices such as JET, the last few years have witnessed a significant increase in the use of digital imagery, not only for the surveying and control of experiments, but also for the physical interpretation of results. More than 25 cameras are routinely used for imaging on JET in the infrared (IR) and visible spectral regions. These cameras can produce up to tens of Gbytes per shot and their information content can be very different, depending on the experimental conditions. However, the relevant information about the underlying physical processes is generally of much reduced dimensionality compared to the recorded data. The extraction of this information, which allows full exploitation of these diagnostics, is a challenging task. The image analysis consists, in most cases, of inverse problems which are typically ill-posed mathematically. The typology of objects to be analysed is very wide, and usually the images are affected by noise, low levels of contrast, low grey-level in-depth resolution, reshaping of moving objects, etc. Moreover, the plasma events have time constants of ms or tens of ms, which imposes tough conditions for real-time applications. On JET, in the last few years new tools and methods have been developed for physical information retrieval. The methodology of optical flow has allowed, under certain assumptions, the derivation of information about the dynamics of video objects associated with different physical phenomena, such as instabilities, pellets and filaments. The approach has been extended in order to approximate the optical flow within the MPEG compressed domain, allowing the manipulation of the large JET video databases and, in specific cases, even real-time data processing. The fast visible camera may provide new information that is potentially useful for disruption prediction. A set of methods, based on the extraction of structural information from the visual scene, have been developed for the

  10. Overview of image processing tools to extract physical information from JET videos

    International Nuclear Information System (INIS)

    Craciunescu, T; Tiseanu, I; Zoita, V; Murari, A; Gelfusa, M

    2014-01-01

    In magnetic confinement nuclear fusion devices such as JET, the last few years have witnessed a significant increase in the use of digital imagery, not only for the surveying and control of experiments, but also for the physical interpretation of results. More than 25 cameras are routinely used for imaging on JET in the infrared (IR) and visible spectral regions. These cameras can produce up to tens of Gbytes per shot and their information content can be very different, depending on the experimental conditions. However, the relevant information about the underlying physical processes is generally of much reduced dimensionality compared to the recorded data. The extraction of this information, which allows full exploitation of these diagnostics, is a challenging task. The image analysis consists, in most cases, of inverse problems which are typically ill-posed mathematically. The typology of objects to be analysed is very wide, and usually the images are affected by noise, low levels of contrast, low grey-level in-depth resolution, reshaping of moving objects, etc. Moreover, the plasma events have time constants of ms or tens of ms, which imposes tough conditions for real-time applications. On JET, in the last few years new tools and methods have been developed for physical information retrieval. The methodology of optical flow has allowed, under certain assumptions, the derivation of information about the dynamics of video objects associated with different physical phenomena, such as instabilities, pellets and filaments. The approach has been extended in order to approximate the optical flow within the MPEG compressed domain, allowing the manipulation of the large JET video databases and, in specific cases, even real-time data processing. The fast visible camera may provide new information that is potentially useful for disruption prediction. A set of methods, based on the extraction of structural information from the visual scene, have been developed for the

  11. Heat transfer in high-level waste management

    International Nuclear Information System (INIS)

    Dickey, B.R.; Hogg, G.W.

    1979-01-01

    Heat transfer in the storage of high-level liquid wastes, calcining of radioactive wastes, and storage of solidified wastes are discussed. Processing and storage experience at the Idaho Chemical Processing Plant are summarized for defense high-level wastes; heat transfer in power reactor high-level waste processing and storage is also discussed

  12. Extraction and Analysis of Information Related to Research & Development Declared Under an Additional Protocol

    International Nuclear Information System (INIS)

    Idinger, J.; Labella, R.; Rialhe, A.; Teller, N.

    2015-01-01

    The additional protocol (AP) provides important tools to strengthen and improve the effectiveness and efficiency of the safeguards system. Safeguards are designed to verify that States comply with their international commitments not to use nuclear material or to engage in nuclear-related activities for the purpose of developing nuclear weapons or other nuclear explosive devices. Under an AP based on INFCIRC/540, a State must provide to the IAEA additional information about, and inspector access to, all parts of its nuclear fuel cycle. In addition, the State has to supply information about its nuclear fuel cycle-related research and development (R&D) activities. The majority of States declare their R&D activities under the AP Articles 2.a.(i), 2.a.(x), and 2.b.(i) as part of initial declarations and their annual updates under the AP. In order to verify consistency and completeness of information provided under the AP by States, the Agency has started to analyze declared R&D information by identifying interrelationships between States in different R&D areas relevant to safeguards. The paper outlines the quality of R&D information provided by States to the Agency, describes how the extraction and analysis of relevant declarations are currently carried out at the Agency and specifies what kinds of difficulties arise during evaluation in respect to cross-linking international projects and finding gaps in reporting. In addition, the paper tries to elaborate how the reporting quality of AP information with reference to R&D activities and the assessment process of R&D information could be improved. (author)

  13. Vision in high-level football officials.

    Directory of Open Access Journals (Sweden)

    António Manuel Gonçalves Baptista

    Full Text Available Officiating in football depends, at least to some extent, upon adequate visual function. However, there is no vision standard for football officiating and the nature of the relationship between officiating performance and level of vision is unknown. As a first step in characterising this relationship, we report on the clinically-measured vision and on the perceived level of vision in elite-level, Portuguese football officials. Seventy-one referees (R and assistant referees (AR participated in the study, representing 92% of the total population of elite level football officials in Portugal in the 2013/2014 season. Nine of the 22 Rs (40.9% and ten of the 49 ARs (20.4% were international-level. Information about visual history was also gathered. Perceived vision was assessed using the preference-values-assigned-to-global-visual-status (PVVS and the Quality-of-Vision (QoV questionnaire. Standard clinical vision measures (including visual acuity, contrast sensitivity and stereopsis were gathered in a subset (n = 44, 62% of the participants. Data were analysed according to the type (R/AR and level (international/national of official, and Bonferroni corrections were applied to reduce the risk of type I errors. Adopting criterion for statistical significance of p<0.01, PVVS scores did not differ between R and AR (p = 0.88, or between national- and international-level officials (p = 0.66. Similarly, QoV scores did not differ between R and AR in frequency (p = 0.50, severity (p = 0.71 or bothersomeness (p = 0.81 of symptoms, or between international-level vs national-level officials for frequency (p = 0.03 or bothersomeness (p = 0.07 of symptoms. However, international-level officials reported less severe symptoms than their national-level counterparts (p<0.01. Overall, 18.3% of officials had either never had an eye examination or if they had, it was more than 3 years previously. Regarding refractive correction, 4.2% had undergone refractive surgery and

  14. Zone analysis in biology articles as a basis for information extraction.

    Science.gov (United States)

    Mizuta, Yoko; Korhonen, Anna; Mullen, Tony; Collier, Nigel

    2006-06-01

    In the field of biomedicine, an overwhelming amount of experimental data has become available as a result of the high throughput of research in this domain. The amount of results reported has now grown beyond the limits of what can be managed by manual means. This makes it increasingly difficult for the researchers in this area to keep up with the latest developments. Information extraction (IE) in the biological domain aims to provide an effective automatic means to dynamically manage the information contained in archived journal articles and abstract collections and thus help researchers in their work. However, while considerable advances have been made in certain areas of IE, pinpointing and organizing factual information (such as experimental results) remains a challenge. In this paper we propose tackling this task by incorporating into IE information about rhetorical zones, i.e. classification of spans of text in terms of argumentation and intellectual attribution. As the first step towards this goal, we introduce a scheme for annotating biological texts for rhetorical zones and provide a qualitative and quantitative analysis of the data annotated according to this scheme. We also discuss our preliminary research on automatic zone analysis, and its incorporation into our IE framework.

  15. Extract the Relational Information of Static Features and Motion Features for Human Activities Recognition in Videos

    Directory of Open Access Journals (Sweden)

    Li Yao

    2016-01-01

    Full Text Available Both static features and motion features have shown promising performance in human activities recognition task. However, the information included in these features is insufficient for complex human activities. In this paper, we propose extracting relational information of static features and motion features for human activities recognition. The videos are represented by a classical Bag-of-Word (BoW model which is useful in many works. To get a compact and discriminative codebook with small dimension, we employ the divisive algorithm based on KL-divergence to reconstruct the codebook. After that, to further capture strong relational information, we construct a bipartite graph to model the relationship between words of different feature set. Then we use a k-way partition to create a new codebook in which similar words are getting together. With this new codebook, videos can be represented by a new BoW vector with strong relational information. Moreover, we propose a method to compute new clusters from the divisive algorithm’s projective function. We test our work on the several datasets and obtain very promising results.

  16. Actinides and fission products partitioning from high level liquid waste

    International Nuclear Information System (INIS)

    Yamaura, Mitiko

    1999-01-01

    The presence of small amount of mixed actinides and long-lived heat generators fission products as 137 Cs and 90 Sr are the major problems for safety handling and disposal of high level nuclear wastes. In this work, actinides and fission products partitioning process, as an alternative process for waste treatment is proposed. First of all, ammonium phosphotungstate (PWA), a selective inorganic exchanger for cesium separation was chosen and a new procedure for synthesizing PWA into the organic resin was developed. An strong anionic resin loaded with tungstate or phosphotungstate anion enables the precipitation of PWA directly in the resinous structure by adding the ammonium nitrate in acid medium (R-PWA). Parameters as W/P ratio, pH, reactants, temperature and aging were studied. The R-PWA obtained by using phosphotungstate solution prepared with W/P=9.6, 9 hours digestion time at 94-106 deg C and 4 to 5 months aging time showed the best capacity for cesium retention. On the other hand, Sr separation was performed by technique of extraction chromatography, using DH18C6 impregnated on XAD7 resin as stationary phase. Sr is selectively extracted from acid solution and >99% was recovered from loaded column using distilled water as eluent. Concerning to actinides separations, two extraction chromatographic columns were used. In the first one, TBP(XAD7) column, U and Pu were extracted and its separations were carried-out using HNO 3 and hydroxylamine nitrate + HNO 3 as eluent. In the second one, CMP0-TBP(XAD7) column, the actinides were retained on the column and the separations were done by using (NH 4 ) 2 C 2 O 4 , DTPA, HNO 3 and HCl as eluent. The behavior of some fission products were also verified in both columns. Based on the obtained data, actinides and fission products Cs and Sr partitioning process, using TBP(XAD7) and CMP0-TBP(XAD7) columns for actinides separation, R-PWA column for cesium retention and DH18C6(XAD7) column for Sr isolation was performed

  17. Midwestern High-Level Radioactive Waste Transportation Project

    International Nuclear Information System (INIS)

    Dantoin, T.S.

    1990-12-01

    For more than half a century, the Council of State Governments has served as a common ground for the states of the nation. The Council is a nonprofit, state-supported and -directed service organization that provides research and resources, identifies trends, supplies answers and creates a network for legislative, executive and judicial branch representatives. This List of Available Resources was prepared with the support of the US Department of Energy, Cooperative Agreement No. DE-FC02-89CH10402. However, any opinions, findings, conclusions, or recommendations expressed herein are those of the author(s) and do not necessarily reflect the views of DOE. The purpose of the agreement, and reports issued pursuant to it, is to identify and analyze regional issues pertaining to the transportation of high-level radioactive waste and to inform Midwestern state officials with respect to technical issues and regulatory concerns related to waste transportation

  18. High-Level Language Production in Parkinson's Disease: A Review

    Directory of Open Access Journals (Sweden)

    Lori J. P. Altmann

    2011-01-01

    Full Text Available This paper discusses impairments of high-level, complex language production in Parkinson's disease (PD, defined as sentence and discourse production, and situates these impairments within the framework of current psycholinguistic theories of language production. The paper comprises three major sections, an overview of the effects of PD on the brain and cognition, a review of the literature on language production in PD, and a discussion of the stages of the language production process that are impaired in PD. Overall, the literature converges on a few common characteristics of language production in PD: reduced information content, impaired grammaticality, disrupted fluency, and reduced syntactic complexity. Many studies also document the strong impact of differences in cognitive ability on language production. Based on the data, PD affects all stages of language production including conceptualization and functional and positional processing. Furthermore, impairments at all stages appear to be exacerbated by impairments in cognitive abilities.

  19. MedEx: a medication information extraction system for clinical narratives

    Science.gov (United States)

    Stenner, Shane P; Doan, Son; Johnson, Kevin B; Waitman, Lemuel R; Denny, Joshua C

    2010-01-01

    Medication information is one of the most important types of clinical data in electronic medical records. It is critical for healthcare safety and quality, as well as for clinical research that uses electronic medical record data. However, medication data are often recorded in clinical notes as free-text. As such, they are not accessible to other computerized applications that rely on coded data. We describe a new natural language processing system (MedEx), which extracts medication information from clinical notes. MedEx was initially developed using discharge summaries. An evaluation using a data set of 50 discharge summaries showed it performed well on identifying not only drug names (F-measure 93.2%), but also signature information, such as strength, route, and frequency, with F-measures of 94.5%, 93.9%, and 96.0% respectively. We then applied MedEx unchanged to outpatient clinic visit notes. It performed similarly with F-measures over 90% on a set of 25 clinic visit notes. PMID:20064797

  20. Videomicroscopic extraction of specific information on cell proliferation and migration in vitro

    International Nuclear Information System (INIS)

    Debeir, Olivier; Megalizzi, Veronique; Warzee, Nadine; Kiss, Robert; Decaestecker, Christine

    2008-01-01

    In vitro cell imaging is a useful exploratory tool for cell behavior monitoring with a wide range of applications in cell biology and pharmacology. Combined with appropriate image analysis techniques, this approach has been shown to provide useful information on the detection and dynamic analysis of cell events. In this context, numerous efforts have been focused on cell migration analysis. In contrast, the cell division process has been the subject of fewer investigations. The present work focuses on this latter aspect and shows that, in complement to cell migration data, interesting information related to cell division can be extracted from phase-contrast time-lapse image series, in particular cell division duration, which is not provided by standard cell assays using endpoint analyses. We illustrate our approach by analyzing the effects induced by two sigma-1 receptor ligands (haloperidol and 4-IBP) on the behavior of two glioma cell lines using two in vitro cell models, i.e., the low-density individual cell model and the high-density scratch wound model. This illustration also shows that the data provided by our approach are suggestive as to the mechanism of action of compounds, and are thus capable of informing the appropriate selection of further time-consuming and more expensive biological evaluations required to elucidate a mechanism

  1. 5W1H Information Extraction with CNN-Bidirectional LSTM

    Science.gov (United States)

    Nurdin, A.; Maulidevi, N. U.

    2018-03-01

    In this work, information about who, did what, when, where, why, and how on Indonesian news articles were extracted by combining Convolutional Neural Network and Bidirectional Long Short-Term Memory. Convolutional Neural Network can learn semantically meaningful representations of sentences. Bidirectional LSTM can analyze the relations among words in the sequence. We also use word embedding word2vec for word representation. By combining these algorithms, we obtained F-measure 0.808. Our experiments show that CNN-BLSTM outperforms other shallow methods, namely IBk, C4.5, and Naïve Bayes with the F-measure 0.655, 0.645, and 0.595, respectively.

  2. Metaproteomics: extracting and mining proteome information to characterize metabolic activities in microbial communities.

    Science.gov (United States)

    Abraham, Paul E; Giannone, Richard J; Xiong, Weili; Hettich, Robert L

    2014-06-17

    Contemporary microbial ecology studies usually employ one or more "omics" approaches to investigate the structure and function of microbial communities. Among these, metaproteomics aims to characterize the metabolic activities of the microbial membership, providing a direct link between the genetic potential and functional metabolism. The successful deployment of metaproteomics research depends on the integration of high-quality experimental and bioinformatic techniques for uncovering the metabolic activities of a microbial community in a way that is complementary to other "meta-omic" approaches. The essential, quality-defining informatics steps in metaproteomics investigations are: (1) construction of the metagenome, (2) functional annotation of predicted protein-coding genes, (3) protein database searching, (4) protein inference, and (5) extraction of metabolic information. In this article, we provide an overview of current bioinformatic approaches and software implementations in metaproteome studies in order to highlight the key considerations needed for successful implementation of this powerful community-biology tool. Copyright © 2014 John Wiley & Sons, Inc.

  3. Developing a Process Model for the Forensic Extraction of Information from Desktop Search Applications

    Directory of Open Access Journals (Sweden)

    Timothy Pavlic

    2008-03-01

    Full Text Available Desktop search applications can contain cached copies of files that were deleted from the file system. Forensic investigators see this as a potential source of evidence, as documents deleted by suspects may still exist in the cache. Whilst there have been attempts at recovering data collected by desktop search applications, there is no methodology governing the process, nor discussion on the most appropriate means to do so. This article seeks to address this issue by developing a process model that can be applied when developing an information extraction application for desktop search applications, discussing preferred methods and the limitations of each. This work represents a more structured approach than other forms of current research.

  4. An innovative method for extracting isotopic information from low-resolution gamma spectra

    International Nuclear Information System (INIS)

    Miko, D.; Estep, R.J.; Rawool-Sullivan, M.W.

    1998-01-01

    A method is described for the extraction of isotopic information from attenuated gamma ray spectra using the gross-count material basis set (GC-MBS) model. This method solves for the isotopic composition of an unknown mixture of isotopes attenuated through an absorber of unknown material. For binary isotopic combinations the problem is nonlinear in only one variable and is easily solved using standard line optimization techniques. Results are presented for NaI spectrum analyses of various binary combinations of enriched uranium, depleted uranium, low burnup Pu, 137 Cs, and 133 Ba attenuated through a suite of absorbers ranging in Z from polyethylene through lead. The GC-MBS method results are compared to those computed using ordinary response function fitting and with a simple net peak area method. The GC-MBS method was found to be significantly more accurate than the other methods over the range of absorbers and isotopic blends studied

  5. EnvMine: A text-mining system for the automatic extraction of contextual information

    Directory of Open Access Journals (Sweden)

    de Lorenzo Victor

    2010-06-01

    Full Text Available Abstract Background For ecological studies, it is crucial to count on adequate descriptions of the environments and samples being studied. Such a description must be done in terms of their physicochemical characteristics, allowing a direct comparison between different environments that would be difficult to do otherwise. Also the characterization must include the precise geographical location, to make possible the study of geographical distributions and biogeographical patterns. Currently, there is no schema for annotating these environmental features, and these data have to be extracted from textual sources (published articles. So far, this had to be performed by manual inspection of the corresponding documents. To facilitate this task, we have developed EnvMine, a set of text-mining tools devoted to retrieve contextual information (physicochemical variables and geographical locations from textual sources of any kind. Results EnvMine is capable of retrieving the physicochemical variables cited in the text, by means of the accurate identification of their associated units of measurement. In this task, the system achieves a recall (percentage of items retrieved of 92% with less than 1% error. Also a Bayesian classifier was tested for distinguishing parts of the text describing environmental characteristics from others dealing with, for instance, experimental settings. Regarding the identification of geographical locations, the system takes advantage of existing databases such as GeoNames to achieve 86% recall with 92% precision. The identification of a location includes also the determination of its exact coordinates (latitude and longitude, thus allowing the calculation of distance between the individual locations. Conclusion EnvMine is a very efficient method for extracting contextual information from different text sources, like published articles or web pages. This tool can help in determining the precise location and physicochemical

  6. Hanford high-level waste melter system evaluation data packages

    International Nuclear Information System (INIS)

    Elliott, M.L.; Shafer, P.J.; Lamar, D.A.; Merrill, R.A.; Grunewald, W.; Roth, G.; Tobie, W.

    1996-03-01

    The Tank Waste Remediation System is selecting a reference melter system for the Hanford High-Level Waste vitrification plant. A melter evaluation was conducted in FY 1994 to narrow down the long list of potential melter technologies to a few for testing. A formal evaluation was performed by a Melter Selection Working Group (MSWG), which met in June and August 1994. At the June meeting, MSWG evaluated 15 technologies and selected six for more thorough evaluation at the Aug. meeting. All 6 were variations of joule-heated or induction-heated melters. Between the June and August meetings, Hanford site staff and consultants compiled data packages for each of the six melter technologies as well as variants of the baseline technologies. Information was solicited from melter candidate vendors to supplement existing information. This document contains the data packages compiled to provide background information to MSWG in support of the evaluation of the six technologies. (A separate evaluation was performed by Fluor Daniel, Inc. to identify balance of plant impacts if a given melter system was selected.)

  7. Eodataservice.org: Big Data Platform to Enable Multi-disciplinary Information Extraction from Geospatial Data

    Science.gov (United States)

    Natali, S.; Mantovani, S.; Barboni, D.; Hogan, P.

    2017-12-01

    In 1999, US Vice-President Al Gore outlined the concept of `Digital Earth' as a multi-resolution, three-dimensional representation of the planet to find, visualise and make sense of vast amounts of geo- referenced information on physical and social environments, allowing to navigate through space and time, accessing historical and forecast data to support scientists, policy-makers, and any other user. The eodataservice platform (http://eodataservice.org/) implements the Digital Earth Concept: eodatasevice is a cross-domain platform that makes available a large set of multi-year global environmental collections allowing data discovery, visualization, combination, processing and download. It implements a "virtual datacube" approach where data stored on distributed data centers are made available via standardized OGC-compliant interfaces. Dedicated web-based Graphic User Interfaces (based on the ESA-NASA WebWorldWind technology) as well as web-based notebooks (e.g. Jupyter notebook), deskop GIS tools and command line interfaces can be used to access and manipulate the data. The platform can be fully customized on users' needs. So far eodataservice has been used for the following thematic applications: High resolution satellite data distribution Land surface monitoring using SAR surface deformation data Atmosphere, ocean and climate applications Climate-health applications Urban Environment monitoring Safeguard of cultural heritage sites Support to farmers and (re)-insurances in the agriculturés field In the current work, the EO Data Service concept is presented as key enabling technology; furthermore various examples are provided to demonstrate the high level of interdisciplinarity of the platform.

  8. Extraction of prospecting information of uranium deposit based on high spatial resolution satellite data. Taking bashibulake region as an example

    International Nuclear Information System (INIS)

    Yang Xu; Liu Dechang; Zhang Jielin

    2008-01-01

    In this study, the signification and content of prospecting information of uranium deposit are expounded. Quickbird high spatial resolution satellite data are used to extract the prospecting information of uranium deposit in Bashibulake area in the north of Tarim Basin. By using the pertinent methods of image processing, the information of ore-bearing bed, ore-control structure and mineralized alteration have been extracted. The results show a high consistency with the field survey. The aim of this study is to explore practicability of high spatial resolution satellite data for prospecting minerals, and to broaden the thinking of prospectation at similar area. (authors)

  9. Report on achievements in fiscal 1999 on research and development of human media under the industrial and scientific technology research and development institution. Research and development of a high-level petroleum plant information system technology; 1999 nendo human media no kenkyu kaighatsu seika hokokusho. Sekiyu plant kodo johoka system gijutsu kenkyu kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    This paper describes the achievements in fiscal 1999 on development of a high-level petroleum plant information system technology as part of the research and development of human media. Structuring has been performed on a prototype system comprising two stages: the phase 1 system of the single-element concentration and control type and the phase 2 system of the multi-agent type for the original fusing purpose. Relative to the interface agent, development has been made on a function for fusing, voice interactive function was realized, and personifying agent function was operated on a trial basis. In addition, a database server for large-scale cases was developed. Coordination with other agents and interactive operation were realized by making autonomic the hypothetical plant display interface. Dynamic task sharing was realized by means of discrete coordination processing. A server and a message producing system were completed, by which the processing can be progressed in coordination by having multiple number of agents share the ontology and the object models based thereon. The definition display interface was mounted for the discretely coordinating environment. (NEDO)

  10. Development of a partitioning method for the management of high-level liquid waste

    International Nuclear Information System (INIS)

    Kubota, M.; Dojiri, S.; Yamaguchi, I.; Morita, Y.; Yamagishi, I.; Kobayashi, T.; Tani, S.

    1989-01-01

    Fundamental studies especially focused on the separation of neptunium and technetium have been carried out to construct the advanced partitioning process of fractioning elements in a high-level liquid waste into four groups: transuranium elements, technetium-noble metals, strontium-cesium, and other elements. For the separation of neptunium by solvent extraction, DIDPA proved excellent for extracting Np(V), and its extraction rate was accelerated by hydrogen peroxide. Np(V) was found to be also separated quantitatively as precipitate with oxalic acid. For the separation of technetium, the denitration with formic acid was effective in precipitating it along with noble metals, and the adsorption with activated carbon was also effective for quantitative separation. Through these fundamental studies, the advanced partitioning process is presented as the candidate to be examined with an actual high-level liquid waste

  11. Evaluation of radionuclide concentrations in high-level radioactive wastes

    International Nuclear Information System (INIS)

    Fehringer, D.J.

    1985-10-01

    This report describes a possible approach for development of a numerical definition of the term ''high-level radioactive waste.'' Five wastes are identified which are recognized as being high-level wastes under current, non-numerical definitions. The constituents of these wastes are examined and the most hazardous component radionuclides are identified. This report suggests that other wastes with similar concentrations of these radionuclides could also be defined as high-level wastes. 15 refs., 9 figs., 4 tabs

  12. Development of a test system for high level liquid waste partitioning

    Directory of Open Access Journals (Sweden)

    Duan Wu H.

    2015-01-01

    Full Text Available The partitioning and transmutation strategy has increasingly attracted interest for the safe treatment and disposal of high level liquid waste, in which the partitioning of high level liquid waste is one of the critical technical issues. An improved total partitioning process, including a tri-alkylphosphine oxide process for the removal of actinides, a crown ether strontium extraction process for the removal of strontium, and a calixcrown ether cesium extraction process for the removal of cesium, has been developed to treat Chinese high level liquid waste. A test system containing 72-stage 10-mm-diam annular centrifugal contactors, a remote sampling system, a rotor speed acquisition-monitoring system, a feeding system, and a video camera-surveillance system was successfully developed to carry out the hot test for verifying the improved total partitioning process. The test system has been successfully used in a 160 hour hot test using genuine high level liquid waste. During the hot test, the test system was stable, which demonstrated it was reliable for the hot test of the high level liquid waste partitioning.

  13. Extracting chemical information from high-resolution Kβ X-ray emission spectroscopy

    Science.gov (United States)

    Limandri, S.; Robledo, J.; Tirao, G.

    2018-06-01

    High-resolution X-ray emission spectroscopy allows studying the chemical environment of a wide variety of materials. Chemical information can be obtained by fitting the X-ray spectra and observing the behavior of some spectral features. Spectral changes can also be quantified by means of statistical parameters calculated by considering the spectrum as a probability distribution. Another possibility is to perform statistical multivariate analysis, such as principal component analysis. In this work the performance of these procedures for extracting chemical information in X-ray emission spectroscopy spectra for mixtures of Mn2+ and Mn4+ oxides are studied. A detail analysis of the parameters obtained, as well as the associated uncertainties is shown. The methodologies are also applied for Mn oxidation state characterization of double perovskite oxides Ba1+xLa1-xMnSbO6 (with 0 ≤ x ≤ 0.7). The results show that statistical parameters and multivariate analysis are the most suitable for the analysis of this kind of spectra.

  14. Information Extraction of Tourist Geological Resources Based on 3d Visualization Remote Sensing Image

    Science.gov (United States)

    Wang, X.

    2018-04-01

    Tourism geological resources are of high value in admiration, scientific research and universal education, which need to be protected and rationally utilized. In the past, most of the remote sensing investigations of tourism geological resources used two-dimensional remote sensing interpretation method, which made it difficult for some geological heritages to be interpreted and led to the omission of some information. This aim of this paper is to assess the value of a method using the three-dimensional visual remote sensing image to extract information of geological heritages. skyline software system is applied to fuse the 0.36 m aerial images and 5m interval DEM to establish the digital earth model. Based on the three-dimensional shape, color tone, shadow, texture and other image features, the distribution of tourism geological resources in Shandong Province and the location of geological heritage sites were obtained, such as geological structure, DaiGu landform, granite landform, Volcanic landform, sandy landform, Waterscapes, etc. The results show that using this method for remote sensing interpretation is highly recognizable, making the interpretation more accurate and comprehensive.

  15. Information Management Processes for Extraction of Student Dropout Indicators in Courses in Distance Mode

    Directory of Open Access Journals (Sweden)

    Renata Maria Abrantes Baracho

    2016-04-01

    Full Text Available This research addresses the use of information management processes in order to extract student dropout indicators in distance mode courses. Distance education in Brazil aims to facilitate access to information. The MEC (Ministry of Education announced, in the second semester of 2013, that the main obstacles faced by institutions offering courses in this mode were students dropping out and the resistance of both educators and students to this mode. The research used a mixed methodology, qualitative and quantitative, to obtain student dropout indicators. The factors found and validated in this research were: the lack of interest from students, insufficient training in the use of the virtual learning environment for students, structural problems in the schools that were chosen to offer the course, students without e-mail, incoherent answers to activities to the course, lack of knowledge on the part of the student when using the computer tool. The scenario considered was a course offered in distance mode called Aluno Integrado (Integrated Student

  16. Measuring nuclear reaction cross sections to extract information on neutrinoless double beta decay

    Science.gov (United States)

    Cavallaro, M.; Cappuzzello, F.; Agodi, C.; Acosta, L.; Auerbach, N.; Bellone, J.; Bijker, R.; Bonanno, D.; Bongiovanni, D.; Borello-Lewin, T.; Boztosun, I.; Branchina, V.; Bussa, M. P.; Calabrese, S.; Calabretta, L.; Calanna, A.; Calvo, D.; Carbone, D.; Chávez Lomelí, E. R.; Coban, A.; Colonna, M.; D'Agostino, G.; De Geronimo, G.; Delaunay, F.; Deshmukh, N.; de Faria, P. N.; Ferraresi, C.; Ferreira, J. L.; Finocchiaro, P.; Fisichella, M.; Foti, A.; Gallo, G.; Garcia, U.; Giraudo, G.; Greco, V.; Hacisalihoglu, A.; Kotila, J.; Iazzi, F.; Introzzi, R.; Lanzalone, G.; Lavagno, A.; La Via, F.; Lay, J. A.; Lenske, H.; Linares, R.; Litrico, G.; Longhitano, F.; Lo Presti, D.; Lubian, J.; Medina, N.; Mendes, D. R.; Muoio, A.; Oliveira, J. R. B.; Pakou, A.; Pandola, L.; Petrascu, H.; Pinna, F.; Reito, S.; Rifuggiato, D.; Rodrigues, M. R. D.; Russo, A. D.; Russo, G.; Santagati, G.; Santopinto, E.; Sgouros, O.; Solakci, S. O.; Souliotis, G.; Soukeras, V.; Spatafora, A.; Torresi, D.; Tudisco, S.; Vsevolodovna, R. I. M.; Wheadon, R. J.; Yildirin, A.; Zagatto, V. A. B.

    2018-02-01

    Neutrinoless double beta decay (0vββ) is considered the best potential resource to access the absolute neutrino mass scale. Moreover, if observed, it will signal that neutrinos are their own anti-particles (Majorana particles). Presently, this physics case is one of the most important research “beyond Standard Model” and might guide the way towards a Grand Unified Theory of fundamental interactions. Since the 0vββ decay process involves nuclei, its analysis necessarily implies nuclear structure issues. In the NURE project, supported by a Starting Grant of the European Research Council (ERC), nuclear reactions of double charge-exchange (DCE) are used as a tool to extract information on the 0vββ Nuclear Matrix Elements. In DCE reactions and ββ decay indeed the initial and final nuclear states are the same and the transition operators have similar structure. Thus the measurement of the DCE absolute cross-sections can give crucial information on ββ matrix elements. In a wider view, the NUMEN international collaboration plans a major upgrade of the INFN-LNS facilities in the next years in order to increase the experimental production of nuclei of at least two orders of magnitude, thus making feasible a systematic study of all the cases of interest as candidates for 0vββ.

  17. Unsupervised Symbolization of Signal Time Series for Extraction of the Embedded Information

    Directory of Open Access Journals (Sweden)

    Yue Li

    2017-03-01

    Full Text Available This paper formulates an unsupervised algorithm for symbolization of signal time series to capture the embedded dynamic behavior. The key idea is to convert time series of the digital signal into a string of (spatially discrete symbols from which the embedded dynamic information can be extracted in an unsupervised manner (i.e., no requirement for labeling of time series. The main challenges here are: (1 definition of the symbol assignment for the time series; (2 identification of the partitioning segment locations in the signal space of time series; and (3 construction of probabilistic finite-state automata (PFSA from the symbol strings that contain temporal patterns. The reported work addresses these challenges by maximizing the mutual information measures between symbol strings and PFSA states. The proposed symbolization method has been validated by numerical simulation as well as by experimentation in a laboratory environment. Performance of the proposed algorithm has been compared to that of two commonly used algorithms of time series partitioning.

  18. A methodology for the extraction of quantitative information from electron microscopy images at the atomic level

    International Nuclear Information System (INIS)

    Galindo, P L; Pizarro, J; Guerrero, E; Guerrero-Lebrero, M P; Scavello, G; Yáñez, A; Sales, D L; Herrera, M; Molina, S I; Núñez-Moraleda, B M; Maestre, J M

    2014-01-01

    In this paper we describe a methodology developed at the University of Cadiz (Spain) in the past few years for the extraction of quantitative information from electron microscopy images at the atomic level. This work is based on a coordinated and synergic activity of several research groups that have been working together over the last decade in two different and complementary fields: Materials Science and Computer Science. The aim of our joint research has been to develop innovative high-performance computing techniques and simulation methods in order to address computationally challenging problems in the analysis, modelling and simulation of materials at the atomic scale, providing significant advances with respect to existing techniques. The methodology involves several fundamental areas of research including the analysis of high resolution electron microscopy images, materials modelling, image simulation and 3D reconstruction using quantitative information from experimental images. These techniques for the analysis, modelling and simulation allow optimizing the control and functionality of devices developed using materials under study, and have been tested using data obtained from experimental samples

  19. Options for the disposal of high-level radioactive waste

    International Nuclear Information System (INIS)

    Mitchell, N.T.; Laughton, A.S.; Webb, G.A.M.

    1977-01-01

    The management of radioactive waste within the fuel cycle, especially the high-level wastes from reprocessing of nuclear fuel, is currently a matter of particular concern. In the short term (meaning a timescale of tens of years) management by engineered storage is considered to provide a satisfactory solution. Beyond this, however, the two main alternative options which are considered in the paper are: (a) disposal by burial into geologic formations on land; and (b) disposal by emplacement into or onto the seabed. Status of our present knowledge on the land and seabed disposal options is reviewed together with an assessment of the extent to which their reliability and safety can be judged on presently available information. Further information is needed on the environmental behaviour of radioactivity in the form of solidified waste in both situations in order to provide a more complete, scientific assessment. Work done so far has clarified the areas where further research is most needed - for instance modelling of the environmental transfer processes associated with the seabed option. This is discussed together with an indication of the research programmes which are now being pursued

  20. Dual-wavelength phase-shifting digital holography selectively extracting wavelength information from wavelength-multiplexed holograms.

    Science.gov (United States)

    Tahara, Tatsuki; Mori, Ryota; Kikunaga, Shuhei; Arai, Yasuhiko; Takaki, Yasuhiro

    2015-06-15

    Dual-wavelength phase-shifting digital holography that selectively extracts wavelength information from five wavelength-multiplexed holograms is presented. Specific phase shifts for respective wavelengths are introduced to remove the crosstalk components and extract only the object wave at the desired wavelength from the holograms. Object waves in multiple wavelengths are selectively extracted by utilizing 2π ambiguity and the subtraction procedures based on phase-shifting interferometry. Numerical results show the validity of the proposed technique. The proposed technique is also experimentally demonstrated.

  1. Information Extraction and Dependency on Open Government Data (ogd) for Environmental Monitoring

    Science.gov (United States)

    Abdulmuttalib, Hussein

    2016-06-01

    Environmental monitoring practices support decision makers of different government / private institutions, besides environmentalists and planners among others. This support helps them act towards the sustainability of our environment, and also take efficient measures for protecting human beings in general, but it is difficult to explore useful information from 'OGD' and assure its quality for the purpose. On the other hand, Monitoring itself comprises detecting changes as happens, or within the mitigation period range, which means that any source of data, that is to be used for monitoring, should replicate the information related to the period of environmental monitoring, or otherwise it's considered almost useless or history. In this paper the assessment of information extraction and structuring from Open Government Data 'OGD', that can be useful to environmental monitoring is performed, looking into availability, usefulness to environmental monitoring of a certain type, checking its repetition period and dependences. The particular assessment is being performed on a small sample selected from OGD, bearing in mind the type of the environmental change monitored, such as the increase and concentrations of built up areas, and reduction of green areas, or monitoring the change of temperature in a specific area. The World Bank mentioned in its blog that Data is open if it satisfies both conditions of, being technically open, and legally open. The use of Open Data thus, is regulated by published terms of use, or an agreement which implies some conditions without violating the above mentioned two conditions. Within the scope of the paper I wish to share the experience of using some OGD for supporting an environmental monitoring work, that is performed to mitigate the production of carbon dioxide, by regulating energy consumption, and by properly designing the test area's landscapes, thus using Geodesign tactics, meanwhile wish to add to the results achieved by many

  2. Discovery of high-level tasks in the operating room

    NARCIS (Netherlands)

    Bouarfa, L.; Jonker, P.P.; Dankelman, J.

    2010-01-01

    Recognizing and understanding surgical high-level tasks from sensor readings is important for surgical workflow analysis. Surgical high-level task recognition is also a challenging task in ubiquitous computing because of the inherent uncertainty of sensor data and the complexity of the operating

  3. Characteristics of solidified high-level waste products

    International Nuclear Information System (INIS)

    1979-01-01

    The object of the report is to contribute to the establishment of a data bank for future preparation of codes of practice and standards for the management of high-level wastes. The work currently in progress on measuring the properties of solidified high-level wastes is being studied

  4. Process for solidifying high-level nuclear waste

    Science.gov (United States)

    Ross, Wayne A.

    1978-01-01

    The addition of a small amount of reducing agent to a mixture of a high-level radioactive waste calcine and glass frit before the mixture is melted will produce a more homogeneous glass which is leach-resistant and suitable for long-term storage of high-level radioactive waste products.

  5. Machine learning classification of surgical pathology reports and chunk recognition for information extraction noise reduction.

    Science.gov (United States)

    Napolitano, Giulio; Marshall, Adele; Hamilton, Peter; Gavin, Anna T

    2016-06-01

    Machine learning techniques for the text mining of cancer-related clinical documents have not been sufficiently explored. Here some techniques are presented for the pre-processing of free-text breast cancer pathology reports, with the aim of facilitating the extraction of information relevant to cancer staging. The first technique was implemented using the freely available software RapidMiner to classify the reports according to their general layout: 'semi-structured' and 'unstructured'. The second technique was developed using the open source language engineering framework GATE and aimed at the prediction of chunks of the report text containing information pertaining to the cancer morphology, the tumour size, its hormone receptor status and the number of positive nodes. The classifiers were trained and tested respectively on sets of 635 and 163 manually classified or annotated reports, from the Northern Ireland Cancer Registry. The best result of 99.4% accuracy - which included only one semi-structured report predicted as unstructured - was produced by the layout classifier with the k nearest algorithm, using the binary term occurrence word vector type with stopword filter and pruning. For chunk recognition, the best results were found using the PAUM algorithm with the same parameters for all cases, except for the prediction of chunks containing cancer morphology. For semi-structured reports the performance ranged from 0.97 to 0.94 and from 0.92 to 0.83 in precision and recall, while for unstructured reports performance ranged from 0.91 to 0.64 and from 0.68 to 0.41 in precision and recall. Poor results were found when the classifier was trained on semi-structured reports but tested on unstructured. These results show that it is possible and beneficial to predict the layout of reports and that the accuracy of prediction of which segments of a report may contain certain information is sensitive to the report layout and the type of information sought. Copyright

  6. Study of time-frequency characteristics of single snores: extracting new information for sleep apnea diagnosis

    Energy Technology Data Exchange (ETDEWEB)

    Castillo Escario, Y.; Blanco Almazan, D.; Camara Vazquez, M.A.; Jane Campos, R.

    2016-07-01

    Obstructive sleep apnea (OSA) is a highly prevalent chronic disease, especially in elderly and obese population. Despite constituting a huge health and economic problem, most patients remain undiagnosed due to limitations in current strategies. Therefore, it is essential to find cost-effective diagnostic alternatives. One of these novel approaches is the analysis of acoustic snoring signals. Snoring is an early symptom of OSA which carries pathophysiological information of high diagnostic value. For this reason, the main objective of this work is to study the characteristics of single snores of different types, from healthy and OSA subjects. To do that, we analyzed snoring signals from previous databases and developed an experimental protocol to record simulated OSA-related sounds and characterize the response of two commercial tracheal microphones. Automatic programs for filtering, downsampling, event detection and time-frequency analysis were built in MATLAB. We found that time-frequency maps and spectral parameters (central, mean and peak frequency and energy in the 100-500 Hz band) allow distinguishing regular snores of healthy subjects from non-regular snores and snores of OSA subjects. Regarding the two commercial microphones, we found that one of them was a suitable snoring sensor, while the other had a too restricted frequency response. Future work shall include a higher number of episodes and subjects, but our study has contributed to show how important the differences between regular and non-regular snores can be for OSA diagnosis, and how much clinically relevant information can be extracted from time-frequency maps and spectral parameters of single snores. (Author)

  7. Quantum measurement information as a key to energy extraction from local vacuums

    International Nuclear Information System (INIS)

    Hotta, Masahiro

    2008-01-01

    In this paper, a protocol is proposed in which energy extraction from local vacuum states is possible by using quantum measurement information for the vacuum state of quantum fields. In the protocol, Alice, who stays at a spatial point, excites the ground state of the fields by a local measurement. Consequently, wave packets generated by Alice's measurement propagate the vacuum to spatial infinity. Let us assume that Bob stays away from Alice and fails to catch the excitation energy when the wave packets pass in front of him. Next Alice announces her local measurement result to Bob by classical communication. Bob performs a local unitary operation depending on the measurement result. In this process, positive energy is released from the fields to Bob's apparatus of the unitary operation. In the field systems, wave packets are generated with negative energy around Bob's location. Soon afterwards, the negative-energy wave packets begin to chase after the positive-energy wave packets generated by Alice and form loosely bound states.

  8. Oxygen octahedra picker: A software tool to extract quantitative information from STEM images

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Yi, E-mail: y.wang@fkf.mpg.de; Salzberger, Ute; Sigle, Wilfried; Eren Suyolcu, Y.; Aken, Peter A. van

    2016-09-15

    In perovskite oxide based materials and hetero-structures there are often strong correlations between oxygen octahedral distortions and functionality. Thus, atomistic understanding of the octahedral distortion, which requires accurate measurements of atomic column positions, will greatly help to engineer their properties. Here, we report the development of a software tool to extract quantitative information of the lattice and of BO{sub 6} octahedral distortions from STEM images. Center-of-mass and 2D Gaussian fitting methods are implemented to locate positions of individual atom columns. The precision of atomic column distance measurements is evaluated on both simulated and experimental images. The application of the software tool is demonstrated using practical examples. - Highlights: • We report a software tool for mapping atomic positions from HAADF and ABF images. • It enables quantification of both crystal lattice and oxygen octahedral distortions. • We test the measurement accuracy and precision on simulated and experimental images. • It works well for different orientations of perovskite structures and interfaces.

  9. Note on difference spectra for fast extraction of global image information.

    CSIR Research Space (South Africa)

    Van Wyk, BJ

    2007-06-01

    Full Text Available FOR FAST EXTRACTION OF GLOBAL IMAGE INFORMATION. B.J van Wyk* M.A. van Wyk* and F. van den Bergh** * c29c55c48c51c46c4bc03 c36c52c58c57c4bc03 c24c49c55c4cc46c44c51c03 c37c48c46c4bc51c4cc46c44c4fc03 c2cc51c56c57c4cc57c58c57c48c03 c4cc51c03 c28c4fc48c...46c57c55c52c51c4cc46c56c03 c0bc29cb6c36c24c37c2cc28c0cc03 c44c57c03 c57c4bc48c03 c37c56c4bc5ac44c51c48c03 c38c51c4cc59c48c55c56c4cc57c5cc03 c52c49c03 Technology, Private Bag X680, Pretoria 0001. ** Remote Sensing Research Group, Meraka Institute...

  10. Pyrochemical separation of radioactive components from inert materials in ICPP high-level calcined waste

    International Nuclear Information System (INIS)

    Del Debbio, J.A.; Nelson, L.O.; Todd, T.A.

    1995-05-01

    Since 1963, calcination of aqueous wastes from reprocessing of DOE-owned spent nuclear fuels has resulted in the accumulation of approximately 3800 m 3 of high-level waste (HLW) at the Idaho Chemical Processing Plant (ICPP). The waste is in the form of a granular solid called calcine and is stored on site in stainless steel bins which are encased in concrete. Due to the leachability of 137 Cs and 90 Sr and possibly other radioactive components, the calcine is not suitable for final disposal. Hence, a process to immobilize calcine in glass is being developed. Since radioactive components represent less than 1 wt % of the calcine, separation of actinides and fission products from inert components is being considered to reduce the volume of HLW requiring final disposal. Current estimates indicate that compared to direct vitrification, a volume reduction factor of 10 could result in significant cost savings. Aqueous processes, which involve calcine dissolution in nitric acid followed by separation of actinide and fission products by solvent extraction and ion exchange methods, are being developed. Pyrochemical separation methods, which generate small volumes of aqueous wastes and do not require calcine dissolution, have been evaluated as alternatives to aqueous processes. This report describes three proposed pyrochemical flowsheets and presents the results of experimental studies conducted to evaluate their feasibility. The information presented is a consolidation of three reports, which should be consulted for experimental details

  11. Text mining analysis of public comments regarding high-level radioactive waste disposal

    International Nuclear Information System (INIS)

    Kugo, Akihide; Yoshikawa, Hidekazu; Shimoda, Hiroshi; Wakabayashi, Yasunaga

    2005-01-01

    In order to narrow the risk perception gap as seen in social investigations between the general public and people who are involved in nuclear industry, public comments on high-level radioactive waste (HLW) disposal have been conducted to find the significant talking points with the general public for constructing an effective risk communication model of social risk information regarding HLW disposal. Text mining was introduced to examine public comments to identify the core public interest underlying the comments. The utilized test mining method is to cluster specific groups of words with negative meanings and then to analyze public understanding by employing text structural analysis to extract words from subjective expressions. Using these procedures, it was found that the public does not trust the nuclear fuel cycle promotion policy and shows signs of anxiety about the long-lasting technological reliability of waste storage. To develop effective social risk communication of HLW issues, these findings are expected to help experts in the nuclear industry to communicate with the general public more effectively to obtain their trust. (author)

  12. High level bacterial contamination of secondary school students' mobile phones.

    Science.gov (United States)

    Kõljalg, Siiri; Mändar, Rando; Sõber, Tiina; Rööp, Tiiu; Mändar, Reet

    2017-06-01

    While contamination of mobile phones in the hospital has been found to be common in several studies, little information about bacterial abundance on phones used in the community is available. Our aim was to quantitatively determine the bacterial contamination of secondary school students' mobile phones. Altogether 27 mobile phones were studied. The contact plate method and microbial identification using MALDI-TOF mass spectrometer were used for culture studies. Quantitative PCR reaction for detection of universal 16S rRNA, Enterococcus faecalis 16S rRNA and Escherichia coli allantoin permease were performed, and the presence of tetracycline ( tet A, tet B, tet M), erythromycin ( erm B) and sulphonamide ( sul 1) resistance genes was assessed. We found a high median bacterial count on secondary school students' mobile phones (10.5 CFU/cm 2 ) and a median of 17,032 bacterial 16S rRNA gene copies per phone. Potentially pathogenic microbes ( Staphylococcus aureus , Acinetobacter spp. , Pseudomonas spp., Bacillus cereus and Neisseria flavescens ) were found among dominant microbes more often on phones with higher percentage of E. faecalis in total bacterial 16S rRNA. No differences in contamination level or dominating bacterial species between phone owner's gender and between phone types (touch screen/keypad) were found. No antibiotic resistance genes were detected on mobile phone surfaces. Quantitative study methods revealed high level bacterial contamination of secondary school students' mobile phones.

  13. Multiple Word-Length High-Level Synthesis

    Directory of Open Access Journals (Sweden)

    Coussy Philippe

    2008-01-01

    Full Text Available Abstract Digital signal processing (DSP applications are nowadays widely used and their complexity is ever growing. The design of dedicated hardware accelerators is thus still needed in system-on-chip and embedded systems. Realistic hardware implementation requires first to convert the floating-point data of the initial specification into arbitrary length data (finite-precision while keeping an acceptable computation accuracy. Next, an optimized hardware architecture has to be designed. Considering uniform bit-width specification allows to use traditional automated design flow. However, it leads to oversized design. On the other hand, considering non uniform bit-width specification allows to get a smaller circuit but requires complex design tasks. In this paper, we propose an approach that inputs a C/C++ specification. The design flow, based on high-level synthesis (HLS techniques, automatically generates a potentially pipeline RTL architecture described in VHDL. Both bitaccurate integer and fixed-point data types can be used in the input specification. The generated architecture uses components (operator, register, etc. that have different widths. The design constraints are the clock period and the throughput of the application. The proposed approach considers data word-length information in all the synthesis steps by using dedicated algorithms. We show in this paper the effectiveness of the proposed approach through several design experiments in the DSP domain.

  14. Multiple Word-Length High-Level Synthesis

    Directory of Open Access Journals (Sweden)

    Dominique Heller

    2008-09-01

    Full Text Available Digital signal processing (DSP applications are nowadays widely used and their complexity is ever growing. The design of dedicated hardware accelerators is thus still needed in system-on-chip and embedded systems. Realistic hardware implementation requires first to convert the floating-point data of the initial specification into arbitrary length data (finite-precision while keeping an acceptable computation accuracy. Next, an optimized hardware architecture has to be designed. Considering uniform bit-width specification allows to use traditional automated design flow. However, it leads to oversized design. On the other hand, considering non uniform bit-width specification allows to get a smaller circuit but requires complex design tasks. In this paper, we propose an approach that inputs a C/C++ specification. The design flow, based on high-level synthesis (HLS techniques, automatically generates a potentially pipeline RTL architecture described in VHDL. Both bitaccurate integer and fixed-point data types can be used in the input specification. The generated architecture uses components (operator, register, etc. that have different widths. The design constraints are the clock period and the throughput of the application. The proposed approach considers data word-length information in all the synthesis steps by using dedicated algorithms. We show in this paper the effectiveness of the proposed approach through several design experiments in the DSP domain.

  15. PLUTONIUM/HIGH-LEVEL VITRIFIED WASTE BDBE DOSE CALCULATION

    Energy Technology Data Exchange (ETDEWEB)

    J.A. Ziegler

    2000-11-20

    The purpose of this calculation is to provide a dose consequence analysis of high-level waste (HLW) consisting of plutonium immobilized in vitrified HLW to be handled at the proposed Monitored Geologic Repository at Yucca Mountain for a beyond design basis event (BDBE) under expected conditions using best estimate values for each calculation parameter. In addition to the dose calculation, a plutonium respirable particle size for dose calculation use is derived. The current concept for this waste form is plutonium disks enclosed in cans immobilized in canisters of vitrified HLW (i.e., glass). The plutonium inventory at risk used for this calculation is selected from Plutonium Immobilization Project Input for Yucca Mountain Total Systems Performance Assessment (Shaw 1999). The BDBE examined in this calculation is a nonmechanistic initiating event and the sequence of events that follow to cause a radiological release. This analysis will provide the radiological releases and dose consequences for a postulated BDBE. Results may be considered in other analyses to determine or modify the safety classification and quality assurance level of repository structures, systems, and components. This calculation uses best available technical information because the BDBE frequency is very low (i.e., less than 1.0E-6 events/year) and is not required for License Application for the Monitored Geologic Repository. The results of this calculation will not be used as part of a licensing or design basis.

  16. Improvement of Control Infrastructure and High Level Application for KOMAC LINAC

    Energy Technology Data Exchange (ETDEWEB)

    Song, Young-Gi; Kim, Jae-Ha; Ahn, Tae-Sung; Kwon, Hyeok-Jung; Cho, Yong-Sub [Korea Atomic Energy Research Institute, Gyeongju (Korea, Republic of)

    2015-10-15

    The Korea multi-purpose accelerator complex (KOMAC) has two beam extraction points at 20 and 100 MeV for proton beam utilization. There are about 70 control systems for controlling the KOMAC subsystems, such as the ion source, the radio frequency, the diagnostic devices, the magnet power supply, and the cooling system. The infrastructure which includes network system, local controllers, and control system environment was required to be changed to process increasing process variables without fail. Experimental Physics and Industrial Control System (EPICS) based high level control environment which includes alarm, data archiving was changed to support the improved infrastructure of KOMAC control system. In this paper, we will describe the improvement of infrastructures for the KOMAC control system and EPICS based high level application. We improved the control network environment and EPCIS based high level application for enhancement of the KOMAC control system.

  17. High-level waste immobilization program: an overview

    International Nuclear Information System (INIS)

    Bonner, W.R.

    1979-09-01

    The High-Level Waste Immobilization Program is providing technology to allow safe, affordable immobilization and disposal of nuclear waste. Waste forms and processes are being developed on a schedule consistent with national needs for immobilization of high-level wastes stored at Savannah River, Hanford, Idaho National Engineering Laboratory, and West Valley, New York. This technology is directly applicable to high-level wastes from potential reprocessing of spent nuclear fuel. The program is removing one more obstacle previously seen as a potential restriction on the use and further development of nuclear power, and is thus meeting a critical technological need within the national objective of energy independence

  18. Analysis Methods for Extracting Knowledge from Large-Scale WiFi Monitoring to Inform Building Facility Planning

    DEFF Research Database (Denmark)

    Ruiz-Ruiz, Antonio; Blunck, Henrik; Prentow, Thor Siiger

    2014-01-01

    realistic data to inform facility planning. In this paper, we propose analysis methods to extract knowledge from large sets of network collected WiFi traces to better inform facility management and planning in large building complexes. The analysis methods, which build on a rich set of temporal and spatial......The optimization of logistics in large building com- plexes with many resources, such as hospitals, require realistic facility management and planning. Current planning practices rely foremost on manual observations or coarse unverified as- sumptions and therefore do not properly scale or provide....... Spatio-temporal visualization tools built on top of these methods enable planners to inspect and explore extracted information to inform facility-planning activities. To evaluate the methods, we present results for a large hospital complex covering more than 10 hectares. The evaluation is based on Wi...

  19. High-level Waste Long-term management technology development

    International Nuclear Information System (INIS)

    Choi, Jong Won; Kang, C. H.; Ko, Y. K.

    2012-02-01

    The purpose of this project is to develop a long-term management system(A-KRS) which deals with spent fuels from domestic nuclear power stations, HLW from advanced fuel cycle and other wastes that are not admitted to LILW disposal site. Also, this project demonstrate the feasibility and reliability of the key technologies applied in the A-KRS by evaluating them under in-situ condition such as underground research laboratory and provide important information to establish the safety assessment and long-term management plan. To develop the technologies for the high level radioactive wastes disposal, demonstrate their reliability under in-situ condition and establish safety assessment of disposal system, The major objects of this project are the following: Ο An advanced disposal system including waste containers for HLW from advanced fuel cycle and pyroprocess has been developed. Ο Quantitative assessment tools for long-term safety and performance assessment of a radwaste disposal system has been developed. Ο Hydrological and geochemical investigation and interpretation methods has been developed to evaluate deep geological environments. Ο The THMC characteristics of the engineered barrier system and near-field has been evaluated by in-situ experiments. Ο The migration and retardation of radionuclides and colloid materials in a deep geological environment has been investigated. The results from this project will provide important information to show HLW disposal plan safe and reliable. The knowledge from this project can also contribute to environmental conservation by applying them to the field of oil and gas industries to store their wastes safe

  20. Extraction as a source of additional information when concentrations in multicomponent systems are simultaneously determined

    International Nuclear Information System (INIS)

    Perkov, I.G.

    1988-01-01

    Using as an example photometric determination of Nd and Sm in their joint presence, the possibility to use the influence of extraction on analytic signal increase is considered. It is shown that interligand exchange in extracts in combination with simultaneous determination of concentrations can be used as a simple means increasing the accuracy of determination. 5 refs.; 2 figs.; 3 tabs

  1. R and D Activities on high-level nuclear waste management

    International Nuclear Information System (INIS)

    Watanabe, Shosuke

    1985-01-01

    High-level liquid waste (HLLW) at Tokai Reprocessing Plant has been generated from reprocessing of spent fuels from the light water reactors, and successfully managed since 1977. At the time of 1984, about 154m 3 of HLLW from 170 tons of spent fuels were stored in three high-integrity stainless steel tanks (90m 3 for each) as a nitric acid aqueous solution. The HLLW arises mainly from the first cycle solvent extraction phase. Alkaline solution to scrub the extraction solvent is another source of HLLW. The Advisory Committee on Radioactive Waste Management reported the concept on disposal of high-level waste (HLW) in Japan in 1980 report, that the waste be solidified into borosilicate glass and then be disposed in deep geologic formation so as to minimize the influence of the waste on human environment, with the aid of multibarrier system which is the combination of natural barrier and engineered barrier

  2. Technical career opportunities in high-level radioactive waste management

    International Nuclear Information System (INIS)

    1993-01-01

    Technical career opportunities in high-level radioactive waste management are briefly described in the areas of: Hydrology; geology; biological sciences; mathematics; engineering; heavy equipment operation; and skilled labor and crafts

  3. Long-term high-level waste technology program

    International Nuclear Information System (INIS)

    1980-04-01

    The Department of Energy (DOE) is conducting a comprehensive program to isolate all US nuclear wastes from the human environment. The DOE Office of Nuclear Energy - Waste (NEW) has full responsibility for managing the high-level wastes resulting from defense activities and additional responsiblity for providing the technology to manage existing commercial high-level wastes and any that may be generated in one of several alternative fuel cycles. Responsibilities of the Three Divisions of DOE-NEW are shown. This strategy document presents the research and development plan of the Division of Waste Products for long-term immobilization of the high-level radioactive wastes resulting from chemical processing of nuclear reactor fuels and targets. These high-level wastes contain more than 99% of the residual radionuclides produced in the fuels and targets during reactor operations. They include essentially all the fission products and most of the actinides that were not recovered for use

  4. Glasses used for the high level radioactive wastes storage

    International Nuclear Information System (INIS)

    Sombret, C.

    1983-06-01

    High level radioactive wastes generated by the reprocessing of spent fuels is an important concern in the conditioning of radioactive wastes. This paper deals with the status of the knowledge about glasses used for the treatment of these liquids [fr

  5. Validation and extraction of molecular-geometry information from small-molecule databases.

    Science.gov (United States)

    Long, Fei; Nicholls, Robert A; Emsley, Paul; Graǽulis, Saulius; Merkys, Andrius; Vaitkus, Antanas; Murshudov, Garib N

    2017-02-01

    A freely available small-molecule structure database, the Crystallography Open Database (COD), is used for the extraction of molecular-geometry information on small-molecule compounds. The results are used for the generation of new ligand descriptions, which are subsequently used by macromolecular model-building and structure-refinement software. To increase the reliability of the derived data, and therefore the new ligand descriptions, the entries from this database were subjected to very strict validation. The selection criteria made sure that the crystal structures used to derive atom types, bond and angle classes are of sufficiently high quality. Any suspicious entries at a crystal or molecular level were removed from further consideration. The selection criteria included (i) the resolution of the data used for refinement (entries solved at 0.84 Å resolution or higher) and (ii) the structure-solution method (structures must be from a single-crystal experiment and all atoms of generated molecules must have full occupancies), as well as basic sanity checks such as (iii) consistency between the valences and the number of connections between atoms, (iv) acceptable bond-length deviations from the expected values and (v) detection of atomic collisions. The derived atom types and bond classes were then validated using high-order moment-based statistical techniques. The results of the statistical analyses were fed back to fine-tune the atom typing. The developed procedure was repeated four times, resulting in fine-grained atom typing, bond and angle classes. The procedure will be repeated in the future as and when new entries are deposited in the COD. The whole procedure can also be applied to any source of small-molecule structures, including the Cambridge Structural Database and the ZINC database.

  6. Extracting respiratory information from seismocardiogram signals acquired on the chest using a miniature accelerometer

    International Nuclear Information System (INIS)

    Pandia, Keya; Inan, Omer T; Kovacs, Gregory T A; Giovangrandi, Laurent

    2012-01-01

    Seismocardiography (SCG) is a non-invasive measurement of the vibrations of the chest caused by the heartbeat. SCG signals can be measured using a miniature accelerometer attached to the chest, and are thus well-suited for unobtrusive and long-term patient monitoring. Additionally, SCG contains information relating to both cardiovascular and respiratory systems. In this work, algorithms were developed for extracting three respiration-dependent features of the SCG signal: intensity modulation, timing interval changes within each heartbeat, and timing interval changes between successive heartbeats. Simultaneously with a reference respiration belt, SCG signals were measured from 20 healthy subjects and a respiration rate was estimated using each of the three SCG features and the reference signal. The agreement between each of the three accelerometer-derived respiration rate measurements was computed with respect to the respiration rate derived from the reference respiration belt. The respiration rate obtained from the intensity modulation in the SCG signal was found to be in closest agreement with the respiration rate obtained from the reference respiration belt: the bias was found to be 0.06 breaths per minute with a 95% confidence interval of −0.99 to 1.11 breaths per minute. The limits of agreement between the respiration rates estimated using SCG (intensity modulation) and the reference were within the clinically relevant ranges given in existing literature, demonstrating that SCG could be used for both cardiovascular and respiratory monitoring. Furthermore, phases of each of the three SCG parameters were investigated at four instances of a respiration cycle—start inspiration, peak inspiration, start expiration, and peak expiration—and during breath hold (apnea). The phases of the three SCG parameters observed during the respiration cycle were congruent with existing literature and physiologically expected trends. (paper)

  7. Extracting key information from historical data to quantify the transmission dynamics of smallpox

    Directory of Open Access Journals (Sweden)

    Brockmann Stefan O

    2008-08-01

    Full Text Available Abstract Background Quantification of the transmission dynamics of smallpox is crucial for optimizing intervention strategies in the event of a bioterrorist attack. This article reviews basic methods and findings in mathematical and statistical studies of smallpox which estimate key transmission parameters from historical data. Main findings First, critically important aspects in extracting key information from historical data are briefly summarized. We mention different sources of heterogeneity and potential pitfalls in utilizing historical records. Second, we discuss how smallpox spreads in the absence of interventions and how the optimal timing of quarantine and isolation measures can be determined. Case studies demonstrate the following. (1 The upper confidence limit of the 99th percentile of the incubation period is 22.2 days, suggesting that quarantine should last 23 days. (2 The highest frequency (61.8% of secondary transmissions occurs 3–5 days after onset of fever so that infected individuals should be isolated before the appearance of rash. (3 The U-shaped age-specific case fatality implies a vulnerability of infants and elderly among non-immune individuals. Estimates of the transmission potential are subsequently reviewed, followed by an assessment of vaccination effects and of the expected effectiveness of interventions. Conclusion Current debates on bio-terrorism preparedness indicate that public health decision making must account for the complex interplay and balance between vaccination strategies and other public health measures (e.g. case isolation and contact tracing taking into account the frequency of adverse events to vaccination. In this review, we summarize what has already been clarified and point out needs to analyze previous smallpox outbreaks systematically.

  8. Development of melt compositions for sulphate bearing high level waste

    International Nuclear Information System (INIS)

    Jahagirdar, P.B.; Wattal, P.K.

    1997-09-01

    The report deals with the development and characterization of vitreous matrices for sulphate bearing high level waste. Studies were conducted in sodium borosilicate and lead borosilicate systems with the introduction of CaO, BaO, MgO etc. Lead borosilicate system was found to be compatible with sulphate bearing high level wastes. Detailed product evaluation carried on selected formulations is also described. (author)

  9. Properties and characteristics of high-level waste glass

    International Nuclear Information System (INIS)

    Ross, W.A.

    1977-01-01

    This paper has briefly reviewed many of the characteristics and properties of high-level waste glasses. From this review, it can be noted that glass has many desirable properties for solidification of high-level wastes. The most important of these include: (1) its low leach rate; (2) the ability to tolerate large changes in waste composition; (3) the tolerance of anticipated storage temperatures; (4) its low surface area even after thermal shock or impact

  10. High-Level Waste System Process Interface Description

    International Nuclear Information System (INIS)

    D'Entremont, P.D.

    1999-01-01

    The High-Level Waste System is a set of six different processes interconnected by pipelines. These processes function as one large treatment plant that receives, stores, and treats high-level wastes from various generators at SRS and converts them into forms suitable for final disposal. The three major forms are borosilicate glass, which will be eventually disposed of in a Federal Repository, Saltstone to be buried on site, and treated water effluent that is released to the environment

  11. Data base system for research and development of high-level waste conditioning

    International Nuclear Information System (INIS)

    Masaki, Toshio; Igarashi, Hiroshi; Ohuchi, Jin; Miyauchi, Tomoko.

    1992-01-01

    Results of research and development for High-Level Waste Conditioning are accumulated as large number of documents. Data Base System for Research and Development of High-Level Waste Conditioning has been developed since 1987 to search for necessary informations correctly and rapidly with the intention of offering and transferring the results to organization inside and outside of PNC. This data base system has contributed that technical informations has been correctly and rapidly searched. Designing of devices etc. and making of reports have become easy and work has been efficiently and rationally accomplished. (author)

  12. Development of a geoscience database for preselecting China's high level radioactive waste disposal sites

    International Nuclear Information System (INIS)

    Li Jun; Fan Ai; Huang Shutao; Wang Ju

    1998-01-01

    Taking the development of a geoscience database for China's high level waste disposal sites: Yumen Town, Gansu Province, northwest of China, as an example, the author introduces in detail the application of Geographical Information System (GIS) to high level waste disposal and analyses its application prospect in other fields. The development of GIS provides brand-new thinking for administrators and technicians at all levels. At the same time, the author also introduces the administration of maps and materials by using Geographical Information System

  13. Development of a geoscience database for preselecting China's high level radioactive waste disposal sites

    International Nuclear Information System (INIS)

    Li Jun; Fan Ai; Huang Shutao; Wang Ju

    2004-01-01

    Taking the development of a geoscience database for China's high level waste disposal sites: Yumen Town, Guansu province, northwest of China, as an example, this paper introduces in detail the application of Geographical Information System (GIS) to high level waste disposal and analyses its application prospect in other fields. The development of GIS provides brand-new thinking for administrators and technicians at all levels. At the same time, this paper also introduces the administration of maps and materials by using Geographical Information System. (author)

  14. Evaluation of needle trap micro-extraction and solid-phase micro-extraction: Obtaining comprehensive information on volatile emissions from in vitro cultures.

    Science.gov (United States)

    Oertel, Peter; Bergmann, Andreas; Fischer, Sina; Trefz, Phillip; Küntzel, Anne; Reinhold, Petra; Köhler, Heike; Schubert, Jochen K; Miekisch, Wolfram

    2018-05-14

    Volatile organic compounds (VOCs) emitted from in vitro cultures may reveal information on species and metabolism. Owing to low nmol L -1 concentration ranges, pre-concentration techniques are required for gas chromatography-mass spectrometry (GC-MS) based analyses. This study was intended to compare the efficiency of established micro-extraction techniques - solid-phase micro-extraction (SPME) and needle-trap micro-extraction (NTME) - for the analysis of complex VOC patterns. For SPME, a 75 μm Carboxen®/polydimethylsiloxane fiber was used. The NTME needle was packed with divinylbenzene, Carbopack X and Carboxen 1000. The headspace was sampled bi-directionally. Seventy-two VOCs were calibrated by reference standard mixtures in the range of 0.041-62.24 nmol L -1 by means of GC-MS. Both pre-concentration methods were applied to profile VOCs from cultures of Mycobacterium avium ssp. paratuberculosis. Limits of detection ranged from 0.004 to 3.93 nmol L -1 (median = 0.030 nmol L -1 ) for NTME and from 0.001 to 5.684 nmol L -1 (median = 0.043 nmol L -1 ) for SPME. NTME showed advantages in assessing polar compounds such as alcohols. SPME showed advantages in reproducibility but disadvantages in sensitivity for N-containing compounds. Micro-extraction techniques such as SPME and NTME are well suited for trace VOC profiling over cultures if the limitations of each technique is taken into account. Copyright © 2018 John Wiley & Sons, Ltd.

  15. The precautionary principle and high-level nuclear waste policy

    International Nuclear Information System (INIS)

    Frishman, S.

    1999-01-01

    The 'Precautionary Principle' has grown from the broadening observation that there is compelling evidence that damage to humans and the world-wide environment is of such a magnitude and seriousness that new principles for conducting human activities are necessary. One of the various statements of the Precautionary Principle is: when an activity raises threats of harm to human health or the environment, precautionary measures should be taken even if some cause and effect relationships are not fully established scientifically. The use of a precautionary principle was a significant recommendation emerging from the 1992 United Nations Conference on Environment and Development, held in Rio de Janeiro, Brazil, and it is gaining acceptance in discussions ranging from global warming to activities that affect the marine environment, and far beyond. In the US high-level nuclear waste policy, there is a growing trend on the part of geologic repository proponents and regulators to shift the required safety evaluation from a deterministic analysis of natural and engineered barriers and their interactions to risk assessments and total system waste containment and isolation performance assessment. This is largely a result of the realisation that scientific 'proof' of safety cannot be demonstrated to the level repository proponents have led the American public to expect. Therefore, they are now developing other methods in an attempt to effectively lower the repository safety expectations of the public. Implicit in this shift in demonstration of 'proof' is that levels of uncertainty far larger than those generally taken as scientifically acceptable must be accepted in repository safety, simply because greater certainty is either too costly, in time and money, or impossible to achieve at the potential Yucca Mountain repository site. In the context of the Precautionary Principle, the repository proponent must bear the burden of providing 'Acceptable' proof, established by an open

  16. High-Level Waste Systems Plan. Revision 7 (U)

    International Nuclear Information System (INIS)

    Brooke, J.N.; Gregory, M.V.; Paul, P.; Taylor, G.; Wise, F.E.; Davis, N.R.; Wells, M.N.

    1996-10-01

    This revision of the High-Level Waste (HLW) System Plan aligns SRS HLW program planning with the DOE Savannah River (DOE-SR) Ten Year Plan (QC-96-0005, Draft 8/6), which was issued in July 1996. The objective of the Ten Year Plan is to complete cleanup at most nuclear sites within the next ten years. The two key principles of the Ten Year Plan are to accelerate the reduction of the most urgent risks to human health and the environment and to reduce mortgage costs. Accordingly, this System Plan describes the HLW program that will remove HLW from all 24 old-style tanks, and close 20 of those tanks, by 2006 with vitrification of all HLW by 2018. To achieve these goals, the DWPF canister production rate is projected to climb to 300 canisters per year starting in FY06, and remain at that rate through the end of the program in FY18, (Compare that to past System Plans, in which DWPF production peaked at 200 canisters per year, and the program did not complete until 2026.) An additional $247M (FY98 dollars) must be made available as requested over the ten year planning period, including a one-time $10M to enhance Late Wash attainment. If appropriate resources are made available, facility attainment issues are resolved and regulatory support is sufficient, then completion of the HLW program in 2018 would achieve a $3.3 billion cost savings to DOE, versus the cost of completing the program in 2026. Facility status information is current as of October 31, 1996

  17. A COMPARATIVE ANALYSIS OF WEB INFORMATION EXTRACTION TECHNIQUES DEEP LEARNING vs. NAÏVE BAYES vs. BACK PROPAGATION NEURAL NETWORKS IN WEB DOCUMENT EXTRACTION

    Directory of Open Access Journals (Sweden)

    J. Sharmila

    2016-01-01

    Full Text Available Web mining related exploration is getting the chance to be more essential these days in view of the reason that a lot of information is overseen through the web. Web utilization is expanding in an uncontrolled way. A particular framework is required for controlling such extensive measure of information in the web space. Web mining is ordered into three noteworthy divisions: Web content mining, web usage mining and web structure mining. Tak-Lam Wong has proposed a web content mining methodology in the exploration with the aid of Bayesian Networks (BN. In their methodology, they were learning on separating the web data and characteristic revelation in view of the Bayesian approach. Roused from their investigation, we mean to propose a web content mining methodology, in view of a Deep Learning Algorithm. The Deep Learning Algorithm gives the interest over BN on the basis that BN is not considered in any learning architecture planning like to propose system. The main objective of this investigation is web document extraction utilizing different grouping algorithm and investigation. This work extricates the data from the web URL. This work shows three classification algorithms, Deep Learning Algorithm, Bayesian Algorithm and BPNN Algorithm. Deep Learning is a capable arrangement of strategies for learning in neural system which is connected like computer vision, speech recognition, and natural language processing and biometrics framework. Deep Learning is one of the simple classification technique and which is utilized for subset of extensive field furthermore Deep Learning has less time for classification. Naive Bayes classifiers are a group of basic probabilistic classifiers in view of applying Bayes hypothesis with concrete independence assumptions between the features. At that point the BPNN algorithm is utilized for classification. Initially training and testing dataset contains more URL. We extract the content presently from the dataset. The

  18. Outline of facility for studying high level radioactive materials (CPF) and study programmes

    International Nuclear Information System (INIS)

    Sakamoto, Motoi

    1983-01-01

    The Chemical Processing Facility for studying high level radioactive materials in Tokai Works of Power Reactor and Nuclear Fuel Development Corp. is a facility for fundamental studies centering around hot cells, necessary for the development of fuel recycle techniques for fast breeder reactors, an important point of nuclear fuel cycle, and of the techniques for processing and disposing high level radioactive liquid wastes. The operation of the facility was started in 1982, for both the system A (the test of fuel recycle for fast breeder reactors) and the system B (the test of vitrification of high level liquid wastes). In this report, the outline of the facility, the contents of testings and the reflection of the results are described. For the fuel recycle test, the hot test of the spent fuel pins of JOYO MK-1 core was started, and now the uranium and plutonium extraction test is underway. The scheduled tests are fuel solubility, the confirmation of residual properties in fuel melting, the confirmation of extracting conditions, the electrolytic reduction of plutonium, off-gas behaviour and the test of material reliability. For the test of vitrification of high level liquid wastes, the fundamental test on the solidifying techniques for the actual high level wastes eluted from the Tokai reprocessing plant has been started, and the following tests are programmed: Assessment of the properties of actual liquid wastes, denitration and concentration test, vitrification test, off-gas treatment test, the test of evaluating solidified wastes, and the test of storing solidified wastes. These test results are programmed to be reflected to the safety deliberation and the demonstration operation of a vitrification pilot plant. (Wakatsuki, Y.)

  19. Overview: Defense high-level waste technology program

    International Nuclear Information System (INIS)

    Shupe, M.W.; Turner, D.A.

    1987-01-01

    Defense high-level waste generated by atomic energy defense activities is stored on an interim basis at three U.S. Department of Energy (DOE) operating locations; the Savannah River Plant in South Carolina, the Hanford Site in Washington, and the Idaho National Engineering Laboratory in Idaho. Responsibility for the permanent disposal of this waste resides with DOE's Office of Defense Waste and Transportation Management. The objective of the Defense High-Level Wast Technology Program is to develop the technology for ending interim storage and achieving permanent disposal of all U.S. defense high-level waste. New and readily retrievable high-level waste are immobilized for disposal in a geologic repository. Other high-level waste will be stabilized in-place if, after completion of the National Environmental Policy Act (NEPA) process, it is determined, on a site-specific basis, that this option is safe, cost effective and environmentally sound. The immediate program focus is on implementing the waste disposal strategy selected in compliance with the NEPA process at Savannah River, while continuing progress toward development of final waste disposal strategies at Hanford and Idaho. This paper presents an overview of the technology development program which supports these waste management activities and an assessment of the impact that recent and anticipated legal and institutional developments are expected to have on the program

  20. Information extraction from dynamic PS-InSAR time series using machine learning

    Science.gov (United States)

    van de Kerkhof, B.; Pankratius, V.; Chang, L.; van Swol, R.; Hanssen, R. F.

    2017-12-01

    Due to the increasing number of SAR satellites, with shorter repeat intervals and higher resolutions, SAR data volumes are exploding. Time series analyses of SAR data, i.e. Persistent Scatterer (PS) InSAR, enable the deformation monitoring of the built environment at an unprecedented scale, with hundreds of scatterers per km2, updated weekly. Potential hazards, e.g. due to failure of aging infrastructure, can be detected at an early stage. Yet, this requires the operational data processing of billions of measurement points, over hundreds of epochs, updating this data set dynamically as new data come in, and testing whether points (start to) behave in an anomalous way. Moreover, the quality of PS-InSAR measurements is ambiguous and heterogeneous, which will yield false positives and false negatives. Such analyses are numerically challenging. Here we extract relevant information from PS-InSAR time series using machine learning algorithms. We cluster (group together) time series with similar behaviour, even though they may not be spatially close, such that the results can be used for further analysis. First we reduce the dimensionality of the dataset in order to be able to cluster the data, since applying clustering techniques on high dimensional datasets often result in unsatisfying results. Our approach is to apply t-distributed Stochastic Neighbor Embedding (t-SNE), a machine learning algorithm for dimensionality reduction of high-dimensional data to a 2D or 3D map, and cluster this result using Density-Based Spatial Clustering of Applications with Noise (DBSCAN). The results show that we are able to detect and cluster time series with similar behaviour, which is the starting point for more extensive analysis into the underlying driving mechanisms. The results of the methods are compared to conventional hypothesis testing as well as a Self-Organising Map (SOM) approach. Hypothesis testing is robust and takes the stochastic nature of the observations into account

  1. Synthesis of High-Frequency Ground Motion Using Information Extracted from Low-Frequency Ground Motion

    Science.gov (United States)

    Iwaki, A.; Fujiwara, H.

    2012-12-01

    Broadband ground motion computations of scenario earthquakes are often based on hybrid methods that are the combinations of deterministic approach in lower frequency band and stochastic approach in higher frequency band. Typical computation methods for low-frequency and high-frequency (LF and HF, respectively) ground motions are the numerical simulations, such as finite-difference and finite-element methods based on three-dimensional velocity structure model, and the stochastic Green's function method, respectively. In such hybrid methods, LF and HF wave fields are generated through two different methods that are completely independent of each other, and are combined at the matching frequency. However, LF and HF wave fields are essentially not independent as long as they are from the same event. In this study, we focus on the relation among acceleration envelopes at different frequency bands, and attempt to synthesize HF ground motion using the information extracted from LF ground motion, aiming to propose a new method for broad-band strong motion prediction. Our study area is Kanto area, Japan. We use the K-NET and KiK-net surface acceleration data and compute RMS envelope at four frequency bands: 0.5-1.0 Hz, 1.0-2.0 Hz, 2.0-4.0 Hz, .0-8.0 Hz, and 8.0-16.0 Hz. Taking the ratio of the envelopes of adjacent bands, we find that the envelope ratios have stable shapes at each site. The empirical envelope-ratio characteristics are combined with low-frequency envelope of the target earthquake to synthesize HF ground motion. We have applied the method to M5-class earthquakes and a M7 target earthquake that occurred in the vicinity of Kanto area, and successfully reproduced the observed HF ground motion of the target earthquake. The method can be applied to a broad band ground motion simulation for a scenario earthquake by combining numerically-computed low-frequency (~1 Hz) ground motion with the empirical envelope ratio characteristics to generate broadband ground motion

  2. Managing the nation's commercial high-level radioactive waste

    International Nuclear Information System (INIS)

    1985-03-01

    This report presents the findings and conclusions of OTA's analysis of Federal policy for the management of commercial high-level radioactive waste. It represents a major update and expansion of the Analysis presented to Congress in our summary report, Managing Commercial High-Level Radioactive Waste, published in April of 1982 (NWPA). This new report is intended to contribute to the implementation of NWPA, and in particular to Congressional review of three major documents that DOE will submit to the 99th Congress: a Mission Plan for the waste management program; a monitored retrievable storage (MRS) proposal; and a report on mechanisms for financing and managing the waste program. The assessment was originally focused on the ocean disposal of nuclear waste. OTA later broadened the study to include all aspects of high-level waste disposal. The major findings of the original analysis were published in OTA's 1982 summary report

  3. Techniques for the solidification of high-level wastes

    International Nuclear Information System (INIS)

    1977-01-01

    The problem of the long-term management of the high-level wastes from the reprocessing of irradiated nuclear fuel is receiving world-wide attention. While the majority of the waste solutions from the reprocessing of commercial fuels are currently being stored in stainless-steel tanks, increasing effort is being devoted to developing technology for the conversion of these wastes into solids. A number of full-scale solidification facilities are expected to come into operation in the next decade. The object of this report is to survey and compare all the work currently in progress on the techniques available for the solidification of high-level wastes. It will examine the high-level liquid wastes arising from the various processes currently under development or in operation, the advantages and disadvantages of each process for different types and quantities of waste solutions, the stages of development, the scale-up potential and flexibility of the processes

  4. Spent fuel and high-level radioactive waste storage

    International Nuclear Information System (INIS)

    Trigerman, S.

    1988-06-01

    The subject of spent fuel and high-level radioactive waste storage, is bibliographically reviewed. The review shows that in the majority of the countries, spent fuels and high-level radioactive wastes are planned to be stored for tens of years. Sites for final disposal of high-level radioactive wastes have not yet been found. A first final disposal facility is expected to come into operation in the United States of America by the year 2010. Other final disposal facilities are expected to come into operation in Germany, Sweden, Switzerland and Japan by the year 2020. Meanwhile , stress is placed upon the 'dry storage' method which is carried out successfully in a number of countries (Britain and France). In the United States of America spent fuels are stored in water pools while the 'dry storage' method is still being investigated. (Author)

  5. The ATLAS High-Level Calorimeter Trigger in Run-2

    CERN Document Server

    Wiglesworth, Craig; The ATLAS collaboration

    2018-01-01

    The ATLAS Experiment uses a two-level triggering system to identify and record collision events containing a wide variety of physics signatures. It reduces the event rate from the bunch-crossing rate of 40 MHz to an average recording rate of 1 kHz, whilst maintaining high efficiency for interesting collision events. It is composed of an initial hardware-based level-1 trigger followed by a software-based high-level trigger. A central component of the high-level trigger is the calorimeter trigger. This is responsible for processing data from the electromagnetic and hadronic calorimeters in order to identify electrons, photons, taus, jets and missing transverse energy. In this talk I will present the performance of the high-level calorimeter trigger in Run-2, noting the improvements that have been made in response to the challenges of operating at high luminosity.

  6. TempoWordNet : une ressource lexicale pour l'extraction d'information temporelle

    OpenAIRE

    Hasanuzzaman , Mohammed

    2016-01-01

    The ability to capture the time information conveyed in natural language, where that information is expressed either explicitly, or implicitly, or connotative, is essential to many natural language processing applications such as information retrieval, question answering, automatic summarization, targeted marketing, loan repayment forecasting, and understanding economic patterns. Associating word senses with temporal orientation to grasp the temporal information in language is relatively stra...

  7. High-level face shape adaptation depends on visual awareness : Evidence from continuous flash suppression

    NARCIS (Netherlands)

    Stein, T.; Sterzer, P.

    When incompatible images are presented to the two eyes, one image dominates awareness while the other is rendered invisible by interocular suppression. It has remained unclear whether complex visual information can reach high-level processing stages in the ventral visual pathway during such

  8. The Michigan high-level radioactive waste program: Final technical progress report

    International Nuclear Information System (INIS)

    1987-01-01

    This report comprises the state of Michigan's final technical report on the location of a proposed high-level radioactive waste disposal site. Included are a list of Michigan's efforts to review the DOE proposal and a detailed report on the application of geographic information systems analysis techniques to the review process

  9. Radionuclide compositions of spent fuel and high level waste from commercial nuclear reactors

    International Nuclear Information System (INIS)

    Goodill, D.R.; Tymons, B.J.

    1984-10-01

    This report provides information on radionuclide compositions of spent fuel and high level waste produced during reprocessing. The reactor types considered are Magnox, AGR, PWR and CFR. The activities of the radionuclides are calculated using the FISPIN code. The results are presented in a form suitable for radioactive waste management calculations. (author)

  10. Keeping a large-pupilled eye on high-level visual processing.

    Science.gov (United States)

    Binda, Paola; Murray, Scott O

    2015-01-01

    The pupillary light response has long been considered an elementary reflex. However, evidence now shows that it integrates information from such complex phenomena as attention, contextual processing, and imagery. These discoveries make pupillometry a promising tool for an entirely new application: the study of high-level vision. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. High level radioactive waste management facility design criteria

    International Nuclear Information System (INIS)

    Sheikh, N.A.; Salaymeh, S.R.

    1993-01-01

    This paper discusses the engineering systems for the structural design of the Defense Waste Processing Facility (DWPF) at the Savannah River Site (SRS). At the DWPF, high level radioactive liquids will be mixed with glass particles and heated in a melter. This molten glass will then be poured into stainless steel canisters where it will harden. This process will transform the high level waste into a more stable, manageable substance. This paper discuss the structural design requirements for this unique one of a kind facility. A special emphasis will be concentrated on the design criteria pertaining to earthquake, wind and tornado, and flooding

  12. High-Level Waste (HLW) Feed Process Control Strategy

    International Nuclear Information System (INIS)

    STAEHR, T.W.

    2000-01-01

    The primary purpose of this document is to describe the overall process control strategy for monitoring and controlling the functions associated with the Phase 1B high-level waste feed delivery. This document provides the basis for process monitoring and control functions and requirements needed throughput the double-shell tank system during Phase 1 high-level waste feed delivery. This document is intended to be used by (1) the developers of the future Process Control Plan and (2) the developers of the monitoring and control system

  13. Final report on cermet high-level waste forms

    International Nuclear Information System (INIS)

    Kobisk, E.H.; Quinby, T.C.; Aaron, W.S.

    1981-08-01

    Cermets are being developed as an alternate method for the fixation of defense and commercial high level radioactive waste in a terminal disposal form. Following initial feasibility assessments of this waste form, consisting of ceramic particles dispersed in an iron-nickel base alloy, significantly improved processing methods were developed. The characterization of cermets has continued through property determinations on samples prepared by various methods from a variety of simulated and actual high-level wastes. This report describes the status of development of the cermet waste form as it has evolved since 1977. 6 tables, 18 figures

  14. Managing the high level waste nuclear regulatory commission licensing process

    International Nuclear Information System (INIS)

    Baskin, K.P.

    1992-01-01

    This paper reports that the process for obtaining Nuclear Regulatory Commission permits for the high level waste storage facility is basically the same process commercial nuclear power plants followed to obtain construction permits and operating licenses for their facilities. Therefore, the experience from licensing commercial reactors can be applied to the high level waste facility. Proper management of the licensing process will be the key to the successful project. The management of the licensing process was categorized into four areas as follows: responsibility, organization, communication and documentation. Drawing on experience from nuclear power plant licensing and basic management principles, the management requirement for successfully accomplishing the project goals are discussed

  15. High-level trigger system for the LHC ALICE experiment

    CERN Document Server

    Bramm, R; Lien, J A; Lindenstruth, V; Loizides, C; Röhrich, D; Skaali, B; Steinbeck, T M; Stock, Reinhard; Ullaland, K; Vestbø, A S; Wiebalck, A

    2003-01-01

    The central detectors of the ALICE experiment at LHC will produce a data size of up to 75 MB/event at an event rate less than approximately equals 200 Hz resulting in a data rate of similar to 15 GB/s. Online processing of the data is necessary in order to select interesting (sub)events ("High Level Trigger"), or to compress data efficiently by modeling techniques. Processing this data requires a massive parallel computing system (High Level Trigger System). The system will consist of a farm of clustered SMP-nodes based on off- the-shelf PCs connected with a high bandwidth low latency network.

  16. Treatment technologies for non-high-level wastes (USA)

    International Nuclear Information System (INIS)

    Cooley, C.R.; Clark, D.E.

    1976-06-01

    Non-high-level waste arising from operations at nuclear reactors, fuel fabrication facilities, and reprocessing facilities can be treated using one of several technical alternatives prior to storage. Each alternative and the associated experience and status of development are summarized. The technology for treating non-high-level wastes is generally available for industrial use. Improved techniques applicable to the commercial nuclear fuel cycle are being developed and demonstrated to reduce the volume of waste and to immobilize it for storage. 36 figures, 59 references

  17. High-level radioactive waste disposal type and theoretical analyses

    International Nuclear Information System (INIS)

    Lu Yingfa; Wu Yanchun; Luo Xianqi; Cui Yujun

    2006-01-01

    Study of high-level radioactive waste disposal is necessary for the nuclear electrical development; the determination of nuclear waste depository type is one of importance safety. Based on the high-level radioactive disposal type, the relative research subjects are proposed, then the fundamental research characteristics of nuclear waste disposition, for instance: mechanical and hydraulic properties of rock mass, saturated and unsaturated seepage, chemical behaviors, behavior of special soil, and gas behavior, etc. are introduced, the relative coupling equations are suggested, and a one dimensional result is proposed. (authors)

  18. Aeon: Synthesizing Scheduling Algorithms from High-Level Models

    Science.gov (United States)

    Monette, Jean-Noël; Deville, Yves; van Hentenryck, Pascal

    This paper describes the aeon system whose aim is to synthesize scheduling algorithms from high-level models. A eon, which is entirely written in comet, receives as input a high-level model for a scheduling application which is then analyzed to generate a dedicated scheduling algorithm exploiting the structure of the model. A eon provides a variety of synthesizers for generating complete or heuristic algorithms. Moreover, synthesizers are compositional, making it possible to generate complex hybrid algorithms naturally. Preliminary experimental results indicate that this approach may be competitive with state-of-the-art search algorithms.

  19. Sterilization, high-level disinfection, and environmental cleaning.

    Science.gov (United States)

    Rutala, William A; Weber, David J

    2011-03-01

    Failure to perform proper disinfection and sterilization of medical devices may lead to introduction of pathogens, resulting in infection. New techniques have been developed for achieving high-level disinfection and adequate environmental cleanliness. This article examines new technologies for sterilization and high-level disinfection of critical and semicritical items, respectively, and because semicritical items carry the greatest risk of infection, the authors discuss reprocessing semicritical items such as endoscopes and automated endoscope reprocessors, endocavitary probes, prostate biopsy probes, tonometers, laryngoscopes, and infrared coagulation devices. In addition, current issues and practices associated with environmental cleaning are reviewed. Copyright © 2011. Published by Elsevier Inc.

  20. BioSimplify: an open source sentence simplification engine to improve recall in automatic biomedical information extraction

    OpenAIRE

    Jonnalagadda, Siddhartha; Gonzalez, Graciela

    2011-01-01

    BioSimplify is an open source tool written in Java that introduces and facilitates the use of a novel model for sentence simplification tuned for automatic discourse analysis and information extraction (as opposed to sentence simplification for improving human readability). The model is based on a "shot-gun" approach that produces many different (simpler) versions of the original sentence by combining variants of its constituent elements. This tool is optimized for processing biomedical scien...

  1. The BEL information extraction workflow (BELIEF): evaluation in the BioCreative V BEL and IAT track

    OpenAIRE

    Madan, Sumit; Hodapp, Sven; Senger, Philipp; Ansari, Sam; Szostak, Justyna; Hoeng, Julia; Peitsch, Manuel; Fluck, Juliane

    2016-01-01

    Network-based approaches have become extremely important in systems biology to achieve a better understanding of biological mechanisms. For network representation, the Biological Expression Language (BEL) is well designed to collate findings from the scientific literature into biological network models. To facilitate encoding and biocuration of such findings in BEL, a BEL Information Extraction Workflow (BELIEF) was developed. BELIEF provides a web-based curation interface, the BELIEF Dashboa...

  2. An Investigation of the Relationship Between Automated Machine Translation Evaluation Metrics and User Performance on an Information Extraction Task

    Science.gov (United States)

    2007-01-01

    more reliable than BLEU and that it is easier to understand in terms familiar to NLP researchers. 19 2.2.3 METEOR Researchers at Carnegie Mellon...essential elements of infor- mation from output generated by three types of Arabic -English MT engines. The information extraction experiment was one of three...reviewing the task hierarchy and examining the MT output of several engines. A small, prior pilot experiment to evaluate Arabic -English MT engines for

  3. Comparison of Qinzhou bay wetland landscape information extraction by three methods

    Directory of Open Access Journals (Sweden)

    X. Chang

    2014-04-01

    and OO is 219 km2, 193.70 km2, 217.40 km2 respectively. The result indicates that SC is in the f irst place, followed by OO approach, and the third DT method when used to extract Qingzhou Bay coastal wetland.

  4. Extracting topographic structure from digital elevation data for geographic information-system analysis

    Science.gov (United States)

    Jenson, Susan K.; Domingue, Julia O.

    1988-01-01

    Software tools have been developed at the U.S. Geological Survey's EROS Data Center to extract topographic structure and to delineate watersheds and overland flow paths from digital elevation models. The tools are specialpurpose FORTRAN programs interfaced with general-purpose raster and vector spatial analysis and relational data base management packages.

  5. Site suitability criteria for solidified high level waste repositories

    International Nuclear Information System (INIS)

    Heckman, R.A.; Holdsworth, T.; Towse, D.F.

    1979-01-01

    Activities devoted to development of regulations, criteria, and standards for storage of solidified high-level radioactive wastes are reported. The work is summarized in sections on site suitability regulations, risk calculations, geological models, aquifer models, human usage model, climatology model, and repository characteristics. Proposed additional analytical work is also summarized

  6. The ATLAS Data Acquisition and High Level Trigger system

    International Nuclear Information System (INIS)

    2016-01-01

    This paper describes the data acquisition and high level trigger system of the ATLAS experiment at the Large Hadron Collider at CERN, as deployed during Run 1. Data flow as well as control, configuration and monitoring aspects are addressed. An overview of the functionality of the system and of its performance is presented and design choices are discussed.

  7. Extending Java for High-Level Web Service Construction

    DEFF Research Database (Denmark)

    Christensen, Aske Simon; Møller, Anders; Schwartzbach, Michael Ignatieff

    2003-01-01

    We incorporate innovations from the project into the Java language to provide high-level features for Web service programming. The resulting language, JWIG, contains an advanced session model and a flexible mechanism for dynamic construction of XML documents, in particular XHTML. To support program...

  8. High-Level Overview of Data Needs for RE Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lopez, Anthony

    2016-12-22

    This presentation provides a high level overview of analysis topics and associated data needs. Types of renewable energy analysis are grouped into two buckets: First, analysis for renewable energy potential, and second, analysis for other goals. Data requirements are similar but and they build upon one another.

  9. High-level radioactive waste repositories site selection plan

    International Nuclear Information System (INIS)

    Castanon, A.; Recreo, F.

    1985-01-01

    A general vision of the high level nuclear waste (HLNW) and/or nuclear spent fuel facilities site selection processes is given, according to the main international nuclear safety regulatory organisms quidelines and the experience from those countries which have reached a larger development of their national nuclear programs. (author)

  10. High-level waste-form-product performance evaluation

    International Nuclear Information System (INIS)

    Bernadzikowski, T.A.; Allender, J.S.; Stone, J.A.; Gordon, D.E.; Gould, T.H. Jr.; Westberry, C.F. III.

    1982-01-01

    Seven candidate waste forms were evaluated for immobilization and geologic disposal of high-level radioactive wastes. The waste forms were compared on the basis of leach resistance, mechanical stability, and waste loading. All forms performed well at leaching temperatures of 40, 90, and 150 0 C. Ceramic forms ranked highest, followed by glasses, a metal matrix form, and concrete. 11 tables

  11. High level waste canister emplacement and retrieval concepts study

    International Nuclear Information System (INIS)

    1975-09-01

    Several concepts are described for the interim (20 to 30 years) storage of canisters containing high level waste, cladding waste, and intermediate level-TRU wastes. It includes requirements, ground rules and assumptions for the entire storage pilot plant. Concepts are generally evaluated and the most promising are selected for additional work. Follow-on recommendations are made

  12. An emergency management demonstrator using the high level architecture

    International Nuclear Information System (INIS)

    Williams, R.J.

    1996-12-01

    This paper addresses the issues of simulation interoperability within the emergency management training context. A prototype implementation in Java of a subset of the High Level Architecture (HLA) is described. The use of Web Browsers to provide graphical user interfaces to HLA is also investigated. (au)

  13. Reachability Trees for High-level Petri Nets

    DEFF Research Database (Denmark)

    Jensen, Kurt; Jensen, Arne M.; Jepsen, Leif Obel

    1986-01-01

    the necessary analysis methods. In other papers it is shown how to generalize the concept of place- and transition invariants from place/transition nets to high-level Petri nets. Our present paper contributes to this with a generalization of reachability trees, which is one of the other important analysis...

  14. High-level lipase production by Aspergillus candidus URM 5611 ...

    African Journals Online (AJOL)

    The current study evaluated lipase production by Aspergillus candidus URM 5611 through solid state fermentation (SSF) by using almond bran licuri as a new substrate. The microorganism produced high levels of the enzyme (395.105 U gds-1), thus surpassing those previously reported in the literature. The variable ...

  15. High level of CA 125 due to large endometrioma.

    Science.gov (United States)

    Phupong, Vorapong; Chen, Orawan; Ultchaswadi, Pornthip

    2004-09-01

    CA 125 is a tumor-associated antigen. Its high levels are usually associated with ovarian malignancies, whereas smaller increases in the levels were associated with benign gynecologic conditions. The authors report a high level of CA 125 in a case of large ovarian endometrioma. A 45-year-old nulliparous Thai woman, presented with an increase of her abdominal girth for 7 months. Transabdominal ultrasonogram demonstrated a large ovarian cyst and multiple small leiomyoma uteri, and serum CA 125 level was 1,006 U/ml. The preoperative diagnosis was ovarian cancer with leiomyoma uteri. Exploratory laparotomy was performed. There were a large right ovarian endometrioma, small left ovarian endometrioma and multiple small leiomyoma. Total abdominal hysterectomy and bilateral salpingo-oophorectomy was performed and histopathology confirmed the diagnosis of endometrioma and leiomyoma. The serum CA 125 level declined to non-detectable at the 4th week. She was well at discharge and throughout her 4th week follow-up period Although a very high level of CA 125 is associated with a malignant process, it can also be found in benign conditions such as a large endometrioma. The case emphasizes the association of high levels of CA 125 with benign gynecologic conditions.

  16. The 2011 United Nations High-Level Meeting on Non ...

    African Journals Online (AJOL)

    The 2011 United Nations High-Level Meeting on Non- Communicable Diseases: The Africa agenda calls for a 5-by-5 approach. ... The Political Declaration issued at the meeting focused the attention of world leaders and the global health community on the prevention and control of noncommunicable diseases (NCDs).

  17. High-Level Waste Vitrification Facility Feasibility Study

    International Nuclear Information System (INIS)

    D. A. Lopez

    1999-01-01

    A ''Settlement Agreement'' between the Department of Energy and the State of Idaho mandates that all radioactive high-level waste now stored at the Idaho Nuclear Technology and Engineering Center will be treated so that it is ready to be moved out of Idaho for disposal by a compliance date of 2035. This report investigates vitrification treatment of the high-level waste in a High-Level Waste Vitrification Facility based on the assumption that no more New Waste Calcining Facility campaigns will be conducted after June 2000. Under this option, the sodium-bearing waste remaining in the Idaho Nuclear Technology and Engineering Center Tank Farm, and newly generated liquid waste produced between now and the start of 2013, will be processed using a different option, such as a Cesium Ion Exchange Facility. The cesium-saturated waste from this other option will be sent to the Calcine Solids Storage Facilities to be mixed with existing calcine. The calcine and cesium-saturated waste will be processed in the High-Level Waste Vitrification Facility by the end of calendar year 2035. In addition, the High-Level Waste Vitrification Facility will process all newly-generated liquid waste produced between 2013 and the end of 2035. Vitrification of this waste is an acceptable treatment method for complying with the Settlement Agreement. This method involves vitrifying the waste and pouring it into stainless-steel canisters that will be ready for shipment out of Idaho to a disposal facility by 2035. These canisters will be stored at the Idaho National Engineering and Environmental Laboratory until they are sent to a national geologic repository. The operating period for vitrification treatment will be from the end of 2015 through 2035

  18. High-Level Waste Vitrification Facility Feasibility Study

    Energy Technology Data Exchange (ETDEWEB)

    D. A. Lopez

    1999-08-01

    A ''Settlement Agreement'' between the Department of Energy and the State of Idaho mandates that all radioactive high-level waste now stored at the Idaho Nuclear Technology and Engineering Center will be treated so that it is ready to be moved out of Idaho for disposal by a compliance date of 2035. This report investigates vitrification treatment of the high-level waste in a High-Level Waste Vitrification Facility based on the assumption that no more New Waste Calcining Facility campaigns will be conducted after June 2000. Under this option, the sodium-bearing waste remaining in the Idaho Nuclear Technology and Engineering Center Tank Farm, and newly generated liquid waste produced between now and the start of 2013, will be processed using a different option, such as a Cesium Ion Exchange Facility. The cesium-saturated waste from this other option will be sent to the Calcine Solids Storage Facilities to be mixed with existing calcine. The calcine and cesium-saturated waste will be processed in the High-Level Waste Vitrification Facility by the end of calendar year 2035. In addition, the High-Level Waste Vitrification Facility will process all newly-generated liquid waste produced between 2013 and the end of 2035. Vitrification of this waste is an acceptable treatment method for complying with the Settlement Agreement. This method involves vitrifying the waste and pouring it into stainless-steel canisters that will be ready for shipment out of Idaho to a disposal facility by 2035. These canisters will be stored at the Idaho National Engineering and Environmental Laboratory until they are sent to a national geologic repository. The operating period for vitrification treatment will be from the end of 2015 through 2035.

  19. Systematically extracting metal- and solvent-related occupational information from free-text responses to lifetime occupational history questionnaires.

    Science.gov (United States)

    Friesen, Melissa C; Locke, Sarah J; Tornow, Carina; Chen, Yu-Cheng; Koh, Dong-Hee; Stewart, Patricia A; Purdue, Mark; Colt, Joanne S

    2014-06-01

    Lifetime occupational history (OH) questionnaires often use open-ended questions to capture detailed information about study participants' jobs. Exposure assessors use this information, along with responses to job- and industry-specific questionnaires, to assign exposure estimates on a job-by-job basis. An alternative approach is to use information from the OH responses and the job- and industry-specific questionnaires to develop programmable decision rules for assigning exposures. As a first step in this process, we developed a systematic approach to extract the free-text OH responses and convert them into standardized variables that represented exposure scenarios. Our study population comprised 2408 subjects, reporting 11991 jobs, from a case-control study of renal cell carcinoma. Each subject completed a lifetime OH questionnaire that included verbatim responses, for each job, to open-ended questions including job title, main tasks and activities (task), tools and equipment used (tools), and chemicals and materials handled (chemicals). Based on a review of the literature, we identified exposure scenarios (occupations, industries, tasks/tools/chemicals) expected to involve possible exposure to chlorinated solvents, trichloroethylene (TCE) in particular, lead, and cadmium. We then used a SAS macro to review the information reported by study participants to identify jobs associated with each exposure scenario; this was done using previously coded standardized occupation and industry classification codes, and a priori lists of associated key words and phrases related to possibly exposed tasks, tools, and chemicals. Exposure variables representing the occupation, industry, and task/tool/chemicals exposure scenarios were added to the work history records of the study respondents. Our identification of possibly TCE-exposed scenarios in the OH responses was compared to an expert's independently assigned probability ratings to evaluate whether we missed identifying

  20. MIDAS. An algorithm for the extraction of modal information from experimentally determined transfer functions

    International Nuclear Information System (INIS)

    Durrans, R.F.

    1978-12-01

    In order to design reactor structures to withstand the large flow and acoustic forces present it is necessary to know something of their dynamic properties. In many cases these properties cannot be predicted theoretically and it is necessary to determine them experimentally. The algorithm MIDAS (Modal Identification for the Dynamic Analysis of Structures) which has been developed at B.N.L. for extracting these structural properties from experimental data is described. (author)

  1. Isolation of transplutonium elements from high-level radioactive wastes using diphenyl(dibutylcarbamoylmethyl)phosphine oxide

    International Nuclear Information System (INIS)

    Chmutova, M.K.; Litvina, M.N.; Pribylova, G.A.; Ivanova, L.A.; Myasoedov, B.F.; Smirnov, I.V.; Shadrin, A.Yu.

    1999-01-01

    Consequent stages of development of principal technological scheme of extraction separation of transplutonium elements from high-level radioactive wastes of spent fuel reprocessing are presented. Approach to reagent selection from the series of carbamoylmethylphosphine oxides is based. Distribution of transplutonium elements and accompanying elements between model solution of high-level radioactive wastes and solution of reagent in organic solvent is investigated. Methods of separation of transplutonium elements, reextraction of transplutonium elements together with rare earth elements are developed. Principal technological scheme of transplutonium elements separation from nonevaporated raffinates of spent fuel of WWER type reactors and method of separation of transplutonium and rare earth elements in weakly acid reextract with the use of liquid chromatography with free immobile phase are proposed [ru

  2. Removal of actinide elements from high level radioactive waste by trialkylphosphine oxide (TRPO)

    International Nuclear Information System (INIS)

    Song Chongli; Yang Dazhu; He Longhai; Xu Jingming; Zhu Yongjun

    1992-03-01

    The modified TRPO process for removing actinide elements from synthetic solution, which was taken from reprocessing of power reactor nuclear fuel, was verified by cascade experiment. Neptunium valence was adjusted in the process for improving neptunium removing efficiency. At 1 mol/L concentration of HNO 3 of feed solution and after a few stages of extraction with 30% t=TRPO kerosene, over 99.9% of Am, Pu, Np and U could be removed from HAW (high level radioactive waste) solution. The stripping of actinides loaded in TRPO are accomplished by high concentration nitric acid, oxalic acid and sodium carbonate instead of amino carboxylic complexing agents used in previous process. The actinides stripped were divided into three groups, which are Am + RE, Np + Pu, and U, and the cross contamination between them is small. Behaviours of F.P. elements are divided into three types which are not extracted, little extracted and extracted elements. The extracted elements are rare earth and Pd, Zr and Mo which are co-extracted with actinides. The separation factor between actinides and other two types of F.P.elements will increase if more scrubbing sections are added in the process. The relative concentration profile of actinide elements and Tc in various stages as well as the distribution of actinides and F.P. elements in the process stream solutions are also presented

  3. Extracting Information about the Initial State from the Black Hole Radiation.

    Science.gov (United States)

    Lochan, Kinjalk; Padmanabhan, T

    2016-02-05

    The crux of the black hole information paradox is related to the fact that the complete information about the initial state of a quantum field in a collapsing spacetime is not available to future asymptotic observers, belying the expectations from a unitary quantum theory. We study the imprints of the initial quantum state contained in a specific class of distortions of the black hole radiation and identify the classes of in states that can be partially or fully reconstructed from the information contained within. Even for the general in state, we can uncover some specific information. These results suggest that a classical collapse scenario ignores this richness of information in the resulting spectrum and a consistent quantum treatment of the entire collapse process might allow us to retrieve much more information from the spectrum of the final radiation.

  4. Point Cloud Classification of Tesserae from Terrestrial Laser Data Combined with Dense Image Matching for Archaeological Information Extraction

    Science.gov (United States)

    Poux, F.; Neuville, R.; Billen, R.

    2017-08-01

    Reasoning from information extraction given by point cloud data mining allows contextual adaptation and fast decision making. However, to achieve this perceptive level, a point cloud must be semantically rich, retaining relevant information for the end user. This paper presents an automatic knowledge-based method for pre-processing multi-sensory data and classifying a hybrid point cloud from both terrestrial laser scanning and dense image matching. Using 18 features including sensor's biased data, each tessera in the high-density point cloud from the 3D captured complex mosaics of Germigny-des-prés (France) is segmented via a colour multi-scale abstraction-based featuring extracting connectivity. A 2D surface and outline polygon of each tessera is generated by a RANSAC plane extraction and convex hull fitting. Knowledge is then used to classify every tesserae based on their size, surface, shape, material properties and their neighbour's class. The detection and semantic enrichment method shows promising results of 94% correct semantization, a first step toward the creation of an archaeological smart point cloud.

  5. Study on high-level waste geological disposal metadata model

    International Nuclear Information System (INIS)

    Ding Xiaobin; Wang Changhong; Zhu Hehua; Li Xiaojun

    2008-01-01

    This paper expatiated the concept of metadata and its researches within china and abroad, then explain why start the study on the metadata model of high-level nuclear waste deep geological disposal project. As reference to GML, the author first set up DML under the framework of digital underground space engineering. Based on DML, a standardized metadata employed in high-level nuclear waste deep geological disposal project is presented. Then, a Metadata Model with the utilization of internet is put forward. With the standardized data and CSW services, this model may solve the problem in the data sharing and exchanging of different data form A metadata editor is build up in order to search and maintain metadata based on this model. (authors)

  6. In-situ nitrite analysis in high level waste tanks

    International Nuclear Information System (INIS)

    O'Rourke, P.E.; Prather, W.S.; Livingston, R.R.

    1992-01-01

    The Savannah River Site produces special nuclear materials used in the defense of the United States. Most of the processes at SRS are primarily chemical separations and purifications. In-situ chemical analyses help improve the safety, efficiency and quality of these operations. One area where in situ fiberoptic spectroscopy can have a great impact is the management of high level radioactive waste. High level radioactive waste at SRS is stored in more than 50 large waste tanks. The waste exists as a slurry of nitrate salts and metal hydroxides at pH's higher than 10. Sodium Nitrite is added to the tanks as a corrosion inhibitor. In-situ fiberoptic probes are being developed to measure the nitrate, nitrite and hydroxide concentrations in both liquid and solid fractions. Nitrite levels can be measured between 0.01M and 1M in a 1mm pathlength optical cell

  7. Glass-solidification method for high level radioactive liquid waste

    International Nuclear Information System (INIS)

    Kawamura, Kazuhiro; Kometani, Masayuki; Sasage, Ken-ichi.

    1996-01-01

    High level liquid wastes are removed with precipitates mainly comprising Mo and Zr, thereafter, the high level liquid wastes are mixed with a glass raw material comprising a composition having a B 2 O 3 /SiO 2 ratio of not less than 0.41, a ZnO/Li 2 O ratio of not less than 1.00, and an Al 2 O 3 /Li 2 O ratio of not less than 2.58, and they are melted and solidified into glass-solidification products. The liquid waste content in the glass-solidification products can be increased up to about 45% by using the glass raw material having such a predetermined composition. In addition, deposition of a yellow phase does not occur, and a leaching rate identical with that in a conventional case can be maintained. (T.M.)

  8. Nondestructive examination of DOE high-level waste storage tanks

    International Nuclear Information System (INIS)

    Bush, S.; Bandyopadhyay, K.; Kassir, M.; Mather, B.; Shewmon, P.; Streicher, M.; Thompson, B.; van Rooyen, D.; Weeks, J.

    1995-01-01

    A number of DOE sites have buried tanks containing high-level waste. Tanks of particular interest am double-shell inside concrete cylinders. A program has been developed for the inservice inspection of the primary tank containing high-level waste (HLW), for testing of transfer lines and for the inspection of the concrete containment where possible. Emphasis is placed on the ultrasonic examination of selected areas of the primary tank, coupled with a leak-detection system capable of detecting small leaks through the wall of the primary tank. The NDE program is modelled after ASME Section XI in many respects, particularly with respects to the sampling protocol. Selected testing of concrete is planned to determine if there has been any significant degradation. The most probable failure mechanisms are corrosion-related so that the examination program gives major emphasis to possible locations for corrosion attack

  9. Evaluation and selection of candidate high-level waste forms

    International Nuclear Information System (INIS)

    1982-03-01

    Seven candidate waste forms being developed under the direction of the Department of Energy's National High-Level Waste (HLW) Technology Program, were evaluated as potential media for the immobilization and geologic disposal of high-level nuclear wastes. The evaluation combined preliminary waste form evaluations conducted at DOE defense waste-sites and independent laboratories, peer review assessments, a product performance evaluation, and a processability analysis. Based on the combined results of these four inputs, two of the seven forms, borosilicate glass and a titanate based ceramic, SYNROC, were selected as the reference and alternative forms for continued development and evaluation in the National HLW Program. Both the glass and ceramic forms are viable candidates for use at each of the DOE defense waste-sites; they are also potential candidates for immobilization of commercial reprocessing wastes. This report describes the waste form screening process, and discusses each of the four major inputs considered in the selection of the two forms

  10. Storage of High Level Nuclear Waste in Germany

    Directory of Open Access Journals (Sweden)

    Dietmar P. F. Möller

    2007-01-01

    Full Text Available Nuclear energy is very often used to generate electricity. But first the energy must be released from atoms what can be done in two ways: nuclear fusion and nuclear fission. Nuclear power plants use nuclear fission to produce electrical energy. The electrical energy generated in nuclear power plants does not produce polluting combustion gases but a renewable energy, an important fact that could play a key role helping to reduce global greenhouse gas emissions and tackling global warming especially as the electricity energy demand rises in the years ahead. This could be assumed as an ideal win-win situation, but the reverse site of the medal is that the production of high-level nuclear waste outweighs this advantage. Hence the paper attempt to highlight the possible state-of-art concepts for the safe and sustaining storage of high-level nuclear waste in Germany.

  11. Radiation transport in high-level waste form

    International Nuclear Information System (INIS)

    Arakali, V.S.; Barnes, S.M.

    1992-01-01

    The waste form selected for vitrifying high-level nuclear waste stored in underground tanks at West Valley, NY is borosilicate glass. The maximum radiation level at the surface of a canister filled with the high-level waste form is prescribed by repository design criteria for handling and disposition of the vitrified waste. This paper presents an evaluation of the radiation transport characteristics for the vitreous waste form expected to be produced at West Valley and the resulting neutron and gamma dose rates. The maximum gamma and neutron dose rates are estimated to be less than 7500 R/h and 10 mRem/h respectively at the surface of a West Valley canister filled with borosilicate waste glass

  12. Multipurpose optimization models for high level waste vitrification

    International Nuclear Information System (INIS)

    Hoza, M.

    1994-08-01

    Optimal Waste Loading (OWL) models have been developed as multipurpose tools for high-level waste studies for the Tank Waste Remediation Program at Hanford. Using nonlinear programming techniques, these models maximize the waste loading of the vitrified waste and optimize the glass formers composition such that the glass produced has the appropriate properties within the melter, and the resultant vitrified waste form meets the requirements for disposal. The OWL model can be used for a single waste stream or for blended streams. The models can determine optimal continuous blends or optimal discrete blends of a number of different wastes. The OWL models have been used to identify the most restrictive constraints, to evaluate prospective waste pretreatment methods, to formulate and evaluate blending strategies, and to determine the impacts of variability in the wastes. The OWL models will be used to aid in the design of frits and the maximize the waste in the glass for High-Level Waste (HLW) vitrification

  13. Chromosome Aberration on High Level Background Natural Radiation Areas

    International Nuclear Information System (INIS)

    Yanti-Lusiyanti; Zubaidah-Alatas

    2001-01-01

    When the body is irradiated, all cells can suffer cytogenetic damage that can be seen as structural damage of chromosome in the lymphocytes. People no matter where they live in world are exposed to background radiation from natural sources both internal and external such as cosmic radiation, terrestrial radiation, cosmogenic radiation radon and thoron. Level of area natural ionizing radiation is varies depending on the altitude, the soil or rock conditions, particular food chains and the building materials and construction features. Level of normal areas of background exposure is annual effective dose 2.4 mSv and the high level areas of background exposure 20 mSv. This paper discuses the frequency of aberration chromosome especially dysenteries in several countries having high level radiation background. It seems that frequency of chromosome aberrations increase, generally with the increase of age of the people and the accumulated dose received. (author)

  14. Safety of geologic disposal of high level radioactive waste

    International Nuclear Information System (INIS)

    Zaitsu, Tomohisa; Ishiguro, Katsuhiko; Masuda, Sumio

    1992-01-01

    This article introduces current concepts of geologic disposal of high level radioactive waste and its safety. High level radioactive waste is physically stabilized by solidifying it in a glass form. Characteristics of deep geologic layer are presented from the viewpoint of geologic disposal. Reconstruction of multi-barrier system receives much attention to secure the safety of geologic disposal. It is important to research performance assessment of multi-barrier system for preventing dissolution or transfer of radionuclides into the ground water. Physical and chemical modeling for the performance assessment is outlined in the following terms: (1) chemical property of deep ground water, (2) geochemical modeling of artificial barrier spatial water, (3) hydrology of deep ground water, (4) hydrology of the inside of artificial barrier, and (5) modeling of radionuclide transfer from artificial barrier. (N.K.)

  15. Development of high-level waste solidification technology 1

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Joon Hyung; Kim, Hwan Young; Kim, In Tae [and others

    1999-02-01

    Spent nuclear fuel contains useful nuclides as valuable resource materials for energy, heat and catalyst. High-level wastes (HLW) are expected to be generated from the R and D activities and reuse processes. It is necessary to develop vitrification or advanced solidification technologies for the safe long-term management of high level wastes. As a first step to establish HLW vitrification technology, characterization of HLWs that would arise at KAERI site, glass melting experiments with a lab-scale high frequency induction melter, and fabrication and property evaluation of base-glass made of used HEPA filter media and additives were performed. Basic study on the fabrication and characterization of candidate ceramic waste form (Synroc) was also carried out. These HLW solidification technologies would be directly useful for carrying out the R and Ds on the nuclear fuel cycle and waste management. (author). 70 refs., 29 tabs., 35 figs.

  16. QSPIN: A High Level Java API for Quantum Computing Experimentation

    Science.gov (United States)

    Barth, Tim

    2017-01-01

    QSPIN is a high level Java language API for experimentation in QC models used in the calculation of Ising spin glass ground states and related quadratic unconstrained binary optimization (QUBO) problems. The Java API is intended to facilitate research in advanced QC algorithms such as hybrid quantum-classical solvers, automatic selection of constraint and optimization parameters, and techniques for the correction and mitigation of model and solution errors. QSPIN includes high level solver objects tailored to the D-Wave quantum annealing architecture that implement hybrid quantum-classical algorithms [Booth et al.] for solving large problems on small quantum devices, elimination of variables via roof duality, and classical computing optimization methods such as GPU accelerated simulated annealing and tabu search for comparison. A test suite of documented NP-complete applications ranging from graph coloring, covering, and partitioning to integer programming and scheduling are provided to demonstrate current capabilities.

  17. Leaching behavior of simulated high-level waste glass

    International Nuclear Information System (INIS)

    Kamizono, Hiroshi

    1987-03-01

    The author's work in the study on the leaching behavior of simulated high-level waste (HLW) glass were summarized. The subjects described are (1) leach rates at high temperatures, (2) effects of cracks on leach rates, (3) effects of flow rate on leach rates, and (4) an in-situ burial test in natural groundwater. In the following section, the leach rates obtained by various experiments were summarized and discussed. (author)

  18. The tracking of high level waste shipments-TRANSCOM system

    International Nuclear Information System (INIS)

    Johnson, P.E.; Joy, D.S.; Pope, R.B.

    1995-01-01

    The TRANSCOM (transportation tracking and communication) system is the U.S. Department of Energy's (DOE's) real-time system for tracking shipments of spent fuel, high-level wastes, and other high-visibility shipments of radioactive material. The TRANSCOM system has been operational since 1988. The system was used during FY1993 to track almost 100 shipments within the US.DOE complex, and it is accessed weekly by 10 to 20 users

  19. Mixing Processes in High-Level Waste Tanks - Final Report

    International Nuclear Information System (INIS)

    Peterson, P.F.

    1999-01-01

    The mixing processes in large, complex enclosures using one-dimensional differential equations, with transport in free and wall jets is modeled using standard integral techniques. With this goal in mind, we have constructed a simple, computationally efficient numerical tool, the Berkeley Mechanistic Mixing Model, which can be used to predict the transient evolution of fuel and oxygen concentrations in DOE high-level waste tanks following loss of ventilation, and validate the model against a series of experiments

  20. Status of the French nuclear high level waste disposal

    International Nuclear Information System (INIS)

    Sombret, C.

    1985-09-01

    French research on high level waste processing has led to the development of industrial vitrification facilities. Borosilicate glass is still being investigated for its long-term storage properties, since it is itself a component of the containment system. The other constituents of this system, the engineered barriers, are also being actively investigated. The geological barrier is now being assessed using a methodology applicable to various types of geological formations, and final site qualification should be possible before the end of 1992

  1. Soil-structure interaction effects on high level waste tanks

    International Nuclear Information System (INIS)

    Miller, C.A.; Costantino, C.J.; Heymsfeld, E.

    1991-01-01

    High Level Waste Tanks consist of steel tanks located in concrete vaults which are usually completely embedded in the soil. Many of these tanks are old and were designed to seismic standards which are not compatible with current requirements. The objective if this paper is to develop simple methods of modeling SSI effects for such structures and to obtain solutions for a range of parameters that can be used to identify significant aspects of the problem

  2. High-level neutron coincidence counter (HLNCC): users' manual

    International Nuclear Information System (INIS)

    Krick, M.S.; Menlove, H.O.

    1979-06-01

    This manual describes the portable High-Level Neutron Coincidence Counter (HLNCC) developed at the Los Alamos Scientific Laboratory (LASL) for the assay of plutonium, particularly by inspectors of the International Atomic Energy Agency (IAEA). The counter is designed for the measurement of the effective 240 Pu mass in plutonium samples which may have a high plutonium content. The following topics are discussed: principle of operation, description of the system, operating procedures, and applications

  3. Development of cermets for high-level radioactive waste fixation

    International Nuclear Information System (INIS)

    Aaron, W.S.; Quinby, T.C.; Kobisk, E.H.

    1979-01-01

    A method is currently under development for the solidification and fixation of commercial and defense high-level radioactive wastes in the form of ceramic particles encapsulated by metal, i.e., a cermet. The chemical and physical processing techniques which have been developed and the properties of the resulting cermet bodies are described in this paper. These cermets have the advantages of high thermal conductivity and low leach rates

  4. Research on high level radioactive waste repository seismic design criteria

    International Nuclear Information System (INIS)

    Jing Xu

    2012-01-01

    Review seismic hazard analysis principle and method in site suitable assessment process of Yucca Mountain Project, and seismic design criteria and seismic design basis in primary design process. Demonstrated spatial character of seismic hazard by calculated regional seismic hazard map. Contrasted different level seismic design basis to show their differences and relation. Discussed seismic design criteria for preclosure phrase of high level waste repository and preference goal under beyond design basis ground motion. (author)

  5. The tracking of high level waste shipments - TRANSCOM system

    International Nuclear Information System (INIS)

    Johnson, P.E.; Joy, D.S.; Pope, R.B.; Thomas, T.M.; Lester, P.B.

    1994-01-01

    The TRANSCOM (transportation tracking and communication) system is the US Department of Energy's (DOE's) real-time system for tracking shipments of spent fuel, high-level wastes, and other high-visibility shipments of radioactive material. The TRANSCOM system has been operational since 1988. The system was used during FY 1993 to track almost 100 shipments within the US DOE complex, and it is accessed weekly by 10 to 20 users

  6. A High Level Model of a Conscious Embodied Agent

    Czech Academy of Sciences Publication Activity Database

    Wiedermann, Jiří

    2010-01-01

    Roč. 2, č. 3 (2010), s. 62-78 ISSN 1942-9045 R&D Projects: GA ČR GAP202/10/1333 Institutional research plan: CEZ:AV0Z10300504 Keywords : embodied agent * internal world models * higher cognitive function Subject RIV: IN - Informatics, Computer Science http://www.igi-global.com/article/high-level-model-conscious-embodied/46147

  7. Apparatus for Crossflow Filtration Testing of High Level Waste Samples

    International Nuclear Information System (INIS)

    Nash, C.

    1998-05-01

    Remotely-operated experimental apparatuses for verifying crossflow filtration of high level nuclear waste have been constructed at the Savannah River Site (SRS). These units have been used to demonstrate filtration processes at the Savannah River Site, Oak Ridge National Laboratory, the Idaho National Engineering and Environmental Laboratory, and Pacific Northwest National Laboratory. The current work covers the design considerations for experimentation as well as providing results from testing at SRS

  8. Production and utilization of high level and long duration shocks

    International Nuclear Information System (INIS)

    Labrot, R.

    1978-01-01

    In order to verify the behaviour of equipments under extreme environmental conditions (propulsion, falls, impacts...), it is necessary to create 'high level and long duration shocks'. For these shocks, the velocity variation ΔV, which is equal to the area under the accelerogram γ (t), can reach several hundred meters per second. These velocity variations cannot be performed via classical free fall shock machine (ΔV [fr

  9. Lung region extraction based on the model information and the inversed MIP method by using chest CT images

    International Nuclear Information System (INIS)

    Tomita, Toshihiro; Miguchi, Ryosuke; Okumura, Toshiaki; Yamamoto, Shinji; Matsumoto, Mitsuomi; Tateno, Yukio; Iinuma, Takeshi; Matsumoto, Toru.

    1997-01-01

    We developed a lung region extraction method based on the model information and the inversed MIP method in the Lung Cancer Screening CT (LSCT). Original model is composed of typical 3-D lung contour lines, a body axis, an apical point, and a convex hull. First, the body axis. the apical point, and the convex hull are automatically extracted from the input image Next, the model is properly transformed to fit to those of input image by the affine transformation. Using the same affine transformation coefficients, typical lung contour lines are also transferred, which correspond to rough contour lines of input image. Experimental results applied for 68 samples showed this method quite promising. (author)

  10. BioSimplify: an open source sentence simplification engine to improve recall in automatic biomedical information extraction.

    Science.gov (United States)

    Jonnalagadda, Siddhartha; Gonzalez, Graciela

    2010-11-13

    BioSimplify is an open source tool written in Java that introduces and facilitates the use of a novel model for sentence simplification tuned for automatic discourse analysis and information extraction (as opposed to sentence simplification for improving human readability). The model is based on a "shot-gun" approach that produces many different (simpler) versions of the original sentence by combining variants of its constituent elements. This tool is optimized for processing biomedical scientific literature such as the abstracts indexed in PubMed. We tested our tool on its impact to the task of PPI extraction and it improved the f-score of the PPI tool by around 7%, with an improvement in recall of around 20%. The BioSimplify tool and test corpus can be downloaded from https://biosimplify.sourceforge.net.

  11. Unsupervised improvement of named entity extraction in short informal context using disambiguation clues

    NARCIS (Netherlands)

    Habib, Mena Badieh; van Keulen, Maurice

    2012-01-01

    Short context messages (like tweets and SMS’s) are a potentially rich source of continuously and instantly updated information. Shortness and informality of such messages are challenges for Natural Language Processing tasks. Most efforts done in this direction rely on machine learning techniques

  12. Automated Methods to Extract Patient New Information from Clinical Notes in Electronic Health Record Systems

    Science.gov (United States)

    Zhang, Rui

    2013-01-01

    The widespread adoption of Electronic Health Record (EHR) has resulted in rapid text proliferation within clinical care. Clinicians' use of copying and pasting functions in EHR systems further compounds this by creating a large amount of redundant clinical information in clinical documents. A mixture of redundant information (especially outdated…

  13. Extracting principles for information management adaptability during crisis response : A dynamic capability view

    NARCIS (Netherlands)

    Bharosa, N.; Janssen, M.F.W.H.A.

    2010-01-01

    During crises, relief agency commanders have to make decisions in a complex and uncertain environment, requiring them to continuously adapt to unforeseen environmental changes. In the process of adaptation, the commanders depend on information management systems for information. Yet there are still

  14. Solidification of Savannah River Plant high level waste

    International Nuclear Information System (INIS)

    Maher, R.; Shafranek, L.F.; Kelley, J.A.; Zeyfang, R.W.

    1981-11-01

    Authorization for construction of the Defense Waste Processing Facility (DWPF) is expected in FY 83. The optimum time for stage 2 authorization is about three years later. Detailed design and construction will require approximately five years for stage 1, with stage 2 construction completed about two to three years later. Production of canisters of waste glass would begin in 1988, and the existing backlog of high level waste sludge stored at SRP would be worked off by about the year 2000. Stage 2 operation could begin in 1990. The technology and engineering are ready for construction and eventual operation of the DWPF for immobilizing high level radioactive waste at Savannah River Plant (SRP). Proceeding with this project will provide the public, and the leadership of this country, with a crucial demonstration that a major quantity of existing high level nuclear wastes can be safely and permanently immobilized. Early demonstration will both expedite and facilitate rational decision making on this aspect of the nuclear program. Delay in providing these facilities will result in significant DOE expenditures at SRP for new tanks just for continued temporary storage of wastes, and would probably result in dissipation of the intellectual and planning momentum that has built up in developing the project

  15. Evaluation of conditioned high-level waste forms

    International Nuclear Information System (INIS)

    Mendel, J.E.; Turcotte, R.P.; Chikalla, T.D.; Hench, L.L.

    1983-01-01

    The evaluation of conditioned high-level waste forms requires an understanding of radiation and thermal effects, mechanical properties, volatility, and chemical durability. As a result of nuclear waste research and development programs in many countries, a good understanding of these factors is available for borosilicate glass containing high-level waste. The IAEA through its coordinated research program has contributed to this understanding. Methods used in the evaluation of conditioned high-level waste forms are reviewed. In the US, this evaluation has been facilitated by the definition of standard test methods by the Materials Characterization Center (MCC), which was established by the Department of Energy (DOE) in 1979. The DOE has also established a 20-member Materials Review Board to peer-review the activities of the MCC. In addition to comparing waste forms, testing must be done to evaluate the behavior of waste forms in geologic repositories. Such testing is complex; accelerated tests are required to predict expected behavior for thousands of years. The tests must be multicomponent tests to ensure that all potential interactions between waste form, canister/overpack and corrosion products, backfill, intruding ground water and the repository rock, are accounted for. An overview of the status of such multicomponent testing is presented

  16. Final disposal of high levels waste and spent nuclear fuel

    International Nuclear Information System (INIS)

    Gelin, R.

    1984-05-01

    Foreign and international activities on the final disposal of high-level waste and spent nuclear fuel have been reviewed. A considerable research effort is devoted to development of acceptable disposal options. The different technical concepts presently under study are described in the report. Numerous studies have been made in many countries of the potential risks to future generations from radioactive wastes in underground disposal repositories. In the report the safety assessment studies and existing performance criteria for geological disposal are briefly discussed. The studies that are being made in Canada, the United States, France and Switzerland are the most interesting for Sweden as these countries also are considering disposal into crystalline rocks. The overall time-tables in different countries for realisation of the final disposal are rather similar. Normally actual large-scale disposal operations for high-level wastes are not foreseen until after year 2000. In the United States the Congress recently passed the important Nuclear Waste Policy Act. It gives a rather firm timetable for site-selection and construction of nuclear waste disposal facilities. According to this act the first repository for disposal of commercial high-level waste must be in operation not later than in January 1998. (Author)

  17. High-Level Development of Multiserver Online Games

    Directory of Open Access Journals (Sweden)

    Frank Glinka

    2008-01-01

    Full Text Available Multiplayer online games with support for high user numbers must provide mechanisms to support an increasing amount of players by using additional resources. This paper provides a comprehensive analysis of the practically proven multiserver distribution mechanisms, zoning, instancing, and replication, and the tasks for the game developer implied by them. We propose a novel, high-level development approach which integrates the three distribution mechanisms seamlessly in today's online games. As a possible base for this high-level approach, we describe the real-time framework (RTF middleware system which liberates the developer from low-level tasks and allows him to stay at high level of design abstraction. We explain how RTF supports the implementation of single-server online games and how RTF allows to incorporate the three multiserver distribution mechanisms during the development process. Finally, we describe briefly how RTF provides manageability and maintenance functionality for online games in a grid context with dynamic resource allocation scenarios.

  18. High level radioactive wastes: Considerations on final disposal

    International Nuclear Information System (INIS)

    Ciallella, Norberto R.

    2000-01-01

    When at the beginnings of the decade of the 80 the National Commission on Atomic Energy (CNEA) in Argentina decided to study the destination of the high level radioactive wastes, was began many investigations, analysis and multidisciplinary evaluations that be origin to a study of characteristics never before carried out in Argentina. For the first time in the country was faced the study of an environmental eventual problem, several decades before that the problem was presented. The elimination of the high level radioactive wastes in the technological aspects was taken in advance, avoiding to transfer the problems to the future generations. The decision was based, not only in technical evaluations but also in ethical premises, since it was considered that the future generations may enjoy the benefits of the nuclear energy and not should be solve the problem. The CNEA in Argentina in 1980 decided to begin a feasibility study and preliminary engineering project for the construction of the final disposal of high level radioactive wastes

  19. Standards for high level waste disposal: A sustainability perspective

    International Nuclear Information System (INIS)

    Dougherty, W.W.; Powers, V.; Johnson, F.X.; Cornland, D.

    1999-01-01

    Spent reactor fuel from commercial power stations contains high levels of plutonium, other fissionable actinides, and fission products, all of which pose serious challenges for permanent disposal because of the very long half-lives of some isotopes. The 'nuclear nations' have agreed on the use of permanent geologic repositories for the ultimate disposal of high-level nuclear waste. However, it is premature to claim that a geologic repository offers permanent isolation from the biosphere, given high levels of uncertainty, nascent risk assessment frameworks for the time periods considered, and serious intergenerational equity issues. Many have argued for a broader consideration of disposal options that include extended monitored retrievable storage and accelerator-driven transmutation of wastes. In this paper we discuss and compare these three options relative to standards that emerge from the application of sustainable development principles, namely long-lasting technical viability, intergenerational equity, rational resource allocation, and rights of future intervention. We conclude that in order to maximise the autonomy of future generations, it is imperative to leave future options more open than does permanent disposal

  20. Materials Science of High-Level Nuclear Waste Immobilization

    International Nuclear Information System (INIS)

    Weber, William J.; Navrotsky, Alexandra; Stefanovsky, S. V.; Vance, E. R.; Vernaz, Etienne Y.

    2009-01-01

    With the increasing demand for the development of more nuclear power comes the responsibility to address the technical challenges of immobilizing high-level nuclear wastes in stable solid forms for interim storage or disposition in geologic repositories. The immobilization of high-level nuclear wastes has been an active area of research and development for over 50 years. Borosilicate glasses and complex ceramic composites have been developed to meet many technical challenges and current needs, although regulatory issues, which vary widely from country to country, have yet to be resolved. Cooperative international programs to develop advanced proliferation-resistant nuclear technologies to close the nuclear fuel cycle and increase the efficiency of nuclear energy production might create new separation waste streams that could demand new concepts and materials for nuclear waste immobilization. This article reviews the current state-of-the-art understanding regarding the materials science of glasses and ceramics for the immobilization of high-level nuclear waste and excess nuclear materials and discusses approaches to address new waste streams

  1. Partitioning and recovery of neptunium from high level waste streams of PUREX origin using 30% TBP

    International Nuclear Information System (INIS)

    Mathur, J.N.; Murali, M.S.; Balarama Krishna, M.V.; Iyer, R.H.; Chitnis, R.R.; Wattal, P.K.; Theyyunni, T.K.; Ramanujam, A.; Dhami, P.S.; Gopalakrishnan, V.

    1995-01-01

    237 Np is one of the longest-lived nuclides among the actinides present in the high level waste solutions of reprocessing origin. Its separation, recovery and transmutation can reduce the problem of long term storage of the vitrified waste to a great extent. With this objective, the present work was initiated to study the extraction of neptunium into TBP under the conditions relevant to high level waste, along with uranium and plutonium by oxidising it to hexavalent state using potassium dichromate and subsequently recovering it by selective stripping. Three types of simulated HLW solutions namely sulphate bearing (SB), with an acidity of ∼ 0.3 M and non-sulphate wastes originating from the reprocessing of fuels from pressurised heavy water reactor (PHWR) and fast breeder reactor (FBR) with acidities of 3.0 M HNO 3 were employed in these studies. The extraction of U(VI), Np(VI) and Pu(VI) was very high for PHWR- and FBR-HLW solutions, whereas for the SB-HLW solution, these values were less but reasonably high. Quantitative recovery of neptunium and plutonium was achieved using a stripping solution containing 0.1 M H 2 O 2 and 0.01 M ascorbic acid at an acidity of 2.0 M. Since, cerium present in the waste solutions is expected to undergo oxidation in presence of K 2 Cr 2 O 7 , its extraction behaviour was also studied under similar conditions. Based on the results, a scheme was formulated for the recovery of neptunium along with plutonium and was successfully applied to actual high level waste solution originating from the reprocessing of research reactor fuels. (author). 19 refs., 2 figs., 17 tabs

  2. Extracting protein dynamics information from overlapped NMR signals using relaxation dispersion difference NMR spectroscopy.

    Science.gov (United States)

    Konuma, Tsuyoshi; Harada, Erisa; Sugase, Kenji

    2015-12-01

    Protein dynamics plays important roles in many biological events, such as ligand binding and enzyme reactions. NMR is mostly used for investigating such protein dynamics in a site-specific manner. Recently, NMR has been actively applied to large proteins and intrinsically disordered proteins, which are attractive research targets. However, signal overlap, which is often observed for such proteins, hampers accurate analysis of NMR data. In this study, we have developed a new methodology called relaxation dispersion difference that can extract conformational exchange parameters from overlapped NMR signals measured using relaxation dispersion spectroscopy. In relaxation dispersion measurements, the signal intensities of fluctuating residues vary according to the Carr-Purcell-Meiboon-Gill pulsing interval, whereas those of non-fluctuating residues are constant. Therefore, subtraction of each relaxation dispersion spectrum from that with the highest signal intensities, measured at the shortest pulsing interval, leaves only the signals of the fluctuating residues. This is the principle of the relaxation dispersion difference method. This new method enabled us to extract exchange parameters from overlapped signals of heme oxygenase-1, which is a relatively large protein. The results indicate that the structural flexibility of a kink in the heme-binding site is important for efficient heme binding. Relaxation dispersion difference requires neither selectively labeled samples nor modification of pulse programs; thus it will have wide applications in protein dynamics analysis.

  3. Extracting protein dynamics information from overlapped NMR signals using relaxation dispersion difference NMR spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Konuma, Tsuyoshi [Icahn School of Medicine at Mount Sinai, Department of Structural and Chemical Biology (United States); Harada, Erisa [Suntory Foundation for Life Sciences, Bioorganic Research Institute (Japan); Sugase, Kenji, E-mail: sugase@sunbor.or.jp, E-mail: sugase@moleng.kyoto-u.ac.jp [Kyoto University, Department of Molecular Engineering, Graduate School of Engineering (Japan)

    2015-12-15

    Protein dynamics plays important roles in many biological events, such as ligand binding and enzyme reactions. NMR is mostly used for investigating such protein dynamics in a site-specific manner. Recently, NMR has been actively applied to large proteins and intrinsically disordered proteins, which are attractive research targets. However, signal overlap, which is often observed for such proteins, hampers accurate analysis of NMR data. In this study, we have developed a new methodology called relaxation dispersion difference that can extract conformational exchange parameters from overlapped NMR signals measured using relaxation dispersion spectroscopy. In relaxation dispersion measurements, the signal intensities of fluctuating residues vary according to the Carr-Purcell-Meiboon-Gill pulsing interval, whereas those of non-fluctuating residues are constant. Therefore, subtraction of each relaxation dispersion spectrum from that with the highest signal intensities, measured at the shortest pulsing interval, leaves only the signals of the fluctuating residues. This is the principle of the relaxation dispersion difference method. This new method enabled us to extract exchange parameters from overlapped signals of heme oxygenase-1, which is a relatively large protein. The results indicate that the structural flexibility of a kink in the heme-binding site is important for efficient heme binding. Relaxation dispersion difference requires neither selectively labeled samples nor modification of pulse programs; thus it will have wide applications in protein dynamics analysis.

  4. A method for assay of special nuclear material in high level liquid waste streams

    International Nuclear Information System (INIS)

    Venkata Subramani, C.R.; Swaminathan, K.; Asuvathraman, R.; Kutty, K.V.G.

    2003-01-01

    The assay of special nuclear material in the high level liquid waste streams assumes importance as this is the first stage in the extraction cycle and considerable losses of plutonium could occur here. This stream contains all the fission products as also the minor actinides and hence normal nuclear techniques cannot be used without prior separation of the special nuclear material. This paper presents the preliminary results carried out using wavelength dispersive x-ray fluorescence as part of the developmental efforts to assay SNM in these streams by instrumental techniques. (author)

  5. Performance of high level waste forms and engineered barriers under repository conditions

    International Nuclear Information System (INIS)

    1991-02-01

    The IAEA initiated in 1977 a co-ordinated research programme on the ''Evaluation of Solidified High-Level Waste Forms'' which was terminated in 1983. As there was a continuing need for international collaboration in research on solidified high-level waste form and spent fuel, the IAEA initiated a new programme in 1984. The new programme, besides including spent fuel and SYNROC, also placed greater emphasis on the effect of the engineered barriers of future repositories on the properties of the waste form. These engineered barriers included containers, overpacks, buffer and backfill materials etc. as components of the ''near-field'' of the repository. The Co-ordinated Research Programme on the Performance of High-Level Waste Forms and Engineered Barriers Under Repository Conditions had the objectives of promoting the exchange of information on the experience gained by different Member States in experimental performance data and technical model evaluation of solidified high level waste forms, components of the waste package and the complete waste management system under conditions relevant to final repository disposal. The programme includes studies on both irradiated spent fuel and glass and ceramic forms as the final solidified waste forms. The following topics were discussed: Leaching of vitrified high-level wastes, modelling of glass behaviour in clay, salt and granite repositories, environmental impacts of radionuclide release, synroc use for high--level waste solidification, leachate-rock interactions, spent fuel disposal in deep geologic repositories and radionuclide release mechanisms from various fuel types, radiolysis and selective leaching correlated with matrix alteration. Refs, figs and tabs

  6. Corrosion issues in high-level nuclear waste containers

    Science.gov (United States)

    Asl, Samin Sharifi

    In this dissertation different aspects of corrosion and electrochemistry of copper, candidate canister material in Scandinavian high-level nuclear waste disposal program, including the thermodynamics and kinetics of the reactions that are predicted to occur in the practical system have been studied. A comprehensive thermodynamic study of copper in contact with granitic groundwater of the type and composition that is expected in the Forsmark repository in Sweden has been performed. Our primary objective was to ascertain whether copper would exist in the thermodynamically immune state in the repository, in which case corrosion could not occur and the issue of corrosion in the assessment of the storage technology would be moot. In spite of the fact that metallic copper has been found to exist for geological times in granitic geological formations, copper is well-known to be activated from the immune state to corrode by specific species that may exist in the environment. The principal activator of copper is known to be sulfur in its various forms, including sulfide (H2S, HS-, S2-), polysulfide (H2Sx, HSx -, Sx 2-), poly sulfur thiosulfate ( SxO3 2-), and polythionates (SxO6 2-). A comprehensive study of this aspect of copper chemistry has never been reported, and yet an understanding of this issue is vital for assessing whether copper is a suitable material for fabricating canisters for the disposal of HLNW. Our study identifies and explores those species that activate copper; these species include sulfur-containing entities as well as other, non-sulfur species that may be present in the repository. The effects of temperature, solution pH, and hydrogen pressure on the kinetics of the hydrogen electrode reaction (HER) on copper in borate buffer solution have been studied by means of steady-state polarization measurements, including electrochemical impedance spectroscopy (EIS). In order to obtain electrokinetic parameters, such as the exchange current density and the

  7. Wavelet analysis of molecular dynamics: Efficient extraction of time-frequency information in ultrafast optical processes

    International Nuclear Information System (INIS)

    Prior, Javier; Castro, Enrique; Chin, Alex W.; Almeida, Javier; Huelga, Susana F.; Plenio, Martin B.

    2013-01-01

    New experimental techniques based on nonlinear ultrafast spectroscopies have been developed over the last few years, and have been demonstrated to provide powerful probes of quantum dynamics in different types of molecular aggregates, including both natural and artificial light harvesting complexes. Fourier transform-based spectroscopies have been particularly successful, yet “complete” spectral information normally necessitates the loss of all information on the temporal sequence of events in a signal. This information though is particularly important in transient or multi-stage processes, in which the spectral decomposition of the data evolves in time. By going through several examples of ultrafast quantum dynamics, we demonstrate that the use of wavelets provide an efficient and accurate way to simultaneously acquire both temporal and frequency information about a signal, and argue that this greatly aids the elucidation and interpretation of physical process responsible for non-stationary spectroscopic features, such as those encountered in coherent excitonic energy transport

  8. Extracting information from an ensemble of GCMs to reliably assess future global runoff change

    NARCIS (Netherlands)

    Sperna Weiland, F.C.; Beek, L.P.H. van; Weerts, A.H.; Bierkens, M.F.P.

    2011-01-01

    Future runoff projections derived from different global climate models (GCMs) show large differences. Therefore, within this study the, information from multiple GCMs has been combined to better assess hydrological changes. For projections of precipitation and temperature the Reliability ensemble

  9. Investigation of the Impact of Extracting and Exchanging Health Information by Using Internet and Social Networks.

    Science.gov (United States)

    Pistolis, John; Zimeras, Stelios; Chardalias, Kostas; Roupa, Zoe; Fildisis, George; Diomidous, Marianna

    2016-06-01

    Social networks (1) have been embedded in our daily life for a long time. They constitute a powerful tool used nowadays for both searching and exchanging information on different issues by using Internet searching engines (Google, Bing, etc.) and Social Networks (Facebook, Twitter etc.). In this paper, are presented the results of a research based on the frequency and the type of the usage of the Internet and the Social Networks by the general public and the health professionals. The objectives of the research were focused on the investigation of the frequency of seeking and meticulously searching for health information in the social media by both individuals and health practitioners. The exchanging of information is a procedure that involves the issues of reliability and quality of information. In this research, by using advanced statistical techniques an effort is made to investigate the participant's profile in using social networks for searching and exchanging information on health issues. Based on the answers 93 % of the people, use the Internet to find information on health-subjects. Considering principal component analysis, the most important health subjects were nutrition (0.719 %), respiratory issues (0.79 %), cardiological issues (0.777%), psychological issues (0.667%) and total (73.8%). The research results, based on different statistical techniques revealed that the 61.2% of the males and 56.4% of the females intended to use the social networks for searching medical information. Based on the principal components analysis, the most important sources that the participants mentioned, were the use of the Internet and social networks for exchanging information on health issues. These sources proved to be of paramount importance to the participants of the study. The same holds for nursing, medical and administrative staff in hospitals.

  10. Amplitude extraction in pseudoscalar-meson photoproduction: towards a situation of complete information

    International Nuclear Information System (INIS)

    Nys, Jannes; Vrancx, Tom; Ryckebusch, Jan

    2015-01-01

    A complete set for pseudoscalar-meson photoproduction is a minimum set of observables from which one can determine the underlying reaction amplitudes unambiguously. The complete sets considered in this work involve single- and double-polarization observables. It is argued that for extracting amplitudes from data, the transversity representation of the reaction amplitudes offers advantages over alternate representations. It is shown that with the available single-polarization data for the p(γ,K + )Λ reaction, the energy and angular dependence of the moduli of the normalized transversity amplitudes in the resonance region can be determined to a fair accuracy. Determining the relative phases of the amplitudes from double-polarization observables is far less evident. (paper)

  11. The Analysis of Tree Species Distribution Information Extraction and Landscape Pattern Based on Remote Sensing Images

    Directory of Open Access Journals (Sweden)

    Yi Zeng

    2017-08-01

    Full Text Available The forest ecosystem is the largest land vegetation type, which plays the role of unreplacement with its unique value. And in the landscape scale, the research on forest landscape pattern has become the current hot spot, wherein the study of forest canopy structure is very important. They determines the process and the strength of forests energy flow, which influences the adjustments of ecosystem for climate and species diversity to some extent. The extraction of influencing factors of canopy structure and the analysis of the vegetation distribution pattern are especially important. To solve the problems, remote sensing technology, which is superior to other technical means because of its fine timeliness and large-scale monitoring, is applied to the study. Taking Lingkong Mountain as the study area, the paper uses the remote sensing image to analyze the forest distribution pattern and obtains the spatial characteristics of canopy structure distribution, and DEM data are as the basic data to extract the influencing factors of canopy structure. In this paper, pattern of trees distribution is further analyzed by using terrain parameters, spatial analysis tools and surface processes quantitative simulation. The Hydrological Analysis tool is used to build distributed hydrological model, and corresponding algorithm is applied to determine surface water flow path, rivers network and basin boundary. Results show that forest vegetation distribution of dominant tree species present plaque on the landscape scale and their distribution have spatial heterogeneity which is related to terrain factors closely. After the overlay analysis of aspect, slope and forest distribution pattern respectively, the most suitable area for stand growth and the better living condition are obtained.

  12. Linking attentional processes and conceptual problem solving: visual cues facilitate the automaticity of extracting relevant information from diagrams.

    Science.gov (United States)

    Rouinfar, Amy; Agra, Elise; Larson, Adam M; Rebello, N Sanjay; Loschky, Lester C

    2014-01-01

    This study investigated links between visual attention processes and conceptual problem solving. This was done by overlaying visual cues on conceptual physics problem diagrams to direct participants' attention to relevant areas to facilitate problem solving. Participants (N = 80) individually worked through four problem sets, each containing a diagram, while their eye movements were recorded. Each diagram contained regions that were relevant to solving the problem correctly and separate regions related to common incorrect responses. Problem sets contained an initial problem, six isomorphic training problems, and a transfer problem. The cued condition saw visual cues overlaid on the training problems. Participants' verbal responses were used to determine their accuracy. This study produced two major findings. First, short duration visual cues which draw attention to solution-relevant information and aid in the organizing and integrating of it, facilitate both immediate problem solving and generalization of that ability to new problems. Thus, visual cues can facilitate re-representing a problem and overcoming impasse, enabling a correct solution. Importantly, these cueing effects on problem solving did not involve the solvers' attention necessarily embodying the solution to the problem, but were instead caused by solvers attending to and integrating relevant information in the problems into a solution path. Second, this study demonstrates that when such cues are used across multiple problems, solvers can automatize the extraction of problem-relevant information extraction. These results suggest that low-level attentional selection processes provide a necessary gateway for relevant information to be used in problem solving, but are generally not sufficient for correct problem solving. Instead, factors that lead a solver to an impasse and to organize and integrate problem information also greatly facilitate arriving at correct solutions.

  13. The Application of Chinese High-Spatial Remote Sensing Satellite Image in Land Law Enforcement Information Extraction

    Science.gov (United States)

    Wang, N.; Yang, R.

    2018-04-01

    Chinese high -resolution (HR) remote sensing satellites have made huge leap in the past decade. Commercial satellite datasets, such as GF-1, GF-2 and ZY-3 images, the panchromatic images (PAN) resolution of them are 2 m, 1 m and 2.1 m and the multispectral images (MS) resolution are 8 m, 4 m, 5.8 m respectively have been emerged in recent years. Chinese HR satellite imagery has been free downloaded for public welfare purposes using. Local government began to employ more professional technician to improve traditional land management technology. This paper focused on analysing the actual requirements of the applications in government land law enforcement in Guangxi Autonomous Region. 66 counties in Guangxi Autonomous Region were selected for illegal land utilization spot extraction with fusion Chinese HR images. The procedure contains: A. Defines illegal land utilization spot type. B. Data collection, GF-1, GF-2, and ZY-3 datasets were acquired in the first half year of 2016 and other auxiliary data were collected in 2015. C. Batch process, HR images were collected for batch preprocessing through ENVI/IDL tool. D. Illegal land utilization spot extraction by visual interpretation. E. Obtaining attribute data with ArcGIS Geoprocessor (GP) model. F. Thematic mapping and surveying. Through analysing 42 counties results, law enforcement officials found 1092 illegal land using spots and 16 suspicious illegal mining spots. The results show that Chinese HR satellite images have great potential for feature information extraction and the processing procedure appears robust.

  14. Defense High Level Waste Disposal Container System Description Document

    International Nuclear Information System (INIS)

    Pettit, N. E.

    2001-01-01

    The Defense High Level Waste Disposal Container System supports the confinement and isolation of waste within the Engineered Barrier System of the Monitored Geologic Repository (MGR). Disposal containers are loaded and sealed in the surface waste handling facilities, transferred to the underground through the accesses using a rail mounted transporter, and emplaced in emplacement drifts. The defense high level waste (HLW) disposal container provides long-term confinement of the commercial HLW and defense HLW (including immobilized plutonium waste forms [IPWF]) placed within disposable canisters, and withstands the loading, transfer, emplacement, and retrieval loads and environments. US Department of Energy (DOE)-owned spent nuclear fuel (SNF) in disposable canisters may also be placed in a defense HLW disposal container along with commercial HLW waste forms, which is known as co-disposal. The Defense High Level Waste Disposal Container System provides containment of waste for a designated period of time, and limits radionuclide release. The disposal container/waste package maintains the waste in a designated configuration, withstands maximum handling and rockfall loads, limits the individual canister temperatures after emplacement, resists corrosion in the expected handling and repository environments, and provides containment of waste in the event of an accident. Defense HLW disposal containers for HLW disposal will hold up to five HLW canisters. Defense HLW disposal containers for co-disposal will hold up to five HLW canisters arranged in a ring and one DOE SNF canister inserted in the center and/or one or more DOE SNF canisters displacing a HLW canister in the ring. Defense HLW disposal containers also will hold two Multi-Canister Overpacks (MCOs) and two HLW canisters in one disposal container. The disposal container will include outer and inner cylinders, outer and inner cylinder lids, and may include a canister guide. An exterior label will provide a means by

  15. High-Level Synthesis: Productivity, Performance, and Software Constraints

    Directory of Open Access Journals (Sweden)

    Yun Liang

    2012-01-01

    Full Text Available FPGAs are an attractive platform for applications with high computation demand and low energy consumption requirements. However, design effort for FPGA implementations remains high—often an order of magnitude larger than design effort using high-level languages. Instead of this time-consuming process, high-level synthesis (HLS tools generate hardware implementations from algorithm descriptions in languages such as C/C++ and SystemC. Such tools reduce design effort: high-level descriptions are more compact and less error prone. HLS tools promise hardware development abstracted from software designer knowledge of the implementation platform. In this paper, we present an unbiased study of the performance, usability and productivity of HLS using AutoPilot (a state-of-the-art HLS tool. In particular, we first evaluate AutoPilot using the popular embedded benchmark kernels. Then, to evaluate the suitability of HLS on real-world applications, we perform a case study of stereo matching, an active area of computer vision research that uses techniques also common for image denoising, image retrieval, feature matching, and face recognition. Based on our study, we provide insights on current limitations of mapping general-purpose software to hardware using HLS and some future directions for HLS tool development. We also offer several guidelines for hardware-friendly software design. For popular embedded benchmark kernels, the designs produced by HLS achieve 4X to 126X speedup over the software version. The stereo matching algorithms achieve between 3.5X and 67.9X speedup over software (but still less than manual RTL design with a fivefold reduction in design effort versus manual RTL design.

  16. Implementation of generalized quantum measurements: Superadditive quantum coding, accessible information extraction, and classical capacity limit

    International Nuclear Information System (INIS)

    Takeoka, Masahiro; Fujiwara, Mikio; Mizuno, Jun; Sasaki, Masahide

    2004-01-01

    Quantum-information theory predicts that when the transmission resource is doubled in quantum channels, the amount of information transmitted can be increased more than twice by quantum-channel coding technique, whereas the increase is at most twice in classical information theory. This remarkable feature, the superadditive quantum-coding gain, can be implemented by appropriate choices of code words and corresponding quantum decoding which requires a collective quantum measurement. Recently, an experimental demonstration was reported [M. Fujiwara et al., Phys. Rev. Lett. 90, 167906 (2003)]. The purpose of this paper is to describe our experiment in detail. Particularly, a design strategy of quantum-collective decoding in physical quantum circuits is emphasized. We also address the practical implication of the gain on communication performance by introducing the quantum-classical hybrid coding scheme. We show how the superadditive quantum-coding gain, even in a small code length, can boost the communication performance of conventional coding techniques

  17. Transferring knowledge about high-level waste repositories: An ethical consideration

    International Nuclear Information System (INIS)

    Berndes, S.; Kornwachs, K.

    1996-01-01

    The purpose of this paper is to present requirements to Information and Documentation Systems for high-level waste repositories from an ethical point of view. A structured synopsis of ethical arguments used by experts from Europe and America is presented. On the one hand the review suggests to reinforce the obligation to transfer knowledge about high level waste repositories. This obligation is reduced on the other hand by the objection that ethical obligations are dependent on the difference between our and future civilizations. This reflection results in proposing a list of well-balanced ethical arguments. Then a method is presented which shows how scenarios of possible future civilizations for different time horizons and related ethical arguments are used to justify requirements to the Information and Documentation System

  18. Managing the nation's commercial high-level radioactive waste

    International Nuclear Information System (INIS)

    1985-03-01

    This report presents the findings and conclusions of OTA's analysis of Federal policy for the management of commercial high-level radioactive waste. It is intended to contribute to the implementation of Nuclear Waste Policy Act of 1982 (NWPA). The major conclusion of that review is that NWPA provides sufficient authority for developing and operating a waste management system based on disposal in geologic repositories. Substantial new authority for other facilities will not be required unless major unexpected problems with geologic disposal are encountered. OTA also concludes that DOE's Draft Mission Plan published in 1984 falls short of its potential for enhancing the credibility and acceptability of the waste management program

  19. Corrosion and failure processes in high-level waste tanks

    International Nuclear Information System (INIS)

    Mahidhara, R.K.; Elleman, T.S.; Murty, K.L.

    1992-11-01

    A large amount of radioactive waste has been stored safely at the Savannah River and Hanford sites over the past 46 years. The aim of this report is to review the experimental corrosion studies at Savannah River and Hanford with the intention of identifying the types and rates of corrosion encountered and indicate how these data contribute to tank failure predictions. The compositions of the High-Level Wastes, mild steels used in the construction of the waste tanks and degradation-modes particularly stress corrosion cracking and pitting are discussed. Current concerns at the Hanford Site are highlighted

  20. Global tracker for the ALICE high level trigger

    International Nuclear Information System (INIS)

    Vik, Thomas

    2006-01-01

    This thesis deals with two main topics. The first is the implementation and testing of a Kalman filter algorithm in the HLT (High Level Trigger) reconstruction code. This will perform the global tracking in the HLT, that is merging tracklets and hits from the different sub-detectors in the central barrel detector. The second topic is a trigger mode of the HLT which uses the global tracking of particles through the TRD (Transition Radiation Detector), TPC (Time Projection Chamber) and the ITS (Inner Tracking System): The dielectron trigger. Global tracking: The Kalman filter algorithm has been introduced to the HLT tracking scheme. (Author)

  1. Market Designs for High Levels of Variable Generation: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Milligan, M.; Holttinen, H.; Kiviluoma, J.; Orths, A.; Lynch, M.; Soder, L.

    2014-10-01

    Variable renewable generation is increasing in penetration in modern power systems, leading to higher variability in the supply and price of electricity as well as lower average spot prices. This raises new challenges, particularly in ensuring sufficient capacity and flexibility from conventional technologies. Because the fixed costs and lifetimes of electricity generation investments are significant, designing markets and regulations that ensure the efficient integration of renewable generation is a significant challenge. This papers reviews the state of play of market designs for high levels of variable generation in the United States and Europe and considers new developments in both regions.

  2. The high level and long lived radioactive wastes

    International Nuclear Information System (INIS)

    2005-01-01

    This report presents the main conclusions of 15 years of researches managed by the CEA. This report is the preliminary version of the 2005 final report. It presents the main conclusions of the actions on the axis 1 and 3 of the law of the 30 December 1991. The synthesis report on the axis 1 concerns results obtained on the long lived radionuclides separation and transmutation in high level and long lived radioactive wastes. the synthesis report on the axis 3 presents results obtained by the processes of conditioning and of ground and underground long term storage. (A.L.B.)

  3. High level waste at Hanford: Potential for waste loading maximization

    International Nuclear Information System (INIS)

    Hrma, P.R.; Bailey, A.W.

    1995-09-01

    The loading of Hanford nuclear waste in borosilicate glass is limited by phase-related phenomena, such as crystallization or formation of immiscible liquids, and by breakdown of the glass structure because of an excessive concentration of modifiers. The phase-related phenomena cause both processing and product quality problems. The deterioration of product durability determines the ultimate waste loading limit if all processing problems are resolved. Concrete examples and mass-balance based calculations show that a substantial potential exists for increasing waste loading of high-level wastes that contain a large fraction of refractory components

  4. Solidification of Savannah River Plant high-level waste

    International Nuclear Information System (INIS)

    Maher, R.; Shafranek, L.F.; Stevens, W.R. III.

    1983-01-01

    The Department of Energy, in accord with recommendations from the Du Pont Company, has started construction of a Defense Waste Processing Facility (DWPF) at the Savannah River Plant. The facility should be completed by the end of 1988, and full-scale operation should begin in 1990. This facility will immobilize in borosilicate glass the large quantity of high-level radioactive waste now stored at the plant plus the waste to be generated from continued chemical reprocessing operations. The existing wastes at the Savannah River Plant will be completely converted by about 2010. 21 figures

  5. Treatment of High-Level Waste Arising from Pyrochemical Processes

    International Nuclear Information System (INIS)

    Lizin, A.A.; Kormilitsyn, M.V.; Osipenko, A.G.; Tomilin, S.V.; Lavrinovich, Yu.G.

    2013-01-01

    JSC “SSC RIAR” has been performing research and development activities in support of closed fuel cycle of fast reactor since the middle of 1960s. Fuel cycle involves fabrication and reprocessing of spent nuclear fuel (SNF) using pyrochemical methods of reprocessing in molten alkali metal chlorides. At present pyrochemical methods of SNF reprocessing in molten chlorides has reached such a level in their development that makes it possible to compare their competitiveness with classic aqueous methods. Their comparative advantage lies in high safety, compactness, high protectability as to nonproliferation of nuclear materials, and reduction of high level waste volume

  6. Development and evaluation of candidate high-level waste forms

    International Nuclear Information System (INIS)

    Bernadzikowski, T.A.

    1981-01-01

    Some seventeen candidate waste forms have been investigated under US Department of Energy programs as potential media for the immobilization and geologic disposal of the high-level radioactive wastes (HLW) resulting from chemical processing of nuclear reactor fuels and targets. Two of these HLW forms were selected at the end of fiscal year (FY) 1981 for intensive development if FY 1982 to 1983. Borosilicate glass was continued as the reference form. A crystalline ceramic waste form, SYNROC, was selected for further product formulation and process development as the alternative to borosilicate glass. This paper describes the bases on which this decision was made

  7. High-level waste canister envelope study: structural analysis

    International Nuclear Information System (INIS)

    1977-11-01

    The structural integrity of waste canisters, fabricated from standard weight Type 304L stainless steel pipe, was analyzed for sizes ranging from 8 to 24 in. diameter and 10 to 16 feet long under normal, abnormal, and improbable life cycle loading conditions. The canisters are assumed to be filled with vitrified high-level nuclear waste, stored temporarily at a fuel reprocessing plant, and then transported for storage in an underground salt bed or other geologic storage. In each of the three impact conditions studies, the resulting impact force is far greater than the elastic limit capacity of the material. Recommendations are made for further study

  8. Cermet high level waste forms: a pregress report

    International Nuclear Information System (INIS)

    Aaron, W.S.; Quinby, T.C.; Kobisk, E.H.

    1978-06-01

    The fixation of high level radioactive waste from both commercial and DOE defense sources as cermets is currently under study. This waste form consists of a continuous iron-nickel base metal matrix containing small particles of fission product oxides. Preliminary evaluations of cermets fabricated from a variety of simulated wastes indicate they possess properties providing advantages over other waste forms presently being considered, namely thermal conductivity, waste loading levels, and leach resistance. This report describes the progress of this effort, to date, since its initiation in 1977

  9. Characterizing speed-independence of high-level designs

    DEFF Research Database (Denmark)

    Kishinevsky, Michael; Staunstrup, Jørgen

    1994-01-01

    This paper characterizes the speed-independence of high-level designs. The characterization is a condition on the design description ensuring that the behavior of the design is independent of the speeds of its components. The behavior of a circuit is modeled as a transition system, that allows data...... types, and internal as well as external non-determinism. This makes it possible to verify the speed-independence of a design without providing an explicit realization of the environment. The verification can be done mechanically. A number of experimental designs have been verified including a speed-independent...

  10. High-level neutron coincidence counter maintenance manual

    International Nuclear Information System (INIS)

    Swansen, J.; Collinsworth, P.

    1983-05-01

    High-level neutron coincidence counter operational (field) calibration and usage is well known. This manual makes explicit basic (shop) check-out, calibration, and testing of new units and is a guide for repair of failed in-service units. Operational criteria for the major electronic functions are detailed, as are adjustments and calibration procedures, and recurrent mechanical/electromechanical problems are addressed. Some system tests are included for quality assurance. Data on nonstandard large-scale integrated (circuit) components and a schematic set are also included

  11. High level trigger system for the ALICE experiment

    International Nuclear Information System (INIS)

    Frankenfeld, U.; Roehrich, D.; Ullaland, K.; Vestabo, A.; Helstrup, H.; Lien, J.; Lindenstruth, V.; Schulz, M.; Steinbeck, T.; Wiebalck, A.; Skaali, B.

    2001-01-01

    The ALICE experiment at the Large Hadron Collider (LHC) at CERN will detect up to 20,000 particles in a single Pb-Pb event resulting in a data rate of ∼75 MByte/event. The event rate is limited by the bandwidth of the data storage system. Higher rates are possible by selecting interesting events and subevents (High Level trigger) or compressing the data efficiently with modeling techniques. Both require a fast parallel pattern recognition. One possible solution to process the detector data at such rates is a farm of clustered SMP nodes, based on off-the-shelf PCs, and connected by a high bandwidth, low latency network

  12. FPGA Co-processor for the ALICE High Level Trigger

    CERN Document Server

    Grastveit, G.; Lindenstruth, V.; Loizides, C.; Roehrich, D.; Skaali, B.; Steinbeck, T.; Stock, R.; Tilsner, H.; Ullaland, K.; Vestbo, A.; Vik, T.

    2003-01-01

    The High Level Trigger (HLT) of the ALICE experiment requires massive parallel computing. One of the main tasks of the HLT system is two-dimensional cluster finding on raw data of the Time Projection Chamber (TPC), which is the main data source of ALICE. To reduce the number of computing nodes needed in the HLT farm, FPGAs, which are an intrinsic part of the system, will be utilized for this task. VHDL code implementing the Fast Cluster Finder algorithm, has been written, a testbed for functional verification of the code has been developed, and the code has been synthesized

  13. Spanish high level radioactive waste management system issues

    International Nuclear Information System (INIS)

    Espejo, J.M.; Beceiro, A.R.

    1992-01-01

    The Empresa Nacional de Residuos Radiactivos, S.A. (ENRESA) has been limited liability company to be responsible for the management of all kind of radioactive wastes in Spain. This paper provides an overview of the strategy and main lines of action stated in the third General Radioactive Waste Plan, currently in force, for the management of spent nuclear fuel and high - level wastes, as well as an outline of the main related projects, either being developed or foreseen. Aspects concerning the organizational structure, the economic and financing system and the international cooperation are also included

  14. Spanish high level radioactive waste management system issues

    International Nuclear Information System (INIS)

    Ulibarri, A.; Veganzones, A.

    1993-01-01

    The Empresa Nacional de Residuous Radiactivos, S.A. (ENRESA) was set up in 1984 as a state-owned limited liability company to be responsible for the management of all kinds of radioactive wastes in Spain. This paper provides an overview of the strategy and main lines of action stated in the third General Radioactive Waste Plan, currently in force, for the management of spent nuclear fuel and high-level wastes, as well as an outline of the main related projects, either being developed or foreseen. Aspects concerning the organizational structure, the economic and financing system and the international co-operational are also included

  15. A high-level product representation for automatic design reasoning

    Energy Technology Data Exchange (ETDEWEB)

    Kroll, E.; Qamar, Z.; Mohammad, R. [Texas A and M Univ., College Station, TX (United States). Mechanical Engineering Dept.

    1994-12-31

    A high-level product representation has been developed and implemented, using features for part description and mating conditions between features for the relationships among parts. The underlying ideas are that features are necessary for effective design representation; that spatial and functional relationships among parts of an assembly are best expressed through mating conditions; that assembly features of a part may, at times, be different from its manufacturing features; and that a good representation should be natural, intelligent, comprehensive, and integrated with a visual display. Some new mating conditions have been defined and classified. Several problems concerning the use of features with mating conditions are discussed.

  16. Very-high-level neutral-beam control system

    International Nuclear Information System (INIS)

    Elischer, V.; Jacobson, V.; Theil, E.

    1981-10-01

    As increasing numbers of neutral beams are added to fusion machines, their operation can consume a significant fraction of a facility's total resources. LBL has developed a very high level control system that allows a neutral beam injector to be treated as a black box with just 2 controls: one to set the beam power and one to set the pulse duration. This 2 knob view allows simple operation and provides a natural base for implementing even higher level controls such as automatic source conditioning

  17. High-level wastes: DOE names three sites for characterization

    International Nuclear Information System (INIS)

    Anon.

    1986-01-01

    DOE announced in May 1986 that there will be there site characterization studies made to determine suitability for a high-level radioactive waste repository. The studies will include several test drillings to the proposed disposal depths. Yucca Mountain, Nevada; Deaf Smith Country, Texas, and Hanford, Washington were identified as the study sites, and further studies for a second repository site in the East were postponed. The affected states all filed suits in federal circuit courts because they were given no advance warning of the announcement of their selection or the decision to suspend work on a second repository. Criticisms of the selection process include the narrowing or DOE options

  18. High-level neutron coincidence counter maintenance manual

    Energy Technology Data Exchange (ETDEWEB)

    Swansen, J.; Collinsworth, P.

    1983-05-01

    High-level neutron coincidence counter operational (field) calibration and usage is well known. This manual makes explicit basic (shop) check-out, calibration, and testing of new units and is a guide for repair of failed in-service units. Operational criteria for the major electronic functions are detailed, as are adjustments and calibration procedures, and recurrent mechanical/electromechanical problems are addressed. Some system tests are included for quality assurance. Data on nonstandard large-scale integrated (circuit) components and a schematic set are also included.

  19. Ionization chamber for measurements of high-level tritium gas

    International Nuclear Information System (INIS)

    Carstens, D.H.W.; David, W.R.

    1980-01-01

    The construction and calibration of a simple ionization-chamber apparatus for measurement of high level tritium gas is described. The apparatus uses an easily constructed but rugged chamber containing the unknown gas and an inexpensive digital multimeter for measuring the ion current. The equipment after calibration is suitable for measuring 0.01 to 100% tritium gas in hydrogen-helium mixes with an accuracy of a few percent. At both the high and low limits of measurements deviations from the predicted theoretical current are observed. These are briefly discussed

  20. Safe disposal of high-level radioactive wastes

    Energy Technology Data Exchange (ETDEWEB)

    Ringwood, A E [Australian National Univ., Canberra. Research School of Earth Sciences

    1980-10-01

    Current strategies in most countries favour the immobilisation of high-level radioactive wastes in borosilicate glasses, and their burial in large, centralised, mined repositories. Strong public opposition has been encountered because of concerns over safety and socio-political issues. The author develops a new disposal strategy, based on immobilisation of wastes in an extremely resistant ceramic, SYNROC, combined with burial in an array of widely dispersed, very deep drill holes. It is demonstrated that the difficulties encountered by conventional disposal strategies can be overcome by this new approach.