WorldWideScience

Sample records for extracting bottom information

  1. INFORMATION EXTRACTION IN TOMB PIT USING HYPERSPECTRAL DATA

    Directory of Open Access Journals (Sweden)

    X. Yang

    2018-04-01

    Full Text Available Hyperspectral data has characteristics of multiple bands and continuous, large amount of data, redundancy, and non-destructive. These characteristics make it possible to use hyperspectral data to study cultural relics. In this paper, the hyperspectral imaging technology is adopted to recognize the bottom images of an ancient tomb located in Shanxi province. There are many black remains on the bottom surface of the tomb, which are suspected to be some meaningful texts or paintings. Firstly, the hyperspectral data is preprocessing to get the reflectance of the region of interesting. For the convenient of compute and storage, the original reflectance value is multiplied by 10000. Secondly, this article uses three methods to extract the symbols at the bottom of the ancient tomb. Finally we tried to use morphology to connect the symbols and gave fifteen reference images. The results show that the extraction of information based on hyperspectral data can obtain a better visual experience, which is beneficial to the study of ancient tombs by researchers, and provides some references for archaeological research findings.

  2. Information Extraction in Tomb Pit Using Hyperspectral Data

    Science.gov (United States)

    Yang, X.; Hou, M.; Lyu, S.; Ma, S.; Gao, Z.; Bai, S.; Gu, M.; Liu, Y.

    2018-04-01

    Hyperspectral data has characteristics of multiple bands and continuous, large amount of data, redundancy, and non-destructive. These characteristics make it possible to use hyperspectral data to study cultural relics. In this paper, the hyperspectral imaging technology is adopted to recognize the bottom images of an ancient tomb located in Shanxi province. There are many black remains on the bottom surface of the tomb, which are suspected to be some meaningful texts or paintings. Firstly, the hyperspectral data is preprocessing to get the reflectance of the region of interesting. For the convenient of compute and storage, the original reflectance value is multiplied by 10000. Secondly, this article uses three methods to extract the symbols at the bottom of the ancient tomb. Finally we tried to use morphology to connect the symbols and gave fifteen reference images. The results show that the extraction of information based on hyperspectral data can obtain a better visual experience, which is beneficial to the study of ancient tombs by researchers, and provides some references for archaeological research findings.

  3. An efficient and not polluting bottom ash extraction system

    International Nuclear Information System (INIS)

    Carrea, A.

    1992-01-01

    This paper reports that boiler waste water effluent must meet more and more tighter requirements to comply with environmental regulations; sluice water resulting from bottom ash handling is one of the main problems in this context, and many utilities are under effort to maximize the reuse of the sluice water, and, if possible, to meet the aim of zero water discharge from bottom ash handling system. At the same time ash reuse efforts gain strength in order to minimize waste production. One solution to these problems can be found in an innovative Bottom Ash Extraction System (MAC System), marked by the peculiarity to be a continuous dry ash removal; the system has been developed in the last four years by MAGALDI INDUSTRIE SRL in collaboration with ANSALDO Ricerche, the R and D department of ANSALDO, the main Italian Boiler Manufacturer, and is now installed in six ENEL Boilers. The elimination of the water as separation element between the bottom part of the furnace and the outside atmosphere gives advantages mainly from the environmental view point, but a certain improvement in the boiler efficiency has also been demonstrated by the application of the system

  4. Design of a tool for extracting a plexiglass falls to the bottom of the reactor pool TRIGA MKI

    International Nuclear Information System (INIS)

    Kankunku, P.K.; Lukanda, M.V.

    2011-01-01

    This paper presents a particular problem, of extracting a plexiglas from the bottom of thr reactor swimming pool. With rudimentary techniques of extraction (two attempts), we noticed that these techniques were unsuccessful, by the way we proceeded in designing a tool made of steel which solved the problem of plexiglas extraction

  5. Bottom-Up Technologies for Reuse: Automated Extractive Adoption of Software Product Lines

    OpenAIRE

    Martinez , Jabier ,; Ziadi , Tewfik; Bissyandé , Tegawendé; Klein , Jacques ,; Le Traon , Yves ,

    2017-01-01

    International audience; Adopting Software Product Line (SPL) engineering principles demands a high up-front investment. Bottom-Up Technologies for Reuse (BUT4Reuse) is a generic and extensible tool aimed to leverage existing similar software products in order to help in extractive SPL adoption. The envisioned users are 1) SPL adopters and 2) Integrators of techniques and algorithms to provide automation in SPL adoption activities. We present the methodology it implies for both types of users ...

  6. Ecological and Economic Prerequisites for the Extraction of Solid Minerals from the Bottom of the Arctic Seas

    Directory of Open Access Journals (Sweden)

    Myaskov Alexander

    2017-01-01

    Full Text Available The world ocean has huge reserves of minerals that are contained directly in the water, as well as on the surface of its bottom and in its subsoils. The deposits of solid minerals on the surface of the bottom of the World Ocean are considered the most promising for industrial extraction. The deposits of ferromanganese nodules, cobalt-manganese crusts and polymetallic sulphides are considered as an object of extracting more often than others. There are the largest deposits of ferromanganese nodules in the central and southern parts of the Pacific Ocean, in the central part of the Indian Ocean, and in the seas of the Arctic Ocean near Russia. The deposits of ferromanganese nodules are a serious alternative to deposits of manganese ore on land. However, there are many factors influencing the efficiency of the development of ferromanganese deposits, the most significant are: the content of the useful component in the ore, the depth of the bottom and the distance from the seaports. It is also necessary to take into account the possible environmental consequences of underwater mining.

  7. Information extraction system

    Science.gov (United States)

    Lemmond, Tracy D; Hanley, William G; Guensche, Joseph Wendell; Perry, Nathan C; Nitao, John J; Kidwell, Paul Brandon; Boakye, Kofi Agyeman; Glaser, Ron E; Prenger, Ryan James

    2014-05-13

    An information extraction system and methods of operating the system are provided. In particular, an information extraction system for performing meta-extraction of named entities of people, organizations, and locations as well as relationships and events from text documents are described herein.

  8. Selective spatial attention modulates bottom-up informational masking of speech.

    Science.gov (United States)

    Carlile, Simon; Corkhill, Caitlin

    2015-03-02

    To hear out a conversation against other talkers listeners overcome energetic and informational masking. Largely attributed to top-down processes, information masking has also been demonstrated using unintelligible speech and amplitude-modulated maskers suggesting bottom-up processes. We examined the role of speech-like amplitude modulations in information masking using a spatial masking release paradigm. Separating a target talker from two masker talkers produced a 20 dB improvement in speech reception threshold; 40% of which was attributed to a release from informational masking. When across frequency temporal modulations in the masker talkers are decorrelated the speech is unintelligible, although the within frequency modulation characteristics remains identical. Used as a masker as above, the information masking accounted for 37% of the spatial unmasking seen with this masker. This unintelligible and highly differentiable masker is unlikely to involve top-down processes. These data provides strong evidence of bottom-up masking involving speech-like, within-frequency modulations and that this, presumably low level process, can be modulated by selective spatial attention.

  9. Selective spatial attention modulates bottom-up informational masking of speech

    OpenAIRE

    Carlile, Simon; Corkhill, Caitlin

    2015-01-01

    To hear out a conversation against other talkers listeners overcome energetic and informational masking. Largely attributed to top-down processes, information masking has also been demonstrated using unintelligible speech and amplitude-modulated maskers suggesting bottom-up processes. We examined the role of speech-like amplitude modulations in information masking using a spatial masking release paradigm. Separating a target talker from two masker talkers produced a 20?dB improvement in speec...

  10. Multimedia Information Extraction

    CERN Document Server

    Maybury, Mark T

    2012-01-01

    The advent of increasingly large consumer collections of audio (e.g., iTunes), imagery (e.g., Flickr), and video (e.g., YouTube) is driving a need not only for multimedia retrieval but also information extraction from and across media. Furthermore, industrial and government collections fuel requirements for stock media access, media preservation, broadcast news retrieval, identity management, and video surveillance.  While significant advances have been made in language processing for information extraction from unstructured multilingual text and extraction of objects from imagery and vid

  11. Challenges in Managing Information Extraction

    Science.gov (United States)

    Shen, Warren H.

    2009-01-01

    This dissertation studies information extraction (IE), the problem of extracting structured information from unstructured data. Example IE tasks include extracting person names from news articles, product information from e-commerce Web pages, street addresses from emails, and names of emerging music bands from blogs. IE is all increasingly…

  12. The Bottom Boundary Layer.

    Science.gov (United States)

    Trowbridge, John H; Lentz, Steven J

    2018-01-03

    The oceanic bottom boundary layer extracts energy and momentum from the overlying flow, mediates the fate of near-bottom substances, and generates bedforms that retard the flow and affect benthic processes. The bottom boundary layer is forced by winds, waves, tides, and buoyancy and is influenced by surface waves, internal waves, and stratification by heat, salt, and suspended sediments. This review focuses on the coastal ocean. The main points are that (a) classical turbulence concepts and modern turbulence parameterizations provide accurate representations of the structure and turbulent fluxes under conditions in which the underlying assumptions hold, (b) modern sensors and analyses enable high-quality direct or near-direct measurements of the turbulent fluxes and dissipation rates, and (c) the remaining challenges include the interaction of waves and currents with the erodible seabed, the impact of layer-scale two- and three-dimensional instabilities, and the role of the bottom boundary layer in shelf-slope exchange.

  13. The Bottom Boundary Layer

    Science.gov (United States)

    Trowbridge, John H.; Lentz, Steven J.

    2018-01-01

    The oceanic bottom boundary layer extracts energy and momentum from the overlying flow, mediates the fate of near-bottom substances, and generates bedforms that retard the flow and affect benthic processes. The bottom boundary layer is forced by winds, waves, tides, and buoyancy and is influenced by surface waves, internal waves, and stratification by heat, salt, and suspended sediments. This review focuses on the coastal ocean. The main points are that (a) classical turbulence concepts and modern turbulence parameterizations provide accurate representations of the structure and turbulent fluxes under conditions in which the underlying assumptions hold, (b) modern sensors and analyses enable high-quality direct or near-direct measurements of the turbulent fluxes and dissipation rates, and (c) the remaining challenges include the interaction of waves and currents with the erodible seabed, the impact of layer-scale two- and three-dimensional instabilities, and the role of the bottom boundary layer in shelf-slope exchange.

  14. Information extraction from multi-institutional radiology reports.

    Science.gov (United States)

    Hassanpour, Saeed; Langlotz, Curtis P

    2016-01-01

    The radiology report is the most important source of clinical imaging information. It documents critical information about the patient's health and the radiologist's interpretation of medical findings. It also communicates information to the referring physicians and records that information for future clinical and research use. Although efforts to structure some radiology report information through predefined templates are beginning to bear fruit, a large portion of radiology report information is entered in free text. The free text format is a major obstacle for rapid extraction and subsequent use of information by clinicians, researchers, and healthcare information systems. This difficulty is due to the ambiguity and subtlety of natural language, complexity of described images, and variations among different radiologists and healthcare organizations. As a result, radiology reports are used only once by the clinician who ordered the study and rarely are used again for research and data mining. In this work, machine learning techniques and a large multi-institutional radiology report repository are used to extract the semantics of the radiology report and overcome the barriers to the re-use of radiology report information in clinical research and other healthcare applications. We describe a machine learning system to annotate radiology reports and extract report contents according to an information model. This information model covers the majority of clinically significant contents in radiology reports and is applicable to a wide variety of radiology study types. Our automated approach uses discriminative sequence classifiers for named-entity recognition to extract and organize clinically significant terms and phrases consistent with the information model. We evaluated our information extraction system on 150 radiology reports from three major healthcare organizations and compared its results to a commonly used non-machine learning information extraction method. We

  15. Content and the forms of heavy metals in bottom sediments in the zone of industrial pollution sources ,

    Directory of Open Access Journals (Sweden)

    Voytyuk Y.Y.

    2014-12-01

    Full Text Available Regularities in the distribution of heavy metals in sediments in the zone of influence of the steel industry in Mariupol are installed. The study results of the forms of occurrence of Zn, Pb, Cu, Cr, Ni are represented. Ecological and geochemical assessment of sediment contamination by heavy metals is performed. The main sources of pollution of bottom sediments are air borne emissions from industrial plants, hydrogenous pollution in industrial sewage entering the water, sewage sludge, ash dumps, slag, ore, sludge, oil spills and salt solutions. Pollution hydrogenous sediments may be significant, contaminated sediments are a source of long-term contamination of water, even after cessation of discharges into rivers untreated wastewater. The environmental condition of bottom sediments in gross content of heavy metals is little information because they do not reflect the transformation and further migration to adjacent environment. The study forms of giving objective information for ecological and geochemical evaluation. The study forms of heavy metals in the sediments carried by successive extracts. Concentrations of heavy metals in the extracts determined by atomic absorption spectrometer analysis CAS-115. It was established that a number of elements typical of exceeding their content in bottom sediments of the background values, due likely to their technogenic origin. Man-made pollution of bottom sediments. Mariupol has disrupted the natural form of the ratio of heavy metals. In the studied sediments form ion exchange increased content of heavy metals, which contributes to their migration in the aquatic environment.

  16. Extracting useful information from images

    DEFF Research Database (Denmark)

    Kucheryavskiy, Sergey

    2011-01-01

    The paper presents an overview of methods for extracting useful information from digital images. It covers various approaches that utilized different properties of images, like intensity distribution, spatial frequencies content and several others. A few case studies including isotropic and heter......The paper presents an overview of methods for extracting useful information from digital images. It covers various approaches that utilized different properties of images, like intensity distribution, spatial frequencies content and several others. A few case studies including isotropic...

  17. NEMO. Netherlands Energy demand MOdel. A top-down model based on bottom-up information

    International Nuclear Information System (INIS)

    Koopmans, C.C.; Te Velde, D.W.; Groot, W.; Hendriks, J.H.A.

    1999-06-01

    The title model links energy use to other production factors, (physical) production, energy prices, technological trends and government policies. It uses a 'putty-semiputty' vintage production structure, in which new investments, adaptations to existing capital goods (retrofit) and 'good-housekeeping' are discerned. Price elasticities are relatively large in the long term and small in the short term. Most predictions of energy use are based on either econometric models or on 'bottom-up information', i.e. disaggregated lists of technical possibilities for and costs of saving energy. Typically, one predicts more energy-efficiency improvements using bottom-up information than using econometric ('top-down') models. We bridged this so-called 'energy-efficiency gap' by designing our macro/meso model NEMO in such a way that we can use bottom-up (micro) information to estimate most model parameters. In our view, reflected in NEMO, the energy-efficiency gap arises for two reasons. The first is that firms and households use a fairly high discount rate of 15% when evaluating the profitability of energy-efficiency improvements. The second is that our bottom-up information ('ICARUS') for most economic sectors does not (as NEMO does) take account of the fact that implementation of new, energy-efficient technology in capital stock takes place only gradually. Parameter estimates for 19 sectors point at a long-term technological energy efficiency improvement trend in Netherlands final energy use of 0.8% per year. The long-term price elasticity is estimated to be 0.29. These values are comparable to other studies based on time series data. Simulations of the effects of the oil price shocks in the seventies and the subsequent fall of oil prices show that the NEMO's price elasticities are consistent with historical data. However, the present pace at which new technologies become available (reflected in NEMO) appears to be lower than in the seventies and eighties. This suggests that it

  18. Extraction of Information of Audio-Visual Contents

    Directory of Open Access Journals (Sweden)

    Carlos Aguilar

    2011-10-01

    Full Text Available In this article we show how it is possible to use Channel Theory (Barwise and Seligman, 1997 for modeling the process of information extraction realized by audiences of audio-visual contents. To do this, we rely on the concepts pro- posed by Channel Theory and, especially, its treatment of representational systems. We then show how the information that an agent is capable of extracting from the content depends on the number of channels he is able to establish between the content and the set of classifications he is able to discriminate. The agent can endeavor the extraction of information through these channels from the totality of content; however, we discuss the advantages of extracting from its constituents in order to obtain a greater number of informational items that represent it. After showing how the extraction process is endeavored for each channel, we propose a method of representation of all the informative values an agent can obtain from a content using a matrix constituted by the channels the agent is able to establish on the content (source classifications, and the ones he can understand as individual (destination classifications. We finally show how this representation allows reflecting the evolution of the informative items through the evolution of audio-visual content.

  19. Scenario Customization for Information Extraction

    National Research Council Canada - National Science Library

    Yangarber, Roman

    2001-01-01

    Information Extraction (IE) is an emerging NLP technology, whose function is to process unstructured, natural language text, to locate specific pieces of information, or facts, in the text, and to use these facts to fill a database...

  20. Can we replace curation with information extraction software?

    Science.gov (United States)

    Karp, Peter D

    2016-01-01

    Can we use programs for automated or semi-automated information extraction from scientific texts as practical alternatives to professional curation? I show that error rates of current information extraction programs are too high to replace professional curation today. Furthermore, current IEP programs extract single narrow slivers of information, such as individual protein interactions; they cannot extract the large breadth of information extracted by professional curators for databases such as EcoCyc. They also cannot arbitrate among conflicting statements in the literature as curators can. Therefore, funding agencies should not hobble the curation efforts of existing databases on the assumption that a problem that has stymied Artificial Intelligence researchers for more than 60 years will be solved tomorrow. Semi-automated extraction techniques appear to have significantly more potential based on a review of recent tools that enhance curator productivity. But a full cost-benefit analysis for these tools is lacking. Without such analysis it is possible to expend significant effort developing information-extraction tools that automate small parts of the overall curation workflow without achieving a significant decrease in curation costs.Database URL. © The Author(s) 2016. Published by Oxford University Press.

  1. Transductive Pattern Learning for Information Extraction

    National Research Council Canada - National Science Library

    McLernon, Brian; Kushmerick, Nicholas

    2006-01-01

    .... We present TPLEX, a semi-supervised learning algorithm for information extraction that can acquire extraction patterns from a small amount of labelled text in conjunction with a large amount of unlabelled text...

  2. Optical Aperture Synthesis Object's Information Extracting Based on Wavelet Denoising

    International Nuclear Information System (INIS)

    Fan, W J; Lu, Y

    2006-01-01

    Wavelet denoising is studied to improve OAS(optical aperture synthesis) object's Fourier information extracting. Translation invariance wavelet denoising based on Donoho wavelet soft threshold denoising is researched to remove Pseudo-Gibbs in wavelet soft threshold image. OAS object's information extracting based on translation invariance wavelet denoising is studied. The study shows that wavelet threshold denoising can improve the precision and the repetition of object's information extracting from interferogram, and the translation invariance wavelet denoising information extracting is better than soft threshold wavelet denoising information extracting

  3. Enhanced Photon Extraction from a Nanowire Quantum Dot Using a Bottom-Up Photonic Shell

    Science.gov (United States)

    Jeannin, Mathieu; Cremel, Thibault; Häyrynen, Teppo; Gregersen, Niels; Bellet-Amalric, Edith; Nogues, Gilles; Kheng, Kuntheak

    2017-11-01

    Semiconductor nanowires offer the possibility to grow high-quality quantum-dot heterostructures, and, in particular, CdSe quantum dots inserted in ZnSe nanowires have demonstrated the ability to emit single photons up to room temperature. In this paper, we demonstrate a bottom-up approach to fabricate a photonic fiberlike structure around such nanowire quantum dots by depositing an oxide shell using atomic-layer deposition. Simulations suggest that the intensity collected in our NA =0.6 microscope objective can be increased by a factor 7 with respect to the bare nanowire case. Combining microphotoluminescence, decay time measurements, and numerical simulations, we obtain a fourfold increase in the collected photoluminescence from the quantum dot. We show that this improvement is due to an increase of the quantum-dot emission rate and a redirection of the emitted light. Our ex situ fabrication technique allows a precise and reproducible fabrication on a large scale. Its improved extraction efficiency is compared to state-of-the-art top-down devices.

  4. Cellular Mutagenicity and Heavy Metal Concentrations of Leachates Extracted from the Fly and Bottom Ash Derived from Municipal Solid Waste Incineration

    Science.gov (United States)

    Chen, Po-Wen; Liu, Zhen-Shu; Wun, Min-Jie; Kuo, Tai-Chen

    2016-01-01

    Two incinerators in Taiwan have recently attempted to reuse the fly and bottom ash that they produce, but the mutagenicity of these types of ash has not yet been assessed. Therefore, we evaluated the mutagenicity of the ash with the Ames mutagenicity assay using the TA98, TA100, and TA1535 bacterial strains. We obtained three leachates from three leachants of varying pH values using the toxicity characteristic leaching procedure test recommended by the Taiwan Environmental Protection Agency (Taiwan EPA). We then performed the Ames assay on the harvested leachates. To evaluate the possible relationship between the presence of heavy metals and mutagenicity, the concentrations of five heavy metals (Cd, Cr, Cu, Pb, and Zn) in the leachates were also determined. The concentrations of Cd and Cr in the most acidic leachate from the precipitator fly ash and the Cd concentration in the most acidic leachate from the boiler fly ash exceeded the recommended limits. Notably, none of the nine leachates extracted from the boiler, precipitator, or bottom ashes displayed mutagenic activity. This data partially affirms the safety of the fly and bottom ash produced by certain incinerators. Therefore, the biotoxicity of leachates from recycled ash should be routinely monitored before reusing the ash. PMID:27827867

  5. Top-Down and Bottom-Up Identification of Proteins by Liquid Extraction Surface Analysis Mass Spectrometry of Healthy and Diseased Human Liver Tissue

    Science.gov (United States)

    Sarsby, Joscelyn; Martin, Nicholas J.; Lalor, Patricia F.; Bunch, Josephine; Cooper, Helen J.

    2014-09-01

    Liquid extraction surface analysis mass spectrometry (LESA MS) has the potential to become a useful tool in the spatially-resolved profiling of proteins in substrates. Here, the approach has been applied to the analysis of thin tissue sections from human liver. The aim was to determine whether LESA MS was a suitable approach for the detection of protein biomarkers of nonalcoholic liver disease (nonalcoholic steatohepatitis, NASH), with a view to the eventual development of LESA MS for imaging NASH pathology. Two approaches were considered. In the first, endogenous proteins were extracted from liver tissue sections by LESA, subjected to automated trypsin digestion, and the resulting peptide mixture was analyzed by liquid chromatography tandem mass spectrometry (LC-MS/MS) (bottom-up approach). In the second (top-down approach), endogenous proteins were extracted by LESA, and analyzed intact. Selected protein ions were subjected to collision-induced dissociation (CID) and/or electron transfer dissociation (ETD) mass spectrometry. The bottom-up approach resulted in the identification of over 500 proteins; however identification of key protein biomarkers, liver fatty acid binding protein (FABP1), and its variant (Thr→Ala, position 94), was unreliable and irreproducible. Top-down LESA MS analysis of healthy and diseased liver tissue revealed peaks corresponding to multiple (~15-25) proteins. MS/MS of four of these proteins identified them as FABP1, its variant, α-hemoglobin, and 10 kDa heat shock protein. The reliable identification of FABP1 and its variant by top-down LESA MS suggests that the approach may be suitable for imaging NASH pathology in sections from liver biopsies.

  6. Nanoelectronics «bottom – up»: thermodynamics of electric conductor, information-driven battery and quantum entropy

    Directory of Open Access Journals (Sweden)

    Юрий Алексеевич Кругляк

    2015-11-01

    Full Text Available Within the «bottom – up» approach of nanoelectronics the equilibrium thermodynamics of a conductor with a current is presented and the accumulation of information in a non-equilibrium state with an analysis of information-driven battery model is discussed in connection with the Landauer principle on the minimum of energy needed to erase one bit of information. The concept of quantum entropy is introduced and the importance of integration of spintronics and magnetronics in connection with the upcoming development of the spin architecture for the computing devices are discussed

  7. Cause Information Extraction from Financial Articles Concerning Business Performance

    Science.gov (United States)

    Sakai, Hiroyuki; Masuyama, Shigeru

    We propose a method of extracting cause information from Japanese financial articles concerning business performance. Our method acquires cause informtion, e. g. “_??__??__??__??__??__??__??__??__??__??_ (zidousya no uriage ga koutyou: Sales of cars were good)”. Cause information is useful for investors in selecting companies to invest. Our method extracts cause information as a form of causal expression by using statistical information and initial clue expressions automatically. Our method can extract causal expressions without predetermined patterns or complex rules given by hand, and is expected to be applied to other tasks for acquiring phrases that have a particular meaning not limited to cause information. We compared our method with our previous one originally proposed for extracting phrases concerning traffic accident causes and experimental results showed that our new method outperforms our previous one.

  8. Sample-based XPath Ranking for Web Information Extraction

    NARCIS (Netherlands)

    Jundt, Oliver; van Keulen, Maurice

    Web information extraction typically relies on a wrapper, i.e., program code or a configuration that specifies how to extract some information from web pages at a specific website. Manually creating and maintaining wrappers is a cumbersome and error-prone task. It may even be prohibitive as some

  9. Ontology-Based Information Extraction for Business Intelligence

    Science.gov (United States)

    Saggion, Horacio; Funk, Adam; Maynard, Diana; Bontcheva, Kalina

    Business Intelligence (BI) requires the acquisition and aggregation of key pieces of knowledge from multiple sources in order to provide valuable information to customers or feed statistical BI models and tools. The massive amount of information available to business analysts makes information extraction and other natural language processing tools key enablers for the acquisition and use of that semantic information. We describe the application of ontology-based extraction and merging in the context of a practical e-business application for the EU MUSING Project where the goal is to gather international company intelligence and country/region information. The results of our experiments so far are very promising and we are now in the process of building a complete end-to-end solution.

  10. Bottom Dissolved Oxygen Maps From SEAMAP Summer Groundfish/Shrimp Surveys

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Bottom dissolved oxygen (DO) data was extracted from environmental profiles acquired during the Southeast Fisheries Science Center Mississippi Laboratories summer...

  11. A Two-Step Resume Information Extraction Algorithm

    Directory of Open Access Journals (Sweden)

    Jie Chen

    2018-01-01

    Full Text Available With the rapid growth of Internet-based recruiting, there are a great number of personal resumes among recruiting systems. To gain more attention from the recruiters, most resumes are written in diverse formats, including varying font size, font colour, and table cells. However, the diversity of format is harmful to data mining, such as resume information extraction, automatic job matching, and candidates ranking. Supervised methods and rule-based methods have been proposed to extract facts from resumes, but they strongly rely on hierarchical structure information and large amounts of labelled data, which are hard to collect in reality. In this paper, we propose a two-step resume information extraction approach. In the first step, raw text of resume is identified as different resume blocks. To achieve the goal, we design a novel feature, Writing Style, to model sentence syntax information. Besides word index and punctuation index, word lexical attribute and prediction results of classifiers are included in Writing Style. In the second step, multiple classifiers are employed to identify different attributes of fact information in resumes. Experimental results on a real-world dataset show that the algorithm is feasible and effective.

  12. The Agent of extracting Internet Information with Lead Order

    Science.gov (United States)

    Mo, Zan; Huang, Chuliang; Liu, Aijun

    In order to carry out e-commerce better, advanced technologies to access business information are in need urgently. An agent is described to deal with the problems of extracting internet information that caused by the non-standard and skimble-scamble structure of Chinese websites. The agent designed includes three modules which respond to the process of extracting information separately. A method of HTTP tree and a kind of Lead algorithm is proposed to generate a lead order, with which the required web can be retrieved easily. How to transform the extracted information structuralized with natural language is also discussed.

  13. Intelligent Evaluation Method of Tank Bottom Corrosion Status Based on Improved BP Artificial Neural Network

    Science.gov (United States)

    Qiu, Feng; Dai, Guang; Zhang, Ying

    According to the acoustic emission information and the appearance inspection information of tank bottom online testing, the external factors associated with tank bottom corrosion status are confirmed. Applying artificial neural network intelligent evaluation method, three tank bottom corrosion status evaluation models based on appearance inspection information, acoustic emission information, and online testing information are established. Comparing with the result of acoustic emission online testing through the evaluation of test sample, the accuracy of the evaluation model based on online testing information is 94 %. The evaluation model can evaluate tank bottom corrosion accurately and realize acoustic emission online testing intelligent evaluation of tank bottom.

  14. Fine-grained information extraction from German transthoracic echocardiography reports.

    Science.gov (United States)

    Toepfer, Martin; Corovic, Hamo; Fette, Georg; Klügl, Peter; Störk, Stefan; Puppe, Frank

    2015-11-12

    Information extraction techniques that get structured representations out of unstructured data make a large amount of clinically relevant information about patients accessible for semantic applications. These methods typically rely on standardized terminologies that guide this process. Many languages and clinical domains, however, lack appropriate resources and tools, as well as evaluations of their applications, especially if detailed conceptualizations of the domain are required. For instance, German transthoracic echocardiography reports have not been targeted sufficiently before, despite of their importance for clinical trials. This work therefore aimed at development and evaluation of an information extraction component with a fine-grained terminology that enables to recognize almost all relevant information stated in German transthoracic echocardiography reports at the University Hospital of Würzburg. A domain expert validated and iteratively refined an automatically inferred base terminology. The terminology was used by an ontology-driven information extraction system that outputs attribute value pairs. The final component has been mapped to the central elements of a standardized terminology, and it has been evaluated according to documents with different layouts. The final system achieved state-of-the-art precision (micro average.996) and recall (micro average.961) on 100 test documents that represent more than 90 % of all reports. In particular, principal aspects as defined in a standardized external terminology were recognized with f 1=.989 (micro average) and f 1=.963 (macro average). As a result of keyword matching and restraint concept extraction, the system obtained high precision also on unstructured or exceptionally short documents, and documents with uncommon layout. The developed terminology and the proposed information extraction system allow to extract fine-grained information from German semi-structured transthoracic echocardiography reports

  15. Uncertainty quantification for radiation measurements: Bottom-up error variance estimation using calibration information

    International Nuclear Information System (INIS)

    Burr, T.; Croft, S.; Krieger, T.; Martin, K.; Norman, C.; Walsh, S.

    2016-01-01

    One example of top-down uncertainty quantification (UQ) involves comparing two or more measurements on each of multiple items. One example of bottom-up UQ expresses a measurement result as a function of one or more input variables that have associated errors, such as a measured count rate, which individually (or collectively) can be evaluated for impact on the uncertainty in the resulting measured value. In practice, it is often found that top-down UQ exhibits larger error variances than bottom-up UQ, because some error sources are present in the fielded assay methods used in top-down UQ that are not present (or not recognized) in the assay studies used in bottom-up UQ. One would like better consistency between the two approaches in order to claim understanding of the measurement process. The purpose of this paper is to refine bottom-up uncertainty estimation by using calibration information so that if there are no unknown error sources, the refined bottom-up uncertainty estimate will agree with the top-down uncertainty estimate to within a specified tolerance. Then, in practice, if the top-down uncertainty estimate is larger than the refined bottom-up uncertainty estimate by more than the specified tolerance, there must be omitted sources of error beyond those predicted from calibration uncertainty. The paper develops a refined bottom-up uncertainty approach for four cases of simple linear calibration: (1) inverse regression with negligible error in predictors, (2) inverse regression with non-negligible error in predictors, (3) classical regression followed by inversion with negligible error in predictors, and (4) classical regression followed by inversion with non-negligible errors in predictors. Our illustrations are of general interest, but are drawn from our experience with nuclear material assay by non-destructive assay. The main example we use is gamma spectroscopy that applies the enrichment meter principle. Previous papers that ignore error in predictors

  16. Status analysis of keyhole bottom in laser-MAG hybrid welding process.

    Science.gov (United States)

    Wang, Lin; Gao, Xiangdong; Chen, Ziqin

    2018-01-08

    The keyhole status is a determining factor of weld quality in laser-metal active gas arc (MAG) hybrid welding process. For a better evaluation of the hybrid welding process, three different penetration welding experiments: partial penetration, normal penetration (or full penetration), and excessive penetration were conducted in this work. The instantaneous visual phenomena including metallic vapor, spatters and keyhole of bottom surface were used to evaluate the keyhole status by a double high-speed camera system. The Fourier transform was applied on the bottom weld pool image for removing the image noise around the keyhole, and then the bottom weld pool image was reconstructed through the inverse Fourier transform. Lastly, the keyhole bottom was extracted from the de-noised bottom weld pool image. By analyzing the visual features of the laser-MAG hybrid welding process, mechanism of the closed and opened keyhole bottom were revealed. The results show that the stable opened or closed status of keyhole bottom is directly affected by the MAG droplet transition in the normal penetration welding process, and the unstable opened or closed status of keyhole bottom would appear in excessive penetration welding and partial penetration welding. The analysis method proposed in this paper could be used to monitor the keyhole stability in laser-MAG hybrid welding process.

  17. Methods of analysis by the U.S. Geological Survey National Water Quality Laboratory; determination of semivolatile organic compounds in bottom sediment by solvent extraction, gel permeation chromatographic fractionation, and capillary-column gas chromatography/mass spectrometry

    Science.gov (United States)

    Furlong, E.T.; Vaught, D.G.; Merten, L.M.; Foreman, W.T.; Gates, Paul M.

    1996-01-01

    A method for the determination of 79 semivolatile organic compounds (SOCs) and 4 surrogate compounds in soils and bottom sediment is described. The SOCs are extracted from bottom sediment by solvent extraction, followed by partial isolation using high-performance gel permeation chromatography (GPC). The SOCs then are qualitatively identified and quantitative concentrations determined by capillary-column gas chromatography/mass spectrometry (GC/MS). This method also is designed for an optional simultaneous isolation of polychlorinated biphenyls (PCBs) and organochlorine (OC) insecticides, including toxaphene. When OCs and PCBs are determined, an additional alumina- over-silica column chromatography step follows GPC cleanup, and quantitation is by dual capillary- column gas chromatography with electron-capture detection (GC/ECD). Bottom-sediment samples are centrifuged to remove excess water and extracted overnight with dichloromethane. The extract is concentrated, centrifuged, and then filtered through a 0.2-micrometer polytetrafluoro-ethylene syringe filter. Two aliquots of the sample extract then are quantitatively injected onto two polystyrene- divinylbenzene GPC columns connected in series. The SOCs are eluted with dichloromethane, a fraction containing the SOCs is collected, and some coextracted interferences, including elemental sulfur, are separated and discarded. The SOC-containing GPC fraction then is analyzed by GC/MS. When desired, a second aliquot from GPC is further processed for OCs and PCBs by combined alumina-over-silica column chromatography. The two fractions produced in this cleanup then are analyzed by GC/ECD. This report fully describes and is limited to the determination of SOCs by GC/MS.

  18. Remediation Performance and Mechanism of Heavy Metals by a Bottom Up Activation and Extraction System Using Multiple Biochemical Materials.

    Science.gov (United States)

    Xiao, Kemeng; Li, Yunzhen; Sun, Yang; Liu, Ruyue; Li, Junjie; Zhao, Yun; Xu, Heng

    2017-09-13

    Soil contamination with heavy metals has caused serious environmental problems and increased the risks to humans and biota. Herein, we developed an effective bottom up metals removal system based on the synergy between the activation of immobilization metal-resistant bacteria and the extraction of bioaccumulator material (Stropharia rugosoannulata). In this system, the advantages of biochar produced at 400 °C and sodium alginate were integrated to immobilize bacteria. Optimized by response surface methodology, the biochar and bacterial suspension were mixed at a ratio of 1:20 (w:v) for 12 h when 2.5% sodium alginate was added to the mixture. Results demonstrated that the system significantly increased the proportion of acid soluble Cd and Cu and improved the soil microecology (microbial counts, soil respiration, and enzyme activities). The maximum extractions of Cd and Cu were 8.79 and 77.92 mg kg -1 , respectively. Moreover, details of the possible mechanistic insight into the metal removal are discussed, which indicate positive correlation with the acetic acid extractable metals and soil microecology. Meanwhile, the "dilution effect" in S. rugosoannulata probably plays an important role in the metal removal process. Furthermore, the metal-resistant bacteria in this system were successfully colonized, and the soil bacteria community were evaluated to understand the microbial diversity in metal-contaminated soil after remediation.

  19. Extracting Information from Multimedia Meeting Collections

    OpenAIRE

    Gatica-Perez, Daniel; Zhang, Dong; Bengio, Samy

    2005-01-01

    Multimedia meeting collections, composed of unedited audio and video streams, handwritten notes, slides, and electronic documents that jointly constitute a raw record of complex human interaction processes in the workplace, have attracted interest due to the increasing feasibility of recording them in large quantities, by the opportunities for information access and retrieval applications derived from the automatic extraction of relevant meeting information, and by the challenges that the ext...

  20. Contact effects analyzed by a parameter extraction method based on a single bottom-gate/top-contact organic thin-film transistor

    Science.gov (United States)

    Takagaki, Shunsuke; Yamada, Hirofumi; Noda, Kei

    2018-03-01

    Contact effects in organic thin-film transistors (OTFTs) were examined by using our previously proposed parameter extraction method from the electrical characteristics of a single staggered-type device. Gate-voltage-dependent contact resistance and channel mobility in the linear regime were evaluated for bottom-gate/top-contact (BGTC) pentacene TFTs with active layers of different thicknesses, and for pentacene TFTs with contact-doped layers prepared by coevaporation of pentacene and tetrafluorotetracyanoquinodimethane (F4TCNQ). The extracted parameters suggested that the influence of the contact resistance becomes more prominent with the larger active-layer thickness, and that contact-doping experiments give rise to a drastic decrease in the contact resistance and a concurrent considerable improvement in the channel mobility. Additionally, the estimated energy distributions of trap density in the transistor channel probably reflect the trap filling with charge carriers injected into the channel regions. The analysis results in this study confirm the effectiveness of our proposed method, with which we can investigate contact effects and circumvent the influences of characteristic variations in OTFT fabrication.

  1. NAMED ENTITY RECOGNITION FROM BIOMEDICAL TEXT -AN INFORMATION EXTRACTION TASK

    Directory of Open Access Journals (Sweden)

    N. Kanya

    2016-07-01

    Full Text Available Biomedical Text Mining targets the Extraction of significant information from biomedical archives. Bio TM encompasses Information Retrieval (IR and Information Extraction (IE. The Information Retrieval will retrieve the relevant Biomedical Literature documents from the various Repositories like PubMed, MedLine etc., based on a search query. The IR Process ends up with the generation of corpus with the relevant document retrieved from the Publication databases based on the query. The IE task includes the process of Preprocessing of the document, Named Entity Recognition (NER from the documents and Relationship Extraction. This process includes Natural Language Processing, Data Mining techniques and machine Language algorithm. The preprocessing task includes tokenization, stop word Removal, shallow parsing, and Parts-Of-Speech tagging. NER phase involves recognition of well-defined objects such as genes, proteins or cell-lines etc. This process leads to the next phase that is extraction of relationships (IE. The work was based on machine learning algorithm Conditional Random Field (CRF.

  2. Semantic Information Extraction of Lanes Based on Onboard Camera Videos

    Science.gov (United States)

    Tang, L.; Deng, T.; Ren, C.

    2018-04-01

    In the field of autonomous driving, semantic information of lanes is very important. This paper proposes a method of automatic detection of lanes and extraction of semantic information from onboard camera videos. The proposed method firstly detects the edges of lanes by the grayscale gradient direction, and improves the Probabilistic Hough transform to fit them; then, it uses the vanishing point principle to calculate the lane geometrical position, and uses lane characteristics to extract lane semantic information by the classification of decision trees. In the experiment, 216 road video images captured by a camera mounted onboard a moving vehicle were used to detect lanes and extract lane semantic information. The results show that the proposed method can accurately identify lane semantics from video images.

  3. Integrating Information Extraction Agents into a Tourism Recommender System

    Science.gov (United States)

    Esparcia, Sergio; Sánchez-Anguix, Víctor; Argente, Estefanía; García-Fornes, Ana; Julián, Vicente

    Recommender systems face some problems. On the one hand information needs to be maintained updated, which can result in a costly task if it is not performed automatically. On the other hand, it may be interesting to include third party services in the recommendation since they improve its quality. In this paper, we present an add-on for the Social-Net Tourism Recommender System that uses information extraction and natural language processing techniques in order to automatically extract and classify information from the Web. Its goal is to maintain the system updated and obtain information about third party services that are not offered by service providers inside the system.

  4. Bottom head assembly

    International Nuclear Information System (INIS)

    Fife, A.B.

    1998-01-01

    A bottom head dome assembly is described which includes, in one embodiment, a bottom head dome and a liner configured to be positioned proximate the bottom head dome. The bottom head dome has a plurality of openings extending there through. The liner also has a plurality of openings extending there through, and each liner opening aligns with a respective bottom head dome opening. A seal is formed, such as by welding, between the liner and the bottom head dome to resist entry of water between the liner and the bottom head dome at the edge of the liner. In the one embodiment, a plurality of stub tubes are secured to the liner. Each stub tube has a bore extending there through, and each stub tube bore is coaxially aligned with a respective liner opening. A seat portion is formed by each liner opening for receiving a portion of the respective stub tube. The assembly also includes a plurality of support shims positioned between the bottom head dome and the liner for supporting the liner. In one embodiment, each support shim includes a support stub having a bore there through, and each support stub bore aligns with a respective bottom head dome opening. 2 figs

  5. Optimal Information Extraction of Laser Scanning Dataset by Scale-Adaptive Reduction

    Science.gov (United States)

    Zang, Y.; Yang, B.

    2018-04-01

    3D laser technology is widely used to collocate the surface information of object. For various applications, we need to extract a good perceptual quality point cloud from the scanned points. To solve the problem, most of existing methods extract important points based on a fixed scale. However, geometric features of 3D object come from various geometric scales. We propose a multi-scale construction method based on radial basis function. For each scale, important points are extracted from the point cloud based on their importance. We apply a perception metric Just-Noticeable-Difference to measure degradation of each geometric scale. Finally, scale-adaptive optimal information extraction is realized. Experiments are undertaken to evaluate the effective of the proposed method, suggesting a reliable solution for optimal information extraction of object.

  6. OPTIMAL INFORMATION EXTRACTION OF LASER SCANNING DATASET BY SCALE-ADAPTIVE REDUCTION

    Directory of Open Access Journals (Sweden)

    Y. Zang

    2018-04-01

    Full Text Available 3D laser technology is widely used to collocate the surface information of object. For various applications, we need to extract a good perceptual quality point cloud from the scanned points. To solve the problem, most of existing methods extract important points based on a fixed scale. However, geometric features of 3D object come from various geometric scales. We propose a multi-scale construction method based on radial basis function. For each scale, important points are extracted from the point cloud based on their importance. We apply a perception metric Just-Noticeable-Difference to measure degradation of each geometric scale. Finally, scale-adaptive optimal information extraction is realized. Experiments are undertaken to evaluate the effective of the proposed method, suggesting a reliable solution for optimal information extraction of object.

  7. Knowledge Dictionary for Information Extraction on the Arabic Text Data

    Directory of Open Access Journals (Sweden)

    Wahyu Jauharis Saputra

    2013-04-01

    Full Text Available Information extraction is an early stage of a process of textual data analysis. Information extraction is required to get information from textual data that can be used for process analysis, such as classification and categorization. A textual data is strongly influenced by the language. Arabic is gaining a significant attention in many studies because Arabic language is very different from others, and in contrast to other languages, tools and research on the Arabic language is still lacking. The information extracted using the knowledge dictionary is a concept of expression. A knowledge dictionary is usually constructed manually by an expert and this would take a long time and is specific to a problem only. This paper proposed a method for automatically building a knowledge dictionary. Dictionary knowledge is formed by classifying sentences having the same concept, assuming that they will have a high similarity value. The concept that has been extracted can be used as features for subsequent computational process such as classification or categorization. Dataset used in this paper was the Arabic text dataset. Extraction result was tested by using a decision tree classification engine and the highest precision value obtained was 71.0% while the highest recall value was 75.0%. 

  8. Multi-Filter String Matching and Human-Centric Entity Matching for Information Extraction

    Science.gov (United States)

    Sun, Chong

    2012-01-01

    More and more information is being generated in text documents, such as Web pages, emails and blogs. To effectively manage this unstructured information, one broadly used approach includes locating relevant content in documents, extracting structured information and integrating the extracted information for querying, mining or further analysis. In…

  9. Research on Crowdsourcing Emergency Information Extraction of Based on Events' Frame

    Science.gov (United States)

    Yang, Bo; Wang, Jizhou; Ma, Weijun; Mao, Xi

    2018-01-01

    At present, the common information extraction method cannot extract the structured emergency event information accurately; the general information retrieval tool cannot completely identify the emergency geographic information; these ways also do not have an accurate assessment of these results of distilling. So, this paper proposes an emergency information collection technology based on event framework. This technique is to solve the problem of emergency information picking. It mainly includes emergency information extraction model (EIEM), complete address recognition method (CARM) and the accuracy evaluation model of emergency information (AEMEI). EIEM can be structured to extract emergency information and complements the lack of network data acquisition in emergency mapping. CARM uses a hierarchical model and the shortest path algorithm and allows the toponomy pieces to be joined as a full address. AEMEI analyzes the results of the emergency event and summarizes the advantages and disadvantages of the event framework. Experiments show that event frame technology can solve the problem of emergency information drawing and provides reference cases for other applications. When the emergency disaster is about to occur, the relevant departments query emergency's data that has occurred in the past. They can make arrangements ahead of schedule which defense and reducing disaster. The technology decreases the number of casualties and property damage in the country and world. This is of great significance to the state and society.

  10. Novel extraction induced by microemulsion breaking: a model study for Hg extraction from Brazilian gasoline.

    Science.gov (United States)

    Vicentino, Priscila O; Cassella, Ricardo J

    2017-01-01

    This paper proposes a novel approach for the extraction of Hg from Brazilian gasoline samples: extraction induced by microemulsion breaking (EIMB). In this approach, a microemulsion is formed by mixing the sample with n-propanol and HCl. Afterwards, the microemulsion is destabilized by the addition of water and the two phases are separated: (i) the top phase, containing the residual gasoline and (ii) the bottom phase, containing the extracted analyte in a medium containing water, n-propanol and the ethanol originally present in the gasoline sample. The bottom phase is then collected and the Hg is measured by cold vapor atomic absorption spectrometry (CV-AAS). This model study used Brazilian gasoline samples spiked with Hg (organometallic compound) to optimize the process. Under the optimum extraction conditions, the microemulsion was prepared by mixing 8.7mL of sample with 1.2mL of n-propanol and 0.1mL of a 10molL -1 HCl solution. Emulsion breaking was induced by adding 300µL of deionized water and the bottom phase was collected for the measurement of Hg. Six samples of Brazilian gasoline were spiked with Hg in the organometallic form and recovery percentages in the range of 88-109% were observed. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. A rapid extraction of landslide disaster information research based on GF-1 image

    Science.gov (United States)

    Wang, Sai; Xu, Suning; Peng, Ling; Wang, Zhiyi; Wang, Na

    2015-08-01

    In recent years, the landslide disasters occurred frequently because of the seismic activity. It brings great harm to people's life. It has caused high attention of the state and the extensive concern of society. In the field of geological disaster, landslide information extraction based on remote sensing has been controversial, but high resolution remote sensing image can improve the accuracy of information extraction effectively with its rich texture and geometry information. Therefore, it is feasible to extract the information of earthquake- triggered landslides with serious surface damage and large scale. Taking the Wenchuan county as the study area, this paper uses multi-scale segmentation method to extract the landslide image object through domestic GF-1 images and DEM data, which uses the estimation of scale parameter tool to determine the optimal segmentation scale; After analyzing the characteristics of landslide high-resolution image comprehensively and selecting spectrum feature, texture feature, geometric features and landform characteristics of the image, we can establish the extracting rules to extract landslide disaster information. The extraction results show that there are 20 landslide whose total area is 521279.31 .Compared with visual interpretation results, the extraction accuracy is 72.22%. This study indicates its efficient and feasible to extract earthquake landslide disaster information based on high resolution remote sensing and it provides important technical support for post-disaster emergency investigation and disaster assessment.

  12. Information Extraction with Character-level Neural Networks and Free Noisy Supervision

    OpenAIRE

    Meerkamp, Philipp; Zhou, Zhengyi

    2016-01-01

    We present an architecture for information extraction from text that augments an existing parser with a character-level neural network. The network is trained using a measure of consistency of extracted data with existing databases as a form of noisy supervision. Our architecture combines the ability of constraint-based information extraction systems to easily incorporate domain knowledge and constraints with the ability of deep neural networks to leverage large amounts of data to learn compl...

  13. Unsupervised information extraction by text segmentation

    CERN Document Server

    Cortez, Eli

    2013-01-01

    A new unsupervised approach to the problem of Information Extraction by Text Segmentation (IETS) is proposed, implemented and evaluated herein. The authors' approach relies on information available on pre-existing data to learn how to associate segments in the input string with attributes of a given domain relying on a very effective set of content-based features. The effectiveness of the content-based features is also exploited to directly learn from test data structure-based features, with no previous human-driven training, a feature unique to the presented approach. Based on the approach, a

  14. Extraction of CT dose information from DICOM metadata: automated Matlab-based approach.

    Science.gov (United States)

    Dave, Jaydev K; Gingold, Eric L

    2013-01-01

    The purpose of this study was to extract exposure parameters and dose-relevant indexes of CT examinations from information embedded in DICOM metadata. DICOM dose report files were identified and retrieved from a PACS. An automated software program was used to extract from these files information from the structured elements in the DICOM metadata relevant to exposure. Extracting information from DICOM metadata eliminated potential errors inherent in techniques based on optical character recognition, yielding 100% accuracy.

  15. Advanced applications of natural language processing for performing information extraction

    CERN Document Server

    Rodrigues, Mário

    2015-01-01

    This book explains how can be created information extraction (IE) applications that are able to tap the vast amount of relevant information available in natural language sources: Internet pages, official documents such as laws and regulations, books and newspapers, and social web. Readers are introduced to the problem of IE and its current challenges and limitations, supported with examples. The book discusses the need to fill the gap between documents, data, and people, and provides a broad overview of the technology supporting IE. The authors present a generic architecture for developing systems that are able to learn how to extract relevant information from natural language documents, and illustrate how to implement working systems using state-of-the-art and freely available software tools. The book also discusses concrete applications illustrating IE uses.   ·         Provides an overview of state-of-the-art technology in information extraction (IE), discussing achievements and limitations for t...

  16. Post-processing of Deep Web Information Extraction Based on Domain Ontology

    Directory of Open Access Journals (Sweden)

    PENG, T.

    2013-11-01

    Full Text Available Many methods are utilized to extract and process query results in deep Web, which rely on the different structures of Web pages and various designing modes of databases. However, some semantic meanings and relations are ignored. So, in this paper, we present an approach for post-processing deep Web query results based on domain ontology which can utilize the semantic meanings and relations. A block identification model (BIM based on node similarity is defined to extract data blocks that are relevant to specific domain after reducing noisy nodes. Feature vector of domain books is obtained by result set extraction model (RSEM based on vector space model (VSM. RSEM, in combination with BIM, builds the domain ontology on books which can not only remove the limit of Web page structures when extracting data information, but also make use of semantic meanings of domain ontology. After extracting basic information of Web pages, a ranking algorithm is adopted to offer an ordered list of data records to users. Experimental results show that BIM and RSEM extract data blocks and build domain ontology accurately. In addition, relevant data records and basic information are extracted and ranked. The performances precision and recall show that our proposed method is feasible and efficient.

  17. Reducing Heavy Metal Element from Coal Bottom Ash by Using Citric Acid Leaching Treatment

    Directory of Open Access Journals (Sweden)

    Yahya Ahmad Asyari

    2017-01-01

    Full Text Available Coal ash is the residue that is produced during coal combustion for instance fly ash, bottom ash or boiler slag which was primarily produced from the combustion of coal. With growth in coal burning power station, huge amount of coal bottom ash (CBA considered as hazardous material which are normally disposed in an on-site disposal system without any commercialization purpose. Previous researchers have studied the extraction of silica from agricultural wastes such as palm ash and rice husk ash (RHA and CBA by using leaching treatment method. In this study, the weaker acid, citric acid solution was used to replace the strong acid in leaching treatment process. Result showed that the heavy metal content such as Copper (Cu, Zinc (Zn and Lead (Pb can be decrease. Meanwhile the silica can be extracted up to 44% from coal bottom ash using citric acid leaching treatment under the optimum reaction time of 60 minutes with solution temperature of 60°C and concentration of citric acid more than 2%.

  18. An Effective Approach to Biomedical Information Extraction with Limited Training Data

    Science.gov (United States)

    Jonnalagadda, Siddhartha

    2011-01-01

    In the current millennium, extensive use of computers and the internet caused an exponential increase in information. Few research areas are as important as information extraction, which primarily involves extracting concepts and the relations between them from free text. Limitations in the size of training data, lack of lexicons and lack of…

  19. MedTime: a temporal information extraction system for clinical narratives.

    Science.gov (United States)

    Lin, Yu-Kai; Chen, Hsinchun; Brown, Randall A

    2013-12-01

    Temporal information extraction from clinical narratives is of critical importance to many clinical applications. We participated in the EVENT/TIMEX3 track of the 2012 i2b2 clinical temporal relations challenge, and presented our temporal information extraction system, MedTime. MedTime comprises a cascade of rule-based and machine-learning pattern recognition procedures. It achieved a micro-averaged f-measure of 0.88 in both the recognitions of clinical events and temporal expressions. We proposed and evaluated three time normalization strategies to normalize relative time expressions in clinical texts. The accuracy was 0.68 in normalizing temporal expressions of dates, times, durations, and frequencies. This study demonstrates and evaluates the integration of rule-based and machine-learning-based approaches for high performance temporal information extraction from clinical narratives. Copyright © 2013 Elsevier Inc. All rights reserved.

  20. RELATING BOTTOM QUARK MASS IN DR-BAR AND MS-BAR REGULARIZATION SCHEMES

    International Nuclear Information System (INIS)

    2002-01-01

    The value of the bottom quark mass at Q = M Z in the (bar D)(bar R) scheme is an important input for the analysis of supersymmetric models with a large value of tan β. Conventionally, however, the running bottom quark mass extracted from experimental data is quoted in the (bar M)(bar S) scheme at the scale Q = m b . We describe a two loop procedure for the conversion of the bottom quark mass from (bar M)(bar S) to (bar D)(bar R) scheme. The Particle Data Group value m b # bar M# # bar S#(m b # bar M# # bar S#) = 4.2 ± 0.2 GeV corresponds to a range of 2.65-3.03 GeV for m b # bar D# # bar R#(M Z )

  1. a Statistical Texture Feature for Building Collapse Information Extraction of SAR Image

    Science.gov (United States)

    Li, L.; Yang, H.; Chen, Q.; Liu, X.

    2018-04-01

    Synthetic Aperture Radar (SAR) has become one of the most important ways to extract post-disaster collapsed building information, due to its extreme versatility and almost all-weather, day-and-night working capability, etc. In view of the fact that the inherent statistical distribution of speckle in SAR images is not used to extract collapsed building information, this paper proposed a novel texture feature of statistical models of SAR images to extract the collapsed buildings. In the proposed feature, the texture parameter of G0 distribution from SAR images is used to reflect the uniformity of the target to extract the collapsed building. This feature not only considers the statistical distribution of SAR images, providing more accurate description of the object texture, but also is applied to extract collapsed building information of single-, dual- or full-polarization SAR data. The RADARSAT-2 data of Yushu earthquake which acquired on April 21, 2010 is used to present and analyze the performance of the proposed method. In addition, the applicability of this feature to SAR data with different polarizations is also analysed, which provides decision support for the data selection of collapsed building information extraction.

  2. DKIE: Open Source Information Extraction for Danish

    DEFF Research Database (Denmark)

    Derczynski, Leon; Field, Camilla Vilhelmsen; Bøgh, Kenneth Sejdenfaden

    2014-01-01

    Danish is a major Scandinavian language spoken daily by around six million people. However, it lacks a unified, open set of NLP tools. This demonstration will introduce DKIE, an extensible open-source toolkit for processing Danish text. We implement an information extraction architecture for Danish...

  3. The Interplay of Top-Down and Bottom-Up

    DEFF Research Database (Denmark)

    Winkler, Till; Brown, Carol V.; Ozturk, Pinar

    2014-01-01

    The exchange of patient health information across different organizations involved in healthcare delivery has potential benefits for a wide range of stakeholders. However, many governments in Europe and in the U.S. have, despite both top-down and bottom-up initiatives, experienced major barriers...... in achieving sustainable models for implementing health information exchange (HIE) throughout their healthcare systems. In the case of the U.S., three years after stimulus funding allocated as part of the 2009 HITECH Act, the extent to which government funding will be needed to sustain health information...... organizations (HIOs) that facilitate HIE across regional stakeholders remains an unanswered question. This research investigates the impacts of top-down and bottom-up initiatives on the evolutionary paths of HIOs in two contingent states in the U.S. (New Jersey and New York) which had different starting...

  4. Transliteration normalization for Information Extraction and Machine Translation

    Directory of Open Access Journals (Sweden)

    Yuval Marton

    2014-12-01

    Full Text Available Foreign name transliterations typically include multiple spelling variants. These variants cause data sparseness and inconsistency problems, increase the Out-of-Vocabulary (OOV rate, and present challenges for Machine Translation, Information Extraction and other natural language processing (NLP tasks. This work aims to identify and cluster name spelling variants using a Statistical Machine Translation method: word alignment. The variants are identified by being aligned to the same “pivot” name in another language (the source-language in Machine Translation settings. Based on word-to-word translation and transliteration probabilities, as well as the string edit distance metric, names with similar spellings in the target language are clustered and then normalized to a canonical form. With this approach, tens of thousands of high-precision name transliteration spelling variants are extracted from sentence-aligned bilingual corpora in Arabic and English (in both languages. When these normalized name spelling variants are applied to Information Extraction tasks, improvements over strong baseline systems are observed. When applied to Machine Translation tasks, a large improvement potential is shown.

  5. Mathematical model of whole-process calculation for bottom-blowing copper smelting

    Science.gov (United States)

    Li, Ming-zhou; Zhou, Jie-min; Tong, Chang-ren; Zhang, Wen-hai; Li, He-song

    2017-11-01

    The distribution law of materials in smelting products is key to cost accounting and contaminant control. Regardless, the distribution law is difficult to determine quickly and accurately by mere sampling and analysis. Mathematical models for material and heat balance in bottom-blowing smelting, converting, anode furnace refining, and electrolytic refining were established based on the principles of material (element) conservation, energy conservation, and control index constraint in copper bottom-blowing smelting. Simulation of the entire process of bottom-blowing copper smelting was established using a self-developed MetCal software platform. A whole-process simulation for an enterprise in China was then conducted. Results indicated that the quantity and composition information of unknown materials, as well as heat balance information, can be quickly calculated using the model. Comparison of production data revealed that the model can basically reflect the distribution law of the materials in bottom-blowing copper smelting. This finding provides theoretical guidance for mastering the performance of the entire process.

  6. End-to-end information extraction without token-level supervision

    DEFF Research Database (Denmark)

    Palm, Rasmus Berg; Hovy, Dirk; Laws, Florian

    2017-01-01

    Most state-of-the-art information extraction approaches rely on token-level labels to find the areas of interest in text. Unfortunately, these labels are time-consuming and costly to create, and consequently, not available for many real-life IE tasks. To make matters worse, token-level labels...... and output text. We evaluate our model on the ATIS data set, MIT restaurant corpus and the MIT movie corpus and compare to neural baselines that do use token-level labels. We achieve competitive results, within a few percentage points of the baselines, showing the feasibility of E2E information extraction...

  7. Mining knowledge from text repositories using information extraction ...

    Indian Academy of Sciences (India)

    Information extraction (IE); text mining; text repositories; knowledge discovery from .... general purpose English words. However ... of precision and recall, as extensive experimentation is required due to lack of public tagged corpora. 4. Mining ...

  8. Towards an information extraction and knowledge formation framework based on Shannon entropy

    Directory of Open Access Journals (Sweden)

    Iliescu Dragoș

    2017-01-01

    Full Text Available Information quantity subject is approached in this paperwork, considering the specific domain of nonconforming product management as information source. This work represents a case study. Raw data were gathered from a heavy industrial works company, information extraction and knowledge formation being considered herein. Involved method for information quantity estimation is based on Shannon entropy formula. Information and entropy spectrum are decomposed and analysed for extraction of specific information and knowledge-that formation. The result of the entropy analysis point out the information needed to be acquired by the involved organisation, this being presented as a specific knowledge type.

  9. Tagline: Information Extraction for Semi-Structured Text Elements in Medical Progress Notes

    Science.gov (United States)

    Finch, Dezon Kile

    2012-01-01

    Text analysis has become an important research activity in the Department of Veterans Affairs (VA). Statistical text mining and natural language processing have been shown to be very effective for extracting useful information from medical documents. However, neither of these techniques is effective at extracting the information stored in…

  10. 76 FR 19125 - Bottom Mount Combination Refrigerator-Freezers From Korea and Mexico

    Science.gov (United States)

    2011-04-06

    ...)] Bottom Mount Combination Refrigerator-Freezers From Korea and Mexico AGENCY: United States International... bottom mount combination refrigerator-freezers from Korea and Mexico, provided for in subheadings 8418.10... five business days thereafter, or by May 23, 2011. For further information concerning the conduct of...

  11. Mars Target Encyclopedia: Information Extraction for Planetary Science

    Science.gov (United States)

    Wagstaff, K. L.; Francis, R.; Gowda, T.; Lu, Y.; Riloff, E.; Singh, K.

    2017-06-01

    Mars surface targets / and published compositions / Seek and ye will find. We used text mining methods to extract information from LPSC abstracts about the composition of Mars surface targets. Users can search by element, mineral, or target.

  12. Optimum detection for extracting maximum information from symmetric qubit sets

    International Nuclear Information System (INIS)

    Mizuno, Jun; Fujiwara, Mikio; Sasaki, Masahide; Akiba, Makoto; Kawanishi, Tetsuya; Barnett, Stephen M.

    2002-01-01

    We demonstrate a class of optimum detection strategies for extracting the maximum information from sets of equiprobable real symmetric qubit states of a single photon. These optimum strategies have been predicted by Sasaki et al. [Phys. Rev. A 59, 3325 (1999)]. The peculiar aspect is that the detections with at least three outputs suffice for optimum extraction of information regardless of the number of signal elements. The cases of ternary (or trine), quinary, and septenary polarization signals are studied where a standard von Neumann detection (a projection onto a binary orthogonal basis) fails to access the maximum information. Our experiments demonstrate that it is possible with present technologies to attain about 96% of the theoretical limit

  13. Study on methods and techniques of aeroradiometric weak information extraction for sandstone-hosted uranium deposits based on GIS

    International Nuclear Information System (INIS)

    Han Shaoyang; Ke Dan; Hou Huiqun

    2005-01-01

    The weak information extraction is one of the important research contents in the current sandstone-type uranium prospecting in China. This paper introduces the connotation of aeroradiometric weak information extraction, and discusses the formation theories of aeroradiometric weak information extraction, and discusses the formation theories of aeroradiometric weak information and establishes some effective mathematic models for weak information extraction. Models for weak information extraction are realized based on GIS software platform. Application tests of weak information extraction are realized based on GIS software platform. Application tests of weak information extraction are completed in known uranium mineralized areas. Research results prove that the prospective areas of sandstone-type uranium deposits can be rapidly delineated by extracting aeroradiometric weak information. (authors)

  14. Enhanced Photon Extraction from a Nanowire Quantum Dot Using a Bottom-Up Photonic Shell

    DEFF Research Database (Denmark)

    Jeannin, Mathieu; Cremel, Thibault; Häyrynen, Teppo

    2017-01-01

    Semiconductor nanowires offer the possibility to grow high-quality quantum-dot heterostructures, and, in particular, CdSe quantum dots inserted in ZnSe nanowires have demonstrated the ability to emit single photons up to room temperature. In this paper, we demonstrate a bottom-up approach...

  15. Using text mining techniques to extract phenotypic information from the PhenoCHF corpus.

    Science.gov (United States)

    Alnazzawi, Noha; Thompson, Paul; Batista-Navarro, Riza; Ananiadou, Sophia

    2015-01-01

    Phenotypic information locked away in unstructured narrative text presents significant barriers to information accessibility, both for clinical practitioners and for computerised applications used for clinical research purposes. Text mining (TM) techniques have previously been applied successfully to extract different types of information from text in the biomedical domain. They have the potential to be extended to allow the extraction of information relating to phenotypes from free text. To stimulate the development of TM systems that are able to extract phenotypic information from text, we have created a new corpus (PhenoCHF) that is annotated by domain experts with several types of phenotypic information relating to congestive heart failure. To ensure that systems developed using the corpus are robust to multiple text types, it integrates text from heterogeneous sources, i.e., electronic health records (EHRs) and scientific articles from the literature. We have developed several different phenotype extraction methods to demonstrate the utility of the corpus, and tested these methods on a further corpus, i.e., ShARe/CLEF 2013. Evaluation of our automated methods showed that PhenoCHF can facilitate the training of reliable phenotype extraction systems, which are robust to variations in text type. These results have been reinforced by evaluating our trained systems on the ShARe/CLEF corpus, which contains clinical records of various types. Like other studies within the biomedical domain, we found that solutions based on conditional random fields produced the best results, when coupled with a rich feature set. PhenoCHF is the first annotated corpus aimed at encoding detailed phenotypic information. The unique heterogeneous composition of the corpus has been shown to be advantageous in the training of systems that can accurately extract phenotypic information from a range of different text types. Although the scope of our annotation is currently limited to a single

  16. Multi-angle backscatter classification and sub-bottom profiling for improved seafloor characterization

    Science.gov (United States)

    Alevizos, Evangelos; Snellen, Mirjam; Simons, Dick; Siemes, Kerstin; Greinert, Jens

    2018-06-01

    This study applies three classification methods exploiting the angular dependence of acoustic seafloor backscatter along with high resolution sub-bottom profiling for seafloor sediment characterization in the Eckernförde Bay, Baltic Sea Germany. This area is well suited for acoustic backscatter studies due to its shallowness, its smooth bathymetry and the presence of a wide range of sediment types. Backscatter data were acquired using a Seabeam1180 (180 kHz) multibeam echosounder and sub-bottom profiler data were recorded using a SES-2000 parametric sonar transmitting 6 and 12 kHz. The high density of seafloor soundings allowed extracting backscatter layers for five beam angles over a large part of the surveyed area. A Bayesian probability method was employed for sediment classification based on the backscatter variability at a single incidence angle, whereas Maximum Likelihood Classification (MLC) and Principal Components Analysis (PCA) were applied to the multi-angle layers. The Bayesian approach was used for identifying the optimum number of acoustic classes because cluster validation is carried out prior to class assignment and class outputs are ordinal categorical values. The method is based on the principle that backscatter values from a single incidence angle express a normal distribution for a particular sediment type. The resulting Bayesian classes were well correlated to median grain sizes and the percentage of coarse material. The MLC method uses angular response information from five layers of training areas extracted from the Bayesian classification map. The subsequent PCA analysis is based on the transformation of these five layers into two principal components that comprise most of the data variability. These principal components were clustered in five classes after running an external cluster validation test. In general both methods MLC and PCA, separated the various sediment types effectively, showing good agreement (kappa >0.7) with the Bayesian

  17. Information Extraction, Data Integration, and Uncertain Data Management: The State of The Art

    NARCIS (Netherlands)

    Habib, Mena Badieh; van Keulen, Maurice

    2011-01-01

    Information Extraction, data Integration, and uncertain data management are different areas of research that got vast focus in the last two decades. Many researches tackled those areas of research individually. However, information extraction systems should have integrated with data integration

  18. Information extraction from muon radiography data

    International Nuclear Information System (INIS)

    Borozdin, K.N.; Asaki, T.J.; Chartrand, R.; Hengartner, N.W.; Hogan, G.E.; Morris, C.L.; Priedhorsky, W.C.; Schirato, R.C.; Schultz, L.J.; Sottile, M.J.; Vixie, K.R.; Wohlberg, B.E.; Blanpied, G.

    2004-01-01

    Scattering muon radiography was proposed recently as a technique of detection and 3-d imaging for dense high-Z objects. High-energy cosmic ray muons are deflected in matter in the process of multiple Coulomb scattering. By measuring the deflection angles we are able to reconstruct the configuration of high-Z material in the object. We discuss the methods for information extraction from muon radiography data. Tomographic methods widely used in medical images have been applied to a specific muon radiography information source. Alternative simple technique based on the counting of high-scattered muons in the voxels seems to be efficient in many simulated scenes. SVM-based classifiers and clustering algorithms may allow detection of compact high-Z object without full image reconstruction. The efficiency of muon radiography can be increased using additional informational sources, such as momentum estimation, stopping power measurement, and detection of muonic atom emission.

  19. Method of extracting iodine from liquid mixtures of iodine, water and hydrogen iodide

    Science.gov (United States)

    Mysels, Karol J.

    1979-01-01

    The components of a liquid mixture consisting essentially of HI, water and at least about 50 w/o iodine are separated in a countercurrent extraction zone by treating with phosphoric acid containing at least about 90 w/o H.sub.3 PO.sub.4. The bottom stream from the extraction zone is substantially completely molten iodine, and the overhead stream contains water, HI, H.sub.3 PO.sub.4 and a small fraction of the amount of original iodine. When the water and HI are present in near-azeotropic proportions, there is particular advantage in feeding the overhead stream to an extractive distillation zone wherein it is treated with additional concentrated phosphoric acid to create an anhydrous HI vapor stream and bottoms which contain at least about 85 w/o H.sub.3 PO.sub.4. Concentration of these bottoms provides phosphoric acid infeed for both the countercurrent extraction zone and for the extractive distillation zone.

  20. Bottom-up effects on attention capture and choice

    DEFF Research Database (Denmark)

    Peschel, Anne; Orquin, Jacob Lund; Mueller Loose, Simone

    Attention processes and decision making are accepted to be closely linked together because only information that is attended to can be incorporated in the decision process. Little is known however, to which extent bottom-up processes of attention affect stimulus selection and therefore...... the information available to form a decision. Does changing one visual cue in the stimulus set affect attention towards this cue and what does that mean for the choice outcome? To address this, we conducted a combined eye tracking and choice experiment in a consumer choice setting with visual shelf simulations...... salient. The observed effect on attention also carries over into increased choice likelihood. From these results, we conclude that even small changes in the choice capture attention based on bottom-up processes. Also for eye tracking studies in other domains (e.g. search tasks) this means that stimulus...

  1. Recognition techniques for extracting information from semistructured documents

    Science.gov (United States)

    Della Ventura, Anna; Gagliardi, Isabella; Zonta, Bruna

    2000-12-01

    Archives of optical documents are more and more massively employed, the demand driven also by the new norms sanctioning the legal value of digital documents, provided they are stored on supports that are physically unalterable. On the supply side there is now a vast and technologically advanced market, where optical memories have solved the problem of the duration and permanence of data at costs comparable to those for magnetic memories. The remaining bottleneck in these systems is the indexing. The indexing of documents with a variable structure, while still not completely automated, can be machine supported to a large degree with evident advantages both in the organization of the work, and in extracting information, providing data that is much more detailed and potentially significant for the user. We present here a system for the automatic registration of correspondence to and from a public office. The system is based on a general methodology for the extraction, indexing, archiving, and retrieval of significant information from semi-structured documents. This information, in our prototype application, is distributed among the database fields of sender, addressee, subject, date, and body of the document.

  2. Evaluation of the bottom water reservoir VAPEX process

    Energy Technology Data Exchange (ETDEWEB)

    Frauenfeld, T.W.J.; Jossy, C.; Kissel, G.A. [Alberta Research Council, Devon, AB (Canada); Rispler, K. [Saskatchewan Research Council, Saskatoon, SK (Canada)

    2004-07-01

    The mobilization of viscous heavy oil requires the dissolution of solvent vapour into the oil as well as the diffusion of the dissolved solvent into the virgin oil. Vapour extraction (VAPEX) is an enhanced oil recovery (EOR) process which involves injecting a solvent into the reservoir to reduce the viscosity of hydrocarbons. This paper describes the contribution of the Alberta Research Council to solvent-assisted oil recovery technology. The bottom water process was also modelled to determine its feasibility for a field-scale oil recovery scheme. Several experiments were conducted in an acrylic visual model in which Pujol and Boberg scaling were used to produce a lab model scaling a field process. The model simulated a slice of a 30 metre thick reservoir, with a 10 metre thick bottom water zone, containing two horizontal wells (25 metres apart) at the oil water interface. The experimental rates were found to be negatively affected by continuous low permeability layers and by oil with an initial gas content. In order to achieve commercial oil recovery rates, the bottom water process must be used to increase the surface area exposed to solvents. A large oil water interface between the wells provides contact for solvent when injecting gas at the interface. High production rates are therefore possible with appropriate well spacing. 11 refs., 4 tabs., 16 figs.

  3. Cylinder-type bottom reflector

    International Nuclear Information System (INIS)

    Elter, C.; Fritz, R.; Kissel, K.F.; Schoening, J.

    1982-01-01

    Proposal of a bottom reflector for gas-cooled nuclear reactor plants with a pebble bed of spherical fuel elements, where the horizontal forces acting from the core and the bottom reflector upon the side reflector are equally distributed. This is attained by the upper edge of the bottom reflector being placed levelly and by the angle of inclination of the recesses varying. (orig.) [de

  4. RESEARCH ON REMOTE SENSING GEOLOGICAL INFORMATION EXTRACTION BASED ON OBJECT ORIENTED CLASSIFICATION

    Directory of Open Access Journals (Sweden)

    H. Gao

    2018-04-01

    Full Text Available The northern Tibet belongs to the Sub cold arid climate zone in the plateau. It is rarely visited by people. The geological working conditions are very poor. However, the stratum exposures are good and human interference is very small. Therefore, the research on the automatic classification and extraction of remote sensing geological information has typical significance and good application prospect. Based on the object-oriented classification in Northern Tibet, using the Worldview2 high-resolution remote sensing data, combined with the tectonic information and image enhancement, the lithological spectral features, shape features, spatial locations and topological relations of various geological information are excavated. By setting the threshold, based on the hierarchical classification, eight kinds of geological information were classified and extracted. Compared with the existing geological maps, the accuracy analysis shows that the overall accuracy reached 87.8561 %, indicating that the classification-oriented method is effective and feasible for this study area and provides a new idea for the automatic extraction of remote sensing geological information.

  5. Terrain Extraction by Integrating Terrestrial Laser Scanner Data and Spectral Information

    Science.gov (United States)

    Lau, C. L.; Halim, S.; Zulkepli, M.; Azwan, A. M.; Tang, W. L.; Chong, A. K.

    2015-10-01

    The extraction of true terrain points from unstructured laser point cloud data is an important process in order to produce an accurate digital terrain model (DTM). However, most of these spatial filtering methods just utilizing the geometrical data to discriminate the terrain points from nonterrain points. The point cloud filtering method also can be improved by using the spectral information available with some scanners. Therefore, the objective of this study is to investigate the effectiveness of using the three-channel (red, green and blue) of the colour image captured from built-in digital camera which is available in some Terrestrial Laser Scanner (TLS) for terrain extraction. In this study, the data acquisition was conducted at a mini replica landscape in Universiti Teknologi Malaysia (UTM), Skudai campus using Leica ScanStation C10. The spectral information of the coloured point clouds from selected sample classes are extracted for spectral analysis. The coloured point clouds which within the corresponding preset spectral threshold are identified as that specific feature point from the dataset. This process of terrain extraction is done through using developed Matlab coding. Result demonstrates that a higher spectral resolution passive image is required in order to improve the output. This is because low quality of the colour images captured by the sensor contributes to the low separability in spectral reflectance. In conclusion, this study shows that, spectral information is capable to be used as a parameter for terrain extraction.

  6. Extracting Semantic Information from Visual Data: A Survey

    Directory of Open Access Journals (Sweden)

    Qiang Liu

    2016-03-01

    Full Text Available The traditional environment maps built by mobile robots include both metric ones and topological ones. These maps are navigation-oriented and not adequate for service robots to interact with or serve human users who normally rely on the conceptual knowledge or semantic contents of the environment. Therefore, the construction of semantic maps becomes necessary for building an effective human-robot interface for service robots. This paper reviews recent research and development in the field of visual-based semantic mapping. The main focus is placed on how to extract semantic information from visual data in terms of feature extraction, object/place recognition and semantic representation methods.

  7. KneeTex: an ontology-driven system for information extraction from MRI reports.

    Science.gov (United States)

    Spasić, Irena; Zhao, Bo; Jones, Christopher B; Button, Kate

    2015-01-01

    In the realm of knee pathology, magnetic resonance imaging (MRI) has the advantage of visualising all structures within the knee joint, which makes it a valuable tool for increasing diagnostic accuracy and planning surgical treatments. Therefore, clinical narratives found in MRI reports convey valuable diagnostic information. A range of studies have proven the feasibility of natural language processing for information extraction from clinical narratives. However, no study focused specifically on MRI reports in relation to knee pathology, possibly due to the complexity of knee anatomy and a wide range of conditions that may be associated with different anatomical entities. In this paper we describe KneeTex, an information extraction system that operates in this domain. As an ontology-driven information extraction system, KneeTex makes active use of an ontology to strongly guide and constrain text analysis. We used automatic term recognition to facilitate the development of a domain-specific ontology with sufficient detail and coverage for text mining applications. In combination with the ontology, high regularity of the sublanguage used in knee MRI reports allowed us to model its processing by a set of sophisticated lexico-semantic rules with minimal syntactic analysis. The main processing steps involve named entity recognition combined with coordination, enumeration, ambiguity and co-reference resolution, followed by text segmentation. Ontology-based semantic typing is then used to drive the template filling process. We adopted an existing ontology, TRAK (Taxonomy for RehAbilitation of Knee conditions), for use within KneeTex. The original TRAK ontology expanded from 1,292 concepts, 1,720 synonyms and 518 relationship instances to 1,621 concepts, 2,550 synonyms and 560 relationship instances. This provided KneeTex with a very fine-grained lexico-semantic knowledge base, which is highly attuned to the given sublanguage. Information extraction results were evaluated

  8. River bottom sediment from the Vistula as matrix of candidate for a new reference material.

    Science.gov (United States)

    Kiełbasa, Anna; Buszewski, Bogusław

    2017-08-01

    Bottom sediments are very important in aquatic ecosystems. The sediments accumulate heavy metals and compounds belonging to the group of persistent organic pollutants. The accelerated solvent extraction (ASE) was used for extraction of 16 compounds from PAH group from bottom sediment of Vistula. For the matrix of candidate of a new reference material, moisture content, particle size, loss on ignition, pH, and total organic carbon were determined. A gas chromatograph with a selective mass detector (GC/MS) was used for the final analysis. The obtained recoveries were from 86% (SD=6.9) for anthracene to 119% (SD=5.4) for dibenzo(ah)anthracene. For the candidate for a new reference material, homogeneity and analytes content were determined using a validated method. The results are a very important part of the development and certification of a new reference materials. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Prospects of obtaining samples of bottom sediments from subglacial lake Vostok

    Directory of Open Access Journals (Sweden)

    Н. И. Васильев

    2017-04-01

    Full Text Available The paper proves the timeliness of obtaining and examining bottom sediments from subglacial Lake Vostok. Predictive geological section of Lake Vostok and information value of bottom sediments have been examined. Severe requirements towards environmental security of lake examinations and sampling of bottom sediments rule out the use of conventional drilling technologies, as they would pollute the lake with injection liquid from the borehole. In order to carry out sampling of bottom sediments from the subglacial lake, it is proposed to use a dynamically balanced tool string, which enables rotary drilling without any external support on borehole walls to transmit counter torque.     A theoretical analysis has been carried out to assess the operation of the tool string, which is a two-mass oscillatory electromechanical system of reciprocating and rotating motion (RRM with two degrees of freedom.

  10. Extracting the information backbone in online system.

    Directory of Open Access Journals (Sweden)

    Qian-Ming Zhang

    Full Text Available Information overload is a serious problem in modern society and many solutions such as recommender system have been proposed to filter out irrelevant information. In the literature, researchers have been mainly dedicated to improving the recommendation performance (accuracy and diversity of the algorithms while they have overlooked the influence of topology of the online user-object bipartite networks. In this paper, we find that some information provided by the bipartite networks is not only redundant but also misleading. With such "less can be more" feature, we design some algorithms to improve the recommendation performance by eliminating some links from the original networks. Moreover, we propose a hybrid method combining the time-aware and topology-aware link removal algorithms to extract the backbone which contains the essential information for the recommender systems. From the practical point of view, our method can improve the performance and reduce the computational time of the recommendation system, thus improving both of their effectiveness and efficiency.

  11. Extracting the information backbone in online system.

    Science.gov (United States)

    Zhang, Qian-Ming; Zeng, An; Shang, Ming-Sheng

    2013-01-01

    Information overload is a serious problem in modern society and many solutions such as recommender system have been proposed to filter out irrelevant information. In the literature, researchers have been mainly dedicated to improving the recommendation performance (accuracy and diversity) of the algorithms while they have overlooked the influence of topology of the online user-object bipartite networks. In this paper, we find that some information provided by the bipartite networks is not only redundant but also misleading. With such "less can be more" feature, we design some algorithms to improve the recommendation performance by eliminating some links from the original networks. Moreover, we propose a hybrid method combining the time-aware and topology-aware link removal algorithms to extract the backbone which contains the essential information for the recommender systems. From the practical point of view, our method can improve the performance and reduce the computational time of the recommendation system, thus improving both of their effectiveness and efficiency.

  12. Extracting the Information Backbone in Online System

    Science.gov (United States)

    Zhang, Qian-Ming; Zeng, An; Shang, Ming-Sheng

    2013-01-01

    Information overload is a serious problem in modern society and many solutions such as recommender system have been proposed to filter out irrelevant information. In the literature, researchers have been mainly dedicated to improving the recommendation performance (accuracy and diversity) of the algorithms while they have overlooked the influence of topology of the online user-object bipartite networks. In this paper, we find that some information provided by the bipartite networks is not only redundant but also misleading. With such “less can be more” feature, we design some algorithms to improve the recommendation performance by eliminating some links from the original networks. Moreover, we propose a hybrid method combining the time-aware and topology-aware link removal algorithms to extract the backbone which contains the essential information for the recommender systems. From the practical point of view, our method can improve the performance and reduce the computational time of the recommendation system, thus improving both of their effectiveness and efficiency. PMID:23690946

  13. Accumulation and potential dissolution of Chernobyl-derived radionuclides in river bottom sediment

    International Nuclear Information System (INIS)

    Sanada, Yukihisa; Matsunaga, Takeshi; Yanase, Nobuyuki; Nagao, Seiya; Amano, Hikaru; Takada, Hideshige; Tkachenko, Yuri

    2002-01-01

    Areas contaminated with radionuclides from the Chernobyl nuclear accident have been identified in Pripyat River near the Chernobyl Nuclear Power Plant. The river bottom sediment cores contained 137 Cs (10 5 - 10 6 Bq/m 2 ) within 0-30 cm depth, whose concentration is comparable to that in the ground soil in the vicinity of the nuclear power plant (the Exclusion Zone). The sediment cores also accumulated 90 Sr (10 5 Bq/m 2 ), 239,240 Pu (10 4 Bq/m 2 ) and 241 Am (10 4 Bq/m 2 ) derived from the accident. Several nuclear fuel particles have been preserved at 20-25 cm depth that is the peak area of the concentrations of the radionuclides. Th ese inventories in the bottom sediments were compared with those of the released radionuclides during the accident. An analysis using a selective sequential extraction technique was applied for the radionuclides in the sediments. Results suggest that the possibility of release of 137 Cs and 239,240 Pu from the bottom sediment was low compared with 90 Sr. The potential dissolution and subsequent transport of 90 Sr from the river bottom sediment should be taken into account with respect to the long-term radiological influence on the aquatic environment

  14. Process for extracting uranium from phosphoric acid solutions

    International Nuclear Information System (INIS)

    1977-01-01

    The description is given of a method for extracting uranium from phosphoric acid solutions whereby the previously oxided acid is treated with an organic solvent constituted by a mixture of dialkylphosphoric acid and trialkylphosphine oxide in solution in a non-reactive inert solvent so as to obtain de-uraniated phosphoric acid and an organic extract constituted by the solvent containing most of the uranium. The uranium is then separated from the extract as uranyl ammonium tricarbonate by reaction with ammonia and ammonium carbonate and the extract de-uraniated at the extraction stage is recycled. The extract is treated in a re-extraction apparatus comprising not less than two stages. The extract to be treated is injected at the top of the first stage. At the bottom of the first stage, ammonia is introduced counter current as gas or as an aqueous solution whilst controlling the pH of the first stage so as to keep it to 8.0 or 8.5 and at the bottom of the last stage an ammonium carbonate aqueous solution is injected in a quantity representing 50 to 80% of the stoichiometric quantity required to neutralize the dialkylphosphoric acid contained in the solvent and transform the uranium into uranyl ammonium tricarbonate [fr

  15. Drycon dry ash conveyor: dry bottom ash handling system with reduced operating costs and improved plant efficiency

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2006-07-01

    The Drycon dry bottom ash extraction system is designed to remove bottom ash beneath the furnace, cooling it without any need of water. Fresh air in countercurrent flow to the ash is used for the ash cooling. Data presented show how savings of time and costs can be achieved with this system and how a boiler efficiency can be increased using this technology. Considerable advantages in the reliability of operation with new improvements of the design are described. 7 figs.

  16. Extracting central places from the link structure in Wikipedia

    DEFF Research Database (Denmark)

    Kessler, Carsten

    2017-01-01

    of the German language edition of Wikipedia. The official upper and middle centers declared, based on German spatial laws, are used as a reference dataset. The characteristics of the link structure around their Wikipedia pages, which link to each other or mention each other, and how often, are used to develop...... a bottom-up method for extracting central places from Wikipedia. The method relies solely on the structure and number of links and mentions between the corresponding Wikipedia pages; no spatial information is used in the extraction process. The output of this method shows significant overlap...... with the official central place structure, especially for the upper centers. The results indicate that real-world relationships are in fact reflected in the link structure on the web in the case of Wikipedia....

  17. Tsunami Simulation Method Assimilating Ocean Bottom Pressure Data Near a Tsunami Source Region

    Science.gov (United States)

    Tanioka, Yuichiro

    2018-02-01

    A new method was developed to reproduce the tsunami height distribution in and around the source area, at a certain time, from a large number of ocean bottom pressure sensors, without information on an earthquake source. A dense cabled observation network called S-NET, which consists of 150 ocean bottom pressure sensors, was installed recently along a wide portion of the seafloor off Kanto, Tohoku, and Hokkaido in Japan. However, in the source area, the ocean bottom pressure sensors cannot observe directly an initial ocean surface displacement. Therefore, we developed the new method. The method was tested and functioned well for a synthetic tsunami from a simple rectangular fault with an ocean bottom pressure sensor network using 10 arc-min, or 20 km, intervals. For a test case that is more realistic, ocean bottom pressure sensors with 15 arc-min intervals along the north-south direction and sensors with 30 arc-min intervals along the east-west direction were used. In the test case, the method also functioned well enough to reproduce the tsunami height field in general. These results indicated that the method could be used for tsunami early warning by estimating the tsunami height field just after a great earthquake without the need for earthquake source information.

  18. High-frequency internal waves and thick bottom mixed layers observed by gliders in the Gulf Stream

    Science.gov (United States)

    Todd, Robert E.

    2017-06-01

    Autonomous underwater gliders are conducting high-resolution surveys within the Gulf Stream along the U.S. East Coast. Glider surveys reveal two mechanisms by which energy is extracted from the Gulf Stream as it flows over the Blake Plateau, a portion of the outer continental shelf between Florida and North Carolina where bottom depths are less than 1000 m. Internal waves with vertical velocities exceeding 0.1 m s-1 and frequencies just below the local buoyancy frequency are routinely found over the Blake Plateau, particularly near the Charleston Bump, a prominent topographic feature. These waves are likely internal lee waves generated by the subinertial Gulf Stream flow over the irregular bathymetry of the outer continental shelf. Bottom mixed layers with O(100) m thickness are also frequently encountered; these thick bottom mixed layers likely form in the lee of topography due to enhanced turbulence generated by O(1) m s-1 near-bottom flows.

  19. Extraction Method for Earthquake-Collapsed Building Information Based on High-Resolution Remote Sensing

    International Nuclear Information System (INIS)

    Chen, Peng; Wu, Jian; Liu, Yaolin; Wang, Jing

    2014-01-01

    At present, the extraction of earthquake disaster information from remote sensing data relies on visual interpretation. However, this technique cannot effectively and quickly obtain precise and efficient information for earthquake relief and emergency management. Collapsed buildings in the town of Zipingpu after the Wenchuan earthquake were used as a case study to validate two kinds of rapid extraction methods for earthquake-collapsed building information based on pixel-oriented and object-oriented theories. The pixel-oriented method is based on multi-layer regional segments that embody the core layers and segments of the object-oriented method. The key idea is to mask layer by layer all image information, including that on the collapsed buildings. Compared with traditional techniques, the pixel-oriented method is innovative because it allows considerably rapid computer processing. As for the object-oriented method, a multi-scale segment algorithm was applied to build a three-layer hierarchy. By analyzing the spectrum, texture, shape, location, and context of individual object classes in different layers, the fuzzy determined rule system was established for the extraction of earthquake-collapsed building information. We compared the two sets of results using three variables: precision assessment, visual effect, and principle. Both methods can extract earthquake-collapsed building information quickly and accurately. The object-oriented method successfully overcomes the pepper salt noise caused by the spectral diversity of high-resolution remote sensing data and solves the problem of same object, different spectrums and that of same spectrum, different objects. With an overall accuracy of 90.38%, the method achieves more scientific and accurate results compared with the pixel-oriented method (76.84%). The object-oriented image analysis method can be extensively applied in the extraction of earthquake disaster information based on high-resolution remote sensing

  20. Improving information extraction using a probability-based approach

    DEFF Research Database (Denmark)

    Kim, S.; Ahmed, Saeema; Wallace, K.

    2007-01-01

    Information plays a crucial role during the entire life-cycle of a product. It has been shown that engineers frequently consult colleagues to obtain the information they require to solve problems. However, the industrial world is now more transient and key personnel move to other companies...... or retire. It is becoming essential to retrieve vital information from archived product documents, if it is available. There is, therefore, great interest in ways of extracting relevant and sharable information from documents. A keyword-based search is commonly used, but studies have shown...... the recall, while maintaining the high precision, a learning approach that makes identification decisions based on a probability model, rather than simply looking up the presence of the pre-defined variations, looks promising. This paper presents the results of developing such a probability-based entity...

  1. Semi-automatic building extraction in informal settlements from high-resolution satellite imagery

    Science.gov (United States)

    Mayunga, Selassie David

    The extraction of man-made features from digital remotely sensed images is considered as an important step underpinning management of human settlements in any country. Man-made features and buildings in particular are required for varieties of applications such as urban planning, creation of geographical information systems (GIS) databases and Urban City models. The traditional man-made feature extraction methods are very expensive in terms of equipment, labour intensive, need well-trained personnel and cannot cope with changing environments, particularly in dense urban settlement areas. This research presents an approach for extracting buildings in dense informal settlement areas using high-resolution satellite imagery. The proposed system uses a novel strategy of extracting building by measuring a single point at the approximate centre of the building. The fine measurement of the building outlines is then effected using a modified snake model. The original snake model on which this framework is based, incorporates an external constraint energy term which is tailored to preserving the convergence properties of the snake model; its use to unstructured objects will negatively affect their actual shapes. The external constrained energy term was removed from the original snake model formulation, thereby, giving ability to cope with high variability of building shapes in informal settlement areas. The proposed building extraction system was tested on two areas, which have different situations. The first area was Tungi in Dar Es Salaam, Tanzania where three sites were tested. This area is characterized by informal settlements, which are illegally formulated within the city boundaries. The second area was Oromocto in New Brunswick, Canada where two sites were tested. Oromocto area is mostly flat and the buildings are constructed using similar materials. Qualitative and quantitative measures were employed to evaluate the accuracy of the results as well as the performance

  2. Bottom Dissolved Oxygen Maps From SEAMAP Summer and Fall Groundfish/Shrimp Surveys from 1982 to 1998 (NCEI Accession 0155488)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Bottom dissolved oxygen (DO) data was extracted from environmental profiles acquired during the Southeast Fisheries Science Center Mississippi Laboratories summer...

  3. Bottom friction models for shallow water equations: Manning’s roughness coefficient and small-scale bottom heterogeneity

    Science.gov (United States)

    Dyakonova, Tatyana; Khoperskov, Alexander

    2018-03-01

    The correct description of the surface water dynamics in the model of shallow water requires accounting for friction. To simulate a channel flow in the Chezy model the constant Manning roughness coefficient is frequently used. The Manning coefficient nM is an integral parameter which accounts for a large number of physical factors determining the flow braking. We used computational simulations in a shallow water model to determine the relationship between the Manning coefficient and the parameters of small-scale perturbations of a bottom in a long channel. Comparing the transverse water velocity profiles in the channel obtained in the models with a perturbed bottom without bottom friction and with bottom friction on a smooth bottom, we constructed the dependence of nM on the amplitude and spatial scale of perturbation of the bottom relief.

  4. Accurate facade feature extraction method for buildings from three-dimensional point cloud data considering structural information

    Science.gov (United States)

    Wang, Yongzhi; Ma, Yuqing; Zhu, A.-xing; Zhao, Hui; Liao, Lixia

    2018-05-01

    Facade features represent segmentations of building surfaces and can serve as a building framework. Extracting facade features from three-dimensional (3D) point cloud data (3D PCD) is an efficient method for 3D building modeling. By combining the advantages of 3D PCD and two-dimensional optical images, this study describes the creation of a highly accurate building facade feature extraction method from 3D PCD with a focus on structural information. The new extraction method involves three major steps: image feature extraction, exploration of the mapping method between the image features and 3D PCD, and optimization of the initial 3D PCD facade features considering structural information. Results show that the new method can extract the 3D PCD facade features of buildings more accurately and continuously. The new method is validated using a case study. In addition, the effectiveness of the new method is demonstrated by comparing it with the range image-extraction method and the optical image-extraction method in the absence of structural information. The 3D PCD facade features extracted by the new method can be applied in many fields, such as 3D building modeling and building information modeling.

  5. Information analysis of iris biometrics for the needs of cryptology key extraction

    Directory of Open Access Journals (Sweden)

    Adamović Saša

    2013-01-01

    Full Text Available The paper presents a rigorous analysis of iris biometric information for the synthesis of an optimized system for the extraction of a high quality cryptology key. Estimations of local entropy and mutual information were identified as segments of the iris most suitable for this purpose. In order to optimize parameters, corresponding wavelets were transformed, in order to obtain the highest possible entropy and mutual information lower in the transformation domain, which set frameworks for the synthesis of systems for the extraction of truly random sequences of iris biometrics, without compromising authentication properties. [Projekat Ministarstva nauke Republike Srbije, br. TR32054 i br. III44006

  6. Information retrieval and terminology extraction in online resources for patients with diabetes.

    Science.gov (United States)

    Seljan, Sanja; Baretić, Maja; Kucis, Vlasta

    2014-06-01

    Terminology use, as a mean for information retrieval or document indexing, plays an important role in health literacy. Specific types of users, i.e. patients with diabetes need access to various online resources (on foreign and/or native language) searching for information on self-education of basic diabetic knowledge, on self-care activities regarding importance of dietetic food, medications, physical exercises and on self-management of insulin pumps. Automatic extraction of corpus-based terminology from online texts, manuals or professional papers, can help in building terminology lists or list of "browsing phrases" useful in information retrieval or in document indexing. Specific terminology lists represent an intermediate step between free text search and controlled vocabulary, between user's demands and existing online resources in native and foreign language. The research aiming to detect the role of terminology in online resources, is conducted on English and Croatian manuals and Croatian online texts, and divided into three interrelated parts: i) comparison of professional and popular terminology use ii) evaluation of automatic statistically-based terminology extraction on English and Croatian texts iii) comparison and evaluation of extracted terminology performed on English manual using statistical and hybrid approaches. Extracted terminology candidates are evaluated by comparison with three types of reference lists: list created by professional medical person, list of highly professional vocabulary contained in MeSH and list created by non-medical persons, made as intersection of 15 lists. Results report on use of popular and professional terminology in online diabetes resources, on evaluation of automatically extracted terminology candidates in English and Croatian texts and on comparison of statistical and hybrid extraction methods in English text. Evaluation of automatic and semi-automatic terminology extraction methods is performed by recall

  7. Bottom-linked innovation

    DEFF Research Database (Denmark)

    Kristensen, Catharina Juul

    2018-01-01

    hitherto been paid little explicit attention, namely collaboration between middle managers and employees in innovation processes. In contrast to most studies, middle managers and employees are here both subjects of explicit investigation. The collaboration processes explored in this article are termed...... ‘bottom-linked innovation’. The empirical analysis is based on an in-depth qualitative study of bottom-linked innovation in a public frontline institution in Denmark. By combining research on employee-driven innovation and middle management, the article offers new insights into such collaborative......Employee-driven innovation is gaining ground as a strategy for developing sustainable organisations in the public and private sector. This type of innovation is characterised by active employee participation, and the bottom-up perspective is often emphasised. This article explores an issue that has...

  8. [Extraction of buildings three-dimensional information from high-resolution satellite imagery based on Barista software].

    Science.gov (United States)

    Zhang, Pei-feng; Hu, Yuan-man; He, Hong-shi

    2010-05-01

    The demand for accurate and up-to-date spatial information of urban buildings is becoming more and more important for urban planning, environmental protection, and other vocations. Today's commercial high-resolution satellite imagery offers the potential to extract the three-dimensional information of urban buildings. This paper extracted the three-dimensional information of urban buildings from QuickBird imagery, and validated the precision of the extraction based on Barista software. It was shown that the extraction of three-dimensional information of the buildings from high-resolution satellite imagery based on Barista software had the advantages of low professional level demand, powerful universality, simple operation, and high precision. One pixel level of point positioning and height determination accuracy could be achieved if the digital elevation model (DEM) and sensor orientation model had higher precision and the off-Nadir View Angle was relatively perfect.

  9. ICT-ENABLED BOTTOM-UP ARCHITECTURAL DESIGN

    Directory of Open Access Journals (Sweden)

    Burak Pak

    2016-04-01

    Full Text Available This paper aims at discussing the potentials of bottom-up design practices in relation to the latest developments in Information and Communication Technologies (ICT by making an in-depth review of inaugural cases. The first part of the study involves a literature study and the elaboration of basic strategies from the case study. The second part reframes the existing ICT tools and strategies and elaborates on their potentials to support the modes of participation performed in these cases. As a result, by distilling the created knowledge, the study reveals the potentials of novel modes of ICT-enabled design participation which exploit a set of collective action tools to support sustainable ways of self-organization and bottom-up design. The final part explains the relevance of these with solid examples and presents a hypothetical case for future implementation. The paper concludes with a brief reflection on the implications of the findings for the future of architectural design education.

  10. Information Extraction of High Resolution Remote Sensing Images Based on the Calculation of Optimal Segmentation Parameters

    Science.gov (United States)

    Zhu, Hongchun; Cai, Lijie; Liu, Haiying; Huang, Wei

    2016-01-01

    Multi-scale image segmentation and the selection of optimal segmentation parameters are the key processes in the object-oriented information extraction of high-resolution remote sensing images. The accuracy of remote sensing special subject information depends on this extraction. On the basis of WorldView-2 high-resolution data, the optimal segmentation parameters methodof object-oriented image segmentation and high-resolution image information extraction, the following processes were conducted in this study. Firstly, the best combination of the bands and weights was determined for the information extraction of high-resolution remote sensing image. An improved weighted mean-variance method was proposed andused to calculatethe optimal segmentation scale. Thereafter, the best shape factor parameter and compact factor parameters were computed with the use of the control variables and the combination of the heterogeneity and homogeneity indexes. Different types of image segmentation parameters were obtained according to the surface features. The high-resolution remote sensing images were multi-scale segmented with the optimal segmentation parameters. Ahierarchical network structure was established by setting the information extraction rules to achieve object-oriented information extraction. This study presents an effective and practical method that can explain expert input judgment by reproducible quantitative measurements. Furthermore the results of this procedure may be incorporated into a classification scheme. PMID:27362762

  11. Microplastics in Baltic bottom sediments: Quantification procedures and first results.

    Science.gov (United States)

    Zobkov, M; Esiukova, E

    2017-01-30

    Microplastics in the marine environment are known as a global ecological problem but there are still no standardized analysis procedures for their quantification. The first breakthrough in this direction was the NOAA Laboratory Methods for quantifying synthetic particles in water and sediments, but fibers numbers have been found to be underestimated with this approach. We propose modifications for these methods that will allow us to analyze microplastics in bottom sediments, including small fibers. Addition of an internal standard to sediment samples and occasional empty runs are advised for analysis quality control. The microplastics extraction efficiency using the proposed modifications is 92±7%. Distribution of microplastics in bottom sediments of the Russian part of the Baltic Sea is presented. Microplastic particles were found in all of the samples with an average concentration of 34±10 items/kg DW and have the same order of magnitude as neighbor studies reported. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. A Method for Extracting Road Boundary Information from Crowdsourcing Vehicle GPS Trajectories.

    Science.gov (United States)

    Yang, Wei; Ai, Tinghua; Lu, Wei

    2018-04-19

    Crowdsourcing trajectory data is an important approach for accessing and updating road information. In this paper, we present a novel approach for extracting road boundary information from crowdsourcing vehicle traces based on Delaunay triangulation (DT). First, an optimization and interpolation method is proposed to filter abnormal trace segments from raw global positioning system (GPS) traces and interpolate the optimization segments adaptively to ensure there are enough tracking points. Second, constructing the DT and the Voronoi diagram within interpolated tracking lines to calculate road boundary descriptors using the area of Voronoi cell and the length of triangle edge. Then, the road boundary detection model is established integrating the boundary descriptors and trajectory movement features (e.g., direction) by DT. Third, using the boundary detection model to detect road boundary from the DT constructed by trajectory lines, and a regional growing method based on seed polygons is proposed to extract the road boundary. Experiments were conducted using the GPS traces of taxis in Beijing, China, and the results show that the proposed method is suitable for extracting the road boundary from low-frequency GPS traces, multi-type road structures, and different time intervals. Compared with two existing methods, the automatically extracted boundary information was proved to be of higher quality.

  13. A Method for Extracting Road Boundary Information from Crowdsourcing Vehicle GPS Trajectories

    Directory of Open Access Journals (Sweden)

    Wei Yang

    2018-04-01

    Full Text Available Crowdsourcing trajectory data is an important approach for accessing and updating road information. In this paper, we present a novel approach for extracting road boundary information from crowdsourcing vehicle traces based on Delaunay triangulation (DT. First, an optimization and interpolation method is proposed to filter abnormal trace segments from raw global positioning system (GPS traces and interpolate the optimization segments adaptively to ensure there are enough tracking points. Second, constructing the DT and the Voronoi diagram within interpolated tracking lines to calculate road boundary descriptors using the area of Voronoi cell and the length of triangle edge. Then, the road boundary detection model is established integrating the boundary descriptors and trajectory movement features (e.g., direction by DT. Third, using the boundary detection model to detect road boundary from the DT constructed by trajectory lines, and a regional growing method based on seed polygons is proposed to extract the road boundary. Experiments were conducted using the GPS traces of taxis in Beijing, China, and the results show that the proposed method is suitable for extracting the road boundary from low-frequency GPS traces, multi-type road structures, and different time intervals. Compared with two existing methods, the automatically extracted boundary information was proved to be of higher quality.

  14. A cascade of classifiers for extracting medication information from discharge summaries

    Directory of Open Access Journals (Sweden)

    Halgrim Scott

    2011-07-01

    Full Text Available Abstract Background Extracting medication information from clinical records has many potential applications, and recently published research, systems, and competitions reflect an interest therein. Much of the early extraction work involved rules and lexicons, but more recently machine learning has been applied to the task. Methods We present a hybrid system consisting of two parts. The first part, field detection, uses a cascade of statistical classifiers to identify medication-related named entities. The second part uses simple heuristics to link those entities into medication events. Results The system achieved performance that is comparable to other approaches to the same task. This performance is further improved by adding features that reference external medication name lists. Conclusions This study demonstrates that our hybrid approach outperforms purely statistical or rule-based systems. The study also shows that a cascade of classifiers works better than a single classifier in extracting medication information. The system is available as is upon request from the first author.

  15. Rapid automatic keyword extraction for information retrieval and analysis

    Science.gov (United States)

    Rose, Stuart J [Richland, WA; Cowley,; E, Wendy [Richland, WA; Crow, Vernon L [Richland, WA; Cramer, Nicholas O [Richland, WA

    2012-03-06

    Methods and systems for rapid automatic keyword extraction for information retrieval and analysis. Embodiments can include parsing words in an individual document by delimiters, stop words, or both in order to identify candidate keywords. Word scores for each word within the candidate keywords are then calculated based on a function of co-occurrence degree, co-occurrence frequency, or both. Based on a function of the word scores for words within the candidate keyword, a keyword score is calculated for each of the candidate keywords. A portion of the candidate keywords are then extracted as keywords based, at least in part, on the candidate keywords having the highest keyword scores.

  16. Removal of COD and color loads in bleached kraft pulp effluents by bottom ashes from boilers.

    Science.gov (United States)

    Van Tran, A

    2008-07-01

    The effectiveness of the bottom ashes from biomass and coal-fired boilers in removing chemical oxygen demand (COD) and colorloads in effluents of a kraft pulp bleachery plant is investigated. The effluents tested are those of the sulfuric acid treatment (A stage) of a hardwood kraft pulp, and of the first acidic (chlorine or chlorine dioxide) and second alkaline (extraction) stages in the chlorine and elemental chlorine-free (ECF) bleaching lines of hardwood and softwood kraft pulps. The coal-fired boiler's bottom ashes are unable to remove either COD or color load in the bleached kraft pulp effluents. However, the bottom ashes of the biomass boiler are effective in removing COD and color loads of the acidic and alkaline effluents irrespective of the bleaching process or wood species. In particular, these ashes increase the pH of all the effluents examined.

  17. The Impact of Bottom-Up Parking Information Provision in a Real-Life Context: The Case of Antwerp

    Directory of Open Access Journals (Sweden)

    Geert Tasseron

    2017-01-01

    Full Text Available A number of studies have analyzed the possible impacts of bottom-up parking information or parking reservation systems on parking dynamics in abstract simulation environments. In this paper, we take these efforts one step further by investigating the impacts of these systems in a real-life context: the center of the city of Antwerp, Belgium. In our simulation, we assume that all on-street and off-street parking places are equipped with technology able to transmit their occupancy status to so-called smart cars, which can receive information and reserve a parking place. We employ PARKAGENT, an agent-based simulation model, to simulate the behavior of smart and regular cars. We obtain detailed data on parking demand from FEATHERS, an activity-based transport model. The simulation results show that parking information and reservation hardly impact search time but do reduce walking distance for smart cars, leading to a reduction in total parking time, that is, the sum of search time and walking time. Reductions in search time occur only in zones with high occupancy rates, while a drop in walking distance is especially observed in low occupancy areas. Societal benefits of parking information and reservation are limited, because of the low impact on search time and the possible negative health effects of reduced walking distance.

  18. Robust Vehicle and Traffic Information Extraction for Highway Surveillance

    Directory of Open Access Journals (Sweden)

    Yeh Chia-Hung

    2005-01-01

    Full Text Available A robust vision-based traffic monitoring system for vehicle and traffic information extraction is developed in this research. It is challenging to maintain detection robustness at all time for a highway surveillance system. There are three major problems in detecting and tracking a vehicle: (1 the moving cast shadow effect, (2 the occlusion effect, and (3 nighttime detection. For moving cast shadow elimination, a 2D joint vehicle-shadow model is employed. For occlusion detection, a multiple-camera system is used to detect occlusion so as to extract the exact location of each vehicle. For vehicle nighttime detection, a rear-view monitoring technique is proposed to maintain tracking and detection accuracy. Furthermore, we propose a method to improve the accuracy of background extraction, which usually serves as the first step in any vehicle detection processing. Experimental results are given to demonstrate that the proposed techniques are effective and efficient for vision-based highway surveillance.

  19. State of the soft bottoms of the continental shelf

    International Nuclear Information System (INIS)

    Guzman Alvis, Angela I; Solano, Oscar David

    2002-01-01

    The presented information, it is based on studies carried out on the continental shelf of the Colombian Caribbean, mainly in the Gulf of Morrosquillo and the Magdalena and Guajira departments in the last ten years. A diagnostic is done of the soft bottoms of the Colombian continental shelf

  20. Salient region detection by fusing bottom-up and top-down features extracted from a single image.

    Science.gov (United States)

    Tian, Huawei; Fang, Yuming; Zhao, Yao; Lin, Weisi; Ni, Rongrong; Zhu, Zhenfeng

    2014-10-01

    Recently, some global contrast-based salient region detection models have been proposed based on only the low-level feature of color. It is necessary to consider both color and orientation features to overcome their limitations, and thus improve the performance of salient region detection for images with low-contrast in color and high-contrast in orientation. In addition, the existing fusion methods for different feature maps, like the simple averaging method and the selective method, are not effective sufficiently. To overcome these limitations of existing salient region detection models, we propose a novel salient region model based on the bottom-up and top-down mechanisms: the color contrast and orientation contrast are adopted to calculate the bottom-up feature maps, while the top-down cue of depth-from-focus from the same single image is used to guide the generation of final salient regions, since depth-from-focus reflects the photographer's preference and knowledge of the task. A more general and effective fusion method is designed to combine the bottom-up feature maps. According to the degree-of-scattering and eccentricities of feature maps, the proposed fusion method can assign adaptive weights to different feature maps to reflect the confidence level of each feature map. The depth-from-focus of the image as a significant top-down feature for visual attention in the image is used to guide the salient regions during the fusion process; with its aid, the proposed fusion method can filter out the background and highlight salient regions for the image. Experimental results show that the proposed model outperforms the state-of-the-art models on three public available data sets.

  1. The effect of bottom sediment supplement on heavy metals content in plants (Zea mays and soil

    Directory of Open Access Journals (Sweden)

    Baran A.

    2013-04-01

    Full Text Available Important aspect of bottom sediments is the problem of their management or disposal after their extraction from the bottom of rivers, dam reservoirs, ports, channels or ponds. The research aimed at an assessment of potential environmental management of bottom sediment used as an admixture to light soil basing on its effect on contents of heavy metals in plants and soil. The research was conducted on light soil with granulometric structure of weakly loamy sand. The bottom sediment was added to light soil in the amount of 0 (control 5, 10, 30 i 50%. The test plant was maize (Zea mays, “Bora” c.v. The sediment applied in the presented research revealed high share of silt and clay fractions, alkaline pH and low contents of heavy metals, therefore it may be used as an admixture to the above mentioned soils to improve their productivity. The applied bottom sediment to the soil affected a decreased in Zn, Cd and Pb content in maize in comparison with the treatment without the deposit whereas increased content of Cu, Cr and Ni. No exceeded permissible content of heavy metals concerning plant assessment in view of their forage usability were registered in maize biomass.

  2. Experimental use of road header (AM-50) as face cutting machine for extraction of coal in longwall panel

    Energy Technology Data Exchange (ETDEWEB)

    Passi, K.K.; Kumar, C.R.; Prasad, P. [DGMS, Dhanbad (India)

    2001-07-01

    The scope of this paper has been limited to the use of available machines and techniques for attaining higher and more efficient production in underground coal mines. Under certain conditions of strata and higher degree of gassiness, the longwall method with hydraulic sand stowing is the only appropriate method of work for extraction of thick seam. In Moonidih Jitpur Colliery of M/S IISCO, No. 14 seam, Degree III gassy seam, 9.07 m thick, is extracted in multilift system with hydraulic sand stowing. In general, the bottom lift is extracted by Single Ended Ranging Arm Shearer and the middle and top lift are extracted by conventional method. However, in one of the panels spare road header machine was used as face cutting machine in bottom lift, on an experimental basis. This paper presents a successful case study of extraction of bottom lift coal by the longwall method with hydraulic sand stowing using road header (AM 50) as the face cutting machines. 9 figs.

  3. Research of building information extraction and evaluation based on high-resolution remote-sensing imagery

    Science.gov (United States)

    Cao, Qiong; Gu, Lingjia; Ren, Ruizhi; Wang, Lang

    2016-09-01

    Building extraction currently is important in the application of high-resolution remote sensing imagery. At present, quite a few algorithms are available for detecting building information, however, most of them still have some obvious disadvantages, such as the ignorance of spectral information, the contradiction between extraction rate and extraction accuracy. The purpose of this research is to develop an effective method to detect building information for Chinese GF-1 data. Firstly, the image preprocessing technique is used to normalize the image and image enhancement is used to highlight the useful information in the image. Secondly, multi-spectral information is analyzed. Subsequently, an improved morphological building index (IMBI) based on remote sensing imagery is proposed to get the candidate building objects. Furthermore, in order to refine building objects and further remove false objects, the post-processing (e.g., the shape features, the vegetation index and the water index) is employed. To validate the effectiveness of the proposed algorithm, the omission errors (OE), commission errors (CE), the overall accuracy (OA) and Kappa are used at final. The proposed method can not only effectively use spectral information and other basic features, but also avoid extracting excessive interference details from high-resolution remote sensing images. Compared to the original MBI algorithm, the proposed method reduces the OE by 33.14% .At the same time, the Kappa increase by 16.09%. In experiments, IMBI achieved satisfactory results and outperformed other algorithms in terms of both accuracies and visual inspection

  4. Extracting Social Networks and Contact Information From Email and the Web

    National Research Council Canada - National Science Library

    Culotta, Aron; Bekkerman, Ron; McCallum, Andrew

    2005-01-01

    ...-suited for such information extraction tasks. By recursively calling itself on new people discovered on the Web, the system builds a social network with multiple degrees of separation from the user...

  5. Spectroscopy and lifetime of bottom and charm hadrons

    International Nuclear Information System (INIS)

    F. Ukegawa

    2000-01-01

    There are several motivations for studying masses and lifetimes of the hadrons containing a heavy quark, either the bottom or the charm quark. First, the mass and the lifetime are fundamental properties of an elementary particle. Second, the spectroscopy of hadrons gives insights into the QCD potential between quarks. In particular, a symmetry exists for heavy hadrons when the heavy quark mass is taken to be infinite, providing a powerful tool to predict and understand properties of those heavy hadrons. Third, studies of the lifetimes of heavy hadrons probe their decay mechanisms. A measurement of the lifetime, or the total decay width, is necessary when the authors extract magnitudes of elements of the Kobayashi-Maskawa matrix. Again, in the limit of an infinite heavy quark mass things become simple and decay of a heavy hadron should be the decay of the heavy quark Q. This leads to a prediction that all hadrons containing the heavy quark Q should have the same lifetime, that of the quark Q. This is far from reality in the case of charm hadrons, where the D + meson lifetime is about 2.5 times longer than the D 0 meson lifetime. Perhaps the charm quark is not heavy enough. The simple quark decay picture should be a better approximation for the bottom hadrons because of the larger b quark mass. On the experimental side, the measurements and knowledge of the heavy hadrons (in particular bottom hadrons) have significantly improved over the last decade, thanks to high statistics data accumulated by various experiments. The authors shall review recent developments in these studies in the remainder of this manuscript

  6. Where to start? Bottom-up attention improves working memory by determining encoding order.

    Science.gov (United States)

    Ravizza, Susan M; Uitvlugt, Mitchell G; Hazeltine, Eliot

    2016-12-01

    The present study aimed to characterize the mechanism by which working memory is enhanced for items that capture attention because of their novelty or saliency-that is, via bottom-up attention. The first experiment replicated previous research by corroborating that bottom-up attention directed to an item is sufficient for enhancing working memory and, moreover, generalized the effect to the domain of verbal working memory. The subsequent 3 experiments sought to determine how bottom-up attention affects working memory. We considered 2 hypotheses: (1) Bottom-up attention enhances the encoded representation of the stimulus, similar to how voluntary attention functions, or (2) It affects the order of encoding by shifting priority onto the attended stimulus. By manipulating how stimuli were presented (simultaneous/sequential display) and whether the cue predicted the tested items, we found evidence that bottom-up attention improves working memory performance via the order of encoding hypothesis. This finding was observed across change detection and free recall paradigms. In contrast, voluntary attention improved working memory regardless of encoding order and showed greater effects on working memory. We conclude that when multiple information sources compete, bottom-up attention prioritizes the location at which encoding should begin. When encoding order is set, bottom-up attention has little or no benefit to working memory. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  7. Information Extraction of High-Resolution Remotely Sensed Image Based on Multiresolution Segmentation

    Directory of Open Access Journals (Sweden)

    Peng Shao

    2014-08-01

    Full Text Available The principle of multiresolution segmentation was represented in detail in this study, and the canny algorithm was applied for edge-detection of a remotely sensed image based on this principle. The target image was divided into regions based on object-oriented multiresolution segmentation and edge-detection. Furthermore, object hierarchy was created, and a series of features (water bodies, vegetation, roads, residential areas, bare land and other information were extracted by the spectral and geometrical features. The results indicate that the edge-detection has a positive effect on multiresolution segmentation, and overall accuracy of information extraction reaches to 94.6% by the confusion matrix.

  8. Extracting information from multiplex networks

    Science.gov (United States)

    Iacovacci, Jacopo; Bianconi, Ginestra

    2016-06-01

    Multiplex networks are generalized network structures that are able to describe networks in which the same set of nodes are connected by links that have different connotations. Multiplex networks are ubiquitous since they describe social, financial, engineering, and biological networks as well. Extending our ability to analyze complex networks to multiplex network structures increases greatly the level of information that is possible to extract from big data. For these reasons, characterizing the centrality of nodes in multiplex networks and finding new ways to solve challenging inference problems defined on multiplex networks are fundamental questions of network science. In this paper, we discuss the relevance of the Multiplex PageRank algorithm for measuring the centrality of nodes in multilayer networks and we characterize the utility of the recently introduced indicator function Θ ˜ S for describing their mesoscale organization and community structure. As working examples for studying these measures, we consider three multiplex network datasets coming for social science.

  9. OpenCV-Based Nanomanipulation Information Extraction and the Probe Operation in SEM

    Directory of Open Access Journals (Sweden)

    Dongjie Li

    2015-02-01

    Full Text Available Aimed at the established telenanomanipulation system, the method of extracting location information and the strategies of probe operation were studied in this paper. First, the machine learning algorithm of OpenCV was used to extract location information from SEM images. Thus nanowires and probe in SEM images can be automatically tracked and the region of interest (ROI can be marked quickly. Then the location of nanowire and probe can be extracted from the ROI. To study the probe operation strategy, the Van der Waals force between probe and a nanowire was computed; thus relevant operating parameters can be obtained. With these operating parameters, the nanowire in 3D virtual environment can be preoperated and an optimal path of the probe can be obtained. The actual probe runs automatically under the telenanomanipulation system's control. Finally, experiments were carried out to verify the above methods, and results show the designed methods have achieved the expected effect.

  10. A method for automating the extraction of specialized information from the web

    NARCIS (Netherlands)

    Lin, L.; Liotta, A.; Hippisley, A.; Hao, Y.; Liu, J.; Wang, Y.; Cheung, Y-M.; Yin, H.; Jiao, L.; Ma, j.; Jiao, Y-C.

    2005-01-01

    The World Wide Web can be viewed as a gigantic distributed database including millions of interconnected hosts some of which publish information via web servers or peer-to-peer systems. We present here a novel method for the extraction of semantically rich information from the web in a fully

  11. Evaluating the Resilience of the Bottom-up Method used to Detect and Benchmark the Smartness of University Campuses

    NARCIS (Netherlands)

    Giovannella, Carlo; Andone, Diana; Dascalu, Mihai; Popescu, Elvira; Rehm, Matthias; Mealha, Oscar

    2017-01-01

    A new method to perform a bottom-up extraction and benchmark of the perceived multilevel smartness of complex ecosystems has been recently described and applied to territories and learning ecosystems like university campuses and schools. In this paper we study the resilience of our method

  12. Phosphorus availability from bottom sediments of lakes using a nuclear technique

    International Nuclear Information System (INIS)

    Flores, F.; Facetti, J.F.

    1991-01-01

    Availability of phosphorus from the bottom sediments of a lake plays an import role in the development of aquatic biota and in the enhancement of eutrophication process. In this work the 31 P↔ 32 P isotopic exchange (E values) technique was applied to assess the potential influence of this phosphorus reservoir on the water quality of Acaray and Yguazu Dams, at the Eastern Region of Paraguay. Samples analyzed were taken from the bottom sediments of the water bodies at different sites as well as from the shores. The method is reliable and yields information of ecological significance

  13. Extraction of land cover change information from ENVISAT-ASAR data in Chengdu Plain

    Science.gov (United States)

    Xu, Wenbo; Fan, Jinlong; Huang, Jianxi; Tian, Yichen; Zhang, Yong

    2006-10-01

    Land cover data are essential to most global change research objectives, including the assessment of current environmental conditions and the simulation of future environmental scenarios that ultimately lead to public policy development. Chinese Academy of Sciences generated a nationwide land cover database in order to carry out the quantification and spatial characterization of land use/cover changes (LUCC) in 1990s. In order to improve the reliability of the database, we will update the database anytime. But it is difficult to obtain remote sensing data to extract land cover change information in large-scale. It is hard to acquire optical remote sensing data in Chengdu plain, so the objective of this research was to evaluate multitemporal ENVISAT advanced synthetic aperture radar (ASAR) data for extracting land cover change information. Based on the fieldwork and the nationwide 1:100000 land cover database, the paper assesses several land cover changes in Chengdu plain, for example: crop to buildings, forest to buildings, and forest to bare land. The results show that ENVISAT ASAR data have great potential for the applications of extracting land cover change information.

  14. Pressing technology for large bottoms

    International Nuclear Information System (INIS)

    Jilek, L.

    1986-01-01

    The technology has been selected of a circular plate bent into the shape of a trough, for pressing bottoms of pressure vessels from a circular plate of large diameter. The initial sheet is first bent in the middle by heating with the edges remaining straight. These are then welded longitudinally by electroslag welding and the circular shape is flame cut. The result will be a plate with a straight surface in the middle with raised edges which may be pressed into the desired shape. In this manner it is also possible to press pressure vessel bottoms with tube couplings from plates which are thickened in the middle and drilled; additional welding is then eliminated. Deformation from heat treatment may be avoided by the use of a fixture in the shape of a ring with a groove into which is fixed the edge of the bottom. During hardening of the bottom it will be necessary to care for the withdrawal of vapours and gases which would hamper uniform cooling. Bottom hardening with the grill and the cupola downwards has been proven. Deformation which occurs during treatment may to a certain extent be removed by calibration which cannot, however, be made without special fixtures and instruments. (J.B.)

  15. Evolving spectral transformations for multitemporal information extraction using evolutionary computation

    Science.gov (United States)

    Momm, Henrique; Easson, Greg

    2011-01-01

    Remote sensing plays an important role in assessing temporal changes in land features. The challenge often resides in the conversion of large quantities of raw data into actionable information in a timely and cost-effective fashion. To address this issue, research was undertaken to develop an innovative methodology integrating biologically-inspired algorithms with standard image classification algorithms to improve information extraction from multitemporal imagery. Genetic programming was used as the optimization engine to evolve feature-specific candidate solutions in the form of nonlinear mathematical expressions of the image spectral channels (spectral indices). The temporal generalization capability of the proposed system was evaluated by addressing the task of building rooftop identification from a set of images acquired at different dates in a cross-validation approach. The proposed system generates robust solutions (kappa values > 0.75 for stage 1 and > 0.4 for stage 2) despite the statistical differences between the scenes caused by land use and land cover changes coupled with variable environmental conditions, and the lack of radiometric calibration between images. Based on our results, the use of nonlinear spectral indices enhanced the spectral differences between features improving the clustering capability of standard classifiers and providing an alternative solution for multitemporal information extraction.

  16. Extraction of Graph Information Based on Image Contents and the Use of Ontology

    Science.gov (United States)

    Kanjanawattana, Sarunya; Kimura, Masaomi

    2016-01-01

    A graph is an effective form of data representation used to summarize complex information. Explicit information such as the relationship between the X- and Y-axes can be easily extracted from a graph by applying human intelligence. However, implicit knowledge such as information obtained from other related concepts in an ontology also resides in…

  17. Rita Bottoms: Polyartist Librarian

    OpenAIRE

    Bottoms, Rita; Reti, Irene; Regional History Project, UCSC Library

    2005-01-01

    Project Director Irene Reti conducted fourteen hours of interviews with Rita Bottoms, Head of Special Collections at the University Library, UC Santa Cruz, shortly before her retirement in March 2003. This oral history provides a vivid and intimate look at thirty-seven years behind the scenes in the library's Special Collections. For thirty-seven years Bottoms dedicated herself to collecting work by some of the most eminent writers and photographers of the twentieth century, includin...

  18. Shallow flows with bottom topography

    NARCIS (Netherlands)

    Heijst, van G.J.F.; Kamp, L.P.J.; Theunissen, R.; Rodi, W.; Uhlmann, M.

    2012-01-01

    This paper discusses laboratory experiments and numerical simulations of dipolar vortex flows in a shallow fluid layer with bottom topography. Two cases are considered: a step topography and a linearly sloping bottom. It is found that viscous effects – i.e., no-slip conditions at the non-horizontal

  19. Catalog solvent extraction: anticipate process adjustments

    International Nuclear Information System (INIS)

    Campbell, S.G.; Brass, E.A.; Brown, S.J.; Geeting, M.W.

    2008-01-01

    The Modular Caustic-Side Solvent Extraction Unit (MCU) utilizes commercially available centrifugal contactors to facilitate removal of radioactive cesium from highly alkaline salt solutions. During the fabrication of the contactor assembly, demonstrations revealed a higher propensity for foaming than was initially expected. A task team performed a series of single-phase experiments that revealed that the shape of the bottom vanes and the outer diameter of those vanes are key to the successful deployment of commercial contactors in the Caustic-Side Solvent Extraction Process. (authors)

  20. An Accurate Integral Method for Vibration Signal Based on Feature Information Extraction

    Directory of Open Access Journals (Sweden)

    Yong Zhu

    2015-01-01

    Full Text Available After summarizing the advantages and disadvantages of current integral methods, a novel vibration signal integral method based on feature information extraction was proposed. This method took full advantage of the self-adaptive filter characteristic and waveform correction feature of ensemble empirical mode decomposition in dealing with nonlinear and nonstationary signals. This research merged the superiorities of kurtosis, mean square error, energy, and singular value decomposition on signal feature extraction. The values of the four indexes aforementioned were combined into a feature vector. Then, the connotative characteristic components in vibration signal were accurately extracted by Euclidean distance search, and the desired integral signals were precisely reconstructed. With this method, the interference problem of invalid signal such as trend item and noise which plague traditional methods is commendably solved. The great cumulative error from the traditional time-domain integral is effectively overcome. Moreover, the large low-frequency error from the traditional frequency-domain integral is successfully avoided. Comparing with the traditional integral methods, this method is outstanding at removing noise and retaining useful feature information and shows higher accuracy and superiority.

  1. Phosphorus availability from bottom sediments of lakes using a nuclear technique

    International Nuclear Information System (INIS)

    Flores, F.; Facetti, J.F.

    1992-01-01

    Availability of phosphorus from the bottom sediments of a lake plays an import role in the development of aquatic biota and in the enhancement of the eutrophication process. In this work, the 31 P- 32 P isotopic exchange (E values) technique was applied to assess the potential influence of this phosphorus 'reservoir' on the water quality of the Acaray and Yguazu Dams in the Easter Region of Paraguay. Samples analyzed were taken from the bottom sediments of the water body at different sites as well as from the shores. The method is reliable and yields information of potential ecological significance. (author) 14 refs.; 2 tabs

  2. Knowledge discovery: Extracting usable information from large amounts of data

    International Nuclear Information System (INIS)

    Whiteson, R.

    1998-01-01

    The threat of nuclear weapons proliferation is a problem of world wide concern. Safeguards are the key to nuclear nonproliferation and data is the key to safeguards. The safeguards community has access to a huge and steadily growing volume of data. The advantages of this data rich environment are obvious, there is a great deal of information which can be utilized. The challenge is to effectively apply proven and developing technologies to find and extract usable information from that data. That information must then be assessed and evaluated to produce the knowledge needed for crucial decision making. Efficient and effective analysis of safeguards data will depend on utilizing technologies to interpret the large, heterogeneous data sets that are available from diverse sources. With an order-of-magnitude increase in the amount of data from a wide variety of technical, textual, and historical sources there is a vital need to apply advanced computer technologies to support all-source analysis. There are techniques of data warehousing, data mining, and data analysis that can provide analysts with tools that will expedite their extracting useable information from the huge amounts of data to which they have access. Computerized tools can aid analysts by integrating heterogeneous data, evaluating diverse data streams, automating retrieval of database information, prioritizing inputs, reconciling conflicting data, doing preliminary interpretations, discovering patterns or trends in data, and automating some of the simpler prescreening tasks that are time consuming and tedious. Thus knowledge discovery technologies can provide a foundation of support for the analyst. Rather than spending time sifting through often irrelevant information, analysts could use their specialized skills in a focused, productive fashion. This would allow them to make their analytical judgments with more confidence and spend more of their time doing what they do best

  3. Extraction of temporal information in functional MRI

    Science.gov (United States)

    Singh, M.; Sungkarat, W.; Jeong, Jeong-Won; Zhou, Yongxia

    2002-10-01

    The temporal resolution of functional MRI (fMRI) is limited by the shape of the haemodynamic response function (hrf) and the vascular architecture underlying the activated regions. Typically, the temporal resolution of fMRI is on the order of 1 s. We have developed a new data processing approach to extract temporal information on a pixel-by-pixel basis at the level of 100 ms from fMRI data. Instead of correlating or fitting the time-course of each pixel to a single reference function, which is the common practice in fMRI, we correlate each pixel's time-course to a series of reference functions that are shifted with respect to each other by 100 ms. The reference function yielding the highest correlation coefficient for a pixel is then used as a time marker for that pixel. A Monte Carlo simulation and experimental study of this approach were performed to estimate the temporal resolution as a function of signal-to-noise ratio (SNR) in the time-course of a pixel. Assuming a known and stationary hrf, the simulation and experimental studies suggest a lower limit in the temporal resolution of approximately 100 ms at an SNR of 3. The multireference function approach was also applied to extract timing information from an event-related motor movement study where the subjects flexed a finger on cue. The event was repeated 19 times with the event's presentation staggered to yield an approximately 100-ms temporal sampling of the haemodynamic response over the entire presentation cycle. The timing differences among different regions of the brain activated by the motor task were clearly visualized and quantified by this method. The results suggest that it is possible to achieve a temporal resolution of /spl sim/200 ms in practice with this approach.

  4. Acoustic water bottom investigation with a remotely operated watercraft survey system

    Science.gov (United States)

    Yamasaki, Shintaro; Tabusa, Tomonori; Iwasaki, Shunsuke; Hiramatsu, Masahiro

    2017-12-01

    This paper describes a remotely operated investigation system developed by combining a modern leisure-use fish finder and an unmanned watercraft to survey water bottom topography and other data related to bottom materials. Current leisure-use fish finders have strong depth sounding capabilities and can provide precise sonar images and bathymetric information. Because these sonar instruments are lightweight and small, they can be used on unmanned small watercraft. With the developed system, an operator can direct the heading of an unmanned watercraft and monitor a PC display showing real-time positioning information through the use of onboard equipment and long-distance communication devices. Here, we explain how the system was developed and demonstrate the use of the system in an area of submerged woods in a lake. The system is low cost, easy to use, and mobile. It should be useful in surveying areas that have heretofore been hard to investigate, including remote, small, and shallow lakes, for example, volcanic and glacial lakes.

  5. Learning affects top down and bottom up modulation of eye movements in decision making

    DEFF Research Database (Denmark)

    Orquin, Jacob Lund; Bagger, Martin; Mueller Loose, Simone

    2013-01-01

    Repeated decision making is subject to changes over time such as decreases in decision time and information use and increases in decision accuracy. We show that a traditional strategy selection view of decision making cannot account for these temporal dynamics without relaxing main assumptions...... about what defines a decision strategy. As an alternative view we suggest that temporal dynamics in decision making are driven by attentional and perceptual processes and that this view has been expressed in the information reduction hypothesis. We test the information reduction hypothesis by integrating...... it in a broader framework of top down and bottom up processes and derive the predictions that repeated decisions increase top down control of attention capture which in turn leads to a reduction in bottom up attention capture. To test our hypotheses we conducted a repeated discrete choice experiment with three...

  6. Dual-wavelength phase-shifting digital holography selectively extracting wavelength information from wavelength-multiplexed holograms.

    Science.gov (United States)

    Tahara, Tatsuki; Mori, Ryota; Kikunaga, Shuhei; Arai, Yasuhiko; Takaki, Yasuhiro

    2015-06-15

    Dual-wavelength phase-shifting digital holography that selectively extracts wavelength information from five wavelength-multiplexed holograms is presented. Specific phase shifts for respective wavelengths are introduced to remove the crosstalk components and extract only the object wave at the desired wavelength from the holograms. Object waves in multiple wavelengths are selectively extracted by utilizing 2π ambiguity and the subtraction procedures based on phase-shifting interferometry. Numerical results show the validity of the proposed technique. The proposed technique is also experimentally demonstrated.

  7. Cathodic protection for the bottoms of above ground storage tanks

    Energy Technology Data Exchange (ETDEWEB)

    Mohr, John P. [Tyco Adhesives, Norwood, MA (United States)

    2004-07-01

    Impressed Current Cathodic Protection has been used for many years to protect the external bottoms of above ground storage tanks. The use of a vertical deep ground bed often treated several bare steel tank bottoms by broadcasting current over a wide area. Environmental concerns and, in some countries, government regulations, have introduced the use of dielectric secondary containment liners. The dielectric liner does not allow the protective cathodic protection current to pass and causes corrosion to continue on the newly placed tank bottom. In existing tank bottoms where inadequate protection has been provided, leaks can develop. In one method of remediation, an old bottom is covered with sand and a double bottom is welded above the leaking bottom. The new bottom is welded very close to the old bottom, thus shielding the traditional cathodic protection from protecting the new bottom. These double bottoms often employ the use of dielectric liner as well. Both the liner and the double bottom often minimize the distance from the external tank bottom. The minimized space between the liner, or double bottom, and the bottom to be protected places a challenge in providing current distribution in cathodic protection systems. This study examines the practical concerns for application of impressed current cathodic protection and the types of anode materials used in these specific applications. One unique approach for an economical treatment using a conductive polymer cathodic protection method is presented. (author)

  8. Extracting information from two-dimensional electrophoresis gels by partial least squares regression

    DEFF Research Database (Denmark)

    Jessen, Flemming; Lametsch, R.; Bendixen, E.

    2002-01-01

    of all proteins/spots in the gels. In the present study it is demonstrated how information can be extracted by multivariate data analysis. The strategy is based on partial least squares regression followed by variable selection to find proteins that individually or in combination with other proteins vary......Two-dimensional gel electrophoresis (2-DE) produces large amounts of data and extraction of relevant information from these data demands a cautious and time consuming process of spot pattern matching between gels. The classical approach of data analysis is to detect protein markers that appear...... or disappear depending on the experimental conditions. Such biomarkers are found by comparing the relative volumes of individual spots in the individual gels. Multivariate statistical analysis and modelling of 2-DE data for comparison and classification is an alternative approach utilising the combination...

  9. Using classic methods in a networked manner: seeing volunteered spatial information in a bottom-up fashion

    NARCIS (Netherlands)

    Carton, L.J.; Ache, P.M.

    2014-01-01

    Using new social media and ICT infrastructures for self-organization, more and more citizen networks and business sectors organize themselves voluntarily around sustainability themes. The paper traces and evaluates one emerging innovation in such bottom-up, networked form of sustainable

  10. Chaotic spectra: How to extract dynamic information

    International Nuclear Information System (INIS)

    Taylor, H.S.; Gomez Llorente, J.M.; Zakrzewski, J.; Kulander, K.C.

    1988-10-01

    Nonlinear dynamics is applied to chaotic unassignable atomic and molecular spectra with the aim of extracting detailed information about regular dynamic motions that exist over short intervals of time. It is shown how this motion can be extracted from high resolution spectra by doing low resolution studies or by Fourier transforming limited regions of the spectrum. These motions mimic those of periodic orbits (PO) and are inserts into the dominant chaotic motion. Considering these inserts and the PO as a dynamically decoupled region of space, resonant scattering theory and stabilization methods enable us to compute ladders of resonant states which interact with the chaotic quasi-continuum computed in principle from basis sets placed off the PO. The interaction of the resonances with the quasicontinuum explains the low resolution spectra seen in such experiments. It also allows one to associate low resolution features with a particular PO. The motion on the PO thereby supplies the molecular movements whose quantization causes the low resolution spectra. Characteristic properties of the periodic orbit based resonances are discussed. The method is illustrated on the photoabsorption spectrum of the hydrogen atom in a strong magnetic field and on the photodissociation spectrum of H 3 + . Other molecular systems which are currently under investigation using this formalism are also mentioned. 53 refs., 10 figs., 2 tabs

  11. Properties and Leachability of Self-Compacting Concrete Incorporated with Fly Ash and Bottom Ash

    Science.gov (United States)

    Kadir, Aeslina Abdul; Ikhmal Haqeem Hassan, Mohd; Jamaluddin, Norwati; Bakri Abdullah, Mohd Mustafa Al

    2016-06-01

    The process of combustion in coal-fired power plant generates ashes, namely fly ash and bottom ash. Besides, coal ash produced from coal combustion contains heavy metals within their compositions. These metals are toxic to the environment as well as to human health. Fortunately, treatment methods are available for these ashes, and the use of fly ash and bottom ash in the concrete mix is one of the few. Therefore, an experimental program was carried out to study the properties and determine the leachability of selfcompacting concrete incorporated with fly ash and bottom ash. For experimental study, self-compacting concrete was produced with fly ash as a replacement for Ordinary Portland Cement and bottom ash as a replacement for sand with the ratios of 10%, 20%, and 30% respectively. The fresh properties tests conducted were slump flow, t500, sieve segregation and J-ring. Meanwhile for the hardened properties, density, compressive strength and water absorption test were performed. The samples were then crushed to be extracted using Toxicity Characteristic Leaching Procedure and heavy metals content within the samples were identified accordingly using Atomic Absorption Spectrometry. The results demonstrated that both fresh and hardened properties were qualified to categorize as self-compacting concrete. Improvements in compressive strength were observed, and densities for all the samples were identified as a normal weight concrete with ranges between 2000 kg/m3 to 2600 kg/m3. Other than that, it was found that incorporation up to 30% of the ashes was safe as the leached heavy metals concentration did not exceed the regulatory levels, except for arsenic. In conclusion, this study will serve as a reference which suggests that fly ash and bottom ash are widely applicable in concrete technology, and its incorporation in self-compacting concrete constitutes a potential means of adding value to appropriate mix and design.

  12. Automated Extraction of Substance Use Information from Clinical Texts.

    Science.gov (United States)

    Wang, Yan; Chen, Elizabeth S; Pakhomov, Serguei; Arsoniadis, Elliot; Carter, Elizabeth W; Lindemann, Elizabeth; Sarkar, Indra Neil; Melton, Genevieve B

    2015-01-01

    Within clinical discourse, social history (SH) includes important information about substance use (alcohol, drug, and nicotine use) as key risk factors for disease, disability, and mortality. In this study, we developed and evaluated a natural language processing (NLP) system for automated detection of substance use statements and extraction of substance use attributes (e.g., temporal and status) based on Stanford Typed Dependencies. The developed NLP system leveraged linguistic resources and domain knowledge from a multi-site social history study, Propbank and the MiPACQ corpus. The system attained F-scores of 89.8, 84.6 and 89.4 respectively for alcohol, drug, and nicotine use statement detection, as well as average F-scores of 82.1, 90.3, 80.8, 88.7, 96.6, and 74.5 respectively for extraction of attributes. Our results suggest that NLP systems can achieve good performance when augmented with linguistic resources and domain knowledge when applied to a wide breadth of substance use free text clinical notes.

  13. INDIVIDUAL TREE OF URBAN FOREST EXTRACTION FROM VERY HIGH DENSITY LIDAR DATA

    Directory of Open Access Journals (Sweden)

    A. Moradi

    2016-06-01

    Full Text Available Airborne LiDAR (Light Detection and Ranging data have a high potential to provide 3D information from trees. Most proposed methods to extract individual trees detect points of tree top or bottom firstly and then using them as starting points in a segmentation algorithm. Hence, in these methods, the number and the locations of detected peak points heavily effect on the process of detecting individual trees. In this study, a new method is presented to extract individual tree segments using LiDAR points with 10cm point density. In this method, a two-step strategy is performed for the extraction of individual tree LiDAR points: finding deterministic segments of individual trees points and allocation of other LiDAR points based on these segments. This research is performed on two study areas in Zeebrugge, Bruges, Belgium (51.33° N, 3.20° E. The accuracy assessment of this method showed that it could correctly classified 74.51% of trees with 21.57% and 3.92% under- and over-segmentation errors respectively.

  14. Network and Ensemble Enabled Entity Extraction in Informal Text (NEEEEIT) final report

    Energy Technology Data Exchange (ETDEWEB)

    Kegelmeyer, Philip W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shead, Timothy M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dunlavy, Daniel M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-09-01

    This SAND report summarizes the activities and outcomes of the Network and Ensemble Enabled Entity Extraction in Information Text (NEEEEIT) LDRD project, which addressed improving the accuracy of conditional random fields for named entity recognition through the use of ensemble methods.

  15. 1S and $\\overline{MS}$ Bottom Quark Masses from $\\Upsilon$ Sum Rules

    CERN Document Server

    Hoang, A.H.

    2000-01-01

    The bottom quark $1S$ mass, $M_b^{1S}$, is determined using sum rules which relate the masses and the electronic decay widths of the $\\Upsilon$ mesons to moments of the vacuum polarization function. The $1S$ mass is defined as half the perturbative mass of a fictitious ${}^3S_1$ bottom-antibottom quark bound state, and is free of the ambiguity of order $\\Lambda_{QCD}$ which plagues the pole mass definition. Compared to an earlier analysis by the same author, which had been carried out in the pole mass scheme, the $1S$ mass scheme leads to a much better behaved perturbative series of the moments, smaller uncertainties in the mass extraction and to a reduced correlation of the mass and the strong coupling. We arrive at $M_b^{1S}=4.71\\pm 0.03$ GeV taking m_b(\\bar m_b)$ can be reduced if the three-loop corrections to the relation of pole and $\\bar{MS}$ mass are known and if the error in the strong coupling is decreased.

  16. Three-dimensional information extraction from GaoFen-1 satellite images for landslide monitoring

    Science.gov (United States)

    Wang, Shixin; Yang, Baolin; Zhou, Yi; Wang, Futao; Zhang, Rui; Zhao, Qing

    2018-05-01

    To more efficiently use GaoFen-1 (GF-1) satellite images for landslide emergency monitoring, a Digital Surface Model (DSM) can be generated from GF-1 across-track stereo image pairs to build a terrain dataset. This study proposes a landslide 3D information extraction method based on the terrain changes of slope objects. The slope objects are mergences of segmented image objects which have similar aspects; and the terrain changes are calculated from the post-disaster Digital Elevation Model (DEM) from GF-1 and the pre-disaster DEM from GDEM V2. A high mountain landslide that occurred in Wenchuan County, Sichuan Province is used to conduct a 3D information extraction test. The extracted total area of the landslide is 22.58 ha; the displaced earth volume is 652,100 m3; and the average sliding direction is 263.83°. The accuracies of them are 0.89, 0.87 and 0.95, respectively. Thus, the proposed method expands the application of GF-1 satellite images to the field of landslide emergency monitoring.

  17. Lithium NLP: A System for Rich Information Extraction from Noisy User Generated Text on Social Media

    OpenAIRE

    Bhargava, Preeti; Spasojevic, Nemanja; Hu, Guoning

    2017-01-01

    In this paper, we describe the Lithium Natural Language Processing (NLP) system - a resource-constrained, high- throughput and language-agnostic system for information extraction from noisy user generated text on social media. Lithium NLP extracts a rich set of information including entities, topics, hashtags and sentiment from text. We discuss several real world applications of the system currently incorporated in Lithium products. We also compare our system with existing commercial and acad...

  18. Lime application methods, water and bottom soil acidity in fresh water fish ponds

    Directory of Open Access Journals (Sweden)

    Queiroz Julio Ferraz de

    2004-01-01

    Full Text Available Although some methods for determining lime requirement of pond soils are available and commonly used, there is still no consensus on whether it is more effective to apply liming materials to the bottoms of empty ponds or to wait and apply them over the water surface after ponds are filled. There is also little information on how deep lime reacts in pond sediment over time, and whether the depth of reaction is different when liming materials are applied to the water or to the soil. Therefore, three techniques for treating fish ponds with agricultural limestone were evaluated in ponds with clayey soils at a commercial fish farm. Amounts of agricultural limestone equal to the lime requirement of bottom soils were applied to each of three ponds by: direct application over the pond water surface; spread uniformly over the bottom of the empty pond; spread uniformly over the bottom of the empty pond followed by tilling of the bottom. Effectiveness of agricultural limestone applications did not differ among treatment methods. Agricultural limestone also reacted quickly to increase total alkalinity and total hardness of pond water to acceptable concentrations within 2 weeks after application. The reaction of lime to increase soil pH was essentially complete after one to two months, and lime had no effect below a soil depth of 8 cm. Tilling of pond bottoms to incorporate liming materials is unnecessary, and tilling consumes time and is an expensive practice; filled ponds can be limed effectively.

  19. System and method for extracting physiological information from remotely detected electromagnetic radiation

    NARCIS (Netherlands)

    2016-01-01

    The present invention relates to a device and a method for extracting physiological information indicative of at least one health symptom from remotely detected electromagnetic radiation. The device comprises an interface (20) for receiving a data stream comprising remotely detected image data

  20. System and method for extracting physiological information from remotely detected electromagnetic radiation

    NARCIS (Netherlands)

    2015-01-01

    The present invention relates to a device and a method for extracting physiological information indicative of at least one health symptom from remotely detected electromagnetic radiation. The device comprises an interface (20) for receiving a data stream comprising remotely detected image data

  1. Extracting of implicit information in English advertising texts with phonetic and lexical-morphological means

    Directory of Open Access Journals (Sweden)

    Traikovskaya Natalya Petrovna

    2015-12-01

    Full Text Available The article deals with phonetic and lexical-morphological language means participating in the process of extracting implicit information in English-speaking advertising texts for men and women. The functioning of phonetic means of the English language is not the basis for implication of information in advertising texts. Lexical and morphological means play the role of markers of relevant information, playing the role of the activator ofimplicit information in the texts of advertising.

  2. Methods from Information Extraction from LIDAR Intensity Data and Multispectral LIDAR Technology

    Science.gov (United States)

    Scaioni, M.; Höfle, B.; Baungarten Kersting, A. P.; Barazzetti, L.; Previtali, M.; Wujanz, D.

    2018-04-01

    LiDAR is a consolidated technology for topographic mapping and 3D reconstruction, which is implemented in several platforms On the other hand, the exploitation of the geometric information has been coupled by the use of laser intensity, which may provide additional data for multiple purposes. This option has been emphasized by the availability of sensors working on different wavelength, thus able to provide additional information for classification of surfaces and objects. Several applications ofmonochromatic and multi-spectral LiDAR data have been already developed in different fields: geosciences, agriculture, forestry, building and cultural heritage. The use of intensity data to extract measures of point cloud quality has been also developed. The paper would like to give an overview on the state-of-the-art of these techniques, and to present the modern technologies for the acquisition of multispectral LiDAR data. In addition, the ISPRS WG III/5 on `Information Extraction from LiDAR Intensity Data' has collected and made available a few open data sets to support scholars to do research on this field. This service is presented and data sets delivered so far as are described.

  3. METHODS FROM INFORMATION EXTRACTION FROM LIDAR INTENSITY DATA AND MULTISPECTRAL LIDAR TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    M. Scaioni

    2018-04-01

    Full Text Available LiDAR is a consolidated technology for topographic mapping and 3D reconstruction, which is implemented in several platforms On the other hand, the exploitation of the geometric information has been coupled by the use of laser intensity, which may provide additional data for multiple purposes. This option has been emphasized by the availability of sensors working on different wavelength, thus able to provide additional information for classification of surfaces and objects. Several applications ofmonochromatic and multi-spectral LiDAR data have been already developed in different fields: geosciences, agriculture, forestry, building and cultural heritage. The use of intensity data to extract measures of point cloud quality has been also developed. The paper would like to give an overview on the state-of-the-art of these techniques, and to present the modern technologies for the acquisition of multispectral LiDAR data. In addition, the ISPRS WG III/5 on ‘Information Extraction from LiDAR Intensity Data’ has collected and made available a few open data sets to support scholars to do research on this field. This service is presented and data sets delivered so far as are described.

  4. Outcast labour in Asia: circulation and informalization of the workforce at the bottom of the economy

    NARCIS (Netherlands)

    Breman, J.

    2010-01-01

    Written over the last ten years, these essays focus on labor at the bottom of the rural economy, lacking social, economic, and political wherewithal, and their struggles to find a foothold in the urban economy. The author draws on his fieldwork from India, Indonesia, and China. The volume

  5. The Interaction of Top-Down and Bottom-Up Statistics in the Resolution of Syntactic Category Ambiguity

    Science.gov (United States)

    Gibson, Edward

    2006-01-01

    This paper investigates how people resolve syntactic category ambiguities when comprehending sentences. It is proposed that people combine: (a) context-dependent syntactic expectations (top-down statistical information) and (b) context-independent lexical-category frequencies of words (bottom-up statistical information) in order to resolve…

  6. 46 CFR 173.058 - Double bottom requirements.

    Science.gov (United States)

    2010-10-01

    ... PERTAINING TO VESSEL USE School Ships § 173.058 Double bottom requirements. Each new sailing school vessel... service must comply with the double bottom requirements in §§ 171.105 through 171.109, inclusive, of this...

  7. Automated extraction of chemical structure information from digital raster images

    Directory of Open Access Journals (Sweden)

    Shedden Kerby A

    2009-02-01

    Full Text Available Abstract Background To search for chemical structures in research articles, diagrams or text representing molecules need to be translated to a standard chemical file format compatible with cheminformatic search engines. Nevertheless, chemical information contained in research articles is often referenced as analog diagrams of chemical structures embedded in digital raster images. To automate analog-to-digital conversion of chemical structure diagrams in scientific research articles, several software systems have been developed. But their algorithmic performance and utility in cheminformatic research have not been investigated. Results This paper aims to provide critical reviews for these systems and also report our recent development of ChemReader – a fully automated tool for extracting chemical structure diagrams in research articles and converting them into standard, searchable chemical file formats. Basic algorithms for recognizing lines and letters representing bonds and atoms in chemical structure diagrams can be independently run in sequence from a graphical user interface-and the algorithm parameters can be readily changed-to facilitate additional development specifically tailored to a chemical database annotation scheme. Compared with existing software programs such as OSRA, Kekule, and CLiDE, our results indicate that ChemReader outperforms other software systems on several sets of sample images from diverse sources in terms of the rate of correct outputs and the accuracy on extracting molecular substructure patterns. Conclusion The availability of ChemReader as a cheminformatic tool for extracting chemical structure information from digital raster images allows research and development groups to enrich their chemical structure databases by annotating the entries with published research articles. Based on its stable performance and high accuracy, ChemReader may be sufficiently accurate for annotating the chemical database with links

  8. Top-Down Beta Enhances Bottom-Up Gamma.

    Science.gov (United States)

    Richter, Craig G; Thompson, William H; Bosman, Conrado A; Fries, Pascal

    2017-07-12

    Several recent studies have demonstrated that the bottom-up signaling of a visual stimulus is subserved by interareal gamma-band synchronization, whereas top-down influences are mediated by alpha-beta band synchronization. These processes may implement top-down control of stimulus processing if top-down and bottom-up mediating rhythms are coupled via cross-frequency interaction. To test this possibility, we investigated Granger-causal influences among awake macaque primary visual area V1, higher visual area V4, and parietal control area 7a during attentional task performance. Top-down 7a-to-V1 beta-band influences enhanced visually driven V1-to-V4 gamma-band influences. This enhancement was spatially specific and largest when beta-band activity preceded gamma-band activity by ∼0.1 s, suggesting a causal effect of top-down processes on bottom-up processes. We propose that this cross-frequency interaction mechanistically subserves the attentional control of stimulus selection. SIGNIFICANCE STATEMENT Contemporary research indicates that the alpha-beta frequency band underlies top-down control, whereas the gamma-band mediates bottom-up stimulus processing. This arrangement inspires an attractive hypothesis, which posits that top-down beta-band influences directly modulate bottom-up gamma band influences via cross-frequency interaction. We evaluate this hypothesis determining that beta-band top-down influences from parietal area 7a to visual area V1 are correlated with bottom-up gamma frequency influences from V1 to area V4, in a spatially specific manner, and that this correlation is maximal when top-down activity precedes bottom-up activity. These results show that for top-down processes such as spatial attention, elevated top-down beta-band influences directly enhance feedforward stimulus-induced gamma-band processing, leading to enhancement of the selected stimulus. Copyright © 2017 Richter, Thompson et al.

  9. The Fishery Performance Indicators: A Management Tool for Triple Bottom Line Outcomes

    Science.gov (United States)

    Anderson, James L.; Anderson, Christopher M.; Chu, Jingjie; Meredith, Jennifer; Asche, Frank; Sylvia, Gil; Smith, Martin D.; Anggraeni, Dessy; Arthur, Robert; Guttormsen, Atle; McCluney, Jessica K.; Ward, Tim; Akpalu, Wisdom; Eggert, Håkan; Flores, Jimely; Freeman, Matthew A.; Holland, Daniel S.; Knapp, Gunnar; Kobayashi, Mimako; Larkin, Sherry; MacLauchlin, Kari; Schnier, Kurt; Soboil, Mark; Tveteras, Sigbjorn; Uchida, Hirotsugu; Valderrama, Diego

    2015-01-01

    Pursuit of the triple bottom line of economic, community and ecological sustainability has increased the complexity of fishery management; fisheries assessments require new types of data and analysis to guide science-based policy in addition to traditional biological information and modeling. We introduce the Fishery Performance Indicators (FPIs), a broadly applicable and flexible tool for assessing performance in individual fisheries, and for establishing cross-sectional links between enabling conditions, management strategies and triple bottom line outcomes. Conceptually separating measures of performance, the FPIs use 68 individual outcome metrics—coded on a 1 to 5 scale based on expert assessment to facilitate application to data poor fisheries and sectors—that can be partitioned into sector-based or triple-bottom-line sustainability-based interpretative indicators. Variation among outcomes is explained with 54 similarly structured metrics of inputs, management approaches and enabling conditions. Using 61 initial fishery case studies drawn from industrial and developing countries around the world, we demonstrate the inferential importance of tracking economic and community outcomes, in addition to resource status. PMID:25946194

  10. Operating history report for the Peach Bottom HTGR. Volume I. Reactor operating history

    International Nuclear Information System (INIS)

    Scheffel, W.J.; Baldwin, N.L.; Tomlin, R.W.

    1976-01-01

    The operating history for the Peach Bottom-1 Reactor is presented for the years 1966 through 1975. Information concerning general chemistry data, general physics data, location of sensing elements in the primary helium circuit, and postirradiation examination and testing of reactor components is presented

  11. A Novel Thermal-Mechanical Detection System for Reactor Pressure Vessel Bottom Failure Monitoring in Severe Accidents

    International Nuclear Information System (INIS)

    Bi, Daowei; Bu, Jiangtao; Xu, Dongling

    2013-06-01

    Following the Fukushima Daiichi nuclear accident in Japan, there is an increased need of enhanced capabilities for severe accident management (SAM) program. Among others, a reliable method for detecting reactor pressure vessel (RPV) bottom failure has been evaluated as imperative by many utility owners. Though radiation and/or temperature measurement are potential solutions by tradition, there are some limitations for them to function desirably in such severe accident as that in Japan. To provide reliable information for assessment of accident progress in SAM program, in this paper we propose a novel thermal-mechanical detection system (TMDS) for RPV bottom failure monitoring in severe accidents. The main components of TMDS include thermally sensitive element, metallic cables, tension controlled switch and main control room annunciation device. With TMDS installed, there shall be a reliable means of keeping SAM decision-makers informed whether the RPV bottom has indeed failed. Such assurance definitely guarantees enhancement of severe accident management performance and significantly improve nuclear safety and thus protect the society and people. (authors)

  12. Single electron yields from semileptonic charm and bottom hadron decays in Au +Au collisions at √{sN N}=200 GeV

    Science.gov (United States)

    Adare, A.; Aidala, C.; Ajitanand, N. N.; Akiba, Y.; Akimoto, R.; Alexander, J.; Alfred, M.; Aoki, K.; Apadula, N.; Aramaki, Y.; Asano, H.; Aschenauer, E. C.; Atomssa, E. T.; Awes, T. C.; Azmoun, B.; Babintsev, V.; Bai, M.; Bandara, N. S.; Bannier, B.; Barish, K. N.; Bassalleck, B.; Bathe, S.; Baublis, V.; Baumgart, S.; Bazilevsky, A.; Beaumier, M.; Beckman, S.; Belmont, R.; Berdnikov, A.; Berdnikov, Y.; Black, D.; Blau, D. S.; Bok, J. S.; Boyle, K.; Brooks, M. L.; Bryslawskyj, J.; Buesching, H.; Bumazhnov, V.; Butsyk, S.; Campbell, S.; Chen, C.-H.; Chi, C. Y.; Chiu, M.; Choi, I. J.; Choi, J. B.; Choi, S.; Choudhury, R. K.; Christiansen, P.; Chujo, T.; Chvala, O.; Cianciolo, V.; Citron, Z.; Cole, B. A.; Connors, M.; Cronin, N.; Crossette, N.; Csanád, M.; Csörgő, T.; Dairaku, S.; Danley, T. W.; Datta, A.; Daugherity, M. S.; David, G.; Deblasio, K.; Dehmelt, K.; Denisov, A.; Deshpande, A.; Desmond, E. J.; Dietzsch, O.; Ding, L.; Dion, A.; Diss, P. B.; Do, J. H.; Donadelli, M.; D'Orazio, L.; Drapier, O.; Drees, A.; Drees, K. A.; Durham, J. M.; Durum, A.; Edwards, S.; Efremenko, Y. V.; Engelmore, T.; Enokizono, A.; Esumi, S.; Eyser, K. O.; Fadem, B.; Feege, N.; Fields, D. E.; Finger, M.; Finger, M.; Fleuret, F.; Fokin, S. L.; Frantz, J. E.; Franz, A.; Frawley, A. D.; Fukao, Y.; Fusayasu, T.; Gainey, K.; Gal, C.; Gallus, P.; Garg, P.; Garishvili, A.; Garishvili, I.; Ge, H.; Giordano, F.; Glenn, A.; Gong, X.; Gonin, M.; Goto, Y.; Granier de Cassagnac, R.; Grau, N.; Greene, S. V.; Grosse Perdekamp, M.; Gu, Y.; Gunji, T.; Hachiya, T.; Haggerty, J. S.; Hahn, K. I.; Hamagaki, H.; Hamilton, H. F.; Han, S. Y.; Hanks, J.; Hasegawa, S.; Haseler, T. O. S.; Hashimoto, K.; Hayano, R.; Hayashi, S.; He, X.; Hemmick, T. K.; Hester, T.; Hill, J. C.; Hollis, R. S.; Homma, K.; Hong, B.; Horaguchi, T.; Hoshino, T.; Hotvedt, N.; Huang, J.; Huang, S.; Ichihara, T.; Iinuma, H.; Ikeda, Y.; Imai, K.; Imazu, Y.; Imrek, J.; Inaba, M.; Iordanova, A.; Isenhower, D.; Isinhue, A.; Ivanishchev, D.; Jacak, B. V.; Javani, M.; Jezghani, M.; Jia, J.; Jiang, X.; Johnson, B. M.; Joo, K. S.; Jouan, D.; Jumper, D. S.; Kamin, J.; Kanda, S.; Kang, B. H.; Kang, J. H.; Kang, J. S.; Kapustinsky, J.; Karatsu, K.; Kawall, D.; Kazantsev, A. V.; Kempel, T.; Key, J. A.; Khachatryan, V.; Khandai, P. K.; Khanzadeev, A.; Kijima, K. M.; Kim, B. I.; Kim, C.; Kim, D. J.; Kim, E.-J.; Kim, G. W.; Kim, M.; Kim, Y.-J.; Kim, Y. K.; Kimelman, B.; Kinney, E.; Kistenev, E.; Kitamura, R.; Klatsky, J.; Kleinjan, D.; Kline, P.; Koblesky, T.; Komkov, B.; Koster, J.; Kotchetkov, D.; Kotov, D.; Krizek, F.; Kurita, K.; Kurosawa, M.; Kwon, Y.; Kyle, G. S.; Lacey, R.; Lai, Y. S.; Lajoie, J. G.; Lebedev, A.; Lee, D. M.; Lee, J.; Lee, K. B.; Lee, K. S.; Lee, S.; Lee, S. H.; Lee, S. R.; Leitch, M. J.; Leite, M. A. L.; Leitgab, M.; Lewis, B.; Li, X.; Lim, S. H.; Linden Levy, L. A.; Liu, M. X.; Lynch, D.; Maguire, C. F.; Makdisi, Y. I.; Makek, M.; Manion, A.; Manko, V. I.; Mannel, E.; Maruyama, T.; McCumber, M.; McGaughey, P. L.; McGlinchey, D.; McKinney, C.; Meles, A.; Mendoza, M.; Meredith, B.; Miake, Y.; Mibe, T.; Midori, J.; Mignerey, A. C.; Milov, A.; Mishra, D. K.; Mitchell, J. T.; Miyasaka, S.; Mizuno, S.; Mohanty, A. K.; Mohapatra, S.; Montuenga, P.; Moon, H. J.; Moon, T.; Morrison, D. P.; Moskowitz, M.; Moukhanova, T. V.; Murakami, T.; Murata, J.; Mwai, A.; Nagae, T.; Nagamiya, S.; Nagashima, K.; Nagle, J. L.; Nagy, M. I.; Nakagawa, I.; Nakagomi, H.; Nakamiya, Y.; Nakamura, K. R.; Nakamura, T.; Nakano, K.; Nattrass, C.; Netrakanti, P. K.; Nihashi, M.; Niida, T.; Nishimura, S.; Nouicer, R.; Novák, T.; Novitzky, N.; Nukariya, A.; Nyanin, A. S.; Obayashi, H.; O'Brien, E.; Ogilvie, C. A.; Okada, K.; Orjuela Koop, J. D.; Osborn, J. D.; Oskarsson, A.; Ozawa, K.; Pak, R.; Pantuev, V.; Papavassiliou, V.; Park, I. H.; Park, J. S.; Park, S.; Park, S. K.; Pate, S. F.; Patel, L.; Patel, M.; Pei, H.; Peng, J.-C.; Perepelitsa, D. V.; Perera, G. D. N.; Peressounko, D. Yu.; Perry, J.; Petti, R.; Pinkenburg, C.; Pinson, R.; Pisani, R. P.; Purschke, M. L.; Qu, H.; Rak, J.; Ramson, B. J.; Ravinovich, I.; Read, K. F.; Reynolds, D.; Riabov, V.; Riabov, Y.; Richardson, E.; Rinn, T.; Riveli, N.; Roach, D.; Roche, G.; Rolnick, S. D.; Rosati, M.; Rowan, Z.; Rubin, J. G.; Ryu, M. S.; Sahlmueller, B.; Saito, N.; Sakaguchi, T.; Sako, H.; Samsonov, V.; Sarsour, M.; Sato, S.; Sawada, S.; Schaefer, B.; Schmoll, B. K.; Sedgwick, K.; Seidl, R.; Sen, A.; Seto, R.; Sett, P.; Sexton, A.; Sharma, D.; Shein, I.; Shibata, T.-A.; Shigaki, K.; Shimomura, M.; Shoji, K.; Shukla, P.; Sickles, A.; Silva, C. L.; Silvermyr, D.; Sim, K. S.; Singh, B. K.; Singh, C. P.; Singh, V.; Skolnik, M.; Slunečka, M.; Snowball, M.; Solano, S.; Soltz, R. A.; Sondheim, W. E.; Sorensen, S. P.; Sourikova, I. V.; Stankus, P. W.; Steinberg, P.; Stenlund, E.; Stepanov, M.; Ster, A.; Stoll, S. P.; Sugitate, T.; Sukhanov, A.; Sumita, T.; Sun, J.; Sziklai, J.; Takagui, E. M.; Takahara, A.; Taketani, A.; Tanaka, Y.; Taneja, S.; Tanida, K.; Tannenbaum, M. J.; Tarafdar, S.; Taranenko, A.; Tennant, E.; Tieulent, R.; Timilsina, A.; Todoroki, T.; Tomášek, M.; Torii, H.; Towell, C. L.; Towell, R.; Towell, R. S.; Tserruya, I.; Tsuchimoto, Y.; Vale, C.; van Hecke, H. W.; Vargyas, M.; Vazquez-Zambrano, E.; Veicht, A.; Velkovska, J.; Vértesi, R.; Virius, M.; Voas, B.; Vrba, V.; Vznuzdaev, E.; Wang, X. R.; Watanabe, D.; Watanabe, K.; Watanabe, Y.; Watanabe, Y. S.; Wei, F.; Whitaker, S.; White, A. S.; White, S. N.; Winter, D.; Wolin, S.; Woody, C. L.; Wysocki, M.; Xia, B.; Xue, L.; Yalcin, S.; Yamaguchi, Y. L.; Yanovich, A.; Ying, J.; Yokkaichi, S.; Yoo, J. H.; Yoon, I.; You, Z.; Younus, I.; Yu, H.; Yushmanov, I. E.; Zajc, W. A.; Zelenski, A.; Zhou, S.; Zou, L.; Phenix Collaboration

    2016-03-01

    The PHENIX Collaboration at the Relativistic Heavy Ion Collider has measured open heavy flavor production in minimum bias Au +Au collisions at √{sN N}=200 GeV via the yields of electrons from semileptonic decays of charm and bottom hadrons. Previous heavy flavor electron measurements indicated substantial modification in the momentum distribution of the parent heavy quarks owing to the quark-gluon plasma created in these collisions. For the first time, using the PHENIX silicon vertex detector to measure precision displaced tracking, the relative contributions from charm and bottom hadrons to these electrons as a function of transverse momentum are measured in Au +Au collisions. We compare the fraction of electrons from bottom hadrons to previously published results extracted from electron-hadron correlations in p +p collisions at √{sN N}=200 GeV and find the fractions to be similar within the large uncertainties on both measurements for pT>4 GeV/c . We use the bottom electron fractions in Au +Au and p +p along with the previously measured heavy flavor electron RA A to calculate the RA A for electrons from charm and bottom hadron decays separately. We find that electrons from bottom hadron decays are less suppressed than those from charm for the region 3

  13. Bottom-up guidance in visual search for conjunctions.

    Science.gov (United States)

    Proulx, Michael J

    2007-02-01

    Understanding the relative role of top-down and bottom-up guidance is crucial for models of visual search. Previous studies have addressed the role of top-down and bottom-up processes in search for a conjunction of features but with inconsistent results. Here, the author used an attentional capture method to address the role of top-down and bottom-up processes in conjunction search. The role of bottom-up processing was assayed by inclusion of an irrelevant-size singleton in a search for a conjunction of color and orientation. One object was uniquely larger on each trial, with chance probability of coinciding with the target; thus, the irrelevant feature of size was not predictive of the target's location. Participants searched more efficiently for the target when it was also the size singleton, and they searched less efficiently for the target when a nontarget was the size singleton. Although a conjunction target cannot be detected on the basis of bottom-up processing alone, participants used search strategies that relied significantly on bottom-up guidance in finding the target, resulting in interference from the irrelevant-size singleton.

  14. Summary of core damage frequency from internal initiators: Peach Bottom

    International Nuclear Information System (INIS)

    Kolaczkowski, A.M.; Lambright, J.A.; Cathey, N.

    1986-01-01

    Probabilistic risk assessments (PRA) based on internal initiators are being conducted on a number of reference plants in order to provide the Nuclear Regulatory Commission (NRC) with updated information about light water reactor risk. The results of these analyses will be used by the NRC to prepare NUREG-1150 which will examine the NRC's current perception of risk. Peach Bottom has been chosen as one of the reference plants

  15. From remote sensing data about information extraction for 3D geovisualization - Development of a workflow

    International Nuclear Information System (INIS)

    Tiede, D.

    2010-01-01

    With an increased availability of high (spatial) resolution remote sensing imagery since the late nineties, the need to develop operative workflows for the automated extraction, provision and communication of information from such data has grown. Monitoring requirements, aimed at the implementation of environmental or conservation targets, management of (environmental-) resources, and regional planning as well as international initiatives, especially the joint initiative of the European Commission and ESA (European Space Agency) for Global Monitoring for Environment and Security (GMES) play also a major part. This thesis addresses the development of an integrated workflow for the automated provision of information derived from remote sensing data. Considering applied data and fields of application, this work aims to design the workflow as generic as possible. Following research questions are discussed: What are the requirements of a workflow architecture that seamlessly links the individual workflow elements in a timely manner and secures accuracy of the extracted information effectively? How can the workflow retain its efficiency if mounds of data are processed? How can the workflow be improved with regards to automated object-based image analysis (OBIA)? Which recent developments could be of use? What are the limitations or which workarounds could be applied in order to generate relevant results? How can relevant information be prepared target-oriented and communicated effectively? How can the more recently developed freely available virtual globes be used for the delivery of conditioned information under consideration of the third dimension as an additional, explicit carrier of information? Based on case studies comprising different data sets and fields of application it is demonstrated how methods to extract and process information as well as to effectively communicate results can be improved and successfully combined within one workflow. It is shown that (1

  16. Analysis of Maisotsenko open gas turbine bottoming cycle

    International Nuclear Information System (INIS)

    Saghafifar, Mohammad; Gadalla, Mohamed

    2015-01-01

    Maisotsenko gas turbine cycle (MGTC) is a recently proposed humid air turbine cycle. An air saturator is employed for air heating and humidification purposes in MGTC. In this paper, MGTC is integrated as the bottoming cycle to a topping simple gas turbine as Maisotsenko bottoming cycle (MBC). A thermodynamic optimization is performed to illustrate the advantages and disadvantages of MBC as compared with air bottoming cycle (ABC). Furthermore, detailed sensitivity analysis is reported to present the effect of different operating parameters on the proposed configurations' performance. Efficiency enhancement of 3.7% is reported which results in more than 2600 tonne of natural gas fuel savings per year. - Highlights: • Developed an accurate air saturator model. • Introduced Maisotsenko bottoming cycle (MBC) as a power generation cycle. • Performed Thermodynamic optimization for MBC and air bottoming cycle (ABC). • Performed detailed sensitivity analysis for MBC under different operating conditions. • MBC has higher efficiency and specific net work output as compared to ABC

  17. Nuclear reactor construction with bottom supported reactor vessel

    International Nuclear Information System (INIS)

    Sharbaugh, J.E.

    1987-01-01

    This patent describes an improved liquid metal nuclear reactor construction comprising: (a) a nuclear reactor core having a bottom platform support structure; (b) a reactor vessel for holding a large pool of low pressure liquid metal coolant and housing the core; (c) a containment structure surrounding the reactor vessel and having a sidewall spaced outwardly from the reactor vessel side wall and having a base mat spaced below the reactor vessel bottom end wall; (d) a central small diameter post anchored to the containment structure base mat and extending upwardly to the reactor vessel to axially fix the bottom end wall of the reactor vessel and provide a center column support for the lower end of the reactor core; (e) annular support structure disposed in the reactor vessel on the bottom end wall and extending about the lower end of the core; (f) structural support means disposed between the containment structure base mat and bottom end of the reactor vessel wall and cooperating for supporting the reactor vessel at its bottom end wall on the containment structure base mat to allow the reactor vessel to expand radially but substantially prevent any lateral motions that might be imposed by the occurrence of a seismic event; (g) a bed of insulating material disposed between the containment structure base mat and the bottom end wall of the reactor vessel and uniformly supporting the reactor vessel at its bottom end wall; freely expand radially from the central post as it heats up while providing continuous support thereof; (h) a deck supported upon the wall of the containment vessel above the top open end of the reactor vessel; and (i) extendible and retractable coupling means extending between the deck and the top open end of the reactor vessel and flexibly and sealably interconnecting the reactor vessel at its top end to the deck

  18. Automated concept-level information extraction to reduce the need for custom software and rules development.

    Science.gov (United States)

    D'Avolio, Leonard W; Nguyen, Thien M; Goryachev, Sergey; Fiore, Louis D

    2011-01-01

    Despite at least 40 years of promising empirical performance, very few clinical natural language processing (NLP) or information extraction systems currently contribute to medical science or care. The authors address this gap by reducing the need for custom software and rules development with a graphical user interface-driven, highly generalizable approach to concept-level retrieval. A 'learn by example' approach combines features derived from open-source NLP pipelines with open-source machine learning classifiers to automatically and iteratively evaluate top-performing configurations. The Fourth i2b2/VA Shared Task Challenge's concept extraction task provided the data sets and metrics used to evaluate performance. Top F-measure scores for each of the tasks were medical problems (0.83), treatments (0.82), and tests (0.83). Recall lagged precision in all experiments. Precision was near or above 0.90 in all tasks. Discussion With no customization for the tasks and less than 5 min of end-user time to configure and launch each experiment, the average F-measure was 0.83, one point behind the mean F-measure of the 22 entrants in the competition. Strong precision scores indicate the potential of applying the approach for more specific clinical information extraction tasks. There was not one best configuration, supporting an iterative approach to model creation. Acceptable levels of performance can be achieved using fully automated and generalizable approaches to concept-level information extraction. The described implementation and related documentation is available for download.

  19. Extraction of quantifiable information from complex systems

    CERN Document Server

    Dahmen, Wolfgang; Griebel, Michael; Hackbusch, Wolfgang; Ritter, Klaus; Schneider, Reinhold; Schwab, Christoph; Yserentant, Harry

    2014-01-01

    In April 2007, the  Deutsche Forschungsgemeinschaft (DFG) approved the  Priority Program 1324 “Mathematical Methods for Extracting Quantifiable Information from Complex Systems.” This volume presents a comprehensive overview of the most important results obtained over the course of the program.   Mathematical models of complex systems provide the foundation for further technological developments in science, engineering and computational finance.  Motivated by the trend toward steadily increasing computer power, ever more realistic models have been developed in recent years. These models have also become increasingly complex, and their numerical treatment poses serious challenges.   Recent developments in mathematics suggest that, in the long run, much more powerful numerical solution strategies could be derived if the interconnections between the different fields of research were systematically exploited at a conceptual level. Accordingly, a deeper understanding of the mathematical foundations as w...

  20. THE EXTRACTION OF INDOOR BUILDING INFORMATION FROM BIM TO OGC INDOORGML

    Directory of Open Access Journals (Sweden)

    T.-A. Teo

    2017-07-01

    Full Text Available Indoor Spatial Data Infrastructure (indoor-SDI is an important SDI for geosptial analysis and location-based services. Building Information Model (BIM has high degree of details in geometric and semantic information for building. This study proposed direct conversion schemes to extract indoor building information from BIM to OGC IndoorGML. The major steps of the research include (1 topological conversion from building model into indoor network model; and (2 generation of IndoorGML. The topological conversion is a major process of generating and mapping nodes and edges from IFC to indoorGML. Node represents every space (e.g. IfcSpace and objects (e.g. IfcDoor in the building while edge shows the relationships between nodes. According to the definition of IndoorGML, the topological model in the dual space is also represented as a set of nodes and edges. These definitions of IndoorGML are the same as in the indoor network. Therefore, we can extract the necessary data in the indoor network and easily convert them into IndoorGML based on IndoorGML Schema. The experiment utilized a real BIM model to examine the proposed method. The experimental results indicated that the 3D indoor model (i.e. IndoorGML model can be automatically imported from IFC model by the proposed procedure. In addition, the geometric and attribute of building elements are completely and correctly converted from BIM to indoor-SDI.

  1. ONTOGRABBING: Extracting Information from Texts Using Generative Ontologies

    DEFF Research Database (Denmark)

    Nilsson, Jørgen Fischer; Szymczak, Bartlomiej Antoni; Jensen, P.A.

    2009-01-01

    We describe principles for extracting information from texts using a so-called generative ontology in combination with syntactic analysis. Generative ontologies are introduced as semantic domains for natural language phrases. Generative ontologies extend ordinary finite ontologies with rules...... for producing recursively shaped terms representing the ontological content (ontological semantics) of NL noun phrases and other phrases. We focus here on achieving a robust, often only partial, ontology-driven parsing of and ascription of semantics to a sentence in the text corpus. The aim of the ontological...... analysis is primarily to identify paraphrases, thereby achieving a search functionality beyond mere keyword search with synsets. We further envisage use of the generative ontology as a phrase-based rather than word-based browser into text corpora....

  2. MedEx: a medication information extraction system for clinical narratives

    Science.gov (United States)

    Stenner, Shane P; Doan, Son; Johnson, Kevin B; Waitman, Lemuel R; Denny, Joshua C

    2010-01-01

    Medication information is one of the most important types of clinical data in electronic medical records. It is critical for healthcare safety and quality, as well as for clinical research that uses electronic medical record data. However, medication data are often recorded in clinical notes as free-text. As such, they are not accessible to other computerized applications that rely on coded data. We describe a new natural language processing system (MedEx), which extracts medication information from clinical notes. MedEx was initially developed using discharge summaries. An evaluation using a data set of 50 discharge summaries showed it performed well on identifying not only drug names (F-measure 93.2%), but also signature information, such as strength, route, and frequency, with F-measures of 94.5%, 93.9%, and 96.0% respectively. We then applied MedEx unchanged to outpatient clinic visit notes. It performed similarly with F-measures over 90% on a set of 25 clinic visit notes. PMID:20064797

  3. Methods to extract information on the atomic and molecular states from scientific abstracts

    International Nuclear Information System (INIS)

    Sasaki, Akira; Ueshima, Yutaka; Yamagiwa, Mitsuru; Murata, Masaki; Kanamaru, Toshiyuki; Shirado, Tamotsu; Isahara, Hitoshi

    2005-01-01

    We propose a new application of information technology to recognize and extract expressions of atomic and molecular states from electrical forms of scientific abstracts. Present results will help scientists to understand atomic states as well as the physics discussed in the articles. Combining with the internet search engines, it will make one possible to collect not only atomic and molecular data but broader scientific information over a wide range of research fields. (author)

  4. Agricultural ammonia emissions in China: reconciling bottom-up and top-down estimates

    Directory of Open Access Journals (Sweden)

    L. Zhang

    2018-01-01

    Full Text Available Current estimates of agricultural ammonia (NH3 emissions in China differ by more than a factor of 2, hindering our understanding of their environmental consequences. Here we apply both bottom-up statistical and top-down inversion methods to quantify NH3 emissions from agriculture in China for the year 2008. We first assimilate satellite observations of NH3 column concentration from the Tropospheric Emission Spectrometer (TES using the GEOS-Chem adjoint model to optimize Chinese anthropogenic NH3 emissions at the 1∕2°  ×  2∕3° horizontal resolution for March–October 2008. Optimized emissions show a strong summer peak, with emissions about 50 % higher in summer than spring and fall, which is underestimated in current bottom-up NH3 emission estimates. To reconcile the latter with the top-down results, we revisit the processes of agricultural NH3 emissions and develop an improved bottom-up inventory of Chinese NH3 emissions from fertilizer application and livestock waste at the 1∕2°  ×  2∕3° resolution. Our bottom-up emission inventory includes more detailed information on crop-specific fertilizer application practices and better accounts for meteorological modulation of NH3 emission factors in China. We find that annual anthropogenic NH3 emissions are 11.7 Tg for 2008, with 5.05 Tg from fertilizer application and 5.31 Tg from livestock waste. The two sources together account for 88 % of total anthropogenic NH3 emissions in China. Our bottom-up emission estimates also show a distinct seasonality peaking in summer, consistent with top-down results from the satellite-based inversion. Further evaluations using surface network measurements show that the model driven by our bottom-up emissions reproduces the observed spatial and seasonal variations of NH3 gas concentrations and ammonium (NH4+ wet deposition fluxes over China well, providing additional credibility to the improvements we have made to our

  5. Analysis Methods for Extracting Knowledge from Large-Scale WiFi Monitoring to Inform Building Facility Planning

    DEFF Research Database (Denmark)

    Ruiz-Ruiz, Antonio; Blunck, Henrik; Prentow, Thor Siiger

    2014-01-01

    realistic data to inform facility planning. In this paper, we propose analysis methods to extract knowledge from large sets of network collected WiFi traces to better inform facility management and planning in large building complexes. The analysis methods, which build on a rich set of temporal and spatial......The optimization of logistics in large building com- plexes with many resources, such as hospitals, require realistic facility management and planning. Current planning practices rely foremost on manual observations or coarse unverified as- sumptions and therefore do not properly scale or provide....... Spatio-temporal visualization tools built on top of these methods enable planners to inspect and explore extracted information to inform facility-planning activities. To evaluate the methods, we present results for a large hospital complex covering more than 10 hectares. The evaluation is based on Wi...

  6. Estimates of bottom roughness length and bottom shear stress in South San Francisco Bay, California

    Science.gov (United States)

    Cheng, R.T.; Ling, C.-H.; Gartner, J.W.; Wang, P.-F.

    1999-01-01

    A field investigation of the hydrodynamics and the resuspension and transport of participate matter in a bottom boundary layer was carried out in South San Francisco Bay (South Bay), California, during March-April 1995. Using broadband acoustic Doppler current profilers, detailed measurements of turbulent mean velocity distribution within 1.5 m above bed have been obtained. A global method of data analysis was used for estimating bottom roughness length zo and bottom shear stress (or friction velocities u*). Field data have been examined by dividing the time series of velocity profiles into 24-hour periods and independently analyzing the velocity profile time series by flooding and ebbing periods. The global method of solution gives consistent properties of bottom roughness length zo and bottom shear stress values (or friction velocities u*) in South Bay. Estimated mean values of zo and u* for flooding and ebbing cycles are different. The differences in mean zo and u* are shown to be caused by tidal current flood-ebb inequality, rather than the flooding or ebbing of tidal currents. The bed shear stress correlates well with a reference velocity; the slope of the correlation defines a drag coefficient. Forty-three days of field data in South Bay show two regimes of zo (and drag coefficient) as a function of a reference velocity. When the mean velocity is >25-30 cm s-1, the ln zo (and thus the drag coefficient) is inversely proportional to the reference velocity. The cause for the reduction of roughness length is hypothesized as sediment erosion due to intensifying tidal currents thereby reducing bed roughness. When the mean velocity is <25-30 cm s-1, the correlation between zo and the reference velocity is less clear. A plausible explanation of scattered values of zo under this condition may be sediment deposition. Measured sediment data were inadequate to support this hypothesis, but the proposed hypothesis warrants further field investigation.

  7. Visual anticipation biases conscious perception but not bottom-up visual processing

    Directory of Open Access Journals (Sweden)

    Paul F.M.J. Verschure

    2015-01-01

    Full Text Available Theories of consciousness can be grouped with respect to their stance on embodiment, sensori-motor contingencies, prediction and integration. In this list prediction plays a key role and it is not clear which aspects of prediction are most prominent in the conscious scene. An evolving view on the brain is that it can be seen as a prediction machine that optimizes its ability to predict states of the world and the self through the top-down propagation of predictions and the bottom-up presentation of prediction errors. There are competing views though on whether prediction or prediction errors dominate the conscious scene. Yet, due to the lack of efficient indirect measures, the dynamic effects of prediction on perception, decision making and consciousness have been difficult to assess and to model. We propose a novel mathematical framework and psychophysical paradigm that allows us to assess both the hierarchical structuring of perceptual consciousness, its content and the impact of predictions and / or errors on the conscious scene. Using a displacement detection task combined with reverse correlation we reveal signatures of the usage of prediction at three different levels of perception: bottom-up early saccades, top-down driven late saccades and conscious decisions. Our results suggest that the brain employs multiple parallel mechanisms at different levels of information processing to restrict the sensory field using predictions. We observe that cognitive load has a quantifiable effect on this dissociation of the bottom-up sensory and top-down predictive processes. We propose a probabilistic data association model from dynamical systems theory to model this predictive bias in different information processing levels.

  8. Distribution of Fe in waters and bottom sediments of a small estuarine catchment, Pumicestone Region, southeast Queensland, Australia

    International Nuclear Information System (INIS)

    Liaghati, Tania; Cox, Malcolm E.; Preda, Micaela

    2005-01-01

    Dissolved and extractable iron concentrations in surface water, groundwater and bottom sediments were determined for Halls Creek, a small subtropical tidally influenced creek. Dissolved iron concentrations were much higher in fresh surface waters and groundwater compared to the estuarine water. In bottom sediments, iron minerals were determined by X-ray diffraction (XRD); of these, hematite (up to 11%) has formed by precipitation from iron-rich water in the freshwater section of the catchment. Pyrite was only identified in the estuarine reach and demonstrated several morphologies [identified by scanning electron microscopy (SEM)] including loosely and closely packed framboids, and the euhedral form. The forms of pyrite found in bottom sediments indicate in situ production and recrystallisation. In surface waters, pyrite was detected in suspended sediment; due to oxygen concentrations well above 50 μmol/l, it was concluded that framboids do not form in the water column, but are within resuspended bottom sediments or eroded from creek banks. The persistence of framboids in suspended sediments, where oxygen levels are relatively high, could be due to their silica and clay-rich coatings, which prevent a rapid oxidation of the pyrite. In addition to identifying processes of formation and transport of pyrite, this study has environmental significance, as this mineral is a potential source of bioavailable forms of iron, which can be a major nutrient supporting algal growth

  9. Wet physical separation of MSWI bottom ash

    NARCIS (Netherlands)

    Muchova, L.

    2010-01-01

    Bottom ash (BA) from municipal solid waste incineration (MSWI) has high potential for the recovery of valuable secondary materials. For example, the MSWI bottom ash produced by the incinerator at Amsterdam contains materials such as non-ferrous metals (2.3%), ferrous metals (8-13%), gold (0.4 ppm),

  10. Sugaring-out extraction of acetoin from fermentation broth by coupling with fermentation.

    Science.gov (United States)

    Dai, Jian-Ying; Ma, Lin-Hui; Wang, Zhuang-Fei; Guan, Wen-Tian; Xiu, Zhi-Long

    2017-03-01

    Acetoin is a natural flavor and an important bio-based chemical which could be separated from fermentation broth by solvent extraction, salting-out extraction or recovered in the form of derivatives. In this work, a novel method named as sugaring-out extraction coupled with fermentation was tried in the acetoin production by Bacillus subtilis DL01. The effects of six solvents on bacterial growth and the distribution of acetoin and glucose in different solvent-glucose systems were explored. The operation parameters such as standing time, glucose concentration, and volume ratio of ethyl acetate to fermentation broth were determined. In a system composed of fermentation broth, glucose (100%, m/v) and two-fold volume of ethyl acetate, nearly 100% glucose was distributed into bottom phase, and 61.2% acetoin into top phase without coloring matters and organic acids. The top phase was treated by vacuum distillation to remove solvent and purify acetoin, while the bottom phase was used as carbon source to produce acetoin in the next batch of fermentation.

  11. Collection and preparation of bottom sediment samples for analysis of radionuclides and trace elements

    International Nuclear Information System (INIS)

    2003-07-01

    The publication is the first in a series of TECDOCs on sampling and sample handling as part of the IAEA support to improve reliability of nuclear analytical techniques (NATs) in Member State laboratories. The purpose of the document is to provide information on the methods for collecting sediments, the equipment used, and the sample preparation techniques for radionuclide and elemental analysis. The most appropriate procedures for defining the strategies and criteria for selecting sampling locations, for sample storage and transportation are also given. Elements of QA/QC and documentation needs for sampling and sediment analysis are discussed. Collection and preparation of stream and river bottom sediments, lake bottom sediments, estuary bottom sediments, and marine (shallow) bottom sediments are covered. The document is intended to be a comprehensive manual for the collection and preparation of bottom sediments as a prerequisite to obtain representative and meaningful results using NATs. Quality assurance and quality control (QA/QC) is emphasized as an important aspect to ensure proper collection, transportation, preservation, and analysis since it forms the basis for interpretation and legislation. Although there are many approaches and methods available for sediment analyses, the scope of the report is limited to sample preparation for (1) analysis of radionuclides (including sediment dating using radionuclides such as Pb-210 and Cs-137) and (2) analysis of trace, minor and major elements using nuclear and related analytical techniques such as NAA, XRF and PIXE

  12. Collection and preparation of bottom sediment samples for analysis of radionuclides and trace elements

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-07-01

    The publication is the first in a series of TECDOCs on sampling and sample handling as part of the IAEA support to improve reliability of nuclear analytical techniques (NATs) in Member State laboratories. The purpose of the document is to provide information on the methods for collecting sediments, the equipment used, and the sample preparation techniques for radionuclide and elemental analysis. The most appropriate procedures for defining the strategies and criteria for selecting sampling locations, for sample storage and transportation are also given. Elements of QA/QC and documentation needs for sampling and sediment analysis are discussed. Collection and preparation of stream and river bottom sediments, lake bottom sediments, estuary bottom sediments, and marine (shallow) bottom sediments are covered. The document is intended to be a comprehensive manual for the collection and preparation of bottom sediments as a prerequisite to obtain representative and meaningful results using NATs. Quality assurance and quality control (QA/QC) is emphasized as an important aspect to ensure proper collection, transportation, preservation, and analysis since it forms the basis for interpretation and legislation. Although there are many approaches and methods available for sediment analyses, the scope of the report is limited to sample preparation for (1) analysis of radionuclides (including sediment dating using radionuclides such as Pb-210 and Cs-137) and (2) analysis of trace, minor and major elements using nuclear and related analytical techniques such as NAA, XRF and PIXE.

  13. Perceived Effects of Pornography on the Couple Relationship: Initial Findings of Open-Ended, Participant-Informed, "Bottom-Up" Research.

    Science.gov (United States)

    Kohut, Taylor; Fisher, William A; Campbell, Lorne

    2017-02-01

    The current study adopted a participant-informed, "bottom-up," qualitative approach to identifying perceived effects of pornography on the couple relationship. A large sample (N = 430) of men and women in heterosexual relationships in which pornography was used by at least one partner was recruited through online (e.g., Facebook, Twitter, etc.) and offline (e.g., newspapers, radio, etc.) sources. Participants responded to open-ended questions regarding perceived consequences of pornography use for each couple member and for their relationship in the context of an online survey. In the current sample of respondents, "no negative effects" was the most commonly reported impact of pornography use. Among remaining responses, positive perceived effects of pornography use on couple members and their relationship (e.g., improved sexual communication, more sexual experimentation, enhanced sexual comfort) were reported frequently; negative perceived effects of pornography (e.g., unrealistic expectations, decreased sexual interest in partner, increased insecurity) were also reported, albeit with considerably less frequency. The results of this work suggest new research directions that require more systematic attention.

  14. Domain-independent information extraction in unstructured text

    Energy Technology Data Exchange (ETDEWEB)

    Irwin, N.H. [Sandia National Labs., Albuquerque, NM (United States). Software Surety Dept.

    1996-09-01

    Extracting information from unstructured text has become an important research area in recent years due to the large amount of text now electronically available. This status report describes the findings and work done during the second year of a two-year Laboratory Directed Research and Development Project. Building on the first-year`s work of identifying important entities, this report details techniques used to group words into semantic categories and to output templates containing selective document content. Using word profiles and category clustering derived during a training run, the time-consuming knowledge-building task can be avoided. Though the output still lacks in completeness when compared to systems with domain-specific knowledge bases, the results do look promising. The two approaches are compatible and could complement each other within the same system. Domain-independent approaches retain appeal as a system that adapts and learns will soon outpace a system with any amount of a priori knowledge.

  15. Assessment of heavy metals pollution in bottom sediments of the Arabian Gulf after the Gulf War oil spill 1991

    International Nuclear Information System (INIS)

    Nasr, S.M.; Ahmed, M.H.; El-Raey, M.; Frihy, O.E.; Abdel Motti, A.

    1999-01-01

    The major objective of this study was to carry out a sequential geochemical extraction scheme for the partitioning of Fe, Mn, Co, Cu, Zn, Ni, Cr and Pb in the bottom sediments of the Arabian Gulf to detect any potential pollution impact on the gulf sediments following the 1991 gulf war oil spill, and to differentiate between anthropogenic inputs and natural background of heavy metals

  16. Fall Bottom Trawl Survey

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The standardized NEFSC Fall Bottom Trawl Survey was initiated in 1963 and covered an area from Hudson Canyon, NY to Nova Scotia, Canada. Throughout the years,...

  17. Winter Bottom Trawl Survey

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The standardized NEFSC Winter Bottom Trawl Survey was initiated in 1992 and covered offshore areas from the Mid-Atlantic to Georges Bank. Inshore strata were covered...

  18. Extracting breathing rate information from a wearable reflectance pulse oximeter sensor.

    Science.gov (United States)

    Johnston, W S; Mendelson, Y

    2004-01-01

    The integration of multiple vital physiological measurements could help combat medics and field commanders to better predict a soldier's health condition and enhance their ability to perform remote triage procedures. In this paper we demonstrate the feasibility of extracting accurate breathing rate information from a photoplethysmographic signal that was recorded by a reflectance pulse oximeter sensor mounted on the forehead and subsequently processed by a simple time domain filtering and frequency domain Fourier analysis.

  19. Trace elements distribution in bottom sediments from Amazon River estuary

    International Nuclear Information System (INIS)

    Lara, L.B.L.S.; Nadai Fernandes, E. de; Oliveira, H. de; Bacchi, M.A.

    1994-01-01

    The Amazon River discharges into a dynamic marine environment where there have been many interactive processes affecting dissolved and particulate solids, either those settling on the shelf or reaching the ocean. Trace elemental concentration, especially of the rare earth elements, have been determined by neutron activation analysis in sixty bottom sediment samples of the Amazon River estuary, providing information for the spatial and temporal variation study of those elements. (author). 16 refs, 6 figs, 3 tabs

  20. Summer Bottom Trawl Survey

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Sampling the coastal waters of the Gulf of Maine using the Northeast Fishery Science Center standardized bottom trawl has been problematic due to large areas of hard...

  1. Spring Bottom Trawl Survey

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The standardized NEFSC Spring Bottom Trawl Survey was initiated in 1968 and covered an area from Cape Hatteras, NC, to Nova Scotia, Canada, at depths >27m....

  2. Pretreatment and utilization of waste incineration bottom ashes

    DEFF Research Database (Denmark)

    Astrup, Thomas

    2007-01-01

    Within recent years, researchers and authorities have had increasing focus on leaching properties from waste incineration bottom ashes. Researchers have investigated processes such as those related to carbonation, weathering, metal complexation, and leaching control. Most of these investigations......, however, have had a strong emphasis on lab experiments with little focus on full scale bottom ash upgrading methods. The introduction of regulatory limit values restricting leaching from utilized bottom ashes, has created a need for a better understanding of how lab scale experiences can be utilized...

  3. Analysis of Peach Bottom turbine trip tests

    International Nuclear Information System (INIS)

    Cheng, H.S.; Lu, M.S.; Hsu, C.J.; Shier, W.G.; Diamond, D.J.; Levine, M.M.; Odar, F.

    1979-01-01

    Current interest in the analysis of turbine trip transients has been generated by the recent tests performed at the Peach Bottom (Unit 2) reactor. Three tests, simulating turbine trip transients, were performed at different initial power and coolant flow conditions. The data from these tests provide considerable information to aid qualification of computer codes that are currently used in BWR design analysis. The results are presented of an analysis of a turbine trip transient using the RELAP-3B and the BNL-TWIGL computer codes. Specific results are provided comparing the calculated reactor power and system pressures with the test data. Excellent agreement for all three test transients is evident from the comparisons

  4. Bottom water circulation in Cascadia Basin

    Science.gov (United States)

    Hautala, Susan L.; Paul Johnson, H.; Hammond, Douglas E.

    2009-10-01

    A combination of beta spiral and minimum length inverse methods, along with a compilation of historical and recent high-resolution CTD data, are used to produce a quantitative estimate of the subthermocline circulation in Cascadia Basin. Flow in the North Pacific Deep Water, from 900-1900 m, is characterized by a basin-scale anticyclonic gyre. Below 2000 m, two water masses are present within the basin interior, distinguished by different potential temperature-salinity lines. These water masses, referred to as Cascadia Basin Bottom Water (CBBW) and Cascadia Basin Deep Water (CBDW), are separated by a transition zone at about 2400 m depth. Below the depth where it freely communicates with the broader North Pacific, Cascadia Basin is renewed by northward flow through deep gaps in the Blanco Fracture Zone that feeds the lower limb of a vertical circulation cell within the CBBW. Lower CBBW gradually warms and returns to the south at lighter density. Isopycnal layer renewal times, based on combined lateral and diapycnal advective fluxes, increase upwards from the bottom. The densest layer, existing in the southeast quadrant of the basin below ˜2850 m, has an advective flushing time of 0.6 years. The total volume flushing time for the entire CBBW is 2.4 years, corresponding to an average water parcel residence time of 4.7 years. Geothermal heating at the Cascadia Basin seafloor produces a characteristic bottom-intensified temperature anomaly and plays an important role in the conversion of cold bottom water to lighter density within the CBBW. Although covering only about 0.05% of the global seafloor, the combined effects of bottom heat flux and diapycnal mixing within Cascadia Basin provide about 2-3% of the total required global input to the upward branch of the global thermohaline circulation.

  5. Molten salt extractive distillation process for zirconium-hafnium separation

    International Nuclear Information System (INIS)

    McLaughlin, D.F.; Stoltz, R.A.

    1989-01-01

    This patent describes an improvement in a process for zirconium-hafnium separation. It utilizes an extractive distillation column with a mixture of zirconium and hafnium tetrachlorides introduced into a distillation column having a top and bottom with hafnium enriched overheads taken from the top of the column and a molten salt solvent circulated through the column to provide a liquid phase, and with molten salt solvent containing zirconium chloride being taken from the bottom of the distillation column. The improvements comprising: utilizing a molten salt solvent consisting principally of lithium chloride and at least one of sodium, potassium, magnesium and calcium chlorides; stripping of the zirconium chloride taken from the bottom of the distillation column by electrochemically reducing zirconium from the molten salt solvent; and utilizing a pressurized reflux condenser on the top of the column to add the hafnium chloride enriched overheads to the molten salt solvent previously stripped of zirconium chloride

  6. Systematically extracting metal- and solvent-related occupational information from free-text responses to lifetime occupational history questionnaires.

    Science.gov (United States)

    Friesen, Melissa C; Locke, Sarah J; Tornow, Carina; Chen, Yu-Cheng; Koh, Dong-Hee; Stewart, Patricia A; Purdue, Mark; Colt, Joanne S

    2014-06-01

    Lifetime occupational history (OH) questionnaires often use open-ended questions to capture detailed information about study participants' jobs. Exposure assessors use this information, along with responses to job- and industry-specific questionnaires, to assign exposure estimates on a job-by-job basis. An alternative approach is to use information from the OH responses and the job- and industry-specific questionnaires to develop programmable decision rules for assigning exposures. As a first step in this process, we developed a systematic approach to extract the free-text OH responses and convert them into standardized variables that represented exposure scenarios. Our study population comprised 2408 subjects, reporting 11991 jobs, from a case-control study of renal cell carcinoma. Each subject completed a lifetime OH questionnaire that included verbatim responses, for each job, to open-ended questions including job title, main tasks and activities (task), tools and equipment used (tools), and chemicals and materials handled (chemicals). Based on a review of the literature, we identified exposure scenarios (occupations, industries, tasks/tools/chemicals) expected to involve possible exposure to chlorinated solvents, trichloroethylene (TCE) in particular, lead, and cadmium. We then used a SAS macro to review the information reported by study participants to identify jobs associated with each exposure scenario; this was done using previously coded standardized occupation and industry classification codes, and a priori lists of associated key words and phrases related to possibly exposed tasks, tools, and chemicals. Exposure variables representing the occupation, industry, and task/tool/chemicals exposure scenarios were added to the work history records of the study respondents. Our identification of possibly TCE-exposed scenarios in the OH responses was compared to an expert's independently assigned probability ratings to evaluate whether we missed identifying

  7. The Feature Extraction Based on Texture Image Information for Emotion Sensing in Speech

    Directory of Open Access Journals (Sweden)

    Kun-Ching Wang

    2014-09-01

    Full Text Available In this paper, we present a novel texture image feature for Emotion Sensing in Speech (ESS. This idea is based on the fact that the texture images carry emotion-related information. The feature extraction is derived from time-frequency representation of spectrogram images. First, we transform the spectrogram as a recognizable image. Next, we use a cubic curve to enhance the image contrast. Then, the texture image information (TII derived from the spectrogram image can be extracted by using Laws’ masks to characterize emotional state. In order to evaluate the effectiveness of the proposed emotion recognition in different languages, we use two open emotional databases including the Berlin Emotional Speech Database (EMO-DB and eNTERFACE corpus and one self-recorded database (KHUSC-EmoDB, to evaluate the performance cross-corpora. The results of the proposed ESS system are presented using support vector machine (SVM as a classifier. Experimental results show that the proposed TII-based feature extraction inspired by visual perception can provide significant classification for ESS systems. The two-dimensional (2-D TII feature can provide the discrimination between different emotions in visual expressions except for the conveyance pitch and formant tracks. In addition, the de-noising in 2-D images can be more easily completed than de-noising in 1-D speech.

  8. Development of cask body integrated with bottom plate

    International Nuclear Information System (INIS)

    Yoshida, Takuji; Sasaki, Tomoharu; Koyama, Yoichi; Kumagai, Yasuyuki; Watanabe, Yuichi; Takasa, Seiju

    2017-01-01

    The main parts of a metal cask for storage and transport of spent nuclear fuel consists of main body, neutron shield material and external cylinder. The forged main body has been manufactured as a cup shape by welding of 'forged body' and 'forged bottom plate' which are independently forged. JSW has developed the manufacturing technology of 'cask body integrated with bottom plate' which has no weld line with the goal of cost reduction, manufacturing period shortening and further reliability improvement. Manufacturing for the prototype of 'cask body integrated with bottom plate' has completed to verify mechanical properties and uniformity of the product which satisfy the specified values stipulated in JSME Code S FA1 2007 edition. Here, we report the manufacturing technology and obtained properties of 'cask body integrated with bottom plate'. (author)

  9. Bottom production asymmetries at the LHC

    Energy Technology Data Exchange (ETDEWEB)

    Norrbin, E.; Vogt, R.

    1999-01-01

    We present results on bottom hadron production asymmetries at the LHC within both the Lund string fragmentation model and the intrinsic bottom model. The main aspects of the models are summarized and specific predictions for pp collisions at 14 TeV are given. Asymmetries are found to be very small at central rapidities increasing to a few percent at forward rapidities. At very large rapidities intrinsic production could dominate but this region is probably out of reach of any experiment.

  10. Bottom production asymmetries at the LHC

    International Nuclear Information System (INIS)

    Norrbin, E.; Vogt, R.

    1999-01-01

    We present results on bottom hadron production asymmetries at the LHC within both the Lund string fragmentation model and the intrinsic bottom model. The main aspects of the models are summarized and specific predictions for pp collisions at 14 TeV are given. Asymmetries are found to be very small at central rapidities increasing to a few percent at forward rapidities. At very large rapidities intrinsic production could dominate but this region is probably out of reach of any experiment

  11. To fractionate municipal solid waste incineration bottom ash: Key for utilisation?

    Science.gov (United States)

    Sormunen, Laura Annika; Rantsi, Riina

    2015-11-01

    For the past decade, the Finnish waste sector has increasingly moved from the landfilling of municipal solid waste towards waste incineration. New challenges are faced with the growing amounts of municipal solid waste incineration bottom ash, which are mainly landfilled at the moment. Since this is not a sustainable or a profitable solution, finding different utilisation applications for the municipal solid waste incineration bottom ash is crucial. This study reports a comprehensive analysis of bottom ash properties from one waste incineration plant in Finland, which was first treated with a Dutch bottom ash recovery technique called advanced dry recovery. This novel process separates non-ferrous and ferrous metals from bottom ash, generating mineral fractions of different grain sizes (0-2 mm, 2-5 mm, 5-12 mm and 12-50 mm). The main aim of the study was to assess, whether the advanced bottom ash treatment technique, producing mineral fractions of different grain sizes and therefore properties, facilitates the utilisation of municipal solid waste incineration bottom ash in Finland. The results were encouraging; the bottom ash mineral fractions have favourable behaviour against the frost action, which is especially useful in the Finnish conditions. In addition, the leaching of most hazardous substances did not restrict the utilisation of bottom ash, especially for the larger fractions (>5 mm). Overall, this study has shown that the advanced bottom ash recovering technique can be one solution to increase the utilisation of bottom ash and furthermore decrease its landfilling in Finland. © The Author(s) 2015.

  12. Coal bottom ash and pine wood peelings as root substrates in a circulating nutriculture system

    Energy Technology Data Exchange (ETDEWEB)

    Woodard, M A; Bearce, B C; Cluskey, S; Townsend, E [West Virginia University, Morgantown, WV (USA). Division of Plant and Soil Science

    1993-06-01

    'Inca Yellow' marigolds ([ital Tagetes erecta L.]) were planted in polyethylene bags containing coal bottom ash (CBA), pine wood peelings (PWP), a mixture of 1 CBA: 1 PWP (v/v), and loose Grodan Rockwool (RW) and grown in a circulating nutriculture system. Three fertigation frequencies of 12,6, or 4 cycles per 12-hour light period were set with a duration of 5 minutes each. Flower diameters of marigolds grown in CBA, PWP, and CBA-PWP exceeded flower diameters of RW-grown marigolds, and days from planting to harvest were less in CBA and CBA-PWP than in the other two media. There was no interaction between medium and fertigation frequency. Foliar analysis showed no significant differences in plant elemental composition among root media or fertigation frequencies. Postharvest PWP water extracts contained higher P levels than extracts of other media, and CBA-PWP water extracts contained higher K, Ca, and Mg. In the CBA-PWP mixture, decomposition products from PWP may have increased P solubility and solubilized the K, Ca, and Mg in CBA.

  13. Coal bottom ash and pine wood peelings as root substrates in a circulating nutriculture system

    Energy Technology Data Exchange (ETDEWEB)

    Woodard, M.A.; Bearce, B.C.; Cluskey, S.; Townsend, E. (West Virginia University, Morgantown, WV (USA). Division of Plant and Soil Science)

    1993-06-01

    'Inca Yellow' marigolds ([ital Tagetes erecta L.]) were planted in polyethylene bags containing coal bottom ash (CBA), pine wood peelings (PWP), a mixture of 1 CBA: 1 PWP (v/v), and loose Grodan Rockwool (RW) and grown in a circulating nutriculture system. Three fertigation frequencies of 12,6, or 4 cycles per 12-hour light period were set with a duration of 5 minutes each. Flower diameters of marigolds grown in CBA, PWP, and CBA-PWP exceeded flower diameters of RW-grown marigolds, and days from planting to harvest were less in CBA and CBA-PWP than in the other two media. There was no interaction between medium and fertigation frequency. Foliar analysis showed no significant differences in plant elemental composition among root media or fertigation frequencies. Postharvest PWP water extracts contained higher P levels than extracts of other media, and CBA-PWP water extracts contained higher K, Ca, and Mg. In the CBA-PWP mixture, decomposition products from PWP may have increased P solubility and solubilized the K, Ca, and Mg in CBA.

  14. YAdumper: extracting and translating large information volumes from relational databases to structured flat files.

    Science.gov (United States)

    Fernández, José M; Valencia, Alfonso

    2004-10-12

    Downloading the information stored in relational databases into XML and other flat formats is a common task in bioinformatics. This periodical dumping of information requires considerable CPU time, disk and memory resources. YAdumper has been developed as a purpose-specific tool to deal with the integral structured information download of relational databases. YAdumper is a Java application that organizes database extraction following an XML template based on an external Document Type Declaration. Compared with other non-native alternatives, YAdumper substantially reduces memory requirements and considerably improves writing performance.

  15. Strong Flows of Bottom Water in Abyssal Channels of the Atlantic

    Science.gov (United States)

    Morozov, E. G.

    Analysis of bottom water transport through the abyssal channels of the Atlantic Ocean is presented. The study is based on recent observations in the Russian expeditions and historical data. A strong flow of Antarctic Bottom Water from the Argentine Basin to the Brazil Basin through the Vema Channel is observed on the basis of lowered profilers and anchored buoys with current meters. The further flow of bottom water in the Brazil Basin splits in the northern part of the basin. Part of the bottom water flows to the East Atlantic through the Romanche and Chain fracture zones. The other part follows the bottom topography and flows to the northwester into the North American Basin. Part of the northwesterly flow propagates through the Vema Fracture Zone into the Northeastern Atlantic. This flow generally fills the bottom layer in the Northeastern Atlantic basins. The flows of bottom waters through the Romanche and Chain fracture zones do not spread to the Northeast Atlantic due to strong mixing in the equatorial zone and enhanced transformation of bottom water properties.

  16. Quality assurance of MSWI bottom ash. Environmental properties; Kvalitetssaekring av slaggrus. Miljoemaessiga egenskaper

    Energy Technology Data Exchange (ETDEWEB)

    Flyhammar, Peter [Lund Univ. (Sweden). Dept. of Engineering Geology

    2006-04-15

    In Sweden several hundred tonnes of MSWI bottom ash are generated annually at 29 incineration plants for municipal solid waste. So far bottom ash has mainly been disposed in to landfills or used as cover material in landfills or in other construction works at landfills. A few applications of bottom ash in construction works outside landfills have been reported. A large problem for the market of bottom ash and other secondary materials outside Swedish waste treatment plants is the lack of roles and regulations for a non-polluting use. During 2002 Hartlen and Groenholm (HG) presented a proposal to a system to assure the quality of bottom ash after homogenization and stabilization. A quality assurance of environmental properties should be based on leaching tests. The aim of this project was to study how the control of environmental properties of bottom ash earlier described in e.g. a product information sheet should be worked out. The starting-point has been a control system for bottom ash developed by the Sysav company. Different leaching tests illustrate however different aspects of the environmental properties, e.g. short-term and long-term leaching. Limit and target values for different variables could affect both the possibilities to use bottom ash as well as the sampling from storage heaps. We have chosen to investigate: pH, availability and leached amount and the connection between these variables; the possibilities to use pH or the availability to assess both short-term and long term leaching properties; how the number of subsamples that should be collected from a storage heap is affected by different control variables and quality requirements; how bottom ash is stabilized by today's storage technology and how the technology could be improved. Our sample test of bottom ash from Swedish incineration plants indicates that the availability of elements such as Cd, Cu, Cr, Ni, Pb and Zn in bottom ash usually is below Sysav's target values. Extreme values

  17. Enrichment and geochemical mobility of heavy metals in bottom sediment of the Hoedong reservoir, Korea and their source apportionment.

    Science.gov (United States)

    Lee, Pyeong-Koo; Kang, Min-Ju; Yu, Soonyoung; Ko, Kyung-Seok; Ha, Kyoochul; Shin, Seong-Cheon; Park, Jung Han

    2017-10-01

    Physicochemical characteristics of bottom sediment in the Hoedong reservoir were studied to evaluate the effectiveness of the reservoir as traps for trace metals. Roadside soil, stream sediment and background soil were also studied for comparison. Sequential extractions were carried out, and lead isotopic compositions of each extraction were determined to apportion Pb sources. Besides, particle size distribution of roadside soil, and metal concentrations and Pb isotopes of each size group were determined to characterize metal contamination. In result, Zn and Cu were enriched in sediment through roadside soil. The data on metal partitioning implied that Zn posed potential hazards for water quality. Meanwhile, the noticeable reduction of the 206 Pb/ 207 Pb isotopic ratio in the acid-soluble fraction in the size group 200 μm - 2 mm of national roadside soil indicated that this size group was highly contaminated by automotive emission with precipitation of acid-soluble secondary minerals during evaporation. Based on the Pb isotopic ratios, the dry deposition of Asian dust (AD) and non-Asian dust (NAD) affected roadside soil, while the effects of AD and NAD on bottom sediment appeared to be low given the low metal concentrations in sediment. Metal concentrations and Pb isotopic compositions indicated that sediments were a mixture of background and roadside soil. Source apportionment calculations showed that the average proportion of traffic Pb in bottom and stream sediments was respectively 34 and 31% in non-residual fractions, and 26 and 28% in residual fraction. The residual fraction of sediments appeared to be as contaminated as the non-residual fractions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. From Specific Information Extraction to Inferences: A Hierarchical Framework of Graph Comprehension

    Science.gov (United States)

    2004-09-01

    The skill to interpret the information displayed in graphs is so important to have, the National Council of Teachers of Mathematics has created...guidelines to ensure that students learn these skills ( NCTM : Standards for Mathematics , 2003). These guidelines are based primarily on the extraction of...graphical perception. Human Computer Interaction, 8, 353-388. NCTM : Standards for Mathematics . (2003, 2003). Peebles, D., & Cheng, P. C.-H. (2002

  19. Bottom friction optimization for a better barotropic tide modelling

    Science.gov (United States)

    Boutet, Martial; Lathuilière, Cyril; Son Hoang, Hong; Baraille, Rémy

    2015-04-01

    At a regional scale, barotropic tides are the dominant source of variability of currents and water heights. A precise representation of these processes is essential because of their great impacts on human activities (submersion risks, marine renewable energies, ...). Identified sources of error for tide modelling at a regional scale are the followings: bathymetry, boundary forcing and dissipation due to bottom friction. Nevertheless, bathymetric databases are nowadays known with a good accuracy, especially over shelves, and global tide models performances are better than ever. The most promising improvement is thus the bottom friction representation. The method used to estimate bottom friction is the simultaneous perturbation stochastic approximation (SPSA) which consists in the approximation of the gradient based on a fixed number of cost function measurements, regardless of the dimension of the vector to be estimated. Indeed, each cost function measurement is obtained by randomly perturbing every component of the parameter vector. An important feature of SPSA is its relative ease of implementation. In particular, the method does not require the development of tangent linear and adjoint version of the circulation model. Experiments are carried out to estimate bottom friction with the HYbrid Coordinate Ocean Model (HYCOM) in barotropic mode (one isopycnal layer). The study area is the Northeastern Atlantic margin which is characterized by strong currents and an intense dissipation. Bottom friction is parameterized with a quadratic term and friction coefficient is computed with the water height and the bottom roughness. The latter parameter is the one to be estimated. Assimilated data are the available tide gauge observations. First, the bottom roughness is estimated taking into account bottom sediment natures and bathymetric ranges. Then, it is estimated with geographical degrees of freedom. Finally, the impact of the estimation of a mixed quadratic/linear friction

  20. Painful faces-induced attentional blink modulated by top-down and bottom-up mechanisms

    Directory of Open Access Journals (Sweden)

    Chun eZheng

    2015-06-01

    Full Text Available Pain-related stimuli can capture attention in an automatic (bottom-up or intentional (top-down fashion. Previous studies have examined attentional capture by pain-related information using spatial attention paradigms that involve mainly a bottom-up mechanism. In the current study, we investigated the pain information–induced attentional blink (AB using a rapid serial visual presentation (RSVP task, and compared the effects of task-irrelevant and task-relevant pain distractors. Relationships between accuracy of target identification and individual traits (i.e., empathy and catastrophizing thinking about pain were also examined. The results demonstrated that task-relevant painful faces had a significant pain information–induced AB effect, whereas task-irrelevant faces a near-significant trend of this effect, supporting the notion that pain-related stimuli can influence the temporal dynamics of attention. Furthermore, we found a significant negative correlation between response accuracy and pain catastrophizing score in task-relevant trials. These findings suggest that active scanning of environmental information related to pain produces greater deficits in cognition than does unintentional attention toward pain, which may represent the different ways in which healthy individuals and patients with chronic pain process pain-relevant information. These results may provide insight into the understanding of maladaptive attentional processing in patients with chronic pain.

  1. Radioactive pollution of the Chernobyl cooling pond bottom sediments. I. Water-physical properties, chemical compound and radioactive pollution of pore water

    Directory of Open Access Journals (Sweden)

    L. S. Pirnach

    2011-03-01

    Full Text Available First results of complex research of the Chernobyl cooling pond bottom sediments are presented. The general problematic is considered. Information about vertical distribution of bottom sediments water-physical properties, and also ionic compound and radioactive pollution 137Cs and 90Sr of pore water is received. The inventory of bottom sediments pore water activity is calculated. Strong correlations between concentration in pore water 137Cs, K +, NH4 + within the selected sediments columns are found out. Results of researches are intended for the forecast of radioecological situation change in the cooling pond water-soil complex during drying-up.

  2. Visual scanning and recognition of Chinese, Caucasian, and racially ambiguous faces: contributions from bottom-up facial physiognomic information and top-down knowledge of racial categories.

    Science.gov (United States)

    Wang, Qiandong; Xiao, Naiqi G; Quinn, Paul C; Hu, Chao S; Qian, Miao; Fu, Genyue; Lee, Kang

    2015-02-01

    Recent studies have shown that participants use different eye movement strategies when scanning own- and other-race faces. However, it is unclear (1) whether this effect is related to face recognition performance, and (2) to what extent this effect is influenced by top-down or bottom-up facial information. In the present study, Chinese participants performed a face recognition task with Chinese, Caucasian, and racially ambiguous faces. For the racially ambiguous faces, we led participants to believe that they were viewing either own-race Chinese faces or other-race Caucasian faces. Results showed that (1) Chinese participants scanned the nose of the true Chinese faces more than that of the true Caucasian faces, whereas they scanned the eyes of the Caucasian faces more than those of the Chinese faces; (2) they scanned the eyes, nose, and mouth equally for the ambiguous faces in the Chinese condition compared with those in the Caucasian condition; (3) when recognizing the true Chinese target faces, but not the true target Caucasian faces, the greater the fixation proportion on the nose, the faster the participants correctly recognized these faces. The same was true when racially ambiguous face stimuli were thought to be Chinese faces. These results provide the first evidence to show that (1) visual scanning patterns of faces are related to own-race face recognition response time, and (2) it is bottom-up facial physiognomic information that mainly contributes to face scanning. However, top-down knowledge of racial categories can influence the relationship between face scanning patterns and recognition response time. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Visual scanning and recognition of Chinese, Caucasian, and racially ambiguous faces: Contributions from bottom-up facial physiognomic information and top-down knowledge of racial categories

    Science.gov (United States)

    Wang, Qiandong; Xiao, Naiqi G.; Quinn, Paul C.; Hu, Chao S.; Qian, Miao; Fu, Genyue; Lee, Kang

    2014-01-01

    Recent studies have shown that participants use different eye movement strategies when scanning own- and other-race faces. However, it is unclear (1) whether this effect is related to face recognition performance, and (2) to what extent this effect is influenced by top-down or bottom-up facial information. In the present study, Chinese participants performed a face recognition task with Chinese faces, Caucasian faces, and racially ambiguous morphed face stimuli. For the racially ambiguous faces, we led participants to believe that they were viewing either own-race Chinese faces or other-race Caucasian faces. Results showed that (1) Chinese participants scanned the nose of the true Chinese faces more than that of the true Caucasian faces, whereas they scanned the eyes of the Caucasian faces more than those of the Chinese faces; (2) they scanned the eyes, nose, and mouth equally for the ambiguous faces in the Chinese condition compared with those in the Caucasian condition; (3) when recognizing the true Chinese target faces, but not the true target Caucasian faces, the greater the fixation proportion on the nose, the faster the participants correctly recognized these faces. The same was true when racially ambiguous face stimuli were thought to be Chinese faces. These results provide the first evidence to show that (1) visual scanning patterns of faces are related to own-race face recognition response time, and (2) it is bottom-up facial physiognomic information of racial categories that mainly contributes to face scanning. However, top-down knowledge of racial categories can influence the relationship between face scanning patterns and recognition response time. PMID:25497461

  4. Quality assurance of MSWI bottom ash. Environmental properties; Kvalitetssaekring av slaggrus. Miljoemaessiga egenskaper

    Energy Technology Data Exchange (ETDEWEB)

    Flyhammar, Peter [Lund Univ. (Sweden). Engineering Geology

    2006-04-15

    In Sweden, several hundred tonnes of MSWI bottom ash are generated annually at 29 incineration plants for municipal solid waste. So far bottom ash has mainly been disposed in to landfills or used as cover material in landfills or in other construction works at landfills. A few applications of bottom ash in construction works outside landfills have been reported. A large problem for the market of bottom ash and other secondary materials outside Swedish waste treatment plants is the lack of roles and regulations for a non-polluting use. During 2002 Hartlen and Groenholm presented a proposal to a system to assure the quality of bottom ash after homogenization and stabilization. They notice that the leaching of salts and metals to ground water constitutes the largest risk for the environment during use of bottom ash. Therefore, a quality assurance of environmental properties should be based on leaching tests. The aim of this project was to study how the control of environmental properties of bottom ash (at first hand leaching properties) earlier described in e.g. a product information sheet should be worked out. The starting-point has been a control system for bottom ash developed by Sysav. Different leaching tests illustrate however different aspects of the environmental properties, e.g. short-term and long-term leaching. Limit and target values for different variables could affect both the possibilities to use bottom ash as well as the sampling from storage heaps. We have chosen to investigate pH, availability and leached amount and the connection between these variables. the possibilities to use pH or the availability to assess both short-term and longterm leaching properties. how the number of subsamples that should be collected from a storage heap is affected by different control variables and quality requirements. how bottom ash is stabilized by today's storage technology and how the technology could be improved. Our sample test of bottom ash from Swedish

  5. Mapping of an ultrasonic bath for ultrasound assisted extraction of mangiferin from Mangifera indica leaves.

    Science.gov (United States)

    Kulkarni, Vrushali M; Rathod, Virendra K

    2014-03-01

    The present work deals with the mapping of an ultrasonic bath for the maximum extraction of mangiferin from Mangifera indica leaves. I3(-) liberation experiments (chemical transformations) and extraction (physical transformations) were carried out at different locations in an ultrasonic bath and compared. The experimental findings indicated a similar trend in variation in an ultrasonic bath by both these methods. Various parameters such as position and depth of vessel in an ultrasonic bath, diameter and shape of a vessel, frequency and input power which affect the extraction yield have been studied in detail. Maximum yield of mangiferin obtained was approximately 31 mg/g at optimized parameters: distance of 2.54 cm above the bottom of the bath, 7 cm diameter of vessel, flat bottom vessel, 6.35 cm liquid height, 122 W input power and 25 kHz frequency. The present work indicates that the position and depth of vessel in an ultrasonic bath, diameter and shape of a vessel, frequency and input power have significant effect on the extraction yield. This work can be used as a base for all ultrasonic baths to obtain maximum efficiency for ultrasound assisted extraction. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Bottom Trawl Survey Protocol Development (HB0706, EK60)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Cruise objectives include: 1) Investigate performance characteristics of new research bottom trawl; 2) Develop standard operating procedures for the NEFSC Bottom...

  7. Bottom and top physics

    International Nuclear Information System (INIS)

    Foley, K.J.; Fridman, A.; Gilman, F.J.; Herten, G.; Hinchliffe, I.; Jawahery, A.; Sanda, A.; Schmidt, M.P.; Schubert, K.R.

    1987-09-01

    The production of bottom quarks at the SSC and the formalism and phenomenology of observing CP violation in B meson decays is discussed. The production of a heavy t quark which decays into a real W boson, and what we might learn from its decays is examined

  8. Small Engines as Bottoming Cycle Steam Expanders for Internal Combustion Engines

    OpenAIRE

    Weerasinghe, Rohitha; Hounsham, Sandra

    2017-01-01

    Heat recovery bottoming cycles for internal combustion engines have opened new avenues for research into small steam expanders [1]. Dependable data for small steam expanders will allow us to predict on their suitability as bottoming cycle engines and the fuel economy achieved by using them as bottoming cycles. Wankel Engines, with its lower resistance properties at small scale provide excellent contenders for bottoming cycle expanders. Present paper is based on results of experiments carried ...

  9. Dissociable effects of top-down and bottom-up attention during episodic encoding

    Science.gov (United States)

    Uncapher, Melina R.; Hutchinson, J. Benjamin; Wagner, Anthony D.

    2011-01-01

    It is well established that the formation of memories for life’s experiences—episodic memory—is influenced by how we attend to those experiences, yet the neural mechanisms by which attention shapes episodic encoding are still unclear. We investigated how top-down and bottom-up attention contribute to memory encoding of visual objects in humans by manipulating both types of attention during functional magnetic resonance imaging (fMRI) of episodic memory formation. We show that dorsal parietal cortex—specifically, intraparietal sulcus (IPS)—was engaged during top-down attention and was also recruited during the successful formation of episodic memories. By contrast, bottom-up attention engaged ventral parietal cortex—specifically, temporoparietal junction (TPJ)—and was also more active during encoding failure. Functional connectivity analyses revealed further dissociations in how top-down and bottom-up attention influenced encoding: while both IPS and TPJ influenced activity in perceptual cortices thought to represent the information being encoded (fusiform/lateral occipital cortex), they each exerted opposite effects on memory encoding. Specifically, during a preparatory period preceding stimulus presentation, a stronger drive from IPS was associated with a higher likelihood that the subsequently attended stimulus would be encoded. By contrast, during stimulus processing, stronger connectivity with TPJ was associated with a lower likelihood the stimulus would be successfully encoded. These findings suggest that during encoding of visual objects into episodic memory, top-down and bottom-up attention can have opposite influences on perceptual areas that subserve visual object representation, suggesting that one manner in which attention modulates memory is by altering the perceptual processing of to-be-encoded stimuli. PMID:21880922

  10. Metabolic Network Discovery by Top-Down and Bottom-Up Approaches and Paths for Reconciliation

    Energy Technology Data Exchange (ETDEWEB)

    Çakır, Tunahan, E-mail: tcakir@gyte.edu.tr [Computational Systems Biology Group, Department of Bioengineering, Gebze Technical University (formerly known as Gebze Institute of Technology), Gebze (Turkey); Khatibipour, Mohammad Jafar [Computational Systems Biology Group, Department of Bioengineering, Gebze Technical University (formerly known as Gebze Institute of Technology), Gebze (Turkey); Department of Chemical Engineering, Gebze Technical University (formerly known as Gebze Institute of Technology), Gebze (Turkey)

    2014-12-03

    The primary focus in the network-centric analysis of cellular metabolism by systems biology approaches is to identify the active metabolic network for the condition of interest. Two major approaches are available for the discovery of the condition-specific metabolic networks. One approach starts from genome-scale metabolic networks, which cover all possible reactions known to occur in the related organism in a condition-independent manner, and applies methods such as the optimization-based Flux-Balance Analysis to elucidate the active network. The other approach starts from the condition-specific metabolome data, and processes the data with statistical or optimization-based methods to extract information content of the data such that the active network is inferred. These approaches, termed bottom-up and top-down, respectively, are currently employed independently. However, considering that both approaches have the same goal, they can both benefit from each other paving the way for the novel integrative analysis methods of metabolome data- and flux-analysis approaches in the post-genomic era. This study reviews the strengths of constraint-based analysis and network inference methods reported in the metabolic systems biology field; then elaborates on the potential paths to reconcile the two approaches to shed better light on how the metabolism functions.

  11. Metabolic Network Discovery by Top-Down and Bottom-Up Approaches and Paths for Reconciliation

    International Nuclear Information System (INIS)

    Çakır, Tunahan; Khatibipour, Mohammad Jafar

    2014-01-01

    The primary focus in the network-centric analysis of cellular metabolism by systems biology approaches is to identify the active metabolic network for the condition of interest. Two major approaches are available for the discovery of the condition-specific metabolic networks. One approach starts from genome-scale metabolic networks, which cover all possible reactions known to occur in the related organism in a condition-independent manner, and applies methods such as the optimization-based Flux-Balance Analysis to elucidate the active network. The other approach starts from the condition-specific metabolome data, and processes the data with statistical or optimization-based methods to extract information content of the data such that the active network is inferred. These approaches, termed bottom-up and top-down, respectively, are currently employed independently. However, considering that both approaches have the same goal, they can both benefit from each other paving the way for the novel integrative analysis methods of metabolome data- and flux-analysis approaches in the post-genomic era. This study reviews the strengths of constraint-based analysis and network inference methods reported in the metabolic systems biology field; then elaborates on the potential paths to reconcile the two approaches to shed better light on how the metabolism functions.

  12. 5W1H Information Extraction with CNN-Bidirectional LSTM

    Science.gov (United States)

    Nurdin, A.; Maulidevi, N. U.

    2018-03-01

    In this work, information about who, did what, when, where, why, and how on Indonesian news articles were extracted by combining Convolutional Neural Network and Bidirectional Long Short-Term Memory. Convolutional Neural Network can learn semantically meaningful representations of sentences. Bidirectional LSTM can analyze the relations among words in the sequence. We also use word embedding word2vec for word representation. By combining these algorithms, we obtained F-measure 0.808. Our experiments show that CNN-BLSTM outperforms other shallow methods, namely IBk, C4.5, and Naïve Bayes with the F-measure 0.655, 0.645, and 0.595, respectively.

  13. Respiratory Information Extraction from Electrocardiogram Signals

    KAUST Repository

    Amin, Gamal El Din Fathy

    2010-12-01

    The Electrocardiogram (ECG) is a tool measuring the electrical activity of the heart, and it is extensively used for diagnosis and monitoring of heart diseases. The ECG signal reflects not only the heart activity but also many other physiological processes. The respiratory activity is a prominent process that affects the ECG signal due to the close proximity of the heart and the lungs. In this thesis, several methods for the extraction of respiratory process information from the ECG signal are presented. These methods allow an estimation of the lung volume and the lung pressure from the ECG signal. The potential benefit of this is to eliminate the corresponding sensors used to measure the respiration activity. A reduction of the number of sensors connected to patients will increase patients’ comfort and reduce the costs associated with healthcare. As a further result, the efficiency of diagnosing respirational disorders will increase since the respiration activity can be monitored with a common, widely available method. The developed methods can also improve the detection of respirational disorders that occur while patients are sleeping. Such disorders are commonly diagnosed in sleeping laboratories where the patients are connected to a number of different sensors. Any reduction of these sensors will result in a more natural sleeping environment for the patients and hence a higher sensitivity of the diagnosis.

  14. Peabody Western Coal cuts costs with bottom-dump haulers

    Energy Technology Data Exchange (ETDEWEB)

    Perla, S.; Baecker, G.; Morgan, W. [Empire Machinery, Mesa, AZ (United States)

    1995-04-01

    A new hauling concept has been introduced at the Black Mesa and Kayenta coal mines of the Peabody Western Coal Co. in northern Arizona, USA. The article describes the switch from Caterpillar 992 wheel loaders with 136 t bottom-dump trucks to 272 t bottom-dump trucks. Cat 789 off-highway trucks were modified to pull bottom-dump trucks. Haulage costs per ton of coal and cost per ton-mile have fallen significantly since the introduction of the new large hauling method. 7 figs., 2 photos.

  15. Spreading of Antarctic Bottom Water in the Atlantic Ocean

    OpenAIRE

    Morozov, E.; Tarakanov, R. Y.; Zenk, Walter

    2012-01-01

    This paper describes the transport of bottom water from its source region in the Weddell Sea through the abyssal channels of the Atlantic Ocean. The research brings together the recent observations and historical data. A strong flow of Antarctic Bottom Water through the Vema Channel is analyzed. The mean speed of the flow is 30 cm/s. A temperature increase was found in the deep Vema Channel, which has been observed for 30 years already. The flow of bottom water in the northern part of the Bra...

  16. Search for scalar bottom quarks from gluino decays in collisions at.

    Science.gov (United States)

    Abulencia, A; Acosta, D; Adelman, J; Affolder, T; Akimoto, T; Albrow, M G; Ambrose, D; Amerio, S; Amidei, D; Anastassov, A; Anikeev, K; Annovi, A; Antos, J; Aoki, M; Apollinari, G; Arguin, J-F; Arisawa, T; Artikov, A; Ashmanskas, W; Attal, A; Azfar, F; Azzi-Bacchetta, P; Azzurri, P; Bacchetta, N; Bachacou, H; Badgett, W; Barbaro-Galtieri, A; Barnes, V E; Barnett, B A; Baroiant, S; Bartsch, V; Bauer, G; Bedeschi, F; Behari, S; Belforte, S; Bellettini, G; Bellinger, J; Belloni, A; Ben-Haim, E; Benjamin, D; Beretvas, A; Beringer, J; Berry, T; Bhatti, A; Binkley, M; Bisello, D; Bishai, M; Blair, R E; Blocker, C; Bloom, K; Blumenfeld, B; Bocci, A; Bodek, A; Boisvert, V; Bolla, G; Bolshov, A; Bortoletto, D; Boudreau, J; Bourov, S; Boveia, A; Brau, B; Bromberg, C; Brubaker, E; Budagov, J; Budd, H S; Budd, S; Burkett, K; Busetto, G; Bussey, P; Byrum, K L; Cabrera, S; Campanelli, M; Campbell, M; Canelli, F; Canepa, A; Carlsmith, D; Carosi, R; Carron, S; Casarsa, M; Castro, A; Catastini, P; Cauz, D; Cavalli-Sforza, M; Cerri, A; Cerrito, L; Chang, S H; Chapman, J; Chen, Y C; Chertok, M; Chiarelli, G; Chlachidze, G; Chlebana, F; Cho, I; Cho, K; Chokheli, D; Chou, J P; Chu, P H; Chuang, S H; Chung, K; Chung, W H; Chung, Y S; Ciljak, M; Ciobanu, C I; Ciocci, M A; Clark, A; Clark, D; Coca, M; Connolly, A; Convery, M E; Conway, J; Cooper, B; Copic, K; Cordelli, M; Cortiana, G; Cruz, A; Cuevas, J; Culbertson, R; Cyr, D; DaRonco, S; D'Auria, S; D'onofrio, M; Dagenhart, D; de Barbaro, P; De Cecco, S; Deisher, A; De Lentdecker, G; Dell'Orso, M; Demers, S; Demortier, L; Deng, J; Deninno, M; De Pedis, D; Derwent, P F; Dionisi, C; Dittmann, J R; Dituro, P; Dörr, C; Dominguez, A; Donati, S; Donega, M; Dong, P; Donini, J; Dorigo, T; Dube, S; Ebina, K; Efron, J; Ehlers, J; Erbacher, R; Errede, D; Errede, S; Eusebi, R; Fang, H C; Farrington, S; Fedorko, I; Fedorko, W T; Feild, R G; Feindt, M; Fernandez, J P; Field, R; Flanagan, G; Flores-Castillo, L R; Foland, A; Forrester, S; Foster, G W; Franklin, M; Freeman, J C; Fujii, Y; Furic, I; Gajjar, A; Gallinaro, M; Galyardt, J; Garcia, J E; Garcia Sciverez, M; Garfinkel, A F; Gay, C; Gerberich, H; Gerchtein, E; Gerdes, D; Giagu, S; di Giovanni, G P; Giannetti, P; Gibson, A; Gibson, K; Ginsburg, C; Giokaris, N; Giolo, K; Giordani, M; Giunta, M; Giurgiu, G; Glagolev, V; Glenzinski, D; Gold, M; Goldschmidt, N; Goldstein, J; Gomez, G; Gomez-Ceballos, G; Goncharov, M; González, O; Gorelov, I; Goshaw, A T; Gotra, Y; Goulianos, K; Gresele, A; Griffiths, M; Grinstein, S; Grosso-Pilcher, C; Grundler, U; Guimaraes da Costa, J; Haber, C; Hahn, S R; Hahn, K; Halkiadakis, E; Hamilton, A; Han, B-Y; Handler, R; Happacher, F; Hara, K; Hare, M; Harper, S; Harr, R F; Harris, R M; Hatakeyama, K; Hauser, J; Hays, C; Hayward, H; Heijboer, A; Heinemann, B; Heinrich, J; Hennecke, M; Herndon, M; Heuser, J; Hidas, D; Hill, C S; Hirschbuehl, D; Hocker, A; Holloway, A; Hou, S; Houlden, M; Hsu, S-C; Huffman, B T; Hughes, R E; Huston, J; Ikado, K; Incandela, J; Introzzi, G; Iori, M; Ishizawa, Y; Ivanov, A; Iyutin, B; James, E; Jang, D; Jayatilaka, B; Jeans, D; Jensen, H; Jeon, E J; Jones, M; Joo, K K; Jun, S Y; Junk, T R; Kamon, T; Kang, J; Karagoz-Unel, M; Karchin, P E; Kato, Y; Kemp, Y; Kephart, R; Kerzel, U; Khotilovich, V; Kilminster, B; Kim, D H; Kim, H S; Kim, J E; Kim, M J; Kim, M S; Kim, S B; Kim, S H; Kim, Y K; Kirby, M; Kirsch, L; Klimenko, S; Klute, M; Knuteson, B; Ko, B R; Kobayashi, H; Kondo, K; Kong, D J; Konigsberg, J; Kordas, K; Korytov, A; Kotwal, A V; Kovalev, A; Kraus, J; Kravchenko, I; Kreps, M; Kreymer, A; Kroll, J; Krumnack, N; Kruse, M; Krutelyov, V; Kuhlmann, S E; Kusakabe, Y; Kwang, S; Laasanen, A T; Lai, S; Lami, S; Lammel, S; Lancaster, M; Lander, R L; Lannon, K; Lath, A; Latino, G; Lazzizzera, I; Lecci, C; Lecompte, T; Lee, J; Lee, J; Lee, S W; Lefèvre, R; Leonardo, N; Leone, S; Levy, S; Lewis, J D; Li, K; Lin, C; Lin, C S; Lindgren, M; Lipeles, E; Liss, T M; Lister, A; Litvintsev, D O; Liu, T; Liu, Y; Lockyer, N S; Loginov, A; Loreti, M; Loverre, P; Lu, R-S; Lucchesi, D; Lujan, P; Lukens, P; Lungu, G; Lyons, L; Lys, J; Lysak, R; Lytken, E; Mack, P; MacQueen, D; Madrak, R; Maeshima, K; Maksimovic, P; Manca, G; Margaroli, F; Marginean, R; Marino, C; Martin, A; Martin, M; Martin, V; Martínez, M; Maruyama, T; Matsunaga, H; Mattson, M E; Mazini, R; Mazzanti, P; McFarland, K S; McGivern, D; McIntyre, P; McNamara, P; McNulty, R; Mehta, A; Menzemer, S; Menzione, A; Merkel, P; Mesropian, C; Messina, A; von der Mey, M; Miao, T; Miladinovic, N; Miles, J; Miller, R; Miller, J S; Mills, C; Milnik, M; Miquel, R; Miscetti, S; Mitselmakher, G; Miyamoto, A; Moggi, N; Mohr, B; Moore, R; Morello, M; Movilla Fernandez, P; Mülmenstädt, J; Mukherjee, A; Mulhearn, M; Muller, Th; Mumford, R; Munar, A; Murat, P; Nachtman, J; Nahn, S; Nakano, I; Napier, A; Naumov, D; Necula, V; Neu, C; Neubauer, M S; Nielsen, J; Nigmanov, T; Nodulman, L; Norniella, O; Ogawa, T; Oh, S H; Oh, Y D; Okusawa, T; Oldeman, R; Orava, R; Osterberg, K; Pagliarone, C; Palencia, E; Paoletti, R; Papadimitriou, V; Papikonomou, A; Paramonov, A A; Parks, B; Pashapour, S; Patrick, J; Pauletta, G; Paulini, M; Paus, C; Pellett, D E; Penzo, A; Phillips, T J; Piacentino, G; Piedra, J; Pitts, K; Plager, C; Pondrom, L; Pope, G; Portell, X; Poukhov, O; Pounder, N; Prakoshyn, F; Pronko, A; Proudfoot, J; Ptohos, F; Punzi, G; Pursley, J; Rademacker, J; Rahaman, A; Rakitin, A; Rappoccio, S; Ratnikov, F; Reisert, B; Rekovic, V; van Remortel, N; Renton, P; Rescigno, M; Richter, S; Rimondi, F; Rinnert, K; Ristori, L; Robertson, W J; Robson, A; Rodrigo, T; Rogers, E; Rolli, S; Roser, R; Rossi, M; Rossin, R; Rott, C; Ruiz, A; Russ, J; Rusu, V; Ryan, D; Saarikko, H; Sabik, S; Safonov, A; Sakumoto, W K; Salamanna, G; Salto, O; Saltzberg, D; Sanchez, C; Santi, L; Sarkar, S; Sato, K; Savard, P; Savoy-Navarro, A; Scheidle, T; Schlabach, P; Schmidt, E E; Schmidt, M P; Schmitt, M; Schwarz, T; Scodellaro, L; Scott, A L; Scribano, A; Scuri, F; Sedov, A; Seidel, S; Seiya, Y; Semenov, A; Semeria, F; Sexton-Kennedy, L; Sfiligoi, I; Shapiro, M D; Shears, T; Shepard, P F; Sherman, D; Shimojima, M; Shochet, M; Shon, Y; Shreyber, I; Sidoti, A; Siegrist, J L; Sill, A; Sinervo, P; Sisakyan, A; Sjolin, J; Skiba, A; Slaughter, A J; Sliwa, K; Smirnov, D; Smith, J R; Snider, F D; Snihur, R; Soderberg, M; Soha, A; Somalwar, S; Sorin, V; Spalding, J; Spinella, F; Squillacioti, P; Stanitzki, M; Staveris-Polykalas, A; St Dennis, R; Stelzer, B; Stelzer-Chilton, O; Stentz, D; Strologas, J; Stuart, D; Suh, J S; Sukhanov, A; Sumorok, K; Sun, H; Suzuki, T; Taffard, A; Tafirout, R; Takashima, R; Takeuchi, Y; Takikawa, K; Tanaka, M; Tanaka, R; Tecchio, M; Teng, P K; Terashi, K; Tether, S; Thom, J; Thompson, A S; Thomson, E; Tipton, P; Tiwari, V; Tkaczyk, S; Toback, D; Tokar, S; Tollefson, K; Tomura, T; Tonelli, D; Tönnesmann, M; Torre, S; Torretta, D; Tourneur, S; Trischuk, W; Tsuchiya, R; Tsuno, S; Turini, N; Ukegawa, F; Unverhau, T; Uozumi, S; Usynin, D; Vacavant, L; Vaiciulis, A; Vallecorsa, S; Varganov, A; Vataga, E; Velev, G; Veramendi, G; Veszpremi, V; Vickey, T; Vidal, R; Vila, I; Vilar, R; Vollrath, I; Volobouev, I; Würthwein, F; Wagner, P; Wagner, R G; Wagner, R L; Wagner, W; Wallny, R; Walter, T; Wan, Z; Wang, M J; Wang, S M; Warburton, A; Ward, B; Waschke, S; Waters, D; Watts, T; Weber, M; Wester, W C; Whitehouse, B; Whiteson, D; Wicklund, A B; Wicklund, E; Williams, H H; Wilson, P; Winer, B L; Wittich, P; Wolbers, S; Wolfe, C; Worm, S; Wright, T; Wu, X; Wynne, S M; Yagil, A; Yamamoto, K; Yamaoka, J; Yamashita, Y; Yang, C; Yang, U K; Yao, W M; Yeh, G P; Yoh, J; Yorita, K; Yoshida, T; Yu, I; Yu, S S; Yun, J C; Zanello, L; Zanetti, A; Zaw, I; Zetti, F; Zhang, X; Zhou, J; Zucchelli, S

    2006-05-05

    We searched for scalar bottom quarks 156 pb(-1) of pp collisions at radicalS = 1.96 recorded by the Collider Detector at Fermilab II experiment at the Tevatron. Scalar bottom quarks can be produced from gluino decays in -parity conserving models of supersymmetry when the mass of the gluino exceeds that of the scalar bottom quark. Then, a scalar bottom quark can decay into a bottom quark and a neutralino. To search for this scenario, we investigated events with large missing transverse energy and at least three jets, two or more of which were identified as containing a secondary vertex from the hadronization of quarks. We found four candidate events, where 2.6 +/- 0.7 are expected from standard model processes, and placed 95% confidence level lower limits on gluino and scalar bottom quark masses of up to 280 and 240 GeV/c(2), respectively.

  17. Fuel Summary for Peach Bottom Unit 1 High-Temperature Gas-Cooled Reactor Cores 1 and 2

    Energy Technology Data Exchange (ETDEWEB)

    Karel I. Kingrey

    2003-04-01

    This fuel summary report contains background and summary information for the Peach Bottom Unit 1, High-Temperature, Gas-Cooled Reactor Cores 1 and 2. This report contains detailed information about the fuel in the two cores, the Peach Bottom Unit 1 operating history, nuclear parameters, physical and chemical characteristics, and shipping and storage canister related data. The data in this document have been compiled from a large number of sources and are not qualified beyond the qualification of the source documents. This report is intended to provide an overview of the existing data pertaining to spent fuel management and point to pertinent reference source documents. For design applications, the original source documentation must be used. While all referenced sources are available as records or controlled documents at the Idaho National Engineering and Environmental Laboratory (INEEL), some of the sources were marked as informal or draft reports. This is noted where applicable. In some instances, source documents are not consistent. Where they are known, this document identifies those instances and provides clarification where possible. However, as stated above, this document has not been independently qualified and such clarifications are only included for information purposes. Some of the information in this summary is available in multiple source documents. An effort has been made to clearly identify at least one record document as the source for the information included in this report.

  18. Addressing the Misuse Potential of Life Science Research-Perspectives From a Bottom-Up Initiative in Switzerland.

    Science.gov (United States)

    Oeschger, Franziska M; Jenal, Ursula

    2018-01-01

    Codes of conduct have received wide attention as a bottom-up approach to foster responsibility for dual use aspects of life science research within the scientific community. In Switzerland, a series of discussion sessions led by the Swiss Academy of Sciences with over 40 representatives of most Swiss academic life science research institutions has revealed that while a formal code of conduct was considered too restrictive, a bottom-up approach toward awareness raising and education and demonstrating scientists' responsibility toward society was highly welcomed. Consequently, an informational brochure on "Misuse potential and biosecurity in life sciences research" was developed to provide material for further discussions and education.

  19. Information extraction and knowledge graph construction from geoscience literature

    Science.gov (United States)

    Wang, Chengbin; Ma, Xiaogang; Chen, Jianguo; Chen, Jingwen

    2018-03-01

    Geoscience literature published online is an important part of open data, and brings both challenges and opportunities for data analysis. Compared with studies of numerical geoscience data, there are limited works on information extraction and knowledge discovery from textual geoscience data. This paper presents a workflow and a few empirical case studies for that topic, with a focus on documents written in Chinese. First, we set up a hybrid corpus combining the generic and geology terms from geology dictionaries to train Chinese word segmentation rules of the Conditional Random Fields model. Second, we used the word segmentation rules to parse documents into individual words, and removed the stop-words from the segmentation results to get a corpus constituted of content-words. Third, we used a statistical method to analyze the semantic links between content-words, and we selected the chord and bigram graphs to visualize the content-words and their links as nodes and edges in a knowledge graph, respectively. The resulting graph presents a clear overview of key information in an unstructured document. This study proves the usefulness of the designed workflow, and shows the potential of leveraging natural language processing and knowledge graph technologies for geoscience.

  20. Bottom friction. A practical approach to modelling coastal oceanography

    Science.gov (United States)

    Bolanos, Rodolfo; Jensen, Palle; Kofoed-Hansen, Henrik; Tornsfeldt Sørensen, Jacob

    2017-04-01

    Coastal processes imply the interaction of the atmosphere, the sea, the coastline and the bottom. The spatial gradients in this area are normally large, induced by orographic and bathymetric features. Although nowadays it is possible to obtain high-resolution bathymetry, the details of the seabed, e.g. sediment type, presence of biological material and living organisms are not available. Additionally, these properties as well as bathymetry can also be highly dynamic. These bottom characteristics are very important to describe the boundary layer of currents and waves and control to a large degree the dissipation of flows. The bottom friction is thus typically a calibration parameter in numerical modelling of coastal processes. In this work, we assess this process and put it into context of other physical processes uncertainties influencing wind-waves and currents in the coastal areas. A case study in the North Sea is used, particularly the west coast of Denmark, where water depth of less than 30 m cover a wide fringe along the coast, where several offshore wind farm developments are being carried out. We use the hydrodynamic model MIKE 21 HD and the spectral wave model MIKE 21 SW to simulate atmosphere and tidal induced flows and the wind wave generation and propagation. Both models represent state of the art and have been developed for flexible meshes, ideal for coastal oceanography as they can better represent coastlines and allow a variable spatial resolution within the domain. Sensitivity tests to bottom friction formulations are carried out into context of other processes (e.g. model forcing uncertainties, wind and wave interactions, wind drag coefficient). Additionally, a map of varying bottom properties is generated based on a literature survey to explore the impact of the spatial variability. Assessment of different approaches is made in order to establish a best practice regarding bottom friction and coastal oceanographic modelling. Its contribution is also

  1. SPREADING OF ANTARCTIC BOTTOM WATER IN THE ATLANTIC OCEAN

    Directory of Open Access Journals (Sweden)

    Eugene Morozov

    2012-01-01

    Full Text Available This paper describes the transport of bottom water from its source region in the Weddell Sea through the abyssal channels of the Atlantic Ocean. The research brings together the recent observations and historical data. A strong flow of Antarctic Bottom Water through the Vema Channel is analyzed. The mean speed of the flow is 30 cm/s. A temperature increase was found in the deep Vema Channel, which has been observed for 30 years already. The flow of bottom water in the northern part of the Brazil Basin splits. Part of the water flows through the Romanche and Chain fracture zones. The other part flows to the North American Basin. Part of the latter flow propagates through the Vema Fracture Zone into the Northeast Atlantic. The properties of bottom water in the Kane Gap and Discovery Gap are also analyzed.

  2. Extraction of neutron spectral information from Bonner-Sphere data

    CERN Document Server

    Haney, J H; Zaidins, C S

    1999-01-01

    We have extended a least-squares method of extracting neutron spectral information from Bonner-Sphere data which was previously developed by Zaidins et al. (Med. Phys. 5 (1978) 42). A pulse-height analysis with background stripping is employed which provided a more accurate count rate for each sphere. Newer response curves by Mares and Schraube (Nucl. Instr. and Meth. A 366 (1994) 461) were included for the moderating spheres and the bare detector which comprise the Bonner spectrometer system. Finally, the neutron energy spectrum of interest was divided using the philosophy of fuzzy logic into three trapezoidal regimes corresponding to slow, moderate, and fast neutrons. Spectral data was taken using a PuBe source in two different environments and the analyzed data is presented for these cases as slow, moderate, and fast neutron fluences. (author)

  3. Developmental toxicity of clarified slurry oil, syntower bottoms, and distillate aromatic extract administered as a single oral dose to pregnant rats

    Energy Technology Data Exchange (ETDEWEB)

    Feuston, M.H.; Mackerer, C.R. [Stonybrook Labs., Princeton, NJ (United States)

    1996-09-01

    Clarified slurry oil (CSO), syntower bottoms (STB), and distillate aromatic extract (DAE) are refinery streams produced by processing crude oil. Available data indicate that some refinery streams are developmentally toxic by the dermal route of exposure. However, there is no conclusive evidence for their being teratogenic. The present studies were designed to further explore the suspected teratogenic potency of refinery streams while at the same time limiting embryolethality. In general, evidence of maternal toxicity (i.e., decreased body weight gain, decreased thymus weight) was observed at doses greater than or equal to 500 mg/kg. For each refinery stream tested, the incidence of resorption was greatest on GD 11. A common pattern of fetal malformations was observed for all of the refinery streams tested and included cleft palate, diaphragmatic hernia, and paw and tail defects. The incidence and type of malformation observed were influenced by the gestation day of exposure. The incidence and type of malformation observed were influenced by the gestation day of exposure. The incidences of external and skeletal malformations were greatest on GD 11 and 12 for fetuses exposed to CSO; on GD 13 and 14, the incidence of malformation was comparable for CSO- and STB-exposed fetuses. The incidence of visceral anomalies was greatest on GD 11-13 for fetuses exposed to CSO and STB; on Gestation D 14, the incidence was comparable for each of the refinery streams tested. In general, the ability to produce adverse effects on development was greatest for CSO and least for DAE. Effects produced by STB were comparable to or less severe than those observed for CSO. 24 refs., 11 tabs.

  4. Bottom-water observations in the Vema fracture zone

    Science.gov (United States)

    Eittreim, Stephen L.; Biscaye, Pierre E.; Jacobs, Stanley S.

    1983-03-01

    The Vema fracture zone trough, at 11°N between 41° and 45°E, is open to the west at the 5000-m level but is silled at the 4650-m level on the east where it intersects the axis of the Mid-Atlantic Ridge. The trough is filled with Antarctic Bottom Water (AABW) with a potential temperature of 1.32°C and salinity of 34.82 ppt. The bottom water is thermally well mixed in a nearly homogeneous layer about 700 m thick. The great thickness of this bottom layer, as compared with the bottom-water structure of the western Atlantic basin, may result from enhanced mixing induced by topographic constriction at the west end of the fracture zone trough. A benthic thermocline, with potential temperature gradients of about 1.2 mdeg m-1, is associated with an abrupt increase in turbidity with depth at about 1200 m above bottom. A transitional layer of more moderate temperature gradients, about 0.4 mdeg m-1, lies between the benthic thermocline above and the AABW below. The AABW layer whose depth-averaged suspended paniculate concentrations range from 8 to 19 μg L-1, is consistently higher in turbidity than the overlying waters. At the eastern end of the trough, 140 m below sill depth, very low northeastward current velocities, with maximums of 3 cm s-1, were recorded for an 11-day period.

  5. Analytics that Inform the University: Using Data You Already Have

    Science.gov (United States)

    Dziuban, Charles; Moskal, Patsy; Cavanagh, Thomas; Watts, Andre

    2012-01-01

    The authors describe the University of Central Florida's top-down/bottom-up action analytics approach to using data to inform decision-making at the University of Central Florida. The top-down approach utilizes information about programs, modalities, and college implementation of Web initiatives. The bottom-up approach continuously monitors…

  6. A comparison of top-down and bottom-up approaches to benthic habitat mapping to inform offshore wind energy development

    Science.gov (United States)

    LaFrance, Monique; King, John W.; Oakley, Bryan A.; Pratt, Sheldon

    2014-07-01

    Recent interest in offshore renewable energy within the United States has amplified the need for marine spatial planning to direct management strategies and address competing user demands. To assist this effort in Rhode Island, benthic habitat classification maps were developed for two sites in offshore waters being considered for wind turbine installation. Maps characterizing and representing the distribution and extent of benthic habitats are valuable tools for improving understanding of ecosystem patterns and processes, and promoting scientifically-sound management decisions. This project presented the opportunity to conduct a comparison of the methodologies and resulting map outputs of two classification approaches, “top-down” and “bottom-up” in the two study areas. This comparison was undertaken to improve understanding of mapping methodologies and their applicability, including the bottom-up approach in offshore environments where data density tends to be lower, as well as to provide case studies for scientists and managers to consider for their own areas of interest. Such case studies can offer guidance for future work for assessing methodologies and translating them to other areas. The traditional top-down mapping approach identifies biological community patterns based on communities occurring within geologically defined habitat map units, under the concept that geologic environments contain distinct biological assemblages. Alternatively, the bottom-up approach aims to establish habitat map units centered on biological similarity and then uses statistics to identify relationships with associated environmental parameters and determine habitat boundaries. When applied to the two study areas, both mapping approaches produced habitat classes with distinct macrofaunal assemblages and each established statistically strong and significant biotic-abiotic relationships with geologic features, sediment characteristics, water depth, and/or habitat

  7. Information Extraction for Clinical Data Mining: A Mammography Case Study.

    Science.gov (United States)

    Nassif, Houssam; Woods, Ryan; Burnside, Elizabeth; Ayvaci, Mehmet; Shavlik, Jude; Page, David

    2009-01-01

    Breast cancer is the leading cause of cancer mortality in women between the ages of 15 and 54. During mammography screening, radiologists use a strict lexicon (BI-RADS) to describe and report their findings. Mammography records are then stored in a well-defined database format (NMD). Lately, researchers have applied data mining and machine learning techniques to these databases. They successfully built breast cancer classifiers that can help in early detection of malignancy. However, the validity of these models depends on the quality of the underlying databases. Unfortunately, most databases suffer from inconsistencies, missing data, inter-observer variability and inappropriate term usage. In addition, many databases are not compliant with the NMD format and/or solely consist of text reports. BI-RADS feature extraction from free text and consistency checks between recorded predictive variables and text reports are crucial to addressing this problem. We describe a general scheme for concept information retrieval from free text given a lexicon, and present a BI-RADS features extraction algorithm for clinical data mining. It consists of a syntax analyzer, a concept finder and a negation detector. The syntax analyzer preprocesses the input into individual sentences. The concept finder uses a semantic grammar based on the BI-RADS lexicon and the experts' input. It parses sentences detecting BI-RADS concepts. Once a concept is located, a lexical scanner checks for negation. Our method can handle multiple latent concepts within the text, filtering out ultrasound concepts. On our dataset, our algorithm achieves 97.7% precision, 95.5% recall and an F 1 -score of 0.97. It outperforms manual feature extraction at the 5% statistical significance level.

  8. A resting bottom sodium cooled fast reactor

    International Nuclear Information System (INIS)

    Costes, D.

    2012-01-01

    This follows ICAPP 2011 paper 11059 'Fast Reactor with a Cold Bottom Vessel', on sodium cooled reactor vessels in thermal gradient, resting on soil. Sodium is frozen on vessel bottom plate, temperature increasing to the top. The vault cover rests on the safety vessel, the core diagrid welded to a toric collector forms a slab, supported by skirts resting on the bottom plate. Intermediate exchangers and pumps, fixed on the cover, plunge on the collector. At the vessel top, a skirt hanging from the cover plunges into sodium, leaving a thin circular slit partially filled by sodium covered by argon, providing leak-tightness and allowing vessel dilatation, as well as a radial relative holding due to sodium inertia. No 'air conditioning' at 400 deg. C is needed as for hanging vessels, and this allows a large economy. The sodium volume below the slab contains isolating refractory elements, stopping a hypothetical corium flow. The small gas volume around the vessel limits any LOCA. The liner cooling system of the concrete safety vessel may contribute to reactor cooling. The cold resting bottom vessel, proposed by the author for many years, could avoid the complete visual inspection required for hanging vessels. However, a double vessel, containing support skirts, would allow introduction of inspecting devices. Stress limiting thermal gradient is obtained by filling secondary sodium in the intermediate space. (authors)

  9. Improving the extraction-and-loading process in the open mining operations

    Directory of Open Access Journals (Sweden)

    Cheban A. Yu.

    2017-09-01

    Full Text Available Using the explosions is the main way to prepare solid rocks for the excavation, and that results in the formation of a rock mass of uneven granulometric composition, which makes it impossible to use a conveyor quarry transport without the preliminary large crushing of the rock mass obtained during the explosion. A way to achieve the greatest technical and economic effect is the full conveyorization of quarry transport, what, in this case, ensures the sequenced-flow of transport operations, automation of management and high labor productivity. The extraction-and-loading machines are the determining factor in the performance of mining and transport machines in the technological flow of the quarry. When extracting a blasted rock mass with single-bucket excavators or loaders working in combination with bottom-hole conveyors, one uses self-propelled crushing and reloading units of various designs to grind large individual parts to fractions of conditioning size. The presence of a crushing and reloading unit in the pit-face along with the excavator requires an additional space for its placement, complicates the maneuvering of the equipment in the pit-face, and increases the number of personnel and the cost of maintaining the extraction-and-reloading operations. The article proposes an improved method for carrying out the extraction-and-loading process, as well as the design of extraction-and-grinding unit based on a quarry hydraulic excavator. The design of the proposed unit makes it possible to convert the cyclic process of scooping the rock mass into the continuous process of its loading on the bottom-hole conveyor. Using the extraction-and-grinding unit allows one to combine the processes of excavation, preliminary crushing and loading of the rock mass, which ensures an increase in the efficiency of mining operations.

  10. Overview of image processing tools to extract physical information from JET videos

    Science.gov (United States)

    Craciunescu, T.; Murari, A.; Gelfusa, M.; Tiseanu, I.; Zoita, V.; EFDA Contributors, JET

    2014-11-01

    In magnetic confinement nuclear fusion devices such as JET, the last few years have witnessed a significant increase in the use of digital imagery, not only for the surveying and control of experiments, but also for the physical interpretation of results. More than 25 cameras are routinely used for imaging on JET in the infrared (IR) and visible spectral regions. These cameras can produce up to tens of Gbytes per shot and their information content can be very different, depending on the experimental conditions. However, the relevant information about the underlying physical processes is generally of much reduced dimensionality compared to the recorded data. The extraction of this information, which allows full exploitation of these diagnostics, is a challenging task. The image analysis consists, in most cases, of inverse problems which are typically ill-posed mathematically. The typology of objects to be analysed is very wide, and usually the images are affected by noise, low levels of contrast, low grey-level in-depth resolution, reshaping of moving objects, etc. Moreover, the plasma events have time constants of ms or tens of ms, which imposes tough conditions for real-time applications. On JET, in the last few years new tools and methods have been developed for physical information retrieval. The methodology of optical flow has allowed, under certain assumptions, the derivation of information about the dynamics of video objects associated with different physical phenomena, such as instabilities, pellets and filaments. The approach has been extended in order to approximate the optical flow within the MPEG compressed domain, allowing the manipulation of the large JET video databases and, in specific cases, even real-time data processing. The fast visible camera may provide new information that is potentially useful for disruption prediction. A set of methods, based on the extraction of structural information from the visual scene, have been developed for the

  11. Overview of image processing tools to extract physical information from JET videos

    International Nuclear Information System (INIS)

    Craciunescu, T; Tiseanu, I; Zoita, V; Murari, A; Gelfusa, M

    2014-01-01

    In magnetic confinement nuclear fusion devices such as JET, the last few years have witnessed a significant increase in the use of digital imagery, not only for the surveying and control of experiments, but also for the physical interpretation of results. More than 25 cameras are routinely used for imaging on JET in the infrared (IR) and visible spectral regions. These cameras can produce up to tens of Gbytes per shot and their information content can be very different, depending on the experimental conditions. However, the relevant information about the underlying physical processes is generally of much reduced dimensionality compared to the recorded data. The extraction of this information, which allows full exploitation of these diagnostics, is a challenging task. The image analysis consists, in most cases, of inverse problems which are typically ill-posed mathematically. The typology of objects to be analysed is very wide, and usually the images are affected by noise, low levels of contrast, low grey-level in-depth resolution, reshaping of moving objects, etc. Moreover, the plasma events have time constants of ms or tens of ms, which imposes tough conditions for real-time applications. On JET, in the last few years new tools and methods have been developed for physical information retrieval. The methodology of optical flow has allowed, under certain assumptions, the derivation of information about the dynamics of video objects associated with different physical phenomena, such as instabilities, pellets and filaments. The approach has been extended in order to approximate the optical flow within the MPEG compressed domain, allowing the manipulation of the large JET video databases and, in specific cases, even real-time data processing. The fast visible camera may provide new information that is potentially useful for disruption prediction. A set of methods, based on the extraction of structural information from the visual scene, have been developed for the

  12. Extraction of prospecting information of uranium deposit based on high spatial resolution satellite data. Taking bashibulake region as an example

    International Nuclear Information System (INIS)

    Yang Xu; Liu Dechang; Zhang Jielin

    2008-01-01

    In this study, the signification and content of prospecting information of uranium deposit are expounded. Quickbird high spatial resolution satellite data are used to extract the prospecting information of uranium deposit in Bashibulake area in the north of Tarim Basin. By using the pertinent methods of image processing, the information of ore-bearing bed, ore-control structure and mineralized alteration have been extracted. The results show a high consistency with the field survey. The aim of this study is to explore practicability of high spatial resolution satellite data for prospecting minerals, and to broaden the thinking of prospectation at similar area. (authors)

  13. Bottom head failure program plan

    International Nuclear Information System (INIS)

    Meyer, R.O.

    1989-01-01

    Earlier this year the NRC staff presented a Revised Severe Accident Research Program Plan (SECY-89-123) to the Commission and initiated work on that plan. Two of the near-term issues in that plan involve failure of the bottom head of the reactor pressure vessel. These two issues are (1) depressurization and DCH and (2) BWR Mark I Containment Shell Meltthrough. ORNL has developed models for several competing failure mechanisms for BWRs. INEL has performed analytical and experimental work directly related to bottom head failure in connection with several programs. SNL has conducted a number of analyses and experimental activities to examine the failure of LWR vessels. In addition to the government-sponsored work mentioned above, EPRI and FAI performed studies on vessel failure for the Industry Degraded Core Rulemaking Program (IDCOR). EPRI examined the failure of a PWR vessel bottom head without penetrations, as found in some Combustion Engineering reactors. To give more attention to this subject as called for by the revised Severe Accident Research Plan, two things are being done. First, work previously done is being reviewed carefully to develop an overall picture and to determine the reliability of assumptions used in those studies. Second, new work is being planned for FY90 to try to complete a reasonable understanding of the failure process. The review and planning are being done in close cooperation with the ACRS. Results of this exercise will be presented in this paper

  14. Extracting Low-Frequency Information from Time Attenuation in Elastic Waveform Inversion

    Science.gov (United States)

    Guo, Xuebao; Liu, Hong; Shi, Ying; Wang, Weihong

    2017-03-01

    Low-frequency information is crucial for recovering background velocity, but the lack of low-frequency information in field data makes inversion impractical without accurate initial models. Laplace-Fourier domain waveform inversion can recover a smooth model from real data without low-frequency information, which can be used for subsequent inversion as an ideal starting model. In general, it also starts with low frequencies and includes higher frequencies at later inversion stages, while the difference is that its ultralow frequency information comes from the Laplace-Fourier domain. Meanwhile, a direct implementation of the Laplace-transformed wavefield using frequency domain inversion is also very convenient. However, because broad frequency bands are often used in the pure time domain waveform inversion, it is difficult to extract the wavefields dominated by low frequencies in this case. In this paper, low-frequency components are constructed by introducing time attenuation into the recorded residuals, and the rest of the method is identical to the traditional time domain inversion. Time windowing and frequency filtering are also applied to mitigate the ambiguity of the inverse problem. Therefore, we can start at low frequencies and to move to higher frequencies. The experiment shows that the proposed method can achieve a good inversion result in the presence of a linear initial model and records without low-frequency information.

  15. Top-down and bottom-up lipidomic analysis of rabbit lipoproteins under different metabolic conditions using flow field-flow fractionation, nanoflow liquid chromatography and mass spectrometry.

    Science.gov (United States)

    Byeon, Seul Kee; Kim, Jin Yong; Lee, Ju Yong; Chung, Bong Chul; Seo, Hong Seog; Moon, Myeong Hee

    2015-07-31

    This study demonstrated the performances of top-down and bottom-up approaches in lipidomic analysis of lipoproteins from rabbits raised under different metabolic conditions: healthy controls, carrageenan-induced inflammation, dehydration, high cholesterol (HC) diet, and highest cholesterol diet with inflammation (HCI). In the bottom-up approach, the high density lipoproteins (HDL) and the low density lipoproteins (LDL) were size-sorted and collected on a semi-preparative scale using a multiplexed hollow fiber flow field-flow fractionation (MxHF5), followed by nanoflow liquid chromatography-ESI-MS/MS (nLC-ESI-MS/MS) analysis of the lipids extracted from each lipoprotein fraction. In the top-down method, size-fractionated lipoproteins were directly infused to MS for quantitative analysis of targeted lipids using chip-type asymmetrical flow field-flow fractionation-electrospray ionization-tandem mass spectrometry (cAF4-ESI-MS/MS) in selected reaction monitoring (SRM) mode. The comprehensive bottom-up analysis yielded 122 and 104 lipids from HDL and LDL, respectively. Rabbits within the HC and HCI groups had lipid patterns that contrasted most substantially from those of controls, suggesting that HC diet significantly alters the lipid composition of lipoproteins. Among the identified lipids, 20 lipid species that exhibited large differences (>10-fold) were selected as targets for the top-down quantitative analysis in order to compare the results with those from the bottom-up method. Statistical comparison of the results from the two methods revealed that the results were not significantly different for most of the selected species, except for those species with only small differences in concentration between groups. The current study demonstrated that top-down lipid analysis using cAF4-ESI-MS/MS is a powerful high-speed analytical platform for targeted lipidomic analysis that does not require the extraction of lipids from blood samples. Copyright © 2015 Elsevier B

  16. Cascadia Initiative Ocean Bottom Seismograph Performance

    Science.gov (United States)

    Evers, B.; Aderhold, K.

    2017-12-01

    The Ocean Bottom Seismograph Instrument Pool (OBSIP) provided instrumentation and operations support for the Cascadia Initiative community experiment. This experiment investigated geophysical processes across the Cascadia subduction zone through a combination of onshore and offshore seismic data. The recovery of Year 4 instruments in September 2015 marked the conclusion of a multi-year experiment that utilized 60 ocean-bottom seismographs (OBSs) specifically designed for the subduction zone boundary, including shallow/deep water deployments and active fisheries. The new instruments featured trawl-resistant enclosures designed by Lamont-Doherty Earth Observatory (LDEO) and Scripps Institution of Oceanography (SIO) for shallow deployment [water depth ≤ 500 m], as well as new deep-water instruments designed by Woods Hole Oceanographic Institute (WHOI). Existing OBSIP instruments were also deployed along the Blanco Transform Fault and on the Gorda Plate through complementary experiments. Station instrumentation included weak and strong motion seismometers, differential pressure gauges (DPG) and absolute pressure gauges (APG). All data collected from the Cascadia, Blanco, and Gorda deployments is available through the Incorporated Research Institutions for Seismology (IRIS) Data Management Center (DMC). The Cascadia Initiative is the largest amphibious seismic experiment undertaken to date, encompassing a diverse technical implementation and demonstrating an effective structure for community experiments. Thus, the results from Cascadia serve as both a technical and operational resource for the development of future community experiments, such as might be contemplated as part of the SZ4D Initiative. To guide future efforts, we investigate and summarize the quality of the Cascadia OBS data using basic metrics such as instrument recovery and more advanced metrics such as noise characteristics through power spectral density analysis. We also use this broad and diverse

  17. Comparison of methods of extracting information for meta-analysis of observational studies in nutritional epidemiology

    Directory of Open Access Journals (Sweden)

    Jong-Myon Bae

    2016-01-01

    Full Text Available OBJECTIVES: A common method for conducting a quantitative systematic review (QSR for observational studies related to nutritional epidemiology is the “highest versus lowest intake” method (HLM, in which only the information concerning the effect size (ES of the highest category of a food item is collected on the basis of its lowest category. However, in the interval collapsing method (ICM, a method suggested to enable a maximum utilization of all available information, the ES information is collected by collapsing all categories into a single category. This study aimed to compare the ES and summary effect size (SES between the HLM and ICM. METHODS: A QSR for evaluating the citrus fruit intake and risk of pancreatic cancer and calculating the SES by using the HLM was selected. The ES and SES were estimated by performing a meta-analysis using the fixed-effect model. The directionality and statistical significance of the ES and SES were used as criteria for determining the concordance between the HLM and ICM outcomes. RESULTS: No significant differences were observed in the directionality of SES extracted by using the HLM or ICM. The application of the ICM, which uses a broader information base, yielded more-consistent ES and SES, and narrower confidence intervals than the HLM. CONCLUSIONS: The ICM is advantageous over the HLM owing to its higher statistical accuracy in extracting information for QSR on nutritional epidemiology. The application of the ICM should hence be recommended for future studies.

  18. A COMPARATIVE ANALYSIS OF WEB INFORMATION EXTRACTION TECHNIQUES DEEP LEARNING vs. NAÏVE BAYES vs. BACK PROPAGATION NEURAL NETWORKS IN WEB DOCUMENT EXTRACTION

    Directory of Open Access Journals (Sweden)

    J. Sharmila

    2016-01-01

    Full Text Available Web mining related exploration is getting the chance to be more essential these days in view of the reason that a lot of information is overseen through the web. Web utilization is expanding in an uncontrolled way. A particular framework is required for controlling such extensive measure of information in the web space. Web mining is ordered into three noteworthy divisions: Web content mining, web usage mining and web structure mining. Tak-Lam Wong has proposed a web content mining methodology in the exploration with the aid of Bayesian Networks (BN. In their methodology, they were learning on separating the web data and characteristic revelation in view of the Bayesian approach. Roused from their investigation, we mean to propose a web content mining methodology, in view of a Deep Learning Algorithm. The Deep Learning Algorithm gives the interest over BN on the basis that BN is not considered in any learning architecture planning like to propose system. The main objective of this investigation is web document extraction utilizing different grouping algorithm and investigation. This work extricates the data from the web URL. This work shows three classification algorithms, Deep Learning Algorithm, Bayesian Algorithm and BPNN Algorithm. Deep Learning is a capable arrangement of strategies for learning in neural system which is connected like computer vision, speech recognition, and natural language processing and biometrics framework. Deep Learning is one of the simple classification technique and which is utilized for subset of extensive field furthermore Deep Learning has less time for classification. Naive Bayes classifiers are a group of basic probabilistic classifiers in view of applying Bayes hypothesis with concrete independence assumptions between the features. At that point the BPNN algorithm is utilized for classification. Initially training and testing dataset contains more URL. We extract the content presently from the dataset. The

  19. Zone analysis in biology articles as a basis for information extraction.

    Science.gov (United States)

    Mizuta, Yoko; Korhonen, Anna; Mullen, Tony; Collier, Nigel

    2006-06-01

    In the field of biomedicine, an overwhelming amount of experimental data has become available as a result of the high throughput of research in this domain. The amount of results reported has now grown beyond the limits of what can be managed by manual means. This makes it increasingly difficult for the researchers in this area to keep up with the latest developments. Information extraction (IE) in the biological domain aims to provide an effective automatic means to dynamically manage the information contained in archived journal articles and abstract collections and thus help researchers in their work. However, while considerable advances have been made in certain areas of IE, pinpointing and organizing factual information (such as experimental results) remains a challenge. In this paper we propose tackling this task by incorporating into IE information about rhetorical zones, i.e. classification of spans of text in terms of argumentation and intellectual attribution. As the first step towards this goal, we introduce a scheme for annotating biological texts for rhetorical zones and provide a qualitative and quantitative analysis of the data annotated according to this scheme. We also discuss our preliminary research on automatic zone analysis, and its incorporation into our IE framework.

  20. Aluminium alloys in municipal solid waste incineration bottom ash.

    Science.gov (United States)

    Hu, Yanjun; Rem, Peter

    2009-05-01

    With the increasing growth of incineration of household waste, more and more aluminium is retained in municipal solid waste incinerator bottom ash. Therefore recycling of aluminium from bottom ash becomes increasingly important. Previous research suggests that aluminium from different sources is found in different size fractions resulting in different recycling rates. The purpose of this study was to develop analytical and sampling techniques to measure the particle size distribution of individual alloys in bottom ash. In particular, cast aluminium alloys were investigated. Based on the particle size distribution it was computed how well these alloys were recovered in a typical state-of-the-art treatment plant. Assessment of the cast alloy distribution was carried out by wet physical separation processes, as well as chemical methods, X-ray fluorescence analysis and electron microprobe analysis. The results from laboratory analyses showed that cast alloys tend to concentrate in the coarser fractions and therefore are better recovered in bottom ash treatment plants.

  1. Bottom-up modeling of oil production: A review of approaches

    International Nuclear Information System (INIS)

    Jakobsson, Kristofer; Söderbergh, Bengt; Snowden, Simon; Aleklett, Kjell

    2014-01-01

    Bottom-up models of oil production are continuously being used to guide investments and policymaking. Compared to simpler top-down models, bottom-up models have a number of advantages due to their modularity, flexibility and concreteness. The purposes of this paper is to identify the crucial modeling challenges, compare the different ways in which nine existing models handle them, assess the appropriateness of these models, and point to possibilities of further development. The conclusions are that the high level of detail in bottom-up models is of questionable value for predictive accuracy, but of great value for identifying areas of uncertainty and new research questions. There is a potential for improved qualitative insights through systematic sensitivity analysis. This potential is at present largely unrealized. - Highlights: • Bottom-up models are influential in the study of the oil production supply chain. • Nine existing bottom-up models are reviewed. • The high level of detail is of questionable value for predictive accuracy. • There is a potential for more systematic sensitivity analysis

  2. Delamination propensity of pharmaceutical glass containers by accelerated testing with different extraction media.

    Science.gov (United States)

    Guadagnino, Emanuel; Zuccato, Daniele

    2012-01-01

    can cause glass particles to appear in vials, a problem that has forced a number of drug product recalls in recent years. To combat this, pharmaceutical and biopharmaceutical manufacturers need to understand the reasons for glass delamination. The most recent cases of product recall due to the presence of particles in the filling liquid have involved borosilicate glass containers carrying drugs made of active components with known ability to corrode glass and to dissolve the silica matrix. Sometimes these ingredients are dissolved in an alkaline medium that dramatically increases the glass corrosion and potentially causes the issue. As this action is strongly affected by time and temperature, flaking may become visible only after a long incubation during storage and requires systematic monitoring to be detected at its early stage. If the nature of the filling liquid is the driving force of the phenomenon, other factors are of primary importance. The surface morphology created during vial forming is a key issue, being a function of the forming temperature that is higher in the cutting step and the forming of the bottom. Delamination occurs generally on the vial's bottom and shoulder, where extensive flaming can favor a strong evaporation of alkali and borate species and the formation of heavily enriched silica layers. When these layers are in contact with a solution, they are subject to a differential re-hydration that may result in cracking and detachment of scales. The purpose of this investigation is to identify testing conditions and parameters that can be used as indicators of an incipient delamination process. Extractions with 0.9% KCl solution for 1 h at 121 °C can be used to simulate a long-term contact with aggressive pharmaceutical preparations, while SiO(2) concentration in the extract solution can be taken as an index of glass dissolution. The conclusions developed by this study can provide pharmaceutical manufacturers with information needed to help

  3. Adsorption of organic pollutants from coking and papermaking wastewaters by bottom ash.

    Science.gov (United States)

    Sun, Wei-ling; Qu, Yan-zhi; Yu, Qing; Ni, Jin-ren

    2008-06-15

    Bottom ash, a power plant waste, was used to remove the organic pollutants in coking wastewater and papermaking wastewater. Particular attention was paid on the effect of bottom ash particle size and dosage on the removal of chemical oxygen demand (COD). UV-vis spectra, fluorescence excitation-emission matrix (FEEM) spectra, Fourier transform infrared (FTIR) spectra, and scanning electron microscopic (SEM) photographs were investigated to characterize the wastewaters and bottom ash. The results show that the COD removal efficiencies increase with decreasing particle sizes of bottom ash, and the COD removal efficiency for coking wastewater is much higher than that for papermaking wastewater due to its high percentage of particle organic carbon (POC). Different trends of COD removal efficiency with bottom ash dosage are also observed for coking and papermaking wastewaters because of their various POC concentrations. Significant variations are observed in the FEEM spectra of wastewaters after treatment by bottom ash. New excitation-emission peaks are found in FEEM spectra, and the fluorescence intensities of the peaks decrease. A new transmittance band in the region of 1400-1420 cm(-1) is observed in FTIR spectra of bottom ash after adsorption. The SEM photographs reveal that the surface of bottom ash particles varies evidently after adsorption.

  4. Adsorption of organic pollutants from coking and papermaking wastewaters by bottom ash

    International Nuclear Information System (INIS)

    Sun Weiling; Qu Yanzhi; Yu Qing; Ni Jinren

    2008-01-01

    Bottom ash, a power plant waste, was used to remove the organic pollutants in coking wastewater and papermaking wastewater. Particular attention was paid on the effect of bottom ash particle size and dosage on the removal of chemical oxygen demand (COD). UV-vis spectra, fluorescence excitation-emission matrix (FEEM) spectra, Fourier transform infrared (FTIR) spectra, and scanning electron microscopic (SEM) photographs were investigated to characterize the wastewaters and bottom ash. The results show that the COD removal efficiencies increase with decreasing particle sizes of bottom ash, and the COD removal efficiency for coking wastewater is much higher than that for papermaking wastewater due to its high percentage of particle organic carbon (POC). Different trends of COD removal efficiency with bottom ash dosage are also observed for coking and papermaking wastewaters because of their various POC concentrations. Significant variations are observed in the FEEM spectra of wastewaters after treatment by bottom ash. New excitation-emission peaks are found in FEEM spectra, and the fluorescence intensities of the peaks decrease. A new transmittance band in the region of 1400-1420 cm -1 is observed in FTIR spectra of bottom ash after adsorption. The SEM photographs reveal that the surface of bottom ash particles varies evidently after adsorption

  5. ASSESSMENT OF THE CHEMICAL POLLUTION OF THE SOIL, GROUND AND BOTTOM SEDIMENTS AT KLEN GOLD AND SILVER DEPOSIT

    Directory of Open Access Journals (Sweden)

    Bryukhan' Fedor Fedorovich

    2012-10-01

    Full Text Available Currently, prospecting and design-related works are performed prior to the upcoming launch of mining operations at Klen gold and silver deposit in Chukot Autonomous District. The anthropogenic impact of the geological exploration in this intact territory has been produced since 1984. A considerable amount of borehole drilling, prospecting, road building, and temporary housing development has been performed. The engineering research, including ecological surveys, has been completed to assess the ecological impact of upcoming exploratory and mining operations at the deposit. Assessment of the geochemical condition of the landscape constituents, including the soil, ground and bottom sediments is of special importance in terms of their engineering protection and rational management of the natural environment. The above assessments were based on the field sampling made by «Sibgeoconsulting», CJSC (Krasnoyarsk and the laboratory research made by accredited laboratories of Federal State Unitary Geological Enterprise «Urangeolograzvedka» (Irkutsk and «Krasnoyarskgeologiya» (Krasnoyarsk. The analysis of the chemical pollution of soils, ground and bottom sediments is based on the examination of 30 samples. Peculiarities of the chemical composition of samples extracted at the deposit were identified. It has been discovered that pH values of the soil vary from 5.1 to 7.3. The concentration of metal in bottom sediments exceeds its concentration in the soil by far. Almost all irregular features of the sample water in the whole territory of the deposit are caused by the anthropogenic impact. In general, the metal content in soils, ground and bottom sediments within the territory of the deposit is slightly different from the regular clarke.

  6. 75 FR 37253 - Classified National Security Information

    Science.gov (United States)

    2010-06-28

    ... ``Secret.'' (3) Each interior page of a classified document shall be marked at the top and bottom either... ``(TS)'' for Top Secret, ``(S)'' for Secret, and ``(C)'' for Confidential will be used. (2) Portions... from the informational text. (1) Conspicuously place the overall classification at the top and bottom...

  7. Three-Dimensional Adjustment of Stratified Flow Over a Sloping Bottom

    National Research Council Canada - National Science Library

    Chapman, David

    2002-01-01

    This study focused on understanding how advection of density within the bottom boundary layer influence the three-dimensional structure, evolution, and dynamics of both the bottom boundary layer and the overlying (interior) flow...

  8. Polymethylmethacrylate-based luminescent solar concentrators with bottom-mounted solar cells

    International Nuclear Information System (INIS)

    Zhang, Yi; Sun, Song; Kang, Rui; Zhang, Jun; Zhang, Ningning; Yan, Wenhao; Xie, Wei; Ding, Jianjun; Bao, Jun; Gao, Chen

    2015-01-01

    Graphical abstract: - Highlights: • Bottom-mounted luminescent solar concentrators on dye-doped plates were studied. • The mechanism of transport process was proposed. • The fabricated luminescent solar concentrator achieved a gain of 1.38. • Power conversion efficiency of 5.03% was obtained with cell area coverage of 27%. • The lowest cost per watt of $1.89 was optimized with cell area coverage of 18%. - Abstract: Luminescent solar concentrators offer an attractive approach to concentrate sunlight economically without tracking, but the narrow absorption band of luminescent materials hinders their further development. This paper describes bottom-mounted luminescent solar concentrators on dye-doped polymethylmethacrylate plates that absorb not only the waveguided light but also the transmitted sunlight and partial fluorescent light in the escape cone. A series of bottom-mounted luminescent solar concentrators with size of 78 mm × 78 mm × 7 mm were fabricated and their gain and power conversion efficiency were investigated. The transport process of the waveguided light and the relationship between the bottom-mounted cells were studied to optimize the performance of the device. The bottom-mounted luminescent solar concentrator with cell area coverage of 9% displayed a cell gain of 1.38, to our best knowledge, which is the highest value for dye-doped polymethylmethacrylate plate luminescent solar concentrators. Power conversion efficiency as high as 5.03% was obtained with cell area coverage of 27%. Furthermore, the bottom-mounted luminescent solar concentrator was found to have a lowest cost per watt of $1.89 with cell area coverage of 18%. These results suggested that the fabricated bottom-mounted luminescent solar concentrator may have a potential in low-cost building integrated photovoltaic application

  9. Evaluation of needle trap micro-extraction and solid-phase micro-extraction: Obtaining comprehensive information on volatile emissions from in vitro cultures.

    Science.gov (United States)

    Oertel, Peter; Bergmann, Andreas; Fischer, Sina; Trefz, Phillip; Küntzel, Anne; Reinhold, Petra; Köhler, Heike; Schubert, Jochen K; Miekisch, Wolfram

    2018-05-14

    Volatile organic compounds (VOCs) emitted from in vitro cultures may reveal information on species and metabolism. Owing to low nmol L -1 concentration ranges, pre-concentration techniques are required for gas chromatography-mass spectrometry (GC-MS) based analyses. This study was intended to compare the efficiency of established micro-extraction techniques - solid-phase micro-extraction (SPME) and needle-trap micro-extraction (NTME) - for the analysis of complex VOC patterns. For SPME, a 75 μm Carboxen®/polydimethylsiloxane fiber was used. The NTME needle was packed with divinylbenzene, Carbopack X and Carboxen 1000. The headspace was sampled bi-directionally. Seventy-two VOCs were calibrated by reference standard mixtures in the range of 0.041-62.24 nmol L -1 by means of GC-MS. Both pre-concentration methods were applied to profile VOCs from cultures of Mycobacterium avium ssp. paratuberculosis. Limits of detection ranged from 0.004 to 3.93 nmol L -1 (median = 0.030 nmol L -1 ) for NTME and from 0.001 to 5.684 nmol L -1 (median = 0.043 nmol L -1 ) for SPME. NTME showed advantages in assessing polar compounds such as alcohols. SPME showed advantages in reproducibility but disadvantages in sensitivity for N-containing compounds. Micro-extraction techniques such as SPME and NTME are well suited for trace VOC profiling over cultures if the limitations of each technique is taken into account. Copyright © 2018 John Wiley & Sons, Ltd.

  10. Reuse potential of low-calcium bottom ash as aggregate through pelletization.

    Science.gov (United States)

    Geetha, S; Ramamurthy, K

    2010-01-01

    Coal combustion residues which include fly ash, bottom ash and boiler slag is one of the major pollutants as these residues require large land area for their disposal. Among these residues, utilization of bottom ash in the construction industry is very low. This paper explains the use of bottom ash through pelletization. Raw bottom ash could not be pelletized as such due to its coarseness. Though pulverized bottom ash could be pelletized, the pelletization efficiency was low, and the aggregates were too weak to withstand the handling stresses. To improve the pelletization efficiency, different clay and cementitious binders were used with bottom ash. The influence of different factors and their interaction effects were studied on the duration of pelletization process and the pelletization efficiency through fractional factorial design. Addition of binders facilitated conversion of low-calcium bottom ash into aggregates. To achieve maximum pelletization efficiency, the binder content and moisture requirements vary with type of binder. Addition of Ca(OH)(2) improved the (i) pelletization efficiency, (ii) reduced the duration of pelletization process from an average of 14-7 min, and (iii) reduced the binder dosage for a given pelletization efficiency. For aggregate with clay binders and cementitious binder, Ca(OH)(2) and binder dosage have significant effect in reducing the duration of pelletization process. 2010 Elsevier Ltd. All rights reserved.

  11. Refinement of the bottom boundary of the INES scale

    International Nuclear Information System (INIS)

    Ferjencik, Milos

    2010-01-01

    No existing edition of the International Nuclear Events Scale (INES) Manual addresses in depth the determination of the bottom boundary of the Scale, although a need for a definition is felt. The article introduces a method for determining the INES bottom boundary applicable to pressurized water reactors. This bottom boundary is put identical with the threshold of degradation of the installation's nuclear safety assurance. A comprehensive flowchart has been developed as the main outcome of the analysis of the nuclear safety assurance violation issue. The use of this flowchart in INES classification to replace the introductory question in the General INES Rating Procedure in the INES Manual is recommended. (orig.)

  12. Acid Extraction - Ion Exchange Recovery of Cinchona Alkaloids Process and Plant Development

    Science.gov (United States)

    1945-06-08

    center of the tank and dispersed by the rising jet from the bottom. From the operation of the scale model it was ascertained that the rate of...were then extracted from the precipitate by means of an alkali and an organic solvent« (18) Fink has suggested the use of a mixture of kaolin and

  13. Approaching the largest ‘API’: extracting information from the Internet with Python

    Directory of Open Access Journals (Sweden)

    Jonathan E. Germann

    2018-02-01

    Full Text Available This article explores the need for libraries to algorithmically access and manipulate the world’s largest API: the Internet. The billions of pages on the ‘Internet API’ (HTTP, HTML, CSS, XPath, DOM, etc. are easily accessible and manipulable. Libraries can assist in creating meaning through the datafication of information on the world wide web. Because most information is created for human consumption, some programming is required for automated extraction. Python is an easy-to-learn programming language with extensive packages and community support for web page automation. Four packages (Urllib, Selenium, BeautifulSoup, Scrapy in Python can automate almost any web page for all sized projects. An example warrant data project is explained to illustrate how well Python packages can manipulate web pages to create meaning through assembling custom datasets.

  14. New, national bottom-up estimate for tree-based biological ...

    Science.gov (United States)

    Nitrogen is a limiting nutrient in many ecosystems, but is also a chief pollutant from human activity. Quantifying human impacts on the nitrogen cycle and investigating natural ecosystem nitrogen cycling both require an understanding of the magnitude of nitrogen inputs from biological nitrogen fixation (BNF). A bottom-up approach to estimating BNF—scaling rates up from measurements to broader scales—is attractive because it is rooted in actual BNF measurements. However, bottom-up approaches have been hindered by scaling difficulties, and a recent top-down approach suggested that the previous bottom-up estimate was much too large. Here, we used a bottom-up approach for tree-based BNF, overcoming scaling difficulties with the systematic, immense (>70,000 N-fixing trees) Forest Inventory and Analysis (FIA) database. We employed two approaches to estimate species-specific BNF rates: published ecosystem-scale rates (kg N ha-1 yr-1) and published estimates of the percent of N derived from the atmosphere (%Ndfa) combined with FIA-derived growth rates. Species-specific rates can vary for a variety of reasons, so for each approach we examined how different assumptions influenced our results. Specifically, we allowed BNF rates to vary with stand age, N-fixer density, and canopy position (since N-fixation is known to require substantial light).Our estimates from this bottom-up technique are several orders of magnitude lower than previous estimates indicating

  15. Evaluating the Resilience of the Bottom-up Method used to Detect and Benchmark the Smartness of University Campuses

    DEFF Research Database (Denmark)

    Giovannella, Carlo; Andone, Diana; Dascalu, Mihai

    2016-01-01

    A new method to perform a bottom-up extraction and benchmark of the perceived multilevel smartness of complex ecosystems has been recently described and applied to territories and learning ecosystems like university campuses and schools. In this paper we study the resilience of our method...... by comparing and integrating the data collected in several European Campuses during two different academic years, 2014-15 and 2015-16. The overall results are: a) a more adequate and robust definition of the orthogonal multidimensional space of representation of the smartness, and b) the definition...

  16. Out-pile test of the capsule with cone shape bottom structures

    Energy Technology Data Exchange (ETDEWEB)

    Choi, M. H.; Kang, Y. H.; Cho, M. S.; Choo, K. N.; Kim, B. G.; Son, J. M.; Park, S. J.; Shin, Y. T.; Oh, J. M

    2004-01-01

    The design modification of bottom guide structures for the instrumented capsule which is used for the irradiation test in the research reactor, HANARO is done because of the cutting trouble of the bottom guide arm's pin. The previous structure of the 3-pin arm shape is changed into one body of the cone shape. The specimens of the bottom end cap ring with three different sizes ({phi}68mm, {phi}70mm, {phi}72mm) are designed and manufactured. The out-pile test for the capsule with previous 3-pin arm and new three bottom structures of the cone shape is performed using the one-channel flow test facilities. In order to estimate the compatibility with HANARO, the structural stability and integrity of the capsule, the out-pile test such as a loading/unloading test, a pressure drop test, a thermal performance test, a displacement measurement due to a vibration and an endurance test etc. is conducted, and the outer diameter of the bottom end cap ring to meet the HANARO requirements is selected. From out-pile test results the capsule with cone shape bottom structures is evaluated as to have the structural stability and the benefit from the fluid's flow respect. Also the size satisfied various requirements among three kinds of bottom end cap rings is 70mm in diameter. It is expected that the new bottom structures of the cone shape with 70mm in diameter will be applicable to all material and special capsules which will be designed and manufactured for the purpose of irradiation tests in the future.

  17. Detection of Higgs bosons decaying to bottom quarks

    International Nuclear Information System (INIS)

    Gilman, F.J.; Price, L.E.

    1986-11-01

    Several developments affecting the possibility of Higgs detection are discussed. These include the level of certainty about the t quark mass, Monte Carlo programs to generate both signal and background events, and separation and/or enhancement of heavy quark jets from jets due to light quarks or gluons, and the possibility that the neutral Higgs decay into bottom quarks might be the decay mode of choice for detecting the intermediate mass Higgs. Possible means of detection of an intermediate mass Higgs at the SSC, particularly if a prominent decay mode is to bottom quarks, are examined, using the PYTHIA Monte Carlo program to generate both signal and background events. For the signal, events were generated in which Higgs bosons are created in proton-proton collisions, with the Higgs decaying into bottom quarks. The presence of W or Z bosons, created in the same proton-proton collision, is used to enhance the likelihood of Higgs production and to reduce the potentially enormous background. It is found that the Higgs decay to bottom quarks, if important, would be more favorable for detection of the Higgs than decay to top quarks was found to be because of the smaller background. 3 refs., 4 figs

  18. THE FACE EXTRACTION METHOD FOR MOBILE DEVICES

    Directory of Open Access Journals (Sweden)

    Viktor Borodin

    2013-10-01

    Full Text Available Normal 0 false false false MicrosoftInternetExplorer4 The problem of automatic face recognition on images is considered. The method of face ellipse extraction from photo and methods for face special points extraction are proposed /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Обычная таблица"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman"; mso-ansi-language:#0400; mso-fareast-language:#0400; mso-bidi-language:#0400;}

  19. Extracting and Using Photon Polarization Information in Radiative B Decays

    Energy Technology Data Exchange (ETDEWEB)

    Grossman, Yuval

    2000-05-09

    The authors discuss the uses of conversion electron pairs for extracting photon polarization information in weak radiative B decays. Both cases of leptons produced through a virtual and real photon are considered. Measurements of the angular correlation between the (K-pi) and (e{sup +}e{sup {minus}}) decay planes in B --> K*(--> K-pi)gamma (*)(--> e{sup +}e{sup {minus}}) decays can be used to determine the helicity amplitudes in the radiative B --> K*gamma decays. A large right-handed helicity amplitude in B-bar decays is a signal of new physics. The time-dependent CP asymmetry in the B{sup 0} decay angular correlation is shown to measure sin 2-beta and cos 2-beta with little hadronic uncertainty.

  20. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Dome Reef, 2007 (NODC Accession 0093023)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  1. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Wellwood, 2006 - 2009 (NODC Accession 0093067)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  2. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at 7-mile Bridge (NODC Accession 0002750)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  3. Influence of bottom ash of palm oil on compressive strength of concrete

    Science.gov (United States)

    Saputra, Andika Ade Indra; Basyaruddin, Laksono, Muhamad Hasby; Muntaha, Mohamad

    2017-11-01

    The technological development of concrete demands innovation regarding the alternative material as a part of the effort in improving quality and minimizing reliance on currently used raw materials such as bottom ash of palm oil. Bottom ash known as domestic waste stemming from palm oil cultivation in East Kalimantan contains silica. Like cement in texture and size, bottom ash can be mixed with concrete in which the silica in concrete could help increase the compressive strength of concrete. This research was conducted by comparing between normal concrete and concrete containing bottom ash as which the materials were apart of cement replacement. The bottom ash used in this research had to pass sieve size (#200). The composition tested in this research involved ratio between cement and bottom ash with the following percentages: 100%: 0%, 90%: 10%, 85%: 15% and 80%: 20%. Planned to be within the same amount of compressive strength (fc 25 MPa), the compressive strength of concrete was tested at the age of 7, 14, and 28 days. Research result shows that the addition of bottom ash to concrete influenced workability in concrete, but it did not significantly influence the compressive strength of concrete. Based on the result of compressive strength test, the optimal compressive strength was obtained from the mixture of 100% cement and 0% bottom ash.

  4. Radiator Enhanced Geothermal System - A Revolutionary Method for Extracting Geothermal Energy

    Science.gov (United States)

    Karimi, S.; Marsh, B. D.; Hilpert, M.

    2017-12-01

    A new method of extracting geothermal energy, the Radiator Enhanced Geothermal System (RAD-EGS) has been developed. RAD-EGS attempts to mimic natural hydrothermal systems by 1) generating a vertical vane of artificially produced high porosity/permeability material deep in a hot sedimentary aquifer, 2) injecting water at surface temperatures to the bottom of the vane, where the rock is the hottest, 3) extracting super-heated water at the top of the vane. The novel RAD-EGS differs greatly from the currently available Enhanced Geothermal Systems in vane orientation, determined in the governing local crustal stress field by Shmax and Sl (meaning it is vertical), and in the vane location in a hot sedimentary aquifer, which naturally increases the longevity of the system. In this study, we explore several parameters regimes affecting the water temperature in the extraction well, keeping in mind that the minimum temperature of the extracted water has to be 150 °C in order for a geothermal system to be commercially viable. We used the COMSOL finite element package to simulate coupled heat and fluid transfer within the RAD-EGS model. The following geologic layers from top to bottom are accounted for in the model: i) confining upper layer, ii) hot sedimentary aquifer, and iii) underlying basement rock. The vane is placed vertically within the sedimentary aquifer. An injection well and an extraction well are also included in the simulation. We tested the model for a wide range of various parameters including background heat flux, thickness of geologic layers, geometric properties of the vane, diameter and location of the wells, fluid flow within the wells, regional hydraulic gradient, and permeability and porosity of the layers. The results show that among the aforementioned parameters, background heat flux and the depth of vane emplacement are highly significant in determining the level of commercial viability of the geothermal system. These results indicate that for the

  5. SAR matrices: automated extraction of information-rich SAR tables from large compound data sets.

    Science.gov (United States)

    Wassermann, Anne Mai; Haebel, Peter; Weskamp, Nils; Bajorath, Jürgen

    2012-07-23

    We introduce the SAR matrix data structure that is designed to elucidate SAR patterns produced by groups of structurally related active compounds, which are extracted from large data sets. SAR matrices are systematically generated and sorted on the basis of SAR information content. Matrix generation is computationally efficient and enables processing of large compound sets. The matrix format is reminiscent of SAR tables, and SAR patterns revealed by different categories of matrices are easily interpretable. The structural organization underlying matrix formation is more flexible than standard R-group decomposition schemes. Hence, the resulting matrices capture SAR information in a comprehensive manner.

  6. Decay of the Bottom mesons

    International Nuclear Information System (INIS)

    Duong Van Phi; Duong Anh Duc

    1992-12-01

    The channels of the decay of Bottom mesons are deduced from a selection rule and the Lagrangians which are formed on the LxO(4) invariance and the principle of minimal structure. The estimation of the corresponding decay probabilities are considered. (author). 21 refs

  7. Extracting chemical information from high-resolution Kβ X-ray emission spectroscopy

    Science.gov (United States)

    Limandri, S.; Robledo, J.; Tirao, G.

    2018-06-01

    High-resolution X-ray emission spectroscopy allows studying the chemical environment of a wide variety of materials. Chemical information can be obtained by fitting the X-ray spectra and observing the behavior of some spectral features. Spectral changes can also be quantified by means of statistical parameters calculated by considering the spectrum as a probability distribution. Another possibility is to perform statistical multivariate analysis, such as principal component analysis. In this work the performance of these procedures for extracting chemical information in X-ray emission spectroscopy spectra for mixtures of Mn2+ and Mn4+ oxides are studied. A detail analysis of the parameters obtained, as well as the associated uncertainties is shown. The methodologies are also applied for Mn oxidation state characterization of double perovskite oxides Ba1+xLa1-xMnSbO6 (with 0 ≤ x ≤ 0.7). The results show that statistical parameters and multivariate analysis are the most suitable for the analysis of this kind of spectra.

  8. Feature extraction and learning using context cue and Rényi entropy based mutual information

    DEFF Research Database (Denmark)

    Pan, Hong; Olsen, Søren Ingvor; Zhu, Yaping

    2015-01-01

    information. In particular, for feature extraction, we develop a new set of kernel descriptors−Context Kernel Descriptors (CKD), which enhance the original KDES by embedding the spatial context into the descriptors. Context cues contained in the context kernel enforce some degree of spatial consistency, thus...... improving the robustness of CKD. For feature learning and reduction, we propose a novel codebook learning method, based on a Rényi quadratic entropy based mutual information measure called Cauchy-Schwarz Quadratic Mutual Information (CSQMI), to learn a compact and discriminative CKD codebook. Projecting...... as the information about the underlying labels of the CKD using CSQMI. Thus the resulting codebook and reduced CKD are discriminative. We verify the effectiveness of our method on several public image benchmark datasets such as YaleB, Caltech-101 and CIFAR-10, as well as a challenging chicken feet dataset of our own...

  9. Ocean bottom seismometer technology

    Science.gov (United States)

    Prothero, William A., Jr.

    Seismometers have been placed on the ocean bottom for about 45 years, beginning with the work of Ewing and Vine [1938], and their current use to measure signals from earthquakes and explosions constitutes an important research method for seismological studies. Approximately 20 research groups are active in the United Kingdom, France, West Germany, Japan, Canada, and the United States. A review of ocean bottom seismometer (OBS) instrument characteristics and OBS scientific studies may be found in Whitmarsh and Lilwall [1984]. OBS instrumentation is also important for land seismology. The recording systems that have been developed have been generally more sophisticated than those available for land use, and several modern land seismic recording systems are based on OBS recording system designs.The instrumentation developed for OBS work was the topic of a meeting held at the University of California, Santa Barbara, in July 1982. This article will discuss the state of the art of OBS Technology, some of the problems remaining to be solved, and some of the solutions proposed and implemented by OBS scientists and engineers. It is not intended as a comprehensive review of existing instrumentation.

  10. Experimental Investigation of Discharge Coefficient in Mesh Panel Bottom Intakes

    Directory of Open Access Journals (Sweden)

    keivan bina

    2012-04-01

    Full Text Available Bottom racks is a hydraulic structure which is placed in the bed of stream through which, part of flow in the main channel is diverted. These structures have very wide application in industry, irrigation, drainage and etc. Of course much attention had been paid to the study of such structures, but characteristics of flow through bottom racks are complex. The present study was directed to estimate the discharge coefficient of a new kind of bottom racks including both transverse and longitudinal bars that named "mesh panel racks" without considering any solids in the fluid. This kind of bottom intake has advantages from structural point of view and has less deformation under static and dynamic loads. Laboratory setup with three mesh panel intakes was built and the effects of various parameters such as racks slope, porosity and geometry were explored. A dimensional analysis using Buckingham theory showed the effective hydraulic and geometric factors that affect the discharge coefficient (Cd of bottom racks. Then, a statistical approach to determine the discharge coefficient of a rack structure was developed with linear and nonlinear regression using SPSS software. The efficiency of the proposed technique is high enough that the associated error is limited to 10%. Finally, hydraulic performance of mesh panel intakes was compared with regular type of bottom intakes, which consist of longitudinal bars. For this purpose, diverted discharge through both type of intakes calculated in same situation

  11. Constructing bottom barriers with met grouting

    International Nuclear Information System (INIS)

    Shibazaki, M.; Yoshida, H.

    1997-01-01

    Installing a bottom barrier using conventional high pressure jetting technology and ensuring barrier continuity is challenging. This paper describes technology that has been developed and demonstrated for the emplacement of bottom barriers using pressures and flow rates above the conventional high pressure jetting parameters. The innovation capable of creating an improved body exceeding 5 meters in diameter has resulted in the satisfying connection and adherence between the treated columns. Besides, the interfaces among the improved bodies obtain the same strength and permeability lower than 1 x 10 -7 cm/sec as body itself. A wide variety of the thickness and the diameter of the improved mass optimizes the application, and the method is nearing completion. The paper explains an aspect and briefs case histories

  12. Bottom-simulating reflector variability at the Costa Rica subduction zone and corresponding heat flow model

    Science.gov (United States)

    Cavanaugh, S.; Bangs, N. L.; Hornbach, M. J.; McIntosh, K. D.

    2011-12-01

    We use 3D seismic reflection data acquired in April - May 2011 by the R/V Marcus G. Langseth to extract heat flow information using the bottom-simulating reflector across the Costa Rica convergent margin. These data are part of the CRISP Project, which will image the Middle America subduction zone in 3D. The survey was conducted in an area approximately 55 x 11 km, to the northwest of the Osa Peninsula, Costa Rica. For the analysis presented here, 3D seismic data were processed with Paradigm Focus software through post-stack time migration. The bottom-simulating reflector (BSR)-a reverse polarity reflection indicating the base of the gas hydrate phase boundary-is imaged very clearly in two regions within the slope-cover sediments in the accretionary prism. In deep water environments, the BSR acts as a temperature gauge revealing subsurface temperatures across the margin. We predict BSR depth using a true 3D diffusive heat flow model combined with IODP drilling data and compare results with actual BSR depth observations to determine anomalies in heat flow. Uniform heat flow in the region should result in a deepening BSR downslope toward the trench, however our initial results indicate the BSR shoals near the trench to its shallowest level below sea floor of approximately 96 m below the sea floor, suggesting elevated heat flow towards the toe. Landward, the BSR deepens to about 333 m below the sea floor indicating lower heat flow. Both BSR segments display a trend of deepening landward from the trench, however the depth below the sea floor is greater overall for the landward segment than the segment near the toe. We suggest two regimes with differing heat flow exist across the margin that likely represent two separate fluid flow regimes - one from recently accreted sediments near the prism toe and the other through the older materials making up the prism.

  13. Properties of the Water Column and Bottom Derived from AVIRIS Data

    Science.gov (United States)

    Lee, Zhong-Ping; Carder, Kendall L.; Chen, F. Robert; Peacock, Thomas G.

    2001-01-01

    Using AVIRIS data as an example, we show in this study that the optical properties of the water column and bottom of a large, shallow area can be adequately retrieved using a model-driven optimization technique. The simultaneously derived properties include bottom depth, bottom albedo, and water absorption and backscattering coefficients, which in turn could be used to derive concentrations of chlorophyll, dissolved organic matter, and suspended sediments. The derived bottom depths were compared with a bathymetry chart and a boat survey and were found to agree very well. Also, the derived bottom-albedo image shows clear spatial patterns, with end members consistent with sand and seagrass. The image of absorption and backscattering coefficients indicates that the water is quite horizontally mixed. These results suggest that the model and approach used work very well for the retrieval of sub-surface properties of shallow-water environments even for rather turbid environments like Tampa Bay, Florida.

  14. Eco-friendly porous concrete using bottom ash aggregate for marine ranch application.

    Science.gov (United States)

    Lee, Byung Jae; Prabhu, G Ganesh; Lee, Bong Chun; Kim, Yun Yong

    2016-03-01

    This article presents the test results of an investigation carried out on the reuse of coal bottom ash aggregate as a substitute material for coarse aggregate in porous concrete production for marine ranch applications. The experimental parameters were the rate of bottom ash aggregate substitution (30%, 50% and 100%) and the target void ratio (15%, 20% and 25%). The cement-coated granular fertiliser was substituted into a bottom ash aggregate concrete mixture to improve marine ranch applications. The results of leaching tests revealed that the bottom ash aggregate has only a negligible amount of the ten deleterious substances specified in the Ministry of Environment - Enforcement Regulation of the Waste Management Act of Republic Korea. The large amount of bubbles/air gaps in the bottom ash aggregate increased the voids of the concrete mixtures in all target void ratios, and decreased the compressive strength of the porous concrete mixture; however, the mixture substituted with 30% and 10% of bottom ash aggregate and granular fertiliser, respectively, showed an equal strength to the control mixture. The sea water resistibility of the bottom ash aggregate substituted mixture was relatively equal to that of the control mixture, and also showed a great deal of improvement in the degree of marine organism adhesion compared with the control mixture. No fatality of fish was observed in the fish toxicity test, which suggested that bottom ash aggregate was a harmless material and that the combination of bottom ash aggregate and granular fertiliser with substitution rates of 30% and 10%, respectively, can be effectively used in porous concrete production for marine ranch application. © The Author(s) 2015.

  15. Monitoring of metals in Tilapia nilotica tissues, bottom sediments ...

    African Journals Online (AJOL)

    Tilapia (Tilapia nilotica), bottom sediments and water were collected from Nworie River and Oguta Lake. The muscle, liver and gills of the fish as well as the bottom sediments and water were analysed for Al, Cr, Cd, Pb, As, Zn, Mn, Co, Se, Cu, Ni and Fe using atomic absorption spectrophotometer to highlight the importance ...

  16. A plea for Global Health Action bottom-up

    Directory of Open Access Journals (Sweden)

    Ulrich Laaser

    2016-10-01

    Full Text Available This opinion piece focuses on global health action by hands-on bottom-up practice: Initiation of an organizational framework and securing financial efficiency are – however - essential, both clearly a domain of well trained public health professionals. Examples of action are cited in the four main areas of global threats: planetary climate change, global divides and inequity, global insecurity and violent conflicts, global instability and financial crises. In conclusion a stable health systems policy framework would greatly enhance success. However, such organisational framework dries out if not linked to public debates channelling fresh thoughts and controversial proposals: the structural stabilisation is essential but has to serve not to dominate bottom-up activities. In other words a horizontal management is required, a balanced equilibrium between bottom-up initiative and top-down support. Last not least rewarding voluntary and charity work by public acknowledgement is essential.

  17. OIL DECONTAMINATION OF BOTTOM SEDIMENTS EXPERIMENTAL WORK RESULTS

    Directory of Open Access Journals (Sweden)

    Lushnikov Sergey V.

    2006-08-01

    Full Text Available This article presents the results of experimental work during 2004-2005 on oil decontamination of bottom sediments of Lake Schuchye, situated in the Komi Republic (Northern Russia. The cause of thecontamination were huge oil spills occurred after a series of accidental ruptures on the Harjaga-Usinsk and Vozej-Usinsk oil-pipe lines in 1994. Flotation technology was used for the cleaning of bottom sediments.157 tons of crude oil were removed during the course of 2-year experimental work from an area of 4,1 ha.The content of aliphatic and alicyclic oil hydrocarbons was reduced from 53,3 g/kg to 2,2 g/kg, on average.Hydrobiological investigations revealed that bottom sediments started to be inhabited by benthos organisms, dominantly Oligochaeta. Besides Oligochaeta, Chironomidae maggots and Bivalvia were detected. Theappearance of Macrozoobenthos organisms can serve as a bioindicator of water quality.

  18. Evaluation of aseismic integrity in HTTR core-bottom structure. Pt. 1. Aseismic test for core-bottom structure

    International Nuclear Information System (INIS)

    Iyoku, T.; Futakawa, M.; Ishihara, M.

    1994-01-01

    The aseismic tests were carried out using (1)/(5)-scale and (1)/(3)-scale models of the core-bottom structure of the HTTR to quantitatively evaluate the response of acceleration, strain, impact load etc. The following conclusions are obtained. (i) The frequency response of the keyway strain is correlative with that of the impact acceleration on the hot plenum block. (ii) It was confirmed through (1)/(5)-scale and (1)/(3)-scale model tests that the applied similarity law is valid to evaluate the seismic response characteristics of the core-bottom structure. (ii) The stress of graphite components estimated from the scale model test using S 2 -earthquake excitation was sufficiently lower than the allowable stress used as the design criterion. ((orig.))

  19. Scholarly Information Extraction Is Going to Make a Quantum Leap with PubMed Central (PMC).

    Science.gov (United States)

    Matthies, Franz; Hahn, Udo

    2017-01-01

    With the increasing availability of complete full texts (journal articles), rather than their surrogates (titles, abstracts), as resources for text analytics, entirely new opportunities arise for information extraction and text mining from scholarly publications. Yet, we gathered evidence that a range of problems are encountered for full-text processing when biomedical text analytics simply reuse existing NLP pipelines which were developed on the basis of abstracts (rather than full texts). We conducted experiments with four different relation extraction engines all of which were top performers in previous BioNLP Event Extraction Challenges. We found that abstract-trained engines loose up to 6.6% F-score points when run on full-text data. Hence, the reuse of existing abstract-based NLP software in a full-text scenario is considered harmful because of heavy performance losses. Given the current lack of annotated full-text resources to train on, our study quantifies the price paid for this short cut.

  20. 12 Trace Metals Distribution in Fish Tissues, Bottom Sediments and ...

    African Journals Online (AJOL)

    `123456789jkl''''#

    Abstract. Water samples, bottom sediments, Tilapia, and Cat Fish from Okumeshi River in Delta state of Nigeria were analysed ... Keywords: Trace metals, Fish Tissues, Water, Bottom sediments, Okumeshi River. Introduction ..... Grey Mangroove Avicemmia marina (Forsk). ... sewage treatment plant oulet pipe extension on.

  1. Effect of bottom slope on the nonlinear triad interactions in shallow water

    Science.gov (United States)

    Chen, Hongzhou; Tang, Xiaocheng; Zhang, Ri; Gao, Junliang

    2018-05-01

    This paper aims at investigating the effect of bottom slope to the nonlinear triad interactions for irregular waves propagating in shallow water. The physical experiments are conducted in a wave flume with respect to the transformation of waves propagating on three bottom slopes ( β = 1/15, 1/30, and 1/45). Irregular waves with different type of breaking that are mechanically generated based on JONSWAP spectra are used for the test. The obviously different variations of spectra measured on each bottom reveal a crucial role of slope effect in the energy transfer between harmonics. The wavelet-based bispectrum were used to examine the bottom slope effect on the nonlinear triad interactions. Results show that the different bottom slopes which waves are propagated on will cause a significant discrepancy of triad interactions. Then, the discussions on the summed bicoherence which denote the distribution of phase coupling on each frequency further clarify the effect of bottom slope. Furthermore, the summed of the real and imaginary parts of bispectrum which could reflect the intensity of frequency components participating in the wave skewness and asymmetry were also investigated. Results indicate that the value of these parameters will increase as the bottom slope gets steeper.

  2. Data Assimilation to Extract Soil Moisture Information from SMAP Observations

    Directory of Open Access Journals (Sweden)

    Jana Kolassa

    2017-11-01

    Full Text Available This study compares different methods to extract soil moisture information through the assimilation of Soil Moisture Active Passive (SMAP observations. Neural network (NN and physically-based SMAP soil moisture retrievals were assimilated into the National Aeronautics and Space Administration (NASA Catchment model over the contiguous United States for April 2015 to March 2017. By construction, the NN retrievals are consistent with the global climatology of the Catchment model soil moisture. Assimilating the NN retrievals without further bias correction improved the surface and root zone correlations against in situ measurements from 14 SMAP core validation sites (CVS by 0.12 and 0.16, respectively, over the model-only skill, and reduced the surface and root zone unbiased root-mean-square error (ubRMSE by 0.005 m 3 m − 3 and 0.001 m 3 m − 3 , respectively. The assimilation reduced the average absolute surface bias against the CVS measurements by 0.009 m 3 m − 3 , but increased the root zone bias by 0.014 m 3 m − 3 . Assimilating the NN retrievals after a localized bias correction yielded slightly lower surface correlation and ubRMSE improvements, but generally the skill differences were small. The assimilation of the physically-based SMAP Level-2 passive soil moisture retrievals using a global bias correction yielded similar skill improvements, as did the direct assimilation of locally bias-corrected SMAP brightness temperatures within the SMAP Level-4 soil moisture algorithm. The results show that global bias correction methods may be able to extract more independent information from SMAP observations compared to local bias correction methods, but without accurate quality control and observation error characterization they are also more vulnerable to adverse effects from retrieval errors related to uncertainties in the retrieval inputs and algorithm. Furthermore, the results show that using global bias correction approaches without a

  3. Bottom Scour Observed Under Hurricane Ivan

    National Research Council Canada - National Science Library

    Teague, William J; Jarosz, Eva; Keen, Timothy R; Wang, David W; Hulbert, Mark S

    2006-01-01

    Observations that extensive bottom scour along the outer continental shelf under Hurricane Ivan resulted in the displacement of more than 100 million cubic meters of sediment from a 35x15 km region...

  4. Sorption behaviour of cobalt-60 on Suez Canal bottom sediments

    International Nuclear Information System (INIS)

    Abdel Gawad, S.A.; El-Shinawy, R.M.K.; Abdel Malik, W.E.Y.

    1981-01-01

    Mineralogical, elemental analysis and sorption behaviour of the Suez Canal bottom sediments in the Port Said area were investigated. It was found that the bottom sediment consist mainly of quartz, feldspars and traces of calcite mineral. The cation-exchange capacity was found to increase as the particle size of the sediment decreased. Sorption of 60 Co by the bottom sediment increased with contact time up to 6 h. Variation of the solution pH from 4 to 9 showed limited increase in the sorption of 60 Co. As carrier concentrations increase from 10 -7 N to 10 -3 N, sorption of Co was found to increase linearly following Freundlich isotherm. The presence of Mg 2+ and Fe 3+ in solution depressed the sorption of 60 Co by the sediments. The desorption of 60 Co from bottom sediment with distilled and Suez Canal water was found to increase with contact time. (author)

  5. Distribution of shallow water soft and hard bottom seabeds in the Isla del Coco National Park, Pacific Costa Rica

    Directory of Open Access Journals (Sweden)

    Jeffrey A. Sibaja-Cordero

    2012-11-01

    Full Text Available Geographic Information Systems (GIS applications used in marine habitats are powerful tools for management and monitoring of marine reserves and resources. Here, we present a series of maps of the soft and hard substrates in the shallow waters (>80 m depth of Parque Nacional Isla del Coco (PNIC= Isla del Coco National Park. We use bathymetry data and field data as input for a GIS, GAM, and kriging methods to generate a series of maps that describe the bottom characteristics. Eight types of bottom were found in the PNIC by composition and grain size. The shore of the island and islets consisted of rocky formations (mainly basalts, with coral reefs in the subtidal of some areas. Rhodolith beds had a dispersing distribution. The bottom on the southern and southwestern region is hard substrate, while sediments cover the northern and northeastern zones. Slightly gravelly sand dominated the bays, while gravelly sand (with more coarse grains was frequent offshore. The inner areas of Chatham and Wafer bays have mud and organic matter. The sediments in the area are mostly carbonates, except in Bahía Yglesias where clastic sediments (from the erosion of basalts are presented. The information generated in this study could be a valuable input for future monitoring in the PNIC.

  6. Assessment of sludges and tank bottoms treatment processes

    International Nuclear Information System (INIS)

    Bhutto, A.W.; Bazmi, A.A.

    2005-01-01

    The petroleum refining industries generate considerable amounts of sludge and tank bottoms as waste. Petroleum refinery receives crude oil containing emulsified water and solids. As the crude oil storage tanks are repeatedly filled and emptied, the water and solids settle towards the bottom as sludge. For tanks that have been in service for several years, the sludge accumulation becomes several feet deep, results in a loss of ullage in refinery crude storage tanks. The accumulation of crude storage tank bottoms is a serious problem experienced by local refineries. The refinery sludge waste is categorized as hazardous waste, which is at present buried in the tankform ground. Since the no hazardous material land filling option available, the disposal of these hazardous materials has become a major problem because of the ISO-14000 certification requirements and expectation of stakeholder. To maximize the waste oil recovery from sludge and tank bottoms and to minimize the volume of the hazardous waste, a number of waste recovery and treatment processes are available. The process designs and unit operations of each process are different and each has its own merits, in terms of the technical complexity, operation friendliness, and costs and economics. A study on each of these technologies and the subsequent tide-up to the existing unit operations is conducted, and the associated technical comparisons are made. (author)

  7. The effect of informed consent on stress levels associated with extraction of impacted mandibular third molars.

    Science.gov (United States)

    Casap, Nardy; Alterman, Michael; Sharon, Guy; Samuni, Yuval

    2008-05-01

    To evaluate the effect of informed consent on stress levels associated with removal of impacted mandibular third molars. A total of 60 patients scheduled for extraction of impacted mandibular third molars participated in this study. The patients were unaware of the study's objectives. Data from 20 patients established the baseline levels of electrodermal activity (EDA). The remaining 40 patients were randomly assigned into 2 equal groups receiving either a detailed document of informed consent, disclosing the possible risks involved with the surgery, or a simplified version. Pulse, blood pressure, and EDA were monitored before, during, and after completion of the consent document. Changes in EDA, but not in blood pressure, were measured on completion of either version of the consent document. A greater increase in EDA was associated with the detailed version of the consent document (P = .004). A similar concomitant increase (although nonsignificant) in pulse values was monitored on completion of both versions. Completion of overdisclosed document of informed consent is associated with changes in physiological parameters. The results suggest that overdetailed listing and disclosure before extraction of impacted mandibular third molars can increase patient stress.

  8. Method of extracting significant trouble information of nuclear power plants using probabilistic analysis technique

    International Nuclear Information System (INIS)

    Shimada, Yoshio; Miyazaki, Takamasa

    2005-01-01

    In order to analyze and evaluate large amounts of trouble information of overseas nuclear power plants, it is necessary to select information that is significant in terms of both safety and reliability. In this research, a method of efficiently and simply classifying degrees of importance of components in terms of safety and reliability while paying attention to root-cause components appearing in the information was developed. Regarding safety, the reactor core damage frequency (CDF), which is used in the probabilistic analysis of a reactor, was used. Regarding reliability, the automatic plant trip probability (APTP), which is used in the probabilistic analysis of automatic reactor trips, was used. These two aspects were reflected in the development of criteria for classifying degrees of importance of components. By applying these criteria, a simple method of extracting significant trouble information of overseas nuclear power plants was developed. (author)

  9. Bottom quark contribution to spin-dependent dark matter detection

    Directory of Open Access Journals (Sweden)

    Jinmian Li

    2016-05-01

    Full Text Available We investigate a previously overlooked bottom quark contribution to the spin-dependent cross section for Dark Matter (DM scattering from the nucleon. While the mechanism is relevant to any supersymmetric extension of the Standard Model, for illustrative purposes we explore the consequences within the framework of the Minimal Supersymmetric Standard Model (MSSM. We study two cases, namely those where the DM is predominantly Gaugino or Higgsino. In both cases, there is a substantial, viable region in parameter space (mb˜−mχ≲O(100 GeV in which the bottom contribution becomes important. We show that a relatively large contribution from the bottom quark is consistent with constraints from spin-independent DM searches, as well as some incidental model dependent constraints.

  10. Measurement of the bottom hadron lifetime at the Z0 resonancce

    International Nuclear Information System (INIS)

    Fujino, D.H.

    1992-06-01

    We have measured the bottom hadron lifetime from b bar b events produced at the Z 0 resonance. Using the precision vertex detectors of the Mark II detector at the Stanford Linear Collider, we developed an impact parameter tag to identify bottom hadrons. The vertex tracking system resolved impact parameters to 30 μm for high momentum tracks, and 70 μm for tracks with a momentum of 1 GeV. We selected B hadrons with an efficiency of 40% and a sample purity of 80%, by requiring there be at least two tracks in a single jet that significantly miss the Z 0 decay vertex. From a total of 208 hadronic Z 0 events collected by the Mark II detector in 1990, we tagged 53 jets, of which 22 came from 11 double-tagged events. The jets opposite the tagged ones, referred as the ''untagged'' sample, are rich in B hadrons and unbiased in B decay times. The variable Σδ is the sum of impact parameters from tracks in the jet, and contains vital information on the B decay time. We measured the B lifetime from a one-parameter likelihood fit to the untagged Σδ distribution, obtaining τ b = 1.53 -0.45 +0.55 ±0.16 ps which agrees with the current world average. The first error is statistical and the second is systematic. The systematic error was dominated by uncertainties in the track resolution function. As a check, we also obtained consistent results using the Σδ distribution from the tagged jets and from the entire hadronic sample without any bottom enrichment

  11. Integrating Top-down and Bottom-up Cybersecurity Guidance using XML

    Science.gov (United States)

    Lubell, Joshua

    2016-01-01

    This paper describes a markup-based approach for synthesizing disparate information sources and discusses a software implementation of the approach. The implementation makes it easier for people to use two complementary, but differently structured, guidance specifications together: the (top-down) Cybersecurity Framework and the (bottom-up) National Institute of Standards and Technology Special Publication 800-53 security control catalog. An example scenario demonstrates how the software implementation can help a security professional select the appropriate safeguards for restricting unauthorized access to an Industrial Control System. The implementation and example show the benefits of this approach and suggest its potential application to disciplines other than cybersecurity. PMID:27795810

  12. Integrating Top-down and Bottom-up Cybersecurity Guidance using XML.

    Science.gov (United States)

    Lubell, Joshua

    2016-08-01

    This paper describes a markup-based approach for synthesizing disparate information sources and discusses a software implementation of the approach. The implementation makes it easier for people to use two complementary, but differently structured, guidance specifications together: the (top-down) Cybersecurity Framework and the (bottom-up) National Institute of Standards and Technology Special Publication 800-53 security control catalog. An example scenario demonstrates how the software implementation can help a security professional select the appropriate safeguards for restricting unauthorized access to an Industrial Control System. The implementation and example show the benefits of this approach and suggest its potential application to disciplines other than cybersecurity.

  13. Acoustic Profiling of Bottom Sediments in Large Oil Storage Tanks

    Science.gov (United States)

    Svet, V. D.; Tsysar', S. A.

    2018-01-01

    Characteristic features of acoustic profiling of bottom sediments in large oil storage tanks are considered. Basic acoustic parameters of crude oil and bottom sediments are presented. It is shown that, because of the presence of both transition layers in crude oil and strong reverberation effects in oil tanks, the volume of bottom sediments that is calculated from an acoustic surface image is generally overestimated. To reduce the error, additional post-processing of acoustic profilometry data is proposed in combination with additional measurements of viscosity and tank density distributions in vertical at several points of the tank.

  14. Combining bottom-up and top-down

    International Nuclear Information System (INIS)

    Boehringer, Christoph; Rutherford, Thomas F.

    2008-01-01

    We motivate the formulation of market equilibrium as a mixed complementarity problem which explicitly represents weak inequalities and complementarity between decision variables and equilibrium conditions. The complementarity format permits an energy-economy model to combine technological detail of a bottom-up energy system with a second-best characterization of the over-all economy. Our primary objective is pedagogic. We first lay out the complementarity features of economic equilibrium and demonstrate how we can integrate bottom-up activity analysis into a top-down representation of the broader economy. We then provide a stylized numerical example of an integrated model - within both static and dynamic settings. Finally, we present illustrative applications to three themes figuring prominently on the energy policy agenda of many industrialized countries: nuclear phase-out, green quotas, and environmental tax reforms

  15. Combining bottom-up and top-down

    Energy Technology Data Exchange (ETDEWEB)

    Boehringer, Christoph [Department of Economics, University of Oldenburg, Oldenburg (Germany); Centre for European Economic Research (ZEW), Mannheim (Germany); Rutherford, Thomas F. [Ann Arbor, Michigan (United States)

    2008-03-15

    We motivate the formulation of market equilibrium as a mixed complementarity problem which explicitly represents weak inequalities and complementarity between decision variables and equilibrium conditions. The complementarity format permits an energy-economy model to combine technological detail of a bottom-up energy system with a second-best characterization of the over-all economy. Our primary objective is pedagogic. We first lay out the complementarity features of economic equilibrium and demonstrate how we can integrate bottom-up activity analysis into a top-down representation of the broader economy. We then provide a stylized numerical example of an integrated model - within both static and dynamic settings. Finally, we present illustrative applications to three themes figuring prominently on the energy policy agenda of many industrialized countries: nuclear phase-out, green quotas, and environmental tax reforms. (author)

  16. Daylighting performance evaluation of a bottom-up motorized roller shade

    Energy Technology Data Exchange (ETDEWEB)

    Kapsis, K.; Athienitis, A.K.; Zmeureanu, R.G. [Department of Building, Civil and Environmental Engineering, Concordia University, Montreal, QC (Canada); Tzempelikos, A. [School of Civil Engineering, Purdue University, West Lafayette, IN (United States)

    2010-12-15

    This paper presents an experimental and simulation study for quantifying the daylighting performance of bottom-up roller shades installed in office spaces. The bottom-up shade is a motorized roller shade that opens from top to bottom operating in the opposite direction of a conventional roller shade, so as to cover the bottom part of the window, while allowing daylight to enter from the top part of the window, reaching deeper into the room. A daylighting simulation model, validated with full-scale experiments, was developed in order to establish correlations between the shade position, outdoor illuminance and work plane illuminance for different outdoor conditions. Then, a shading control algorithm was developed for application in any location and orientation. The validated model was employed for a sensitivity analysis of the impact of shade optical properties and control on the potential energy savings due to the use of daylighting. The results showed that Daylight Autonomy for the bottom-up shade is 8-58% higher compared to a conventional roller shade, with a difference of 46% further away from the facade, where the use of electric lighting is needed most of the time. The potential reduction in energy consumption for lighting is 21-41%. (author)

  17. Bottom up

    International Nuclear Information System (INIS)

    Ockenden, James

    1999-01-01

    This article presents an overview of the electric supply industries in Eastern Europe. The development of more competitive and efficient plant in Poland and work on emissions control ahead of EU membership; the Czech's complicated tariff system; Hungary's promised 8% return on investment in their electricity supply industry and its tariff problems; Bulgaria and Ukraine's desperate need for investment to build alternative plants to their aging nuclear plants; and demand outstripping supply in Romania are among the topics considered.. The viscous circle of poor service and low utility income is considered, and the top-down approach for breaking the cycle by improving plant efficiency, and the bottom up approach of improving plant income as practiced by Moldavia are explained. (UK)

  18. Sediment movement along the U.S. east coast continental shelf-I. Estimates of bottom stress using the Grant-Madsen model and near-bottom wave and current measurements

    Science.gov (United States)

    Lyne, V.D.; Butman, B.; Grant, W.D.

    1990-01-01

    Bottom stress is calculated for several long-term time-series observations, made on the U.S. east coast continental shelf during winter, using the wave-current interaction and moveable bed models of Grant and Madsen (1979, Journal of Geophysical Research, 84, 1797-1808; 1982, Journal of Geophysical Research, 87, 469-482). The wave and current measurements were obtained by means of a bottom tripod system which measured current using a Savonius rotor and vane and waves by means of a pressure sensor. The variables were burst sampled about 10% of the time. Wave energy was reasonably resolved, although aliased by wave groupiness, and wave period was accurate to 1-2 s during large storms. Errors in current speed and direction depend on the speed of the mean current relative to the wave current. In general, errors in bottom stress caused by uncertainties in measured current speed and wave characteristics were 10-20%. During storms, the bottom stress calculated using the Grant-Madsen models exceeded stress computed from conventional drag laws by a factor of about 1.5 on average and 3 or more during storm peaks. Thus, even in water as deep as 80 m, oscillatory near-bottom currents associated with surface gravity waves of period 12 s or longer will contribute substantially to bottom stress. Given that the Grant-Madsen model is correct, parameterizations of bottom stress that do not incorporate wave effects will substantially underestimate stress and sediment transport in this region of the continental shelf.

  19. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Dome Reef, 1989 - 2003 (NODC Accession 0002809)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  20. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Dome Reef, 2006 - 2007 (NODC Accession 0029107)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  1. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Dome Reef, 2005 - 2006 (NODC Accession 0014268)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  2. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Alligator Reef, 2005 - 2007 (NODC Accession 0019351)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  3. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Grecian Rocks, 2005 - 2007 (NODC Accession 0039973)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  4. Continuous bottom temperature measurements in strategic areas of the Florida Reef Track at Tennessee Reef, 1990 - 2004 (NODC Accession 0002749)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  5. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Triumph Reef, 1990 - 2006 (NODC accession 0013166)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  6. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Grecian Rocks, 1990 - 2005 (NODC Accession 0011143)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  7. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Cape Florida, 2005 - 2006 (NODC Accession 0014185)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  8. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Looe Iselin, 2006 - 2007 (NODC Accession 0039240)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  9. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Bullard Bank, 1992 - 2005 (NODC Accession 0010426)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  10. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Cape Florida, 1996 - 2005 (NODC Accession 0002788)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  11. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Sprigger Bank, 1992 - 2006 (NODC accession 0013114)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  12. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Looe Iselin, 2004-2006 (NODC Accession 0014271)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  13. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Long Key, 2008 - 2010 (NODC Accession 0093063)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  14. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Smith Shoal, 1998 - 2006 (NODC Accession 0014121)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  15. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Looe Buoy, 1988 - 2004 (NODC Accession 0002616)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  16. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Bullard Bank, 2005 - 2007 (NODC Accession 0039881)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  17. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Triumph Reef, 1990 - 2006 (NODC Accession 0013166)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  18. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Carysfort Reef, 2006-2010 (NODC Accession 0093022)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  19. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Grecian Rocks, 2007 - 2010 (NODC Accession 0093026)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  20. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Alligator Reef, 2007-2010 (NODC Accession 0093017)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  1. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Tennessee Reef, 2004-2006 (NODC Accession 0014272)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  2. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Long Key, 2005-2006 (NODC Accession 0014269)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  3. Evaluation of the contamination level of sea bottom sediments on the Crimean coast of the Black and Azov Seas

    Directory of Open Access Journals (Sweden)

    Tikhonova Elena

    2016-12-01

    At the most stations in the Azov Sea the content of HM exceeded values obtained in the Black Sea. Now (2016 in the open Crimean coast bottom sediments of the Black Sea have properties typical for marine sediments of the studied area. There is an upward trend in the content of chloroform-extracted substances in the Black Sea region, but the sediments are not contaminated with oil products. Taking into account the physical-chemical characteristics of marine sediments, it can be stated that the condition the studied area as a whole is safe.

  4. Fluidized bed combustion bottom ash: A better and alternative geo-material resource for construction.

    Science.gov (United States)

    Mandal, A K; Paramkusam, Bala Ramudu; Sinha, O P

    2018-04-01

    Though the majority of research on fly ash has proved its worth as a construction material, the utility of bottom ash is yet questionable due to its generation during the pulverized combustion process. The bottom ash produced during the fluidized bed combustion (FBC) process is attracting more attention due to the novelty of coal combustion technology. But, to establish its suitability as construction material, it is necessary to characterize it thoroughly with respect to the geotechnical as well as mineralogical points of view. For fulfilling these objectives, the present study mainly aims at characterizing the FBC bottom ash and its comparison with pulverized coal combustion (PCC) bottom ash, collected from the same origin of coal. Suitability of FBC bottom ash as a dike filter material in contrast to PCC bottom ash in replacing traditional filter material such as sand was also studied. The suitability criteria for utilization of both bottom ash and river sand as filter material on pond ash as a base material were evaluated, and both river sand and FBC bottom ash were found to be satisfactory. The study shows that FBC bottom ash is a better geo-material than PCC bottom ash, and it could be highly recommended as an alternative suitable filter material for constructing ash dikes in place of conventional sand.

  5. Bottom-type scattering layers and equatorial spread F

    Directory of Open Access Journals (Sweden)

    D. L. Hysell

    2004-12-01

    Full Text Available Jicamarca radar observations of bottom-type coherent scattering layers in the post-sunset bottomside F-region ionosphere are presented and analyzed. The morphology of the primary waves seen in radar images of the layers supports the hypothesis of kudeki+bhattacharyya-1999 that wind-driven gradient drift instabilities are operating. In one layer event when topside spread F did not occur, irregularities were distributed uniformly in space throughout the layers. In another event when topside spread F did eventually occur, the irregularities within the pre-existing bottom-type layers were horizontally clustered, with clusters separated by about 30km. The same horizontal periodicity was evident in the radar plumes and large-scale irregularities that emerged later in the event. We surmise that horizontal periodicity in bottom-type layer irregularity distribution is indicative of large-scale horizontal waves in the bottomside F-region that may serve as seed waves for large-scale Rayleigh Taylor instabilities. Key words. Ionosphere (equatorial ionosphere; ionospheric irregularties; plasma waves and instabilities

  6. Bottom mass from nonrelativistic sum rules at NNLL

    Energy Technology Data Exchange (ETDEWEB)

    Stahlhofen, Maximilian

    2013-01-15

    We report on a recent determination of the bottom quark mass from nonrelativistic (large-n) {Upsilon} sum rules with renormalization group improvement (RGI) at next-to-next-to-leading logarithmic (NNLL) order. The comparison to previous fixed-order analyses shows that the RGI computed in the vNRQCD framework leads to a substantial stabilization of the theoretical sum rule moments with respect to scale variations. A single moment fit (n=10) to the available experimental data yields M{sub b}{sup 1S}=4.755{+-}0.057{sub pert}{+-}0.009{sub {alpha}{sub s}}{+-}0.003{sub exp} GeV for the bottom 1S mass and anti m{sub b}(anti m{sub b})=4.235{+-}0.055{sub pert}{+-}0.003{sub exp} GeV for the bottom MS mass. The quoted uncertainties refer to the perturbative error and the uncertainties associated with the strong coupling and the experimental input.

  7. Point Cloud Classification of Tesserae from Terrestrial Laser Data Combined with Dense Image Matching for Archaeological Information Extraction

    Science.gov (United States)

    Poux, F.; Neuville, R.; Billen, R.

    2017-08-01

    Reasoning from information extraction given by point cloud data mining allows contextual adaptation and fast decision making. However, to achieve this perceptive level, a point cloud must be semantically rich, retaining relevant information for the end user. This paper presents an automatic knowledge-based method for pre-processing multi-sensory data and classifying a hybrid point cloud from both terrestrial laser scanning and dense image matching. Using 18 features including sensor's biased data, each tessera in the high-density point cloud from the 3D captured complex mosaics of Germigny-des-prés (France) is segmented via a colour multi-scale abstraction-based featuring extracting connectivity. A 2D surface and outline polygon of each tessera is generated by a RANSAC plane extraction and convex hull fitting. Knowledge is then used to classify every tesserae based on their size, surface, shape, material properties and their neighbour's class. The detection and semantic enrichment method shows promising results of 94% correct semantization, a first step toward the creation of an archaeological smart point cloud.

  8. SIMULATION OF ANALYTICAL TRANSIENT WAVE DUE TO DOWNWARD BOTTOM THRUST

    Directory of Open Access Journals (Sweden)

    Sugih Sudharma Tjandra

    2015-11-01

    Full Text Available Generation process is an important part of understanding waves, especially tsunami. Large earthquake under the sea is one major cause of tsunamis. The sea surface deforms as a response from the sea bottom motion caused by the earthquake. Analytical description of surface wave generated by bottom motion can be obtained from the linearized dispersive model. For a bottom motion in the form of a downward motion, the result is expressed in terms of improper integral. Here, we focus on analyzing the convergence of this integral, and then the improper integral is approximated into a finite integral so that the integral can be evaluated numerically. Further, we simulate free surface elevation for three different type of bottom motions, classified as impulsive, intermediate, and slow  movements. We demonstrate that the wave propagating to the right, with a depression as the leading wave, followed with subsequent wave crests. This phenomena is often observed in most tsunami events.

  9. Safe jack-up method permits repairs of tank bottoms and foundations

    International Nuclear Information System (INIS)

    de Wit, J.

    1991-01-01

    The oil and chemical industries use many thousands of steel tanks to store crude oil, oil products, and chemical liquids. The majority of these tanks are 30-40 years old. Tank bottoms are likely to begin leaking in the coming years, as these tanks get older. The European technique of jacking up a tank and repairing its foundation allows the thorough inspection of the underside of the tank bottom and the removal of saturated foundation material. And the possibility of soil and groundwater pollution is reduced to a minimum. With good, regular maintenance, the lifetime of a storage tank is very long. But experience has shown that special attention should be paid to the tank's bottom. Tank bottoms are only 5 or 6 mm thick, and in the last 10 years, an increasing number of leaks in tank bottoms have been reported. Tank foundations are affected by these leaks. This article describes the resulting procedure, which is used successfully in many European countries, but is not yet common in the U.S

  10. The Vaendoera test road, Sweden: A case study of long-term properties of roads constructed with MSWI bottom ash; Projekt Vaendoera: En studie av laangtidsegenskaper hos vaegar anlagda med bottenaska fraan avfallsfoerbraenning

    Energy Technology Data Exchange (ETDEWEB)

    Bendz, David; Arm, Maria; Westberg, Gunnar; Sjoestrand, Karin; Lyth, Martin; Wik, Ola [Swedish Geotechnical Inst., Linkoeping (Sweden); Flyhammar, Peter [Lund Inst. of Technology (Sweden). Dept. of Water Resources Engineering

    2006-03-15

    The accumulated effects of leaching and aging in a subbase layer of bottom ash were investigated in this study. The paved test road were constructed in 1987 in Linkoeping, Sweden, and has been used until the start of this study. The objective of this study was to investigate: (i) the accumulated effects of leaching and aging (ii) the accumulated effects of load and aging on the geotechnical properties (iii) the prerequisites for separate excavation of the bottom ash for possible reuse. The study started in September 2003 and included tests with falling weight deflectometer, triax testing on undisturbed core samples of bottom ash, sampling for chemical analysis. Three trenches were excavated in the test road, samples of the subbase layer and the subgrade were taken in the shaft walls and brought to the laboratory for leaching tests (EN 12457-2) and extraction, respectively. The extraction procedure was used to estimate extractable and chemically available fractions. It was found that the steady increase of stiffness which had been detected by falling weight deflectometer during the first years after construction had ceased. The undisturbed samples showed stiffness comparable with recently produced bottom ash from the same incineration plant, but lower stiffness if compared with the reference material of crushed rock. The permanent deformation was significantly larger for the samples compared with the crushed rock and recent (1999-2001) bottom ash from other incineration plants. The spatial distribution patterns of leachable easily soluble constituents reveal the existence of horizontal gradients, directed from the center of the road towards the shoulders of the road. This implies that horizontal transport by diffusion is the rate limiting leaching process for all easily soluble constituents underneath the pavement in a road. The bottom ash that was used in the sub-base layer was fresh at the time of the construction of the test road with a pH of about 11. Measured p

  11. Bottom reflector for power reactors

    International Nuclear Information System (INIS)

    Elter, C.; Kissel, K.F.; Schoening, J.; Schwiers, H.G.

    1982-01-01

    In pebble bed reactors erosion and damage due fuel elements movement on the surface of the bottom reflector should be minimized. This can be achieved by chamfering and/or rounding the cover edges of the graphite blocks and the edges between the drilled holes and the surface of the graphite block. (orig.) [de

  12. 14 CFR 25.533 - Hull and main float bottom pressures.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Hull and main float bottom pressures. 25... AIRCRAFT AIRWORTHINESS STANDARDS: TRANSPORT CATEGORY AIRPLANES Structure Water Loads § 25.533 Hull and main float bottom pressures. (a) General. The hull and main float structure, including frames and bulkheads...

  13. Audio-Visual Speech Recognition Using Lip Information Extracted from Side-Face Images

    Directory of Open Access Journals (Sweden)

    Koji Iwano

    2007-03-01

    Full Text Available This paper proposes an audio-visual speech recognition method using lip information extracted from side-face images as an attempt to increase noise robustness in mobile environments. Our proposed method assumes that lip images can be captured using a small camera installed in a handset. Two different kinds of lip features, lip-contour geometric features and lip-motion velocity features, are used individually or jointly, in combination with audio features. Phoneme HMMs modeling the audio and visual features are built based on the multistream HMM technique. Experiments conducted using Japanese connected digit speech contaminated with white noise in various SNR conditions show effectiveness of the proposed method. Recognition accuracy is improved by using the visual information in all SNR conditions. These visual features were confirmed to be effective even when the audio HMM was adapted to noise by the MLLR method.

  14. Co-sintering of treated APC-residues with bottom ash

    DEFF Research Database (Denmark)

    Jensen, Dorthe Lærke; Bergfeldt, Britta; Vehlow, Jürgen

    2001-01-01

    the influence of co-sintering of Ferrox products with bottom ashes on the quality of the residues and the effects on the combustion process. Only few elements showed higher concentrations in the bottom ashes of these co-combustion tests compared to reference tests. No significant effect on the leaching......Air pollution control residues stabilised by means of the Ferrox process can be sager disposed of due to lower contents of soluble salts and lesssoluble heavy metals stabilised in iron oxides. Co-combustion tests in the Karlsruhe test incinerator TAMARA were carried out in order to investigate...... behaviour of the bottom ashes could be found. During the co-combustion process an increase in SO2 concentrations in the raw gas and slightly lower temperatures in the fuel bed could be observed....

  15. Processed bottom ash for replacing fine aggregate in making high-volume fly ash concrete

    Directory of Open Access Journals (Sweden)

    Antoni

    2017-01-01

    Full Text Available Bottom ash is a coal plant by-product that is abundant and underutilized. There is the potential use of bottom ash as a fine aggregate replacement in concrete mixtures; however, the problems of water absorption and uniformity of quality of the material need to be overcome first. In this study, bottom ash was treated by sieve separation and pounding to smaller particle size for use as a sand substitute. The physical and chemical characteristics of bottom ash were tested after treatment including water absorption, sieve analysis, and fineness modulus. Highvolume fly ash (HVFA mortar specimens were made and the compressive strength and flowability test using bottom ash after treatment are compared with that of the sand specimen. Low water to cementitious ratio was used to ensure higher strength from the cementitious paste and superplasticizer demand was determined for each treatment. The result showed that bottom ash can be used as fine aggregate replacement material. Sieve separation of the bottom ash could produce 75% of the compressive strength compared with the control sand specimen, whereas pounded bottom ash could have up to 96% of the compressive strength of the control specimen. A 28-day compressive strength of 45 MPa was achievable with 100% replacement of fine aggregate with bottom ash.

  16. A Measurement of Bottom Quark Production in $p\\bar{p}$ Collisions at $\\sqrt{s}$ = 1.8-TeV.

    Energy Technology Data Exchange (ETDEWEB)

    Huehn, Thorsten Bernhard [UC, Riverside

    1995-12-01

    We present a measurement of the bottom quark production cross section at a center of mass energy of 1.8 TeV in the rapidity region $\\mid y \\mid$ < 1 and the transverse momentum range 13 to 37 GeV=c. The measurement is extracted from a dataset of 2707 events containing muons and jets, corresponding to an integrated luminosity of $228nb^{-1}$, taken during the 1992-93 collider run of the Tevatron proton-antiproton collider at Fermilab. The measurement is about two standard deviations above QCD predictions, but is consistent with it within measurement uncertainties and uncertainties in the QCD calculation

  17. Ultraclean single, double, and triple carbon nanotube quantum dots with recessed Re bottom gates

    Science.gov (United States)

    Jung, Minkyung; Schindele, Jens; Nau, Stefan; Weiss, Markus; Baumgartner, Andreas; Schoenenberger, Christian

    2014-03-01

    Ultraclean carbon nanotubes (CNTs) that are free from disorder provide a promising platform to manipulate single electron or hole spins for quantum information. Here, we demonstrate that ultraclean single, double, and triple quantum dots (QDs) can be formed reliably in a CNT by a straightforward fabrication technique. The QDs are electrostatically defined in the CNT by closely spaced metallic bottom gates deposited in trenches in Silicon dioxide by sputter deposition of Re. The carbon nanotubes are then grown by chemical vapor deposition (CVD) across the trenches and contacted using conventional electron beam lithography. The devices exhibit reproducibly the characteristics of ultraclean QDs behavior even after the subsequent electron beam lithography and chemical processing steps. We demonstrate the high quality using CNT devices with two narrow bottom gates and one global back gate. Tunable by the gate voltages, the device can be operated in four different regimes: i) fully p-type with ballistic transport between the outermost contacts (over a length of 700 nm), ii) clean n-type single QD behavior where a QD can be induced by either the left or the right bottom gate, iii) n-type double QD and iv) triple bipolar QD where the middle QD has opposite doping (p-type). Research at Basel is supported by the NCCR-Nano, NCCR-QIST, ERC project QUEST, and FP7 project SE2ND.

  18. Architecture and data processing alternatives for the TSE computer. Volume 2: Extraction of topological information from an image by the Tse computer

    Science.gov (United States)

    Jones, J. R.; Bodenheimer, R. E.

    1976-01-01

    A simple programmable Tse processor organization and arithmetic operations necessary for extraction of the desired topological information are described. Hardware additions to this organization are discussed along with trade-offs peculiar to the tse computing concept. An improved organization is presented along with the complementary software for the various arithmetic operations. The performance of the two organizations is compared in terms of speed, power, and cost. Software routines developed to extract the desired information from an image are included.

  19. Information Extraction for Social Media

    NARCIS (Netherlands)

    Habib, M. B.; Keulen, M. van

    2014-01-01

    The rapid growth in IT in the last two decades has led to a growth in the amount of information available online. A new style for sharing information is social media. Social media is a continuously instantly updated source of information. In this position paper, we propose a framework for

  20. A construction scheme of web page comment information extraction system based on frequent subtree mining

    Science.gov (United States)

    Zhang, Xiaowen; Chen, Bingfeng

    2017-08-01

    Based on the frequent sub-tree mining algorithm, this paper proposes a construction scheme of web page comment information extraction system based on frequent subtree mining, referred to as FSM system. The entire system architecture and the various modules to do a brief introduction, and then the core of the system to do a detailed description, and finally give the system prototype.

  1. Use of Incineration Solid Waste Bottom Ash as Cement Mixture in Cement Production

    Science.gov (United States)

    Jun, N. H.; Abdullah, M. M. A. B.; Jin, T. S.; Kadir, A. A.; Tugui, C. A.; Sandu, A. V.

    2017-06-01

    Incineration solid waste bottom ash was use to examine the suitability as a substitution in cement production. This study enveloped an innovative technology option for designing new equivalent cement that contains incineration solid waste bottom ash. The compressive strength of the samples was determined at 7, 14, 28 and 90 days. The result was compared to control cement with cement mixture containing incineration waste bottom ash where the result proved that bottom ash cement mixture able achieve its equivalent performance compared to control cement which meeting the requirement of the standards according to EN 196-1. The pozzolanic activity index of bottom ash cement mixture reached 0.92 at 28 days and 0.95 at 90 and this values can be concluded as a pozzolanic material with positive pozzolanic activity. Calcium hydroxide in Portland cement decreasing with the increasing replacement of bottom ash where the reaction occur between Ca(OH)2 and active SiO2.

  2. Utilization of power plant bottom ash as aggregates in fiber-reinforced cellular concrete.

    Science.gov (United States)

    Lee, H K; Kim, H K; Hwang, E A

    2010-02-01

    Recently, millions tons of bottom ash wastes from thermoelectric power plants have been disposed of in landfills and coastal areas, regardless of its recycling possibility in construction fields. Fiber-reinforced cellular concrete (FRCC) of low density and of high strength may be attainable through the addition of bottom ash due to its relatively high strength. This paper focuses on evaluating the feasibility of utilizing bottom ash of thermoelectric power plant wastes as aggregates in FRCC. The flow characteristics of cement mortar with bottom ash aggregates and the effect of aggregate type and size on concrete density and compressive strength were investigated. In addition, the effects of adding steel and polypropylene fibers for improving the strength of concrete were also investigated. The results from this study suggest that bottom ash can be applied as a construction material which may not only improve the compressive strength of FRCC significantly but also reduce problems related to bottom ash waste.

  3. C-STrap Sample Preparation Method--In-Situ Cysteinyl Peptide Capture for Bottom-Up Proteomics Analysis in the STrap Format.

    Directory of Open Access Journals (Sweden)

    Alexandre Zougman

    Full Text Available Recently we introduced the concept of Suspension Trapping (STrap for bottom-up proteomics sample processing that is based upon SDS-mediated protein extraction, swift detergent removal and rapid reactor-type protein digestion in a quartz depth filter trap. As the depth filter surface is made of silica, it is readily modifiable with various functional groups using the silane coupling chemistries. Thus, during the digest, peptides possessing specific features could be targeted for enrichment by the functionalized depth filter material while non-targeted peptides could be collected as an unbound distinct fraction after the digest. In the example presented here the quartz depth filter surface is functionalized with the pyridyldithiol group therefore enabling reversible in-situ capture of the cysteine-containing peptides generated during the STrap-based digest. The described C-STrap method retains all advantages of the original STrap methodology and provides robust foundation for the conception of the targeted in-situ peptide fractionation in the STrap format for bottom-up proteomics. The presented data support the method's use in qualitative and semi-quantitative proteomics experiments.

  4. Effect of Redox Potential on Changing of Binding Forms of Heavy Metals in Bottom Sediments of Anzali International Wetland

    International Nuclear Information System (INIS)

    Saeedi, M.; Fakhari, M.

    2016-01-01

    Heavy metals are naturally presented in different chemical bonds within sediment. Different factors affect metals bonding in sediment. One of those factors is changing in Redox potential. Redox potential may change under oxic/anoxic conditions in the bottom sediments. In the present study the effect of redox potential on fractionation and bonding of metals within Anzali international wetland bottom sediment is investigated.Sediment samples of Anzali wetland were aerated for one month and redox potential and p H was measured at 0, 1, 7, 21, and 28th days. Subsamples of sediments at mentioned days of experiments were taken and analyzed for Cu, Zn, Ni, and Cr for deferent chemical bonds using sequential extraction analysis. Results revealed that majority of Cu were presented in the sulfidic/organic bonds while Zn was associated with Fe/Mn oxides. Nickel and Cr were mostly associated in hard residual bonds. At the end of aeration process, with increasing redox potential, 8-23% of metals were released from sediments into dissolved phase. They mainly released from sulfide/organic bonds.

  5. C-STrap Sample Preparation Method--In-Situ Cysteinyl Peptide Capture for Bottom-Up Proteomics Analysis in the STrap Format.

    Science.gov (United States)

    Zougman, Alexandre; Banks, Rosamonde E

    2015-01-01

    Recently we introduced the concept of Suspension Trapping (STrap) for bottom-up proteomics sample processing that is based upon SDS-mediated protein extraction, swift detergent removal and rapid reactor-type protein digestion in a quartz depth filter trap. As the depth filter surface is made of silica, it is readily modifiable with various functional groups using the silane coupling chemistries. Thus, during the digest, peptides possessing specific features could be targeted for enrichment by the functionalized depth filter material while non-targeted peptides could be collected as an unbound distinct fraction after the digest. In the example presented here the quartz depth filter surface is functionalized with the pyridyldithiol group therefore enabling reversible in-situ capture of the cysteine-containing peptides generated during the STrap-based digest. The described C-STrap method retains all advantages of the original STrap methodology and provides robust foundation for the conception of the targeted in-situ peptide fractionation in the STrap format for bottom-up proteomics. The presented data support the method's use in qualitative and semi-quantitative proteomics experiments.

  6. Culture from the Bottom Up

    Science.gov (United States)

    Atkinson, Dwight; Sohn, Jija

    2013-01-01

    The culture concept has been severely criticized for its top-down nature in TESOL, leading arguably to its falling out of favor in the field. But what of the fact that people do "live culturally" (Ingold, 1994)? This article describes a case study of culture from the bottom up--culture as understood and enacted by its individual users.…

  7. Physical and Chemical Properties of Coal Bottom Ash (CBA) from Tanjung Bin Power Plant

    Science.gov (United States)

    Izzati Raihan Ramzi, Nurul; Shahidan, Shahiron; Zulkhairi Maarof, Mohamad; Ali, Noorwirdawati

    2016-11-01

    The objective of this study is to determine the physical and chemical characteristics of Coal Bottom Ash (CBA) obtained from Tanjung Bin Power Plant Station and compare them with the characteristics of natural river sand (as a replacement of fine aggregates). Bottom ash is the by-product of coal combustion during the electricity generating process. However, excess bottom ash production due to the high production of electricity in Malaysia has caused several environmental problems. Therefore, several tests have been conducted in order to determine the physical and chemical properties of bottom ash such as specific gravity, density, particle size distribution, Scanning Electron Microscopic (SEM) and X- Ray Fluorescence (XRF) in the attempt to produce sustainable material from waste. The results indicated that the natural fine aggregate and coal bottom ash have very different physical and chemical properties. Bottom ash was classified as Class C ash. The porous structure, angular and rough texture of bottom ash affected its specific gravity and particle density. From the tests, it was found that bottom ash is recommended to be used in concrete as a replacement for fine aggregates.

  8. Experimental Study on the Measurement of Water Bottom Vibration Induced by Underwater Drilling Blasting

    Directory of Open Access Journals (Sweden)

    Gu Wenbin

    2015-01-01

    Full Text Available Due to the lack of proper instrumentations and the difficulties in underwater measurements, the studies about water bottom vibration induced by underwater drilling blasting are seldom reported. In order to investigate the propagation and attenuation laws of blasting induced water bottom vibration, a water bottom vibration monitor was developed with consideration of the difficulties in underwater measurements. By means of this equipment, the actual water bottom vibration induced by underwater drilling blasting was measured in a field experiment. It shows that the water bottom vibration monitor could collect vibration signals quite effectively in underwater environments. The followed signal analysis shows that the characteristics of water bottom vibration and land ground vibration induced by the same underwater drilling blasting are quite different due to the different geological environments. The amplitude and frequency band of water bottom vibration both exceed those of land ground vibration. Water bottom vibration is mainly in low-frequency band that induced by blasting impact directly acts on rock. Besides the low-frequency component, land vibration contains another higher frequency band component that induced by followed water hammer wave acts on bank slope.

  9. Aluminium recovery from waste incineration bottom ash, and its oxidation level.

    Science.gov (United States)

    Biganzoli, Laura; Grosso, Mario

    2013-09-01

    The recovery of aluminium (Al) scraps from waste incineration bottom ash is becoming a common practice in waste management. However, during the incineration process, Al in the waste undergoes oxidation processes that reduce its recycling potential. This article investigates the behaviour of Al scraps in the furnace of two selected grate-fired waste-to-energy plants and the amount recoverable from the bottom ash. About 21-23% of the Al fed to the furnace with the residual waste was recovered and potentially recycled from the bottom ash. Out of this amount, 76-87% was found in the bottom ash fraction above 5 mm and thus can be recovered with standard eddy current separation technology. These values depend on the characteristics and the mechanical strength of the Al items in the residual waste. Considering Al packaging materials, about 81% of the Al in cans can be recovered from the bottom ash as an ingot, but this amount decreases to 51% for trays, 27% for a mix of aluminium and poly-laminated foils and 47% for paper-laminated foils. This shows that the recovery of Al from the incineration residues increases proportionally to the thickness of the packaging.

  10. Layered growth with bottom-spray granulation for spray deposition of drug.

    Science.gov (United States)

    Er, Dawn Z L; Liew, Celine V; Heng, Paul W S

    2009-07-30

    The gap in scientific knowledge on bottom-spray fluidized bed granulation has emphasized the need for more studies in this area. This paper comparatively studied the applicability of a modified bottom-spray process and the conventional top-spray process for the spray deposition of a micronized drug during granulation. The differences in circulation pattern, mode of growth and resultant granule properties between the two processes were highlighted. The more ordered and consistent circulation pattern of particles in a bottom-spray fluidized bed was observed to give rise to layered granule growth. This resulted in better drug content uniformity among the granule batches and within a granule batch. The processes' sensitivities to wetting and feed material characteristics were also compared and found to differ markedly. Less robustness to differing process conditions was observed for the top-spray process. The resultant bottom-spray granules formed were observed to be less porous, more spherical and had good flow properties. The bottom-spray technique can thus be potentially applied for the spray deposition of drug during granulation and was observed to be a good alternative to the conventional technique for preparing granules.

  11. Medicaid Analytic eXtract (MAX) General Information

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicaid Analytic eXtract (MAX) data is a set of person-level data files on Medicaid eligibility, service utilization, and payments. The MAX data are created to...

  12. A bottom-up approach to estimating cost elements of REDD+ pilot projects in Tanzania

    Directory of Open Access Journals (Sweden)

    Merger Eduard

    2012-08-01

    Full Text Available Abstract Background Several previous global REDD+ cost studies have been conducted, demonstrating that payments for maintaining forest carbon stocks have significant potential to be a cost-effective mechanism for climate change mitigation. These studies have mostly followed highly aggregated top-down approaches without estimating the full range of REDD+ costs elements, thus underestimating the actual costs of REDD+. Based on three REDD+ pilot projects in Tanzania, representing an area of 327,825 ha, this study explicitly adopts a bottom-up approach to data assessment. By estimating opportunity, implementation, transaction and institutional costs of REDD+ we develop a practical and replicable methodological framework to consistently assess REDD+ cost elements. Results Based on historical land use change patterns, current region-specific economic conditions and carbon stocks, project-specific opportunity costs ranged between US$ -7.8 and 28.8 tCOxxxx for deforestation and forest degradation drivers such as agriculture, fuel wood production, unsustainable timber extraction and pasture expansion. The mean opportunity costs for the three projects ranged between US$ 10.1 – 12.5 tCO2. Implementation costs comprised between 89% and 95% of total project costs (excluding opportunity costs ranging between US$ 4.5 - 12.2 tCO2 for a period of 30 years. Transaction costs for measurement, reporting, verification (MRV, and other carbon market related compliance costs comprised a minor share, between US$ 0.21 - 1.46 tCO2. Similarly, the institutional costs comprised around 1% of total REDD+ costs in a range of US$ 0.06 – 0.11 tCO2. Conclusions The use of bottom-up approaches to estimate REDD+ economics by considering regional variations in economic conditions and carbon stocks has been shown to be an appropriate approach to provide policy and decision-makers robust economic information on REDD+. The assessment of opportunity costs is a crucial first step to

  13. A bottom-up approach to estimating cost elements of REDD+ pilot projects in Tanzania

    Science.gov (United States)

    2012-01-01

    Background Several previous global REDD+ cost studies have been conducted, demonstrating that payments for maintaining forest carbon stocks have significant potential to be a cost-effective mechanism for climate change mitigation. These studies have mostly followed highly aggregated top-down approaches without estimating the full range of REDD+ costs elements, thus underestimating the actual costs of REDD+. Based on three REDD+ pilot projects in Tanzania, representing an area of 327,825 ha, this study explicitly adopts a bottom-up approach to data assessment. By estimating opportunity, implementation, transaction and institutional costs of REDD+ we develop a practical and replicable methodological framework to consistently assess REDD+ cost elements. Results Based on historical land use change patterns, current region-specific economic conditions and carbon stocks, project-specific opportunity costs ranged between US$ -7.8 and 28.8 tCOxxxx for deforestation and forest degradation drivers such as agriculture, fuel wood production, unsustainable timber extraction and pasture expansion. The mean opportunity costs for the three projects ranged between US$ 10.1 – 12.5 tCO2. Implementation costs comprised between 89% and 95% of total project costs (excluding opportunity costs) ranging between US$ 4.5 - 12.2 tCO2 for a period of 30 years. Transaction costs for measurement, reporting, verification (MRV), and other carbon market related compliance costs comprised a minor share, between US$ 0.21 - 1.46 tCO2. Similarly, the institutional costs comprised around 1% of total REDD+ costs in a range of US$ 0.06 – 0.11 tCO2. Conclusions The use of bottom-up approaches to estimate REDD+ economics by considering regional variations in economic conditions and carbon stocks has been shown to be an appropriate approach to provide policy and decision-makers robust economic information on REDD+. The assessment of opportunity costs is a crucial first step to provide information on the

  14. Ocean Bottom Seismic Scattering

    Science.gov (United States)

    1989-11-01

    EPR, the Clipperton and Orozco fracture zones , and along the coast of Mexico, were recorded for a two month period using ocean bottom seismometers...67. Tuthill, J.D., Lewis, B.R., and Garmany, J.D., 1981, Stonely waves, Lopez Island noise, and deep sea noise from I to 5 hz, Marine Geophysical...Patrol Pell Marine Science Library d/o Coast Guard R & D Center University of Rhode Island Avery Point Narragansett Bay Campus Groton, CT 06340

  15. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Key West Channel, 1991 - 2005 (NODC Accession 0012739)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  16. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Boca Grande Channel, 2004-2006 (NODC Accession 0014184)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  17. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Sand Key Lighthouse, 1990 - 2005 (NODC Accession 0012769)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  18. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Key West Channel, 2005 - 2007 (NODC Accession 0039986)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  19. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Sand Key Lighthouse, 2007 - 2010 (NODC Accession 0093065)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  20. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Card Sound Bridge, 2004 - 2006 (NODC Accession 0014266)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  1. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Sand Key Lighthouse, 2005 - 2007 (NODC Accession 0040080)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  2. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at 9-Ft Shoal, 2007-2010 (NODC Accession 0092549)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  3. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Ball Buoy Reef, 1990 - 1998 (NODC Accession 0002781)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  4. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Bicentennial Coral Head, 1998 - 2006 (NODC Accession 0039481)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  5. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Bahia Honda Bridge, 2007-2011 (NODC Accession 0093018)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  6. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Snake Creek Bridge, 1989 - 2005 (NODC accession 0013148)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  7. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Harbor Key Bank, 1992 - 1997 (NODC Accession 0013553)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  8. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Sombrero Reef Lighthouse, 1991-2005 (NODC Accession 0013726)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  9. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at 7-mile Bridge, 2005 - 2007 (NODC Accession 0039469)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  10. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at 7-mile Bridge, 2007-2010 (NODC Accession 0092548)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  11. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at 9-FT Shoal, 2005-2007 (NODC Accession 0039533)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  12. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Bicentennial Coral Head, 2006 - 2007 (NODC Accession 0039817)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  13. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Boca Grande Channel, 2006 - 2007 (NODC Accession 0039818)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  14. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Snake Creek Bridge, 1989 - 2005 (NODC Accession 0013148)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  15. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Bicentennial Coral Head, 2007-2009 (NODC Accession 0090835)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  16. Chemical quality of water and bottom sediment, Stillwater National Wildlife Refuge, Lahontan Valley, Nevada

    Science.gov (United States)

    Thodal, Carl E.

    2017-12-28

    The U.S. Geological Survey, in cooperation with the U.S. Fish and Wildlife Service collected data on water and bottom-sediment chemistry to be used to evaluate a new water rights acquisition program designed to enhance wetland habitat in Stillwater National Wildlife Refuge and in Lahontan Valley, Churchill County, Nevada. The area supports habitat critical to the feeding and resting of migratory birds travelling the Pacific Flyway. Information about how water rights acquisitions may affect the quality of water delivered to the wetlands is needed by stakeholders and Stillwater National Wildlife Refuge managers in order to evaluate the effectiveness of this approach to wetlands management. A network of six sites on waterways that deliver the majority of water to Refuge wetlands was established to monitor the quality of streamflow and bottom sediment. Each site was visited every 4 to 6 weeks and selected water-quality field parameters were measured when flowing water was present. Water samples were collected at varying frequencies and analyzed for major ions, silica, and organic carbon, and for selected species of nitrogen and phosphorus, trace elements, pharmaceuticals, and other trace organic compounds. Bottom-sediment samples were collected for analysis of selected trace elements.Dissolved-solids concentrations exceeded the recommended criterion for protection of aquatic life (500 milligrams per liter) in 33 of 62 filtered water samples. The maximum arsenic criterion (340 micrograms per liter) was exceeded twice and the continuous criterion was exceeded seven times. Criteria protecting aquatic life from continuous exposure to aluminum, cadmium, lead, and mercury (87, 0.72, 2.5, and 0.77 micrograms per liter, respectively) were exceeded only once in filtered samples (27, 40, 32, and 36 samples, respectively). Mercury was the only trace element analyzed in bottom-sediment samples to exceed the published probable effect concentration (1,060 micrograms per kilogram).

  17. Area-intensive bottom culture production of blue mussels, Mytilus edulis (L.)

    DEFF Research Database (Denmark)

    Christensen, Helle Torp

    column have the potential to become an alternative seed source for mussel production in bottom cultures. When compared to mussels collected from natural benthic mussel beds, suspended mussels had an active predator response by developing a significantly stronger attach-ment to the substrate and having...... a more pronounced aggregation behaviour. Bottom mussels exhibited a passive strategy by developing a thicker shell and larger relative size of the posterior adductor muscle. When comparing the performance of suspended and bottom seed mussels on complex and smooth sub-strate, respectively, originally...

  18. Comparative Study of the Use of Different Biomass Bottom Ash in the Manufacture of Ceramic Bricks

    OpenAIRE

    Eliche-Quesada, D; Felipe Sesé, Manuel Ángel (UNIR); Martínez-Martínez, S; Pérez-Villarejo, L

    2017-01-01

    The present study evaluates the suitability of several types of biomass bottom ashes [wood bottom ash (WBA), pine-olive pruning bottom ash (POPBA), olive stone bottom ash (OSBA), and olive pomace bottom ash (OPBA)] as an alternative source to replace ceramic raw material in the production of clay bricks. The clay and biomass bottom ash were characterized by means of X-ray diffraction (XRD) (crystallinity), X-ray fluorescence spectroscopy (XRF) (chemical composition), carbon, nitrogen, hydroge...

  19. Construction of reactor vessel bottom of prestressed reinforced concrete

    International Nuclear Information System (INIS)

    Sitnikov, M.I.; Metel'skij, V.P.

    1980-01-01

    Methods are described for building reactor vessel bottoms of prestressed reinforced concrete during NPPs construction in Great Britain, France, Germany (F.R.) and the USA. Schematic of operations performed in succession is presented. Considered are different versions of one of the methods for concreting a space under a facing by forcing concrete through a hole in the facing. The method provides tight sticking of the facing to the reactor vessel bottom concrete

  20. Unsupervised Symbolization of Signal Time Series for Extraction of the Embedded Information

    Directory of Open Access Journals (Sweden)

    Yue Li

    2017-03-01

    Full Text Available This paper formulates an unsupervised algorithm for symbolization of signal time series to capture the embedded dynamic behavior. The key idea is to convert time series of the digital signal into a string of (spatially discrete symbols from which the embedded dynamic information can be extracted in an unsupervised manner (i.e., no requirement for labeling of time series. The main challenges here are: (1 definition of the symbol assignment for the time series; (2 identification of the partitioning segment locations in the signal space of time series; and (3 construction of probabilistic finite-state automata (PFSA from the symbol strings that contain temporal patterns. The reported work addresses these challenges by maximizing the mutual information measures between symbol strings and PFSA states. The proposed symbolization method has been validated by numerical simulation as well as by experimentation in a laboratory environment. Performance of the proposed algorithm has been compared to that of two commonly used algorithms of time series partitioning.

  1. 14 CFR 23.533 - Hull and main float bottom pressures.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Hull and main float bottom pressures. 23... Water Loads § 23.533 Hull and main float bottom pressures. (a) General. The hull and main float....00213; K2=hull station weighing factor, in accordance with figure 2 of appendix I of this part; VS1...

  2. Landfilling: Bottom Lining and Leachate Collection

    DEFF Research Database (Denmark)

    Christensen, Thomas Højlund; Manfredi, Simone; Kjeldsen, Peter

    2011-01-01

    from entering the groundwater or surface water. The bottom lining system should cover the full footprint area of the landfill, including both the relatively flat bottom and the sideslopes in the case of an excavated configuration. This prevents the lateral migration of leachate from within the landfill...... triple) liners, are extremely effective in preventing leachate from entering into the environment. In addition, the risk of polluting the groundwater at a landfill by any leakage of leachate depends on several factors related to siting of the landfill: distance to the water table, distance to surface...... water bodies, and the properties of the soil beneath the landfill. In addition to the lining and drainage systems described in this chapter, the siting and hydrogeology of the landfill site (Chapter 10.12) and the top cover (Chapter 10.9) are also part of the barrier system, contributing to reducing...

  3. Experimental Study on the Measurement of Water Bottom Vibration Induced by Underwater Drilling Blasting

    OpenAIRE

    Wenbin, Gu; Jianghai, Chen; Zhenxiong, Wang; Zhihua, Wang; Jianqing, Liu; Ming, Lu

    2015-01-01

    Due to the lack of proper instrumentations and the difficulties in underwater measurements, the studies about water bottom vibration induced by underwater drilling blasting are seldom reported. In order to investigate the propagation and attenuation laws of blasting induced water bottom vibration, a water bottom vibration monitor was developed with consideration of the difficulties in underwater measurements. By means of this equipment, the actual water bottom vibration induced by underwater ...

  4. Incentives for Collaborative Governance: Top-Down and Bottom-Up Initiatives in the Swedish Mountain Region

    Directory of Open Access Journals (Sweden)

    Katarina Eckerberg

    2015-08-01

    Full Text Available Governance collaborations between public and private partners are increasingly used to promote sustainable mountain development, yet information is limited on their nature and precise extent. This article analyzes collaboration on environment and natural resource management in Swedish mountain communities to critically assess the kinds of issues these efforts address, how they evolve, who leads them, and what functional patterns they exhibit based on Margerum's (2008 typology of action, organizational, and policy collaboration. Based on official documents, interviews, and the records of 245 collaborative projects, we explore the role of the state, how perceptions of policy failure may inspire collaboration, and the opportunities that European Union funds have created. Bottom-up collaborations, most of which are relatively recent, usually have an action and sometimes an organizational function. Top-down collaborations, however, are usually organizational or policy oriented. Our findings suggest that top-down and bottom-up collaborations are complementary in situations with considerable conflict over time and where public policies have partly failed, such as for nature protection and reindeer grazing. In less contested areas, such as rural development, improving tracks and access, recreation, and fishing, there is more bottom-up, action-oriented collaboration. State support, especially in the form of funding, is central to explaining the emergence of bottom-up action collaboration. Our findings show that the state both initiates and coordinates policy networks and retains a great deal of power over the nature and functioning of collaborative governance. A practical consequence is that there is great overlap—aggravated by sectorized approaches—that creates a heavy workload for some regional partners.

  5. Classification of bottom composition and bathymetry of shallow waters by passive remote sensing

    Science.gov (United States)

    Spitzer, D.; Dirks, R. W. J.

    The use of remote sensing data in the development of algorithms to remove the influence of the watercolumn on upwelling optical signals when mapping the bottom depth and composition in shallow waters. Calculations relating the reflectance spectra to the parameters of the watercolumn and the diverse bottom types are performed and measurements of the underwater reflection coefficient of sandy, mud, and vegetation-type seabottoms are taken. The two-flow radiative transfer model is used. Reflectances within the spectral bands of the Landsat MSS, the Landsat TM, SPOT HVR, and the TIROS-N series AVHRR were computed in order to develop appropriate algorithms suitable for the bottom depth and type mapping. Bottom depth and features appear to be observable down to 3-20 m depending on the water composition and bottom type.

  6. Combining Top-down and Bottom-up Accountability: Evidence from a Bribery Experiment.

    OpenAIRE

    Danila Serra

    2008-01-01

    Monitoring corruption typically relies on top-down interventions aimed at increasing the probability of external controls and the severity of punishment. An alternative approach to fighting corruption is to induce bottom-up pressure for reform. Recent studies have shown that both top-down and bottom-up mechanisms are rarely able to keep service providers accountable. This paper investigates the effectiveness of an accountability system that combines bottom-up monitoring and top-down auditing ...

  7. Employment impacts of EU biofuels policy. Combining bottom-up technology information and sectoral market simulations in an input-output framework

    International Nuclear Information System (INIS)

    Neuwahl, Frederik; Mongelli, Ignazio; Delgado, Luis; Loeschel, Andreas

    2008-01-01

    This paper analyses the employment consequences of policies aimed to support biofuels in the European Union. The promotion of biofuel use has been advocated as a means to promote the sustainable use of natural resources and to reduce greenhouse gas emissions originating from transport activities on the one hand, and to reduce dependence on imported oil and thereby increase security of the European energy supply on the other hand. The employment impacts of increasing biofuels shares are calculated by taking into account a set of elements comprising the demand for capital goods required to produce biofuels, the additional demand for agricultural feedstock, higher fuel prices or reduced household budget in the case of price subsidisation, price effects ensuing from a hypothetical world oil price reduction linked to substitution in the EU market, and price impacts on agro-food commodities. The calculations refer to scenarios for the year 2020 targets as set out by the recent Renewable Energy Roadmap. Employment effects are assessed in an input-output framework taking into account bottom-up technology information to specify biofuels activities and linked to partial equilibrium models for the agricultural and energy sectors. The simulations suggest that biofuels targets on the order of 10-15% could be achieved without adverse net employment effects. (author)

  8. Addressing Risk Assessment for Patient Safety in Hospitals through Information Extraction in Medical Reports

    Science.gov (United States)

    Proux, Denys; Segond, Frédérique; Gerbier, Solweig; Metzger, Marie Hélène

    Hospital Acquired Infections (HAI) is a real burden for doctors and risk surveillance experts. The impact on patients' health and related healthcare cost is very significant and a major concern even for rich countries. Furthermore required data to evaluate the threat is generally not available to experts and that prevents from fast reaction. However, recent advances in Computational Intelligence Techniques such as Information Extraction, Risk Patterns Detection in documents and Decision Support Systems allow now to address this problem.

  9. Measurement of the bottom hadron lifetime at the Z sup 0 resonancce

    Energy Technology Data Exchange (ETDEWEB)

    Fujino, D.H.

    1992-06-01

    We have measured the bottom hadron lifetime from b{bar b} events produced at the Z{sup 0} resonance. Using the precision vertex detectors of the Mark II detector at the Stanford Linear Collider, we developed an impact parameter tag to identify bottom hadrons. The vertex tracking system resolved impact parameters to 30 {mu}m for high momentum tracks, and 70 {mu}m for tracks with a momentum of 1 GeV. We selected B hadrons with an efficiency of 40% and a sample purity of 80%, by requiring there be at least two tracks in a single jet that significantly miss the Z{sup 0} decay vertex. From a total of 208 hadronic Z{sup 0} events collected by the Mark II detector in 1990, we tagged 53 jets, of which 22 came from 11 double-tagged events. The jets opposite the tagged ones, referred as the untagged'' sample, are rich in B hadrons and unbiased in B decay times. The variable {Sigma}{delta} is the sum of impact parameters from tracks in the jet, and contains vital information on the B decay time. We measured the B lifetime from a one-parameter likelihood fit to the untagged {Sigma}{delta} distribution, obtaining {tau}{sub b} = 1.53{sub {minus}0.45}{sup +0.55}{plus minus}0.16 ps which agrees with the current world average. The first error is statistical and the second is systematic. The systematic error was dominated by uncertainties in the track resolution function. As a check, we also obtained consistent results using the {Sigma}{delta} distribution from the tagged jets and from the entire hadronic sample without any bottom enrichment.

  10. Bottom fauna of the Malacca Strait

    Digital Repository Service at National Institute of Oceanography (India)

    Parulekar, A.H.; Ansari, Z.A.

    Bottom fauna of Malacca Strait (connecting the Indian Ocean with Pacific) in the depth range of 80 to 1350 m, is dominated by meiofauna which exceeds macrofauna by 12.5 times in weight and by more than 780 times in population density. Standing crop...

  11. Thermal treatment of stabilized air pollution control residues in a waste incinerator pilot plant. Part 2: Leaching characteristics of bottom ashes.

    Science.gov (United States)

    Baun, Dorthe L; Christensen, Thomas H; Bergfeldt, Brita; Vehlow, Jürgen; Mogensen, Erhardt P B

    2004-02-01

    With the perspective of generating only one solid residue from waste incineration, co-feeding of municipal solid waste and air pollution control residues stabilized by the Ferrox process was investigated in the TAMARA pilot plant incinerator as described in Bergfeldt et al. (Waste Management Research, 22, 49-57, 2004). This paper reports on leaching from the combined bottom ashes. Batch leaching test, pH-static leaching tests, availability tests and column leaching tests were used to characterize the leaching properties. The leaching properties are key information in the context of reuse in construction or in landfilling of the combined residue. In general, the combined bottom ashes had leaching characteristics similar to the reference bottom ash, which contained no APC residue. However, As and Pb showed slightly elevated leaching from the combined bottom ashes, while Cr showed less leaching. The investigated combined bottom ashes had contents of metals comparable to what is expected at steady state after continuous co-feeding of APC residues. Only Cd and Pb were partly volatilized (30-40%) during the incineration process and thus the combined bottom ashes had lower contents of Cd and Pb than expected at steady state. Furthermore, a major loss of Hg was, not surprisingly, seen and co-feeding of Ferrox-products together with municipal solid waste will require dedicated removal of Hg in the flue gas to prevent a build up of Hg in the system. In spite of this, a combined single solid residue from waste incineration seems to be a significant environmental improvement to current technology.

  12. Treated bottom ash medium and method of arsenic removal from drinking water

    Science.gov (United States)

    Gadgil, Ashok

    2009-06-09

    A method for low-cost arsenic removal from drinking water using chemically prepared bottom ash pre-treated with ferrous sulfate and then sodium hydroxide. Deposits on the surface of particles of bottom ash form of activated iron adsorbent with a high affinity for arsenic. In laboratory tests, a miniscule 5 grams of pre-treated bottom ash was sufficient to remove the arsenic from 2 liters of 2400 ppb (parts per billion) arsenic-laden water to a level below 50 ppb (the present United States Environmental Protection Agency limit). By increasing the amount of pre-treated bottom ash, even lower levels of post-treatment arsenic are expected. It is further expected that this invention supplies a very low-cost solution to arsenic poisoning for large population segments.

  13. Strategies for the extraction and analysis of non-extractable polyphenols from plants.

    Science.gov (United States)

    Domínguez-Rodríguez, Gloria; Marina, María Luisa; Plaza, Merichel

    2017-09-08

    The majority of studies based on phenolic compounds from plants are focused on the extractable fraction derived from an aqueous or aqueous-organic extraction. However, an important fraction of polyphenols is ignored due to the fact that they remain retained in the residue of extraction. They are the so-called non-extractable polyphenols (NEPs) which are high molecular weight polymeric polyphenols or individual low molecular weight phenolics associated to macromolecules. The scarce information available about NEPs shows that these compounds possess interesting biological activities. That is why the interest about the study of these compounds has been increasing in the last years. Furthermore, the extraction and characterization of NEPs are considered a challenge because the developed analytical methodologies present some limitations. Thus, the present literature review summarizes current knowledge of NEPs and the different methodologies for the extraction of these compounds, with a particular focus on hydrolysis treatments. Besides, this review provides information on the most recent developments in the purification, separation, identification and quantification of NEPs from plants. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Innovation and Creativity at the Bottom of the Pyramid

    Directory of Open Access Journals (Sweden)

    Lauri Erik Lehikoinen

    2018-01-01

    Full Text Available Purpose: The purpose of this study is to illustrate how innovative and creative companies develop products and services at the bottom of the economic pyramid (B.o.P markets. This paper attempts to gain further insight regarding the usage of the 4A perspective developed by Anderson and Billou (2007 and the Triple Bottom Line (TBL framework developed by Elkington (1999 as guidelines to achieve success in BoP markets. Design/methodology/approach: The authors of this paper come from three different countries (Sweden, Norway and Belgium, which gave a possibility to gather qualitative data from companies located or founded in these three countries. The 4A’s perspective and the TBL framework is used as a theoretical foundation to further investigate the phenomenon regarding how western companies act on B.o.P markets. Thus, this paper attempts to answer the following research questions: How can (social entrepreneurs (or any companies adapt the 4A perspective to introduce disruptive innovations and still, with the help from the TBL framework, maintain their sustainable, responsible and ethical approach? Additionally, how can the mind-set of innovation and creativity at the bottom of the pyramid in developing markets be transferred to social entrepreneurs in developed markets? Primary data was gathered through interviews with Solvatten (Sweden, MicroStart (Belgium and Easypaisa (Norway. Findings: The 4A perspective was proven to be an effective tool while approaching B.o.P markets. Companies must think outside the box of traditional marketing and be creative, to achieve their goals. In dynamic markets, a company will struggle to keep up with all constraints. The case companies struggled most with acting sustainably while achieving profitability. Research limitations/implications: To further validate the results, the sample size should be bigger including several different companies and informants. Originality/value: This paper contributes to the

  15. Flowable Backfill Materials from Bottom Ash for Underground Pipeline

    Directory of Open Access Journals (Sweden)

    Kyung-Joong Lee

    2014-04-01

    Full Text Available The purpose of this study was to investigate the relationship between strength and strain in manufacturing controlled low strength materials to recycle incineration bottom ash. Laboratory tests for controlled low strength materials with bottom ash and recycled in-situ soil have been carried out. The optimum mixing ratios were 25%–45% of in-situ soil, 30% of bottom ash, 10%–20% of fly ash, 0%–3% of crumb rubber, 3% of cement, and 22% of water. Each mixture satisfied the standard specifications: a minimum 20 cm of flowability and 127 kPa of unconfined compressive strength. The average secant modulus (E50 was (0.07–0.08 qu. The ranges of the internal friction angle and cohesion for mixtures were 36.5°–46.6° and 49.1–180 kPa, respectively. The pH of all of the mixtures was over 12, which is strongly alkaline. Small-scale chamber tests for controlled low strength materials with bottom ash and recycled in-situ soil have been carried out. Vertical deflection of 0.88–2.41 mm and horizontal deflection of 0.83–3.72 mm were measured during backfilling. The vertical and horizontal deflections of controlled low strength materials were smaller than that of sand backfill.

  16. Inspection and repair of storage tank bottoms and foundations using airbag lifting

    International Nuclear Information System (INIS)

    Wildin, I.P.; Adams, N.J.

    1992-01-01

    This paper reports that within the past five years the environmental impact on the operation of petro-chemical product storage tanks, constructed to standards such as API 650, has taken on critical implications for refineries and distribution centers. Pollution of the supporting foundation and possible widespread effects on ground water has resulted in moves to require the installation of double integrity bottoms. That is not to say, necessarily, a tank with two steel bottoms, but alternative means of reducing the failure probability to an acceptable public or statutory level. Clearly increased inspection of the tank bottom has merit and visual examination of the bottom from inside the tank can be supplemented by ultrasonic methods, acoustic leak detection and magnetic flux scanning. Tank lifting now offers a very cost effective method for underfloor inspection, combined with the opportunity to undertake repairs to the bottom and underside painting, together with improvements and repairs to the Bitsand surface of the tank pad. if necessary, an impervious membrane can also be installed with a leak detection trough formed around the tank edge

  17. Effluent Management Facility Evaporator Bottom-Waste Streams Formulation and Waste Form Qualification Testing

    Energy Technology Data Exchange (ETDEWEB)

    Saslow, Sarah A.; Um, Wooyong; Russell, Renee L.

    2017-08-02

    This report describes the results from grout formulation and cementitious waste form qualification testing performed by Pacific Northwest National Laboratory (PNNL) for Washington River Protection Solutions, LLC (WRPS). These results are part of a screening test that investigates three grout formulations proposed for wide-range treatment of different waste stream compositions expected for the Hanford Effluent Management Facility (EMF) evaporator bottom waste. This work supports the technical development need for alternative disposition paths for the EMF evaporator bottom wastes and future direct feed low-activity waste (DFLAW) operations at the Hanford Site. High-priority activities included simulant production, grout formulation, and cementitious waste form qualification testing. The work contained within this report relates to waste form development and testing, and does not directly support the 2017 Integrated Disposal Facility (IDF) performance assessment (PA). However, this work contains valuable information for use in PA maintenance past FY 2017 and future waste form development efforts. The provided results and data should be used by (1) cementitious waste form scientists to further the understanding of cementitious leach behavior of contaminants of concern (COCs), (2) decision makers interested in off-site waste form disposal, and (3) the U.S. Department of Energy, their Hanford Site contractors and stakeholders as they assess the IDF PA program at the Hanford Site. The results reported help fill existing data gaps, support final selection of a cementitious waste form for the EMF evaporator bottom waste, and improve the technical defensibility of long-term waste form risk estimates.

  18. Adsorption and desorption characteristics of crystal violet in bottom ash column

    Directory of Open Access Journals (Sweden)

    Puthiya Veetil Nidheesh

    2012-06-01

    Full Text Available This study described adsorption of Crystal Violet (CV by bottom ash in fixed-bed column mode. Equilibrium of adsorption was studied in batch mode for finding adsorption capacity of bottom ash. In fixed bed column adsorption, the effects of bed height, feed flow rate, and initial concentration were studied by assessing breakthrough curve. The slope of the breakthrough curve decreased with increasing bed height. The breakthrough time and exhaustion time were decreased with increasing influent CV concentration and flow rates. The effect of bed depth, flow rate and CV concentration on the adsorption column design parameters were analyzed. Bed depth service time (BDST model was applied for analysis of crystal violet adsorption in the column. The adsorption capacity of bottom ash was calculated at 10% breakthrough point for different flow rates and concentrations. Desorption studies reveals that recovery of CV from bottom ash was effective by using CH3COOH than H2SO4, NaOH, HCl and NaCl solutions.

  19. ADSORPTION AND DESORPTION CHARACTERISTICS OF CRYSTAL VIOLET IN BOTTOM ASH COLUMN

    Directory of Open Access Journals (Sweden)

    Puthiya Veetil Nidheesh

    2012-01-01

    Full Text Available This study described adsorption of Crystal Violet (CV by bottom ash in fixed-bed column mode. Equilibrium of adsorption was studied in batch mode for finding adsorption capacity of bottom ash. In fixed bed column adsorption, the effects of bed height, feed flow rate, and initial concentration were studied by assessing breakthrough curve. The slope of the breakthrough curve decreased with increasing bed height. The breakthrough time and exhaustion time were decreased with increasing influent CV concentration and flow rates. The effect of bed depth, flow rate and CV concentration on the adsorption column design parameters were analyzed. Bed depth service time (BDST model was applied for analysis of crystal violet adsorption in the column. The adsorption capacity of bottom ash was calculated at 10% breakthrough point for different flow rates and concentrations. Desorption studies reveals that recovery of CV from bottom ash was effective by using CH3COOH than H2SO4, NaOH, HCl and NaCl solutions.

  20. Hybrid dual gate ferroelectric memory for multilevel information storage

    KAUST Repository

    Khan, Yasser

    2015-01-01

    Here, we report hybrid organic/inorganic ferroelectric memory with multilevel information storage using transparent p-type SnO semiconductor and ferroelectric P(VDF-TrFE) polymer. The dual gate devices include a top ferroelectric field-effect transistor (FeFET) and a bottom thin-film transistor (TFT). The devices are all fabricated at low temperatures (∼200°C), and demonstrate excellent performance with high hole mobility of 2.7 cm2 V-1 s-1, large memory window of ∼18 V, and a low sub-threshold swing ∼-4 V dec-1. The channel conductance of the bottom-TFT and the top-FeFET can be controlled independently by the bottom and top gates, respectively. The results demonstrate multilevel nonvolatile information storage using ferroelectric memory devices with good retention characteristics.

  1. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at the Pillar Coral Forest site, 2006 (NODC accession 0040039)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  2. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at the Bahia Honda Bridge, 2005 - 2007 (NODC Accession 0039226)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  3. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Hen and Chickens Reef, 2006 - 2007 (NODC Accession 0020554)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  4. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Hen and Chickens Reef, 1989 - 2006 (NODC Accession 0011144)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  5. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at the Broad Creek site, 2006 - 2007 (NODC Accession 0039880)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  6. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at the Broad Creek site, 2007-2009 (NODC Accession 0093020)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  7. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Looe Key Back Reef, 2004-2006 (NODC Accession 0014270)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  8. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at the Bahia Honda Bridge, 1990 - 2004 (NODC Accession 0002772)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  9. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Hen and Chickens Reef, 2007 - 2011 (NODC Accession 0093027)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  10. Information society studies

    CERN Document Server

    Duff, Alistair S

    2013-01-01

    We are often told that we are ""living in an information society"" or that we are ""information workers."" But what exactly do these claims mean, and how might they be verified? In this important methodological study, Alistair S. Duff cuts through the rhetoric to get to the bottom of the ""information society thesis."" Wide-ranging in coverage, this study will be of interest to scholars in information science, communication and media studies and social theory. It is a key text for the newly-unified specialism of information society studies, and an indispensable guide to the future of this disc

  11. Processed bottom ash for replacing fine aggregate in making high-volume fly ash concrete

    OpenAIRE

    Antoni; Sulistio Aldi Vincent; Wahjudi Samuel; Hardjito Djwantoro; Hardjito Djwantoro

    2017-01-01

    Bottom ash is a coal plant by-product that is abundant and underutilized. There is the potential use of bottom ash as a fine aggregate replacement in concrete mixtures; however, the problems of water absorption and uniformity of quality of the material need to be overcome first. In this study, bottom ash was treated by sieve separation and pounding to smaller particle size for use as a sand substitute. The physical and chemical characteristics of bottom ash were tested after treatment includi...

  12. Measuring Sandy Bottom Dynamics by Exploiting Depth from Stereo Video Sequences

    DEFF Research Database (Denmark)

    Musumeci, Rosaria E.; Farinella, Giovanni M.; Foti, Enrico

    2013-01-01

    In this paper an imaging system for measuring sandy bottom dynamics is proposed. The system exploits stereo sequences and projected laser beams to build the 3D shape of the sandy bottom during time. The reconstruction is used by experts of the field to perform accurate measurements and analysis...

  13. Sub-bottom profiling for large-scale maritime archaeological survey An experience-based approach

    DEFF Research Database (Denmark)

    Grøn, Ole; Boldreel, Lars Ole

    2013-01-01

    and wrecks partially or wholly embedded in the sea-floor sediments demands the application of highresolution sub-bottom profilers. This paper presents a strategy for the cost-effective large-scale mapping of unknown sedimentembedded sites such as submerged Stone Age settlements or wrecks, based on sub...... of the submerged cultural heritage. Elements such as archaeological wreck sites exposed on the sea floor are mapped using side-scan and multi-beam techniques. These can also provide information on bathymetric patterns representing potential Stone Age settlements, whereas the detection of such archaeological sites...

  14. Bottom-up synthetic biology: modular design for making artificial platelets

    Science.gov (United States)

    Majumder, Sagardip; Liu, Allen P.

    2018-01-01

    Engineering artificial cells to mimic one or multiple fundamental cell biological functions is an emerging area of synthetic biology. Reconstituting functional modules from biological components in vitro is a challenging yet an important essence of bottom-up synthetic biology. Here we describe the concept of building artificial platelets using bottom-up synthetic biology and the four functional modules that together could enable such an ambitious effort.

  15. Bottom depth and type for shallow waters: Hyperspectral observations from a blimp

    Energy Technology Data Exchange (ETDEWEB)

    Lee, ZhongPing; Carder, K.; Steward, R. [Univ. of South Florida, St. Petersburg, FL (United States)] [and others

    1997-08-01

    In a study of a blimp transect over Tampa Bay (Florida), hyperspectral upwelling radiance over the sand and seagrass bottoms was measured. These measurements were converted to hyperspectral remote-sensing reflectances. Using a shallow-water remote-sensing-reflectance model, in-water optical properties, bottom depths and bottom albedos were derived analytically and simultaneously by an optimization procedure. In the process, curvatures of sand and seagrass albedos were used. Also used was a model of absorption spectrum of phytoplankton pigments. The derived bottom depths were compared with bathymetry charts and found to agree well. This study suggests that a low-flying blimp is a useful platform for the study and mapping of coastal water environments. The optical model as well as the data-reduction procedure used are practical for the retrieval of shallow water optical properties.

  16. Measuring device for water quality at reactor bottom

    Energy Technology Data Exchange (ETDEWEB)

    Urata, Hidehiro; Takagi, Jun-ichi

    1995-10-27

    The present invention concerns measurement for water quality at the bottom of a reactor of a BWR type plant, in which reactor water is sampled and analyzed in a state approximate to conditions in a pressure vessel. Based on the result, hydrogen injection amount is controlled during hydrogen injection operation. Namely, a monitor for water quality is disposed to a sampling line in communication with the bottom of a pressure vessel. A water quality monitor is disposed to a drain sampling line in communication with the bottom of the pressure vessel. A corrosion potentiometer is disposed to the pressure sampling line or the drain sampling line. A dissolved oxygen measuring device is disposed to the pressure vessel sampling line or the drain sampling line. With such a constitution, the reactor water can be sampled and analyzed in a state approximate to the conditions in the pressure vessel. In addition, signals from the water quality monitor are inputted to a hydrogen injection amount control device. As a result, the amount of hydrogen injected to primary coolants can be controlled in a state approximate to the conditions in the pressure vessel. (I.S.).

  17. Measuring device for water quality at reactor bottom

    International Nuclear Information System (INIS)

    Urata, Hidehiro; Takagi, Jun-ichi.

    1995-01-01

    The present invention concerns measurement for water quality at the bottom of a reactor of a BWR type plant, in which reactor water is sampled and analyzed in a state approximate to conditions in a pressure vessel. Based on the result, hydrogen injection amount is controlled during hydrogen injection operation. Namely, a monitor for water quality is disposed to a sampling line in communication with the bottom of a pressure vessel. A water quality monitor is disposed to a drain sampling line in communication with the bottom of the pressure vessel. A corrosion potentiometer is disposed to the pressure sampling line or the drain sampling line. A dissolved oxygen measuring device is disposed to the pressure vessel sampling line or the drain sampling line. With such a constitution, the reactor water can be sampled and analyzed in a state approximate to the conditions in the pressure vessel. In addition, signals from the water quality monitor are inputted to a hydrogen injection amount control device. As a result, the amount of hydrogen injected to primary coolants can be controlled in a state approximate to the conditions in the pressure vessel. (I.S.)

  18. Current Enhancement with Contact-Area-Limited Doping for Bottom-Gate, Bottom-Contact Organic Thin-Film Transistors

    Science.gov (United States)

    Noda, Kei; Wakatsuki, Yusuke; Yamagishi, Yuji; Wada, Yasuo; Toyabe, Toru; Matsushige, Kazumi

    2013-02-01

    The current enhancement mechanism in contact-area-limited doping for bottom-gate, bottom-contact (BGBC) p-channel organic thin-film transistors (OTFTs) was investigated both by simulation and experiment. Simulation results suggest that carrier shortage and large potential drop occur in the source-electrode/channel interface region in a conventional BGBC OTFT during operation, which results in a decrease in the effective field-effect mobility. These phenomena are attributed to the low carrier concentration of active semiconductor layers in OTFTs and can be alleviated by contact-area-limited doping, where highly doped layers are prepared over source-drain electrodes. According to two-dimensional current distribution obtained from the device simulation, a current flow from the source electrode to the channel region via highly doped layers is generated in addition to the direct carrier injection from the source electrode to the channel, leading to the enhancement of the drain current and effective field-effect mobility. The expected current enhancement mechanism in contact-area-limited doping was experimentally confirmed in typical α-sexithiophene (α-6T) BGBC thin-film transistors.

  19. Bottom Production

    CERN Document Server

    Nason, P.; Schneider, O.; Tartarelli, G.F.; Vikas, P.; Baines, J.; Baranov, S.P.; Bartalini, P.; Bay, A.; Bouhova, E.; Cacciari, M.; Caner, A.; Coadou, Y.; Corti, G.; Damet, J.; Dell'Orso, R.; De Mello Neto, J.R.T.; Domenech, J.L.; Drollinger, V.; Eerola, P.; Ellis, N.; Epp, B.; Frixione, S.; Gadomski, S.; Gavrilenko, I.; Gennai, S.; George, S.; Ghete, V.M.; Guy, L.; Hasegawa, Y.; Iengo, P.; Jacholkowska, A.; Jones, R.; Kharchilava, A.; Kneringer, E.; Koppenburg, P.; Korsmo, H.; Kramer, M.; Labanca, N.; Lehto, M.; Maltoni, F.; Mangano, Michelangelo L.; Mele, S.; Nairz, A.M.; Nakada, T.; Nikitin, N.; Nisati, A.; Norrbin, E.; Palla, F.; Rizatdinova, F.; Robins, S.; Rousseau, D.; Sanchis-Lozano, M.A.; Shapiro, M.; Sherwood, P.; Smirnova, L.; Smizanska, M.; Starodumov, A.; Stepanov, N.; Vogt, R.

    2000-01-01

    We review the prospects for bottom production physics at the LHC. Members of the working group who has contributed to this document are: J. Baines, S.P. Baranov, P. Bartalini, A. Bay, E. Bouhova, M. Cacciari, A. Caner, Y. Coadou, G. Corti, J. Damet, R. Dell'Orso, J.R.T. De Mello Neto, J.L. Domenech, V. Drollinger, P. Eerola, N. Ellis, B. Epp, S. Frixione, S. Gadomski, I. Gavrilenko, S. Gennai, S. George, V.M. Ghete, L. Guy, Y. Hasegawa, P. Iengo, A. Jacholkowska, R. Jones, A. Kharchilava, E. Kneringer, P. Koppenburg, H. Korsmo, M. Kraemer, N. Labanca, M. Lehto, F. Maltoni, M.L. Mangano, S. Mele, A.M. Nairz, T. Nakada, N. Nikitin, A. Nisati, E. Norrbin, F. Palla, F. Rizatdinova, S. Robins, D. Rousseau, M.A. Sanchis-Lozano, M. Shapiro, P. Sherwood, L. Smirnova, M. Smizanska, A. Starodumov, N. Stepanov, R. Vogt

  20. Videomicroscopic extraction of specific information on cell proliferation and migration in vitro

    International Nuclear Information System (INIS)

    Debeir, Olivier; Megalizzi, Veronique; Warzee, Nadine; Kiss, Robert; Decaestecker, Christine

    2008-01-01

    In vitro cell imaging is a useful exploratory tool for cell behavior monitoring with a wide range of applications in cell biology and pharmacology. Combined with appropriate image analysis techniques, this approach has been shown to provide useful information on the detection and dynamic analysis of cell events. In this context, numerous efforts have been focused on cell migration analysis. In contrast, the cell division process has been the subject of fewer investigations. The present work focuses on this latter aspect and shows that, in complement to cell migration data, interesting information related to cell division can be extracted from phase-contrast time-lapse image series, in particular cell division duration, which is not provided by standard cell assays using endpoint analyses. We illustrate our approach by analyzing the effects induced by two sigma-1 receptor ligands (haloperidol and 4-IBP) on the behavior of two glioma cell lines using two in vitro cell models, i.e., the low-density individual cell model and the high-density scratch wound model. This illustration also shows that the data provided by our approach are suggestive as to the mechanism of action of compounds, and are thus capable of informing the appropriate selection of further time-consuming and more expensive biological evaluations required to elucidate a mechanism

  1. Methodological peculiarities of the Braun-Blanquet method used for marine bottom vegetation classification

    Directory of Open Access Journals (Sweden)

    AFANASYEV Dmitry F.

    2012-09-01

    Full Text Available The features of the Brown Blanquet method application for the classification of the Black and Azov Seas bottom phytocenoses are discussed. Special attention is given to following methodological questions: term of observations, necessary for associations revealing, size of relevance area, the features of geobotanic al underwater exploration technology, description of the bottom communities with epiphytes and the peculiarities of bottom vegetation syntaxonomic analysis.

  2. A fusion of top-down and bottom-up modeling techniques to constrain regional scale carbon budgets

    Science.gov (United States)

    Goeckede, M.; Turner, D. P.; Michalak, A. M.; Vickers, D.; Law, B. E.

    2009-12-01

    The effort to constrain regional scale carbon budgets benefits from assimilating as many high quality data sources as possible in order to reduce uncertainties. Two of the most common approaches used in this field, bottom-up and top-down techniques, both have their strengths and weaknesses, and partly build on very different sources of information to train, drive, and validate the models. Within the context of the ORCA2 project, we follow both bottom-up and top-down modeling strategies with the ultimate objective of reconciling their surface flux estimates. The ORCA2 top-down component builds on a coupled WRF-STILT transport module that resolves the footprint function of a CO2 concentration measurement in high temporal and spatial resolution. Datasets involved in the current setup comprise GDAS meteorology, remote sensing products, VULCAN fossil fuel inventories, boundary conditions from CarbonTracker, and high-accuracy time series of atmospheric CO2 concentrations. Surface fluxes of CO2 are normally provided through a simple diagnostic model which is optimized against atmospheric observations. For the present study, we replaced the simple model with fluxes generated by an advanced bottom-up process model, Biome-BGC, which uses state-of-the-art algorithms to resolve plant-physiological processes, and 'grow' a biosphere based on biogeochemical conditions and climate history. This approach provides a more realistic description of biomass and nutrient pools than is the case for the simple model. The process model ingests various remote sensing data sources as well as high-resolution reanalysis meteorology, and can be trained against biometric inventories and eddy-covariance data. Linking the bottom-up flux fields to the atmospheric CO2 concentrations through the transport module allows evaluating the spatial representativeness of the BGC flux fields, and in that way assimilates more of the available information than either of the individual modeling techniques alone

  3. Novel compact tiltmeter for ocean bottom and other frontier observations

    International Nuclear Information System (INIS)

    Takamori, Akiteru; Araya, Akito; Kanazawa, Toshihiko; Shinohara, Masanao; Bertolini, Alessandro; DeSalvo, Riccardo

    2011-01-01

    Long-term observations conducted with large arrays of tiltmeters deployed in ocean-bottom boreholes, on the seafloor and in other hazardous regions are expected to provide rich information useful in geosciences, social engineering, resource monitoring and other applications. To facilitate such observations, we designed and built a compact, highly sensitive tiltmeter with sufficient performance, comparable to that of much larger instruments that are difficult to operate in the target locations. The new tiltmeter is suitable for observations requiring multiple instruments because its design is optimized for low-cost mass production. This paper describes its key technologies, including a very compact folded pendulum and an optical fiber readout. Preliminary results of test observations conducted using a prototype tiltmeter are compared with a conventional water-tube tiltmeter

  4. Design, test and start up of a cleaning system for the moderator tank bottom of Atucha I Nuclear Power Plant

    International Nuclear Information System (INIS)

    Duca, J.; Gerber, O.; Ibero, M.; Riga, N.

    1989-01-01

    In order to perform the cleaning of the moderator tank bottom, during the repair of the Atucha I nuclear power plant (CNA I) failure, the Empresa Nuclear Argentina de Centrales Electricas (ENACE S.A.) designed a system with the following requirements (asked by CNA I): a) To aspirate and retain free solid particles, uranium dioxide pellets and coolant channels isolations (foils) of minor size settled at the moderator tank bottom, being the reactor at middle loop state. b) To allow a radially cleaning up to 1.4 m from the extracted channel. c) To design a lay-out attaining the ALARA dose exposure. The designed system basically consists in: a) Flexible intake for suction: allows the movement inside the moderator tank and provides the adequate speed to raise the particles. b) Filter: retains the aspirated particles, pellets and foils. Its capacity is 1.8 dm 3 and the minimum size of retained particles is 200 m. The ALARA dose exposure concept is attained due to that the filter is located inside the moderator tank. c) Filtering column: contains the filter and allows the entrance of the extraction and exchange tool (for the flexible intake and filter). d) Suction hose: connects the filtering column with the pump. Its flexibility allows its use in any channel maintaining the same positions of the discharge pump and the return piping. e) Discharge pump: it is a canned centrifugal pump with low-low net positive suction head. f) Return piping: discharges the filtered water into the moderator tank. The system fulfilled satisfactorily all requirements during its operation. (Author)

  5. Bottom-up and top-down emotion generation: implications for emotion regulation

    Science.gov (United States)

    Misra, Supriya; Prasad, Aditya K.; Pereira, Sean C.; Gross, James J.

    2012-01-01

    Emotion regulation plays a crucial role in adaptive functioning and mounting evidence suggests that some emotion regulation strategies are often more effective than others. However, little attention has been paid to the different ways emotions can be generated: from the ‘bottom-up’ (in response to inherently emotional perceptual properties of the stimulus) or ‘top-down’ (in response to cognitive evaluations). Based on a process priming principle, we hypothesized that mode of emotion generation would interact with subsequent emotion regulation. Specifically, we predicted that top-down emotions would be more successfully regulated by a top-down regulation strategy than bottom-up emotions. To test this hypothesis, we induced bottom-up and top-down emotions, and asked participants to decrease the negative impact of these emotions using cognitive reappraisal. We observed the predicted interaction between generation and regulation in two measures of emotional responding. As measured by self-reported affect, cognitive reappraisal was more successful on top-down generated emotions than bottom-up generated emotions. Neurally, reappraisal of bottom-up generated emotions resulted in a paradoxical increase of amygdala activity. This interaction between mode of emotion generation and subsequent regulation should be taken into account when comparing of the efficacy of different types of emotion regulation, as well as when reappraisal is used to treat different types of clinical disorders. PMID:21296865

  6. Using the DOM Tree for Content Extraction

    Directory of Open Access Journals (Sweden)

    David Insa

    2012-10-01

    Full Text Available The main information of a webpage is usually mixed between menus, advertisements, panels, and other not necessarily related information; and it is often difficult to automatically isolate this information. This is precisely the objective of content extraction, a research area of widely interest due to its many applications. Content extraction is useful not only for the final human user, but it is also frequently used as a preprocessing stage of different systems that need to extract the main content in a web document to avoid the treatment and processing of other useless information. Other interesting application where content extraction is particularly used is displaying webpages in small screens such as mobile phones or PDAs. In this work we present a new technique for content extraction that uses the DOM tree of the webpage to analyze the hierarchical relations of the elements in the webpage. Thanks to this information, the technique achieves a considerable recall and precision. Using the DOM structure for content extraction gives us the benefits of other approaches based on the syntax of the webpage (such as characters, words and tags, but it also gives us a very precise information regarding the related components in a block, thus, producing very cohesive blocks.

  7. MAPPING OF THE RUSSIAN NORTHERN SEAS BOTTOM RELIEF USING DIGITAL ELEVATION MODELS

    Directory of Open Access Journals (Sweden)

    S. M. Koshel

    2014-01-01

    Full Text Available The task of the project is the design of the digital elevation models (DEM of the bottoms of Barents Sea, Pechora Sea, and the White Sea. Accuracy (resolution of DEMs allows for adequate delineation of morphological structures and peculiarities of the sea bottoms and the design of bathymetrical and derivative maps. DEMs of the sea bottom were compiled using data from navigation charts of different scales, where additional isobaths were drawn manually taking into account the classification features of the bottom topography forms. Next procedures were carried out: scanning of these charts, processing of scanned images, isobaths vectorization and creation of attribute tables, vector layers transformation to geographical coordinates as well editing, merging and joining of the map sheets, correction of geometry and attributes. For generation of digital model of bottom topography it is important to choose algorithm which allows for representation all of the sea bottom features expressed by isobaths in most details. The original algorithm based on fast calculation of distances to the two different nearest isobaths was used. Interpretation of isolines as vector linear objects is the main peculiarity of this algorithm. The resulted DEMs were used to design bathymetrical maps of Barents Sea of 1:2 500 000 scale, Pechora Sea of 1:1 000 000 scale, and White Sea of 1:750 000 scale. Different derivative maps were compiled based on DEM of the White Sea.

  8. Z decay into a bottom quark, a light sbottom, and a light gluino

    International Nuclear Information System (INIS)

    Cheung Kingman; Keung Waiyee

    2003-01-01

    The discrepancy between the measured and theoretical production cross section of b quarks at the Fermilab Tevatron can probably be explained by the recently proposed scenario of light gluinos of mass 12-16 GeV and light sbottoms of mass 2-5.5 GeV. In this scenario, we study a related process at the Z pole, Z→bb-tilde 1 * g-tilde+b-barb-tilde 1 g-tilde followed by g-tilde→bb-tilde 1 */b-barb-tilde 1 . The hadronic branching ratio for this channel is (1-3)x10 -3 , which is of the order of the size of the uncertainty in R b . We find that a typical event consists of an energetic prompt bottom jet back to back with a 'fat' bottom jet, which consists of a bottom quark and two bottom squarks. Such events with a 10 -3 branching ratio may affect the measurement of R b ; they are even more interesting if the fat bottom jet can be identified

  9. Development of built-in debris-filter bottom nozzle for PWR fuel assemblies

    International Nuclear Information System (INIS)

    Juntaro Shimizu; Kazuki Monaka; Masaji Mori; Kazuo Ikeda

    2005-01-01

    Mitsubishi Heavy Industries, Ltd. (MHI) has worked to improve the capability of anti debris bottom nozzle for a PWR fuel assembly. The Current debris filter bottom nozzle (DFBN) having 4mm diameter flow holes can capture the larger size of debris than the flow hole inner diameter. MHI has completed the development of the built-in debris filter bottom nozzle, which is the new idea of the debris-filter for high burnup (55GWd/t assembly average burnup). Built-in debris filter bottom nozzle consists of the blades and nozzle body. The blades made from inconel strip are embedded and welded on the grooved top surface of the bottom nozzle adapter plate. A flow hole is divided by the blade and the trap size of the debris is reduced. Because the blades block the coolant flow, it was anticipated to increase the pressure loss of the nozzle, however, adjusting the relation between blade and taper shape of the flow hole, the pressure loss has been successfully maintained the satisfactory level. Grooves are cut on the nozzle plate; nevertheless, the additional skirts on the four sides of the nozzle compensate the structural strength. (authors)

  10. Utilization of power plant bottom-ash particles as stabilizer in aluminum foams

    Energy Technology Data Exchange (ETDEWEB)

    Asavavisithchai, Seksak; Prapajaraswong, Attanadol [Chulalongkorn Univ., Bangkok (Thailand). Dept. of Metallurgical Engineering

    2013-07-01

    Aluminum foams, produced via powder metallurgical (PM) process, normally require the addition of ceramic particles in compaction stage, in order to increase both foamability of precursors and mechanical properties of the final foam products. Bottom ash particles are a by-product waste obtained from thermoelectric power plants which are commonly found to be used in landfill facilities. The major chemical constituent, approximately between 30 wt.-% and 60 wt.-%, of bottom ash particles is SiO{sub 2}, depending on chemical composition in coal, sintering condition and environment, and other process parameters. In this study, we explore the feasibility of utilizing bottom ash particles of thermoelectric power plant wastes as stabilizer in aluminum foams. A small amount of two-size bottom ash particles (mean size of 78 {mu}m and 186 {mu}m), between 1 wt.-% and 5 wt.-%, have been added to aluminum foams. Foam expansion, macro- and microstructures as well as mechanical properties, such as compressive strength and microhardness, were investigated. The results from the present study suggest that bottom ash particles can be used as a stabilizing material which can improve both cellular structure and mechanical properties of aluminum foams. (orig.)

  11. Testing the possibility for reusing mswi bottom ash in Greenlandic road construction

    DEFF Research Database (Denmark)

    Kirkelund, Gunvor Marie; Jørgensen, Anders Stuhr; Villumsen, Arne

    2012-01-01

    requirements (a grain size distribution, wear resistance, visual fraction analysis and bearing capacity) for reuse as fill material in road construction [2]. Environmental classification based on heavy metal content and leachability was also investigated. The tests showed that it will not be possible to use......, which can influence the quality of MWSI residues. About 15,000 tons MSWI bottom ash is produced annually in Greenland and is disposed of at the open disposal sites without leachate collection or encapsulation. The MSWI bottom ash could have value as a secondary resource in construction work in Greenland...... the bottom ash directly after the incineration as the bottom ash did not comply with all the requirements specified by the Danish Road Directorate. These technical requirements could be improved by removing large fractions (> 45mm) and metal parts as well as changing the grain size distribution...

  12. Inexperienced clinicians can extract pathoanatomic information from MRI narrative reports with high reproducability for use in research/quality assurance

    DEFF Research Database (Denmark)

    Kent, Peter; Briggs, Andrew M; Albert, Hanne Birgit

    2011-01-01

    Background Although reproducibility in reading MRI images amongst radiologists and clinicians has been studied previously, no studies have examined the reproducibility of inexperienced clinicians in extracting pathoanatomic information from magnetic resonance imaging (MRI) narrative reports and t...

  13. Information Extraction and Interpretation Analysis of Mineral Potential Targets Based on ETM+ Data and GIS technology: A Case Study of Copper and Gold Mineralization in Burma

    International Nuclear Information System (INIS)

    Wenhui, Du; Yongqing, Chen; Nana, Guo; Yinglong, Hao; Pengfei, Zhao; Gongwen, Wang

    2014-01-01

    Mineralization-alteration and structure information extraction plays important roles in mineral resource prospecting and assessment using remote sensing data and the Geographical Information System (GIS) technology. Choosing copper and gold mines in Burma as example, the authors adopt band ratio, threshold segmentation and principal component analysis (PCA) to extract the hydroxyl alteration information using ETM+ remote sensing images. Digital elevation model (DEM) (30m spatial resolution) and ETM+ data was used to extract linear and circular faults that are associated with copper and gold mineralization. Combining geological data and the above information, the weights of evidence method and the C-A fractal model was used to integrate and identify the ore-forming favourable zones in this area. Research results show that the high grade potential targets are located with the known copper and gold deposits, and the integrated information can be used to the next exploration for the mineral resource decision-making

  14. Continuous bottom temperature measurements in strategic atreas of the Florida Reef Tract at the New Ground Shoal site, 1992 - 2006 (NODC Accession 0012845)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  15. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Boca Grande Channel, 2007 - 2008 and 2012 (NODC Accession 0093019)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  16. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at the Pillar Coral Forest site, 1996 - 2006 (NODC accession 0013096)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  17. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at the New Ground Shoal site, 1992 - 2006 (NODC Accession 0012845)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  18. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at M/V ELPIS Restoration Site, 2007 - 2011 (NODC Accession 0093024)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  19. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at the Pillar Coral Forest site, 1996 - 2006 (NODC Accession 0013096)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  20. Bottom production

    Energy Technology Data Exchange (ETDEWEB)

    Baines, J.; Baranov, S.P.; Bartalini, P.; Bay, A.; Bouhova, E.; Cacciari, M.; Caner, A.; Coadou, Y.; Corti, G.; Damet, J.; Dell-Orso, R.; De Mello Neto, J.R.T.; Domenech, J.L.; Drollinger, V.; Eerola, P.; Ellis, N.; Epp, B.; Frixione, S.; Gadomski, S.; Gavrilenko, I.; Gennai, S.; George, S.; Ghete, V.M.; Guy, L.; Hasegawa, Y.; Iengo, P.; Jacholkowska, A.; Jones, R.; Kharchilava, A.; Kneringer, E.; Koppenburg, P.; Korsmo, H.; Kramer, M.; Labanca, N.; Lehto, M.; Maltoni, F.; Mangano, M.L.; Mele, S.; Nairz, A.M.; Nakada, T.; Nikitin, N.; Nisati, A.; Norrbin, E.; Palla, F.; Rizatdinova, F.; Robins, S.; Rousseau, D.; Sanchis-Lozano, M.A.; Shapiro, M.; Sherwood, P.; Smirnova, L.; Smizanska, M.; Starodumov, A.; Stepanov, N.; Vogt, R.

    2000-03-15

    In the context of the LHC experiments, the physics of bottom flavoured hadrons enters in different contexts. It can be used for QCD tests, it affects the possibilities of B decays studies, and it is an important source of background for several processes of interest. The physics of b production at hadron colliders has a rather long story, dating back to its first observation in the UA1 experiment. Subsequently, b production has been studied at the Tevatron. Besides the transverse momentum spectrum of a single b, it has also become possible, in recent time, to study correlations in the production characteristics of the b and the b. At the LHC new opportunities will be offered by the high statistics and the high energy reach. One expects to be able to study the transverse momentum spectrum at higher transverse momenta, and also to exploit the large statistics to perform more accurate studies of correlations.

  1. Bottom production

    International Nuclear Information System (INIS)

    Baines, J.; Baranov, S.P.; Bartalini, P.; Bay, A.; Bouhova, E.; Cacciari, M.; Caner, A.; Coadou, Y.; Corti, G.; Damet, J.; Dell-Orso, R.; De Mello Neto, J.R.T.; Domenech, J.L.; Drollinger, V.; Eerola, P.; Ellis, N.; Epp, B.; Frixione, S.; Gadomski, S.; Gavrilenko, I.; Gennai, S.; George, S.; Ghete, V.M.; Guy, L.; Hasegawa, Y.; Iengo, P.; Jacholkowska, A.; Jones, R.; Kharchilava, A.; Kneringer, E.; Koppenburg, P.; Korsmo, H.; Kramer, M.; Labanca, N.; Lehto, M.; Maltoni, F.; Mangano, M.L.; Mele, S.; Nairz, A.M.; Nakada, T.; Nikitin, N.; Nisati, A.; Norrbin, E.; Palla, F.; Rizatdinova, F.; Robins, S.; Rousseau, D.; Sanchis-Lozano, M.A.; Shapiro, M.; Sherwood, P.; Smirnova, L.; Smizanska, M.; Starodumov, A.; Stepanov, N.; Vogt, R.

    2000-01-01

    In the context of the LHC experiments, the physics of bottom flavoured hadrons enters in different contexts. It can be used for QCD tests, it affects the possibilities of B decays studies, and it is an important source of background for several processes of interest. The physics of b production at hadron colliders has a rather long story, dating back to its first observation in the UA1 experiment. Subsequently, b production has been studied at the Tevatron. Besides the transverse momentum spectrum of a single b, it has also become possible, in recent time, to study correlations in the production characteristics of the b and the b. At the LHC new opportunities will be offered by the high statistics and the high energy reach. One expects to be able to study the transverse momentum spectrum at higher transverse momenta, and also to exploit the large statistics to perform more accurate studies of correlations

  2. Measurement of the bottom hadron lifetime at the Z0 resonancce

    Energy Technology Data Exchange (ETDEWEB)

    Fujino, Donald Hideo [Stanford Univ., CA (United States)

    1992-06-01

    We have measured the bottom hadron lifetime from b$\\bar{b}$ events produced at the Z0 resonance. Using the precision vertex detectors of the Mark II detector at the Stanford Linear Collider, we developed an impact parameter tag to identify bottom hadrons. The vertex tracking system resolved impact parameters to 30 μm for high momentum tracks, and 70 μm for tracks with a momentum of 1 GeV. We selected B hadrons with an efficiency of 40% and a sample purity of 80%, by requiring there be at least two tracks in a single jet that significantly miss the Z0 decay vertex. From a total of 208 hadronic Z0 events collected by the Mark II detector in 1990, we tagged 53 jets, of which 22 came from 11 double-tagged events. The jets opposite the tagged ones, referred as the ``untagged`` sample, are rich in B hadrons and unbiased in B decay times. The variable Σδ is the sum of impact parameters from tracks in the jet, and contains vital information on the B decay time. We measured the B lifetime from a one-parameter likelihood fit to the untagged Σδ distribution, obtaining τb = 1.53 $+0.55\\atop{-0.45}$ ± 0.16 ps which agrees with the current world average. The first error is statistical and the second is systematic. The systematic error was dominated by uncertainties in the track resolution function. As a check, we also obtained consistent results using the Σδ distribution from the tagged jets and from the entire hadronic sample without any bottom enrichment.

  3. Coil in bottom part of splitter magnet

    CERN Multimedia

    CERN PhotoLab

    1976-01-01

    Radiation-resistant coil being bedded into the bottom part of a splitter magnet. This very particular magnet split the beam into 3 branches, for 3 target stations in the West-Area. See Annual Report 1975, p.176, Figs.14 and 15.

  4. Using Local Grammar for Entity Extraction from Clinical Reports

    Directory of Open Access Journals (Sweden)

    Aicha Ghoulam

    2015-06-01

    Full Text Available Information Extraction (IE is a natural language processing (NLP task whose aim is to analyze texts written in natural language to extract structured and useful information such as named entities and semantic relations linking these entities. Information extraction is an important task for many applications such as bio-medical literature mining, customer care, community websites, and personal information management. The increasing information available in patient clinical reports is difficult to access. As it is often in an unstructured text form, doctors need tools to enable them access to this information and the ability to search it. Hence, a system for extracting this information in a structured form can benefits healthcare professionals. The work presented in this paper uses a local grammar approach to extract medical named entities from French patient clinical reports. Experimental results show that the proposed approach achieved an F-Measure of 90. 06%.

  5. BioSimplify: an open source sentence simplification engine to improve recall in automatic biomedical information extraction.

    Science.gov (United States)

    Jonnalagadda, Siddhartha; Gonzalez, Graciela

    2010-11-13

    BioSimplify is an open source tool written in Java that introduces and facilitates the use of a novel model for sentence simplification tuned for automatic discourse analysis and information extraction (as opposed to sentence simplification for improving human readability). The model is based on a "shot-gun" approach that produces many different (simpler) versions of the original sentence by combining variants of its constituent elements. This tool is optimized for processing biomedical scientific literature such as the abstracts indexed in PubMed. We tested our tool on its impact to the task of PPI extraction and it improved the f-score of the PPI tool by around 7%, with an improvement in recall of around 20%. The BioSimplify tool and test corpus can be downloaded from https://biosimplify.sourceforge.net.

  6. The Role of Rough Topography in Mediating Impacts of Bottom Drag in Eddying Ocean Circulation Models.

    Science.gov (United States)

    Trossman, David S; Arbic, Brian K; Straub, David N; Richman, James G; Chassignet, Eric P; Wallcraft, Alan J; Xu, Xiaobiao

    2017-08-01

    Motivated by the substantial sensitivity of eddies in two-layer quasi-geostrophic (QG) turbulence models to the strength of bottom drag, this study explores the sensitivity of eddies in more realistic ocean general circulation model (OGCM) simulations to bottom drag strength. The OGCM results are interpreted using previous results from horizontally homogeneous, two-layer, flat-bottom, f-plane, doubly periodic QG turbulence simulations and new results from two-layer β -plane QG turbulence simulations run in a basin geometry with both flat and rough bottoms. Baroclinicity in all of the simulations varies greatly with drag strength, with weak drag corresponding to more barotropic flow and strong drag corresponding to more baroclinic flow. The sensitivity of the baroclinicity in the QG basin simulations to bottom drag is considerably reduced, however, when rough topography is used in lieu of a flat bottom. Rough topography reduces the sensitivity of the eddy kinetic energy amplitude and horizontal length scales in the QG basin simulations to bottom drag to an even greater degree. The OGCM simulation behavior is qualitatively similar to that in the QG rough bottom basin simulations in that baroclinicity is more sensitive to bottom drag strength than are eddy amplitudes or horizontal length scales. Rough topography therefore appears to mediate the sensitivity of eddies in models to the strength of bottom drag. The sensitivity of eddies to parameterized topographic internal lee wave drag, which has recently been introduced into some OGCMs, is also briefly discussed. Wave drag acts like a strong bottom drag in that it increases the baroclinicity of the flow, without strongly affecting eddy horizontal length scales.

  7. Bottom-pressure observations of deep-sea internal hydrostatic and non-hydrostatic motions

    NARCIS (Netherlands)

    van Haren, H.

    2013-01-01

    In the ocean, sloping bottom topography is important for the generation and dissipation of internal waves. Here, the transition of such waves to turbulence is demonstrated using an accurate bottom-pressure sensor that was moored with an acoustic Doppler current profiler and high-resolution

  8. Air demand estimation in bottom outlets with the particle finite element method. Susqueda Dam case study

    Science.gov (United States)

    Salazar, Fernando; San-Mauro, Javier; Celigueta, Miguel Ángel; Oñate, Eugenio

    2017-07-01

    Dam bottom outlets play a vital role in dam operation and safety, as they allow controlling the water surface elevation below the spillway level. For partial openings, water flows under the gate lip at high velocity and drags the air downstream of the gate, which may cause damages due to cavitation and vibration. The convenience of installing air vents in dam bottom outlets is well known by practitioners. The design of this element depends basically on the maximum air flow through the air vent, which in turn is a function of the specific geometry and the boundary conditions. The intrinsic features of this phenomenon makes it hard to analyse either on site or in full scaled experimental facilities. As a consequence, empirical formulas are frequently employed, which offer a conservative estimate of the maximum air flow. In this work, the particle finite element method was used to model the air-water interaction in Susqueda Dam bottom outlet, with different gate openings. Specific enhancements of the formulation were developed to consider air-water interaction. The results were analysed as compared to the conventional design criteria and to information gathered on site during the gate operation tests. This analysis suggests that numerical modelling with the PFEM can be helpful for the design of this kind of hydraulic works.

  9. Fresh Properties and Flexural Strength of Self-Compacting Concrete Integrating Coal Bottom Ash

    Directory of Open Access Journals (Sweden)

    Jamaluddin Norwati

    2016-01-01

    Full Text Available This paper presents the effect of using coal bottom ash as a partial replacement of fine aggregates in self-compacting concrete (SCC on its fresh properties and flexural strength. A comparison between SCC with various replacements of fine aggregates with coal bottom ash showed that SCC obtained flexural strength decrease on increase of water cement ratio from 0.35 to 0.45. The natural sand was replaced with coal bottom ash up to 30% volumetrically. The fresh properties were investigated by slump flow, T500 spread time, L-box test and sieve segregation resistance in order to evaluate its self-compatibility by compared to control samples embed with natural sand. The results revealed that the flowability and passing ability of SCC mixtures are decreased with higher content of coal bottom ash replacement. The results also showed that the flexural strength is affected by the presence of coal bottom ash in the concrete. In addition, the water cement ratios are influence significantly with higher binder content in concrete.

  10. Large Eddy Simulations of a Bottom Boundary Layer Under a Shallow Geostrophic Front

    Science.gov (United States)

    Bateman, S. P.; Simeonov, J.; Calantoni, J.

    2017-12-01

    The unstratified surf zone and the stratified shelf waters are often separated by dynamic fronts that can strongly impact the character of the Ekman bottom boundary layer. Here, we use large eddy simulations to study the turbulent bottom boundary layer associated with a geostrophic current on a stratified shelf of uniform depth. The simulations are initialized with a spatially uniform vertical shear that is in geostrophic balance with a pressure gradient due to a linear horizontal temperature variation. Superposed on the temperature front is a stable vertical temperature gradient. As turbulence develops near the bottom, the turbulence-induced mixing gradually erodes the initial uniform temperature stratification and a well-mixed layer grows in height until the turbulence becomes fully developed. The simulations provide the spatial distribution of the turbulent dissipation and the Reynolds stresses in the fully developed boundary layer. We vary the initial linear stratification and investigate its effect on the height of the bottom boundary layer and the turbulence statistics. The results are compared to previous models and simulations of stratified bottom Ekman layers.

  11. 90Sr content in the Black Sea bottom sediments after the Chernobyl NPP accident and its use as a radiotracer for an assessment of bottom settlement rate

    International Nuclear Information System (INIS)

    Mirzoyeva, N. Y.; Egorov, V. N.; Polikarpov, G. G.

    2006-01-01

    The increase of 9 0Sr concentrations in the Black Sea bottom sediments along to western coast of the Black Sea and south part of Crimea was observed in 1987-1988 years. To our opinion, it was connected with hydrological processes (for example, currents), occurring in the given sea parts. The most polluted by post-Chernobyl 9 0Sr areas were bottom sediments of Dnieper, Dniester and Danube River deltas, territory of an arrangement of a main channel of the North-Crimea Channel - region of a peninsula Tarkhankut, southeast part of Crimea (Feodosiya area). The similar situation in confined of the greatest contents 9 0Sr to the specified areas not only is kept with time (till 2000), but the process of increase of 9 0Sr concentration in bottom sediments of the investigated regions is observed. So average concentration of 9 0Sr in Dnieper River delta bottom sediments in 1987 was 28,5 Bq kg - 1 , in 2000 - 148,2 28,5 Bq kg - 1 of Dry Weight. Such character of 9 0Sr redistribution shows, that both in first years after Chernobyl NPP accident, and in the following time, the entry of 9 0Sr in the Black Sea basin occurs, basically, with water flow of the large rivers in a northwest part of the Black Sea, discharge waters of the North-Crimean Channel. These sources of 9 0Sr input to the Black Sea ecosystem considerably prevailed above direct atmospheric pollution by given radionuclide in April-May 1986 at once after the Chernobyl NPP accident. On the base of monitoring researches results the maps of 9 0Sr dynamics redistribution in the Black Sea bottom sediments (0-5 cm) since 1986 (Chernobyl NPP accident) up to 2000 were sketched out. The distribution of 9 0Sr radionuclide in the bottom sediments columns, which were selected from the Corukh river mouth region and from the Dnieper-Bug estuary area, is investigated. The peaks of 9 0Sr increased contents were founded in the profile of its vertical distribution in the bottom sediments. These peaks correspond to the periods of 9 0Sr

  12. Calculating systems-scale energy efficiency and net energy returns: A bottom-up matrix-based approach

    International Nuclear Information System (INIS)

    Brandt, Adam R.; Dale, Michael; Barnhart, Charles J.

    2013-01-01

    In this paper we expand the work of Brandt and Dale (2011) on ERRs (energy return ratios) such as EROI (energy return on investment). This paper describes a “bottom-up” mathematical formulation which uses matrix-based computations adapted from the LCA (life cycle assessment) literature. The framework allows multiple energy pathways and flexible inclusion of non-energy sectors. This framework is then used to define a variety of ERRs that measure the amount of energy supplied by an energy extraction and processing pathway compared to the amount of energy consumed in producing the energy. ERRs that were previously defined in the literature are cast in our framework for calculation and comparison. For illustration, our framework is applied to include oil production and processing and generation of electricity from PV (photovoltaic) systems. Results show that ERR values will decline as system boundaries expand to include more processes. NERs (net energy return ratios) tend to be lower than GERs (gross energy return ratios). External energy return ratios (such as net external energy return, or NEER (net external energy ratio)) tend to be higher than their equivalent total energy return ratios. - Highlights: • An improved bottom-up mathematical method for computing net energy return metrics is developed. • Our methodology allows arbitrary numbers of interacting processes acting as an energy system. • Our methodology allows much more specific and rigorous definition of energy return ratios such as EROI or NER

  13. Cathodic protection simulation of above ground storage tank bottom: Experimental and numerical results

    Energy Technology Data Exchange (ETDEWEB)

    Schultz, Marcelo [Inspection Department, Rio de Janeiro Refinery - REDUC, Petrobras, Rio de Janeiro (Brazil); Brasil, Simone L.D.C. [Chemistry School, Federal University of Rio de Janeiro, UFRJ, Rio de Janeiro (Brazil); Baptista, Walmar [Corrosion Department, Research Centre - CENPES, Petrobras (Brazil); Miranda, Luiz de [Materials and Metallurgical Engineering Program, COPPE, UFRJ, Rio de Janeiro (Brazil); Brito, Rosane F. [Corrosion Department, Research Centre, CENPES, Petrobras, Rio de Janeiro (Brazil)

    2004-07-01

    The deterioration history of Above ground Storage Tanks (AST) of Petrobras' refineries - shows that the great incidence of corrosion in the AST bottom is at the external side. This is a problem in the disposability of storage crude oil and other final products. At this refinery, all AST's are built over a concrete base with a lot of pile to support the structure and distribute the charge homogeneously. Because of this it is very difficult to use cathodic protection as an anti-corrosive method for each one of these tanks. This work presents an alternative cathodic protection system to protect the external side of the tank bottom using a new metallic bottom, placed at different distance from the original one. The space between the two bottoms was filled with one of two kinds of soils, sand or clay, more conductive than the concrete. Using a prototype tank it was studied the potential distributions over the new tank bottom for different system parameters, as soil resistivity, number and position of anodes localized in the old bottom. These experimental results were compared to numerical simulations, carried out using a software based on the Boundary Element Method. The computer simulation validates this protection method, confirming to be a very useful tool to define the optimized cathodic protection system configuration. (authors)

  14. Bottom-up vs. top-down effects on terrestrial insect herbivores: a meta-analysis.

    Science.gov (United States)

    Vidal, Mayra C; Murphy, Shannon M

    2018-01-01

    Primary consumers are under strong selection from resource ('bottom-up') and consumer ('top-down') controls, but the relative importance of these selective forces is unknown. We performed a meta-analysis to compare the strength of top-down and bottom-up forces on consumer fitness, considering multiple predictors that can modulate these effects: diet breadth, feeding guild, habitat/environment, type of bottom-up effects, type of top-down effects and how consumer fitness effects are measured. We focused our analyses on the most diverse group of primary consumers, herbivorous insects, and found that in general top-down forces were stronger than bottom-up forces. Notably, chewing, sucking and gall-making herbivores were more affected by top-down than bottom-up forces, top-down forces were stronger than bottom-up in both natural and controlled (cultivated) environments, and parasitoids and predators had equally strong top-down effects on insect herbivores. Future studies should broaden the scope of focal consumers, particularly in understudied terrestrial systems, guilds, taxonomic groups and top-down controls (e.g. pathogens), and test for more complex indirect community interactions. Our results demonstrate the surprising strength of forces exerted by natural enemies on herbivorous insects, and thus the necessity of using a tri-trophic approach when studying insect-plant interactions. © 2017 John Wiley & Sons Ltd/CNRS.

  15. A hybrid approach for robust multilingual toponym extraction and disambiguation

    NARCIS (Netherlands)

    Habib, Mena Badieh; van Keulen, Maurice

    Toponym extraction and disambiguation are key topics recently addressed by fields of Information Extraction and Geographical Information Retrieval. Toponym extraction and disambiguation are highly dependent processes. Not only toponym extraction effectiveness affects disambiguation, but also

  16. Stabilization of bottom sediments from Rzeszowski Reservoir

    Directory of Open Access Journals (Sweden)

    Koś Karolina

    2015-06-01

    Full Text Available The paper presents results of stabilization of bottom sediments from Rzeszowski Reservoir. Based on the geotechnical characteristics of the tested sediments it was stated they do not fulfill all the criteria set for soils in earth embankments. Therefore, an attempt to improve their parameters was made by using two additives – cement and lime. An unconfined compressive strength, shear strength, bearing ratio and pH reaction were determined on samples after different time of curing. Based on the carried out tests it was stated that the obtained values of unconfined compressive strength of sediments stabilized with cement were relatively low and they did not fulfill the requirements set by the Polish standard, which concerns materials in road engineering. In case of lime stabilization it was stated that the tested sediments with 6% addition of the additive can be used for the bottom layers of the improved road base.

  17. The BEL information extraction workflow (BELIEF): evaluation in the BioCreative V BEL and IAT track

    OpenAIRE

    Madan, Sumit; Hodapp, Sven; Senger, Philipp; Ansari, Sam; Szostak, Justyna; Hoeng, Julia; Peitsch, Manuel; Fluck, Juliane

    2016-01-01

    Network-based approaches have become extremely important in systems biology to achieve a better understanding of biological mechanisms. For network representation, the Biological Expression Language (BEL) is well designed to collate findings from the scientific literature into biological network models. To facilitate encoding and biocuration of such findings in BEL, a BEL Information Extraction Workflow (BELIEF) was developed. BELIEF provides a web-based curation interface, the BELIEF Dashboa...

  18. Importance of Nonperturbative QCD Parameters for Bottom Mesons

    Directory of Open Access Journals (Sweden)

    A. Upadhyay

    2014-01-01

    Full Text Available The importance of nonperturbative quantum chromodynamics (QCD parameters is discussed in context to the predicting power for bottom meson masses and isospin splitting. In the framework of heavy quark effective theory, the work presented here focuses on the different allowed values of the two nonperturbative QCD parameters used in heavy quark effective theory formula, and using the best fitted parameter, masses of the excited bottom meson states in jp=1/2+ doublet in strange and nonstrange sectors are calculated here. The calculated masses are found to be matching well with experiments and other phenomenological models. The mass splitting and hyperfine splitting have also been analyzed for both strange and nonstrange heavy mesons with respect to spin and flavor symmetries.

  19. Channel Bottom Morphology in the Deltaic Reach of the Song Hau (mekong) River Channel in Vietnam

    Science.gov (United States)

    Allison, M. A.; Weathers, H. D., III; Meselhe, E. A.

    2016-02-01

    Boat-based, channel bathymetry and bankline elevation studies were conducted in the tidal and estuarine Mekong River channel using multibeam bathymetry and LIDAR corrected for elevation by RTK satellite positioning. Two mapping campaigns, one at high discharge in October 2014 and one at low discharge in March 2015, were conducted in the lower 100 km reach of the Song Hau distributary channel to (1) examine bottom morphology and its relationship to sediment transport, and (2) to provide information to setup the grid for a multi-dimensional and reduced complexity models of channel hydrodynamics and sediment dynamics. Sand fields were identified in multibeam data by the presence of dunes that were as large as 2-4 m high and 40-80 m wavelength and by clean sands in bottom grabs. Extensive areas of sand at the head and toe of mid-channel islands displayed 10-25 m diameter circular pits that could be correlated with bucket dredge, sand mining activities observed at some of the sites. Large areas of the channel floor were relict (containing little or no modern sediment) in the high discharge campaign, identifiable by the presence of along channel erosional furrows and terraced outcrops along the channel floor and margins. Laterally extensive flat areas were also observed in the channel thalweg. Both these and the relict areas were sampled by bottom grab as stiff silty clays. Complex cross-channel combinations of these morphologies were observed in some transects, suggesting strong bottom steering of tidal and riverine currents. Relative to high discharge, transects above and below the salt penetration limit showed evidence of shallowing in the thalweg and adjacent sloping areas at low discharge in March 2015. This shallowing, combined with the reduced extent of sand fields and furrowed areas, and soft muds in grabs, suggests seasonal trapping of fine grained sediment is occurring by estuarine and tidal circulation.

  20. BWR fuel assembly bottom nozzle with one-way coolant flow valve

    International Nuclear Information System (INIS)

    Taleyarkhan, R.P.

    1987-01-01

    In a nuclear reactor having a flow of coolant/moderator fluid therein, at least one fuel assembly installed in the fluid flow, the fuel assembly is described comprising in combination: a bundle of elongated fuel rods disposed in side-by-side relationship so as to form an array of spaced fuel rods; an outer tubular flow channel surrounding the fuel rods so as to direct the flow of coolant/moderator fluid along the fuel rods; bottom and top nozzles mounted at opposite ends of the flow channel and having an inlet and outlet respectively for allowing entry and exit of the flow of coolant/moderator fluid into and from the flow channel and along the fuel rods therein; and a coolant flow direction control device operatively disposed in the bottom nozzle so as to open the inlet thereof to the flow of coolant/moderator fluid in an inflow direction into the flow channel through the bottom nozzle inlet but close the inlet to the flow of coolant/moderator fluid from the flow channel through the bottom nozzle inlet upon reversal of coolant/moderator fluid flow from the inflow direction

  1. Spouted bed drying of Bauhinia forficata link extract: the effects of feed atomizer position and operating conditions on equipment performance and product properties

    Directory of Open Access Journals (Sweden)

    C. R. F. Souza

    2005-06-01

    Full Text Available In this paper the effects of feed atomizer position and operating conditions on equipment performance (accumulation rate, product recovery, elutriation and thermal efficiency and product properties (moisture content, size distribution, flavonoid degradation and flow properties during spouted bed drying of Bauhinia forficata Link extract are evaluated. The parameters studied were the position of the atomizer system (top spray or bottom spray, the inlet temperature of the spouting gas (80 and 150oC and the feed mass flow rate of concentrated extract relative to the evaporation capacity of the dryer, Ws/Wmax (15 to 100%. Higher accumulation rate values were obtained with the atomizer placed at the bottom of the bed. In this configuration, the accumulation rate increases with the increase in the Ws/Wmax ratio. The best drying performance was obtained for the top spray configuration.

  2. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract, at the M/V ELPIS Restoration Site, 2006 - 2007 (NODC Accession 0039899)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  3. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract, at the M/V ELPIS Restoration Site, 2004 - 2006 (NODC Accession 0010576)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  4. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Key West Channel, 2007 - 2010 and 2011 - 2012 (NODC Accession 0093028)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  5. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Looe Key-Back Reef, 2008 and 2011 - 2012 (NODC Accession 0093064)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  6. NMFS Bottom Longline Analytical Dataset Provided to NRDA

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Southeast Fisheries Science Center Mississippi Laboratories has conducted standardized bottom longline surveys in the Gulf of Mexico and South Atlantic since...

  7. s-Step Krylov Subspace Methods as Bottom Solvers for Geometric Multigrid

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Lijewski, Mike [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Almgren, Ann [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Straalen, Brian Van [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Carson, Erin [Univ. of California, Berkeley, CA (United States); Knight, Nicholas [Univ. of California, Berkeley, CA (United States); Demmel, James [Univ. of California, Berkeley, CA (United States)

    2014-08-14

    Geometric multigrid solvers within adaptive mesh refinement (AMR) applications often reach a point where further coarsening of the grid becomes impractical as individual sub domain sizes approach unity. At this point the most common solution is to use a bottom solver, such as BiCGStab, to reduce the residual by a fixed factor at the coarsest level. Each iteration of BiCGStab requires multiple global reductions (MPI collectives). As the number of BiCGStab iterations required for convergence grows with problem size, and the time for each collective operation increases with machine scale, bottom solves in large-scale applications can constitute a significant fraction of the overall multigrid solve time. In this paper, we implement, evaluate, and optimize a communication-avoiding s-step formulation of BiCGStab (CABiCGStab for short) as a high-performance, distributed-memory bottom solver for geometric multigrid solvers. This is the first time s-step Krylov subspace methods have been leveraged to improve multigrid bottom solver performance. We use a synthetic benchmark for detailed analysis and integrate the best implementation into BoxLib in order to evaluate the benefit of a s-step Krylov subspace method on the multigrid solves found in the applications LMC and Nyx on up to 32,768 cores on the Cray XE6 at NERSC. Overall, we see bottom solver improvements of up to 4.2x on synthetic problems and up to 2.7x in real applications. This results in as much as a 1.5x improvement in solver performance in real applications.

  8. Phytotoxic effects of bottom sediments from Ignalina NPP wastewater canals and cooler

    International Nuclear Information System (INIS)

    Montvydiene, D.

    2002-01-01

    In the paper impact of Ignalina Nuclear Power Plant (INPP) waste upon phytotoxicity of sediments from Lake Drukshiai was recognized. Samples of bottom sediments were collected from various wastewater canals of INPP, from the canal of wastewater treatment plant (WWTP), small lake and rivulet, which are on the route of that wastes into Drukshiai. In 1995, 132 sites of Drukshiai were observed in order to assess the phytotoxicity of its bottom sediments. The research was carried out in July of 1993-2000. Number of somatic mutations (pink, colourless and morphological) and nonviable stamen hairs (the quantity of whose indicates lethality, when hair contains less than 12 cells) in Tradescantia (clone 02) stamen hair (SH) system was counted. Genotoxic effect of bottom sediments on Tradescantia was estimated according to Sparrow et al. (1972) and Marciulioniene et al. (1996). Genotoxic effects were considered weak if amount of somatic mutations not exceeded 1%, there were no non-viable stamen hairs, and medium effect was when the number of somatic mutations was between 1.0-4.0% and non-viable stamen hairs did not reach 40,0%. As well as strong effect was when numbers of somatic mutations and non-viable stamen hairs exceeding 4.0% and 40.0%, respectively. L. sativum is a rather sensitive, widely applied biotest because of its simplicity, cheapness and short duration. This test based on Magone (1989) method and lasted for 48 hours, after which time the seeds germination and root length of seedlings was measured. Tested bottom sediments causing percent inhibitions of 100-60%, 61-40%, 41-20%, and 20-0% were classified as highly toxic, moderately toxic, slightly toxic and non-toxic, respectively. Estimations in both cases were run in triplicates. The data were estimated using the analysis of variance with significance defined at α = 0,05. It was established that in accordance with the phytotoxic impact, the wastes discharged by INPP into Drukshiai in 1993-2000 are attributed

  9. The influence of triple bottom line on international operations management

    Directory of Open Access Journals (Sweden)

    Francisco Sperotto Flores

    2017-12-01

    Full Text Available This paper takes a triple bottom line perspective to analyze how the international operations literature integrates economic, social, and environmental issues. Additionally, it shows the main drivers of and barriers to the adoption of triple bottom line practices by companies in an international context. We conducted a literature review in English language journals which publish research of production and operations management and sustainability, resulting in a final sample of 29 papers. Results show that social and legal pressure for companies to adopt a responsible behavior prompts an isomorphic process that leads them to conduct their operations on behalf of triple bottom line goals. Behavioral differences between spin-offs in various countries caused institutions to create mechanisms that can press and change private standards through regulation and enforcement. There is room for progress in studies that seek to analyze the company’s relationships in its international experience and its multi-institutional relations.

  10. Carbon deposition at the bottom of gaps in TEXTOR experiments

    International Nuclear Information System (INIS)

    Matveev, D.; Kirschner, A.; Esser, H.G.; Freisinger, M.; Kreter, A.; Van Hoey, O.; Borodin, D.; Litnovsky, A.; Wienhold, P.; Coenen, J.W.; Stoschus, H.; Philipps, V.; Brezinsek, S.; Van Oost, G.

    2013-01-01

    Results of a new dedicated experiment addressing the problem of impurity deposition at the bottom in gaps are presented along with modelling. A test limiter with an isolated gap was exposed to the scrape-off layer plasma in TEXTOR. The exposure was accompanied by injection of 13 C-marked methane in the vicinity of the gap. Deposition at the bottom of the gap was monitored in situ with Quartz-Microbalance diagnostics. The 13 C deposition efficiency of about 2.6 × 10 −5 was measured. Post mortem analysis of resulting deposited layers performed with SIMS and EPMA techniques yields about a factor 2 smaller value corresponding to approximately 10% contribution of the gap bottom to the total 13 C deposition in the gap. This measured contribution is effectively much smaller than observed earlier in TEXTOR, taking the difference in geometry into account, and is in reasonable agreement with modelling performed with ERO and 3D-GAPS codes

  11. Gear Selectivity of a Longfin Squid Bottom Trawl

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Loligo pealeii (longfin inshore squid) co-occurs with Atlantic butterfish (Peprilus triacanthus) throughout the year and discarding in the L. pealeii bottom trawl...

  12. Simple, miniaturized blood plasma extraction method.

    Science.gov (United States)

    Kim, Jin-Hee; Woenker, Timothy; Adamec, Jiri; Regnier, Fred E

    2013-12-03

    A rapid plasma extraction technology that collects a 2.5 μL aliquot of plasma within three minutes from a finger-stick derived drop of blood was evaluated. The utility of the plasma extraction cards used was that a paper collection disc bearing plasma was produced that could be air-dried in fifteen minutes and placed in a mailing envelop for transport to an analytical laboratory. This circumvents the need for venipuncture and blood collection in specialized vials by a phlebotomist along with centrifugation and refrigerated storage. Plasma extraction was achieved by applying a blood drop to a membrane stack through which plasma was drawn by capillary action. During the course of plasma migration to a collection disc at the bottom of the membrane stack blood cells were removed by a combination of adsorption and filtration. After the collection disc filled with an aliquot of plasma the upper membranes were stripped from the collection card and the collection disc was air-dried. Intercard differences in the volume of plasma collected varied approximately 1% while volume variations of less than 2% were seen with hematocrit levels ranging from 20% to 71%. Dried samples bearing metabolites and proteins were then extracted from the disc and analyzed. 25-Hydroxy vitamin D was quantified by LC-MS/MS analysis following derivatization with a secosteroid signal enhancing tag that imparted a permanent positive charge to the vitamin and reduced the limit of quantification (LOQ) to 1 pg of collected vitamin on the disc; comparable to values observed with liquid-liquid extraction (LLE) of a venipuncture sample. A similar study using conventional proteomics methods and spectral counting for quantification was conducted with yeast enolase added to serum as an internal standard. The LOQ with extracted serum samples for enolase was 1 μM, linear from 1 to 40 μM, the highest concentration examined. In all respects protein quantification with extracted serum samples was comparable to

  13. Geopositioning with a quadcopter: Extracted feature locations and predicted accuracy without a priori sensor attitude information

    Science.gov (United States)

    Dolloff, John; Hottel, Bryant; Edwards, David; Theiss, Henry; Braun, Aaron

    2017-05-01

    This paper presents an overview of the Full Motion Video-Geopositioning Test Bed (FMV-GTB) developed to investigate algorithm performance and issues related to the registration of motion imagery and subsequent extraction of feature locations along with predicted accuracy. A case study is included corresponding to a video taken from a quadcopter. Registration of the corresponding video frames is performed without the benefit of a priori sensor attitude (pointing) information. In particular, tie points are automatically measured between adjacent frames using standard optical flow matching techniques from computer vision, an a priori estimate of sensor attitude is then computed based on supplied GPS sensor positions contained in the video metadata and a photogrammetric/search-based structure from motion algorithm, and then a Weighted Least Squares adjustment of all a priori metadata across the frames is performed. Extraction of absolute 3D feature locations, including their predicted accuracy based on the principles of rigorous error propagation, is then performed using a subset of the registered frames. Results are compared to known locations (check points) over a test site. Throughout this entire process, no external control information (e.g. surveyed points) is used other than for evaluation of solution errors and corresponding accuracy.

  14. Analysis of hydrodynamic characteristics of unmanned underwater vehicle moving close to the sea bottom

    Directory of Open Access Journals (Sweden)

    Xiao-xu Du

    2014-03-01

    Full Text Available The accurate research on the hydrodynamics of unmanned underwater vehicle (UUV, which moves close to the sea bottom, has a great significance for its maneuverability. The structured grid of the computational models with different distances to the sea bottom and attack angles is generated by Ansys ICEM, and the flow field near the sea bottom is simulated using CFX. The characteristics of the drag, lift, pitching moment influenced by the distance to sea bottom and the attack angle are studied. The result shows that the drag coefficient increases with the decrease of distance, while it increases with the increase of attack angle. There exists attraction force when UUV moves close to the sea bottom, and the attraction force increases with the decrease in distance. The lift coefficient increases with the increase in attack angle. The absolute value of the pitching moment coefficient increases with the decrease in distance and the increase in attack angle.

  15. A Study on the Environmental Standard of Sediment on the Bottom of the Water

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chang Hee; Yoo, Hye Jin [Korea Environment Institute, Seoul (Korea)

    2000-12-01

    Sediment on the bottom of the water has been considered one of the water pollutants in the environmental management of Korea so treated as a management on pollutants, as you can see the examples in the dragging operation in the polluted sea area. To healthily maintain and conserve the water ecosystem including bottom living things in the water, sediment on the bottom of the water should be recognized as the independent medium, which should maintain the certain quality like the water, the atmosphere, and soil, rather than the source of water pollution. Such recognition means that the management of sediment on the bottom of the water should change the fragmentary goal, centered the post management focusing on the water management, to the ecosystematic goal including the bottom living things. In a point of the view, this study has a great significance to suggest not only the final goal for the management of sediment on the bottom of the water but also the necessity of developing the environmental standard of the sediment on the bottom of the water, which is a standard of the management or judgment in the actual managing the sediment on the bottom of the water - an estimation on the pollution of sediment, a removal of the polluted sediment, a purification of sediment, and an abandonment of the dragged sediment -, and the development measures. Considering the situation that even the basic scheme related to the management of sediment is not prepared in the Government level, the concept of the environmental standard of sediment, the foreign example of the environmental standard of sediment, the current state of the domestic sediment pollution, and the development scheme of the environmental standard in this study must be the important foundation to establish the management system of sediment in the Government level. 121 refs., 10 figs., 45 tabs.

  16. An Analysis Model for Water Cone Subsidence in Bottom Water Drive Reservoirs

    Science.gov (United States)

    Wang, Jianjun; Xu, Hui; Wu, Shucheng; Yang, Chao; Kong, lingxiao; Zeng, Baoquan; Xu, Haixia; Qu, Tailai

    2017-12-01

    Water coning in bottom water drive reservoirs, which will result in earlier water breakthrough, rapid increase in water cut and low recovery level, has drawn tremendous attention in petroleum engineering field. As one simple and effective method to inhibit bottom water coning, shut-in coning control is usually preferred in oilfield to control the water cone and furthermore to enhance economic performance. However, most of the water coning researchers just have been done on investigation of the coning behavior as it grows up, the reported studies for water cone subsidence are very scarce. The goal of this work is to present an analytical model for water cone subsidence to analyze the subsidence of water cone when the well shut in. Based on Dupuit critical oil production rate formula, an analytical model is developed to estimate the initial water cone shape at the point of critical drawdown. Then, with the initial water cone shape equation, we propose an analysis model for water cone subsidence in bottom water reservoir reservoirs. Model analysis and several sensitivity studies are conducted. This work presents accurate and fast analytical model to perform the water cone subsidence in bottom water drive reservoirs. To consider the recent interests in development of bottom drive reservoirs, our approach provides a promising technique for better understanding the subsidence of water cone.

  17. Study of droplet flow in a T-shape microchannel with bottom wall fluctuation

    Science.gov (United States)

    Pang, Yan; Wang, Xiang; Liu, Zhaomiao

    2018-03-01

    Droplet generation in a T-shape microchannel, with a main channel width of 50 μm , side channel width of 25 μm, and height of 50 μm, is simulated to study the effects of the forced fluctuation of the bottom wall. The periodic fluctuations of the bottom wall are applied on the near junction part of the main channel in the T-shape microchannel. Effects of bottom wall's shape, fluctuation periods, and amplitudes on the droplet generation are covered in the research of this protocol. In the simulation, the average size is affected a little by the fluctuations, but significantly by the fixed shape of the deformed bottom wall, while the droplet size range is expanded by the fluctuations under most of the conditions. Droplet sizes are distributed in a periodic pattern with small amplitude along the relative time when the fluctuation is forced on the bottom wall near the T-junction, while the droplet emerging frequency is not varied by the fluctuation. The droplet velocity is varied by the bottom wall motion, especially under the shorter period and the larger amplitude. When the fluctuation period is similar to the droplet emerging period, the droplet size is as stable as the non-fluctuation case after a development stage at the beginning of flow, while the droplet velocity is varied by the moving wall with the scope up to 80% of the average velocity under the conditions of this investigation.

  18. A Comprehensive Review on the Properties of Coal Bottom Ash in Concrete as Sound Absorption Material

    Directory of Open Access Journals (Sweden)

    Ramzi Hannan Nurul Izzati Raihan

    2017-01-01

    Full Text Available The government is currently implementing policies to increase the usage of coal as fuel for electricity generation. At the same time, the dependency on gas will be reduced. In addition, coal power plants in Malaysia produce large amounts of industrial waste such as bottom ash which is collected in impoundment ponds (ash pond. However, millions of tons of coal ash (bottom ash waste are collected in ponds near power plant stations. Since bottom ash has been classified as hazardous material that threatens the health and safety of human life, an innovative and sustainable solution has been introduced to reuse or recycle industrial waste such as coal bottom ash in concrete mixtures to create a greener and more sustainable world. Bottom ash has the potential to be used as concrete material to replace fine aggregates, coarse aggregates or both. Hence, this paper provides an overview of previous research which used bottom ash as fine aggregate replacement in conventional concrete. The workability, compressive strength, flexural strength, and sound absorption of bottom ash in concrete are reviewed.

  19. Extraction and Analysis of Information Related to Research & Development Declared Under an Additional Protocol

    International Nuclear Information System (INIS)

    Idinger, J.; Labella, R.; Rialhe, A.; Teller, N.

    2015-01-01

    The additional protocol (AP) provides important tools to strengthen and improve the effectiveness and efficiency of the safeguards system. Safeguards are designed to verify that States comply with their international commitments not to use nuclear material or to engage in nuclear-related activities for the purpose of developing nuclear weapons or other nuclear explosive devices. Under an AP based on INFCIRC/540, a State must provide to the IAEA additional information about, and inspector access to, all parts of its nuclear fuel cycle. In addition, the State has to supply information about its nuclear fuel cycle-related research and development (R&D) activities. The majority of States declare their R&D activities under the AP Articles 2.a.(i), 2.a.(x), and 2.b.(i) as part of initial declarations and their annual updates under the AP. In order to verify consistency and completeness of information provided under the AP by States, the Agency has started to analyze declared R&D information by identifying interrelationships between States in different R&D areas relevant to safeguards. The paper outlines the quality of R&D information provided by States to the Agency, describes how the extraction and analysis of relevant declarations are currently carried out at the Agency and specifies what kinds of difficulties arise during evaluation in respect to cross-linking international projects and finding gaps in reporting. In addition, the paper tries to elaborate how the reporting quality of AP information with reference to R&D activities and the assessment process of R&D information could be improved. (author)

  20. An analytical evaluation for the pressure drop characteristics of bottom nozzle flow holes

    International Nuclear Information System (INIS)

    Yang, S. G.; Kim, H. J.; Lim, H. T.; Park, E. J.; Jeon, K. L.

    2002-01-01

    An analytical evaluation for the bottom nozzle flow holes was performed to find a best design concept in terms of pressure drop. For this analysis, Computational Fluid Dynamics (CFD), FLUENT 5.5, code was selected as an analytical evaluation tool. The applicability of CFD code was verified by benchmarking study with Vibration Investigation of Small-scale Test Assemblies (VISTA) test data in several flow conditions and typical flow hole shape. From this verification, the analytical data were benchmarked roughly within 17% to the VISTA test data. And, overall trend under various flow conditions looked very similar between both cases. Based on the evaluated results using CFD code, it is concluded that the deburring and multiple chamfer hole features at leading edge are the excellent design concept to decrease pressure drop across bottom nozzle plate. The deburring and multiple chamfer hole features at leading edge on the bottom nozzle plate have 12% and 17% pressure drop benefit against a single chamfer hole feature on the bottom nozzle plate, respectively. These design features are meaningful and applicable as a low pressure drop design concept of bottom nozzle for Pressurized Water Reactor (PWR) fuel assembly