WorldWideScience

Sample records for extracting bottom information

  1. An efficient and not polluting bottom ash extraction system

    International Nuclear Information System (INIS)

    Carrea, A.

    1992-01-01

    This paper reports that boiler waste water effluent must meet more and more tighter requirements to comply with environmental regulations; sluice water resulting from bottom ash handling is one of the main problems in this context, and many utilities are under effort to maximize the reuse of the sluice water, and, if possible, to meet the aim of zero water discharge from bottom ash handling system. At the same time ash reuse efforts gain strength in order to minimize waste production. One solution to these problems can be found in an innovative Bottom Ash Extraction System (MAC System), marked by the peculiarity to be a continuous dry ash removal; the system has been developed in the last four years by MAGALDI INDUSTRIE SRL in collaboration with ANSALDO Ricerche, the R and D department of ANSALDO, the main Italian Boiler Manufacturer, and is now installed in six ENEL Boilers. The elimination of the water as separation element between the bottom part of the furnace and the outside atmosphere gives advantages mainly from the environmental view point, but a certain improvement in the boiler efficiency has also been demonstrated by the application of the system

  2. Information extraction system

    Science.gov (United States)

    Lemmond, Tracy D; Hanley, William G; Guensche, Joseph Wendell; Perry, Nathan C; Nitao, John J; Kidwell, Paul Brandon; Boakye, Kofi Agyeman; Glaser, Ron E; Prenger, Ryan James

    2014-05-13

    An information extraction system and methods of operating the system are provided. In particular, an information extraction system for performing meta-extraction of named entities of people, organizations, and locations as well as relationships and events from text documents are described herein.

  3. Multimedia Information Extraction

    CERN Document Server

    Maybury, Mark T

    2012-01-01

    The advent of increasingly large consumer collections of audio (e.g., iTunes), imagery (e.g., Flickr), and video (e.g., YouTube) is driving a need not only for multimedia retrieval but also information extraction from and across media. Furthermore, industrial and government collections fuel requirements for stock media access, media preservation, broadcast news retrieval, identity management, and video surveillance.  While significant advances have been made in language processing for information extraction from unstructured multilingual text and extraction of objects from imagery and vid

  4. Challenges in Managing Information Extraction

    Science.gov (United States)

    Shen, Warren H.

    2009-01-01

    This dissertation studies information extraction (IE), the problem of extracting structured information from unstructured data. Example IE tasks include extracting person names from news articles, product information from e-commerce Web pages, street addresses from emails, and names of emerging music bands from blogs. IE is all increasingly…

  5. Scenario Customization for Information Extraction

    National Research Council Canada - National Science Library

    Yangarber, Roman

    2001-01-01

    Information Extraction (IE) is an emerging NLP technology, whose function is to process unstructured, natural language text, to locate specific pieces of information, or facts, in the text, and to use these facts to fill a database...

  6. Extracting useful information from images

    DEFF Research Database (Denmark)

    Kucheryavskiy, Sergey

    2011-01-01

    The paper presents an overview of methods for extracting useful information from digital images. It covers various approaches that utilized different properties of images, like intensity distribution, spatial frequencies content and several others. A few case studies including isotropic and heter......The paper presents an overview of methods for extracting useful information from digital images. It covers various approaches that utilized different properties of images, like intensity distribution, spatial frequencies content and several others. A few case studies including isotropic...

  7. Bottom-Up Technologies for Reuse: Automated Extractive Adoption of Software Product Lines

    OpenAIRE

    Martinez , Jabier ,; Ziadi , Tewfik; Bissyandé , Tegawendé; Klein , Jacques ,; Le Traon , Yves ,

    2017-01-01

    International audience; Adopting Software Product Line (SPL) engineering principles demands a high up-front investment. Bottom-Up Technologies for Reuse (BUT4Reuse) is a generic and extensible tool aimed to leverage existing similar software products in order to help in extractive SPL adoption. The envisioned users are 1) SPL adopters and 2) Integrators of techniques and algorithms to provide automation in SPL adoption activities. We present the methodology it implies for both types of users ...

  8. Design of a tool for extracting a plexiglass falls to the bottom of the reactor pool TRIGA MKI

    International Nuclear Information System (INIS)

    Kankunku, P.K.; Lukanda, M.V.

    2011-01-01

    This paper presents a particular problem, of extracting a plexiglas from the bottom of thr reactor swimming pool. With rudimentary techniques of extraction (two attempts), we noticed that these techniques were unsuccessful, by the way we proceeded in designing a tool made of steel which solved the problem of plexiglas extraction

  9. Extracting information from multiplex networks

    Science.gov (United States)

    Iacovacci, Jacopo; Bianconi, Ginestra

    2016-06-01

    Multiplex networks are generalized network structures that are able to describe networks in which the same set of nodes are connected by links that have different connotations. Multiplex networks are ubiquitous since they describe social, financial, engineering, and biological networks as well. Extending our ability to analyze complex networks to multiplex network structures increases greatly the level of information that is possible to extract from big data. For these reasons, characterizing the centrality of nodes in multiplex networks and finding new ways to solve challenging inference problems defined on multiplex networks are fundamental questions of network science. In this paper, we discuss the relevance of the Multiplex PageRank algorithm for measuring the centrality of nodes in multilayer networks and we characterize the utility of the recently introduced indicator function Θ ˜ S for describing their mesoscale organization and community structure. As working examples for studying these measures, we consider three multiplex network datasets coming for social science.

  10. Transductive Pattern Learning for Information Extraction

    National Research Council Canada - National Science Library

    McLernon, Brian; Kushmerick, Nicholas

    2006-01-01

    .... We present TPLEX, a semi-supervised learning algorithm for information extraction that can acquire extraction patterns from a small amount of labelled text in conjunction with a large amount of unlabelled text...

  11. Selective spatial attention modulates bottom-up informational masking of speech

    OpenAIRE

    Carlile, Simon; Corkhill, Caitlin

    2015-01-01

    To hear out a conversation against other talkers listeners overcome energetic and informational masking. Largely attributed to top-down processes, information masking has also been demonstrated using unintelligible speech and amplitude-modulated maskers suggesting bottom-up processes. We examined the role of speech-like amplitude modulations in information masking using a spatial masking release paradigm. Separating a target talker from two masker talkers produced a 20?dB improvement in speec...

  12. Enhanced Photon Extraction from a Nanowire Quantum Dot Using a Bottom-Up Photonic Shell

    Science.gov (United States)

    Jeannin, Mathieu; Cremel, Thibault; Häyrynen, Teppo; Gregersen, Niels; Bellet-Amalric, Edith; Nogues, Gilles; Kheng, Kuntheak

    2017-11-01

    Semiconductor nanowires offer the possibility to grow high-quality quantum-dot heterostructures, and, in particular, CdSe quantum dots inserted in ZnSe nanowires have demonstrated the ability to emit single photons up to room temperature. In this paper, we demonstrate a bottom-up approach to fabricate a photonic fiberlike structure around such nanowire quantum dots by depositing an oxide shell using atomic-layer deposition. Simulations suggest that the intensity collected in our NA =0.6 microscope objective can be increased by a factor 7 with respect to the bare nanowire case. Combining microphotoluminescence, decay time measurements, and numerical simulations, we obtain a fourfold increase in the collected photoluminescence from the quantum dot. We show that this improvement is due to an increase of the quantum-dot emission rate and a redirection of the emitted light. Our ex situ fabrication technique allows a precise and reproducible fabrication on a large scale. Its improved extraction efficiency is compared to state-of-the-art top-down devices.

  13. INFORMATION EXTRACTION IN TOMB PIT USING HYPERSPECTRAL DATA

    Directory of Open Access Journals (Sweden)

    X. Yang

    2018-04-01

    Full Text Available Hyperspectral data has characteristics of multiple bands and continuous, large amount of data, redundancy, and non-destructive. These characteristics make it possible to use hyperspectral data to study cultural relics. In this paper, the hyperspectral imaging technology is adopted to recognize the bottom images of an ancient tomb located in Shanxi province. There are many black remains on the bottom surface of the tomb, which are suspected to be some meaningful texts or paintings. Firstly, the hyperspectral data is preprocessing to get the reflectance of the region of interesting. For the convenient of compute and storage, the original reflectance value is multiplied by 10000. Secondly, this article uses three methods to extract the symbols at the bottom of the ancient tomb. Finally we tried to use morphology to connect the symbols and gave fifteen reference images. The results show that the extraction of information based on hyperspectral data can obtain a better visual experience, which is beneficial to the study of ancient tombs by researchers, and provides some references for archaeological research findings.

  14. Information Extraction in Tomb Pit Using Hyperspectral Data

    Science.gov (United States)

    Yang, X.; Hou, M.; Lyu, S.; Ma, S.; Gao, Z.; Bai, S.; Gu, M.; Liu, Y.

    2018-04-01

    Hyperspectral data has characteristics of multiple bands and continuous, large amount of data, redundancy, and non-destructive. These characteristics make it possible to use hyperspectral data to study cultural relics. In this paper, the hyperspectral imaging technology is adopted to recognize the bottom images of an ancient tomb located in Shanxi province. There are many black remains on the bottom surface of the tomb, which are suspected to be some meaningful texts or paintings. Firstly, the hyperspectral data is preprocessing to get the reflectance of the region of interesting. For the convenient of compute and storage, the original reflectance value is multiplied by 10000. Secondly, this article uses three methods to extract the symbols at the bottom of the ancient tomb. Finally we tried to use morphology to connect the symbols and gave fifteen reference images. The results show that the extraction of information based on hyperspectral data can obtain a better visual experience, which is beneficial to the study of ancient tombs by researchers, and provides some references for archaeological research findings.

  15. Information Extraction for Social Media

    NARCIS (Netherlands)

    Habib, M. B.; Keulen, M. van

    2014-01-01

    The rapid growth in IT in the last two decades has led to a growth in the amount of information available online. A new style for sharing information is social media. Social media is a continuously instantly updated source of information. In this position paper, we propose a framework for

  16. Information Extraction From Chemical Patents

    Directory of Open Access Journals (Sweden)

    Sandra Bergmann

    2012-01-01

    Full Text Available The development of new chemicals or pharmaceuticals is preceded by an indepth analysis of published patents in this field. This information retrieval is a costly and time inefficient step when done by a human reader, yet it is mandatory for potential success of an investment. The goal of the research project UIMA-HPC is to automate and hence speed-up the process of knowledge mining about patents. Multi-threaded analysis engines, developed according to UIMA (Unstructured Information Management Architecture standards, process texts and images in thousands of documents in parallel. UNICORE (UNiform Interface to COmputing Resources workflow control structures make it possible to dynamically allocate resources for every given task to gain best cpu-time/realtime ratios in an HPC environment.

  17. Ecological and Economic Prerequisites for the Extraction of Solid Minerals from the Bottom of the Arctic Seas

    Directory of Open Access Journals (Sweden)

    Myaskov Alexander

    2017-01-01

    Full Text Available The world ocean has huge reserves of minerals that are contained directly in the water, as well as on the surface of its bottom and in its subsoils. The deposits of solid minerals on the surface of the bottom of the World Ocean are considered the most promising for industrial extraction. The deposits of ferromanganese nodules, cobalt-manganese crusts and polymetallic sulphides are considered as an object of extracting more often than others. There are the largest deposits of ferromanganese nodules in the central and southern parts of the Pacific Ocean, in the central part of the Indian Ocean, and in the seas of the Arctic Ocean near Russia. The deposits of ferromanganese nodules are a serious alternative to deposits of manganese ore on land. However, there are many factors influencing the efficiency of the development of ferromanganese deposits, the most significant are: the content of the useful component in the ore, the depth of the bottom and the distance from the seaports. It is also necessary to take into account the possible environmental consequences of underwater mining.

  18. Enhanced Photon Extraction from a Nanowire Quantum Dot Using a Bottom-Up Photonic Shell

    DEFF Research Database (Denmark)

    Jeannin, Mathieu; Cremel, Thibault; Häyrynen, Teppo

    2017-01-01

    Semiconductor nanowires offer the possibility to grow high-quality quantum-dot heterostructures, and, in particular, CdSe quantum dots inserted in ZnSe nanowires have demonstrated the ability to emit single photons up to room temperature. In this paper, we demonstrate a bottom-up approach...

  19. Selective spatial attention modulates bottom-up informational masking of speech.

    Science.gov (United States)

    Carlile, Simon; Corkhill, Caitlin

    2015-03-02

    To hear out a conversation against other talkers listeners overcome energetic and informational masking. Largely attributed to top-down processes, information masking has also been demonstrated using unintelligible speech and amplitude-modulated maskers suggesting bottom-up processes. We examined the role of speech-like amplitude modulations in information masking using a spatial masking release paradigm. Separating a target talker from two masker talkers produced a 20 dB improvement in speech reception threshold; 40% of which was attributed to a release from informational masking. When across frequency temporal modulations in the masker talkers are decorrelated the speech is unintelligible, although the within frequency modulation characteristics remains identical. Used as a masker as above, the information masking accounted for 37% of the spatial unmasking seen with this masker. This unintelligible and highly differentiable masker is unlikely to involve top-down processes. These data provides strong evidence of bottom-up masking involving speech-like, within-frequency modulations and that this, presumably low level process, can be modulated by selective spatial attention.

  20. NEMO. Netherlands Energy demand MOdel. A top-down model based on bottom-up information

    International Nuclear Information System (INIS)

    Koopmans, C.C.; Te Velde, D.W.; Groot, W.; Hendriks, J.H.A.

    1999-06-01

    The title model links energy use to other production factors, (physical) production, energy prices, technological trends and government policies. It uses a 'putty-semiputty' vintage production structure, in which new investments, adaptations to existing capital goods (retrofit) and 'good-housekeeping' are discerned. Price elasticities are relatively large in the long term and small in the short term. Most predictions of energy use are based on either econometric models or on 'bottom-up information', i.e. disaggregated lists of technical possibilities for and costs of saving energy. Typically, one predicts more energy-efficiency improvements using bottom-up information than using econometric ('top-down') models. We bridged this so-called 'energy-efficiency gap' by designing our macro/meso model NEMO in such a way that we can use bottom-up (micro) information to estimate most model parameters. In our view, reflected in NEMO, the energy-efficiency gap arises for two reasons. The first is that firms and households use a fairly high discount rate of 15% when evaluating the profitability of energy-efficiency improvements. The second is that our bottom-up information ('ICARUS') for most economic sectors does not (as NEMO does) take account of the fact that implementation of new, energy-efficient technology in capital stock takes place only gradually. Parameter estimates for 19 sectors point at a long-term technological energy efficiency improvement trend in Netherlands final energy use of 0.8% per year. The long-term price elasticity is estimated to be 0.29. These values are comparable to other studies based on time series data. Simulations of the effects of the oil price shocks in the seventies and the subsequent fall of oil prices show that the NEMO's price elasticities are consistent with historical data. However, the present pace at which new technologies become available (reflected in NEMO) appears to be lower than in the seventies and eighties. This suggests that it

  1. Uncertainty quantification for radiation measurements: Bottom-up error variance estimation using calibration information

    International Nuclear Information System (INIS)

    Burr, T.; Croft, S.; Krieger, T.; Martin, K.; Norman, C.; Walsh, S.

    2016-01-01

    One example of top-down uncertainty quantification (UQ) involves comparing two or more measurements on each of multiple items. One example of bottom-up UQ expresses a measurement result as a function of one or more input variables that have associated errors, such as a measured count rate, which individually (or collectively) can be evaluated for impact on the uncertainty in the resulting measured value. In practice, it is often found that top-down UQ exhibits larger error variances than bottom-up UQ, because some error sources are present in the fielded assay methods used in top-down UQ that are not present (or not recognized) in the assay studies used in bottom-up UQ. One would like better consistency between the two approaches in order to claim understanding of the measurement process. The purpose of this paper is to refine bottom-up uncertainty estimation by using calibration information so that if there are no unknown error sources, the refined bottom-up uncertainty estimate will agree with the top-down uncertainty estimate to within a specified tolerance. Then, in practice, if the top-down uncertainty estimate is larger than the refined bottom-up uncertainty estimate by more than the specified tolerance, there must be omitted sources of error beyond those predicted from calibration uncertainty. The paper develops a refined bottom-up uncertainty approach for four cases of simple linear calibration: (1) inverse regression with negligible error in predictors, (2) inverse regression with non-negligible error in predictors, (3) classical regression followed by inversion with negligible error in predictors, and (4) classical regression followed by inversion with non-negligible errors in predictors. Our illustrations are of general interest, but are drawn from our experience with nuclear material assay by non-destructive assay. The main example we use is gamma spectroscopy that applies the enrichment meter principle. Previous papers that ignore error in predictors

  2. Extracting Information from Multimedia Meeting Collections

    OpenAIRE

    Gatica-Perez, Daniel; Zhang, Dong; Bengio, Samy

    2005-01-01

    Multimedia meeting collections, composed of unedited audio and video streams, handwritten notes, slides, and electronic documents that jointly constitute a raw record of complex human interaction processes in the workplace, have attracted interest due to the increasing feasibility of recording them in large quantities, by the opportunities for information access and retrieval applications derived from the automatic extraction of relevant meeting information, and by the challenges that the ext...

  3. DKIE: Open Source Information Extraction for Danish

    DEFF Research Database (Denmark)

    Derczynski, Leon; Field, Camilla Vilhelmsen; Bøgh, Kenneth Sejdenfaden

    2014-01-01

    Danish is a major Scandinavian language spoken daily by around six million people. However, it lacks a unified, open set of NLP tools. This demonstration will introduce DKIE, an extensible open-source toolkit for processing Danish text. We implement an information extraction architecture for Danish...

  4. Remediation Performance and Mechanism of Heavy Metals by a Bottom Up Activation and Extraction System Using Multiple Biochemical Materials.

    Science.gov (United States)

    Xiao, Kemeng; Li, Yunzhen; Sun, Yang; Liu, Ruyue; Li, Junjie; Zhao, Yun; Xu, Heng

    2017-09-13

    Soil contamination with heavy metals has caused serious environmental problems and increased the risks to humans and biota. Herein, we developed an effective bottom up metals removal system based on the synergy between the activation of immobilization metal-resistant bacteria and the extraction of bioaccumulator material (Stropharia rugosoannulata). In this system, the advantages of biochar produced at 400 °C and sodium alginate were integrated to immobilize bacteria. Optimized by response surface methodology, the biochar and bacterial suspension were mixed at a ratio of 1:20 (w:v) for 12 h when 2.5% sodium alginate was added to the mixture. Results demonstrated that the system significantly increased the proportion of acid soluble Cd and Cu and improved the soil microecology (microbial counts, soil respiration, and enzyme activities). The maximum extractions of Cd and Cu were 8.79 and 77.92 mg kg -1 , respectively. Moreover, details of the possible mechanistic insight into the metal removal are discussed, which indicate positive correlation with the acetic acid extractable metals and soil microecology. Meanwhile, the "dilution effect" in S. rugosoannulata probably plays an important role in the metal removal process. Furthermore, the metal-resistant bacteria in this system were successfully colonized, and the soil bacteria community were evaluated to understand the microbial diversity in metal-contaminated soil after remediation.

  5. Unsupervised information extraction by text segmentation

    CERN Document Server

    Cortez, Eli

    2013-01-01

    A new unsupervised approach to the problem of Information Extraction by Text Segmentation (IETS) is proposed, implemented and evaluated herein. The authors' approach relies on information available on pre-existing data to learn how to associate segments in the input string with attributes of a given domain relying on a very effective set of content-based features. The effectiveness of the content-based features is also exploited to directly learn from test data structure-based features, with no previous human-driven training, a feature unique to the presented approach. Based on the approach, a

  6. Extracting the information backbone in online system.

    Science.gov (United States)

    Zhang, Qian-Ming; Zeng, An; Shang, Ming-Sheng

    2013-01-01

    Information overload is a serious problem in modern society and many solutions such as recommender system have been proposed to filter out irrelevant information. In the literature, researchers have been mainly dedicated to improving the recommendation performance (accuracy and diversity) of the algorithms while they have overlooked the influence of topology of the online user-object bipartite networks. In this paper, we find that some information provided by the bipartite networks is not only redundant but also misleading. With such "less can be more" feature, we design some algorithms to improve the recommendation performance by eliminating some links from the original networks. Moreover, we propose a hybrid method combining the time-aware and topology-aware link removal algorithms to extract the backbone which contains the essential information for the recommender systems. From the practical point of view, our method can improve the performance and reduce the computational time of the recommendation system, thus improving both of their effectiveness and efficiency.

  7. Information extraction from muon radiography data

    International Nuclear Information System (INIS)

    Borozdin, K.N.; Asaki, T.J.; Chartrand, R.; Hengartner, N.W.; Hogan, G.E.; Morris, C.L.; Priedhorsky, W.C.; Schirato, R.C.; Schultz, L.J.; Sottile, M.J.; Vixie, K.R.; Wohlberg, B.E.; Blanpied, G.

    2004-01-01

    Scattering muon radiography was proposed recently as a technique of detection and 3-d imaging for dense high-Z objects. High-energy cosmic ray muons are deflected in matter in the process of multiple Coulomb scattering. By measuring the deflection angles we are able to reconstruct the configuration of high-Z material in the object. We discuss the methods for information extraction from muon radiography data. Tomographic methods widely used in medical images have been applied to a specific muon radiography information source. Alternative simple technique based on the counting of high-scattered muons in the voxels seems to be efficient in many simulated scenes. SVM-based classifiers and clustering algorithms may allow detection of compact high-Z object without full image reconstruction. The efficiency of muon radiography can be increased using additional informational sources, such as momentum estimation, stopping power measurement, and detection of muonic atom emission.

  8. Extracting the information backbone in online system.

    Directory of Open Access Journals (Sweden)

    Qian-Ming Zhang

    Full Text Available Information overload is a serious problem in modern society and many solutions such as recommender system have been proposed to filter out irrelevant information. In the literature, researchers have been mainly dedicated to improving the recommendation performance (accuracy and diversity of the algorithms while they have overlooked the influence of topology of the online user-object bipartite networks. In this paper, we find that some information provided by the bipartite networks is not only redundant but also misleading. With such "less can be more" feature, we design some algorithms to improve the recommendation performance by eliminating some links from the original networks. Moreover, we propose a hybrid method combining the time-aware and topology-aware link removal algorithms to extract the backbone which contains the essential information for the recommender systems. From the practical point of view, our method can improve the performance and reduce the computational time of the recommendation system, thus improving both of their effectiveness and efficiency.

  9. Extracting the Information Backbone in Online System

    Science.gov (United States)

    Zhang, Qian-Ming; Zeng, An; Shang, Ming-Sheng

    2013-01-01

    Information overload is a serious problem in modern society and many solutions such as recommender system have been proposed to filter out irrelevant information. In the literature, researchers have been mainly dedicated to improving the recommendation performance (accuracy and diversity) of the algorithms while they have overlooked the influence of topology of the online user-object bipartite networks. In this paper, we find that some information provided by the bipartite networks is not only redundant but also misleading. With such “less can be more” feature, we design some algorithms to improve the recommendation performance by eliminating some links from the original networks. Moreover, we propose a hybrid method combining the time-aware and topology-aware link removal algorithms to extract the backbone which contains the essential information for the recommender systems. From the practical point of view, our method can improve the performance and reduce the computational time of the recommendation system, thus improving both of their effectiveness and efficiency. PMID:23690946

  10. Chaotic spectra: How to extract dynamic information

    International Nuclear Information System (INIS)

    Taylor, H.S.; Gomez Llorente, J.M.; Zakrzewski, J.; Kulander, K.C.

    1988-10-01

    Nonlinear dynamics is applied to chaotic unassignable atomic and molecular spectra with the aim of extracting detailed information about regular dynamic motions that exist over short intervals of time. It is shown how this motion can be extracted from high resolution spectra by doing low resolution studies or by Fourier transforming limited regions of the spectrum. These motions mimic those of periodic orbits (PO) and are inserts into the dominant chaotic motion. Considering these inserts and the PO as a dynamically decoupled region of space, resonant scattering theory and stabilization methods enable us to compute ladders of resonant states which interact with the chaotic quasi-continuum computed in principle from basis sets placed off the PO. The interaction of the resonances with the quasicontinuum explains the low resolution spectra seen in such experiments. It also allows one to associate low resolution features with a particular PO. The motion on the PO thereby supplies the molecular movements whose quantization causes the low resolution spectra. Characteristic properties of the periodic orbit based resonances are discussed. The method is illustrated on the photoabsorption spectrum of the hydrogen atom in a strong magnetic field and on the photodissociation spectrum of H 3 + . Other molecular systems which are currently under investigation using this formalism are also mentioned. 53 refs., 10 figs., 2 tabs

  11. Extraction of quantifiable information from complex systems

    CERN Document Server

    Dahmen, Wolfgang; Griebel, Michael; Hackbusch, Wolfgang; Ritter, Klaus; Schneider, Reinhold; Schwab, Christoph; Yserentant, Harry

    2014-01-01

    In April 2007, the  Deutsche Forschungsgemeinschaft (DFG) approved the  Priority Program 1324 “Mathematical Methods for Extracting Quantifiable Information from Complex Systems.” This volume presents a comprehensive overview of the most important results obtained over the course of the program.   Mathematical models of complex systems provide the foundation for further technological developments in science, engineering and computational finance.  Motivated by the trend toward steadily increasing computer power, ever more realistic models have been developed in recent years. These models have also become increasingly complex, and their numerical treatment poses serious challenges.   Recent developments in mathematics suggest that, in the long run, much more powerful numerical solution strategies could be derived if the interconnections between the different fields of research were systematically exploited at a conceptual level. Accordingly, a deeper understanding of the mathematical foundations as w...

  12. Extraction of temporal information in functional MRI

    Science.gov (United States)

    Singh, M.; Sungkarat, W.; Jeong, Jeong-Won; Zhou, Yongxia

    2002-10-01

    The temporal resolution of functional MRI (fMRI) is limited by the shape of the haemodynamic response function (hrf) and the vascular architecture underlying the activated regions. Typically, the temporal resolution of fMRI is on the order of 1 s. We have developed a new data processing approach to extract temporal information on a pixel-by-pixel basis at the level of 100 ms from fMRI data. Instead of correlating or fitting the time-course of each pixel to a single reference function, which is the common practice in fMRI, we correlate each pixel's time-course to a series of reference functions that are shifted with respect to each other by 100 ms. The reference function yielding the highest correlation coefficient for a pixel is then used as a time marker for that pixel. A Monte Carlo simulation and experimental study of this approach were performed to estimate the temporal resolution as a function of signal-to-noise ratio (SNR) in the time-course of a pixel. Assuming a known and stationary hrf, the simulation and experimental studies suggest a lower limit in the temporal resolution of approximately 100 ms at an SNR of 3. The multireference function approach was also applied to extract timing information from an event-related motor movement study where the subjects flexed a finger on cue. The event was repeated 19 times with the event's presentation staggered to yield an approximately 100-ms temporal sampling of the haemodynamic response over the entire presentation cycle. The timing differences among different regions of the brain activated by the motor task were clearly visualized and quantified by this method. The results suggest that it is possible to achieve a temporal resolution of /spl sim/200 ms in practice with this approach.

  13. Optical Aperture Synthesis Object's Information Extracting Based on Wavelet Denoising

    International Nuclear Information System (INIS)

    Fan, W J; Lu, Y

    2006-01-01

    Wavelet denoising is studied to improve OAS(optical aperture synthesis) object's Fourier information extracting. Translation invariance wavelet denoising based on Donoho wavelet soft threshold denoising is researched to remove Pseudo-Gibbs in wavelet soft threshold image. OAS object's information extracting based on translation invariance wavelet denoising is studied. The study shows that wavelet threshold denoising can improve the precision and the repetition of object's information extracting from interferogram, and the translation invariance wavelet denoising information extracting is better than soft threshold wavelet denoising information extracting

  14. Outcast labour in Asia: circulation and informalization of the workforce at the bottom of the economy

    NARCIS (Netherlands)

    Breman, J.

    2010-01-01

    Written over the last ten years, these essays focus on labor at the bottom of the rural economy, lacking social, economic, and political wherewithal, and their struggles to find a foothold in the urban economy. The author draws on his fieldwork from India, Indonesia, and China. The volume

  15. Nanoelectronics «bottom – up»: thermodynamics of electric conductor, information-driven battery and quantum entropy

    Directory of Open Access Journals (Sweden)

    Юрий Алексеевич Кругляк

    2015-11-01

    Full Text Available Within the «bottom – up» approach of nanoelectronics the equilibrium thermodynamics of a conductor with a current is presented and the accumulation of information in a non-equilibrium state with an analysis of information-driven battery model is discussed in connection with the Landauer principle on the minimum of energy needed to erase one bit of information. The concept of quantum entropy is introduced and the importance of integration of spintronics and magnetronics in connection with the upcoming development of the spin architecture for the computing devices are discussed

  16. Respiratory Information Extraction from Electrocardiogram Signals

    KAUST Repository

    Amin, Gamal El Din Fathy

    2010-12-01

    The Electrocardiogram (ECG) is a tool measuring the electrical activity of the heart, and it is extensively used for diagnosis and monitoring of heart diseases. The ECG signal reflects not only the heart activity but also many other physiological processes. The respiratory activity is a prominent process that affects the ECG signal due to the close proximity of the heart and the lungs. In this thesis, several methods for the extraction of respiratory process information from the ECG signal are presented. These methods allow an estimation of the lung volume and the lung pressure from the ECG signal. The potential benefit of this is to eliminate the corresponding sensors used to measure the respiration activity. A reduction of the number of sensors connected to patients will increase patients’ comfort and reduce the costs associated with healthcare. As a further result, the efficiency of diagnosing respirational disorders will increase since the respiration activity can be monitored with a common, widely available method. The developed methods can also improve the detection of respirational disorders that occur while patients are sleeping. Such disorders are commonly diagnosed in sleeping laboratories where the patients are connected to a number of different sensors. Any reduction of these sensors will result in a more natural sleeping environment for the patients and hence a higher sensitivity of the diagnosis.

  17. Sample-based XPath Ranking for Web Information Extraction

    NARCIS (Netherlands)

    Jundt, Oliver; van Keulen, Maurice

    Web information extraction typically relies on a wrapper, i.e., program code or a configuration that specifies how to extract some information from web pages at a specific website. Manually creating and maintaining wrappers is a cumbersome and error-prone task. It may even be prohibitive as some

  18. The Agent of extracting Internet Information with Lead Order

    Science.gov (United States)

    Mo, Zan; Huang, Chuliang; Liu, Aijun

    In order to carry out e-commerce better, advanced technologies to access business information are in need urgently. An agent is described to deal with the problems of extracting internet information that caused by the non-standard and skimble-scamble structure of Chinese websites. The agent designed includes three modules which respond to the process of extracting information separately. A method of HTTP tree and a kind of Lead algorithm is proposed to generate a lead order, with which the required web can be retrieved easily. How to transform the extracted information structuralized with natural language is also discussed.

  19. Cause Information Extraction from Financial Articles Concerning Business Performance

    Science.gov (United States)

    Sakai, Hiroyuki; Masuyama, Shigeru

    We propose a method of extracting cause information from Japanese financial articles concerning business performance. Our method acquires cause informtion, e. g. “_??__??__??__??__??__??__??__??__??__??_ (zidousya no uriage ga koutyou: Sales of cars were good)”. Cause information is useful for investors in selecting companies to invest. Our method extracts cause information as a form of causal expression by using statistical information and initial clue expressions automatically. Our method can extract causal expressions without predetermined patterns or complex rules given by hand, and is expected to be applied to other tasks for acquiring phrases that have a particular meaning not limited to cause information. We compared our method with our previous one originally proposed for extracting phrases concerning traffic accident causes and experimental results showed that our new method outperforms our previous one.

  20. Cellular Mutagenicity and Heavy Metal Concentrations of Leachates Extracted from the Fly and Bottom Ash Derived from Municipal Solid Waste Incineration

    Science.gov (United States)

    Chen, Po-Wen; Liu, Zhen-Shu; Wun, Min-Jie; Kuo, Tai-Chen

    2016-01-01

    Two incinerators in Taiwan have recently attempted to reuse the fly and bottom ash that they produce, but the mutagenicity of these types of ash has not yet been assessed. Therefore, we evaluated the mutagenicity of the ash with the Ames mutagenicity assay using the TA98, TA100, and TA1535 bacterial strains. We obtained three leachates from three leachants of varying pH values using the toxicity characteristic leaching procedure test recommended by the Taiwan Environmental Protection Agency (Taiwan EPA). We then performed the Ames assay on the harvested leachates. To evaluate the possible relationship between the presence of heavy metals and mutagenicity, the concentrations of five heavy metals (Cd, Cr, Cu, Pb, and Zn) in the leachates were also determined. The concentrations of Cd and Cr in the most acidic leachate from the precipitator fly ash and the Cd concentration in the most acidic leachate from the boiler fly ash exceeded the recommended limits. Notably, none of the nine leachates extracted from the boiler, precipitator, or bottom ashes displayed mutagenic activity. This data partially affirms the safety of the fly and bottom ash produced by certain incinerators. Therefore, the biotoxicity of leachates from recycled ash should be routinely monitored before reusing the ash. PMID:27827867

  1. A Volunteered Geographic Information Framework to Enable Bottom-Up Disaster Management Platforms

    Directory of Open Access Journals (Sweden)

    Mohammad Ebrahim Poorazizi

    2015-08-01

    Full Text Available Recent disasters, such as the 2010 Haiti earthquake, have drawn attention to the potential role of citizens as active information producers. By using location-aware devices such as smartphones to collect geographic information in the form of geo-tagged text, photos, or videos, and sharing this information through online social media, such as Twitter, citizens create Volunteered Geographic Information (VGI. To effectively use this information for disaster management, we developed a VGI framework for the discovery of VGI. This framework consists of four components: (i a VGI brokering module to provide a standard service interface to retrieve VGI from multiple resources based on spatial, temporal, and semantic parameters; (ii a VGI quality control component, which employs semantic filtering and cross-referencing techniques to evaluate VGI; (iii a VGI publisher module, which uses a service-based delivery mechanism to disseminate VGI, and (iv a VGI discovery component to locate, browse, and query metadata about available VGI datasets. In a case study we employed a FOSS (Free and Open Source Software strategy, open standards/specifications, and free/open data to show the utility of the framework. We demonstrate that the framework can facilitate data discovery for disaster management. The addition of quality metrics and a single aggregated source of relevant crisis VGI will allow users to make informed policy choices that could save lives, meet basic humanitarian needs earlier, and perhaps limit environmental and economic damage.

  2. Salient region detection by fusing bottom-up and top-down features extracted from a single image.

    Science.gov (United States)

    Tian, Huawei; Fang, Yuming; Zhao, Yao; Lin, Weisi; Ni, Rongrong; Zhu, Zhenfeng

    2014-10-01

    Recently, some global contrast-based salient region detection models have been proposed based on only the low-level feature of color. It is necessary to consider both color and orientation features to overcome their limitations, and thus improve the performance of salient region detection for images with low-contrast in color and high-contrast in orientation. In addition, the existing fusion methods for different feature maps, like the simple averaging method and the selective method, are not effective sufficiently. To overcome these limitations of existing salient region detection models, we propose a novel salient region model based on the bottom-up and top-down mechanisms: the color contrast and orientation contrast are adopted to calculate the bottom-up feature maps, while the top-down cue of depth-from-focus from the same single image is used to guide the generation of final salient regions, since depth-from-focus reflects the photographer's preference and knowledge of the task. A more general and effective fusion method is designed to combine the bottom-up feature maps. According to the degree-of-scattering and eccentricities of feature maps, the proposed fusion method can assign adaptive weights to different feature maps to reflect the confidence level of each feature map. The depth-from-focus of the image as a significant top-down feature for visual attention in the image is used to guide the salient regions during the fusion process; with its aid, the proposed fusion method can filter out the background and highlight salient regions for the image. Experimental results show that the proposed model outperforms the state-of-the-art models on three public available data sets.

  3. Can we replace curation with information extraction software?

    Science.gov (United States)

    Karp, Peter D

    2016-01-01

    Can we use programs for automated or semi-automated information extraction from scientific texts as practical alternatives to professional curation? I show that error rates of current information extraction programs are too high to replace professional curation today. Furthermore, current IEP programs extract single narrow slivers of information, such as individual protein interactions; they cannot extract the large breadth of information extracted by professional curators for databases such as EcoCyc. They also cannot arbitrate among conflicting statements in the literature as curators can. Therefore, funding agencies should not hobble the curation efforts of existing databases on the assumption that a problem that has stymied Artificial Intelligence researchers for more than 60 years will be solved tomorrow. Semi-automated extraction techniques appear to have significantly more potential based on a review of recent tools that enhance curator productivity. But a full cost-benefit analysis for these tools is lacking. Without such analysis it is possible to expend significant effort developing information-extraction tools that automate small parts of the overall curation workflow without achieving a significant decrease in curation costs.Database URL. © The Author(s) 2016. Published by Oxford University Press.

  4. Mining knowledge from text repositories using information extraction ...

    Indian Academy of Sciences (India)

    Information extraction (IE); text mining; text repositories; knowledge discovery from .... general purpose English words. However ... of precision and recall, as extensive experimentation is required due to lack of public tagged corpora. 4. Mining ...

  5. Mars Target Encyclopedia: Information Extraction for Planetary Science

    Science.gov (United States)

    Wagstaff, K. L.; Francis, R.; Gowda, T.; Lu, Y.; Riloff, E.; Singh, K.

    2017-06-01

    Mars surface targets / and published compositions / Seek and ye will find. We used text mining methods to extract information from LPSC abstracts about the composition of Mars surface targets. Users can search by element, mineral, or target.

  6. Integrating Information Extraction Agents into a Tourism Recommender System

    Science.gov (United States)

    Esparcia, Sergio; Sánchez-Anguix, Víctor; Argente, Estefanía; García-Fornes, Ana; Julián, Vicente

    Recommender systems face some problems. On the one hand information needs to be maintained updated, which can result in a costly task if it is not performed automatically. On the other hand, it may be interesting to include third party services in the recommendation since they improve its quality. In this paper, we present an add-on for the Social-Net Tourism Recommender System that uses information extraction and natural language processing techniques in order to automatically extract and classify information from the Web. Its goal is to maintain the system updated and obtain information about third party services that are not offered by service providers inside the system.

  7. Addressing Information Proliferation: Applications of Information Extraction and Text Mining

    Science.gov (United States)

    Li, Jingjing

    2013-01-01

    The advent of the Internet and the ever-increasing capacity of storage media have made it easy to store, deliver, and share enormous volumes of data, leading to a proliferation of information on the Web, in online libraries, on news wires, and almost everywhere in our daily lives. Since our ability to process and absorb this information remains…

  8. Integration of top-down and bottom-up information for audio organization and retrieval

    DEFF Research Database (Denmark)

    Jensen, Bjørn Sand

    The increasing availability of digital audio and music calls for methods and systems to analyse and organize these digital objects. This thesis investigates three elements related to such systems focusing on the ability to represent and elicit the user's view on the multimedia object and the system...... output. The aim is to provide organization and processing, which aligns with the understanding and needs of the users. Audio and music is often characterized by the large amount of heterogenous information. The rst aspect investigated is the integration of such multi-variate and multi-modal information...... (indirect scaling). Inference is performed by analytical and simulation based methods, including the Laplace approximation and expectation propagation. In order to minimize the cost of the often expensive and lengthly experimentation, sequential experiment design or active learning is supported. The setup...

  9. Information extraction from multi-institutional radiology reports.

    Science.gov (United States)

    Hassanpour, Saeed; Langlotz, Curtis P

    2016-01-01

    The radiology report is the most important source of clinical imaging information. It documents critical information about the patient's health and the radiologist's interpretation of medical findings. It also communicates information to the referring physicians and records that information for future clinical and research use. Although efforts to structure some radiology report information through predefined templates are beginning to bear fruit, a large portion of radiology report information is entered in free text. The free text format is a major obstacle for rapid extraction and subsequent use of information by clinicians, researchers, and healthcare information systems. This difficulty is due to the ambiguity and subtlety of natural language, complexity of described images, and variations among different radiologists and healthcare organizations. As a result, radiology reports are used only once by the clinician who ordered the study and rarely are used again for research and data mining. In this work, machine learning techniques and a large multi-institutional radiology report repository are used to extract the semantics of the radiology report and overcome the barriers to the re-use of radiology report information in clinical research and other healthcare applications. We describe a machine learning system to annotate radiology reports and extract report contents according to an information model. This information model covers the majority of clinically significant contents in radiology reports and is applicable to a wide variety of radiology study types. Our automated approach uses discriminative sequence classifiers for named-entity recognition to extract and organize clinically significant terms and phrases consistent with the information model. We evaluated our information extraction system on 150 radiology reports from three major healthcare organizations and compared its results to a commonly used non-machine learning information extraction method. We

  10. The Impact of Bottom-Up Parking Information Provision in a Real-Life Context: The Case of Antwerp

    Directory of Open Access Journals (Sweden)

    Geert Tasseron

    2017-01-01

    Full Text Available A number of studies have analyzed the possible impacts of bottom-up parking information or parking reservation systems on parking dynamics in abstract simulation environments. In this paper, we take these efforts one step further by investigating the impacts of these systems in a real-life context: the center of the city of Antwerp, Belgium. In our simulation, we assume that all on-street and off-street parking places are equipped with technology able to transmit their occupancy status to so-called smart cars, which can receive information and reserve a parking place. We employ PARKAGENT, an agent-based simulation model, to simulate the behavior of smart and regular cars. We obtain detailed data on parking demand from FEATHERS, an activity-based transport model. The simulation results show that parking information and reservation hardly impact search time but do reduce walking distance for smart cars, leading to a reduction in total parking time, that is, the sum of search time and walking time. Reductions in search time occur only in zones with high occupancy rates, while a drop in walking distance is especially observed in low occupancy areas. Societal benefits of parking information and reservation are limited, because of the low impact on search time and the possible negative health effects of reduced walking distance.

  11. Fine-grained information extraction from German transthoracic echocardiography reports.

    Science.gov (United States)

    Toepfer, Martin; Corovic, Hamo; Fette, Georg; Klügl, Peter; Störk, Stefan; Puppe, Frank

    2015-11-12

    Information extraction techniques that get structured representations out of unstructured data make a large amount of clinically relevant information about patients accessible for semantic applications. These methods typically rely on standardized terminologies that guide this process. Many languages and clinical domains, however, lack appropriate resources and tools, as well as evaluations of their applications, especially if detailed conceptualizations of the domain are required. For instance, German transthoracic echocardiography reports have not been targeted sufficiently before, despite of their importance for clinical trials. This work therefore aimed at development and evaluation of an information extraction component with a fine-grained terminology that enables to recognize almost all relevant information stated in German transthoracic echocardiography reports at the University Hospital of Würzburg. A domain expert validated and iteratively refined an automatically inferred base terminology. The terminology was used by an ontology-driven information extraction system that outputs attribute value pairs. The final component has been mapped to the central elements of a standardized terminology, and it has been evaluated according to documents with different layouts. The final system achieved state-of-the-art precision (micro average.996) and recall (micro average.961) on 100 test documents that represent more than 90 % of all reports. In particular, principal aspects as defined in a standardized external terminology were recognized with f 1=.989 (micro average) and f 1=.963 (macro average). As a result of keyword matching and restraint concept extraction, the system obtained high precision also on unstructured or exceptionally short documents, and documents with uncommon layout. The developed terminology and the proposed information extraction system allow to extract fine-grained information from German semi-structured transthoracic echocardiography reports

  12. Extraction of Information of Audio-Visual Contents

    Directory of Open Access Journals (Sweden)

    Carlos Aguilar

    2011-10-01

    Full Text Available In this article we show how it is possible to use Channel Theory (Barwise and Seligman, 1997 for modeling the process of information extraction realized by audiences of audio-visual contents. To do this, we rely on the concepts pro- posed by Channel Theory and, especially, its treatment of representational systems. We then show how the information that an agent is capable of extracting from the content depends on the number of channels he is able to establish between the content and the set of classifications he is able to discriminate. The agent can endeavor the extraction of information through these channels from the totality of content; however, we discuss the advantages of extracting from its constituents in order to obtain a greater number of informational items that represent it. After showing how the extraction process is endeavored for each channel, we propose a method of representation of all the informative values an agent can obtain from a content using a matrix constituted by the channels the agent is able to establish on the content (source classifications, and the ones he can understand as individual (destination classifications. We finally show how this representation allows reflecting the evolution of the informative items through the evolution of audio-visual content.

  13. Top-Down and Bottom-Up Identification of Proteins by Liquid Extraction Surface Analysis Mass Spectrometry of Healthy and Diseased Human Liver Tissue

    Science.gov (United States)

    Sarsby, Joscelyn; Martin, Nicholas J.; Lalor, Patricia F.; Bunch, Josephine; Cooper, Helen J.

    2014-09-01

    Liquid extraction surface analysis mass spectrometry (LESA MS) has the potential to become a useful tool in the spatially-resolved profiling of proteins in substrates. Here, the approach has been applied to the analysis of thin tissue sections from human liver. The aim was to determine whether LESA MS was a suitable approach for the detection of protein biomarkers of nonalcoholic liver disease (nonalcoholic steatohepatitis, NASH), with a view to the eventual development of LESA MS for imaging NASH pathology. Two approaches were considered. In the first, endogenous proteins were extracted from liver tissue sections by LESA, subjected to automated trypsin digestion, and the resulting peptide mixture was analyzed by liquid chromatography tandem mass spectrometry (LC-MS/MS) (bottom-up approach). In the second (top-down approach), endogenous proteins were extracted by LESA, and analyzed intact. Selected protein ions were subjected to collision-induced dissociation (CID) and/or electron transfer dissociation (ETD) mass spectrometry. The bottom-up approach resulted in the identification of over 500 proteins; however identification of key protein biomarkers, liver fatty acid binding protein (FABP1), and its variant (Thr→Ala, position 94), was unreliable and irreproducible. Top-down LESA MS analysis of healthy and diseased liver tissue revealed peaks corresponding to multiple (~15-25) proteins. MS/MS of four of these proteins identified them as FABP1, its variant, α-hemoglobin, and 10 kDa heat shock protein. The reliable identification of FABP1 and its variant by top-down LESA MS suggests that the approach may be suitable for imaging NASH pathology in sections from liver biopsies.

  14. Semantic Information Extraction of Lanes Based on Onboard Camera Videos

    Science.gov (United States)

    Tang, L.; Deng, T.; Ren, C.

    2018-04-01

    In the field of autonomous driving, semantic information of lanes is very important. This paper proposes a method of automatic detection of lanes and extraction of semantic information from onboard camera videos. The proposed method firstly detects the edges of lanes by the grayscale gradient direction, and improves the Probabilistic Hough transform to fit them; then, it uses the vanishing point principle to calculate the lane geometrical position, and uses lane characteristics to extract lane semantic information by the classification of decision trees. In the experiment, 216 road video images captured by a camera mounted onboard a moving vehicle were used to detect lanes and extract lane semantic information. The results show that the proposed method can accurately identify lane semantics from video images.

  15. The Bottom Boundary Layer.

    Science.gov (United States)

    Trowbridge, John H; Lentz, Steven J

    2018-01-03

    The oceanic bottom boundary layer extracts energy and momentum from the overlying flow, mediates the fate of near-bottom substances, and generates bedforms that retard the flow and affect benthic processes. The bottom boundary layer is forced by winds, waves, tides, and buoyancy and is influenced by surface waves, internal waves, and stratification by heat, salt, and suspended sediments. This review focuses on the coastal ocean. The main points are that (a) classical turbulence concepts and modern turbulence parameterizations provide accurate representations of the structure and turbulent fluxes under conditions in which the underlying assumptions hold, (b) modern sensors and analyses enable high-quality direct or near-direct measurements of the turbulent fluxes and dissipation rates, and (c) the remaining challenges include the interaction of waves and currents with the erodible seabed, the impact of layer-scale two- and three-dimensional instabilities, and the role of the bottom boundary layer in shelf-slope exchange.

  16. The Bottom Boundary Layer

    Science.gov (United States)

    Trowbridge, John H.; Lentz, Steven J.

    2018-01-01

    The oceanic bottom boundary layer extracts energy and momentum from the overlying flow, mediates the fate of near-bottom substances, and generates bedforms that retard the flow and affect benthic processes. The bottom boundary layer is forced by winds, waves, tides, and buoyancy and is influenced by surface waves, internal waves, and stratification by heat, salt, and suspended sediments. This review focuses on the coastal ocean. The main points are that (a) classical turbulence concepts and modern turbulence parameterizations provide accurate representations of the structure and turbulent fluxes under conditions in which the underlying assumptions hold, (b) modern sensors and analyses enable high-quality direct or near-direct measurements of the turbulent fluxes and dissipation rates, and (c) the remaining challenges include the interaction of waves and currents with the erodible seabed, the impact of layer-scale two- and three-dimensional instabilities, and the role of the bottom boundary layer in shelf-slope exchange.

  17. Knowledge Dictionary for Information Extraction on the Arabic Text Data

    Directory of Open Access Journals (Sweden)

    Wahyu Jauharis Saputra

    2013-04-01

    Full Text Available Information extraction is an early stage of a process of textual data analysis. Information extraction is required to get information from textual data that can be used for process analysis, such as classification and categorization. A textual data is strongly influenced by the language. Arabic is gaining a significant attention in many studies because Arabic language is very different from others, and in contrast to other languages, tools and research on the Arabic language is still lacking. The information extracted using the knowledge dictionary is a concept of expression. A knowledge dictionary is usually constructed manually by an expert and this would take a long time and is specific to a problem only. This paper proposed a method for automatically building a knowledge dictionary. Dictionary knowledge is formed by classifying sentences having the same concept, assuming that they will have a high similarity value. The concept that has been extracted can be used as features for subsequent computational process such as classification or categorization. Dataset used in this paper was the Arabic text dataset. Extraction result was tested by using a decision tree classification engine and the highest precision value obtained was 71.0% while the highest recall value was 75.0%. 

  18. Ontology-Based Information Extraction for Business Intelligence

    Science.gov (United States)

    Saggion, Horacio; Funk, Adam; Maynard, Diana; Bontcheva, Kalina

    Business Intelligence (BI) requires the acquisition and aggregation of key pieces of knowledge from multiple sources in order to provide valuable information to customers or feed statistical BI models and tools. The massive amount of information available to business analysts makes information extraction and other natural language processing tools key enablers for the acquisition and use of that semantic information. We describe the application of ontology-based extraction and merging in the context of a practical e-business application for the EU MUSING Project where the goal is to gather international company intelligence and country/region information. The results of our experiments so far are very promising and we are now in the process of building a complete end-to-end solution.

  19. NAMED ENTITY RECOGNITION FROM BIOMEDICAL TEXT -AN INFORMATION EXTRACTION TASK

    Directory of Open Access Journals (Sweden)

    N. Kanya

    2016-07-01

    Full Text Available Biomedical Text Mining targets the Extraction of significant information from biomedical archives. Bio TM encompasses Information Retrieval (IR and Information Extraction (IE. The Information Retrieval will retrieve the relevant Biomedical Literature documents from the various Repositories like PubMed, MedLine etc., based on a search query. The IR Process ends up with the generation of corpus with the relevant document retrieved from the Publication databases based on the query. The IE task includes the process of Preprocessing of the document, Named Entity Recognition (NER from the documents and Relationship Extraction. This process includes Natural Language Processing, Data Mining techniques and machine Language algorithm. The preprocessing task includes tokenization, stop word Removal, shallow parsing, and Parts-Of-Speech tagging. NER phase involves recognition of well-defined objects such as genes, proteins or cell-lines etc. This process leads to the next phase that is extraction of relationships (IE. The work was based on machine learning algorithm Conditional Random Field (CRF.

  20. Contact effects analyzed by a parameter extraction method based on a single bottom-gate/top-contact organic thin-film transistor

    Science.gov (United States)

    Takagaki, Shunsuke; Yamada, Hirofumi; Noda, Kei

    2018-03-01

    Contact effects in organic thin-film transistors (OTFTs) were examined by using our previously proposed parameter extraction method from the electrical characteristics of a single staggered-type device. Gate-voltage-dependent contact resistance and channel mobility in the linear regime were evaluated for bottom-gate/top-contact (BGTC) pentacene TFTs with active layers of different thicknesses, and for pentacene TFTs with contact-doped layers prepared by coevaporation of pentacene and tetrafluorotetracyanoquinodimethane (F4TCNQ). The extracted parameters suggested that the influence of the contact resistance becomes more prominent with the larger active-layer thickness, and that contact-doping experiments give rise to a drastic decrease in the contact resistance and a concurrent considerable improvement in the channel mobility. Additionally, the estimated energy distributions of trap density in the transistor channel probably reflect the trap filling with charge carriers injected into the channel regions. The analysis results in this study confirm the effectiveness of our proposed method, with which we can investigate contact effects and circumvent the influences of characteristic variations in OTFT fabrication.

  1. Enhanced light extraction from GaN-based LEDs with a bottom-up assembled photonic crystal

    International Nuclear Information System (INIS)

    Gong Haibo; Hao Xiaopeng; Wu Yongzhong; Cao Bingqiang; Xia Wei; Xu Xiangang

    2011-01-01

    Highlights: → Polystyrene (PS) microspheres were employed as a template. → A noninvasive photonic crystal was fabricated on the surface of GaN-based LED. → Periodic arrangement of bowl-like holes served as a photonic crystal with gradually changed fill factors. → The electroluminescence intensity of LED with a photonic crystal was significantly enhanced. - Abstract: Photonic crystal (PhC) structure is an efficient tool for light extraction from light-emitting diodes (LEDs). The fabrication of a large area PhC structure on the light output surface of LEDs often involves sophisticated equipments such as nanoimprint lithography machine. In this study a monolayer of polystyrene (PS) microspheres was employed as a template to fabricate a noninvasive photonic crystal of indium tin oxide (ITO) on the surface of GaN-based LED. PS spheres can help to form periodic arrangement of bowl-like holes, a photonic crystal with gradually changed fill factors. Importantly, the electroluminescence intensity of LED with a photonic crystal was significantly enhanced by 1.5 times compared to that of the conventional one under various forward injection currents.

  2. Enhanced light extraction from GaN-based LEDs with a bottom-up assembled photonic crystal

    Energy Technology Data Exchange (ETDEWEB)

    Gong Haibo [State Key Lab of Crystal Materials, Shandong University, Jinan, 250100 (China); School of Materials Science and Engineering, University of Jinan, Jinan, 250022 (China); Hao Xiaopeng, E-mail: xphao@sdu.edu.cn [State Key Lab of Crystal Materials, Shandong University, Jinan, 250100 (China); Wu Yongzhong [State Key Lab of Crystal Materials, Shandong University, Jinan, 250100 (China); Cao Bingqiang [School of Materials Science and Engineering, University of Jinan, Jinan, 250022 (China); Xia Wei [Shandong Huaguang Optoelectronics Company, Ltd., Jinan, 250101 (China); Xu Xiangang [State Key Lab of Crystal Materials, Shandong University, Jinan, 250100 (China); Shandong Huaguang Optoelectronics Company, Ltd., Jinan, 250101 (China)

    2011-08-15

    Highlights: > Polystyrene (PS) microspheres were employed as a template. > A noninvasive photonic crystal was fabricated on the surface of GaN-based LED. > Periodic arrangement of bowl-like holes served as a photonic crystal with gradually changed fill factors. > The electroluminescence intensity of LED with a photonic crystal was significantly enhanced. - Abstract: Photonic crystal (PhC) structure is an efficient tool for light extraction from light-emitting diodes (LEDs). The fabrication of a large area PhC structure on the light output surface of LEDs often involves sophisticated equipments such as nanoimprint lithography machine. In this study a monolayer of polystyrene (PS) microspheres was employed as a template to fabricate a noninvasive photonic crystal of indium tin oxide (ITO) on the surface of GaN-based LED. PS spheres can help to form periodic arrangement of bowl-like holes, a photonic crystal with gradually changed fill factors. Importantly, the electroluminescence intensity of LED with a photonic crystal was significantly enhanced by 1.5 times compared to that of the conventional one under various forward injection currents.

  3. A Two-Step Resume Information Extraction Algorithm

    Directory of Open Access Journals (Sweden)

    Jie Chen

    2018-01-01

    Full Text Available With the rapid growth of Internet-based recruiting, there are a great number of personal resumes among recruiting systems. To gain more attention from the recruiters, most resumes are written in diverse formats, including varying font size, font colour, and table cells. However, the diversity of format is harmful to data mining, such as resume information extraction, automatic job matching, and candidates ranking. Supervised methods and rule-based methods have been proposed to extract facts from resumes, but they strongly rely on hierarchical structure information and large amounts of labelled data, which are hard to collect in reality. In this paper, we propose a two-step resume information extraction approach. In the first step, raw text of resume is identified as different resume blocks. To achieve the goal, we design a novel feature, Writing Style, to model sentence syntax information. Besides word index and punctuation index, word lexical attribute and prediction results of classifiers are included in Writing Style. In the second step, multiple classifiers are employed to identify different attributes of fact information in resumes. Experimental results on a real-world dataset show that the algorithm is feasible and effective.

  4. Optimum detection for extracting maximum information from symmetric qubit sets

    International Nuclear Information System (INIS)

    Mizuno, Jun; Fujiwara, Mikio; Sasaki, Masahide; Akiba, Makoto; Kawanishi, Tetsuya; Barnett, Stephen M.

    2002-01-01

    We demonstrate a class of optimum detection strategies for extracting the maximum information from sets of equiprobable real symmetric qubit states of a single photon. These optimum strategies have been predicted by Sasaki et al. [Phys. Rev. A 59, 3325 (1999)]. The peculiar aspect is that the detections with at least three outputs suffice for optimum extraction of information regardless of the number of signal elements. The cases of ternary (or trine), quinary, and septenary polarization signals are studied where a standard von Neumann detection (a projection onto a binary orthogonal basis) fails to access the maximum information. Our experiments demonstrate that it is possible with present technologies to attain about 96% of the theoretical limit

  5. Extracting Semantic Information from Visual Data: A Survey

    Directory of Open Access Journals (Sweden)

    Qiang Liu

    2016-03-01

    Full Text Available The traditional environment maps built by mobile robots include both metric ones and topological ones. These maps are navigation-oriented and not adequate for service robots to interact with or serve human users who normally rely on the conceptual knowledge or semantic contents of the environment. Therefore, the construction of semantic maps becomes necessary for building an effective human-robot interface for service robots. This paper reviews recent research and development in the field of visual-based semantic mapping. The main focus is placed on how to extract semantic information from visual data in terms of feature extraction, object/place recognition and semantic representation methods.

  6. Rapid automatic keyword extraction for information retrieval and analysis

    Science.gov (United States)

    Rose, Stuart J [Richland, WA; Cowley,; E, Wendy [Richland, WA; Crow, Vernon L [Richland, WA; Cramer, Nicholas O [Richland, WA

    2012-03-06

    Methods and systems for rapid automatic keyword extraction for information retrieval and analysis. Embodiments can include parsing words in an individual document by delimiters, stop words, or both in order to identify candidate keywords. Word scores for each word within the candidate keywords are then calculated based on a function of co-occurrence degree, co-occurrence frequency, or both. Based on a function of the word scores for words within the candidate keyword, a keyword score is calculated for each of the candidate keywords. A portion of the candidate keywords are then extracted as keywords based, at least in part, on the candidate keywords having the highest keyword scores.

  7. Robust Vehicle and Traffic Information Extraction for Highway Surveillance

    Directory of Open Access Journals (Sweden)

    Yeh Chia-Hung

    2005-01-01

    Full Text Available A robust vision-based traffic monitoring system for vehicle and traffic information extraction is developed in this research. It is challenging to maintain detection robustness at all time for a highway surveillance system. There are three major problems in detecting and tracking a vehicle: (1 the moving cast shadow effect, (2 the occlusion effect, and (3 nighttime detection. For moving cast shadow elimination, a 2D joint vehicle-shadow model is employed. For occlusion detection, a multiple-camera system is used to detect occlusion so as to extract the exact location of each vehicle. For vehicle nighttime detection, a rear-view monitoring technique is proposed to maintain tracking and detection accuracy. Furthermore, we propose a method to improve the accuracy of background extraction, which usually serves as the first step in any vehicle detection processing. Experimental results are given to demonstrate that the proposed techniques are effective and efficient for vision-based highway surveillance.

  8. Advanced applications of natural language processing for performing information extraction

    CERN Document Server

    Rodrigues, Mário

    2015-01-01

    This book explains how can be created information extraction (IE) applications that are able to tap the vast amount of relevant information available in natural language sources: Internet pages, official documents such as laws and regulations, books and newspapers, and social web. Readers are introduced to the problem of IE and its current challenges and limitations, supported with examples. The book discusses the need to fill the gap between documents, data, and people, and provides a broad overview of the technology supporting IE. The authors present a generic architecture for developing systems that are able to learn how to extract relevant information from natural language documents, and illustrate how to implement working systems using state-of-the-art and freely available software tools. The book also discusses concrete applications illustrating IE uses.   ·         Provides an overview of state-of-the-art technology in information extraction (IE), discussing achievements and limitations for t...

  9. Improving information extraction using a probability-based approach

    DEFF Research Database (Denmark)

    Kim, S.; Ahmed, Saeema; Wallace, K.

    2007-01-01

    Information plays a crucial role during the entire life-cycle of a product. It has been shown that engineers frequently consult colleagues to obtain the information they require to solve problems. However, the industrial world is now more transient and key personnel move to other companies...... or retire. It is becoming essential to retrieve vital information from archived product documents, if it is available. There is, therefore, great interest in ways of extracting relevant and sharable information from documents. A keyword-based search is commonly used, but studies have shown...... the recall, while maintaining the high precision, a learning approach that makes identification decisions based on a probability model, rather than simply looking up the presence of the pre-defined variations, looks promising. This paper presents the results of developing such a probability-based entity...

  10. Using classic methods in a networked manner: seeing volunteered spatial information in a bottom-up fashion

    NARCIS (Netherlands)

    Carton, L.J.; Ache, P.M.

    2014-01-01

    Using new social media and ICT infrastructures for self-organization, more and more citizen networks and business sectors organize themselves voluntarily around sustainability themes. The paper traces and evaluates one emerging innovation in such bottom-up, networked form of sustainable

  11. Transliteration normalization for Information Extraction and Machine Translation

    Directory of Open Access Journals (Sweden)

    Yuval Marton

    2014-12-01

    Full Text Available Foreign name transliterations typically include multiple spelling variants. These variants cause data sparseness and inconsistency problems, increase the Out-of-Vocabulary (OOV rate, and present challenges for Machine Translation, Information Extraction and other natural language processing (NLP tasks. This work aims to identify and cluster name spelling variants using a Statistical Machine Translation method: word alignment. The variants are identified by being aligned to the same “pivot” name in another language (the source-language in Machine Translation settings. Based on word-to-word translation and transliteration probabilities, as well as the string edit distance metric, names with similar spellings in the target language are clustered and then normalized to a canonical form. With this approach, tens of thousands of high-precision name transliteration spelling variants are extracted from sentence-aligned bilingual corpora in Arabic and English (in both languages. When these normalized name spelling variants are applied to Information Extraction tasks, improvements over strong baseline systems are observed. When applied to Machine Translation tasks, a large improvement potential is shown.

  12. Knowledge discovery: Extracting usable information from large amounts of data

    International Nuclear Information System (INIS)

    Whiteson, R.

    1998-01-01

    The threat of nuclear weapons proliferation is a problem of world wide concern. Safeguards are the key to nuclear nonproliferation and data is the key to safeguards. The safeguards community has access to a huge and steadily growing volume of data. The advantages of this data rich environment are obvious, there is a great deal of information which can be utilized. The challenge is to effectively apply proven and developing technologies to find and extract usable information from that data. That information must then be assessed and evaluated to produce the knowledge needed for crucial decision making. Efficient and effective analysis of safeguards data will depend on utilizing technologies to interpret the large, heterogeneous data sets that are available from diverse sources. With an order-of-magnitude increase in the amount of data from a wide variety of technical, textual, and historical sources there is a vital need to apply advanced computer technologies to support all-source analysis. There are techniques of data warehousing, data mining, and data analysis that can provide analysts with tools that will expedite their extracting useable information from the huge amounts of data to which they have access. Computerized tools can aid analysts by integrating heterogeneous data, evaluating diverse data streams, automating retrieval of database information, prioritizing inputs, reconciling conflicting data, doing preliminary interpretations, discovering patterns or trends in data, and automating some of the simpler prescreening tasks that are time consuming and tedious. Thus knowledge discovery technologies can provide a foundation of support for the analyst. Rather than spending time sifting through often irrelevant information, analysts could use their specialized skills in a focused, productive fashion. This would allow them to make their analytical judgments with more confidence and spend more of their time doing what they do best

  13. Evolving spectral transformations for multitemporal information extraction using evolutionary computation

    Science.gov (United States)

    Momm, Henrique; Easson, Greg

    2011-01-01

    Remote sensing plays an important role in assessing temporal changes in land features. The challenge often resides in the conversion of large quantities of raw data into actionable information in a timely and cost-effective fashion. To address this issue, research was undertaken to develop an innovative methodology integrating biologically-inspired algorithms with standard image classification algorithms to improve information extraction from multitemporal imagery. Genetic programming was used as the optimization engine to evolve feature-specific candidate solutions in the form of nonlinear mathematical expressions of the image spectral channels (spectral indices). The temporal generalization capability of the proposed system was evaluated by addressing the task of building rooftop identification from a set of images acquired at different dates in a cross-validation approach. The proposed system generates robust solutions (kappa values > 0.75 for stage 1 and > 0.4 for stage 2) despite the statistical differences between the scenes caused by land use and land cover changes coupled with variable environmental conditions, and the lack of radiometric calibration between images. Based on our results, the use of nonlinear spectral indices enhanced the spectral differences between features improving the clustering capability of standard classifiers and providing an alternative solution for multitemporal information extraction.

  14. Recognition techniques for extracting information from semistructured documents

    Science.gov (United States)

    Della Ventura, Anna; Gagliardi, Isabella; Zonta, Bruna

    2000-12-01

    Archives of optical documents are more and more massively employed, the demand driven also by the new norms sanctioning the legal value of digital documents, provided they are stored on supports that are physically unalterable. On the supply side there is now a vast and technologically advanced market, where optical memories have solved the problem of the duration and permanence of data at costs comparable to those for magnetic memories. The remaining bottleneck in these systems is the indexing. The indexing of documents with a variable structure, while still not completely automated, can be machine supported to a large degree with evident advantages both in the organization of the work, and in extracting information, providing data that is much more detailed and potentially significant for the user. We present here a system for the automatic registration of correspondence to and from a public office. The system is based on a general methodology for the extraction, indexing, archiving, and retrieval of significant information from semi-structured documents. This information, in our prototype application, is distributed among the database fields of sender, addressee, subject, date, and body of the document.

  15. Bottom Production

    CERN Document Server

    Nason, P.; Schneider, O.; Tartarelli, G.F.; Vikas, P.; Baines, J.; Baranov, S.P.; Bartalini, P.; Bay, A.; Bouhova, E.; Cacciari, M.; Caner, A.; Coadou, Y.; Corti, G.; Damet, J.; Dell'Orso, R.; De Mello Neto, J.R.T.; Domenech, J.L.; Drollinger, V.; Eerola, P.; Ellis, N.; Epp, B.; Frixione, S.; Gadomski, S.; Gavrilenko, I.; Gennai, S.; George, S.; Ghete, V.M.; Guy, L.; Hasegawa, Y.; Iengo, P.; Jacholkowska, A.; Jones, R.; Kharchilava, A.; Kneringer, E.; Koppenburg, P.; Korsmo, H.; Kramer, M.; Labanca, N.; Lehto, M.; Maltoni, F.; Mangano, Michelangelo L.; Mele, S.; Nairz, A.M.; Nakada, T.; Nikitin, N.; Nisati, A.; Norrbin, E.; Palla, F.; Rizatdinova, F.; Robins, S.; Rousseau, D.; Sanchis-Lozano, M.A.; Shapiro, M.; Sherwood, P.; Smirnova, L.; Smizanska, M.; Starodumov, A.; Stepanov, N.; Vogt, R.

    2000-01-01

    We review the prospects for bottom production physics at the LHC. Members of the working group who has contributed to this document are: J. Baines, S.P. Baranov, P. Bartalini, A. Bay, E. Bouhova, M. Cacciari, A. Caner, Y. Coadou, G. Corti, J. Damet, R. Dell'Orso, J.R.T. De Mello Neto, J.L. Domenech, V. Drollinger, P. Eerola, N. Ellis, B. Epp, S. Frixione, S. Gadomski, I. Gavrilenko, S. Gennai, S. George, V.M. Ghete, L. Guy, Y. Hasegawa, P. Iengo, A. Jacholkowska, R. Jones, A. Kharchilava, E. Kneringer, P. Koppenburg, H. Korsmo, M. Kraemer, N. Labanca, M. Lehto, F. Maltoni, M.L. Mangano, S. Mele, A.M. Nairz, T. Nakada, N. Nikitin, A. Nisati, E. Norrbin, F. Palla, F. Rizatdinova, S. Robins, D. Rousseau, M.A. Sanchis-Lozano, M. Shapiro, P. Sherwood, L. Smirnova, M. Smizanska, A. Starodumov, N. Stepanov, R. Vogt

  16. Bottom up

    International Nuclear Information System (INIS)

    Ockenden, James

    1999-01-01

    This article presents an overview of the electric supply industries in Eastern Europe. The development of more competitive and efficient plant in Poland and work on emissions control ahead of EU membership; the Czech's complicated tariff system; Hungary's promised 8% return on investment in their electricity supply industry and its tariff problems; Bulgaria and Ukraine's desperate need for investment to build alternative plants to their aging nuclear plants; and demand outstripping supply in Romania are among the topics considered.. The viscous circle of poor service and low utility income is considered, and the top-down approach for breaking the cycle by improving plant efficiency, and the bottom up approach of improving plant income as practiced by Moldavia are explained. (UK)

  17. Bottom production

    International Nuclear Information System (INIS)

    Baines, J.; Baranov, S.P.; Bartalini, P.; Bay, A.; Bouhova, E.; Cacciari, M.; Caner, A.; Coadou, Y.; Corti, G.; Damet, J.; Dell-Orso, R.; De Mello Neto, J.R.T.; Domenech, J.L.; Drollinger, V.; Eerola, P.; Ellis, N.; Epp, B.; Frixione, S.; Gadomski, S.; Gavrilenko, I.; Gennai, S.; George, S.; Ghete, V.M.; Guy, L.; Hasegawa, Y.; Iengo, P.; Jacholkowska, A.; Jones, R.; Kharchilava, A.; Kneringer, E.; Koppenburg, P.; Korsmo, H.; Kramer, M.; Labanca, N.; Lehto, M.; Maltoni, F.; Mangano, M.L.; Mele, S.; Nairz, A.M.; Nakada, T.; Nikitin, N.; Nisati, A.; Norrbin, E.; Palla, F.; Rizatdinova, F.; Robins, S.; Rousseau, D.; Sanchis-Lozano, M.A.; Shapiro, M.; Sherwood, P.; Smirnova, L.; Smizanska, M.; Starodumov, A.; Stepanov, N.; Vogt, R.

    2000-01-01

    In the context of the LHC experiments, the physics of bottom flavoured hadrons enters in different contexts. It can be used for QCD tests, it affects the possibilities of B decays studies, and it is an important source of background for several processes of interest. The physics of b production at hadron colliders has a rather long story, dating back to its first observation in the UA1 experiment. Subsequently, b production has been studied at the Tevatron. Besides the transverse momentum spectrum of a single b, it has also become possible, in recent time, to study correlations in the production characteristics of the b and the b. At the LHC new opportunities will be offered by the high statistics and the high energy reach. One expects to be able to study the transverse momentum spectrum at higher transverse momenta, and also to exploit the large statistics to perform more accurate studies of correlations

  18. Bottom production

    Energy Technology Data Exchange (ETDEWEB)

    Baines, J.; Baranov, S.P.; Bartalini, P.; Bay, A.; Bouhova, E.; Cacciari, M.; Caner, A.; Coadou, Y.; Corti, G.; Damet, J.; Dell-Orso, R.; De Mello Neto, J.R.T.; Domenech, J.L.; Drollinger, V.; Eerola, P.; Ellis, N.; Epp, B.; Frixione, S.; Gadomski, S.; Gavrilenko, I.; Gennai, S.; George, S.; Ghete, V.M.; Guy, L.; Hasegawa, Y.; Iengo, P.; Jacholkowska, A.; Jones, R.; Kharchilava, A.; Kneringer, E.; Koppenburg, P.; Korsmo, H.; Kramer, M.; Labanca, N.; Lehto, M.; Maltoni, F.; Mangano, M.L.; Mele, S.; Nairz, A.M.; Nakada, T.; Nikitin, N.; Nisati, A.; Norrbin, E.; Palla, F.; Rizatdinova, F.; Robins, S.; Rousseau, D.; Sanchis-Lozano, M.A.; Shapiro, M.; Sherwood, P.; Smirnova, L.; Smizanska, M.; Starodumov, A.; Stepanov, N.; Vogt, R.

    2000-03-15

    In the context of the LHC experiments, the physics of bottom flavoured hadrons enters in different contexts. It can be used for QCD tests, it affects the possibilities of B decays studies, and it is an important source of background for several processes of interest. The physics of b production at hadron colliders has a rather long story, dating back to its first observation in the UA1 experiment. Subsequently, b production has been studied at the Tevatron. Besides the transverse momentum spectrum of a single b, it has also become possible, in recent time, to study correlations in the production characteristics of the b and the b. At the LHC new opportunities will be offered by the high statistics and the high energy reach. One expects to be able to study the transverse momentum spectrum at higher transverse momenta, and also to exploit the large statistics to perform more accurate studies of correlations.

  19. Developmental toxicity of clarified slurry oil, syntower bottoms, and distillate aromatic extract administered as a single oral dose to pregnant rats

    Energy Technology Data Exchange (ETDEWEB)

    Feuston, M.H.; Mackerer, C.R. [Stonybrook Labs., Princeton, NJ (United States)

    1996-09-01

    Clarified slurry oil (CSO), syntower bottoms (STB), and distillate aromatic extract (DAE) are refinery streams produced by processing crude oil. Available data indicate that some refinery streams are developmentally toxic by the dermal route of exposure. However, there is no conclusive evidence for their being teratogenic. The present studies were designed to further explore the suspected teratogenic potency of refinery streams while at the same time limiting embryolethality. In general, evidence of maternal toxicity (i.e., decreased body weight gain, decreased thymus weight) was observed at doses greater than or equal to 500 mg/kg. For each refinery stream tested, the incidence of resorption was greatest on GD 11. A common pattern of fetal malformations was observed for all of the refinery streams tested and included cleft palate, diaphragmatic hernia, and paw and tail defects. The incidence and type of malformation observed were influenced by the gestation day of exposure. The incidence and type of malformation observed were influenced by the gestation day of exposure. The incidences of external and skeletal malformations were greatest on GD 11 and 12 for fetuses exposed to CSO; on GD 13 and 14, the incidence of malformation was comparable for CSO- and STB-exposed fetuses. The incidence of visceral anomalies was greatest on GD 11-13 for fetuses exposed to CSO and STB; on Gestation D 14, the incidence was comparable for each of the refinery streams tested. In general, the ability to produce adverse effects on development was greatest for CSO and least for DAE. Effects produced by STB were comparable to or less severe than those observed for CSO. 24 refs., 11 tabs.

  20. Automated extraction of chemical structure information from digital raster images

    Directory of Open Access Journals (Sweden)

    Shedden Kerby A

    2009-02-01

    Full Text Available Abstract Background To search for chemical structures in research articles, diagrams or text representing molecules need to be translated to a standard chemical file format compatible with cheminformatic search engines. Nevertheless, chemical information contained in research articles is often referenced as analog diagrams of chemical structures embedded in digital raster images. To automate analog-to-digital conversion of chemical structure diagrams in scientific research articles, several software systems have been developed. But their algorithmic performance and utility in cheminformatic research have not been investigated. Results This paper aims to provide critical reviews for these systems and also report our recent development of ChemReader – a fully automated tool for extracting chemical structure diagrams in research articles and converting them into standard, searchable chemical file formats. Basic algorithms for recognizing lines and letters representing bonds and atoms in chemical structure diagrams can be independently run in sequence from a graphical user interface-and the algorithm parameters can be readily changed-to facilitate additional development specifically tailored to a chemical database annotation scheme. Compared with existing software programs such as OSRA, Kekule, and CLiDE, our results indicate that ChemReader outperforms other software systems on several sets of sample images from diverse sources in terms of the rate of correct outputs and the accuracy on extracting molecular substructure patterns. Conclusion The availability of ChemReader as a cheminformatic tool for extracting chemical structure information from digital raster images allows research and development groups to enrich their chemical structure databases by annotating the entries with published research articles. Based on its stable performance and high accuracy, ChemReader may be sufficiently accurate for annotating the chemical database with links

  1. Information Extraction, Data Integration, and Uncertain Data Management: The State of The Art

    NARCIS (Netherlands)

    Habib, Mena Badieh; van Keulen, Maurice

    2011-01-01

    Information Extraction, data Integration, and uncertain data management are different areas of research that got vast focus in the last two decades. Many researches tackled those areas of research individually. However, information extraction systems should have integrated with data integration

  2. Information Extraction for Clinical Data Mining: A Mammography Case Study.

    Science.gov (United States)

    Nassif, Houssam; Woods, Ryan; Burnside, Elizabeth; Ayvaci, Mehmet; Shavlik, Jude; Page, David

    2009-01-01

    Breast cancer is the leading cause of cancer mortality in women between the ages of 15 and 54. During mammography screening, radiologists use a strict lexicon (BI-RADS) to describe and report their findings. Mammography records are then stored in a well-defined database format (NMD). Lately, researchers have applied data mining and machine learning techniques to these databases. They successfully built breast cancer classifiers that can help in early detection of malignancy. However, the validity of these models depends on the quality of the underlying databases. Unfortunately, most databases suffer from inconsistencies, missing data, inter-observer variability and inappropriate term usage. In addition, many databases are not compliant with the NMD format and/or solely consist of text reports. BI-RADS feature extraction from free text and consistency checks between recorded predictive variables and text reports are crucial to addressing this problem. We describe a general scheme for concept information retrieval from free text given a lexicon, and present a BI-RADS features extraction algorithm for clinical data mining. It consists of a syntax analyzer, a concept finder and a negation detector. The syntax analyzer preprocesses the input into individual sentences. The concept finder uses a semantic grammar based on the BI-RADS lexicon and the experts' input. It parses sentences detecting BI-RADS concepts. Once a concept is located, a lexical scanner checks for negation. Our method can handle multiple latent concepts within the text, filtering out ultrasound concepts. On our dataset, our algorithm achieves 97.7% precision, 95.5% recall and an F 1 -score of 0.97. It outperforms manual feature extraction at the 5% statistical significance level.

  3. Automated Extraction of Substance Use Information from Clinical Texts.

    Science.gov (United States)

    Wang, Yan; Chen, Elizabeth S; Pakhomov, Serguei; Arsoniadis, Elliot; Carter, Elizabeth W; Lindemann, Elizabeth; Sarkar, Indra Neil; Melton, Genevieve B

    2015-01-01

    Within clinical discourse, social history (SH) includes important information about substance use (alcohol, drug, and nicotine use) as key risk factors for disease, disability, and mortality. In this study, we developed and evaluated a natural language processing (NLP) system for automated detection of substance use statements and extraction of substance use attributes (e.g., temporal and status) based on Stanford Typed Dependencies. The developed NLP system leveraged linguistic resources and domain knowledge from a multi-site social history study, Propbank and the MiPACQ corpus. The system attained F-scores of 89.8, 84.6 and 89.4 respectively for alcohol, drug, and nicotine use statement detection, as well as average F-scores of 82.1, 90.3, 80.8, 88.7, 96.6, and 74.5 respectively for extraction of attributes. Our results suggest that NLP systems can achieve good performance when augmented with linguistic resources and domain knowledge when applied to a wide breadth of substance use free text clinical notes.

  4. Domain-independent information extraction in unstructured text

    Energy Technology Data Exchange (ETDEWEB)

    Irwin, N.H. [Sandia National Labs., Albuquerque, NM (United States). Software Surety Dept.

    1996-09-01

    Extracting information from unstructured text has become an important research area in recent years due to the large amount of text now electronically available. This status report describes the findings and work done during the second year of a two-year Laboratory Directed Research and Development Project. Building on the first-year`s work of identifying important entities, this report details techniques used to group words into semantic categories and to output templates containing selective document content. Using word profiles and category clustering derived during a training run, the time-consuming knowledge-building task can be avoided. Though the output still lacks in completeness when compared to systems with domain-specific knowledge bases, the results do look promising. The two approaches are compatible and could complement each other within the same system. Domain-independent approaches retain appeal as a system that adapts and learns will soon outpace a system with any amount of a priori knowledge.

  5. Extracting and Using Photon Polarization Information in Radiative B Decays

    Energy Technology Data Exchange (ETDEWEB)

    Grossman, Yuval

    2000-05-09

    The authors discuss the uses of conversion electron pairs for extracting photon polarization information in weak radiative B decays. Both cases of leptons produced through a virtual and real photon are considered. Measurements of the angular correlation between the (K-pi) and (e{sup +}e{sup {minus}}) decay planes in B --> K*(--> K-pi)gamma (*)(--> e{sup +}e{sup {minus}}) decays can be used to determine the helicity amplitudes in the radiative B --> K*gamma decays. A large right-handed helicity amplitude in B-bar decays is a signal of new physics. The time-dependent CP asymmetry in the B{sup 0} decay angular correlation is shown to measure sin 2-beta and cos 2-beta with little hadronic uncertainty.

  6. Extraction of neutron spectral information from Bonner-Sphere data

    CERN Document Server

    Haney, J H; Zaidins, C S

    1999-01-01

    We have extended a least-squares method of extracting neutron spectral information from Bonner-Sphere data which was previously developed by Zaidins et al. (Med. Phys. 5 (1978) 42). A pulse-height analysis with background stripping is employed which provided a more accurate count rate for each sphere. Newer response curves by Mares and Schraube (Nucl. Instr. and Meth. A 366 (1994) 461) were included for the moderating spheres and the bare detector which comprise the Bonner spectrometer system. Finally, the neutron energy spectrum of interest was divided using the philosophy of fuzzy logic into three trapezoidal regimes corresponding to slow, moderate, and fast neutrons. Spectral data was taken using a PuBe source in two different environments and the analyzed data is presented for these cases as slow, moderate, and fast neutron fluences. (author)

  7. ONTOGRABBING: Extracting Information from Texts Using Generative Ontologies

    DEFF Research Database (Denmark)

    Nilsson, Jørgen Fischer; Szymczak, Bartlomiej Antoni; Jensen, P.A.

    2009-01-01

    We describe principles for extracting information from texts using a so-called generative ontology in combination with syntactic analysis. Generative ontologies are introduced as semantic domains for natural language phrases. Generative ontologies extend ordinary finite ontologies with rules...... for producing recursively shaped terms representing the ontological content (ontological semantics) of NL noun phrases and other phrases. We focus here on achieving a robust, often only partial, ontology-driven parsing of and ascription of semantics to a sentence in the text corpus. The aim of the ontological...... analysis is primarily to identify paraphrases, thereby achieving a search functionality beyond mere keyword search with synsets. We further envisage use of the generative ontology as a phrase-based rather than word-based browser into text corpora....

  8. Information extraction and knowledge graph construction from geoscience literature

    Science.gov (United States)

    Wang, Chengbin; Ma, Xiaogang; Chen, Jianguo; Chen, Jingwen

    2018-03-01

    Geoscience literature published online is an important part of open data, and brings both challenges and opportunities for data analysis. Compared with studies of numerical geoscience data, there are limited works on information extraction and knowledge discovery from textual geoscience data. This paper presents a workflow and a few empirical case studies for that topic, with a focus on documents written in Chinese. First, we set up a hybrid corpus combining the generic and geology terms from geology dictionaries to train Chinese word segmentation rules of the Conditional Random Fields model. Second, we used the word segmentation rules to parse documents into individual words, and removed the stop-words from the segmentation results to get a corpus constituted of content-words. Third, we used a statistical method to analyze the semantic links between content-words, and we selected the chord and bigram graphs to visualize the content-words and their links as nodes and edges in a knowledge graph, respectively. The resulting graph presents a clear overview of key information in an unstructured document. This study proves the usefulness of the designed workflow, and shows the potential of leveraging natural language processing and knowledge graph technologies for geoscience.

  9. Methods of analysis by the U.S. Geological Survey National Water Quality Laboratory; determination of semivolatile organic compounds in bottom sediment by solvent extraction, gel permeation chromatographic fractionation, and capillary-column gas chromatography/mass spectrometry

    Science.gov (United States)

    Furlong, E.T.; Vaught, D.G.; Merten, L.M.; Foreman, W.T.; Gates, Paul M.

    1996-01-01

    A method for the determination of 79 semivolatile organic compounds (SOCs) and 4 surrogate compounds in soils and bottom sediment is described. The SOCs are extracted from bottom sediment by solvent extraction, followed by partial isolation using high-performance gel permeation chromatography (GPC). The SOCs then are qualitatively identified and quantitative concentrations determined by capillary-column gas chromatography/mass spectrometry (GC/MS). This method also is designed for an optional simultaneous isolation of polychlorinated biphenyls (PCBs) and organochlorine (OC) insecticides, including toxaphene. When OCs and PCBs are determined, an additional alumina- over-silica column chromatography step follows GPC cleanup, and quantitation is by dual capillary- column gas chromatography with electron-capture detection (GC/ECD). Bottom-sediment samples are centrifuged to remove excess water and extracted overnight with dichloromethane. The extract is concentrated, centrifuged, and then filtered through a 0.2-micrometer polytetrafluoro-ethylene syringe filter. Two aliquots of the sample extract then are quantitatively injected onto two polystyrene- divinylbenzene GPC columns connected in series. The SOCs are eluted with dichloromethane, a fraction containing the SOCs is collected, and some coextracted interferences, including elemental sulfur, are separated and discarded. The SOC-containing GPC fraction then is analyzed by GC/MS. When desired, a second aliquot from GPC is further processed for OCs and PCBs by combined alumina-over-silica column chromatography. The two fractions produced in this cleanup then are analyzed by GC/ECD. This report fully describes and is limited to the determination of SOCs by GC/MS.

  10. Data Assimilation to Extract Soil Moisture Information from SMAP Observations

    Directory of Open Access Journals (Sweden)

    Jana Kolassa

    2017-11-01

    Full Text Available This study compares different methods to extract soil moisture information through the assimilation of Soil Moisture Active Passive (SMAP observations. Neural network (NN and physically-based SMAP soil moisture retrievals were assimilated into the National Aeronautics and Space Administration (NASA Catchment model over the contiguous United States for April 2015 to March 2017. By construction, the NN retrievals are consistent with the global climatology of the Catchment model soil moisture. Assimilating the NN retrievals without further bias correction improved the surface and root zone correlations against in situ measurements from 14 SMAP core validation sites (CVS by 0.12 and 0.16, respectively, over the model-only skill, and reduced the surface and root zone unbiased root-mean-square error (ubRMSE by 0.005 m 3 m − 3 and 0.001 m 3 m − 3 , respectively. The assimilation reduced the average absolute surface bias against the CVS measurements by 0.009 m 3 m − 3 , but increased the root zone bias by 0.014 m 3 m − 3 . Assimilating the NN retrievals after a localized bias correction yielded slightly lower surface correlation and ubRMSE improvements, but generally the skill differences were small. The assimilation of the physically-based SMAP Level-2 passive soil moisture retrievals using a global bias correction yielded similar skill improvements, as did the direct assimilation of locally bias-corrected SMAP brightness temperatures within the SMAP Level-4 soil moisture algorithm. The results show that global bias correction methods may be able to extract more independent information from SMAP observations compared to local bias correction methods, but without accurate quality control and observation error characterization they are also more vulnerable to adverse effects from retrieval errors related to uncertainties in the retrieval inputs and algorithm. Furthermore, the results show that using global bias correction approaches without a

  11. Multi-Filter String Matching and Human-Centric Entity Matching for Information Extraction

    Science.gov (United States)

    Sun, Chong

    2012-01-01

    More and more information is being generated in text documents, such as Web pages, emails and blogs. To effectively manage this unstructured information, one broadly used approach includes locating relevant content in documents, extracting structured information and integrating the extracted information for querying, mining or further analysis. In…

  12. Earth Science Data Analytics: Preparing for Extracting Knowledge from Information

    Science.gov (United States)

    Kempler, Steven; Barbieri, Lindsay

    2016-01-01

    Data analytics is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. Data analytics is a broad term that includes data analysis, as well as an understanding of the cognitive processes an analyst uses to understand problems and explore data in meaningful ways. Analytics also include data extraction, transformation, and reduction, utilizing specific tools, techniques, and methods. Turning to data science, definitions of data science sound very similar to those of data analytics (which leads to a lot of the confusion between the two). But the skills needed for both, co-analyzing large amounts of heterogeneous data, understanding and utilizing relevant tools and techniques, and subject matter expertise, although similar, serve different purposes. Data Analytics takes on a practitioners approach to applying expertise and skills to solve issues and gain subject knowledge. Data Science, is more theoretical (research in itself) in nature, providing strategic actionable insights and new innovative methodologies. Earth Science Data Analytics (ESDA) is the process of examining, preparing, reducing, and analyzing large amounts of spatial (multi-dimensional), temporal, or spectral data using a variety of data types to uncover patterns, correlations and other information, to better understand our Earth. The large variety of datasets (temporal spatial differences, data types, formats, etc.) invite the need for data analytics skills that understand the science domain, and data preparation, reduction, and analysis techniques, from a practitioners point of view. The application of these skills to ESDA is the focus of this presentation. The Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster was created in recognition of the practical need to facilitate the co-analysis of large amounts of data and information for Earth science. Thus, from a to

  13. Testing the reliability of information extracted from ancient zircon

    Science.gov (United States)

    Kielman, Ross; Whitehouse, Martin; Nemchin, Alexander

    2015-04-01

    Studies combining zircon U-Pb chronology, trace element distribution as well as O and Hf isotope systematics are a powerful way to gain understanding of the processes shaping Earth's evolution, especially in detrital populations where constraints from the original host are missing. Such studies of the Hadean detrital zircon population abundant in sedimentary rocks in Western Australia have involved analysis of an unusually large number of individual grains, but also highlighted potential problems with the approach, only apparent when multiple analyses are obtained from individual grains. A common feature of the Hadean as well as many early Archaean zircon populations is their apparent inhomogeneity, which reduces confidence in conclusions based on studies combining chemistry and isotopic characteristics of zircon. In order to test the reliability of information extracted from early Earth zircon, we report results from one of the first in-depth multi-method study of zircon from a relatively simple early Archean magmatic rock, used as an analogue to ancient detrital zircon. The approach involves making multiple SIMS analyses in individual grains in order to be comparable to the most advanced studies of detrital zircon populations. The investigated sample is a relatively undeformed, non-migmatitic ca. 3.8 Ga tonalite collected a few kms south of the Isua Greenstone Belt, southwest Greenland. Extracted zircon grains can be combined into three different groups based on the behavior of their U-Pb systems: (i) grains that show internally consistent and concordant ages and define an average age of 3805±15 Ma, taken to be the age of the rock, (ii) grains that are distributed close to the concordia line, but with significant variability between multiple analyses, suggesting an ancient Pb loss and (iii) grains that have multiple analyses distributed along a discordia pointing towards a zero intercept, indicating geologically recent Pb-loss. This overall behavior has

  14. Extraction of CT dose information from DICOM metadata: automated Matlab-based approach.

    Science.gov (United States)

    Dave, Jaydev K; Gingold, Eric L

    2013-01-01

    The purpose of this study was to extract exposure parameters and dose-relevant indexes of CT examinations from information embedded in DICOM metadata. DICOM dose report files were identified and retrieved from a PACS. An automated software program was used to extract from these files information from the structured elements in the DICOM metadata relevant to exposure. Extracting information from DICOM metadata eliminated potential errors inherent in techniques based on optical character recognition, yielding 100% accuracy.

  15. Medicaid Analytic eXtract (MAX) General Information

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicaid Analytic eXtract (MAX) data is a set of person-level data files on Medicaid eligibility, service utilization, and payments. The MAX data are created to...

  16. Visual scanning and recognition of Chinese, Caucasian, and racially ambiguous faces: contributions from bottom-up facial physiognomic information and top-down knowledge of racial categories.

    Science.gov (United States)

    Wang, Qiandong; Xiao, Naiqi G; Quinn, Paul C; Hu, Chao S; Qian, Miao; Fu, Genyue; Lee, Kang

    2015-02-01

    Recent studies have shown that participants use different eye movement strategies when scanning own- and other-race faces. However, it is unclear (1) whether this effect is related to face recognition performance, and (2) to what extent this effect is influenced by top-down or bottom-up facial information. In the present study, Chinese participants performed a face recognition task with Chinese, Caucasian, and racially ambiguous faces. For the racially ambiguous faces, we led participants to believe that they were viewing either own-race Chinese faces or other-race Caucasian faces. Results showed that (1) Chinese participants scanned the nose of the true Chinese faces more than that of the true Caucasian faces, whereas they scanned the eyes of the Caucasian faces more than those of the Chinese faces; (2) they scanned the eyes, nose, and mouth equally for the ambiguous faces in the Chinese condition compared with those in the Caucasian condition; (3) when recognizing the true Chinese target faces, but not the true target Caucasian faces, the greater the fixation proportion on the nose, the faster the participants correctly recognized these faces. The same was true when racially ambiguous face stimuli were thought to be Chinese faces. These results provide the first evidence to show that (1) visual scanning patterns of faces are related to own-race face recognition response time, and (2) it is bottom-up facial physiognomic information that mainly contributes to face scanning. However, top-down knowledge of racial categories can influence the relationship between face scanning patterns and recognition response time. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Visual scanning and recognition of Chinese, Caucasian, and racially ambiguous faces: Contributions from bottom-up facial physiognomic information and top-down knowledge of racial categories

    Science.gov (United States)

    Wang, Qiandong; Xiao, Naiqi G.; Quinn, Paul C.; Hu, Chao S.; Qian, Miao; Fu, Genyue; Lee, Kang

    2014-01-01

    Recent studies have shown that participants use different eye movement strategies when scanning own- and other-race faces. However, it is unclear (1) whether this effect is related to face recognition performance, and (2) to what extent this effect is influenced by top-down or bottom-up facial information. In the present study, Chinese participants performed a face recognition task with Chinese faces, Caucasian faces, and racially ambiguous morphed face stimuli. For the racially ambiguous faces, we led participants to believe that they were viewing either own-race Chinese faces or other-race Caucasian faces. Results showed that (1) Chinese participants scanned the nose of the true Chinese faces more than that of the true Caucasian faces, whereas they scanned the eyes of the Caucasian faces more than those of the Chinese faces; (2) they scanned the eyes, nose, and mouth equally for the ambiguous faces in the Chinese condition compared with those in the Caucasian condition; (3) when recognizing the true Chinese target faces, but not the true target Caucasian faces, the greater the fixation proportion on the nose, the faster the participants correctly recognized these faces. The same was true when racially ambiguous face stimuli were thought to be Chinese faces. These results provide the first evidence to show that (1) visual scanning patterns of faces are related to own-race face recognition response time, and (2) it is bottom-up facial physiognomic information of racial categories that mainly contributes to face scanning. However, top-down knowledge of racial categories can influence the relationship between face scanning patterns and recognition response time. PMID:25497461

  18. Bottom-up and middle-out approaches to electronic patient information systems: a focus on healthcare pathways

    Directory of Open Access Journals (Sweden)

    Ken Eason

    2013-12-01

    Full Text Available Background A study is reported that examines the use of electronic health record (EHR systems in two UK local health communities.Objective These systems were developed locally and the aim of the study was to explore how well they were supporting the coordination of care along healthcare pathways that cross the organisational boundaries between the agencies delivering health care.Results The paper presents the findings for two healthcare pathways; the Stroke Pathway and a pathway for the care of the frail elderly in their own homes. All the pathways examined involved multiple agencies and many locally tailored EHR systems are in use to aid the coordination of care. However, the ability to share electronic patient information along the pathways was patchy. The development of systems that enabled effective sharing of information was characterised by sociotechnical system development, i.e. associating the technical development with process changes and organisational changes, with local development teams that drew on all the relevant agencies in the local health community and on evolutionary development, as experience grew of the benefits that EHR systems could deliver.Conclusions The study concludes that whilst there may be a role for a national IT strategy, for example, to set standards for systems procurement that facilitate data interchange, most systems development work needs to be done at a ‘middle-out’ level in the local health community, where joint planning between healthcare agencies can occur, and at the local healthcare pathway level where systems can be matched to specific needs for information sharing.

  19. Information from uranium and thorium isotopes recorded in lake bottom sediment - Lake Kawaguchi. Attempt to evaluate environmental changes

    International Nuclear Information System (INIS)

    Sakaguchi, A.; Yamamoto, M.; Shimizu, T.; Sasaki, K.; Koshimizu, S.

    2003-01-01

    Lake sediments, as well as ice cores and marine sediments, have been used to reveal past environmental changes caused by both natural and artificial events with local and global scales. Particles in the lake originate from soil and other suspended matter which are carried in from the inflowing water or from direct discharge (lithogenous particles), and they are also formed in the lake as a result of the growth, metabolism and death of plants and animals (autogenous particles). The settling particles contain U and Th isotopes due to lithogenous particles (soil), and adsorbed U to the particles. Thorium has an exceedingly low solubility in water and is very strongly adsorbed onto particles. If we can distinguish these two different components in the mixture of U due to lithogenous particles themselves and adsorption fractions, the former might provide useful information on past environmental changes by natural events, while the latter information on past environmental changes by artificial events. In this paper, we aimed to test the above hypothesis using data on U and Th isotopes of sediment cores (0- ca. 40 cm depth, covering periods of past several hundred years) from 3 points in Lake Kawaguchi of Fuji-Goko in Japan. By using model equation and results obtained from analysis, we distinguished U due to lithogenous and autogenous particles. And these depth profiles were compared with changes in the rainfall during the period of 1933 - 2001. Although the changes in the 238 U/ 232 Th ratios with depth for lithogenous particles and rainfall do not fluctuate synchronously, some parts of depth coincided with each other. The results suggest strongly that variation in the U and Th isotopic ratios separated by model might be helpful in tracing the past environmental changes in regional scale. To check the usefulness of this method, physical and chemical data such as grain size, grain density, water content and biogenic SiO 2 in the sediment will be further compared with the

  20. Information Extraction with Character-level Neural Networks and Free Noisy Supervision

    OpenAIRE

    Meerkamp, Philipp; Zhou, Zhengyi

    2016-01-01

    We present an architecture for information extraction from text that augments an existing parser with a character-level neural network. The network is trained using a measure of consistency of extracted data with existing databases as a form of noisy supervision. Our architecture combines the ability of constraint-based information extraction systems to easily incorporate domain knowledge and constraints with the ability of deep neural networks to leverage large amounts of data to learn compl...

  1. Semantics-based information extraction for detecting economic events

    NARCIS (Netherlands)

    A.C. Hogenboom (Alexander); F. Frasincar (Flavius); K. Schouten (Kim); O. van der Meer

    2013-01-01

    textabstractAs today's financial markets are sensitive to breaking news on economic events, accurate and timely automatic identification of events in news items is crucial. Unstructured news items originating from many heterogeneous sources have to be mined in order to extract knowledge useful for

  2. Tagline: Information Extraction for Semi-Structured Text Elements in Medical Progress Notes

    Science.gov (United States)

    Finch, Dezon Kile

    2012-01-01

    Text analysis has become an important research activity in the Department of Veterans Affairs (VA). Statistical text mining and natural language processing have been shown to be very effective for extracting useful information from medical documents. However, neither of these techniques is effective at extracting the information stored in…

  3. An Effective Approach to Biomedical Information Extraction with Limited Training Data

    Science.gov (United States)

    Jonnalagadda, Siddhartha

    2011-01-01

    In the current millennium, extensive use of computers and the internet caused an exponential increase in information. Few research areas are as important as information extraction, which primarily involves extracting concepts and the relations between them from free text. Limitations in the size of training data, lack of lexicons and lack of…

  4. A rapid extraction of landslide disaster information research based on GF-1 image

    Science.gov (United States)

    Wang, Sai; Xu, Suning; Peng, Ling; Wang, Zhiyi; Wang, Na

    2015-08-01

    In recent years, the landslide disasters occurred frequently because of the seismic activity. It brings great harm to people's life. It has caused high attention of the state and the extensive concern of society. In the field of geological disaster, landslide information extraction based on remote sensing has been controversial, but high resolution remote sensing image can improve the accuracy of information extraction effectively with its rich texture and geometry information. Therefore, it is feasible to extract the information of earthquake- triggered landslides with serious surface damage and large scale. Taking the Wenchuan county as the study area, this paper uses multi-scale segmentation method to extract the landslide image object through domestic GF-1 images and DEM data, which uses the estimation of scale parameter tool to determine the optimal segmentation scale; After analyzing the characteristics of landslide high-resolution image comprehensively and selecting spectrum feature, texture feature, geometric features and landform characteristics of the image, we can establish the extracting rules to extract landslide disaster information. The extraction results show that there are 20 landslide whose total area is 521279.31 .Compared with visual interpretation results, the extraction accuracy is 72.22%. This study indicates its efficient and feasible to extract earthquake landslide disaster information based on high resolution remote sensing and it provides important technical support for post-disaster emergency investigation and disaster assessment.

  5. Towards an information extraction and knowledge formation framework based on Shannon entropy

    Directory of Open Access Journals (Sweden)

    Iliescu Dragoș

    2017-01-01

    Full Text Available Information quantity subject is approached in this paperwork, considering the specific domain of nonconforming product management as information source. This work represents a case study. Raw data were gathered from a heavy industrial works company, information extraction and knowledge formation being considered herein. Involved method for information quantity estimation is based on Shannon entropy formula. Information and entropy spectrum are decomposed and analysed for extraction of specific information and knowledge-that formation. The result of the entropy analysis point out the information needed to be acquired by the involved organisation, this being presented as a specific knowledge type.

  6. Extracting local information from crowds through betting markets

    Science.gov (United States)

    Weijs, Steven

    2015-04-01

    In this research, a set-up is considered in which users can bet against a forecasting agency to challenge their probabilistic forecasts. From an information theory standpoint, a reward structure is considered that either provides the forecasting agency with better information, paying the successful providers of information for their winning bets, or funds excellent forecasting agencies through users that think they know better. Especially for local forecasts, the approach may help to diagnose model biases and to identify local predictive information that can be incorporated in the models. The challenges and opportunities for implementing such a system in practice are also discussed.

  7. Spoken Language Understanding Systems for Extracting Semantic Information from Speech

    CERN Document Server

    Tur, Gokhan

    2011-01-01

    Spoken language understanding (SLU) is an emerging field in between speech and language processing, investigating human/ machine and human/ human communication by leveraging technologies from signal processing, pattern recognition, machine learning and artificial intelligence. SLU systems are designed to extract the meaning from speech utterances and its applications are vast, from voice search in mobile devices to meeting summarization, attracting interest from both commercial and academic sectors. Both human/machine and human/human communications can benefit from the application of SLU, usin

  8. Sifting Through Chaos: Extracting Information from Unstructured Legal Opinions.

    Science.gov (United States)

    Oliveira, Bruno Miguel; Guimarães, Rui Vasconcellos; Antunes, Luís; Rodrigues, Pedro Pereira

    2018-01-01

    Abiding to the law is, in some cases, a delicate balance between the rights of different players. Re-using health records is such a case. While the law grants reuse rights to public administration documents, in which health records produced in public health institutions are included, it also grants privacy to personal records. To safeguard a correct usage of data, public hospitals in Portugal employ jurists that are responsible for allowing or withholding access rights to health records. To help decision making, these jurists can consult the legal opinions issued by the national committee on public administration documents usage. While these legal opinions are of undeniable value, due to their doctrine contribution, they are only available in a format best suited from printing, forcing individual consultation of each document, with no option, whatsoever of clustered search, filtering or indexing, which are standard operations nowadays in a document management system. When having to decide on tens of data requests a day, it becomes unfeasible to consult the hundreds of legal opinions already available. With the objective to create a modern document management system, we devised an open, platform agnostic system that extracts and compiles the legal opinions, ex-tracts its contents and produces metadata, allowing for a fast searching and filtering of said legal opinions.

  9. Information extraction from FN plots of tungsten microemitters

    Energy Technology Data Exchange (ETDEWEB)

    Mussa, Khalil O. [Department of Physics, Mu' tah University, Al-Karak (Jordan); Mousa, Marwan S., E-mail: mmousa@mutah.edu.jo [Department of Physics, Mu' tah University, Al-Karak (Jordan); Fischer, Andreas, E-mail: andreas.fischer@physik.tu-chemnitz.de [Institut für Physik, Technische Universität Chemnitz, Chemnitz (Germany)

    2013-09-15

    Tungsten based microemitter tips have been prepared both clean and coated with dielectric materials. For clean tungsten tips, apex radii have been varied ranging from 25 to 500 nm. These tips were manufactured by electrochemical etching a 0.1 mm diameter high purity (99.95%) tungsten wire at the meniscus of two molar NaOH solution. Composite micro-emitters considered here are consisting of a tungsten core coated with different dielectric materials—such as magnesium oxide (MgO), sodium hydroxide (NaOH), tetracyanoethylene (TCNE), and zinc oxide (ZnO). It is worthwhile noting here, that the rather unconventional NaOH coating has shown several interesting properties. Various properties of these emitters were measured including current–voltage (IV) characteristics and the physical shape of the tips. A conventional field emission microscope (FEM) with a tip (cathode)–screen (anode) separation standardized at 10 mm was used to electrically characterize the electron emitters. The system was evacuated down to a base pressure of ∼10{sup −8}mbar when baked at up to ∼180°C overnight. This allowed measurements of typical field electron emission (FE) characteristics, namely the IV characteristics and the emission images on a conductive phosphorus screen (the anode). Mechanical characterization has been performed through a FEI scanning electron microscope (SEM). Within this work, the mentioned experimental results are connected to the theory for analyzing Fowler–Nordheim (FN) plots. We compared and evaluated the data extracted from clean tungsten tips of different radii and determined deviations between the results of different extraction methods applied. In particular, we derived the apex radii of several clean and coated tungsten tips by both SEM imaging and analyzing FN plots. The aim of this analysis is to support the ongoing discussion on recently developed improvements of the theory for analyzing FN plots related to metal field electron emitters, which in

  10. Information extraction from FN plots of tungsten microemitters

    International Nuclear Information System (INIS)

    Mussa, Khalil O.; Mousa, Marwan S.; Fischer, Andreas

    2013-01-01

    Tungsten based microemitter tips have been prepared both clean and coated with dielectric materials. For clean tungsten tips, apex radii have been varied ranging from 25 to 500 nm. These tips were manufactured by electrochemical etching a 0.1 mm diameter high purity (99.95%) tungsten wire at the meniscus of two molar NaOH solution. Composite micro-emitters considered here are consisting of a tungsten core coated with different dielectric materials—such as magnesium oxide (MgO), sodium hydroxide (NaOH), tetracyanoethylene (TCNE), and zinc oxide (ZnO). It is worthwhile noting here, that the rather unconventional NaOH coating has shown several interesting properties. Various properties of these emitters were measured including current–voltage (IV) characteristics and the physical shape of the tips. A conventional field emission microscope (FEM) with a tip (cathode)–screen (anode) separation standardized at 10 mm was used to electrically characterize the electron emitters. The system was evacuated down to a base pressure of ∼10 −8 mbar when baked at up to ∼180°C overnight. This allowed measurements of typical field electron emission (FE) characteristics, namely the IV characteristics and the emission images on a conductive phosphorus screen (the anode). Mechanical characterization has been performed through a FEI scanning electron microscope (SEM). Within this work, the mentioned experimental results are connected to the theory for analyzing Fowler–Nordheim (FN) plots. We compared and evaluated the data extracted from clean tungsten tips of different radii and determined deviations between the results of different extraction methods applied. In particular, we derived the apex radii of several clean and coated tungsten tips by both SEM imaging and analyzing FN plots. The aim of this analysis is to support the ongoing discussion on recently developed improvements of the theory for analyzing FN plots related to metal field electron emitters, which in

  11. Perceived Effects of Pornography on the Couple Relationship: Initial Findings of Open-Ended, Participant-Informed, "Bottom-Up" Research.

    Science.gov (United States)

    Kohut, Taylor; Fisher, William A; Campbell, Lorne

    2017-02-01

    The current study adopted a participant-informed, "bottom-up," qualitative approach to identifying perceived effects of pornography on the couple relationship. A large sample (N = 430) of men and women in heterosexual relationships in which pornography was used by at least one partner was recruited through online (e.g., Facebook, Twitter, etc.) and offline (e.g., newspapers, radio, etc.) sources. Participants responded to open-ended questions regarding perceived consequences of pornography use for each couple member and for their relationship in the context of an online survey. In the current sample of respondents, "no negative effects" was the most commonly reported impact of pornography use. Among remaining responses, positive perceived effects of pornography use on couple members and their relationship (e.g., improved sexual communication, more sexual experimentation, enhanced sexual comfort) were reported frequently; negative perceived effects of pornography (e.g., unrealistic expectations, decreased sexual interest in partner, increased insecurity) were also reported, albeit with considerably less frequency. The results of this work suggest new research directions that require more systematic attention.

  12. Employment impacts of EU biofuels policy. Combining bottom-up technology information and sectoral market simulations in an input-output framework

    International Nuclear Information System (INIS)

    Neuwahl, Frederik; Mongelli, Ignazio; Delgado, Luis; Loeschel, Andreas

    2008-01-01

    This paper analyses the employment consequences of policies aimed to support biofuels in the European Union. The promotion of biofuel use has been advocated as a means to promote the sustainable use of natural resources and to reduce greenhouse gas emissions originating from transport activities on the one hand, and to reduce dependence on imported oil and thereby increase security of the European energy supply on the other hand. The employment impacts of increasing biofuels shares are calculated by taking into account a set of elements comprising the demand for capital goods required to produce biofuels, the additional demand for agricultural feedstock, higher fuel prices or reduced household budget in the case of price subsidisation, price effects ensuing from a hypothetical world oil price reduction linked to substitution in the EU market, and price impacts on agro-food commodities. The calculations refer to scenarios for the year 2020 targets as set out by the recent Renewable Energy Roadmap. Employment effects are assessed in an input-output framework taking into account bottom-up technology information to specify biofuels activities and linked to partial equilibrium models for the agricultural and energy sectors. The simulations suggest that biofuels targets on the order of 10-15% could be achieved without adverse net employment effects. (author)

  13. Optimal Information Extraction of Laser Scanning Dataset by Scale-Adaptive Reduction

    Science.gov (United States)

    Zang, Y.; Yang, B.

    2018-04-01

    3D laser technology is widely used to collocate the surface information of object. For various applications, we need to extract a good perceptual quality point cloud from the scanned points. To solve the problem, most of existing methods extract important points based on a fixed scale. However, geometric features of 3D object come from various geometric scales. We propose a multi-scale construction method based on radial basis function. For each scale, important points are extracted from the point cloud based on their importance. We apply a perception metric Just-Noticeable-Difference to measure degradation of each geometric scale. Finally, scale-adaptive optimal information extraction is realized. Experiments are undertaken to evaluate the effective of the proposed method, suggesting a reliable solution for optimal information extraction of object.

  14. OPTIMAL INFORMATION EXTRACTION OF LASER SCANNING DATASET BY SCALE-ADAPTIVE REDUCTION

    Directory of Open Access Journals (Sweden)

    Y. Zang

    2018-04-01

    Full Text Available 3D laser technology is widely used to collocate the surface information of object. For various applications, we need to extract a good perceptual quality point cloud from the scanned points. To solve the problem, most of existing methods extract important points based on a fixed scale. However, geometric features of 3D object come from various geometric scales. We propose a multi-scale construction method based on radial basis function. For each scale, important points are extracted from the point cloud based on their importance. We apply a perception metric Just-Noticeable-Difference to measure degradation of each geometric scale. Finally, scale-adaptive optimal information extraction is realized. Experiments are undertaken to evaluate the effective of the proposed method, suggesting a reliable solution for optimal information extraction of object.

  15. Information extraction from FN plots of tungsten microemitters.

    Science.gov (United States)

    Mussa, Khalil O; Mousa, Marwan S; Fischer, Andreas

    2013-09-01

    Tungsten based microemitter tips have been prepared both clean and coated with dielectric materials. For clean tungsten tips, apex radii have been varied ranging from 25 to 500 nm. These tips were manufactured by electrochemical etching a 0.1 mm diameter high purity (99.95%) tungsten wire at the meniscus of two molar NaOH solution. Composite micro-emitters considered here are consisting of a tungsten core coated with different dielectric materials-such as magnesium oxide (MgO), sodium hydroxide (NaOH), tetracyanoethylene (TCNE), and zinc oxide (ZnO). It is worthwhile noting here, that the rather unconventional NaOH coating has shown several interesting properties. Various properties of these emitters were measured including current-voltage (IV) characteristics and the physical shape of the tips. A conventional field emission microscope (FEM) with a tip (cathode)-screen (anode) separation standardized at 10 mm was used to electrically characterize the electron emitters. The system was evacuated down to a base pressure of ∼10(-8) mbar when baked at up to ∼180 °C overnight. This allowed measurements of typical field electron emission (FE) characteristics, namely the IV characteristics and the emission images on a conductive phosphorus screen (the anode). Mechanical characterization has been performed through a FEI scanning electron microscope (SEM). Within this work, the mentioned experimental results are connected to the theory for analyzing Fowler-Nordheim (FN) plots. We compared and evaluated the data extracted from clean tungsten tips of different radii and determined deviations between the results of different extraction methods applied. In particular, we derived the apex radii of several clean and coated tungsten tips by both SEM imaging and analyzing FN plots. The aim of this analysis is to support the ongoing discussion on recently developed improvements of the theory for analyzing FN plots related to metal field electron emitters, which in particular

  16. Study on methods and techniques of aeroradiometric weak information extraction for sandstone-hosted uranium deposits based on GIS

    International Nuclear Information System (INIS)

    Han Shaoyang; Ke Dan; Hou Huiqun

    2005-01-01

    The weak information extraction is one of the important research contents in the current sandstone-type uranium prospecting in China. This paper introduces the connotation of aeroradiometric weak information extraction, and discusses the formation theories of aeroradiometric weak information extraction, and discusses the formation theories of aeroradiometric weak information and establishes some effective mathematic models for weak information extraction. Models for weak information extraction are realized based on GIS software platform. Application tests of weak information extraction are realized based on GIS software platform. Application tests of weak information extraction are completed in known uranium mineralized areas. Research results prove that the prospective areas of sandstone-type uranium deposits can be rapidly delineated by extracting aeroradiometric weak information. (authors)

  17. Extraction of Graph Information Based on Image Contents and the Use of Ontology

    Science.gov (United States)

    Kanjanawattana, Sarunya; Kimura, Masaomi

    2016-01-01

    A graph is an effective form of data representation used to summarize complex information. Explicit information such as the relationship between the X- and Y-axes can be easily extracted from a graph by applying human intelligence. However, implicit knowledge such as information obtained from other related concepts in an ontology also resides in…

  18. Extracting information of fixational eye movements through pupil tracking

    Science.gov (United States)

    Xiao, JiangWei; Qiu, Jian; Luo, Kaiqin; Peng, Li; Han, Peng

    2018-01-01

    Human eyes are never completely static even when they are fixing a stationary point. These irregular, small movements, which consist of micro-tremors, micro-saccades and drifts, can prevent the fading of the images that enter our eyes. The importance of researching the fixational eye movements has been experimentally demonstrated recently. However, the characteristics of fixational eye movements and their roles in visual process have not been explained clearly, because these signals can hardly be completely extracted by now. In this paper, we developed a new eye movement detection device with a high-speed camera. This device includes a beam splitter mirror, an infrared light source and a high-speed digital video camera with a frame rate of 200Hz. To avoid the influence of head shaking, we made the device wearable by fixing the camera on a safety helmet. Using this device, the experiments of pupil tracking were conducted. By localizing the pupil center and spectrum analysis, the envelope frequency spectrum of micro-saccades, micro-tremors and drifts are shown obviously. The experimental results show that the device is feasible and effective, so that the device can be applied in further characteristic analysis.

  19. Extracting Social Networks and Contact Information From Email and the Web

    National Research Council Canada - National Science Library

    Culotta, Aron; Bekkerman, Ron; McCallum, Andrew

    2005-01-01

    ...-suited for such information extraction tasks. By recursively calling itself on new people discovered on the Web, the system builds a social network with multiple degrees of separation from the user...

  20. Lithium NLP: A System for Rich Information Extraction from Noisy User Generated Text on Social Media

    OpenAIRE

    Bhargava, Preeti; Spasojevic, Nemanja; Hu, Guoning

    2017-01-01

    In this paper, we describe the Lithium Natural Language Processing (NLP) system - a resource-constrained, high- throughput and language-agnostic system for information extraction from noisy user generated text on social media. Lithium NLP extracts a rich set of information including entities, topics, hashtags and sentiment from text. We discuss several real world applications of the system currently incorporated in Lithium products. We also compare our system with existing commercial and acad...

  1. Information Extraction of High Resolution Remote Sensing Images Based on the Calculation of Optimal Segmentation Parameters

    Science.gov (United States)

    Zhu, Hongchun; Cai, Lijie; Liu, Haiying; Huang, Wei

    2016-01-01

    Multi-scale image segmentation and the selection of optimal segmentation parameters are the key processes in the object-oriented information extraction of high-resolution remote sensing images. The accuracy of remote sensing special subject information depends on this extraction. On the basis of WorldView-2 high-resolution data, the optimal segmentation parameters methodof object-oriented image segmentation and high-resolution image information extraction, the following processes were conducted in this study. Firstly, the best combination of the bands and weights was determined for the information extraction of high-resolution remote sensing image. An improved weighted mean-variance method was proposed andused to calculatethe optimal segmentation scale. Thereafter, the best shape factor parameter and compact factor parameters were computed with the use of the control variables and the combination of the heterogeneity and homogeneity indexes. Different types of image segmentation parameters were obtained according to the surface features. The high-resolution remote sensing images were multi-scale segmented with the optimal segmentation parameters. Ahierarchical network structure was established by setting the information extraction rules to achieve object-oriented information extraction. This study presents an effective and practical method that can explain expert input judgment by reproducible quantitative measurements. Furthermore the results of this procedure may be incorporated into a classification scheme. PMID:27362762

  2. Overview of ImageCLEF 2017: information extraction from images

    OpenAIRE

    Ionescu, Bogdan; Müller, Henning; Villegas, Mauricio; Arenas, Helbert; Boato, Giulia; Dang Nguyen, Duc Tien; Dicente Cid, Yashin; Eickhoff, Carsten; Seco de Herrera, Alba G.; Gurrin, Cathal; Islam, Bayzidul; Kovalev, Vassili; Liauchuk, Vitali; Mothe, Josiane; Piras, Luca

    2017-01-01

    This paper presents an overview of the ImageCLEF 2017 evaluation campaign, an event that was organized as part of the CLEF (Conference and Labs of the Evaluation Forum) labs 2017. ImageCLEF is an ongoing initiative (started in 2003) that promotes the evaluation of technologies for annotation, indexing and retrieval for providing information access to collections of images in various usage scenarios and domains. In 2017, the 15th edition of ImageCLEF, three main tasks were proposed and one pil...

  3. Statistical techniques to extract information during SMAP soil moisture assimilation

    Science.gov (United States)

    Kolassa, J.; Reichle, R. H.; Liu, Q.; Alemohammad, S. H.; Gentine, P.

    2017-12-01

    Statistical techniques permit the retrieval of soil moisture estimates in a model climatology while retaining the spatial and temporal signatures of the satellite observations. As a consequence, the need for bias correction prior to an assimilation of these estimates is reduced, which could result in a more effective use of the independent information provided by the satellite observations. In this study, a statistical neural network (NN) retrieval algorithm is calibrated using SMAP brightness temperature observations and modeled soil moisture estimates (similar to those used to calibrate the SMAP Level 4 DA system). Daily values of surface soil moisture are estimated using the NN and then assimilated into the NASA Catchment model. The skill of the assimilation estimates is assessed based on a comprehensive comparison to in situ measurements from the SMAP core and sparse network sites as well as the International Soil Moisture Network. The NN retrieval assimilation is found to significantly improve the model skill, particularly in areas where the model does not represent processes related to agricultural practices. Additionally, the NN method is compared to assimilation experiments using traditional bias correction techniques. The NN retrieval assimilation is found to more effectively use the independent information provided by SMAP resulting in larger model skill improvements than assimilation experiments using traditional bias correction techniques.

  4. Research on Crowdsourcing Emergency Information Extraction of Based on Events' Frame

    Science.gov (United States)

    Yang, Bo; Wang, Jizhou; Ma, Weijun; Mao, Xi

    2018-01-01

    At present, the common information extraction method cannot extract the structured emergency event information accurately; the general information retrieval tool cannot completely identify the emergency geographic information; these ways also do not have an accurate assessment of these results of distilling. So, this paper proposes an emergency information collection technology based on event framework. This technique is to solve the problem of emergency information picking. It mainly includes emergency information extraction model (EIEM), complete address recognition method (CARM) and the accuracy evaluation model of emergency information (AEMEI). EIEM can be structured to extract emergency information and complements the lack of network data acquisition in emergency mapping. CARM uses a hierarchical model and the shortest path algorithm and allows the toponomy pieces to be joined as a full address. AEMEI analyzes the results of the emergency event and summarizes the advantages and disadvantages of the event framework. Experiments show that event frame technology can solve the problem of emergency information drawing and provides reference cases for other applications. When the emergency disaster is about to occur, the relevant departments query emergency's data that has occurred in the past. They can make arrangements ahead of schedule which defense and reducing disaster. The technology decreases the number of casualties and property damage in the country and world. This is of great significance to the state and society.

  5. [Extraction of management information from the national quality assurance program].

    Science.gov (United States)

    Stausberg, Jürgen; Bartels, Claus; Bobrowski, Christoph

    2007-07-15

    Starting with clinically motivated projects, the national quality assurance program has established a legislative obligatory framework. Annual feedback of results is an important means of quality control. The annual reports cover quality-related information with high granularity. A synopsis for corporate management is missing, however. Therefore, the results of the University Clinics in Greifswald, Germany, have been analyzed and aggregated to support hospital management. Strengths were identified by the ranking of results within the state for each quality indicator, weaknesses by the comparison with national reference values. The assessment was aggregated per clinical discipline and per category (indication, process, and outcome). A composition of quality indicators was claimed multiple times. A coherent concept is still missing. The method presented establishes a plausible summary of strengths and weaknesses of a hospital from the point of view of the national quality assurance program. Nevertheless, further adaptation of the program is needed to better assist corporate management.

  6. Bottom head assembly

    International Nuclear Information System (INIS)

    Fife, A.B.

    1998-01-01

    A bottom head dome assembly is described which includes, in one embodiment, a bottom head dome and a liner configured to be positioned proximate the bottom head dome. The bottom head dome has a plurality of openings extending there through. The liner also has a plurality of openings extending there through, and each liner opening aligns with a respective bottom head dome opening. A seal is formed, such as by welding, between the liner and the bottom head dome to resist entry of water between the liner and the bottom head dome at the edge of the liner. In the one embodiment, a plurality of stub tubes are secured to the liner. Each stub tube has a bore extending there through, and each stub tube bore is coaxially aligned with a respective liner opening. A seat portion is formed by each liner opening for receiving a portion of the respective stub tube. The assembly also includes a plurality of support shims positioned between the bottom head dome and the liner for supporting the liner. In one embodiment, each support shim includes a support stub having a bore there through, and each support stub bore aligns with a respective bottom head dome opening. 2 figs

  7. Extracting of implicit information in English advertising texts with phonetic and lexical-morphological means

    Directory of Open Access Journals (Sweden)

    Traikovskaya Natalya Petrovna

    2015-12-01

    Full Text Available The article deals with phonetic and lexical-morphological language means participating in the process of extracting implicit information in English-speaking advertising texts for men and women. The functioning of phonetic means of the English language is not the basis for implication of information in advertising texts. Lexical and morphological means play the role of markers of relevant information, playing the role of the activator ofimplicit information in the texts of advertising.

  8. Post-processing of Deep Web Information Extraction Based on Domain Ontology

    Directory of Open Access Journals (Sweden)

    PENG, T.

    2013-11-01

    Full Text Available Many methods are utilized to extract and process query results in deep Web, which rely on the different structures of Web pages and various designing modes of databases. However, some semantic meanings and relations are ignored. So, in this paper, we present an approach for post-processing deep Web query results based on domain ontology which can utilize the semantic meanings and relations. A block identification model (BIM based on node similarity is defined to extract data blocks that are relevant to specific domain after reducing noisy nodes. Feature vector of domain books is obtained by result set extraction model (RSEM based on vector space model (VSM. RSEM, in combination with BIM, builds the domain ontology on books which can not only remove the limit of Web page structures when extracting data information, but also make use of semantic meanings of domain ontology. After extracting basic information of Web pages, a ranking algorithm is adopted to offer an ordered list of data records to users. Experimental results show that BIM and RSEM extract data blocks and build domain ontology accurately. In addition, relevant data records and basic information are extracted and ranked. The performances precision and recall show that our proposed method is feasible and efficient.

  9. a Statistical Texture Feature for Building Collapse Information Extraction of SAR Image

    Science.gov (United States)

    Li, L.; Yang, H.; Chen, Q.; Liu, X.

    2018-04-01

    Synthetic Aperture Radar (SAR) has become one of the most important ways to extract post-disaster collapsed building information, due to its extreme versatility and almost all-weather, day-and-night working capability, etc. In view of the fact that the inherent statistical distribution of speckle in SAR images is not used to extract collapsed building information, this paper proposed a novel texture feature of statistical models of SAR images to extract the collapsed buildings. In the proposed feature, the texture parameter of G0 distribution from SAR images is used to reflect the uniformity of the target to extract the collapsed building. This feature not only considers the statistical distribution of SAR images, providing more accurate description of the object texture, but also is applied to extract collapsed building information of single-, dual- or full-polarization SAR data. The RADARSAT-2 data of Yushu earthquake which acquired on April 21, 2010 is used to present and analyze the performance of the proposed method. In addition, the applicability of this feature to SAR data with different polarizations is also analysed, which provides decision support for the data selection of collapsed building information extraction.

  10. A method for automating the extraction of specialized information from the web

    NARCIS (Netherlands)

    Lin, L.; Liotta, A.; Hippisley, A.; Hao, Y.; Liu, J.; Wang, Y.; Cheung, Y-M.; Yin, H.; Jiao, L.; Ma, j.; Jiao, Y-C.

    2005-01-01

    The World Wide Web can be viewed as a gigantic distributed database including millions of interconnected hosts some of which publish information via web servers or peer-to-peer systems. We present here a novel method for the extraction of semantically rich information from the web in a fully

  11. Information analysis of iris biometrics for the needs of cryptology key extraction

    Directory of Open Access Journals (Sweden)

    Adamović Saša

    2013-01-01

    Full Text Available The paper presents a rigorous analysis of iris biometric information for the synthesis of an optimized system for the extraction of a high quality cryptology key. Estimations of local entropy and mutual information were identified as segments of the iris most suitable for this purpose. In order to optimize parameters, corresponding wavelets were transformed, in order to obtain the highest possible entropy and mutual information lower in the transformation domain, which set frameworks for the synthesis of systems for the extraction of truly random sequences of iris biometrics, without compromising authentication properties. [Projekat Ministarstva nauke Republike Srbije, br. TR32054 i br. III44006

  12. MedTime: a temporal information extraction system for clinical narratives.

    Science.gov (United States)

    Lin, Yu-Kai; Chen, Hsinchun; Brown, Randall A

    2013-12-01

    Temporal information extraction from clinical narratives is of critical importance to many clinical applications. We participated in the EVENT/TIMEX3 track of the 2012 i2b2 clinical temporal relations challenge, and presented our temporal information extraction system, MedTime. MedTime comprises a cascade of rule-based and machine-learning pattern recognition procedures. It achieved a micro-averaged f-measure of 0.88 in both the recognitions of clinical events and temporal expressions. We proposed and evaluated three time normalization strategies to normalize relative time expressions in clinical texts. The accuracy was 0.68 in normalizing temporal expressions of dates, times, durations, and frequencies. This study demonstrates and evaluates the integration of rule-based and machine-learning-based approaches for high performance temporal information extraction from clinical narratives. Copyright © 2013 Elsevier Inc. All rights reserved.

  13. Research of building information extraction and evaluation based on high-resolution remote-sensing imagery

    Science.gov (United States)

    Cao, Qiong; Gu, Lingjia; Ren, Ruizhi; Wang, Lang

    2016-09-01

    Building extraction currently is important in the application of high-resolution remote sensing imagery. At present, quite a few algorithms are available for detecting building information, however, most of them still have some obvious disadvantages, such as the ignorance of spectral information, the contradiction between extraction rate and extraction accuracy. The purpose of this research is to develop an effective method to detect building information for Chinese GF-1 data. Firstly, the image preprocessing technique is used to normalize the image and image enhancement is used to highlight the useful information in the image. Secondly, multi-spectral information is analyzed. Subsequently, an improved morphological building index (IMBI) based on remote sensing imagery is proposed to get the candidate building objects. Furthermore, in order to refine building objects and further remove false objects, the post-processing (e.g., the shape features, the vegetation index and the water index) is employed. To validate the effectiveness of the proposed algorithm, the omission errors (OE), commission errors (CE), the overall accuracy (OA) and Kappa are used at final. The proposed method can not only effectively use spectral information and other basic features, but also avoid extracting excessive interference details from high-resolution remote sensing images. Compared to the original MBI algorithm, the proposed method reduces the OE by 33.14% .At the same time, the Kappa increase by 16.09%. In experiments, IMBI achieved satisfactory results and outperformed other algorithms in terms of both accuracies and visual inspection

  14. Information Extraction of High-Resolution Remotely Sensed Image Based on Multiresolution Segmentation

    Directory of Open Access Journals (Sweden)

    Peng Shao

    2014-08-01

    Full Text Available The principle of multiresolution segmentation was represented in detail in this study, and the canny algorithm was applied for edge-detection of a remotely sensed image based on this principle. The target image was divided into regions based on object-oriented multiresolution segmentation and edge-detection. Furthermore, object hierarchy was created, and a series of features (water bodies, vegetation, roads, residential areas, bare land and other information were extracted by the spectral and geometrical features. The results indicate that the edge-detection has a positive effect on multiresolution segmentation, and overall accuracy of information extraction reaches to 94.6% by the confusion matrix.

  15. End-to-end information extraction without token-level supervision

    DEFF Research Database (Denmark)

    Palm, Rasmus Berg; Hovy, Dirk; Laws, Florian

    2017-01-01

    Most state-of-the-art information extraction approaches rely on token-level labels to find the areas of interest in text. Unfortunately, these labels are time-consuming and costly to create, and consequently, not available for many real-life IE tasks. To make matters worse, token-level labels...... and output text. We evaluate our model on the ATIS data set, MIT restaurant corpus and the MIT movie corpus and compare to neural baselines that do use token-level labels. We achieve competitive results, within a few percentage points of the baselines, showing the feasibility of E2E information extraction...

  16. Extraction Method for Earthquake-Collapsed Building Information Based on High-Resolution Remote Sensing

    International Nuclear Information System (INIS)

    Chen, Peng; Wu, Jian; Liu, Yaolin; Wang, Jing

    2014-01-01

    At present, the extraction of earthquake disaster information from remote sensing data relies on visual interpretation. However, this technique cannot effectively and quickly obtain precise and efficient information for earthquake relief and emergency management. Collapsed buildings in the town of Zipingpu after the Wenchuan earthquake were used as a case study to validate two kinds of rapid extraction methods for earthquake-collapsed building information based on pixel-oriented and object-oriented theories. The pixel-oriented method is based on multi-layer regional segments that embody the core layers and segments of the object-oriented method. The key idea is to mask layer by layer all image information, including that on the collapsed buildings. Compared with traditional techniques, the pixel-oriented method is innovative because it allows considerably rapid computer processing. As for the object-oriented method, a multi-scale segment algorithm was applied to build a three-layer hierarchy. By analyzing the spectrum, texture, shape, location, and context of individual object classes in different layers, the fuzzy determined rule system was established for the extraction of earthquake-collapsed building information. We compared the two sets of results using three variables: precision assessment, visual effect, and principle. Both methods can extract earthquake-collapsed building information quickly and accurately. The object-oriented method successfully overcomes the pepper salt noise caused by the spectral diversity of high-resolution remote sensing data and solves the problem of same object, different spectrums and that of same spectrum, different objects. With an overall accuracy of 90.38%, the method achieves more scientific and accurate results compared with the pixel-oriented method (76.84%). The object-oriented image analysis method can be extensively applied in the extraction of earthquake disaster information based on high-resolution remote sensing

  17. Using text mining techniques to extract phenotypic information from the PhenoCHF corpus.

    Science.gov (United States)

    Alnazzawi, Noha; Thompson, Paul; Batista-Navarro, Riza; Ananiadou, Sophia

    2015-01-01

    Phenotypic information locked away in unstructured narrative text presents significant barriers to information accessibility, both for clinical practitioners and for computerised applications used for clinical research purposes. Text mining (TM) techniques have previously been applied successfully to extract different types of information from text in the biomedical domain. They have the potential to be extended to allow the extraction of information relating to phenotypes from free text. To stimulate the development of TM systems that are able to extract phenotypic information from text, we have created a new corpus (PhenoCHF) that is annotated by domain experts with several types of phenotypic information relating to congestive heart failure. To ensure that systems developed using the corpus are robust to multiple text types, it integrates text from heterogeneous sources, i.e., electronic health records (EHRs) and scientific articles from the literature. We have developed several different phenotype extraction methods to demonstrate the utility of the corpus, and tested these methods on a further corpus, i.e., ShARe/CLEF 2013. Evaluation of our automated methods showed that PhenoCHF can facilitate the training of reliable phenotype extraction systems, which are robust to variations in text type. These results have been reinforced by evaluating our trained systems on the ShARe/CLEF corpus, which contains clinical records of various types. Like other studies within the biomedical domain, we found that solutions based on conditional random fields produced the best results, when coupled with a rich feature set. PhenoCHF is the first annotated corpus aimed at encoding detailed phenotypic information. The unique heterogeneous composition of the corpus has been shown to be advantageous in the training of systems that can accurately extract phenotypic information from a range of different text types. Although the scope of our annotation is currently limited to a single

  18. Fall Bottom Trawl Survey

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The standardized NEFSC Fall Bottom Trawl Survey was initiated in 1963 and covered an area from Hudson Canyon, NY to Nova Scotia, Canada. Throughout the years,...

  19. Summer Bottom Trawl Survey

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Sampling the coastal waters of the Gulf of Maine using the Northeast Fishery Science Center standardized bottom trawl has been problematic due to large areas of hard...

  20. Spring Bottom Trawl Survey

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The standardized NEFSC Spring Bottom Trawl Survey was initiated in 1968 and covered an area from Cape Hatteras, NC, to Nova Scotia, Canada, at depths >27m....

  1. Winter Bottom Trawl Survey

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The standardized NEFSC Winter Bottom Trawl Survey was initiated in 1992 and covered offshore areas from the Mid-Atlantic to Georges Bank. Inshore strata were covered...

  2. Terrain Extraction by Integrating Terrestrial Laser Scanner Data and Spectral Information

    Science.gov (United States)

    Lau, C. L.; Halim, S.; Zulkepli, M.; Azwan, A. M.; Tang, W. L.; Chong, A. K.

    2015-10-01

    The extraction of true terrain points from unstructured laser point cloud data is an important process in order to produce an accurate digital terrain model (DTM). However, most of these spatial filtering methods just utilizing the geometrical data to discriminate the terrain points from nonterrain points. The point cloud filtering method also can be improved by using the spectral information available with some scanners. Therefore, the objective of this study is to investigate the effectiveness of using the three-channel (red, green and blue) of the colour image captured from built-in digital camera which is available in some Terrestrial Laser Scanner (TLS) for terrain extraction. In this study, the data acquisition was conducted at a mini replica landscape in Universiti Teknologi Malaysia (UTM), Skudai campus using Leica ScanStation C10. The spectral information of the coloured point clouds from selected sample classes are extracted for spectral analysis. The coloured point clouds which within the corresponding preset spectral threshold are identified as that specific feature point from the dataset. This process of terrain extraction is done through using developed Matlab coding. Result demonstrates that a higher spectral resolution passive image is required in order to improve the output. This is because low quality of the colour images captured by the sensor contributes to the low separability in spectral reflectance. In conclusion, this study shows that, spectral information is capable to be used as a parameter for terrain extraction.

  3. Rita Bottoms: Polyartist Librarian

    OpenAIRE

    Bottoms, Rita; Reti, Irene; Regional History Project, UCSC Library

    2005-01-01

    Project Director Irene Reti conducted fourteen hours of interviews with Rita Bottoms, Head of Special Collections at the University Library, UC Santa Cruz, shortly before her retirement in March 2003. This oral history provides a vivid and intimate look at thirty-seven years behind the scenes in the library's Special Collections. For thirty-seven years Bottoms dedicated herself to collecting work by some of the most eminent writers and photographers of the twentieth century, includin...

  4. Information retrieval and terminology extraction in online resources for patients with diabetes.

    Science.gov (United States)

    Seljan, Sanja; Baretić, Maja; Kucis, Vlasta

    2014-06-01

    Terminology use, as a mean for information retrieval or document indexing, plays an important role in health literacy. Specific types of users, i.e. patients with diabetes need access to various online resources (on foreign and/or native language) searching for information on self-education of basic diabetic knowledge, on self-care activities regarding importance of dietetic food, medications, physical exercises and on self-management of insulin pumps. Automatic extraction of corpus-based terminology from online texts, manuals or professional papers, can help in building terminology lists or list of "browsing phrases" useful in information retrieval or in document indexing. Specific terminology lists represent an intermediate step between free text search and controlled vocabulary, between user's demands and existing online resources in native and foreign language. The research aiming to detect the role of terminology in online resources, is conducted on English and Croatian manuals and Croatian online texts, and divided into three interrelated parts: i) comparison of professional and popular terminology use ii) evaluation of automatic statistically-based terminology extraction on English and Croatian texts iii) comparison and evaluation of extracted terminology performed on English manual using statistical and hybrid approaches. Extracted terminology candidates are evaluated by comparison with three types of reference lists: list created by professional medical person, list of highly professional vocabulary contained in MeSH and list created by non-medical persons, made as intersection of 15 lists. Results report on use of popular and professional terminology in online diabetes resources, on evaluation of automatically extracted terminology candidates in English and Croatian texts and on comparison of statistical and hybrid extraction methods in English text. Evaluation of automatic and semi-automatic terminology extraction methods is performed by recall

  5. OpenCV-Based Nanomanipulation Information Extraction and the Probe Operation in SEM

    Directory of Open Access Journals (Sweden)

    Dongjie Li

    2015-02-01

    Full Text Available Aimed at the established telenanomanipulation system, the method of extracting location information and the strategies of probe operation were studied in this paper. First, the machine learning algorithm of OpenCV was used to extract location information from SEM images. Thus nanowires and probe in SEM images can be automatically tracked and the region of interest (ROI can be marked quickly. Then the location of nanowire and probe can be extracted from the ROI. To study the probe operation strategy, the Van der Waals force between probe and a nanowire was computed; thus relevant operating parameters can be obtained. With these operating parameters, the nanowire in 3D virtual environment can be preoperated and an optimal path of the probe can be obtained. The actual probe runs automatically under the telenanomanipulation system's control. Finally, experiments were carried out to verify the above methods, and results show the designed methods have achieved the expected effect.

  6. Methods to extract information on the atomic and molecular states from scientific abstracts

    International Nuclear Information System (INIS)

    Sasaki, Akira; Ueshima, Yutaka; Yamagiwa, Mitsuru; Murata, Masaki; Kanamaru, Toshiyuki; Shirado, Tamotsu; Isahara, Hitoshi

    2005-01-01

    We propose a new application of information technology to recognize and extract expressions of atomic and molecular states from electrical forms of scientific abstracts. Present results will help scientists to understand atomic states as well as the physics discussed in the articles. Combining with the internet search engines, it will make one possible to collect not only atomic and molecular data but broader scientific information over a wide range of research fields. (author)

  7. System and method for extracting physiological information from remotely detected electromagnetic radiation

    NARCIS (Netherlands)

    2016-01-01

    The present invention relates to a device and a method for extracting physiological information indicative of at least one health symptom from remotely detected electromagnetic radiation. The device comprises an interface (20) for receiving a data stream comprising remotely detected image data

  8. System and method for extracting physiological information from remotely detected electromagnetic radiation

    NARCIS (Netherlands)

    2015-01-01

    The present invention relates to a device and a method for extracting physiological information indicative of at least one health symptom from remotely detected electromagnetic radiation. The device comprises an interface (20) for receiving a data stream comprising remotely detected image data

  9. Network and Ensemble Enabled Entity Extraction in Informal Text (NEEEEIT) final report

    Energy Technology Data Exchange (ETDEWEB)

    Kegelmeyer, Philip W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shead, Timothy M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dunlavy, Daniel M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-09-01

    This SAND report summarizes the activities and outcomes of the Network and Ensemble Enabled Entity Extraction in Information Text (NEEEEIT) LDRD project, which addressed improving the accuracy of conditional random fields for named entity recognition through the use of ensemble methods.

  10. A construction scheme of web page comment information extraction system based on frequent subtree mining

    Science.gov (United States)

    Zhang, Xiaowen; Chen, Bingfeng

    2017-08-01

    Based on the frequent sub-tree mining algorithm, this paper proposes a construction scheme of web page comment information extraction system based on frequent subtree mining, referred to as FSM system. The entire system architecture and the various modules to do a brief introduction, and then the core of the system to do a detailed description, and finally give the system prototype.

  11. EXTRACT

    DEFF Research Database (Denmark)

    Pafilis, Evangelos; Buttigieg, Pier Luigi; Ferrell, Barbra

    2016-01-01

    The microbial and molecular ecology research communities have made substantial progress on developing standards for annotating samples with environment metadata. However, sample manual annotation is a highly labor intensive process and requires familiarity with the terminologies used. We have the...... and text-mining-assisted curation revealed that EXTRACT speeds up annotation by 15-25% and helps curators to detect terms that would otherwise have been missed.Database URL: https://extract.hcmr.gr/......., organism, tissue and disease terms. The evaluators in the BioCreative V Interactive Annotation Task found the system to be intuitive, useful, well documented and sufficiently accurate to be helpful in spotting relevant text passages and extracting organism and environment terms. Comparison of fully manual...

  12. Semi-automatic building extraction in informal settlements from high-resolution satellite imagery

    Science.gov (United States)

    Mayunga, Selassie David

    The extraction of man-made features from digital remotely sensed images is considered as an important step underpinning management of human settlements in any country. Man-made features and buildings in particular are required for varieties of applications such as urban planning, creation of geographical information systems (GIS) databases and Urban City models. The traditional man-made feature extraction methods are very expensive in terms of equipment, labour intensive, need well-trained personnel and cannot cope with changing environments, particularly in dense urban settlement areas. This research presents an approach for extracting buildings in dense informal settlement areas using high-resolution satellite imagery. The proposed system uses a novel strategy of extracting building by measuring a single point at the approximate centre of the building. The fine measurement of the building outlines is then effected using a modified snake model. The original snake model on which this framework is based, incorporates an external constraint energy term which is tailored to preserving the convergence properties of the snake model; its use to unstructured objects will negatively affect their actual shapes. The external constrained energy term was removed from the original snake model formulation, thereby, giving ability to cope with high variability of building shapes in informal settlement areas. The proposed building extraction system was tested on two areas, which have different situations. The first area was Tungi in Dar Es Salaam, Tanzania where three sites were tested. This area is characterized by informal settlements, which are illegally formulated within the city boundaries. The second area was Oromocto in New Brunswick, Canada where two sites were tested. Oromocto area is mostly flat and the buildings are constructed using similar materials. Qualitative and quantitative measures were employed to evaluate the accuracy of the results as well as the performance

  13. RESEARCH ON REMOTE SENSING GEOLOGICAL INFORMATION EXTRACTION BASED ON OBJECT ORIENTED CLASSIFICATION

    Directory of Open Access Journals (Sweden)

    H. Gao

    2018-04-01

    Full Text Available The northern Tibet belongs to the Sub cold arid climate zone in the plateau. It is rarely visited by people. The geological working conditions are very poor. However, the stratum exposures are good and human interference is very small. Therefore, the research on the automatic classification and extraction of remote sensing geological information has typical significance and good application prospect. Based on the object-oriented classification in Northern Tibet, using the Worldview2 high-resolution remote sensing data, combined with the tectonic information and image enhancement, the lithological spectral features, shape features, spatial locations and topological relations of various geological information are excavated. By setting the threshold, based on the hierarchical classification, eight kinds of geological information were classified and extracted. Compared with the existing geological maps, the accuracy analysis shows that the overall accuracy reached 87.8561 %, indicating that the classification-oriented method is effective and feasible for this study area and provides a new idea for the automatic extraction of remote sensing geological information.

  14. A Method for Extracting Road Boundary Information from Crowdsourcing Vehicle GPS Trajectories.

    Science.gov (United States)

    Yang, Wei; Ai, Tinghua; Lu, Wei

    2018-04-19

    Crowdsourcing trajectory data is an important approach for accessing and updating road information. In this paper, we present a novel approach for extracting road boundary information from crowdsourcing vehicle traces based on Delaunay triangulation (DT). First, an optimization and interpolation method is proposed to filter abnormal trace segments from raw global positioning system (GPS) traces and interpolate the optimization segments adaptively to ensure there are enough tracking points. Second, constructing the DT and the Voronoi diagram within interpolated tracking lines to calculate road boundary descriptors using the area of Voronoi cell and the length of triangle edge. Then, the road boundary detection model is established integrating the boundary descriptors and trajectory movement features (e.g., direction) by DT. Third, using the boundary detection model to detect road boundary from the DT constructed by trajectory lines, and a regional growing method based on seed polygons is proposed to extract the road boundary. Experiments were conducted using the GPS traces of taxis in Beijing, China, and the results show that the proposed method is suitable for extracting the road boundary from low-frequency GPS traces, multi-type road structures, and different time intervals. Compared with two existing methods, the automatically extracted boundary information was proved to be of higher quality.

  15. A Method for Extracting Road Boundary Information from Crowdsourcing Vehicle GPS Trajectories

    Directory of Open Access Journals (Sweden)

    Wei Yang

    2018-04-01

    Full Text Available Crowdsourcing trajectory data is an important approach for accessing and updating road information. In this paper, we present a novel approach for extracting road boundary information from crowdsourcing vehicle traces based on Delaunay triangulation (DT. First, an optimization and interpolation method is proposed to filter abnormal trace segments from raw global positioning system (GPS traces and interpolate the optimization segments adaptively to ensure there are enough tracking points. Second, constructing the DT and the Voronoi diagram within interpolated tracking lines to calculate road boundary descriptors using the area of Voronoi cell and the length of triangle edge. Then, the road boundary detection model is established integrating the boundary descriptors and trajectory movement features (e.g., direction by DT. Third, using the boundary detection model to detect road boundary from the DT constructed by trajectory lines, and a regional growing method based on seed polygons is proposed to extract the road boundary. Experiments were conducted using the GPS traces of taxis in Beijing, China, and the results show that the proposed method is suitable for extracting the road boundary from low-frequency GPS traces, multi-type road structures, and different time intervals. Compared with two existing methods, the automatically extracted boundary information was proved to be of higher quality.

  16. YAdumper: extracting and translating large information volumes from relational databases to structured flat files.

    Science.gov (United States)

    Fernández, José M; Valencia, Alfonso

    2004-10-12

    Downloading the information stored in relational databases into XML and other flat formats is a common task in bioinformatics. This periodical dumping of information requires considerable CPU time, disk and memory resources. YAdumper has been developed as a purpose-specific tool to deal with the integral structured information download of relational databases. YAdumper is a Java application that organizes database extraction following an XML template based on an external Document Type Declaration. Compared with other non-native alternatives, YAdumper substantially reduces memory requirements and considerably improves writing performance.

  17. Bottom-linked innovation

    DEFF Research Database (Denmark)

    Kristensen, Catharina Juul

    2018-01-01

    hitherto been paid little explicit attention, namely collaboration between middle managers and employees in innovation processes. In contrast to most studies, middle managers and employees are here both subjects of explicit investigation. The collaboration processes explored in this article are termed...... ‘bottom-linked innovation’. The empirical analysis is based on an in-depth qualitative study of bottom-linked innovation in a public frontline institution in Denmark. By combining research on employee-driven innovation and middle management, the article offers new insights into such collaborative......Employee-driven innovation is gaining ground as a strategy for developing sustainable organisations in the public and private sector. This type of innovation is characterised by active employee participation, and the bottom-up perspective is often emphasised. This article explores an issue that has...

  18. A comparison of top-down and bottom-up approaches to benthic habitat mapping to inform offshore wind energy development

    Science.gov (United States)

    LaFrance, Monique; King, John W.; Oakley, Bryan A.; Pratt, Sheldon

    2014-07-01

    Recent interest in offshore renewable energy within the United States has amplified the need for marine spatial planning to direct management strategies and address competing user demands. To assist this effort in Rhode Island, benthic habitat classification maps were developed for two sites in offshore waters being considered for wind turbine installation. Maps characterizing and representing the distribution and extent of benthic habitats are valuable tools for improving understanding of ecosystem patterns and processes, and promoting scientifically-sound management decisions. This project presented the opportunity to conduct a comparison of the methodologies and resulting map outputs of two classification approaches, “top-down” and “bottom-up” in the two study areas. This comparison was undertaken to improve understanding of mapping methodologies and their applicability, including the bottom-up approach in offshore environments where data density tends to be lower, as well as to provide case studies for scientists and managers to consider for their own areas of interest. Such case studies can offer guidance for future work for assessing methodologies and translating them to other areas. The traditional top-down mapping approach identifies biological community patterns based on communities occurring within geologically defined habitat map units, under the concept that geologic environments contain distinct biological assemblages. Alternatively, the bottom-up approach aims to establish habitat map units centered on biological similarity and then uses statistics to identify relationships with associated environmental parameters and determine habitat boundaries. When applied to the two study areas, both mapping approaches produced habitat classes with distinct macrofaunal assemblages and each established statistically strong and significant biotic-abiotic relationships with geologic features, sediment characteristics, water depth, and/or habitat

  19. Bottom and top physics

    International Nuclear Information System (INIS)

    Foley, K.J.; Fridman, A.; Gilman, F.J.; Herten, G.; Hinchliffe, I.; Jawahery, A.; Sanda, A.; Schmidt, M.P.; Schubert, K.R.

    1987-09-01

    The production of bottom quarks at the SSC and the formalism and phenomenology of observing CP violation in B meson decays is discussed. The production of a heavy t quark which decays into a real W boson, and what we might learn from its decays is examined

  20. Extracting information from two-dimensional electrophoresis gels by partial least squares regression

    DEFF Research Database (Denmark)

    Jessen, Flemming; Lametsch, R.; Bendixen, E.

    2002-01-01

    of all proteins/spots in the gels. In the present study it is demonstrated how information can be extracted by multivariate data analysis. The strategy is based on partial least squares regression followed by variable selection to find proteins that individually or in combination with other proteins vary......Two-dimensional gel electrophoresis (2-DE) produces large amounts of data and extraction of relevant information from these data demands a cautious and time consuming process of spot pattern matching between gels. The classical approach of data analysis is to detect protein markers that appear...... or disappear depending on the experimental conditions. Such biomarkers are found by comparing the relative volumes of individual spots in the individual gels. Multivariate statistical analysis and modelling of 2-DE data for comparison and classification is an alternative approach utilising the combination...

  1. From remote sensing data about information extraction for 3D geovisualization - Development of a workflow

    International Nuclear Information System (INIS)

    Tiede, D.

    2010-01-01

    With an increased availability of high (spatial) resolution remote sensing imagery since the late nineties, the need to develop operative workflows for the automated extraction, provision and communication of information from such data has grown. Monitoring requirements, aimed at the implementation of environmental or conservation targets, management of (environmental-) resources, and regional planning as well as international initiatives, especially the joint initiative of the European Commission and ESA (European Space Agency) for Global Monitoring for Environment and Security (GMES) play also a major part. This thesis addresses the development of an integrated workflow for the automated provision of information derived from remote sensing data. Considering applied data and fields of application, this work aims to design the workflow as generic as possible. Following research questions are discussed: What are the requirements of a workflow architecture that seamlessly links the individual workflow elements in a timely manner and secures accuracy of the extracted information effectively? How can the workflow retain its efficiency if mounds of data are processed? How can the workflow be improved with regards to automated object-based image analysis (OBIA)? Which recent developments could be of use? What are the limitations or which workarounds could be applied in order to generate relevant results? How can relevant information be prepared target-oriented and communicated effectively? How can the more recently developed freely available virtual globes be used for the delivery of conditioned information under consideration of the third dimension as an additional, explicit carrier of information? Based on case studies comprising different data sets and fields of application it is demonstrated how methods to extract and process information as well as to effectively communicate results can be improved and successfully combined within one workflow. It is shown that (1

  2. Addressing Risk Assessment for Patient Safety in Hospitals through Information Extraction in Medical Reports

    Science.gov (United States)

    Proux, Denys; Segond, Frédérique; Gerbier, Solweig; Metzger, Marie Hélène

    Hospital Acquired Infections (HAI) is a real burden for doctors and risk surveillance experts. The impact on patients' health and related healthcare cost is very significant and a major concern even for rich countries. Furthermore required data to evaluate the threat is generally not available to experts and that prevents from fast reaction. However, recent advances in Computational Intelligence Techniques such as Information Extraction, Risk Patterns Detection in documents and Decision Support Systems allow now to address this problem.

  3. From Specific Information Extraction to Inferences: A Hierarchical Framework of Graph Comprehension

    Science.gov (United States)

    2004-09-01

    The skill to interpret the information displayed in graphs is so important to have, the National Council of Teachers of Mathematics has created...guidelines to ensure that students learn these skills ( NCTM : Standards for Mathematics , 2003). These guidelines are based primarily on the extraction of...graphical perception. Human Computer Interaction, 8, 353-388. NCTM : Standards for Mathematics . (2003, 2003). Peebles, D., & Cheng, P. C.-H. (2002

  4. Extracting breathing rate information from a wearable reflectance pulse oximeter sensor.

    Science.gov (United States)

    Johnston, W S; Mendelson, Y

    2004-01-01

    The integration of multiple vital physiological measurements could help combat medics and field commanders to better predict a soldier's health condition and enhance their ability to perform remote triage procedures. In this paper we demonstrate the feasibility of extracting accurate breathing rate information from a photoplethysmographic signal that was recorded by a reflectance pulse oximeter sensor mounted on the forehead and subsequently processed by a simple time domain filtering and frequency domain Fourier analysis.

  5. Extraction of land cover change information from ENVISAT-ASAR data in Chengdu Plain

    Science.gov (United States)

    Xu, Wenbo; Fan, Jinlong; Huang, Jianxi; Tian, Yichen; Zhang, Yong

    2006-10-01

    Land cover data are essential to most global change research objectives, including the assessment of current environmental conditions and the simulation of future environmental scenarios that ultimately lead to public policy development. Chinese Academy of Sciences generated a nationwide land cover database in order to carry out the quantification and spatial characterization of land use/cover changes (LUCC) in 1990s. In order to improve the reliability of the database, we will update the database anytime. But it is difficult to obtain remote sensing data to extract land cover change information in large-scale. It is hard to acquire optical remote sensing data in Chengdu plain, so the objective of this research was to evaluate multitemporal ENVISAT advanced synthetic aperture radar (ASAR) data for extracting land cover change information. Based on the fieldwork and the nationwide 1:100000 land cover database, the paper assesses several land cover changes in Chengdu plain, for example: crop to buildings, forest to buildings, and forest to bare land. The results show that ENVISAT ASAR data have great potential for the applications of extracting land cover change information.

  6. KneeTex: an ontology-driven system for information extraction from MRI reports.

    Science.gov (United States)

    Spasić, Irena; Zhao, Bo; Jones, Christopher B; Button, Kate

    2015-01-01

    In the realm of knee pathology, magnetic resonance imaging (MRI) has the advantage of visualising all structures within the knee joint, which makes it a valuable tool for increasing diagnostic accuracy and planning surgical treatments. Therefore, clinical narratives found in MRI reports convey valuable diagnostic information. A range of studies have proven the feasibility of natural language processing for information extraction from clinical narratives. However, no study focused specifically on MRI reports in relation to knee pathology, possibly due to the complexity of knee anatomy and a wide range of conditions that may be associated with different anatomical entities. In this paper we describe KneeTex, an information extraction system that operates in this domain. As an ontology-driven information extraction system, KneeTex makes active use of an ontology to strongly guide and constrain text analysis. We used automatic term recognition to facilitate the development of a domain-specific ontology with sufficient detail and coverage for text mining applications. In combination with the ontology, high regularity of the sublanguage used in knee MRI reports allowed us to model its processing by a set of sophisticated lexico-semantic rules with minimal syntactic analysis. The main processing steps involve named entity recognition combined with coordination, enumeration, ambiguity and co-reference resolution, followed by text segmentation. Ontology-based semantic typing is then used to drive the template filling process. We adopted an existing ontology, TRAK (Taxonomy for RehAbilitation of Knee conditions), for use within KneeTex. The original TRAK ontology expanded from 1,292 concepts, 1,720 synonyms and 518 relationship instances to 1,621 concepts, 2,550 synonyms and 560 relationship instances. This provided KneeTex with a very fine-grained lexico-semantic knowledge base, which is highly attuned to the given sublanguage. Information extraction results were evaluated

  7. SAR matrices: automated extraction of information-rich SAR tables from large compound data sets.

    Science.gov (United States)

    Wassermann, Anne Mai; Haebel, Peter; Weskamp, Nils; Bajorath, Jürgen

    2012-07-23

    We introduce the SAR matrix data structure that is designed to elucidate SAR patterns produced by groups of structurally related active compounds, which are extracted from large data sets. SAR matrices are systematically generated and sorted on the basis of SAR information content. Matrix generation is computationally efficient and enables processing of large compound sets. The matrix format is reminiscent of SAR tables, and SAR patterns revealed by different categories of matrices are easily interpretable. The structural organization underlying matrix formation is more flexible than standard R-group decomposition schemes. Hence, the resulting matrices capture SAR information in a comprehensive manner.

  8. Comparison of methods of extracting information for meta-analysis of observational studies in nutritional epidemiology

    Directory of Open Access Journals (Sweden)

    Jong-Myon Bae

    2016-01-01

    Full Text Available OBJECTIVES: A common method for conducting a quantitative systematic review (QSR for observational studies related to nutritional epidemiology is the “highest versus lowest intake” method (HLM, in which only the information concerning the effect size (ES of the highest category of a food item is collected on the basis of its lowest category. However, in the interval collapsing method (ICM, a method suggested to enable a maximum utilization of all available information, the ES information is collected by collapsing all categories into a single category. This study aimed to compare the ES and summary effect size (SES between the HLM and ICM. METHODS: A QSR for evaluating the citrus fruit intake and risk of pancreatic cancer and calculating the SES by using the HLM was selected. The ES and SES were estimated by performing a meta-analysis using the fixed-effect model. The directionality and statistical significance of the ES and SES were used as criteria for determining the concordance between the HLM and ICM outcomes. RESULTS: No significant differences were observed in the directionality of SES extracted by using the HLM or ICM. The application of the ICM, which uses a broader information base, yielded more-consistent ES and SES, and narrower confidence intervals than the HLM. CONCLUSIONS: The ICM is advantageous over the HLM owing to its higher statistical accuracy in extracting information for QSR on nutritional epidemiology. The application of the ICM should hence be recommended for future studies.

  9. Feature extraction and learning using context cue and Rényi entropy based mutual information

    DEFF Research Database (Denmark)

    Pan, Hong; Olsen, Søren Ingvor; Zhu, Yaping

    2015-01-01

    information. In particular, for feature extraction, we develop a new set of kernel descriptors−Context Kernel Descriptors (CKD), which enhance the original KDES by embedding the spatial context into the descriptors. Context cues contained in the context kernel enforce some degree of spatial consistency, thus...... improving the robustness of CKD. For feature learning and reduction, we propose a novel codebook learning method, based on a Rényi quadratic entropy based mutual information measure called Cauchy-Schwarz Quadratic Mutual Information (CSQMI), to learn a compact and discriminative CKD codebook. Projecting...... as the information about the underlying labels of the CKD using CSQMI. Thus the resulting codebook and reduced CKD are discriminative. We verify the effectiveness of our method on several public image benchmark datasets such as YaleB, Caltech-101 and CIFAR-10, as well as a challenging chicken feet dataset of our own...

  10. Method of extracting significant trouble information of nuclear power plants using probabilistic analysis technique

    International Nuclear Information System (INIS)

    Shimada, Yoshio; Miyazaki, Takamasa

    2005-01-01

    In order to analyze and evaluate large amounts of trouble information of overseas nuclear power plants, it is necessary to select information that is significant in terms of both safety and reliability. In this research, a method of efficiently and simply classifying degrees of importance of components in terms of safety and reliability while paying attention to root-cause components appearing in the information was developed. Regarding safety, the reactor core damage frequency (CDF), which is used in the probabilistic analysis of a reactor, was used. Regarding reliability, the automatic plant trip probability (APTP), which is used in the probabilistic analysis of automatic reactor trips, was used. These two aspects were reflected in the development of criteria for classifying degrees of importance of components. By applying these criteria, a simple method of extracting significant trouble information of overseas nuclear power plants was developed. (author)

  11. Automated concept-level information extraction to reduce the need for custom software and rules development.

    Science.gov (United States)

    D'Avolio, Leonard W; Nguyen, Thien M; Goryachev, Sergey; Fiore, Louis D

    2011-01-01

    Despite at least 40 years of promising empirical performance, very few clinical natural language processing (NLP) or information extraction systems currently contribute to medical science or care. The authors address this gap by reducing the need for custom software and rules development with a graphical user interface-driven, highly generalizable approach to concept-level retrieval. A 'learn by example' approach combines features derived from open-source NLP pipelines with open-source machine learning classifiers to automatically and iteratively evaluate top-performing configurations. The Fourth i2b2/VA Shared Task Challenge's concept extraction task provided the data sets and metrics used to evaluate performance. Top F-measure scores for each of the tasks were medical problems (0.83), treatments (0.82), and tests (0.83). Recall lagged precision in all experiments. Precision was near or above 0.90 in all tasks. Discussion With no customization for the tasks and less than 5 min of end-user time to configure and launch each experiment, the average F-measure was 0.83, one point behind the mean F-measure of the 22 entrants in the competition. Strong precision scores indicate the potential of applying the approach for more specific clinical information extraction tasks. There was not one best configuration, supporting an iterative approach to model creation. Acceptable levels of performance can be achieved using fully automated and generalizable approaches to concept-level information extraction. The described implementation and related documentation is available for download.

  12. Sediment Budgets and Sources Inform a Novel Valley Bottom Restoration Practice Impacted by Legacy Sediment: The Big Spring Run, PA, Restoration Experiment

    Science.gov (United States)

    Walter, R. C.; Merritts, D.; Rahnis, M. A.; Gellis, A.; Hartranft, J.; Mayer, P. M.; Langland, M.; Forshay, K.; Weitzman, J. N.; Schwarz, E.; Bai, Y.; Blair, A.; Carter, A.; Daniels, S. S.; Lewis, E.; Ohlson, E.; Peck, E. K.; Schulte, K.; Smith, D.; Stein, Z.; Verna, D.; Wilson, E.

    2017-12-01

    Big Spring Run (BSR), a small agricultural watershed in southeastern Pennsylvania, is located in the Piedmont Physiographic Province, which has the highest nutrient and sediment yields in the Chesapeake Bay watershed. To effectively reduce nutrient and sediment loading it is important to monitor the effect of management practices on pollutant reduction. Here we present results of an ongoing study, begun in 2008, to understand the impact of a new valley bottom restoration strategy for reducing surface water sediment and nutrient loads. We test the hypotheses that removing legacy sediments will reduce sediment and phosphorus loads, and that restoring eco-hydrological functions of a buried Holocene wetland (Walter & Merritts 2008) will improve surface and groundwater quality by creating accommodation space to trap sediment and process nutrients. Comparisons of pre- and post-restoration gage data show that restoration lowered the annual sediment load by at least 118 t yr-1, or >75%, from the 1000 m-long restoration reach, with the entire reduction accounted for by legacy sediment removal. Repeat RTK-GPS surveys of pre-restoration stream banks verified that >90 t yr-1 of suspended sediment was from bank erosion within the restoration reach. Mass balance calculations of 137Cs data indicate 85-100% of both the pre-restoration and post-restoration suspended sediment storm load was from stream bank sources. This is consistent with trace element data which show that 80-90 % of the pre-restoration outgoing suspended sediment load at BSR was from bank erosion. Meanwhile, an inventory of fallout 137Cs activity from two hill slope transects adjacent to BSR yields average modern upland erosion rates of 2.7 t ha-1 yr-1 and 5.1 t ha-1 yr-1, showing modest erosion on slopes and deposition at toe of slopes. We conclude that upland farm slopes contribute little soil to the suspended sediment supply within this study area, and removal of historic valley bottom sediment effectively

  13. The Feature Extraction Based on Texture Image Information for Emotion Sensing in Speech

    Directory of Open Access Journals (Sweden)

    Kun-Ching Wang

    2014-09-01

    Full Text Available In this paper, we present a novel texture image feature for Emotion Sensing in Speech (ESS. This idea is based on the fact that the texture images carry emotion-related information. The feature extraction is derived from time-frequency representation of spectrogram images. First, we transform the spectrogram as a recognizable image. Next, we use a cubic curve to enhance the image contrast. Then, the texture image information (TII derived from the spectrogram image can be extracted by using Laws’ masks to characterize emotional state. In order to evaluate the effectiveness of the proposed emotion recognition in different languages, we use two open emotional databases including the Berlin Emotional Speech Database (EMO-DB and eNTERFACE corpus and one self-recorded database (KHUSC-EmoDB, to evaluate the performance cross-corpora. The results of the proposed ESS system are presented using support vector machine (SVM as a classifier. Experimental results show that the proposed TII-based feature extraction inspired by visual perception can provide significant classification for ESS systems. The two-dimensional (2-D TII feature can provide the discrimination between different emotions in visual expressions except for the conveyance pitch and formant tracks. In addition, the de-noising in 2-D images can be more easily completed than de-noising in 1-D speech.

  14. An Accurate Integral Method for Vibration Signal Based on Feature Information Extraction

    Directory of Open Access Journals (Sweden)

    Yong Zhu

    2015-01-01

    Full Text Available After summarizing the advantages and disadvantages of current integral methods, a novel vibration signal integral method based on feature information extraction was proposed. This method took full advantage of the self-adaptive filter characteristic and waveform correction feature of ensemble empirical mode decomposition in dealing with nonlinear and nonstationary signals. This research merged the superiorities of kurtosis, mean square error, energy, and singular value decomposition on signal feature extraction. The values of the four indexes aforementioned were combined into a feature vector. Then, the connotative characteristic components in vibration signal were accurately extracted by Euclidean distance search, and the desired integral signals were precisely reconstructed. With this method, the interference problem of invalid signal such as trend item and noise which plague traditional methods is commendably solved. The great cumulative error from the traditional time-domain integral is effectively overcome. Moreover, the large low-frequency error from the traditional frequency-domain integral is successfully avoided. Comparing with the traditional integral methods, this method is outstanding at removing noise and retaining useful feature information and shows higher accuracy and superiority.

  15. A cascade of classifiers for extracting medication information from discharge summaries

    Directory of Open Access Journals (Sweden)

    Halgrim Scott

    2011-07-01

    Full Text Available Abstract Background Extracting medication information from clinical records has many potential applications, and recently published research, systems, and competitions reflect an interest therein. Much of the early extraction work involved rules and lexicons, but more recently machine learning has been applied to the task. Methods We present a hybrid system consisting of two parts. The first part, field detection, uses a cascade of statistical classifiers to identify medication-related named entities. The second part uses simple heuristics to link those entities into medication events. Results The system achieved performance that is comparable to other approaches to the same task. This performance is further improved by adding features that reference external medication name lists. Conclusions This study demonstrates that our hybrid approach outperforms purely statistical or rule-based systems. The study also shows that a cascade of classifiers works better than a single classifier in extracting medication information. The system is available as is upon request from the first author.

  16. Three-dimensional information extraction from GaoFen-1 satellite images for landslide monitoring

    Science.gov (United States)

    Wang, Shixin; Yang, Baolin; Zhou, Yi; Wang, Futao; Zhang, Rui; Zhao, Qing

    2018-05-01

    To more efficiently use GaoFen-1 (GF-1) satellite images for landslide emergency monitoring, a Digital Surface Model (DSM) can be generated from GF-1 across-track stereo image pairs to build a terrain dataset. This study proposes a landslide 3D information extraction method based on the terrain changes of slope objects. The slope objects are mergences of segmented image objects which have similar aspects; and the terrain changes are calculated from the post-disaster Digital Elevation Model (DEM) from GF-1 and the pre-disaster DEM from GDEM V2. A high mountain landslide that occurred in Wenchuan County, Sichuan Province is used to conduct a 3D information extraction test. The extracted total area of the landslide is 22.58 ha; the displaced earth volume is 652,100 m3; and the average sliding direction is 263.83°. The accuracies of them are 0.89, 0.87 and 0.95, respectively. Thus, the proposed method expands the application of GF-1 satellite images to the field of landslide emergency monitoring.

  17. Ocean Bottom Seismic Scattering

    Science.gov (United States)

    1989-11-01

    EPR, the Clipperton and Orozco fracture zones , and along the coast of Mexico, were recorded for a two month period using ocean bottom seismometers...67. Tuthill, J.D., Lewis, B.R., and Garmany, J.D., 1981, Stonely waves, Lopez Island noise, and deep sea noise from I to 5 hz, Marine Geophysical...Patrol Pell Marine Science Library d/o Coast Guard R & D Center University of Rhode Island Avery Point Narragansett Bay Campus Groton, CT 06340

  18. THE EXTRACTION OF INDOOR BUILDING INFORMATION FROM BIM TO OGC INDOORGML

    Directory of Open Access Journals (Sweden)

    T.-A. Teo

    2017-07-01

    Full Text Available Indoor Spatial Data Infrastructure (indoor-SDI is an important SDI for geosptial analysis and location-based services. Building Information Model (BIM has high degree of details in geometric and semantic information for building. This study proposed direct conversion schemes to extract indoor building information from BIM to OGC IndoorGML. The major steps of the research include (1 topological conversion from building model into indoor network model; and (2 generation of IndoorGML. The topological conversion is a major process of generating and mapping nodes and edges from IFC to indoorGML. Node represents every space (e.g. IfcSpace and objects (e.g. IfcDoor in the building while edge shows the relationships between nodes. According to the definition of IndoorGML, the topological model in the dual space is also represented as a set of nodes and edges. These definitions of IndoorGML are the same as in the indoor network. Therefore, we can extract the necessary data in the indoor network and easily convert them into IndoorGML based on IndoorGML Schema. The experiment utilized a real BIM model to examine the proposed method. The experimental results indicated that the 3D indoor model (i.e. IndoorGML model can be automatically imported from IFC model by the proposed procedure. In addition, the geometric and attribute of building elements are completely and correctly converted from BIM to indoor-SDI.

  19. Methods from Information Extraction from LIDAR Intensity Data and Multispectral LIDAR Technology

    Science.gov (United States)

    Scaioni, M.; Höfle, B.; Baungarten Kersting, A. P.; Barazzetti, L.; Previtali, M.; Wujanz, D.

    2018-04-01

    LiDAR is a consolidated technology for topographic mapping and 3D reconstruction, which is implemented in several platforms On the other hand, the exploitation of the geometric information has been coupled by the use of laser intensity, which may provide additional data for multiple purposes. This option has been emphasized by the availability of sensors working on different wavelength, thus able to provide additional information for classification of surfaces and objects. Several applications ofmonochromatic and multi-spectral LiDAR data have been already developed in different fields: geosciences, agriculture, forestry, building and cultural heritage. The use of intensity data to extract measures of point cloud quality has been also developed. The paper would like to give an overview on the state-of-the-art of these techniques, and to present the modern technologies for the acquisition of multispectral LiDAR data. In addition, the ISPRS WG III/5 on `Information Extraction from LiDAR Intensity Data' has collected and made available a few open data sets to support scholars to do research on this field. This service is presented and data sets delivered so far as are described.

  20. The effect of informed consent on stress levels associated with extraction of impacted mandibular third molars.

    Science.gov (United States)

    Casap, Nardy; Alterman, Michael; Sharon, Guy; Samuni, Yuval

    2008-05-01

    To evaluate the effect of informed consent on stress levels associated with removal of impacted mandibular third molars. A total of 60 patients scheduled for extraction of impacted mandibular third molars participated in this study. The patients were unaware of the study's objectives. Data from 20 patients established the baseline levels of electrodermal activity (EDA). The remaining 40 patients were randomly assigned into 2 equal groups receiving either a detailed document of informed consent, disclosing the possible risks involved with the surgery, or a simplified version. Pulse, blood pressure, and EDA were monitored before, during, and after completion of the consent document. Changes in EDA, but not in blood pressure, were measured on completion of either version of the consent document. A greater increase in EDA was associated with the detailed version of the consent document (P = .004). A similar concomitant increase (although nonsignificant) in pulse values was monitored on completion of both versions. Completion of overdisclosed document of informed consent is associated with changes in physiological parameters. The results suggest that overdetailed listing and disclosure before extraction of impacted mandibular third molars can increase patient stress.

  1. METHODS FROM INFORMATION EXTRACTION FROM LIDAR INTENSITY DATA AND MULTISPECTRAL LIDAR TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    M. Scaioni

    2018-04-01

    Full Text Available LiDAR is a consolidated technology for topographic mapping and 3D reconstruction, which is implemented in several platforms On the other hand, the exploitation of the geometric information has been coupled by the use of laser intensity, which may provide additional data for multiple purposes. This option has been emphasized by the availability of sensors working on different wavelength, thus able to provide additional information for classification of surfaces and objects. Several applications ofmonochromatic and multi-spectral LiDAR data have been already developed in different fields: geosciences, agriculture, forestry, building and cultural heritage. The use of intensity data to extract measures of point cloud quality has been also developed. The paper would like to give an overview on the state-of-the-art of these techniques, and to present the modern technologies for the acquisition of multispectral LiDAR data. In addition, the ISPRS WG III/5 on ‘Information Extraction from LiDAR Intensity Data’ has collected and made available a few open data sets to support scholars to do research on this field. This service is presented and data sets delivered so far as are described.

  2. About increasing informativity of diagnostic system of asynchronous electric motor by extracting additional information from values of consumed current parameter

    Science.gov (United States)

    Zhukovskiy, Y.; Korolev, N.; Koteleva, N.

    2018-05-01

    This article is devoted to expanding the possibilities of assessing the technical state of the current consumption of asynchronous electric drives, as well as increasing the information capacity of diagnostic methods, in conditions of limited access to equipment and incompleteness of information. The method of spectral analysis of the electric drive current can be supplemented by an analysis of the components of the current of the Park's vector. The research of the hodograph evolution in the moment of appearance and development of defects was carried out using the example of current asymmetry in the phases of an induction motor. The result of the study is the new diagnostic parameters of the asynchronous electric drive. During the research, it was proved that the proposed diagnostic parameters allow determining the type and level of the defect. At the same time, there is no need to stop the equipment and taky it out of service for repair. Modern digital control and monitoring systems can use the proposed parameters based on the stator current of an electrical machine to improve the accuracy and reliability of obtaining diagnostic patterns and predicting their changes in order to improve the equipment maintenance systems. This approach can also be used in systems and objects where there are significant parasitic vibrations and unsteady loads. The extraction of useful information can be carried out in electric drive systems in the structure of which there is a power electric converter.

  3. Multi-Paradigm and Multi-Lingual Information Extraction as Support for Medical Web Labelling Authorities

    Directory of Open Access Journals (Sweden)

    Martin Labsky

    2010-10-01

    Full Text Available Until recently, quality labelling of medical web content has been a pre-dominantly manual activity. However, the advances in automated text processing opened the way to computerised support of this activity. The core enabling technology is information extraction (IE. However, the heterogeneity of websites offering medical content imposes particular requirements on the IE techniques to be applied. In the paper we discuss these requirements and describe a multi-paradigm approach to IE addressing them. Experiments on multi-lingual data are reported. The research has been carried out within the EU MedIEQ project.

  4. Scholarly Information Extraction Is Going to Make a Quantum Leap with PubMed Central (PMC).

    Science.gov (United States)

    Matthies, Franz; Hahn, Udo

    2017-01-01

    With the increasing availability of complete full texts (journal articles), rather than their surrogates (titles, abstracts), as resources for text analytics, entirely new opportunities arise for information extraction and text mining from scholarly publications. Yet, we gathered evidence that a range of problems are encountered for full-text processing when biomedical text analytics simply reuse existing NLP pipelines which were developed on the basis of abstracts (rather than full texts). We conducted experiments with four different relation extraction engines all of which were top performers in previous BioNLP Event Extraction Challenges. We found that abstract-trained engines loose up to 6.6% F-score points when run on full-text data. Hence, the reuse of existing abstract-based NLP software in a full-text scenario is considered harmful because of heavy performance losses. Given the current lack of annotated full-text resources to train on, our study quantifies the price paid for this short cut.

  5. Accurate facade feature extraction method for buildings from three-dimensional point cloud data considering structural information

    Science.gov (United States)

    Wang, Yongzhi; Ma, Yuqing; Zhu, A.-xing; Zhao, Hui; Liao, Lixia

    2018-05-01

    Facade features represent segmentations of building surfaces and can serve as a building framework. Extracting facade features from three-dimensional (3D) point cloud data (3D PCD) is an efficient method for 3D building modeling. By combining the advantages of 3D PCD and two-dimensional optical images, this study describes the creation of a highly accurate building facade feature extraction method from 3D PCD with a focus on structural information. The new extraction method involves three major steps: image feature extraction, exploration of the mapping method between the image features and 3D PCD, and optimization of the initial 3D PCD facade features considering structural information. Results show that the new method can extract the 3D PCD facade features of buildings more accurately and continuously. The new method is validated using a case study. In addition, the effectiveness of the new method is demonstrated by comparing it with the range image-extraction method and the optical image-extraction method in the absence of structural information. The 3D PCD facade features extracted by the new method can be applied in many fields, such as 3D building modeling and building information modeling.

  6. Developing an Approach to Prioritize River Restoration using Data Extracted from Flood Risk Information System Databases.

    Science.gov (United States)

    Vimal, S.; Tarboton, D. G.; Band, L. E.; Duncan, J. M.; Lovette, J. P.; Corzo, G.; Miles, B.

    2015-12-01

    Prioritizing river restoration requires information on river geometry. In many states in the US detailed river geometry has been collected for floodplain mapping and is available in Flood Risk Information Systems (FRIS). In particular, North Carolina has, for its 100 Counties, developed a database of numerous HEC-RAS models which are available through its Flood Risk Information System (FRIS). These models that include over 260 variables were developed and updated by numerous contractors. They contain detailed surveyed or LiDAR derived cross-sections and modeled flood extents for different extreme event return periods. In this work, over 4700 HEC-RAS models' data was integrated and upscaled to utilize detailed cross-section information and 100-year modelled flood extent information to enable river restoration prioritization for the entire state of North Carolina. We developed procedures to extract geomorphic properties such as entrenchment ratio, incision ratio, etc. from these models. Entrenchment ratio quantifies the vertical containment of rivers and thereby their vulnerability to flooding and incision ratio quantifies the depth per unit width. A map of entrenchment ratio for the whole state was derived by linking these model results to a geodatabase. A ranking of highly entrenched counties enabling prioritization for flood allowance and mitigation was obtained. The results were shared through HydroShare and web maps developed for their visualization using Google Maps Engine API.

  7. Extracting Low-Frequency Information from Time Attenuation in Elastic Waveform Inversion

    Science.gov (United States)

    Guo, Xuebao; Liu, Hong; Shi, Ying; Wang, Weihong

    2017-03-01

    Low-frequency information is crucial for recovering background velocity, but the lack of low-frequency information in field data makes inversion impractical without accurate initial models. Laplace-Fourier domain waveform inversion can recover a smooth model from real data without low-frequency information, which can be used for subsequent inversion as an ideal starting model. In general, it also starts with low frequencies and includes higher frequencies at later inversion stages, while the difference is that its ultralow frequency information comes from the Laplace-Fourier domain. Meanwhile, a direct implementation of the Laplace-transformed wavefield using frequency domain inversion is also very convenient. However, because broad frequency bands are often used in the pure time domain waveform inversion, it is difficult to extract the wavefields dominated by low frequencies in this case. In this paper, low-frequency components are constructed by introducing time attenuation into the recorded residuals, and the rest of the method is identical to the traditional time domain inversion. Time windowing and frequency filtering are also applied to mitigate the ambiguity of the inverse problem. Therefore, we can start at low frequencies and to move to higher frequencies. The experiment shows that the proposed method can achieve a good inversion result in the presence of a linear initial model and records without low-frequency information.

  8. Audio-Visual Speech Recognition Using Lip Information Extracted from Side-Face Images

    Directory of Open Access Journals (Sweden)

    Koji Iwano

    2007-03-01

    Full Text Available This paper proposes an audio-visual speech recognition method using lip information extracted from side-face images as an attempt to increase noise robustness in mobile environments. Our proposed method assumes that lip images can be captured using a small camera installed in a handset. Two different kinds of lip features, lip-contour geometric features and lip-motion velocity features, are used individually or jointly, in combination with audio features. Phoneme HMMs modeling the audio and visual features are built based on the multistream HMM technique. Experiments conducted using Japanese connected digit speech contaminated with white noise in various SNR conditions show effectiveness of the proposed method. Recognition accuracy is improved by using the visual information in all SNR conditions. These visual features were confirmed to be effective even when the audio HMM was adapted to noise by the MLLR method.

  9. Approaching the largest ‘API’: extracting information from the Internet with Python

    Directory of Open Access Journals (Sweden)

    Jonathan E. Germann

    2018-02-01

    Full Text Available This article explores the need for libraries to algorithmically access and manipulate the world’s largest API: the Internet. The billions of pages on the ‘Internet API’ (HTTP, HTML, CSS, XPath, DOM, etc. are easily accessible and manipulable. Libraries can assist in creating meaning through the datafication of information on the world wide web. Because most information is created for human consumption, some programming is required for automated extraction. Python is an easy-to-learn programming language with extensive packages and community support for web page automation. Four packages (Urllib, Selenium, BeautifulSoup, Scrapy in Python can automate almost any web page for all sized projects. An example warrant data project is explained to illustrate how well Python packages can manipulate web pages to create meaning through assembling custom datasets.

  10. DEVELOPMENT OF AUTOMATIC EXTRACTION METHOD FOR ROAD UPDATE INFORMATION BASED ON PUBLIC WORK ORDER OUTLOOK

    Science.gov (United States)

    Sekimoto, Yoshihide; Nakajo, Satoru; Minami, Yoshitaka; Yamaguchi, Syohei; Yamada, Harutoshi; Fuse, Takashi

    Recently, disclosure of statistic data, representing financial effects or burden for public work, through each web site of national or local government, enables us to discuss macroscopic financial trends. However, it is still difficult to grasp a basic property nationwide how each spot was changed by public work. In this research, our research purpose is to collect road update information reasonably which various road managers provide, in order to realize efficient updating of various maps such as car navigation maps. In particular, we develop the system extracting public work concerned and registering summary including position information to database automatically from public work order outlook, released by each local government, combinating some web mining technologies. Finally, we collect and register several tens of thousands from web site all over Japan, and confirm the feasibility of our method.

  11. Ocean bottom seismometer technology

    Science.gov (United States)

    Prothero, William A., Jr.

    Seismometers have been placed on the ocean bottom for about 45 years, beginning with the work of Ewing and Vine [1938], and their current use to measure signals from earthquakes and explosions constitutes an important research method for seismological studies. Approximately 20 research groups are active in the United Kingdom, France, West Germany, Japan, Canada, and the United States. A review of ocean bottom seismometer (OBS) instrument characteristics and OBS scientific studies may be found in Whitmarsh and Lilwall [1984]. OBS instrumentation is also important for land seismology. The recording systems that have been developed have been generally more sophisticated than those available for land use, and several modern land seismic recording systems are based on OBS recording system designs.The instrumentation developed for OBS work was the topic of a meeting held at the University of California, Santa Barbara, in July 1982. This article will discuss the state of the art of OBS Technology, some of the problems remaining to be solved, and some of the solutions proposed and implemented by OBS scientists and engineers. It is not intended as a comprehensive review of existing instrumentation.

  12. Geopositioning with a quadcopter: Extracted feature locations and predicted accuracy without a priori sensor attitude information

    Science.gov (United States)

    Dolloff, John; Hottel, Bryant; Edwards, David; Theiss, Henry; Braun, Aaron

    2017-05-01

    This paper presents an overview of the Full Motion Video-Geopositioning Test Bed (FMV-GTB) developed to investigate algorithm performance and issues related to the registration of motion imagery and subsequent extraction of feature locations along with predicted accuracy. A case study is included corresponding to a video taken from a quadcopter. Registration of the corresponding video frames is performed without the benefit of a priori sensor attitude (pointing) information. In particular, tie points are automatically measured between adjacent frames using standard optical flow matching techniques from computer vision, an a priori estimate of sensor attitude is then computed based on supplied GPS sensor positions contained in the video metadata and a photogrammetric/search-based structure from motion algorithm, and then a Weighted Least Squares adjustment of all a priori metadata across the frames is performed. Extraction of absolute 3D feature locations, including their predicted accuracy based on the principles of rigorous error propagation, is then performed using a subset of the registered frames. Results are compared to known locations (check points) over a test site. Throughout this entire process, no external control information (e.g. surveyed points) is used other than for evaluation of solution errors and corresponding accuracy.

  13. Inexperienced clinicians can extract pathoanatomic information from MRI narrative reports with high reproducability for use in research/quality assurance

    DEFF Research Database (Denmark)

    Kent, Peter; Briggs, Andrew M; Albert, Hanne Birgit

    2011-01-01

    Background Although reproducibility in reading MRI images amongst radiologists and clinicians has been studied previously, no studies have examined the reproducibility of inexperienced clinicians in extracting pathoanatomic information from magnetic resonance imaging (MRI) narrative reports and t...

  14. [Extraction of buildings three-dimensional information from high-resolution satellite imagery based on Barista software].

    Science.gov (United States)

    Zhang, Pei-feng; Hu, Yuan-man; He, Hong-shi

    2010-05-01

    The demand for accurate and up-to-date spatial information of urban buildings is becoming more and more important for urban planning, environmental protection, and other vocations. Today's commercial high-resolution satellite imagery offers the potential to extract the three-dimensional information of urban buildings. This paper extracted the three-dimensional information of urban buildings from QuickBird imagery, and validated the precision of the extraction based on Barista software. It was shown that the extraction of three-dimensional information of the buildings from high-resolution satellite imagery based on Barista software had the advantages of low professional level demand, powerful universality, simple operation, and high precision. One pixel level of point positioning and height determination accuracy could be achieved if the digital elevation model (DEM) and sensor orientation model had higher precision and the off-Nadir View Angle was relatively perfect.

  15. Understanding latent structures of clinical information logistics: A bottom-up approach for model building and validating the workflow composite score.

    Science.gov (United States)

    Esdar, Moritz; Hübner, Ursula; Liebe, Jan-David; Hüsers, Jens; Thye, Johannes

    2017-01-01

    Clinical information logistics is a construct that aims to describe and explain various phenomena of information provision to drive clinical processes. It can be measured by the workflow composite score, an aggregated indicator of the degree of IT support in clinical processes. This study primarily aimed to investigate the yet unknown empirical patterns constituting this construct. The second goal was to derive a data-driven weighting scheme for the constituents of the workflow composite score and to contrast this scheme with a literature based, top-down procedure. This approach should finally test the validity and robustness of the workflow composite score. Based on secondary data from 183 German hospitals, a tiered factor analytic approach (confirmatory and subsequent exploratory factor analysis) was pursued. A weighting scheme, which was based on factor loadings obtained in the analyses, was put into practice. We were able to identify five statistically significant factors of clinical information logistics that accounted for 63% of the overall variance. These factors were "flow of data and information", "mobility", "clinical decision support and patient safety", "electronic patient record" and "integration and distribution". The system of weights derived from the factor loadings resulted in values for the workflow composite score that differed only slightly from the score values that had been previously published based on a top-down approach. Our findings give insight into the internal composition of clinical information logistics both in terms of factors and weights. They also allowed us to propose a coherent model of clinical information logistics from a technical perspective that joins empirical findings with theoretical knowledge. Despite the new scheme of weights applied to the calculation of the workflow composite score, the score behaved robustly, which is yet another hint of its validity and therefore its usefulness. Copyright © 2016 Elsevier Ireland

  16. Overview of image processing tools to extract physical information from JET videos

    Science.gov (United States)

    Craciunescu, T.; Murari, A.; Gelfusa, M.; Tiseanu, I.; Zoita, V.; EFDA Contributors, JET

    2014-11-01

    In magnetic confinement nuclear fusion devices such as JET, the last few years have witnessed a significant increase in the use of digital imagery, not only for the surveying and control of experiments, but also for the physical interpretation of results. More than 25 cameras are routinely used for imaging on JET in the infrared (IR) and visible spectral regions. These cameras can produce up to tens of Gbytes per shot and their information content can be very different, depending on the experimental conditions. However, the relevant information about the underlying physical processes is generally of much reduced dimensionality compared to the recorded data. The extraction of this information, which allows full exploitation of these diagnostics, is a challenging task. The image analysis consists, in most cases, of inverse problems which are typically ill-posed mathematically. The typology of objects to be analysed is very wide, and usually the images are affected by noise, low levels of contrast, low grey-level in-depth resolution, reshaping of moving objects, etc. Moreover, the plasma events have time constants of ms or tens of ms, which imposes tough conditions for real-time applications. On JET, in the last few years new tools and methods have been developed for physical information retrieval. The methodology of optical flow has allowed, under certain assumptions, the derivation of information about the dynamics of video objects associated with different physical phenomena, such as instabilities, pellets and filaments. The approach has been extended in order to approximate the optical flow within the MPEG compressed domain, allowing the manipulation of the large JET video databases and, in specific cases, even real-time data processing. The fast visible camera may provide new information that is potentially useful for disruption prediction. A set of methods, based on the extraction of structural information from the visual scene, have been developed for the

  17. Overview of image processing tools to extract physical information from JET videos

    International Nuclear Information System (INIS)

    Craciunescu, T; Tiseanu, I; Zoita, V; Murari, A; Gelfusa, M

    2014-01-01

    In magnetic confinement nuclear fusion devices such as JET, the last few years have witnessed a significant increase in the use of digital imagery, not only for the surveying and control of experiments, but also for the physical interpretation of results. More than 25 cameras are routinely used for imaging on JET in the infrared (IR) and visible spectral regions. These cameras can produce up to tens of Gbytes per shot and their information content can be very different, depending on the experimental conditions. However, the relevant information about the underlying physical processes is generally of much reduced dimensionality compared to the recorded data. The extraction of this information, which allows full exploitation of these diagnostics, is a challenging task. The image analysis consists, in most cases, of inverse problems which are typically ill-posed mathematically. The typology of objects to be analysed is very wide, and usually the images are affected by noise, low levels of contrast, low grey-level in-depth resolution, reshaping of moving objects, etc. Moreover, the plasma events have time constants of ms or tens of ms, which imposes tough conditions for real-time applications. On JET, in the last few years new tools and methods have been developed for physical information retrieval. The methodology of optical flow has allowed, under certain assumptions, the derivation of information about the dynamics of video objects associated with different physical phenomena, such as instabilities, pellets and filaments. The approach has been extended in order to approximate the optical flow within the MPEG compressed domain, allowing the manipulation of the large JET video databases and, in specific cases, even real-time data processing. The fast visible camera may provide new information that is potentially useful for disruption prediction. A set of methods, based on the extraction of structural information from the visual scene, have been developed for the

  18. Extraction and Analysis of Information Related to Research & Development Declared Under an Additional Protocol

    International Nuclear Information System (INIS)

    Idinger, J.; Labella, R.; Rialhe, A.; Teller, N.

    2015-01-01

    The additional protocol (AP) provides important tools to strengthen and improve the effectiveness and efficiency of the safeguards system. Safeguards are designed to verify that States comply with their international commitments not to use nuclear material or to engage in nuclear-related activities for the purpose of developing nuclear weapons or other nuclear explosive devices. Under an AP based on INFCIRC/540, a State must provide to the IAEA additional information about, and inspector access to, all parts of its nuclear fuel cycle. In addition, the State has to supply information about its nuclear fuel cycle-related research and development (R&D) activities. The majority of States declare their R&D activities under the AP Articles 2.a.(i), 2.a.(x), and 2.b.(i) as part of initial declarations and their annual updates under the AP. In order to verify consistency and completeness of information provided under the AP by States, the Agency has started to analyze declared R&D information by identifying interrelationships between States in different R&D areas relevant to safeguards. The paper outlines the quality of R&D information provided by States to the Agency, describes how the extraction and analysis of relevant declarations are currently carried out at the Agency and specifies what kinds of difficulties arise during evaluation in respect to cross-linking international projects and finding gaps in reporting. In addition, the paper tries to elaborate how the reporting quality of AP information with reference to R&D activities and the assessment process of R&D information could be improved. (author)

  19. Top-Down and Bottom-Up Visual Information Processing of Non-Social Stimuli in High-Functioning Autism Spectrum Disorder

    Science.gov (United States)

    Maekawa, Toshihiko; Tobimatsu, Shozo; Inada, Naoko; Oribe, Naoya; Onitsuka, Toshiaki; Kanba, Shigenobu; Kamio, Yoko

    2011-01-01

    Individuals with high-functioning autism spectrum disorder (HF-ASD) often show superior performance in simple visual tasks, despite difficulties in the perception of socially important information such as facial expression. The neural basis of visual perception abnormalities associated with HF-ASD is currently unclear. We sought to elucidate the…

  20. Zone analysis in biology articles as a basis for information extraction.

    Science.gov (United States)

    Mizuta, Yoko; Korhonen, Anna; Mullen, Tony; Collier, Nigel

    2006-06-01

    In the field of biomedicine, an overwhelming amount of experimental data has become available as a result of the high throughput of research in this domain. The amount of results reported has now grown beyond the limits of what can be managed by manual means. This makes it increasingly difficult for the researchers in this area to keep up with the latest developments. Information extraction (IE) in the biological domain aims to provide an effective automatic means to dynamically manage the information contained in archived journal articles and abstract collections and thus help researchers in their work. However, while considerable advances have been made in certain areas of IE, pinpointing and organizing factual information (such as experimental results) remains a challenge. In this paper we propose tackling this task by incorporating into IE information about rhetorical zones, i.e. classification of spans of text in terms of argumentation and intellectual attribution. As the first step towards this goal, we introduce a scheme for annotating biological texts for rhetorical zones and provide a qualitative and quantitative analysis of the data annotated according to this scheme. We also discuss our preliminary research on automatic zone analysis, and its incorporation into our IE framework.

  1. Extract the Relational Information of Static Features and Motion Features for Human Activities Recognition in Videos

    Directory of Open Access Journals (Sweden)

    Li Yao

    2016-01-01

    Full Text Available Both static features and motion features have shown promising performance in human activities recognition task. However, the information included in these features is insufficient for complex human activities. In this paper, we propose extracting relational information of static features and motion features for human activities recognition. The videos are represented by a classical Bag-of-Word (BoW model which is useful in many works. To get a compact and discriminative codebook with small dimension, we employ the divisive algorithm based on KL-divergence to reconstruct the codebook. After that, to further capture strong relational information, we construct a bipartite graph to model the relationship between words of different feature set. Then we use a k-way partition to create a new codebook in which similar words are getting together. With this new codebook, videos can be represented by a new BoW vector with strong relational information. Moreover, we propose a method to compute new clusters from the divisive algorithm’s projective function. We test our work on the several datasets and obtain very promising results.

  2. Bottom and top physics

    International Nuclear Information System (INIS)

    Foley, K.J.; Gilman, F.J.; Herten, G.; Hinchliffe, I.; Jawahery, A.; Sanda, A.; Schmidt, M.P.; Schubert, K.R.; Fridman, A.

    1988-01-01

    The production of heavy quark flavors occurs primarily by the strong interactions and offers another arena in which to test QCD and to probe gluon distributions at very small values of x. Such quarks can also be produced as decay products of possible new, yet undiscovered particles, e.g., Higgs bosons, and therefore are a necessary key to reconstructing such particles. The decay products of heavy quarks, especially from their semileptonic decays, can themselves form a background to other new physics processes. The production of bottom quarks at the SSC and the formalism and phenomenology of observing CP violation in B meson decays is discussed. The production of a heavy t quark which decays into a real W boson, and what might be learned from its decays is examined

  3. Intelligent Evaluation Method of Tank Bottom Corrosion Status Based on Improved BP Artificial Neural Network

    Science.gov (United States)

    Qiu, Feng; Dai, Guang; Zhang, Ying

    According to the acoustic emission information and the appearance inspection information of tank bottom online testing, the external factors associated with tank bottom corrosion status are confirmed. Applying artificial neural network intelligent evaluation method, three tank bottom corrosion status evaluation models based on appearance inspection information, acoustic emission information, and online testing information are established. Comparing with the result of acoustic emission online testing through the evaluation of test sample, the accuracy of the evaluation model based on online testing information is 94 %. The evaluation model can evaluate tank bottom corrosion accurately and realize acoustic emission online testing intelligent evaluation of tank bottom.

  4. MedEx: a medication information extraction system for clinical narratives

    Science.gov (United States)

    Stenner, Shane P; Doan, Son; Johnson, Kevin B; Waitman, Lemuel R; Denny, Joshua C

    2010-01-01

    Medication information is one of the most important types of clinical data in electronic medical records. It is critical for healthcare safety and quality, as well as for clinical research that uses electronic medical record data. However, medication data are often recorded in clinical notes as free-text. As such, they are not accessible to other computerized applications that rely on coded data. We describe a new natural language processing system (MedEx), which extracts medication information from clinical notes. MedEx was initially developed using discharge summaries. An evaluation using a data set of 50 discharge summaries showed it performed well on identifying not only drug names (F-measure 93.2%), but also signature information, such as strength, route, and frequency, with F-measures of 94.5%, 93.9%, and 96.0% respectively. We then applied MedEx unchanged to outpatient clinic visit notes. It performed similarly with F-measures over 90% on a set of 25 clinic visit notes. PMID:20064797

  5. Videomicroscopic extraction of specific information on cell proliferation and migration in vitro

    International Nuclear Information System (INIS)

    Debeir, Olivier; Megalizzi, Veronique; Warzee, Nadine; Kiss, Robert; Decaestecker, Christine

    2008-01-01

    In vitro cell imaging is a useful exploratory tool for cell behavior monitoring with a wide range of applications in cell biology and pharmacology. Combined with appropriate image analysis techniques, this approach has been shown to provide useful information on the detection and dynamic analysis of cell events. In this context, numerous efforts have been focused on cell migration analysis. In contrast, the cell division process has been the subject of fewer investigations. The present work focuses on this latter aspect and shows that, in complement to cell migration data, interesting information related to cell division can be extracted from phase-contrast time-lapse image series, in particular cell division duration, which is not provided by standard cell assays using endpoint analyses. We illustrate our approach by analyzing the effects induced by two sigma-1 receptor ligands (haloperidol and 4-IBP) on the behavior of two glioma cell lines using two in vitro cell models, i.e., the low-density individual cell model and the high-density scratch wound model. This illustration also shows that the data provided by our approach are suggestive as to the mechanism of action of compounds, and are thus capable of informing the appropriate selection of further time-consuming and more expensive biological evaluations required to elucidate a mechanism

  6. 5W1H Information Extraction with CNN-Bidirectional LSTM

    Science.gov (United States)

    Nurdin, A.; Maulidevi, N. U.

    2018-03-01

    In this work, information about who, did what, when, where, why, and how on Indonesian news articles were extracted by combining Convolutional Neural Network and Bidirectional Long Short-Term Memory. Convolutional Neural Network can learn semantically meaningful representations of sentences. Bidirectional LSTM can analyze the relations among words in the sequence. We also use word embedding word2vec for word representation. By combining these algorithms, we obtained F-measure 0.808. Our experiments show that CNN-BLSTM outperforms other shallow methods, namely IBk, C4.5, and Naïve Bayes with the F-measure 0.655, 0.645, and 0.595, respectively.

  7. Metaproteomics: extracting and mining proteome information to characterize metabolic activities in microbial communities.

    Science.gov (United States)

    Abraham, Paul E; Giannone, Richard J; Xiong, Weili; Hettich, Robert L

    2014-06-17

    Contemporary microbial ecology studies usually employ one or more "omics" approaches to investigate the structure and function of microbial communities. Among these, metaproteomics aims to characterize the metabolic activities of the microbial membership, providing a direct link between the genetic potential and functional metabolism. The successful deployment of metaproteomics research depends on the integration of high-quality experimental and bioinformatic techniques for uncovering the metabolic activities of a microbial community in a way that is complementary to other "meta-omic" approaches. The essential, quality-defining informatics steps in metaproteomics investigations are: (1) construction of the metagenome, (2) functional annotation of predicted protein-coding genes, (3) protein database searching, (4) protein inference, and (5) extraction of metabolic information. In this article, we provide an overview of current bioinformatic approaches and software implementations in metaproteome studies in order to highlight the key considerations needed for successful implementation of this powerful community-biology tool. Copyright © 2014 John Wiley & Sons, Inc.

  8. Developing a Process Model for the Forensic Extraction of Information from Desktop Search Applications

    Directory of Open Access Journals (Sweden)

    Timothy Pavlic

    2008-03-01

    Full Text Available Desktop search applications can contain cached copies of files that were deleted from the file system. Forensic investigators see this as a potential source of evidence, as documents deleted by suspects may still exist in the cache. Whilst there have been attempts at recovering data collected by desktop search applications, there is no methodology governing the process, nor discussion on the most appropriate means to do so. This article seeks to address this issue by developing a process model that can be applied when developing an information extraction application for desktop search applications, discussing preferred methods and the limitations of each. This work represents a more structured approach than other forms of current research.

  9. An innovative method for extracting isotopic information from low-resolution gamma spectra

    International Nuclear Information System (INIS)

    Miko, D.; Estep, R.J.; Rawool-Sullivan, M.W.

    1998-01-01

    A method is described for the extraction of isotopic information from attenuated gamma ray spectra using the gross-count material basis set (GC-MBS) model. This method solves for the isotopic composition of an unknown mixture of isotopes attenuated through an absorber of unknown material. For binary isotopic combinations the problem is nonlinear in only one variable and is easily solved using standard line optimization techniques. Results are presented for NaI spectrum analyses of various binary combinations of enriched uranium, depleted uranium, low burnup Pu, 137 Cs, and 133 Ba attenuated through a suite of absorbers ranging in Z from polyethylene through lead. The GC-MBS method results are compared to those computed using ordinary response function fitting and with a simple net peak area method. The GC-MBS method was found to be significantly more accurate than the other methods over the range of absorbers and isotopic blends studied

  10. EnvMine: A text-mining system for the automatic extraction of contextual information

    Directory of Open Access Journals (Sweden)

    de Lorenzo Victor

    2010-06-01

    Full Text Available Abstract Background For ecological studies, it is crucial to count on adequate descriptions of the environments and samples being studied. Such a description must be done in terms of their physicochemical characteristics, allowing a direct comparison between different environments that would be difficult to do otherwise. Also the characterization must include the precise geographical location, to make possible the study of geographical distributions and biogeographical patterns. Currently, there is no schema for annotating these environmental features, and these data have to be extracted from textual sources (published articles. So far, this had to be performed by manual inspection of the corresponding documents. To facilitate this task, we have developed EnvMine, a set of text-mining tools devoted to retrieve contextual information (physicochemical variables and geographical locations from textual sources of any kind. Results EnvMine is capable of retrieving the physicochemical variables cited in the text, by means of the accurate identification of their associated units of measurement. In this task, the system achieves a recall (percentage of items retrieved of 92% with less than 1% error. Also a Bayesian classifier was tested for distinguishing parts of the text describing environmental characteristics from others dealing with, for instance, experimental settings. Regarding the identification of geographical locations, the system takes advantage of existing databases such as GeoNames to achieve 86% recall with 92% precision. The identification of a location includes also the determination of its exact coordinates (latitude and longitude, thus allowing the calculation of distance between the individual locations. Conclusion EnvMine is a very efficient method for extracting contextual information from different text sources, like published articles or web pages. This tool can help in determining the precise location and physicochemical

  11. Cylinder-type bottom reflector

    International Nuclear Information System (INIS)

    Elter, C.; Fritz, R.; Kissel, K.F.; Schoening, J.

    1982-01-01

    Proposal of a bottom reflector for gas-cooled nuclear reactor plants with a pebble bed of spherical fuel elements, where the horizontal forces acting from the core and the bottom reflector upon the side reflector are equally distributed. This is attained by the upper edge of the bottom reflector being placed levelly and by the angle of inclination of the recesses varying. (orig.) [de

  12. Extraction of prospecting information of uranium deposit based on high spatial resolution satellite data. Taking bashibulake region as an example

    International Nuclear Information System (INIS)

    Yang Xu; Liu Dechang; Zhang Jielin

    2008-01-01

    In this study, the signification and content of prospecting information of uranium deposit are expounded. Quickbird high spatial resolution satellite data are used to extract the prospecting information of uranium deposit in Bashibulake area in the north of Tarim Basin. By using the pertinent methods of image processing, the information of ore-bearing bed, ore-control structure and mineralized alteration have been extracted. The results show a high consistency with the field survey. The aim of this study is to explore practicability of high spatial resolution satellite data for prospecting minerals, and to broaden the thinking of prospectation at similar area. (authors)

  13. Bottom sample taker

    Energy Technology Data Exchange (ETDEWEB)

    Garbarenko, O V; Slonimskiy, L D

    1982-01-01

    In order to improve the quality of the samples taken during offshore exploration from benthic sediments, the proposed design of the sample taker has a device which makes it possible to regulate the depth of submersion of the core lifter. For this purpose the upper part of the core lifter has an inner delimiting ring, and within the core lifter there is a piston suspended on a cable. The position of the piston in relation to the core lifter is previously assigned depending on the compactness of the benthic sediments and is fixed by tension of the cable which is held by a clamp in the cover of the core taker housing. When lowered to the bottom, the core taker is released, and under the influence of hydrostatic pressure of sea water, it enters the sediments. The magnitude of penetration is limited by the distance between the piston and the stopping ring. The piston also guarantees better preservation of the sample when the instrument is lifted to the surface.

  14. Rewetting during bottom flooding

    International Nuclear Information System (INIS)

    Pearson, K.G.

    1984-11-01

    A qualitative description of the rewetting process during bottom reflooding of a PWR is presented. Rewetting is seen as the end product of a path taken over a heat transfer surface which defines how the surface heat flux varies with surface temperature and with distance from the rewetting front. The main components are liquid contact, vapour convection and thermal radiation. In this paper the general topography of the heat transfer surface is deduced from consideration of the ways in which the conditions of the vapour and liquid phases in the flow are expected to vary with distance from the rewetting front. The deduced surface has a heat transfer ridge which decreases in height, and whose steep face moves to lower temperatures, with increasing distance from the rewetting front, and a valley which becomes negative with increasing distance. There is a different surface for each position along a subchannel, strongly influenced by the proximity of spacer grids, and by whether these grids are wet or dry. The form of this family of heat transfer surfaces is used to explain the phenomena of reflooding of clusters of heated rods. (U.K.)

  15. Extracting chemical information from high-resolution Kβ X-ray emission spectroscopy

    Science.gov (United States)

    Limandri, S.; Robledo, J.; Tirao, G.

    2018-06-01

    High-resolution X-ray emission spectroscopy allows studying the chemical environment of a wide variety of materials. Chemical information can be obtained by fitting the X-ray spectra and observing the behavior of some spectral features. Spectral changes can also be quantified by means of statistical parameters calculated by considering the spectrum as a probability distribution. Another possibility is to perform statistical multivariate analysis, such as principal component analysis. In this work the performance of these procedures for extracting chemical information in X-ray emission spectroscopy spectra for mixtures of Mn2+ and Mn4+ oxides are studied. A detail analysis of the parameters obtained, as well as the associated uncertainties is shown. The methodologies are also applied for Mn oxidation state characterization of double perovskite oxides Ba1+xLa1-xMnSbO6 (with 0 ≤ x ≤ 0.7). The results show that statistical parameters and multivariate analysis are the most suitable for the analysis of this kind of spectra.

  16. Information Extraction of Tourist Geological Resources Based on 3d Visualization Remote Sensing Image

    Science.gov (United States)

    Wang, X.

    2018-04-01

    Tourism geological resources are of high value in admiration, scientific research and universal education, which need to be protected and rationally utilized. In the past, most of the remote sensing investigations of tourism geological resources used two-dimensional remote sensing interpretation method, which made it difficult for some geological heritages to be interpreted and led to the omission of some information. This aim of this paper is to assess the value of a method using the three-dimensional visual remote sensing image to extract information of geological heritages. skyline software system is applied to fuse the 0.36 m aerial images and 5m interval DEM to establish the digital earth model. Based on the three-dimensional shape, color tone, shadow, texture and other image features, the distribution of tourism geological resources in Shandong Province and the location of geological heritage sites were obtained, such as geological structure, DaiGu landform, granite landform, Volcanic landform, sandy landform, Waterscapes, etc. The results show that using this method for remote sensing interpretation is highly recognizable, making the interpretation more accurate and comprehensive.

  17. Information Management Processes for Extraction of Student Dropout Indicators in Courses in Distance Mode

    Directory of Open Access Journals (Sweden)

    Renata Maria Abrantes Baracho

    2016-04-01

    Full Text Available This research addresses the use of information management processes in order to extract student dropout indicators in distance mode courses. Distance education in Brazil aims to facilitate access to information. The MEC (Ministry of Education announced, in the second semester of 2013, that the main obstacles faced by institutions offering courses in this mode were students dropping out and the resistance of both educators and students to this mode. The research used a mixed methodology, qualitative and quantitative, to obtain student dropout indicators. The factors found and validated in this research were: the lack of interest from students, insufficient training in the use of the virtual learning environment for students, structural problems in the schools that were chosen to offer the course, students without e-mail, incoherent answers to activities to the course, lack of knowledge on the part of the student when using the computer tool. The scenario considered was a course offered in distance mode called Aluno Integrado (Integrated Student

  18. Measuring nuclear reaction cross sections to extract information on neutrinoless double beta decay

    Science.gov (United States)

    Cavallaro, M.; Cappuzzello, F.; Agodi, C.; Acosta, L.; Auerbach, N.; Bellone, J.; Bijker, R.; Bonanno, D.; Bongiovanni, D.; Borello-Lewin, T.; Boztosun, I.; Branchina, V.; Bussa, M. P.; Calabrese, S.; Calabretta, L.; Calanna, A.; Calvo, D.; Carbone, D.; Chávez Lomelí, E. R.; Coban, A.; Colonna, M.; D'Agostino, G.; De Geronimo, G.; Delaunay, F.; Deshmukh, N.; de Faria, P. N.; Ferraresi, C.; Ferreira, J. L.; Finocchiaro, P.; Fisichella, M.; Foti, A.; Gallo, G.; Garcia, U.; Giraudo, G.; Greco, V.; Hacisalihoglu, A.; Kotila, J.; Iazzi, F.; Introzzi, R.; Lanzalone, G.; Lavagno, A.; La Via, F.; Lay, J. A.; Lenske, H.; Linares, R.; Litrico, G.; Longhitano, F.; Lo Presti, D.; Lubian, J.; Medina, N.; Mendes, D. R.; Muoio, A.; Oliveira, J. R. B.; Pakou, A.; Pandola, L.; Petrascu, H.; Pinna, F.; Reito, S.; Rifuggiato, D.; Rodrigues, M. R. D.; Russo, A. D.; Russo, G.; Santagati, G.; Santopinto, E.; Sgouros, O.; Solakci, S. O.; Souliotis, G.; Soukeras, V.; Spatafora, A.; Torresi, D.; Tudisco, S.; Vsevolodovna, R. I. M.; Wheadon, R. J.; Yildirin, A.; Zagatto, V. A. B.

    2018-02-01

    Neutrinoless double beta decay (0vββ) is considered the best potential resource to access the absolute neutrino mass scale. Moreover, if observed, it will signal that neutrinos are their own anti-particles (Majorana particles). Presently, this physics case is one of the most important research “beyond Standard Model” and might guide the way towards a Grand Unified Theory of fundamental interactions. Since the 0vββ decay process involves nuclei, its analysis necessarily implies nuclear structure issues. In the NURE project, supported by a Starting Grant of the European Research Council (ERC), nuclear reactions of double charge-exchange (DCE) are used as a tool to extract information on the 0vββ Nuclear Matrix Elements. In DCE reactions and ββ decay indeed the initial and final nuclear states are the same and the transition operators have similar structure. Thus the measurement of the DCE absolute cross-sections can give crucial information on ββ matrix elements. In a wider view, the NUMEN international collaboration plans a major upgrade of the INFN-LNS facilities in the next years in order to increase the experimental production of nuclei of at least two orders of magnitude, thus making feasible a systematic study of all the cases of interest as candidates for 0vββ.

  19. Unsupervised Symbolization of Signal Time Series for Extraction of the Embedded Information

    Directory of Open Access Journals (Sweden)

    Yue Li

    2017-03-01

    Full Text Available This paper formulates an unsupervised algorithm for symbolization of signal time series to capture the embedded dynamic behavior. The key idea is to convert time series of the digital signal into a string of (spatially discrete symbols from which the embedded dynamic information can be extracted in an unsupervised manner (i.e., no requirement for labeling of time series. The main challenges here are: (1 definition of the symbol assignment for the time series; (2 identification of the partitioning segment locations in the signal space of time series; and (3 construction of probabilistic finite-state automata (PFSA from the symbol strings that contain temporal patterns. The reported work addresses these challenges by maximizing the mutual information measures between symbol strings and PFSA states. The proposed symbolization method has been validated by numerical simulation as well as by experimentation in a laboratory environment. Performance of the proposed algorithm has been compared to that of two commonly used algorithms of time series partitioning.

  20. A methodology for the extraction of quantitative information from electron microscopy images at the atomic level

    International Nuclear Information System (INIS)

    Galindo, P L; Pizarro, J; Guerrero, E; Guerrero-Lebrero, M P; Scavello, G; Yáñez, A; Sales, D L; Herrera, M; Molina, S I; Núñez-Moraleda, B M; Maestre, J M

    2014-01-01

    In this paper we describe a methodology developed at the University of Cadiz (Spain) in the past few years for the extraction of quantitative information from electron microscopy images at the atomic level. This work is based on a coordinated and synergic activity of several research groups that have been working together over the last decade in two different and complementary fields: Materials Science and Computer Science. The aim of our joint research has been to develop innovative high-performance computing techniques and simulation methods in order to address computationally challenging problems in the analysis, modelling and simulation of materials at the atomic scale, providing significant advances with respect to existing techniques. The methodology involves several fundamental areas of research including the analysis of high resolution electron microscopy images, materials modelling, image simulation and 3D reconstruction using quantitative information from experimental images. These techniques for the analysis, modelling and simulation allow optimizing the control and functionality of devices developed using materials under study, and have been tested using data obtained from experimental samples

  1. Extraction of Seabed/Subsurface Features in a Potential CO2 Sequestration Site in the Southern Baltic Sea, Using Wavelet Transform of High-resolution Sub-Bottom Profiler Data

    Science.gov (United States)

    Tegowski, J.; Zajfert, G.

    2014-12-01

    Carbon Capture & Storage (CCS) efficiently prevents the release of anthropogenic CO2 into the atmosphere. We investigate a potential site in the Polish Sector of the Baltic Sea (B3 field site), consisting in a depleted oil and gas reservoir. An area ca. 30 x 8 km was surveyed along 138 acoustic transects, realised from R/V St. Barbara in 2012 and combining multibeam echosounder, sidescan sonar and sub-bottom profiler. Preparation of CCS sites requires accurate knowledge of the subsurface structure of the seafloor, in particular deposit compactness. Gas leaks in the water column were monitored, along with the structure of upper sediment layers. Our analyses show the shallow sub-seabed is layered, and quantified the spatial distribution of gas diffusion chimneys and seabed effusion craters. Remote detection of gas-containing surface sediments can be rather complex if bubbles are not emitted directly into the overlying water and thus detectable acoustically. The heterogeneity of gassy sediments makes conventional bottom sampling methods inefficient. Therefore, we propose a new approach to identification, mapping, and monitoring of potentially gassy surface sediments, based on wavelet analysis of echo signal envelopes of a chirp sub-bottom profiler (EdgeTech SB-0512). Each echo envelope was subjected to wavelet transformation, whose coefficients were used to calculate wavelet energies. The set of echo envelope parameters was input to fuzzy logic and c-means algorithms. The resulting classification highlights seafloor areas with different subsurface morphological features, which can indicate gassy sediments. This work has been conducted under EC FP7-CP-IP project No. 265847: Sub-seabed CO2 Storage: Impact on Marine Ecosystems (ECO2).

  2. Dual-wavelength phase-shifting digital holography selectively extracting wavelength information from wavelength-multiplexed holograms.

    Science.gov (United States)

    Tahara, Tatsuki; Mori, Ryota; Kikunaga, Shuhei; Arai, Yasuhiko; Takaki, Yasuhiro

    2015-06-15

    Dual-wavelength phase-shifting digital holography that selectively extracts wavelength information from five wavelength-multiplexed holograms is presented. Specific phase shifts for respective wavelengths are introduced to remove the crosstalk components and extract only the object wave at the desired wavelength from the holograms. Object waves in multiple wavelengths are selectively extracted by utilizing 2π ambiguity and the subtraction procedures based on phase-shifting interferometry. Numerical results show the validity of the proposed technique. The proposed technique is also experimentally demonstrated.

  3. Shallow flows with bottom topography

    NARCIS (Netherlands)

    Heijst, van G.J.F.; Kamp, L.P.J.; Theunissen, R.; Rodi, W.; Uhlmann, M.

    2012-01-01

    This paper discusses laboratory experiments and numerical simulations of dipolar vortex flows in a shallow fluid layer with bottom topography. Two cases are considered: a step topography and a linearly sloping bottom. It is found that viscous effects – i.e., no-slip conditions at the non-horizontal

  4. Information Extraction and Dependency on Open Government Data (ogd) for Environmental Monitoring

    Science.gov (United States)

    Abdulmuttalib, Hussein

    2016-06-01

    Environmental monitoring practices support decision makers of different government / private institutions, besides environmentalists and planners among others. This support helps them act towards the sustainability of our environment, and also take efficient measures for protecting human beings in general, but it is difficult to explore useful information from 'OGD' and assure its quality for the purpose. On the other hand, Monitoring itself comprises detecting changes as happens, or within the mitigation period range, which means that any source of data, that is to be used for monitoring, should replicate the information related to the period of environmental monitoring, or otherwise it's considered almost useless or history. In this paper the assessment of information extraction and structuring from Open Government Data 'OGD', that can be useful to environmental monitoring is performed, looking into availability, usefulness to environmental monitoring of a certain type, checking its repetition period and dependences. The particular assessment is being performed on a small sample selected from OGD, bearing in mind the type of the environmental change monitored, such as the increase and concentrations of built up areas, and reduction of green areas, or monitoring the change of temperature in a specific area. The World Bank mentioned in its blog that Data is open if it satisfies both conditions of, being technically open, and legally open. The use of Open Data thus, is regulated by published terms of use, or an agreement which implies some conditions without violating the above mentioned two conditions. Within the scope of the paper I wish to share the experience of using some OGD for supporting an environmental monitoring work, that is performed to mitigate the production of carbon dioxide, by regulating energy consumption, and by properly designing the test area's landscapes, thus using Geodesign tactics, meanwhile wish to add to the results achieved by many

  5. Content and the forms of heavy metals in bottom sediments in the zone of industrial pollution sources ,

    Directory of Open Access Journals (Sweden)

    Voytyuk Y.Y.

    2014-12-01

    Full Text Available Regularities in the distribution of heavy metals in sediments in the zone of influence of the steel industry in Mariupol are installed. The study results of the forms of occurrence of Zn, Pb, Cu, Cr, Ni are represented. Ecological and geochemical assessment of sediment contamination by heavy metals is performed. The main sources of pollution of bottom sediments are air borne emissions from industrial plants, hydrogenous pollution in industrial sewage entering the water, sewage sludge, ash dumps, slag, ore, sludge, oil spills and salt solutions. Pollution hydrogenous sediments may be significant, contaminated sediments are a source of long-term contamination of water, even after cessation of discharges into rivers untreated wastewater. The environmental condition of bottom sediments in gross content of heavy metals is little information because they do not reflect the transformation and further migration to adjacent environment. The study forms of giving objective information for ecological and geochemical evaluation. The study forms of heavy metals in the sediments carried by successive extracts. Concentrations of heavy metals in the extracts determined by atomic absorption spectrometer analysis CAS-115. It was established that a number of elements typical of exceeding their content in bottom sediments of the background values, due likely to their technogenic origin. Man-made pollution of bottom sediments. Mariupol has disrupted the natural form of the ratio of heavy metals. In the studied sediments form ion exchange increased content of heavy metals, which contributes to their migration in the aquatic environment.

  6. Machine learning classification of surgical pathology reports and chunk recognition for information extraction noise reduction.

    Science.gov (United States)

    Napolitano, Giulio; Marshall, Adele; Hamilton, Peter; Gavin, Anna T

    2016-06-01

    Machine learning techniques for the text mining of cancer-related clinical documents have not been sufficiently explored. Here some techniques are presented for the pre-processing of free-text breast cancer pathology reports, with the aim of facilitating the extraction of information relevant to cancer staging. The first technique was implemented using the freely available software RapidMiner to classify the reports according to their general layout: 'semi-structured' and 'unstructured'. The second technique was developed using the open source language engineering framework GATE and aimed at the prediction of chunks of the report text containing information pertaining to the cancer morphology, the tumour size, its hormone receptor status and the number of positive nodes. The classifiers were trained and tested respectively on sets of 635 and 163 manually classified or annotated reports, from the Northern Ireland Cancer Registry. The best result of 99.4% accuracy - which included only one semi-structured report predicted as unstructured - was produced by the layout classifier with the k nearest algorithm, using the binary term occurrence word vector type with stopword filter and pruning. For chunk recognition, the best results were found using the PAUM algorithm with the same parameters for all cases, except for the prediction of chunks containing cancer morphology. For semi-structured reports the performance ranged from 0.97 to 0.94 and from 0.92 to 0.83 in precision and recall, while for unstructured reports performance ranged from 0.91 to 0.64 and from 0.68 to 0.41 in precision and recall. Poor results were found when the classifier was trained on semi-structured reports but tested on unstructured. These results show that it is possible and beneficial to predict the layout of reports and that the accuracy of prediction of which segments of a report may contain certain information is sensitive to the report layout and the type of information sought. Copyright

  7. Study of time-frequency characteristics of single snores: extracting new information for sleep apnea diagnosis

    Energy Technology Data Exchange (ETDEWEB)

    Castillo Escario, Y.; Blanco Almazan, D.; Camara Vazquez, M.A.; Jane Campos, R.

    2016-07-01

    Obstructive sleep apnea (OSA) is a highly prevalent chronic disease, especially in elderly and obese population. Despite constituting a huge health and economic problem, most patients remain undiagnosed due to limitations in current strategies. Therefore, it is essential to find cost-effective diagnostic alternatives. One of these novel approaches is the analysis of acoustic snoring signals. Snoring is an early symptom of OSA which carries pathophysiological information of high diagnostic value. For this reason, the main objective of this work is to study the characteristics of single snores of different types, from healthy and OSA subjects. To do that, we analyzed snoring signals from previous databases and developed an experimental protocol to record simulated OSA-related sounds and characterize the response of two commercial tracheal microphones. Automatic programs for filtering, downsampling, event detection and time-frequency analysis were built in MATLAB. We found that time-frequency maps and spectral parameters (central, mean and peak frequency and energy in the 100-500 Hz band) allow distinguishing regular snores of healthy subjects from non-regular snores and snores of OSA subjects. Regarding the two commercial microphones, we found that one of them was a suitable snoring sensor, while the other had a too restricted frequency response. Future work shall include a higher number of episodes and subjects, but our study has contributed to show how important the differences between regular and non-regular snores can be for OSA diagnosis, and how much clinically relevant information can be extracted from time-frequency maps and spectral parameters of single snores. (Author)

  8. Quantum measurement information as a key to energy extraction from local vacuums

    International Nuclear Information System (INIS)

    Hotta, Masahiro

    2008-01-01

    In this paper, a protocol is proposed in which energy extraction from local vacuum states is possible by using quantum measurement information for the vacuum state of quantum fields. In the protocol, Alice, who stays at a spatial point, excites the ground state of the fields by a local measurement. Consequently, wave packets generated by Alice's measurement propagate the vacuum to spatial infinity. Let us assume that Bob stays away from Alice and fails to catch the excitation energy when the wave packets pass in front of him. Next Alice announces her local measurement result to Bob by classical communication. Bob performs a local unitary operation depending on the measurement result. In this process, positive energy is released from the fields to Bob's apparatus of the unitary operation. In the field systems, wave packets are generated with negative energy around Bob's location. Soon afterwards, the negative-energy wave packets begin to chase after the positive-energy wave packets generated by Alice and form loosely bound states.

  9. Oxygen octahedra picker: A software tool to extract quantitative information from STEM images

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Yi, E-mail: y.wang@fkf.mpg.de; Salzberger, Ute; Sigle, Wilfried; Eren Suyolcu, Y.; Aken, Peter A. van

    2016-09-15

    In perovskite oxide based materials and hetero-structures there are often strong correlations between oxygen octahedral distortions and functionality. Thus, atomistic understanding of the octahedral distortion, which requires accurate measurements of atomic column positions, will greatly help to engineer their properties. Here, we report the development of a software tool to extract quantitative information of the lattice and of BO{sub 6} octahedral distortions from STEM images. Center-of-mass and 2D Gaussian fitting methods are implemented to locate positions of individual atom columns. The precision of atomic column distance measurements is evaluated on both simulated and experimental images. The application of the software tool is demonstrated using practical examples. - Highlights: • We report a software tool for mapping atomic positions from HAADF and ABF images. • It enables quantification of both crystal lattice and oxygen octahedral distortions. • We test the measurement accuracy and precision on simulated and experimental images. • It works well for different orientations of perovskite structures and interfaces.

  10. Note on difference spectra for fast extraction of global image information.

    CSIR Research Space (South Africa)

    Van Wyk, BJ

    2007-06-01

    Full Text Available FOR FAST EXTRACTION OF GLOBAL IMAGE INFORMATION. B.J van Wyk* M.A. van Wyk* and F. van den Bergh** * c29c55c48c51c46c4bc03 c36c52c58c57c4bc03 c24c49c55c4cc46c44c51c03 c37c48c46c4bc51c4cc46c44c4fc03 c2cc51c56c57c4cc57c58c57c48c03 c4cc51c03 c28c4fc48c...46c57c55c52c51c4cc46c56c03 c0bc29cb6c36c24c37c2cc28c0cc03 c44c57c03 c57c4bc48c03 c37c56c4bc5ac44c51c48c03 c38c51c4cc59c48c55c56c4cc57c5cc03 c52c49c03 Technology, Private Bag X680, Pretoria 0001. ** Remote Sensing Research Group, Meraka Institute...

  11. Analysis Methods for Extracting Knowledge from Large-Scale WiFi Monitoring to Inform Building Facility Planning

    DEFF Research Database (Denmark)

    Ruiz-Ruiz, Antonio; Blunck, Henrik; Prentow, Thor Siiger

    2014-01-01

    realistic data to inform facility planning. In this paper, we propose analysis methods to extract knowledge from large sets of network collected WiFi traces to better inform facility management and planning in large building complexes. The analysis methods, which build on a rich set of temporal and spatial......The optimization of logistics in large building com- plexes with many resources, such as hospitals, require realistic facility management and planning. Current planning practices rely foremost on manual observations or coarse unverified as- sumptions and therefore do not properly scale or provide....... Spatio-temporal visualization tools built on top of these methods enable planners to inspect and explore extracted information to inform facility-planning activities. To evaluate the methods, we present results for a large hospital complex covering more than 10 hectares. The evaluation is based on Wi...

  12. Extraction as a source of additional information when concentrations in multicomponent systems are simultaneously determined

    International Nuclear Information System (INIS)

    Perkov, I.G.

    1988-01-01

    Using as an example photometric determination of Nd and Sm in their joint presence, the possibility to use the influence of extraction on analytic signal increase is considered. It is shown that interligand exchange in extracts in combination with simultaneous determination of concentrations can be used as a simple means increasing the accuracy of determination. 5 refs.; 2 figs.; 3 tabs

  13. Validation and extraction of molecular-geometry information from small-molecule databases.

    Science.gov (United States)

    Long, Fei; Nicholls, Robert A; Emsley, Paul; Graǽulis, Saulius; Merkys, Andrius; Vaitkus, Antanas; Murshudov, Garib N

    2017-02-01

    A freely available small-molecule structure database, the Crystallography Open Database (COD), is used for the extraction of molecular-geometry information on small-molecule compounds. The results are used for the generation of new ligand descriptions, which are subsequently used by macromolecular model-building and structure-refinement software. To increase the reliability of the derived data, and therefore the new ligand descriptions, the entries from this database were subjected to very strict validation. The selection criteria made sure that the crystal structures used to derive atom types, bond and angle classes are of sufficiently high quality. Any suspicious entries at a crystal or molecular level were removed from further consideration. The selection criteria included (i) the resolution of the data used for refinement (entries solved at 0.84 Å resolution or higher) and (ii) the structure-solution method (structures must be from a single-crystal experiment and all atoms of generated molecules must have full occupancies), as well as basic sanity checks such as (iii) consistency between the valences and the number of connections between atoms, (iv) acceptable bond-length deviations from the expected values and (v) detection of atomic collisions. The derived atom types and bond classes were then validated using high-order moment-based statistical techniques. The results of the statistical analyses were fed back to fine-tune the atom typing. The developed procedure was repeated four times, resulting in fine-grained atom typing, bond and angle classes. The procedure will be repeated in the future as and when new entries are deposited in the COD. The whole procedure can also be applied to any source of small-molecule structures, including the Cambridge Structural Database and the ZINC database.

  14. Extracting respiratory information from seismocardiogram signals acquired on the chest using a miniature accelerometer

    International Nuclear Information System (INIS)

    Pandia, Keya; Inan, Omer T; Kovacs, Gregory T A; Giovangrandi, Laurent

    2012-01-01

    Seismocardiography (SCG) is a non-invasive measurement of the vibrations of the chest caused by the heartbeat. SCG signals can be measured using a miniature accelerometer attached to the chest, and are thus well-suited for unobtrusive and long-term patient monitoring. Additionally, SCG contains information relating to both cardiovascular and respiratory systems. In this work, algorithms were developed for extracting three respiration-dependent features of the SCG signal: intensity modulation, timing interval changes within each heartbeat, and timing interval changes between successive heartbeats. Simultaneously with a reference respiration belt, SCG signals were measured from 20 healthy subjects and a respiration rate was estimated using each of the three SCG features and the reference signal. The agreement between each of the three accelerometer-derived respiration rate measurements was computed with respect to the respiration rate derived from the reference respiration belt. The respiration rate obtained from the intensity modulation in the SCG signal was found to be in closest agreement with the respiration rate obtained from the reference respiration belt: the bias was found to be 0.06 breaths per minute with a 95% confidence interval of −0.99 to 1.11 breaths per minute. The limits of agreement between the respiration rates estimated using SCG (intensity modulation) and the reference were within the clinically relevant ranges given in existing literature, demonstrating that SCG could be used for both cardiovascular and respiratory monitoring. Furthermore, phases of each of the three SCG parameters were investigated at four instances of a respiration cycle—start inspiration, peak inspiration, start expiration, and peak expiration—and during breath hold (apnea). The phases of the three SCG parameters observed during the respiration cycle were congruent with existing literature and physiologically expected trends. (paper)

  15. Extracting key information from historical data to quantify the transmission dynamics of smallpox

    Directory of Open Access Journals (Sweden)

    Brockmann Stefan O

    2008-08-01

    Full Text Available Abstract Background Quantification of the transmission dynamics of smallpox is crucial for optimizing intervention strategies in the event of a bioterrorist attack. This article reviews basic methods and findings in mathematical and statistical studies of smallpox which estimate key transmission parameters from historical data. Main findings First, critically important aspects in extracting key information from historical data are briefly summarized. We mention different sources of heterogeneity and potential pitfalls in utilizing historical records. Second, we discuss how smallpox spreads in the absence of interventions and how the optimal timing of quarantine and isolation measures can be determined. Case studies demonstrate the following. (1 The upper confidence limit of the 99th percentile of the incubation period is 22.2 days, suggesting that quarantine should last 23 days. (2 The highest frequency (61.8% of secondary transmissions occurs 3–5 days after onset of fever so that infected individuals should be isolated before the appearance of rash. (3 The U-shaped age-specific case fatality implies a vulnerability of infants and elderly among non-immune individuals. Estimates of the transmission potential are subsequently reviewed, followed by an assessment of vaccination effects and of the expected effectiveness of interventions. Conclusion Current debates on bio-terrorism preparedness indicate that public health decision making must account for the complex interplay and balance between vaccination strategies and other public health measures (e.g. case isolation and contact tracing taking into account the frequency of adverse events to vaccination. In this review, we summarize what has already been clarified and point out needs to analyze previous smallpox outbreaks systematically.

  16. Evaluation of needle trap micro-extraction and solid-phase micro-extraction: Obtaining comprehensive information on volatile emissions from in vitro cultures.

    Science.gov (United States)

    Oertel, Peter; Bergmann, Andreas; Fischer, Sina; Trefz, Phillip; Küntzel, Anne; Reinhold, Petra; Köhler, Heike; Schubert, Jochen K; Miekisch, Wolfram

    2018-05-14

    Volatile organic compounds (VOCs) emitted from in vitro cultures may reveal information on species and metabolism. Owing to low nmol L -1 concentration ranges, pre-concentration techniques are required for gas chromatography-mass spectrometry (GC-MS) based analyses. This study was intended to compare the efficiency of established micro-extraction techniques - solid-phase micro-extraction (SPME) and needle-trap micro-extraction (NTME) - for the analysis of complex VOC patterns. For SPME, a 75 μm Carboxen®/polydimethylsiloxane fiber was used. The NTME needle was packed with divinylbenzene, Carbopack X and Carboxen 1000. The headspace was sampled bi-directionally. Seventy-two VOCs were calibrated by reference standard mixtures in the range of 0.041-62.24 nmol L -1 by means of GC-MS. Both pre-concentration methods were applied to profile VOCs from cultures of Mycobacterium avium ssp. paratuberculosis. Limits of detection ranged from 0.004 to 3.93 nmol L -1 (median = 0.030 nmol L -1 ) for NTME and from 0.001 to 5.684 nmol L -1 (median = 0.043 nmol L -1 ) for SPME. NTME showed advantages in assessing polar compounds such as alcohols. SPME showed advantages in reproducibility but disadvantages in sensitivity for N-containing compounds. Micro-extraction techniques such as SPME and NTME are well suited for trace VOC profiling over cultures if the limitations of each technique is taken into account. Copyright © 2018 John Wiley & Sons, Ltd.

  17. A COMPARATIVE ANALYSIS OF WEB INFORMATION EXTRACTION TECHNIQUES DEEP LEARNING vs. NAÏVE BAYES vs. BACK PROPAGATION NEURAL NETWORKS IN WEB DOCUMENT EXTRACTION

    Directory of Open Access Journals (Sweden)

    J. Sharmila

    2016-01-01

    Full Text Available Web mining related exploration is getting the chance to be more essential these days in view of the reason that a lot of information is overseen through the web. Web utilization is expanding in an uncontrolled way. A particular framework is required for controlling such extensive measure of information in the web space. Web mining is ordered into three noteworthy divisions: Web content mining, web usage mining and web structure mining. Tak-Lam Wong has proposed a web content mining methodology in the exploration with the aid of Bayesian Networks (BN. In their methodology, they were learning on separating the web data and characteristic revelation in view of the Bayesian approach. Roused from their investigation, we mean to propose a web content mining methodology, in view of a Deep Learning Algorithm. The Deep Learning Algorithm gives the interest over BN on the basis that BN is not considered in any learning architecture planning like to propose system. The main objective of this investigation is web document extraction utilizing different grouping algorithm and investigation. This work extricates the data from the web URL. This work shows three classification algorithms, Deep Learning Algorithm, Bayesian Algorithm and BPNN Algorithm. Deep Learning is a capable arrangement of strategies for learning in neural system which is connected like computer vision, speech recognition, and natural language processing and biometrics framework. Deep Learning is one of the simple classification technique and which is utilized for subset of extensive field furthermore Deep Learning has less time for classification. Naive Bayes classifiers are a group of basic probabilistic classifiers in view of applying Bayes hypothesis with concrete independence assumptions between the features. At that point the BPNN algorithm is utilized for classification. Initially training and testing dataset contains more URL. We extract the content presently from the dataset. The

  18. Pressing technology for large bottoms

    International Nuclear Information System (INIS)

    Jilek, L.

    1986-01-01

    The technology has been selected of a circular plate bent into the shape of a trough, for pressing bottoms of pressure vessels from a circular plate of large diameter. The initial sheet is first bent in the middle by heating with the edges remaining straight. These are then welded longitudinally by electroslag welding and the circular shape is flame cut. The result will be a plate with a straight surface in the middle with raised edges which may be pressed into the desired shape. In this manner it is also possible to press pressure vessel bottoms with tube couplings from plates which are thickened in the middle and drilled; additional welding is then eliminated. Deformation from heat treatment may be avoided by the use of a fixture in the shape of a ring with a groove into which is fixed the edge of the bottom. During hardening of the bottom it will be necessary to care for the withdrawal of vapours and gases which would hamper uniform cooling. Bottom hardening with the grill and the cupola downwards has been proven. Deformation which occurs during treatment may to a certain extent be removed by calibration which cannot, however, be made without special fixtures and instruments. (J.B.)

  19. Information extraction from dynamic PS-InSAR time series using machine learning

    Science.gov (United States)

    van de Kerkhof, B.; Pankratius, V.; Chang, L.; van Swol, R.; Hanssen, R. F.

    2017-12-01

    Due to the increasing number of SAR satellites, with shorter repeat intervals and higher resolutions, SAR data volumes are exploding. Time series analyses of SAR data, i.e. Persistent Scatterer (PS) InSAR, enable the deformation monitoring of the built environment at an unprecedented scale, with hundreds of scatterers per km2, updated weekly. Potential hazards, e.g. due to failure of aging infrastructure, can be detected at an early stage. Yet, this requires the operational data processing of billions of measurement points, over hundreds of epochs, updating this data set dynamically as new data come in, and testing whether points (start to) behave in an anomalous way. Moreover, the quality of PS-InSAR measurements is ambiguous and heterogeneous, which will yield false positives and false negatives. Such analyses are numerically challenging. Here we extract relevant information from PS-InSAR time series using machine learning algorithms. We cluster (group together) time series with similar behaviour, even though they may not be spatially close, such that the results can be used for further analysis. First we reduce the dimensionality of the dataset in order to be able to cluster the data, since applying clustering techniques on high dimensional datasets often result in unsatisfying results. Our approach is to apply t-distributed Stochastic Neighbor Embedding (t-SNE), a machine learning algorithm for dimensionality reduction of high-dimensional data to a 2D or 3D map, and cluster this result using Density-Based Spatial Clustering of Applications with Noise (DBSCAN). The results show that we are able to detect and cluster time series with similar behaviour, which is the starting point for more extensive analysis into the underlying driving mechanisms. The results of the methods are compared to conventional hypothesis testing as well as a Self-Organising Map (SOM) approach. Hypothesis testing is robust and takes the stochastic nature of the observations into account

  20. Synthesis of High-Frequency Ground Motion Using Information Extracted from Low-Frequency Ground Motion

    Science.gov (United States)

    Iwaki, A.; Fujiwara, H.

    2012-12-01

    Broadband ground motion computations of scenario earthquakes are often based on hybrid methods that are the combinations of deterministic approach in lower frequency band and stochastic approach in higher frequency band. Typical computation methods for low-frequency and high-frequency (LF and HF, respectively) ground motions are the numerical simulations, such as finite-difference and finite-element methods based on three-dimensional velocity structure model, and the stochastic Green's function method, respectively. In such hybrid methods, LF and HF wave fields are generated through two different methods that are completely independent of each other, and are combined at the matching frequency. However, LF and HF wave fields are essentially not independent as long as they are from the same event. In this study, we focus on the relation among acceleration envelopes at different frequency bands, and attempt to synthesize HF ground motion using the information extracted from LF ground motion, aiming to propose a new method for broad-band strong motion prediction. Our study area is Kanto area, Japan. We use the K-NET and KiK-net surface acceleration data and compute RMS envelope at four frequency bands: 0.5-1.0 Hz, 1.0-2.0 Hz, 2.0-4.0 Hz, .0-8.0 Hz, and 8.0-16.0 Hz. Taking the ratio of the envelopes of adjacent bands, we find that the envelope ratios have stable shapes at each site. The empirical envelope-ratio characteristics are combined with low-frequency envelope of the target earthquake to synthesize HF ground motion. We have applied the method to M5-class earthquakes and a M7 target earthquake that occurred in the vicinity of Kanto area, and successfully reproduced the observed HF ground motion of the target earthquake. The method can be applied to a broad band ground motion simulation for a scenario earthquake by combining numerically-computed low-frequency (~1 Hz) ground motion with the empirical envelope ratio characteristics to generate broadband ground motion

  1. TempoWordNet : une ressource lexicale pour l'extraction d'information temporelle

    OpenAIRE

    Hasanuzzaman , Mohammed

    2016-01-01

    The ability to capture the time information conveyed in natural language, where that information is expressed either explicitly, or implicitly, or connotative, is essential to many natural language processing applications such as information retrieval, question answering, automatic summarization, targeted marketing, loan repayment forecasting, and understanding economic patterns. Associating word senses with temporal orientation to grasp the temporal information in language is relatively stra...

  2. BioSimplify: an open source sentence simplification engine to improve recall in automatic biomedical information extraction

    OpenAIRE

    Jonnalagadda, Siddhartha; Gonzalez, Graciela

    2011-01-01

    BioSimplify is an open source tool written in Java that introduces and facilitates the use of a novel model for sentence simplification tuned for automatic discourse analysis and information extraction (as opposed to sentence simplification for improving human readability). The model is based on a "shot-gun" approach that produces many different (simpler) versions of the original sentence by combining variants of its constituent elements. This tool is optimized for processing biomedical scien...

  3. The BEL information extraction workflow (BELIEF): evaluation in the BioCreative V BEL and IAT track

    OpenAIRE

    Madan, Sumit; Hodapp, Sven; Senger, Philipp; Ansari, Sam; Szostak, Justyna; Hoeng, Julia; Peitsch, Manuel; Fluck, Juliane

    2016-01-01

    Network-based approaches have become extremely important in systems biology to achieve a better understanding of biological mechanisms. For network representation, the Biological Expression Language (BEL) is well designed to collate findings from the scientific literature into biological network models. To facilitate encoding and biocuration of such findings in BEL, a BEL Information Extraction Workflow (BELIEF) was developed. BELIEF provides a web-based curation interface, the BELIEF Dashboa...

  4. An Investigation of the Relationship Between Automated Machine Translation Evaluation Metrics and User Performance on an Information Extraction Task

    Science.gov (United States)

    2007-01-01

    more reliable than BLEU and that it is easier to understand in terms familiar to NLP researchers. 19 2.2.3 METEOR Researchers at Carnegie Mellon...essential elements of infor- mation from output generated by three types of Arabic -English MT engines. The information extraction experiment was one of three...reviewing the task hierarchy and examining the MT output of several engines. A small, prior pilot experiment to evaluate Arabic -English MT engines for

  5. Comparison of Qinzhou bay wetland landscape information extraction by three methods

    Directory of Open Access Journals (Sweden)

    X. Chang

    2014-04-01

    and OO is 219 km2, 193.70 km2, 217.40 km2 respectively. The result indicates that SC is in the f irst place, followed by OO approach, and the third DT method when used to extract Qingzhou Bay coastal wetland.

  6. Extracting topographic structure from digital elevation data for geographic information-system analysis

    Science.gov (United States)

    Jenson, Susan K.; Domingue, Julia O.

    1988-01-01

    Software tools have been developed at the U.S. Geological Survey's EROS Data Center to extract topographic structure and to delineate watersheds and overland flow paths from digital elevation models. The tools are specialpurpose FORTRAN programs interfaced with general-purpose raster and vector spatial analysis and relational data base management packages.

  7. Culture from the Bottom Up

    Science.gov (United States)

    Atkinson, Dwight; Sohn, Jija

    2013-01-01

    The culture concept has been severely criticized for its top-down nature in TESOL, leading arguably to its falling out of favor in the field. But what of the fact that people do "live culturally" (Ingold, 1994)? This article describes a case study of culture from the bottom up--culture as understood and enacted by its individual users.…

  8. Decay of the Bottom mesons

    International Nuclear Information System (INIS)

    Duong Van Phi; Duong Anh Duc

    1992-12-01

    The channels of the decay of Bottom mesons are deduced from a selection rule and the Lagrangians which are formed on the LxO(4) invariance and the principle of minimal structure. The estimation of the corresponding decay probabilities are considered. (author). 21 refs

  9. Bottom reflector for power reactors

    International Nuclear Information System (INIS)

    Elter, C.; Kissel, K.F.; Schoening, J.; Schwiers, H.G.

    1982-01-01

    In pebble bed reactors erosion and damage due fuel elements movement on the surface of the bottom reflector should be minimized. This can be achieved by chamfering and/or rounding the cover edges of the graphite blocks and the edges between the drilled holes and the surface of the graphite block. (orig.) [de

  10. Systematically extracting metal- and solvent-related occupational information from free-text responses to lifetime occupational history questionnaires.

    Science.gov (United States)

    Friesen, Melissa C; Locke, Sarah J; Tornow, Carina; Chen, Yu-Cheng; Koh, Dong-Hee; Stewart, Patricia A; Purdue, Mark; Colt, Joanne S

    2014-06-01

    Lifetime occupational history (OH) questionnaires often use open-ended questions to capture detailed information about study participants' jobs. Exposure assessors use this information, along with responses to job- and industry-specific questionnaires, to assign exposure estimates on a job-by-job basis. An alternative approach is to use information from the OH responses and the job- and industry-specific questionnaires to develop programmable decision rules for assigning exposures. As a first step in this process, we developed a systematic approach to extract the free-text OH responses and convert them into standardized variables that represented exposure scenarios. Our study population comprised 2408 subjects, reporting 11991 jobs, from a case-control study of renal cell carcinoma. Each subject completed a lifetime OH questionnaire that included verbatim responses, for each job, to open-ended questions including job title, main tasks and activities (task), tools and equipment used (tools), and chemicals and materials handled (chemicals). Based on a review of the literature, we identified exposure scenarios (occupations, industries, tasks/tools/chemicals) expected to involve possible exposure to chlorinated solvents, trichloroethylene (TCE) in particular, lead, and cadmium. We then used a SAS macro to review the information reported by study participants to identify jobs associated with each exposure scenario; this was done using previously coded standardized occupation and industry classification codes, and a priori lists of associated key words and phrases related to possibly exposed tasks, tools, and chemicals. Exposure variables representing the occupation, industry, and task/tool/chemicals exposure scenarios were added to the work history records of the study respondents. Our identification of possibly TCE-exposed scenarios in the OH responses was compared to an expert's independently assigned probability ratings to evaluate whether we missed identifying

  11. Bottom Dissolved Oxygen Maps From SEAMAP Summer Groundfish/Shrimp Surveys

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Bottom dissolved oxygen (DO) data was extracted from environmental profiles acquired during the Southeast Fisheries Science Center Mississippi Laboratories summer...

  12. MIDAS. An algorithm for the extraction of modal information from experimentally determined transfer functions

    International Nuclear Information System (INIS)

    Durrans, R.F.

    1978-12-01

    In order to design reactor structures to withstand the large flow and acoustic forces present it is necessary to know something of their dynamic properties. In many cases these properties cannot be predicted theoretically and it is necessary to determine them experimentally. The algorithm MIDAS (Modal Identification for the Dynamic Analysis of Structures) which has been developed at B.N.L. for extracting these structural properties from experimental data is described. (author)

  13. Extracting Information about the Initial State from the Black Hole Radiation.

    Science.gov (United States)

    Lochan, Kinjalk; Padmanabhan, T

    2016-02-05

    The crux of the black hole information paradox is related to the fact that the complete information about the initial state of a quantum field in a collapsing spacetime is not available to future asymptotic observers, belying the expectations from a unitary quantum theory. We study the imprints of the initial quantum state contained in a specific class of distortions of the black hole radiation and identify the classes of in states that can be partially or fully reconstructed from the information contained within. Even for the general in state, we can uncover some specific information. These results suggest that a classical collapse scenario ignores this richness of information in the resulting spectrum and a consistent quantum treatment of the entire collapse process might allow us to retrieve much more information from the spectrum of the final radiation.

  14. Point Cloud Classification of Tesserae from Terrestrial Laser Data Combined with Dense Image Matching for Archaeological Information Extraction

    Science.gov (United States)

    Poux, F.; Neuville, R.; Billen, R.

    2017-08-01

    Reasoning from information extraction given by point cloud data mining allows contextual adaptation and fast decision making. However, to achieve this perceptive level, a point cloud must be semantically rich, retaining relevant information for the end user. This paper presents an automatic knowledge-based method for pre-processing multi-sensory data and classifying a hybrid point cloud from both terrestrial laser scanning and dense image matching. Using 18 features including sensor's biased data, each tessera in the high-density point cloud from the 3D captured complex mosaics of Germigny-des-prés (France) is segmented via a colour multi-scale abstraction-based featuring extracting connectivity. A 2D surface and outline polygon of each tessera is generated by a RANSAC plane extraction and convex hull fitting. Knowledge is then used to classify every tesserae based on their size, surface, shape, material properties and their neighbour's class. The detection and semantic enrichment method shows promising results of 94% correct semantization, a first step toward the creation of an archaeological smart point cloud.

  15. Lung region extraction based on the model information and the inversed MIP method by using chest CT images

    International Nuclear Information System (INIS)

    Tomita, Toshihiro; Miguchi, Ryosuke; Okumura, Toshiaki; Yamamoto, Shinji; Matsumoto, Mitsuomi; Tateno, Yukio; Iinuma, Takeshi; Matsumoto, Toru.

    1997-01-01

    We developed a lung region extraction method based on the model information and the inversed MIP method in the Lung Cancer Screening CT (LSCT). Original model is composed of typical 3-D lung contour lines, a body axis, an apical point, and a convex hull. First, the body axis. the apical point, and the convex hull are automatically extracted from the input image Next, the model is properly transformed to fit to those of input image by the affine transformation. Using the same affine transformation coefficients, typical lung contour lines are also transferred, which correspond to rough contour lines of input image. Experimental results applied for 68 samples showed this method quite promising. (author)

  16. BioSimplify: an open source sentence simplification engine to improve recall in automatic biomedical information extraction.

    Science.gov (United States)

    Jonnalagadda, Siddhartha; Gonzalez, Graciela

    2010-11-13

    BioSimplify is an open source tool written in Java that introduces and facilitates the use of a novel model for sentence simplification tuned for automatic discourse analysis and information extraction (as opposed to sentence simplification for improving human readability). The model is based on a "shot-gun" approach that produces many different (simpler) versions of the original sentence by combining variants of its constituent elements. This tool is optimized for processing biomedical scientific literature such as the abstracts indexed in PubMed. We tested our tool on its impact to the task of PPI extraction and it improved the f-score of the PPI tool by around 7%, with an improvement in recall of around 20%. The BioSimplify tool and test corpus can be downloaded from https://biosimplify.sourceforge.net.

  17. Unsupervised improvement of named entity extraction in short informal context using disambiguation clues

    NARCIS (Netherlands)

    Habib, Mena Badieh; van Keulen, Maurice

    2012-01-01

    Short context messages (like tweets and SMS’s) are a potentially rich source of continuously and instantly updated information. Shortness and informality of such messages are challenges for Natural Language Processing tasks. Most efforts done in this direction rely on machine learning techniques

  18. Automated Methods to Extract Patient New Information from Clinical Notes in Electronic Health Record Systems

    Science.gov (United States)

    Zhang, Rui

    2013-01-01

    The widespread adoption of Electronic Health Record (EHR) has resulted in rapid text proliferation within clinical care. Clinicians' use of copying and pasting functions in EHR systems further compounds this by creating a large amount of redundant clinical information in clinical documents. A mixture of redundant information (especially outdated…

  19. Extracting principles for information management adaptability during crisis response : A dynamic capability view

    NARCIS (Netherlands)

    Bharosa, N.; Janssen, M.F.W.H.A.

    2010-01-01

    During crises, relief agency commanders have to make decisions in a complex and uncertain environment, requiring them to continuously adapt to unforeseen environmental changes. In the process of adaptation, the commanders depend on information management systems for information. Yet there are still

  20. Extracting protein dynamics information from overlapped NMR signals using relaxation dispersion difference NMR spectroscopy.

    Science.gov (United States)

    Konuma, Tsuyoshi; Harada, Erisa; Sugase, Kenji

    2015-12-01

    Protein dynamics plays important roles in many biological events, such as ligand binding and enzyme reactions. NMR is mostly used for investigating such protein dynamics in a site-specific manner. Recently, NMR has been actively applied to large proteins and intrinsically disordered proteins, which are attractive research targets. However, signal overlap, which is often observed for such proteins, hampers accurate analysis of NMR data. In this study, we have developed a new methodology called relaxation dispersion difference that can extract conformational exchange parameters from overlapped NMR signals measured using relaxation dispersion spectroscopy. In relaxation dispersion measurements, the signal intensities of fluctuating residues vary according to the Carr-Purcell-Meiboon-Gill pulsing interval, whereas those of non-fluctuating residues are constant. Therefore, subtraction of each relaxation dispersion spectrum from that with the highest signal intensities, measured at the shortest pulsing interval, leaves only the signals of the fluctuating residues. This is the principle of the relaxation dispersion difference method. This new method enabled us to extract exchange parameters from overlapped signals of heme oxygenase-1, which is a relatively large protein. The results indicate that the structural flexibility of a kink in the heme-binding site is important for efficient heme binding. Relaxation dispersion difference requires neither selectively labeled samples nor modification of pulse programs; thus it will have wide applications in protein dynamics analysis.

  1. Extracting protein dynamics information from overlapped NMR signals using relaxation dispersion difference NMR spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Konuma, Tsuyoshi [Icahn School of Medicine at Mount Sinai, Department of Structural and Chemical Biology (United States); Harada, Erisa [Suntory Foundation for Life Sciences, Bioorganic Research Institute (Japan); Sugase, Kenji, E-mail: sugase@sunbor.or.jp, E-mail: sugase@moleng.kyoto-u.ac.jp [Kyoto University, Department of Molecular Engineering, Graduate School of Engineering (Japan)

    2015-12-15

    Protein dynamics plays important roles in many biological events, such as ligand binding and enzyme reactions. NMR is mostly used for investigating such protein dynamics in a site-specific manner. Recently, NMR has been actively applied to large proteins and intrinsically disordered proteins, which are attractive research targets. However, signal overlap, which is often observed for such proteins, hampers accurate analysis of NMR data. In this study, we have developed a new methodology called relaxation dispersion difference that can extract conformational exchange parameters from overlapped NMR signals measured using relaxation dispersion spectroscopy. In relaxation dispersion measurements, the signal intensities of fluctuating residues vary according to the Carr-Purcell-Meiboon-Gill pulsing interval, whereas those of non-fluctuating residues are constant. Therefore, subtraction of each relaxation dispersion spectrum from that with the highest signal intensities, measured at the shortest pulsing interval, leaves only the signals of the fluctuating residues. This is the principle of the relaxation dispersion difference method. This new method enabled us to extract exchange parameters from overlapped signals of heme oxygenase-1, which is a relatively large protein. The results indicate that the structural flexibility of a kink in the heme-binding site is important for efficient heme binding. Relaxation dispersion difference requires neither selectively labeled samples nor modification of pulse programs; thus it will have wide applications in protein dynamics analysis.

  2. Bottom head failure program plan

    International Nuclear Information System (INIS)

    Meyer, R.O.

    1989-01-01

    Earlier this year the NRC staff presented a Revised Severe Accident Research Program Plan (SECY-89-123) to the Commission and initiated work on that plan. Two of the near-term issues in that plan involve failure of the bottom head of the reactor pressure vessel. These two issues are (1) depressurization and DCH and (2) BWR Mark I Containment Shell Meltthrough. ORNL has developed models for several competing failure mechanisms for BWRs. INEL has performed analytical and experimental work directly related to bottom head failure in connection with several programs. SNL has conducted a number of analyses and experimental activities to examine the failure of LWR vessels. In addition to the government-sponsored work mentioned above, EPRI and FAI performed studies on vessel failure for the Industry Degraded Core Rulemaking Program (IDCOR). EPRI examined the failure of a PWR vessel bottom head without penetrations, as found in some Combustion Engineering reactors. To give more attention to this subject as called for by the revised Severe Accident Research Plan, two things are being done. First, work previously done is being reviewed carefully to develop an overall picture and to determine the reliability of assumptions used in those studies. Second, new work is being planned for FY90 to try to complete a reasonable understanding of the failure process. The review and planning are being done in close cooperation with the ACRS. Results of this exercise will be presented in this paper

  3. Wavelet analysis of molecular dynamics: Efficient extraction of time-frequency information in ultrafast optical processes

    International Nuclear Information System (INIS)

    Prior, Javier; Castro, Enrique; Chin, Alex W.; Almeida, Javier; Huelga, Susana F.; Plenio, Martin B.

    2013-01-01

    New experimental techniques based on nonlinear ultrafast spectroscopies have been developed over the last few years, and have been demonstrated to provide powerful probes of quantum dynamics in different types of molecular aggregates, including both natural and artificial light harvesting complexes. Fourier transform-based spectroscopies have been particularly successful, yet “complete” spectral information normally necessitates the loss of all information on the temporal sequence of events in a signal. This information though is particularly important in transient or multi-stage processes, in which the spectral decomposition of the data evolves in time. By going through several examples of ultrafast quantum dynamics, we demonstrate that the use of wavelets provide an efficient and accurate way to simultaneously acquire both temporal and frequency information about a signal, and argue that this greatly aids the elucidation and interpretation of physical process responsible for non-stationary spectroscopic features, such as those encountered in coherent excitonic energy transport

  4. Extracting information from an ensemble of GCMs to reliably assess future global runoff change

    NARCIS (Netherlands)

    Sperna Weiland, F.C.; Beek, L.P.H. van; Weerts, A.H.; Bierkens, M.F.P.

    2011-01-01

    Future runoff projections derived from different global climate models (GCMs) show large differences. Therefore, within this study the, information from multiple GCMs has been combined to better assess hydrological changes. For projections of precipitation and temperature the Reliability ensemble

  5. Investigation of the Impact of Extracting and Exchanging Health Information by Using Internet and Social Networks.

    Science.gov (United States)

    Pistolis, John; Zimeras, Stelios; Chardalias, Kostas; Roupa, Zoe; Fildisis, George; Diomidous, Marianna

    2016-06-01

    Social networks (1) have been embedded in our daily life for a long time. They constitute a powerful tool used nowadays for both searching and exchanging information on different issues by using Internet searching engines (Google, Bing, etc.) and Social Networks (Facebook, Twitter etc.). In this paper, are presented the results of a research based on the frequency and the type of the usage of the Internet and the Social Networks by the general public and the health professionals. The objectives of the research were focused on the investigation of the frequency of seeking and meticulously searching for health information in the social media by both individuals and health practitioners. The exchanging of information is a procedure that involves the issues of reliability and quality of information. In this research, by using advanced statistical techniques an effort is made to investigate the participant's profile in using social networks for searching and exchanging information on health issues. Based on the answers 93 % of the people, use the Internet to find information on health-subjects. Considering principal component analysis, the most important health subjects were nutrition (0.719 %), respiratory issues (0.79 %), cardiological issues (0.777%), psychological issues (0.667%) and total (73.8%). The research results, based on different statistical techniques revealed that the 61.2% of the males and 56.4% of the females intended to use the social networks for searching medical information. Based on the principal components analysis, the most important sources that the participants mentioned, were the use of the Internet and social networks for exchanging information on health issues. These sources proved to be of paramount importance to the participants of the study. The same holds for nursing, medical and administrative staff in hospitals.

  6. Amplitude extraction in pseudoscalar-meson photoproduction: towards a situation of complete information

    International Nuclear Information System (INIS)

    Nys, Jannes; Vrancx, Tom; Ryckebusch, Jan

    2015-01-01

    A complete set for pseudoscalar-meson photoproduction is a minimum set of observables from which one can determine the underlying reaction amplitudes unambiguously. The complete sets considered in this work involve single- and double-polarization observables. It is argued that for extracting amplitudes from data, the transversity representation of the reaction amplitudes offers advantages over alternate representations. It is shown that with the available single-polarization data for the p(γ,K + )Λ reaction, the energy and angular dependence of the moduli of the normalized transversity amplitudes in the resonance region can be determined to a fair accuracy. Determining the relative phases of the amplitudes from double-polarization observables is far less evident. (paper)

  7. The Analysis of Tree Species Distribution Information Extraction and Landscape Pattern Based on Remote Sensing Images

    Directory of Open Access Journals (Sweden)

    Yi Zeng

    2017-08-01

    Full Text Available The forest ecosystem is the largest land vegetation type, which plays the role of unreplacement with its unique value. And in the landscape scale, the research on forest landscape pattern has become the current hot spot, wherein the study of forest canopy structure is very important. They determines the process and the strength of forests energy flow, which influences the adjustments of ecosystem for climate and species diversity to some extent. The extraction of influencing factors of canopy structure and the analysis of the vegetation distribution pattern are especially important. To solve the problems, remote sensing technology, which is superior to other technical means because of its fine timeliness and large-scale monitoring, is applied to the study. Taking Lingkong Mountain as the study area, the paper uses the remote sensing image to analyze the forest distribution pattern and obtains the spatial characteristics of canopy structure distribution, and DEM data are as the basic data to extract the influencing factors of canopy structure. In this paper, pattern of trees distribution is further analyzed by using terrain parameters, spatial analysis tools and surface processes quantitative simulation. The Hydrological Analysis tool is used to build distributed hydrological model, and corresponding algorithm is applied to determine surface water flow path, rivers network and basin boundary. Results show that forest vegetation distribution of dominant tree species present plaque on the landscape scale and their distribution have spatial heterogeneity which is related to terrain factors closely. After the overlay analysis of aspect, slope and forest distribution pattern respectively, the most suitable area for stand growth and the better living condition are obtained.

  8. Linking attentional processes and conceptual problem solving: visual cues facilitate the automaticity of extracting relevant information from diagrams.

    Science.gov (United States)

    Rouinfar, Amy; Agra, Elise; Larson, Adam M; Rebello, N Sanjay; Loschky, Lester C

    2014-01-01

    This study investigated links between visual attention processes and conceptual problem solving. This was done by overlaying visual cues on conceptual physics problem diagrams to direct participants' attention to relevant areas to facilitate problem solving. Participants (N = 80) individually worked through four problem sets, each containing a diagram, while their eye movements were recorded. Each diagram contained regions that were relevant to solving the problem correctly and separate regions related to common incorrect responses. Problem sets contained an initial problem, six isomorphic training problems, and a transfer problem. The cued condition saw visual cues overlaid on the training problems. Participants' verbal responses were used to determine their accuracy. This study produced two major findings. First, short duration visual cues which draw attention to solution-relevant information and aid in the organizing and integrating of it, facilitate both immediate problem solving and generalization of that ability to new problems. Thus, visual cues can facilitate re-representing a problem and overcoming impasse, enabling a correct solution. Importantly, these cueing effects on problem solving did not involve the solvers' attention necessarily embodying the solution to the problem, but were instead caused by solvers attending to and integrating relevant information in the problems into a solution path. Second, this study demonstrates that when such cues are used across multiple problems, solvers can automatize the extraction of problem-relevant information extraction. These results suggest that low-level attentional selection processes provide a necessary gateway for relevant information to be used in problem solving, but are generally not sufficient for correct problem solving. Instead, factors that lead a solver to an impasse and to organize and integrate problem information also greatly facilitate arriving at correct solutions.

  9. The Application of Chinese High-Spatial Remote Sensing Satellite Image in Land Law Enforcement Information Extraction

    Science.gov (United States)

    Wang, N.; Yang, R.

    2018-04-01

    Chinese high -resolution (HR) remote sensing satellites have made huge leap in the past decade. Commercial satellite datasets, such as GF-1, GF-2 and ZY-3 images, the panchromatic images (PAN) resolution of them are 2 m, 1 m and 2.1 m and the multispectral images (MS) resolution are 8 m, 4 m, 5.8 m respectively have been emerged in recent years. Chinese HR satellite imagery has been free downloaded for public welfare purposes using. Local government began to employ more professional technician to improve traditional land management technology. This paper focused on analysing the actual requirements of the applications in government land law enforcement in Guangxi Autonomous Region. 66 counties in Guangxi Autonomous Region were selected for illegal land utilization spot extraction with fusion Chinese HR images. The procedure contains: A. Defines illegal land utilization spot type. B. Data collection, GF-1, GF-2, and ZY-3 datasets were acquired in the first half year of 2016 and other auxiliary data were collected in 2015. C. Batch process, HR images were collected for batch preprocessing through ENVI/IDL tool. D. Illegal land utilization spot extraction by visual interpretation. E. Obtaining attribute data with ArcGIS Geoprocessor (GP) model. F. Thematic mapping and surveying. Through analysing 42 counties results, law enforcement officials found 1092 illegal land using spots and 16 suspicious illegal mining spots. The results show that Chinese HR satellite images have great potential for feature information extraction and the processing procedure appears robust.

  10. Implementation of generalized quantum measurements: Superadditive quantum coding, accessible information extraction, and classical capacity limit

    International Nuclear Information System (INIS)

    Takeoka, Masahiro; Fujiwara, Mikio; Mizuno, Jun; Sasaki, Masahide

    2004-01-01

    Quantum-information theory predicts that when the transmission resource is doubled in quantum channels, the amount of information transmitted can be increased more than twice by quantum-channel coding technique, whereas the increase is at most twice in classical information theory. This remarkable feature, the superadditive quantum-coding gain, can be implemented by appropriate choices of code words and corresponding quantum decoding which requires a collective quantum measurement. Recently, an experimental demonstration was reported [M. Fujiwara et al., Phys. Rev. Lett. 90, 167906 (2003)]. The purpose of this paper is to describe our experiment in detail. Particularly, a design strategy of quantum-collective decoding in physical quantum circuits is emphasized. We also address the practical implication of the gain on communication performance by introducing the quantum-classical hybrid coding scheme. We show how the superadditive quantum-coding gain, even in a small code length, can boost the communication performance of conventional coding techniques

  11. Extraction of basic roadway information for non-state roads in Florida : [summary].

    Science.gov (United States)

    2015-07-01

    The Florida Department of Transportation (FDOT) maintains a map of all the roads in Florida, : containing over one and a half million road links. For planning purposes, a wide variety : of information, such as stop lights, signage, lane number, and s...

  12. Extracting additional risk managers information from a risk assessment of Listeria monocytogenes in deli meats

    NARCIS (Netherlands)

    Pérez-Rodríguez, F.; Asselt, van E.D.; García-Gimeno, R.M.; Zurera, G.; Zwietering, M.H.

    2007-01-01

    The risk assessment study of Listeria monocytogenes in ready-to-eat foods conducted by the U.S. Food and Drug Administration is an example of an extensive quantitative microbiological risk assessment that could be used by risk analysts and other scientists to obtain information and by managers and

  13. Synthetic aperture radar ship discrimination, generation and latent variable extraction using information maximizing generative adversarial networks

    CSIR Research Space (South Africa)

    Schwegmann, Colin P

    2017-07-01

    Full Text Available such as Synthetic Aperture Radar imagery. To aid in the creation of improved machine learning-based ship detection and discrimination methods this paper applies a type of neural network known as an Information Maximizing Generative Adversarial Network. Generative...

  14. You had me at "Hello": Rapid extraction of dialect information from spoken words.

    Science.gov (United States)

    Scharinger, Mathias; Monahan, Philip J; Idsardi, William J

    2011-06-15

    Research on the neuronal underpinnings of speaker identity recognition has identified voice-selective areas in the human brain with evolutionary homologues in non-human primates who have comparable areas for processing species-specific calls. Most studies have focused on estimating the extent and location of these areas. In contrast, relatively few experiments have investigated the time-course of speaker identity, and in particular, dialect processing and identification by electro- or neuromagnetic means. We show here that dialect extraction occurs speaker-independently, pre-attentively and categorically. We used Standard American English and African-American English exemplars of 'Hello' in a magnetoencephalographic (MEG) Mismatch Negativity (MMN) experiment. The MMN as an automatic change detection response of the brain reflected dialect differences that were not entirely reducible to acoustic differences between the pronunciations of 'Hello'. Source analyses of the M100, an auditory evoked response to the vowels suggested additional processing in voice-selective areas whenever a dialect change was detected. These findings are not only relevant for the cognitive neuroscience of language, but also for the social sciences concerned with dialect and race perception. Copyright © 2011 Elsevier Inc. All rights reserved.

  15. Extraction of indirectly captured information for use in a comparison of offline pH measurement technologies.

    Science.gov (United States)

    Ritchie, Elspeth K; Martin, Elaine B; Racher, Andy; Jaques, Colin

    2017-06-10

    Understanding the causes of discrepancies in pH readings of a sample can allow more robust pH control strategies to be implemented. It was found that 59.4% of differences between two offline pH measurement technologies for an historical dataset lay outside an expected instrument error range of ±0.02pH. A new variable, Osmo Res , was created using multiple linear regression (MLR) to extract information indirectly captured in the recorded measurements for osmolality. Principal component analysis and time series analysis were used to validate the expansion of the historical dataset with the new variable Osmo Res . MLR was used to identify variables strongly correlated (p<0.05) with differences in pH readings by the two offline pH measurement technologies. These included concentrations of specific chemicals (e.g. glucose) and Osmo Res, indicating culture medium and bolus feed additions as possible causes of discrepancies between the offline pH measurement technologies. Temperature was also identified as statistically significant. It is suggested that this was a result of differences in pH-temperature compensations employed by the pH measurement technologies. In summary, a method for extracting indirectly captured information has been demonstrated, and it has been shown that competing pH measurement technologies were not necessarily interchangeable at the desired level of control (±0.02pH). Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Extracting 3d Semantic Information from Video Surveillance System Using Deep Learning

    Science.gov (United States)

    Zhang, J. S.; Cao, J.; Mao, B.; Shen, D. Q.

    2018-04-01

    At present, intelligent video analysis technology has been widely used in various fields. Object tracking is one of the important part of intelligent video surveillance, but the traditional target tracking technology based on the pixel coordinate system in images still exists some unavoidable problems. Target tracking based on pixel can't reflect the real position information of targets, and it is difficult to track objects across scenes. Based on the analysis of Zhengyou Zhang's camera calibration method, this paper presents a method of target tracking based on the target's space coordinate system after converting the 2-D coordinate of the target into 3-D coordinate. It can be seen from the experimental results: Our method can restore the real position change information of targets well, and can also accurately get the trajectory of the target in space.

  17. Information extracting and processing with diffraction enhanced imaging of X-ray

    International Nuclear Information System (INIS)

    Chen Bo; Chinese Academy of Science, Beijing; Chen Chunchong; Jiang Fan; Chen Jie; Ming Hai; Shu Hang; Zhu Peiping; Wang Junyue; Yuan Qingxi; Wu Ziyu

    2006-01-01

    X-ray imaging at high energies has been used for many years in many fields. Conventional X-ray imaging is based on the different absorption within a sample. It is difficult to distinguish different tissues of a biological sample because of their small difference in absorption. The authors use the diffraction enhanced imaging (DEI) method. The authors took images of absorption, extinction, scattering and refractivity. In the end, the authors presented pictures of high resolution with all these information combined. (authors)

  18. Architecture and data processing alternatives for the TSE computer. Volume 2: Extraction of topological information from an image by the Tse computer

    Science.gov (United States)

    Jones, J. R.; Bodenheimer, R. E.

    1976-01-01

    A simple programmable Tse processor organization and arithmetic operations necessary for extraction of the desired topological information are described. Hardware additions to this organization are discussed along with trade-offs peculiar to the tse computing concept. An improved organization is presented along with the complementary software for the various arithmetic operations. The performance of the two organizations is compared in terms of speed, power, and cost. Software routines developed to extract the desired information from an image are included.

  19. What do professional forecasters' stock market expectations tell us about herding, information extraction and beauty contests?

    DEFF Research Database (Denmark)

    Rangvid, Jesper; Schmeling, M.; Schrimpf, A.

    2013-01-01

    We study how professional forecasters form equity market expectations based on a new micro-level dataset which includes rich cross-sectional information about individual characteristics. We focus on testing whether agents rely on the beliefs of others, i.e., consensus expectations, when forming...... their own forecast. We find strong evidence that the average of all forecasters' beliefs influences an individual's own forecast. This effect is stronger for young and less experienced forecasters as well as forecasters whose pay depends more on performance relative to a benchmark. Further tests indicate...

  20. CLASSIFICATION OF INFORMAL SETTLEMENTS THROUGH THE INTEGRATION OF 2D AND 3D FEATURES EXTRACTED FROM UAV DATA

    Directory of Open Access Journals (Sweden)

    C. M. Gevaert

    2016-06-01

    Full Text Available Unmanned Aerial Vehicles (UAVs are capable of providing very high resolution and up-to-date information to support informal settlement upgrading projects. In order to provide accurate basemaps, urban scene understanding through the identification and classification of buildings and terrain is imperative. However, common characteristics of informal settlements such as small, irregular buildings with heterogeneous roof material and large presence of clutter challenge state-of-the-art algorithms. Especially the dense buildings and steeply sloped terrain cause difficulties in identifying elevated objects. This work investigates how 2D radiometric and textural features, 2.5D topographic features, and 3D geometric features obtained from UAV imagery can be integrated to obtain a high classification accuracy in challenging classification problems for the analysis of informal settlements. It compares the utility of pixel-based and segment-based features obtained from an orthomosaic and DSM with point-based and segment-based features extracted from the point cloud to classify an unplanned settlement in Kigali, Rwanda. Findings show that the integration of 2D and 3D features leads to higher classification accuracies.

  1. A method to extract quantitative information in analyzer-based x-ray phase contrast imaging

    International Nuclear Information System (INIS)

    Pagot, E.; Cloetens, P.; Fiedler, S.; Bravin, A.; Coan, P.; Baruchel, J.; Haertwig, J.; Thomlinson, W.

    2003-01-01

    Analyzer-based imaging is a powerful phase-sensitive technique that generates improved contrast compared to standard absorption radiography. Combining numerically two images taken on either side at ±1/2 of the full width at half-maximum (FWHM) of the rocking curve provides images of 'pure refraction' and of 'apparent absorption'. In this study, a similar approach is made by combining symmetrical images with respect to the peak of the analyzer rocking curve but at general positions, ±α·FWHM. These two approaches do not consider the ultrasmall angle scattering produced by the object independently, which can lead to inconsistent results. An accurate way to separately retrieve the quantitative information intrinsic to the object is proposed. It is based on a statistical analysis of the local rocking curve, and allows one to overcome the problems encountered using the previous approaches

  2. Breast cancer and quality of life: medical information extraction from health forums.

    Science.gov (United States)

    Opitz, Thomas; Aze, Jérome; Bringay, Sandra; Joutard, Cyrille; Lavergne, Christian; Mollevi, Caroline

    2014-01-01

    Internet health forums are a rich textual resource with content generated through free exchanges among patients and, in certain cases, health professionals. We tackle the problem of retrieving clinically relevant information from such forums, with relevant topics being defined from clinical auto-questionnaires. Texts in forums are largely unstructured and noisy, calling for adapted preprocessing and query methods. We minimize the number of false negatives in queries by using a synonym tool to achieve query expansion of initial topic keywords. To avoid false positives, we propose a new measure based on a statistical comparison of frequent co-occurrences in a large reference corpus (Web) to keep only relevant expansions. Our work is motivated by a study of breast cancer patients' health-related quality of life (QoL). We consider topics defined from a breast-cancer specific QoL-questionnaire. We quantify and structure occurrences in posts of a specialized French forum and outline important future developments.

  3. EXTRACTION OF BENTHIC COVER INFORMATION FROM VIDEO TOWS AND PHOTOGRAPHS USING OBJECT-BASED IMAGE ANALYSIS

    Directory of Open Access Journals (Sweden)

    M. T. L. Estomata

    2012-07-01

    Full Text Available Mapping benthic cover in deep waters comprises a very small proportion of studies in the field of research. Majority of benthic cover mapping makes use of satellite images and usually, classification is carried out only for shallow waters. To map the seafloor in optically deep waters, underwater videos and photos are needed. Some researchers have applied this method on underwater photos, but made use of different classification methods such as: Neural Networks, and rapid classification via down sampling. In this study, accurate bathymetric data obtained using a multi-beam echo sounder (MBES was attempted to be used as complementary data with the underwater photographs. Due to the absence of a motion reference unit (MRU, which applies correction to the data gathered by the MBES, accuracy of the said depth data was compromised. Nevertheless, even with the absence of accurate bathymetric data, object-based image analysis (OBIA, which used rule sets based on information such as shape, size, area, relative distance, and spectral information, was still applied. Compared to pixel-based classifications, OBIA was able to classify more specific benthic cover types other than coral and sand, such as rubble and fish. Through the use of rule sets on area, less than or equal to 700 pixels for fish and between 700 to 10,000 pixels for rubble, as well as standard deviation values to distinguish texture, fish and rubble were identified. OBIA produced benthic cover maps that had higher overall accuracy, 93.78±0.85%, as compared to pixel-based methods that had an average accuracy of only 87.30±6.11% (p-value = 0.0001, α = 0.05.

  4. Information Extraction and Interpretation Analysis of Mineral Potential Targets Based on ETM+ Data and GIS technology: A Case Study of Copper and Gold Mineralization in Burma

    International Nuclear Information System (INIS)

    Wenhui, Du; Yongqing, Chen; Nana, Guo; Yinglong, Hao; Pengfei, Zhao; Gongwen, Wang

    2014-01-01

    Mineralization-alteration and structure information extraction plays important roles in mineral resource prospecting and assessment using remote sensing data and the Geographical Information System (GIS) technology. Choosing copper and gold mines in Burma as example, the authors adopt band ratio, threshold segmentation and principal component analysis (PCA) to extract the hydroxyl alteration information using ETM+ remote sensing images. Digital elevation model (DEM) (30m spatial resolution) and ETM+ data was used to extract linear and circular faults that are associated with copper and gold mineralization. Combining geological data and the above information, the weights of evidence method and the C-A fractal model was used to integrate and identify the ore-forming favourable zones in this area. Research results show that the high grade potential targets are located with the known copper and gold deposits, and the integrated information can be used to the next exploration for the mineral resource decision-making

  5. State of the soft bottoms of the continental shelf

    International Nuclear Information System (INIS)

    Guzman Alvis, Angela I; Solano, Oscar David

    2002-01-01

    The presented information, it is based on studies carried out on the continental shelf of the Colombian Caribbean, mainly in the Gulf of Morrosquillo and the Magdalena and Guajira departments in the last ten years. A diagnostic is done of the soft bottoms of the Colombian continental shelf

  6. The BEL information extraction workflow (BELIEF): evaluation in the BioCreative V BEL and IAT track.

    Science.gov (United States)

    Madan, Sumit; Hodapp, Sven; Senger, Philipp; Ansari, Sam; Szostak, Justyna; Hoeng, Julia; Peitsch, Manuel; Fluck, Juliane

    2016-01-01

    Network-based approaches have become extremely important in systems biology to achieve a better understanding of biological mechanisms. For network representation, the Biological Expression Language (BEL) is well designed to collate findings from the scientific literature into biological network models. To facilitate encoding and biocuration of such findings in BEL, a BEL Information Extraction Workflow (BELIEF) was developed. BELIEF provides a web-based curation interface, the BELIEF Dashboard, that incorporates text mining techniques to support the biocurator in the generation of BEL networks. The underlying UIMA-based text mining pipeline (BELIEF Pipeline) uses several named entity recognition processes and relationship extraction methods to detect concepts and BEL relationships in literature. The BELIEF Dashboard allows easy curation of the automatically generated BEL statements and their context annotations. Resulting BEL statements and their context annotations can be syntactically and semantically verified to ensure consistency in the BEL network. In summary, the workflow supports experts in different stages of systems biology network building. Based on the BioCreative V BEL track evaluation, we show that the BELIEF Pipeline automatically extracts relationships with an F-score of 36.4% and fully correct statements can be obtained with an F-score of 30.8%. Participation in the BioCreative V Interactive task (IAT) track with BELIEF revealed a systems usability scale (SUS) of 67. Considering the complexity of the task for new users-learning BEL, working with a completely new interface, and performing complex curation-a score so close to the overall SUS average highlights the usability of BELIEF.Database URL: BELIEF is available at http://www.scaiview.com/belief/. © The Author(s) 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  7. Analysis of Peach Bottom turbine trip tests

    International Nuclear Information System (INIS)

    Cheng, H.S.; Lu, M.S.; Hsu, C.J.; Shier, W.G.; Diamond, D.J.; Levine, M.M.; Odar, F.

    1979-01-01

    Current interest in the analysis of turbine trip transients has been generated by the recent tests performed at the Peach Bottom (Unit 2) reactor. Three tests, simulating turbine trip transients, were performed at different initial power and coolant flow conditions. The data from these tests provide considerable information to aid qualification of computer codes that are currently used in BWR design analysis. The results are presented of an analysis of a turbine trip transient using the RELAP-3B and the BNL-TWIGL computer codes. Specific results are provided comparing the calculated reactor power and system pressures with the test data. Excellent agreement for all three test transients is evident from the comparisons

  8. Citizen-Centric Urban Planning through Extracting Emotion Information from Twitter in an Interdisciplinary Space-Time-Linguistics Algorithm

    Directory of Open Access Journals (Sweden)

    Bernd Resch

    2016-07-01

    Full Text Available Traditional urban planning processes typically happen in offices and behind desks. Modern types of civic participation can enhance those processes by acquiring citizens’ ideas and feedback in participatory sensing approaches like “People as Sensors”. As such, citizen-centric planning can be achieved by analysing Volunteered Geographic Information (VGI data such as Twitter tweets and posts from other social media channels. These user-generated data comprise several information dimensions, such as spatial and temporal information, and textual content. However, in previous research, these dimensions were generally examined separately in single-disciplinary approaches, which does not allow for holistic conclusions in urban planning. This paper introduces TwEmLab, an interdisciplinary approach towards extracting citizens’ emotions in different locations within a city. More concretely, we analyse tweets in three dimensions (space, time, and linguistics, based on similarities between each pair of tweets as defined by a specific set of functional relationships in each dimension. We use a graph-based semi-supervised learning algorithm to classify the data into discrete emotions (happiness, sadness, fear, anger/disgust, none. Our proposed solution allows tweets to be classified into emotion classes in a multi-parametric approach. Additionally, we created a manually annotated gold standard that can be used to evaluate TwEmLab’s performance. Our experimental results show that we are able to identify tweets carrying emotions and that our approach bears extensive potential to reveal new insights into citizens’ perceptions of the city.

  9. Eodataservice.org: Big Data Platform to Enable Multi-disciplinary Information Extraction from Geospatial Data

    Science.gov (United States)

    Natali, S.; Mantovani, S.; Barboni, D.; Hogan, P.

    2017-12-01

    In 1999, US Vice-President Al Gore outlined the concept of `Digital Earth' as a multi-resolution, three-dimensional representation of the planet to find, visualise and make sense of vast amounts of geo- referenced information on physical and social environments, allowing to navigate through space and time, accessing historical and forecast data to support scientists, policy-makers, and any other user. The eodataservice platform (http://eodataservice.org/) implements the Digital Earth Concept: eodatasevice is a cross-domain platform that makes available a large set of multi-year global environmental collections allowing data discovery, visualization, combination, processing and download. It implements a "virtual datacube" approach where data stored on distributed data centers are made available via standardized OGC-compliant interfaces. Dedicated web-based Graphic User Interfaces (based on the ESA-NASA WebWorldWind technology) as well as web-based notebooks (e.g. Jupyter notebook), deskop GIS tools and command line interfaces can be used to access and manipulate the data. The platform can be fully customized on users' needs. So far eodataservice has been used for the following thematic applications: High resolution satellite data distribution Land surface monitoring using SAR surface deformation data Atmosphere, ocean and climate applications Climate-health applications Urban Environment monitoring Safeguard of cultural heritage sites Support to farmers and (re)-insurances in the agriculturés field In the current work, the EO Data Service concept is presented as key enabling technology; furthermore various examples are provided to demonstrate the high level of interdisciplinarity of the platform.

  10. A METHOD OF EXTRACTING SHORELINE BASED ON SEMANTIC INFORMATION USING DUAL-LENGTH LiDAR DATA

    Directory of Open Access Journals (Sweden)

    C. Yao

    2017-09-01

    Full Text Available Shoreline is a spatial varying separation between water and land. By utilizing dual-wavelength LiDAR point data together with semantic information that shoreline often appears beyond water surface profile and is observable on the beach, the paper generates the shoreline and the details are as follows: (1 Gain the water surface profile: first we obtain water surface by roughly selecting water points based on several features of water body, then apply least square fitting method to get the whole water trend surface. Then we get the ground surface connecting the under -water surface by both TIN progressive filtering method and surface interpolation method. After that, we have two fitting surfaces intersected to get water surface profile of the island. (2 Gain the sandy beach: we grid all points and select the water surface profile grids points as seeds, then extract sandy beach points based on eight-neighborhood method and features, then we get all sandy beaches. (3 Get the island shoreline: first we get the sandy beach shoreline based on intensity information, then we get a threshold value to distinguish wet area and dry area, therefore we get the shoreline of several sandy beaches. In some extent, the shoreline has the same height values within a small area, by using all the sandy shoreline points to fit a plane P, and the intersection line of the ground surface and the shoreline plane P can be regarded as the island shoreline. By comparing with the surveying shoreline, the results show that the proposed method can successfully extract shoreline.

  11. An audit of the reliability of influenza vaccination and medical information extracted from eHealth records in general practice.

    Science.gov (United States)

    Regan, Annette K; Gibbs, Robyn A; Effler, Paul V

    2018-05-31

    To evaluate the reliability of information in general practice (GP) electronic health records (EHRs), 2100 adult patients were randomly selected for interview regarding the presence of specific medical conditions and recent influenza vaccination. Agreement between self-report and data extracted from EHRs was compared using Cohen's kappa coefficient (k) and interpreted in accordance with Altman's Kappa Benchmarking criteria; 377 (18%) patients declined participation, and 608 (29%) could not be contacted. Of 1115 (53%) remaining, 856 (77%) were active patients (≥3 visits to the GP practice in the last two years) who provided complete information for analysis. Although a higher proportion of patients self-reported being vaccinated or having a medical condition compared to the EHR (50.7% vs 36.9%, and 39.4% vs 30.3%, respectively), there was "good" agreement between self-report and EHR for both vaccination status (κ = 0.67) and medical conditions (κ = 0.66). These findings suggest EHR may be useful for public health surveillance. Crown Copyright © 2018. Published by Elsevier Ltd. All rights reserved.

  12. Cascadia Initiative Ocean Bottom Seismograph Performance

    Science.gov (United States)

    Evers, B.; Aderhold, K.

    2017-12-01

    The Ocean Bottom Seismograph Instrument Pool (OBSIP) provided instrumentation and operations support for the Cascadia Initiative community experiment. This experiment investigated geophysical processes across the Cascadia subduction zone through a combination of onshore and offshore seismic data. The recovery of Year 4 instruments in September 2015 marked the conclusion of a multi-year experiment that utilized 60 ocean-bottom seismographs (OBSs) specifically designed for the subduction zone boundary, including shallow/deep water deployments and active fisheries. The new instruments featured trawl-resistant enclosures designed by Lamont-Doherty Earth Observatory (LDEO) and Scripps Institution of Oceanography (SIO) for shallow deployment [water depth ≤ 500 m], as well as new deep-water instruments designed by Woods Hole Oceanographic Institute (WHOI). Existing OBSIP instruments were also deployed along the Blanco Transform Fault and on the Gorda Plate through complementary experiments. Station instrumentation included weak and strong motion seismometers, differential pressure gauges (DPG) and absolute pressure gauges (APG). All data collected from the Cascadia, Blanco, and Gorda deployments is available through the Incorporated Research Institutions for Seismology (IRIS) Data Management Center (DMC). The Cascadia Initiative is the largest amphibious seismic experiment undertaken to date, encompassing a diverse technical implementation and demonstrating an effective structure for community experiments. Thus, the results from Cascadia serve as both a technical and operational resource for the development of future community experiments, such as might be contemplated as part of the SZ4D Initiative. To guide future efforts, we investigate and summarize the quality of the Cascadia OBS data using basic metrics such as instrument recovery and more advanced metrics such as noise characteristics through power spectral density analysis. We also use this broad and diverse

  13. 76 FR 19125 - Bottom Mount Combination Refrigerator-Freezers From Korea and Mexico

    Science.gov (United States)

    2011-04-06

    ...)] Bottom Mount Combination Refrigerator-Freezers From Korea and Mexico AGENCY: United States International... bottom mount combination refrigerator-freezers from Korea and Mexico, provided for in subheadings 8418.10... five business days thereafter, or by May 23, 2011. For further information concerning the conduct of...

  14. Microplastics in Baltic bottom sediments: Quantification procedures and first results.

    Science.gov (United States)

    Zobkov, M; Esiukova, E

    2017-01-30

    Microplastics in the marine environment are known as a global ecological problem but there are still no standardized analysis procedures for their quantification. The first breakthrough in this direction was the NOAA Laboratory Methods for quantifying synthetic particles in water and sediments, but fibers numbers have been found to be underestimated with this approach. We propose modifications for these methods that will allow us to analyze microplastics in bottom sediments, including small fibers. Addition of an internal standard to sediment samples and occasional empty runs are advised for analysis quality control. The microplastics extraction efficiency using the proposed modifications is 92±7%. Distribution of microplastics in bottom sediments of the Russian part of the Baltic Sea is presented. Microplastic particles were found in all of the samples with an average concentration of 34±10 items/kg DW and have the same order of magnitude as neighbor studies reported. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Perceptual learning: top to bottom.

    Science.gov (United States)

    Amitay, Sygal; Zhang, Yu-Xuan; Jones, Pete R; Moore, David R

    2014-06-01

    Perceptual learning has traditionally been portrayed as a bottom-up phenomenon that improves encoding or decoding of the trained stimulus. Cognitive skills such as attention and memory are thought to drive, guide and modulate learning but are, with notable exceptions, not generally considered to undergo changes themselves as a result of training with simple perceptual tasks. Moreover, shifts in threshold are interpreted as shifts in perceptual sensitivity, with no consideration for non-sensory factors (such as response bias) that may contribute to these changes. Accumulating evidence from our own research and others shows that perceptual learning is a conglomeration of effects, with training-induced changes ranging from the lowest (noise reduction in the phase locking of auditory signals) to the highest (working memory capacity) level of processing, and includes contributions from non-sensory factors that affect decision making even on a "simple" auditory task such as frequency discrimination. We discuss our emerging view of learning as a process that increases the signal-to-noise ratio associated with perceptual tasks by tackling noise sources and inefficiencies that cause performance bottlenecks, and present some implications for training populations other than young, smart, attentive and highly-motivated college students. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  16. Status analysis of keyhole bottom in laser-MAG hybrid welding process.

    Science.gov (United States)

    Wang, Lin; Gao, Xiangdong; Chen, Ziqin

    2018-01-08

    The keyhole status is a determining factor of weld quality in laser-metal active gas arc (MAG) hybrid welding process. For a better evaluation of the hybrid welding process, three different penetration welding experiments: partial penetration, normal penetration (or full penetration), and excessive penetration were conducted in this work. The instantaneous visual phenomena including metallic vapor, spatters and keyhole of bottom surface were used to evaluate the keyhole status by a double high-speed camera system. The Fourier transform was applied on the bottom weld pool image for removing the image noise around the keyhole, and then the bottom weld pool image was reconstructed through the inverse Fourier transform. Lastly, the keyhole bottom was extracted from the de-noised bottom weld pool image. By analyzing the visual features of the laser-MAG hybrid welding process, mechanism of the closed and opened keyhole bottom were revealed. The results show that the stable opened or closed status of keyhole bottom is directly affected by the MAG droplet transition in the normal penetration welding process, and the unstable opened or closed status of keyhole bottom would appear in excessive penetration welding and partial penetration welding. The analysis method proposed in this paper could be used to monitor the keyhole stability in laser-MAG hybrid welding process.

  17. Bottom Dissolved Oxygen Maps From SEAMAP Summer and Fall Groundfish/Shrimp Surveys from 1982 to 1998 (NCEI Accession 0155488)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Bottom dissolved oxygen (DO) data was extracted from environmental profiles acquired during the Southeast Fisheries Science Center Mississippi Laboratories summer...

  18. Achievement report for fiscal 2000 on New Sunshine Project aiding program. Development of hot water utilizing power generation plant (Development of binary cycle power plant - development of system to detect well bottom information during geothermal well drilling); 2000 nendo nessui riyo hatsuden plant to kaihatsu seika hokokusho. Binary cycle hatsuden plant no kaihatsu (Chinetsusei kussakuji koutei joho kenchi system no kaihatsu)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-03-01

    R and D has been performed on a system to detect well bottom information during geothermal well drilling (MWD) to identify items of well bottom information during drilling on a real time basis. This paper summarizes the achievements in fiscal 2000. This device measures and transmits to the ground surface the following items during geothermal well drilling at good accuracy under the mud water temperature of 200 degrees C: azimuth, inclination, tool face, bit load, bit torque, temperatures in the device, downhole temperature, and downhole pressure. The current fiscal year has performed improvement of the sonde, including decrease of the sonde length, electric power conservation, enhancement of anti-noise performance, and enhancement of operability. For the sonde performance evaluation, high-temperature test, long distance loop test, and vibration test were carried out. In addition, the experiment analyzing program (for noise processing) was improved. With regard to the well trajectory control aiding system and the well evaluation aiding system, an operation manual was prepared, entitled the 'MWD analyzing system'. Unification was attempted on the hardware of the ground surface detection device system and the analyzing system. (NEDO)

  19. The Interplay of Top-Down and Bottom-Up

    DEFF Research Database (Denmark)

    Winkler, Till; Brown, Carol V.; Ozturk, Pinar

    2014-01-01

    The exchange of patient health information across different organizations involved in healthcare delivery has potential benefits for a wide range of stakeholders. However, many governments in Europe and in the U.S. have, despite both top-down and bottom-up initiatives, experienced major barriers...... in achieving sustainable models for implementing health information exchange (HIE) throughout their healthcare systems. In the case of the U.S., three years after stimulus funding allocated as part of the 2009 HITECH Act, the extent to which government funding will be needed to sustain health information...... organizations (HIOs) that facilitate HIE across regional stakeholders remains an unanswered question. This research investigates the impacts of top-down and bottom-up initiatives on the evolutionary paths of HIOs in two contingent states in the U.S. (New Jersey and New York) which had different starting...

  20. Extracting information on the spatial variability in erosion rate stored in detrital cooling age distributions in river sands

    Science.gov (United States)

    Braun, Jean; Gemignani, Lorenzo; van der Beek, Peter

    2018-03-01

    One of the main purposes of detrital thermochronology is to provide constraints on the regional-scale exhumation rate and its spatial variability in actively eroding mountain ranges. Procedures that use cooling age distributions coupled with hypsometry and thermal models have been developed in order to extract quantitative estimates of erosion rate and its spatial distribution, assuming steady state between tectonic uplift and erosion. This hypothesis precludes the use of these procedures to assess the likely transient response of mountain belts to changes in tectonic or climatic forcing. Other methods are based on an a priori knowledge of the in situ distribution of ages to interpret the detrital age distributions. In this paper, we describe a simple method that, using the observed detrital mineral age distributions collected along a river, allows us to extract information about the relative distribution of erosion rates in an eroding catchment without relying on a steady-state assumption, the value of thermal parameters or an a priori knowledge of in situ age distributions. The model is based on a relatively low number of parameters describing lithological variability among the various sub-catchments and their sizes and only uses the raw ages. The method we propose is tested against synthetic age distributions to demonstrate its accuracy and the optimum conditions for it use. In order to illustrate the method, we invert age distributions collected along the main trunk of the Tsangpo-Siang-Brahmaputra river system in the eastern Himalaya. From the inversion of the cooling age distributions we predict present-day erosion rates of the catchments along the Tsangpo-Siang-Brahmaputra river system, as well as some of its tributaries. We show that detrital age distributions contain dual information about present-day erosion rate, i.e., from the predicted distribution of surface ages within each catchment and from the relative contribution of any given catchment to the

  1. Extracting information on the spatial variability in erosion rate stored in detrital cooling age distributions in river sands

    Directory of Open Access Journals (Sweden)

    J. Braun

    2018-03-01

    Full Text Available One of the main purposes of detrital thermochronology is to provide constraints on the regional-scale exhumation rate and its spatial variability in actively eroding mountain ranges. Procedures that use cooling age distributions coupled with hypsometry and thermal models have been developed in order to extract quantitative estimates of erosion rate and its spatial distribution, assuming steady state between tectonic uplift and erosion. This hypothesis precludes the use of these procedures to assess the likely transient response of mountain belts to changes in tectonic or climatic forcing. Other methods are based on an a priori knowledge of the in situ distribution of ages to interpret the detrital age distributions. In this paper, we describe a simple method that, using the observed detrital mineral age distributions collected along a river, allows us to extract information about the relative distribution of erosion rates in an eroding catchment without relying on a steady-state assumption, the value of thermal parameters or an a priori knowledge of in situ age distributions. The model is based on a relatively low number of parameters describing lithological variability among the various sub-catchments and their sizes and only uses the raw ages. The method we propose is tested against synthetic age distributions to demonstrate its accuracy and the optimum conditions for it use. In order to illustrate the method, we invert age distributions collected along the main trunk of the Tsangpo–Siang–Brahmaputra river system in the eastern Himalaya. From the inversion of the cooling age distributions we predict present-day erosion rates of the catchments along the Tsangpo–Siang–Brahmaputra river system, as well as some of its tributaries. We show that detrital age distributions contain dual information about present-day erosion rate, i.e., from the predicted distribution of surface ages within each catchment and from the relative contribution of

  2. Wet physical separation of MSWI bottom ash

    NARCIS (Netherlands)

    Muchova, L.

    2010-01-01

    Bottom ash (BA) from municipal solid waste incineration (MSWI) has high potential for the recovery of valuable secondary materials. For example, the MSWI bottom ash produced by the incinerator at Amsterdam contains materials such as non-ferrous metals (2.3%), ferrous metals (8-13%), gold (0.4 ppm),

  3. CHARACTERISTICS OF SLUDGE BOTTOM MESH

    Directory of Open Access Journals (Sweden)

    Kamil Szydłowski

    2016-05-01

    Full Text Available The main aim of the study was to assess the selected heavy metals pollution of bottom sediments of small water bodies of different catchment management. Two ponds located in Mostkowo village were chosen for investigation. The first small water reservoir is surrounded by the cereal fields, cultivated without the use of organic and mineral fertilizers (NPK. The second reservoir is located in a park near rural buildings. Sediment samples were collected by the usage of KC Denmark sediments core probe. Samples were taken from 4 layers of sediment, from depth: 0–5, 5–10, 10–20 and 20–30 cm. Sampling was made once during the winter period (2014 year when ice occurred on the surface of small water bodies, from three points. The material was prepared for further analysis according to procedures used in soil science. The content of heavy metals (Cd, Cr, Cu, Ni, Pb and Zn were determined by atomic absorption spectrometry by usage of ASA ICE 3000 Thermo Scientific after prior digestion in the mixture (5: 1 of concentrated acids (HNO3 and HClO4. Higher pH values ​​were characteristic for sediments of pond located in a park than in pond located within the agricultural fields. In both small water bodies the highest heavy metal concentrations occurred in the deepest points of the research. In the sediments of the pond located within crop fields the highest concentration of cadmium, copper, lead and zinc were observed in a layer of 0–5 cm, wherein the nickel and chromium in a layer of 20–30 cm. In the sediments of the pond, located in the park the highest values ​​occurred at the deepest sampling point in the layer taken form 10–20 cm. Sediments from second reservoir were characterized by the largest average concentrations of heavy metals, except the lead content in sediment form the layer of 10–20 cm. According to the geochemical evaluation of sediments proposed by Bojakowska and Sokołowska [1998], the majority of samples belongs to Ist

  4. Bottom-up effects on attention capture and choice

    DEFF Research Database (Denmark)

    Peschel, Anne; Orquin, Jacob Lund; Mueller Loose, Simone

    Attention processes and decision making are accepted to be closely linked together because only information that is attended to can be incorporated in the decision process. Little is known however, to which extent bottom-up processes of attention affect stimulus selection and therefore...... the information available to form a decision. Does changing one visual cue in the stimulus set affect attention towards this cue and what does that mean for the choice outcome? To address this, we conducted a combined eye tracking and choice experiment in a consumer choice setting with visual shelf simulations...... salient. The observed effect on attention also carries over into increased choice likelihood. From these results, we conclude that even small changes in the choice capture attention based on bottom-up processes. Also for eye tracking studies in other domains (e.g. search tasks) this means that stimulus...

  5. Evaluation of the bottom water reservoir VAPEX process

    Energy Technology Data Exchange (ETDEWEB)

    Frauenfeld, T.W.J.; Jossy, C.; Kissel, G.A. [Alberta Research Council, Devon, AB (Canada); Rispler, K. [Saskatchewan Research Council, Saskatoon, SK (Canada)

    2004-07-01

    The mobilization of viscous heavy oil requires the dissolution of solvent vapour into the oil as well as the diffusion of the dissolved solvent into the virgin oil. Vapour extraction (VAPEX) is an enhanced oil recovery (EOR) process which involves injecting a solvent into the reservoir to reduce the viscosity of hydrocarbons. This paper describes the contribution of the Alberta Research Council to solvent-assisted oil recovery technology. The bottom water process was also modelled to determine its feasibility for a field-scale oil recovery scheme. Several experiments were conducted in an acrylic visual model in which Pujol and Boberg scaling were used to produce a lab model scaling a field process. The model simulated a slice of a 30 metre thick reservoir, with a 10 metre thick bottom water zone, containing two horizontal wells (25 metres apart) at the oil water interface. The experimental rates were found to be negatively affected by continuous low permeability layers and by oil with an initial gas content. In order to achieve commercial oil recovery rates, the bottom water process must be used to increase the surface area exposed to solvents. A large oil water interface between the wells provides contact for solvent when injecting gas at the interface. High production rates are therefore possible with appropriate well spacing. 11 refs., 4 tabs., 16 figs.

  6. Summary of core damage frequency from internal initiators: Peach Bottom

    International Nuclear Information System (INIS)

    Kolaczkowski, A.M.; Lambright, J.A.; Cathey, N.

    1986-01-01

    Probabilistic risk assessments (PRA) based on internal initiators are being conducted on a number of reference plants in order to provide the Nuclear Regulatory Commission (NRC) with updated information about light water reactor risk. The results of these analyses will be used by the NRC to prepare NUREG-1150 which will examine the NRC's current perception of risk. Peach Bottom has been chosen as one of the reference plants

  7. Trace elements distribution in bottom sediments from Amazon River estuary

    International Nuclear Information System (INIS)

    Lara, L.B.L.S.; Nadai Fernandes, E. de; Oliveira, H. de; Bacchi, M.A.

    1994-01-01

    The Amazon River discharges into a dynamic marine environment where there have been many interactive processes affecting dissolved and particulate solids, either those settling on the shelf or reaching the ocean. Trace elemental concentration, especially of the rare earth elements, have been determined by neutron activation analysis in sixty bottom sediment samples of the Amazon River estuary, providing information for the spatial and temporal variation study of those elements. (author). 16 refs, 6 figs, 3 tabs

  8. A COMPARATIVE ANALYSIS OF WEB INFORMATION EXTRACTION TECHNIQUES DEEP LEARNING vs. NAÏVE BAYES vs. BACK PROPAGATION NEURAL NETWORKS IN WEB DOCUMENT EXTRACTION

    OpenAIRE

    J. Sharmila; A. Subramani

    2016-01-01

    Web mining related exploration is getting the chance to be more essential these days in view of the reason that a lot of information is overseen through the web. Web utilization is expanding in an uncontrolled way. A particular framework is required for controlling such extensive measure of information in the web space. Web mining is ordered into three noteworthy divisions: Web content mining, web usage mining and web structure mining. Tak-Lam Wong has proposed a web content mining methodolog...

  9. An analytical framework for extracting hydrological information from time series of small reservoirs in a semi-arid region

    Science.gov (United States)

    Annor, Frank; van de Giesen, Nick; Bogaard, Thom; Eilander, Dirk

    2013-04-01

    small reservoirs in the Upper East Region of Ghana. Reservoirs without obvious large seepage losses (field survey) were selected. To verify this, stable water isotopic samples are collected from groundwater upstream and downstream from the reservoir. By looking at possible enrichment of downstream groundwater, a good estimate of seepage can be made in addition to estimates on evaporation. We estimated the evaporative losses and compared those with field measurements using eddy correlation measurements. Lastly, we determined the cumulative surface runoff curves for the small reservoirs .We will present this analytical framework for extracting hydrological information from time series of small reservoirs and show the first results for our study region of northern Ghana.

  10. Bottom Scour Observed Under Hurricane Ivan

    National Research Council Canada - National Science Library

    Teague, William J; Jarosz, Eva; Keen, Timothy R; Wang, David W; Hulbert, Mark S

    2006-01-01

    Observations that extensive bottom scour along the outer continental shelf under Hurricane Ivan resulted in the displacement of more than 100 million cubic meters of sediment from a 35x15 km region...

  11. Bottom production asymmetries at the LHC

    Energy Technology Data Exchange (ETDEWEB)

    Norrbin, E.; Vogt, R.

    1999-01-01

    We present results on bottom hadron production asymmetries at the LHC within both the Lund string fragmentation model and the intrinsic bottom model. The main aspects of the models are summarized and specific predictions for pp collisions at 14 TeV are given. Asymmetries are found to be very small at central rapidities increasing to a few percent at forward rapidities. At very large rapidities intrinsic production could dominate but this region is probably out of reach of any experiment.

  12. Bottom production asymmetries at the LHC

    International Nuclear Information System (INIS)

    Norrbin, E.; Vogt, R.

    1999-01-01

    We present results on bottom hadron production asymmetries at the LHC within both the Lund string fragmentation model and the intrinsic bottom model. The main aspects of the models are summarized and specific predictions for pp collisions at 14 TeV are given. Asymmetries are found to be very small at central rapidities increasing to a few percent at forward rapidities. At very large rapidities intrinsic production could dominate but this region is probably out of reach of any experiment

  13. Multi-angle backscatter classification and sub-bottom profiling for improved seafloor characterization

    Science.gov (United States)

    Alevizos, Evangelos; Snellen, Mirjam; Simons, Dick; Siemes, Kerstin; Greinert, Jens

    2018-06-01

    This study applies three classification methods exploiting the angular dependence of acoustic seafloor backscatter along with high resolution sub-bottom profiling for seafloor sediment characterization in the Eckernförde Bay, Baltic Sea Germany. This area is well suited for acoustic backscatter studies due to its shallowness, its smooth bathymetry and the presence of a wide range of sediment types. Backscatter data were acquired using a Seabeam1180 (180 kHz) multibeam echosounder and sub-bottom profiler data were recorded using a SES-2000 parametric sonar transmitting 6 and 12 kHz. The high density of seafloor soundings allowed extracting backscatter layers for five beam angles over a large part of the surveyed area. A Bayesian probability method was employed for sediment classification based on the backscatter variability at a single incidence angle, whereas Maximum Likelihood Classification (MLC) and Principal Components Analysis (PCA) were applied to the multi-angle layers. The Bayesian approach was used for identifying the optimum number of acoustic classes because cluster validation is carried out prior to class assignment and class outputs are ordinal categorical values. The method is based on the principle that backscatter values from a single incidence angle express a normal distribution for a particular sediment type. The resulting Bayesian classes were well correlated to median grain sizes and the percentage of coarse material. The MLC method uses angular response information from five layers of training areas extracted from the Bayesian classification map. The subsequent PCA analysis is based on the transformation of these five layers into two principal components that comprise most of the data variability. These principal components were clustered in five classes after running an external cluster validation test. In general both methods MLC and PCA, separated the various sediment types effectively, showing good agreement (kappa >0.7) with the Bayesian

  14. Company Command: The Bottom Line

    Science.gov (United States)

    1990-01-01

    personal basis. Be direct and don’t pull punches. Be sincere and ob- 1e/ctie.. First sergeants. as a group, agree on one rule: Good first sergeants make...8217" You must undo the confusion, LACK OF AGREEMENT ON IINIT GOALIS AND STANDARDS: First sergeants and (C()s should decide jointhr oin the direction for...34* Keep the interview informal. In fact, not every inter- view has to b- in your office. Pull one of your me- chanics aside in the motor pool for a chat

  15. Extraction of wind and temperature information from hybrid 4D-Var assimilation of stratospheric ozone using NAVGEM

    Science.gov (United States)

    Allen, Douglas R.; Hoppel, Karl W.; Kuhl, David D.

    2018-03-01

    Extraction of wind and temperature information from stratospheric ozone assimilation is examined within the context of the Navy Global Environmental Model (NAVGEM) hybrid 4-D variational assimilation (4D-Var) data assimilation (DA) system. Ozone can improve the wind and temperature through two different DA mechanisms: (1) through the flow-of-the-day ensemble background error covariance that is blended together with the static background error covariance and (2) via the ozone continuity equation in the tangent linear model and adjoint used for minimizing the cost function. All experiments assimilate actual conventional data in order to maintain a similar realistic troposphere. In the stratosphere, the experiments assimilate simulated ozone and/or radiance observations in various combinations. The simulated observations are constructed for a case study based on a 16-day cycling truth experiment (TE), which is an analysis with no stratospheric observations. The impact of ozone on the analysis is evaluated by comparing the experiments to the TE for the last 6 days, allowing for a 10-day spin-up. Ozone assimilation benefits the wind and temperature when data are of sufficient quality and frequency. For example, assimilation of perfect (no applied error) global hourly ozone data constrains the stratospheric wind and temperature to within ˜ 2 m s-1 and ˜ 1 K. This demonstrates that there is dynamical information in the ozone distribution that can potentially be used to improve the stratosphere. This is particularly important for the tropics, where radiance observations have difficulty constraining wind due to breakdown of geostrophic balance. Global ozone assimilation provides the largest benefit when the hybrid blending coefficient is an intermediate value (0.5 was used in this study), rather than 0.0 (no ensemble background error covariance) or 1.0 (no static background error covariance), which is consistent with other hybrid DA studies. When perfect global ozone is

  16. Extraction of wind and temperature information from hybrid 4D-Var assimilation of stratospheric ozone using NAVGEM

    Directory of Open Access Journals (Sweden)

    D. R. Allen

    2018-03-01

    Full Text Available Extraction of wind and temperature information from stratospheric ozone assimilation is examined within the context of the Navy Global Environmental Model (NAVGEM hybrid 4-D variational assimilation (4D-Var data assimilation (DA system. Ozone can improve the wind and temperature through two different DA mechanisms: (1 through the flow-of-the-day ensemble background error covariance that is blended together with the static background error covariance and (2 via the ozone continuity equation in the tangent linear model and adjoint used for minimizing the cost function. All experiments assimilate actual conventional data in order to maintain a similar realistic troposphere. In the stratosphere, the experiments assimilate simulated ozone and/or radiance observations in various combinations. The simulated observations are constructed for a case study based on a 16-day cycling truth experiment (TE, which is an analysis with no stratospheric observations. The impact of ozone on the analysis is evaluated by comparing the experiments to the TE for the last 6 days, allowing for a 10-day spin-up. Ozone assimilation benefits the wind and temperature when data are of sufficient quality and frequency. For example, assimilation of perfect (no applied error global hourly ozone data constrains the stratospheric wind and temperature to within ∼ 2 m s−1 and ∼ 1 K. This demonstrates that there is dynamical information in the ozone distribution that can potentially be used to improve the stratosphere. This is particularly important for the tropics, where radiance observations have difficulty constraining wind due to breakdown of geostrophic balance. Global ozone assimilation provides the largest benefit when the hybrid blending coefficient is an intermediate value (0.5 was used in this study, rather than 0.0 (no ensemble background error covariance or 1.0 (no static background error covariance, which is consistent with other hybrid DA studies. When

  17. Extraction and analysis of reducing alteration information of oil-gas in Bashibulake uranium ore district based on ASTER remote sensing data

    International Nuclear Information System (INIS)

    Ye Fawang; Liu Dechang; Zhao Yingjun; Yang Xu

    2008-01-01

    Beginning with the analysis of the spectral characteristics of sandstone with reducing alteration of oil-gas in Bashibulake ore district, the extract technology of reducing alteration information based on ASTER data is presented. Several remote sensing anomaly zones of reducing alteration information similar with that in uranium deposit are interpreted in study area. On the basis of above study, these alteration anomaly information are further classified by using the advantage of ASTER data with multi-band in SWIR, the geological significance for alteration anomaly information is respectively discussed. As a result, alteration anomalies good for uranium prospecting are really selected, which provides some important information for uranium exploration in outland of Bashibulake uranium ore area. (authors)

  18. A simple method to extract information on anisotropy of particle fluxes from spin-modulated counting rates of cosmic ray telescopes

    International Nuclear Information System (INIS)

    Hsieh, K.C.; Lin, Y.C.; Sullivan, J.D.

    1975-01-01

    A simple method to extract information on anisotropy of particle fluxes from data collected by cosmic ray telescopes on spinning spacecraft but without sectored accumulators is presented. Application of this method to specific satellite data demonstrates that it requires no prior assumption on the form of angular distribution of the fluxes; furthermore, self-consistency ensures the validity of the results thus obtained. The examples show perfect agreement with the corresponding magnetic field directions

  19. Computerized extraction of information on the quality of diabetes care from free text in electronic patient records of general practitioners

    NARCIS (Netherlands)

    Voorham, Jaco; Denig, Petra

    2007-01-01

    Objective: This study evaluated a computerized method for extracting numeric clinical measurements related to diabetes care from free text in electronic patient records (EPR) of general practitioners. Design and Measurements: Accuracy of this number-oriented approach was compared to manual chart

  20. An image-processing strategy to extract important information suitable for a low-size stimulus pattern in a retinal prosthesis.

    Science.gov (United States)

    Chen, Yili; Fu, Jixiang; Chu, Dawei; Li, Rongmao; Xie, Yaoqin

    2017-11-27

    A retinal prosthesis is designed to help the blind to obtain some sight. It consists of an external part and an internal part. The external part is made up of a camera, an image processor and an RF transmitter. The internal part is made up of an RF receiver, implant chip and microelectrode. Currently, the number of microelectrodes is in the hundreds, and we do not know the mechanism for using an electrode to stimulate the optic nerve. A simple hypothesis is that the pixels in an image correspond to the electrode. The images captured by the camera should be processed by suitable strategies to correspond to stimulation from the electrode. Thus, it is a question of how to obtain the important information from the image captured in the picture. Here, we use the region of interest (ROI), a useful algorithm for extracting the ROI, to retain the important information, and to remove the redundant information. This paper explains the details of the principles and functions of the ROI. Because we are investigating a real-time system, we need a fast processing ROI as a useful algorithm to extract the ROI. Thus, we simplified the ROI algorithm and used it in an outside image-processing digital signal processing (DSP) system of the retinal prosthesis. The results show that our image-processing strategies are suitable for a real-time retinal prosthesis and can eliminate redundant information and provide useful information for expression in a low-size image.

  1. Spectroscopy and lifetime of bottom and charm hadrons

    International Nuclear Information System (INIS)

    F. Ukegawa

    2000-01-01

    There are several motivations for studying masses and lifetimes of the hadrons containing a heavy quark, either the bottom or the charm quark. First, the mass and the lifetime are fundamental properties of an elementary particle. Second, the spectroscopy of hadrons gives insights into the QCD potential between quarks. In particular, a symmetry exists for heavy hadrons when the heavy quark mass is taken to be infinite, providing a powerful tool to predict and understand properties of those heavy hadrons. Third, studies of the lifetimes of heavy hadrons probe their decay mechanisms. A measurement of the lifetime, or the total decay width, is necessary when the authors extract magnitudes of elements of the Kobayashi-Maskawa matrix. Again, in the limit of an infinite heavy quark mass things become simple and decay of a heavy hadron should be the decay of the heavy quark Q. This leads to a prediction that all hadrons containing the heavy quark Q should have the same lifetime, that of the quark Q. This is far from reality in the case of charm hadrons, where the D + meson lifetime is about 2.5 times longer than the D 0 meson lifetime. Perhaps the charm quark is not heavy enough. The simple quark decay picture should be a better approximation for the bottom hadrons because of the larger b quark mass. On the experimental side, the measurements and knowledge of the heavy hadrons (in particular bottom hadrons) have significantly improved over the last decade, thanks to high statistics data accumulated by various experiments. The authors shall review recent developments in these studies in the remainder of this manuscript

  2. Bottom-simulating reflector variability at the Costa Rica subduction zone and corresponding heat flow model

    Science.gov (United States)

    Cavanaugh, S.; Bangs, N. L.; Hornbach, M. J.; McIntosh, K. D.

    2011-12-01

    We use 3D seismic reflection data acquired in April - May 2011 by the R/V Marcus G. Langseth to extract heat flow information using the bottom-simulating reflector across the Costa Rica convergent margin. These data are part of the CRISP Project, which will image the Middle America subduction zone in 3D. The survey was conducted in an area approximately 55 x 11 km, to the northwest of the Osa Peninsula, Costa Rica. For the analysis presented here, 3D seismic data were processed with Paradigm Focus software through post-stack time migration. The bottom-simulating reflector (BSR)-a reverse polarity reflection indicating the base of the gas hydrate phase boundary-is imaged very clearly in two regions within the slope-cover sediments in the accretionary prism. In deep water environments, the BSR acts as a temperature gauge revealing subsurface temperatures across the margin. We predict BSR depth using a true 3D diffusive heat flow model combined with IODP drilling data and compare results with actual BSR depth observations to determine anomalies in heat flow. Uniform heat flow in the region should result in a deepening BSR downslope toward the trench, however our initial results indicate the BSR shoals near the trench to its shallowest level below sea floor of approximately 96 m below the sea floor, suggesting elevated heat flow towards the toe. Landward, the BSR deepens to about 333 m below the sea floor indicating lower heat flow. Both BSR segments display a trend of deepening landward from the trench, however the depth below the sea floor is greater overall for the landward segment than the segment near the toe. We suggest two regimes with differing heat flow exist across the margin that likely represent two separate fluid flow regimes - one from recently accreted sediments near the prism toe and the other through the older materials making up the prism.

  3. Where to start? Bottom-up attention improves working memory by determining encoding order.

    Science.gov (United States)

    Ravizza, Susan M; Uitvlugt, Mitchell G; Hazeltine, Eliot

    2016-12-01

    The present study aimed to characterize the mechanism by which working memory is enhanced for items that capture attention because of their novelty or saliency-that is, via bottom-up attention. The first experiment replicated previous research by corroborating that bottom-up attention directed to an item is sufficient for enhancing working memory and, moreover, generalized the effect to the domain of verbal working memory. The subsequent 3 experiments sought to determine how bottom-up attention affects working memory. We considered 2 hypotheses: (1) Bottom-up attention enhances the encoded representation of the stimulus, similar to how voluntary attention functions, or (2) It affects the order of encoding by shifting priority onto the attended stimulus. By manipulating how stimuli were presented (simultaneous/sequential display) and whether the cue predicted the tested items, we found evidence that bottom-up attention improves working memory performance via the order of encoding hypothesis. This finding was observed across change detection and free recall paradigms. In contrast, voluntary attention improved working memory regardless of encoding order and showed greater effects on working memory. We conclude that when multiple information sources compete, bottom-up attention prioritizes the location at which encoding should begin. When encoding order is set, bottom-up attention has little or no benefit to working memory. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  4. Drycon dry ash conveyor: dry bottom ash handling system with reduced operating costs and improved plant efficiency

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2006-07-01

    The Drycon dry bottom ash extraction system is designed to remove bottom ash beneath the furnace, cooling it without any need of water. Fresh air in countercurrent flow to the ash is used for the ash cooling. Data presented show how savings of time and costs can be achieved with this system and how a boiler efficiency can be increased using this technology. Considerable advantages in the reliability of operation with new improvements of the design are described. 7 figs.

  5. Control Properties of Bottom Fired Marine Boilers

    DEFF Research Database (Denmark)

    Solberg, Brian; Andersen, Palle; Karstensen, Claus M. S.

    2007-01-01

    This paper focuses on model analysis of a dynamic model of a bottom fired one-pass smoke tube boiler. Linearized versions of the model are analyzed and show large variations in system gains at steady state as function of load whereas gain variations near the desired bandwidth are small. An analys...

  6. A resting bottom sodium cooled fast reactor

    International Nuclear Information System (INIS)

    Costes, D.

    2012-01-01

    This follows ICAPP 2011 paper 11059 'Fast Reactor with a Cold Bottom Vessel', on sodium cooled reactor vessels in thermal gradient, resting on soil. Sodium is frozen on vessel bottom plate, temperature increasing to the top. The vault cover rests on the safety vessel, the core diagrid welded to a toric collector forms a slab, supported by skirts resting on the bottom plate. Intermediate exchangers and pumps, fixed on the cover, plunge on the collector. At the vessel top, a skirt hanging from the cover plunges into sodium, leaving a thin circular slit partially filled by sodium covered by argon, providing leak-tightness and allowing vessel dilatation, as well as a radial relative holding due to sodium inertia. No 'air conditioning' at 400 deg. C is needed as for hanging vessels, and this allows a large economy. The sodium volume below the slab contains isolating refractory elements, stopping a hypothetical corium flow. The small gas volume around the vessel limits any LOCA. The liner cooling system of the concrete safety vessel may contribute to reactor cooling. The cold resting bottom vessel, proposed by the author for many years, could avoid the complete visual inspection required for hanging vessels. However, a double vessel, containing support skirts, would allow introduction of inspecting devices. Stress limiting thermal gradient is obtained by filling secondary sodium in the intermediate space. (authors)

  7. Coil in bottom part of splitter magnet

    CERN Multimedia

    CERN PhotoLab

    1976-01-01

    Radiation-resistant coil being bedded into the bottom part of a splitter magnet. This very particular magnet split the beam into 3 branches, for 3 target stations in the West-Area. See Annual Report 1975, p.176, Figs.14 and 15.

  8. Bottomonia: open bottom strong decays and spectrum

    Directory of Open Access Journals (Sweden)

    Santopinto E.

    2014-05-01

    Full Text Available We present our results for the bottomonium spectrum with self energy corrections. The bare masses used in the calculation are computed within Godfrey and Isgur’s relativized quark model. We also discuss our results for the open bottom strong decay widths of higher bottomonia in the 3P0 pair-creation model.

  9. Bottom fauna of the Malacca Strait

    Digital Repository Service at National Institute of Oceanography (India)

    Parulekar, A.H.; Ansari, Z.A.

    Bottom fauna of Malacca Strait (connecting the Indian Ocean with Pacific) in the depth range of 80 to 1350 m, is dominated by meiofauna which exceeds macrofauna by 12.5 times in weight and by more than 780 times in population density. Standing crop...

  10. Spectroscopy and decays of charm and bottom

    International Nuclear Information System (INIS)

    Butler, J.N.

    1997-10-01

    After a brief review of the quark model, we discuss our present knowledge of the spectroscopy of charm and bottom mesons and baryons. We go on to review the lifetimes, semileptonic, and purely leptonic decays of these particles. We conclude with a brief discussion B and D mixing and rare decays

  11. Multineuronal vectorization is more efficient than time-segmental vectorization for information extraction from neuronal activities in the inferior temporal cortex.

    Science.gov (United States)

    Kaneko, Hidekazu; Tamura, Hiroshi; Tate, Shunta; Kawashima, Takahiro; Suzuki, Shinya S; Fujita, Ichiro

    2010-08-01

    In order for patients with disabilities to control assistive devices with their own neural activity, multineuronal spike trains must be efficiently decoded because only limited computational resources can be used to generate prosthetic control signals in portable real-time applications. In this study, we compare the abilities of two vectorizing procedures (multineuronal and time-segmental) to extract information from spike trains during the same total neuron-seconds. In the multineuronal vectorizing procedure, we defined a response vector whose components represented the spike counts of one to five neurons. In the time-segmental vectorizing procedure, a response vector consisted of components representing a neuron's spike counts for one to five time-segment(s) of a response period of 1 s. Spike trains were recorded from neurons in the inferior temporal cortex of monkeys presented with visual stimuli. We examined whether the amount of information of the visual stimuli carried by these neurons differed between the two vectorizing procedures. The amount of information calculated with the multineuronal vectorizing procedure, but not the time-segmental vectorizing procedure, significantly increased with the dimensions of the response vector. We conclude that the multineuronal vectorizing procedure is superior to the time-segmental vectorizing procedure in efficiently extracting information from neuronal signals. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  12. Bottom water circulation in Cascadia Basin

    Science.gov (United States)

    Hautala, Susan L.; Paul Johnson, H.; Hammond, Douglas E.

    2009-10-01

    A combination of beta spiral and minimum length inverse methods, along with a compilation of historical and recent high-resolution CTD data, are used to produce a quantitative estimate of the subthermocline circulation in Cascadia Basin. Flow in the North Pacific Deep Water, from 900-1900 m, is characterized by a basin-scale anticyclonic gyre. Below 2000 m, two water masses are present within the basin interior, distinguished by different potential temperature-salinity lines. These water masses, referred to as Cascadia Basin Bottom Water (CBBW) and Cascadia Basin Deep Water (CBDW), are separated by a transition zone at about 2400 m depth. Below the depth where it freely communicates with the broader North Pacific, Cascadia Basin is renewed by northward flow through deep gaps in the Blanco Fracture Zone that feeds the lower limb of a vertical circulation cell within the CBBW. Lower CBBW gradually warms and returns to the south at lighter density. Isopycnal layer renewal times, based on combined lateral and diapycnal advective fluxes, increase upwards from the bottom. The densest layer, existing in the southeast quadrant of the basin below ˜2850 m, has an advective flushing time of 0.6 years. The total volume flushing time for the entire CBBW is 2.4 years, corresponding to an average water parcel residence time of 4.7 years. Geothermal heating at the Cascadia Basin seafloor produces a characteristic bottom-intensified temperature anomaly and plays an important role in the conversion of cold bottom water to lighter density within the CBBW. Although covering only about 0.05% of the global seafloor, the combined effects of bottom heat flux and diapycnal mixing within Cascadia Basin provide about 2-3% of the total required global input to the upward branch of the global thermohaline circulation.

  13. Cathodic protection for the bottoms of above ground storage tanks

    Energy Technology Data Exchange (ETDEWEB)

    Mohr, John P. [Tyco Adhesives, Norwood, MA (United States)

    2004-07-01

    Impressed Current Cathodic Protection has been used for many years to protect the external bottoms of above ground storage tanks. The use of a vertical deep ground bed often treated several bare steel tank bottoms by broadcasting current over a wide area. Environmental concerns and, in some countries, government regulations, have introduced the use of dielectric secondary containment liners. The dielectric liner does not allow the protective cathodic protection current to pass and causes corrosion to continue on the newly placed tank bottom. In existing tank bottoms where inadequate protection has been provided, leaks can develop. In one method of remediation, an old bottom is covered with sand and a double bottom is welded above the leaking bottom. The new bottom is welded very close to the old bottom, thus shielding the traditional cathodic protection from protecting the new bottom. These double bottoms often employ the use of dielectric liner as well. Both the liner and the double bottom often minimize the distance from the external tank bottom. The minimized space between the liner, or double bottom, and the bottom to be protected places a challenge in providing current distribution in cathodic protection systems. This study examines the practical concerns for application of impressed current cathodic protection and the types of anode materials used in these specific applications. One unique approach for an economical treatment using a conductive polymer cathodic protection method is presented. (author)

  14. Prospects of obtaining samples of bottom sediments from subglacial lake Vostok

    Directory of Open Access Journals (Sweden)

    Н. И. Васильев

    2017-04-01

    Full Text Available The paper proves the timeliness of obtaining and examining bottom sediments from subglacial Lake Vostok. Predictive geological section of Lake Vostok and information value of bottom sediments have been examined. Severe requirements towards environmental security of lake examinations and sampling of bottom sediments rule out the use of conventional drilling technologies, as they would pollute the lake with injection liquid from the borehole. In order to carry out sampling of bottom sediments from the subglacial lake, it is proposed to use a dynamically balanced tool string, which enables rotary drilling without any external support on borehole walls to transmit counter torque.     A theoretical analysis has been carried out to assess the operation of the tool string, which is a two-mass oscillatory electromechanical system of reciprocating and rotating motion (RRM with two degrees of freedom.

  15. RELATING BOTTOM QUARK MASS IN DR-BAR AND MS-BAR REGULARIZATION SCHEMES

    International Nuclear Information System (INIS)

    2002-01-01

    The value of the bottom quark mass at Q = M Z in the (bar D)(bar R) scheme is an important input for the analysis of supersymmetric models with a large value of tan β. Conventionally, however, the running bottom quark mass extracted from experimental data is quoted in the (bar M)(bar S) scheme at the scale Q = m b . We describe a two loop procedure for the conversion of the bottom quark mass from (bar M)(bar S) to (bar D)(bar R) scheme. The Particle Data Group value m b # bar M# # bar S#(m b # bar M# # bar S#) = 4.2 ± 0.2 GeV corresponds to a range of 2.65-3.03 GeV for m b # bar D# # bar R#(M Z )

  16. Removal of COD and color loads in bleached kraft pulp effluents by bottom ashes from boilers.

    Science.gov (United States)

    Van Tran, A

    2008-07-01

    The effectiveness of the bottom ashes from biomass and coal-fired boilers in removing chemical oxygen demand (COD) and colorloads in effluents of a kraft pulp bleachery plant is investigated. The effluents tested are those of the sulfuric acid treatment (A stage) of a hardwood kraft pulp, and of the first acidic (chlorine or chlorine dioxide) and second alkaline (extraction) stages in the chlorine and elemental chlorine-free (ECF) bleaching lines of hardwood and softwood kraft pulps. The coal-fired boiler's bottom ashes are unable to remove either COD or color load in the bleached kraft pulp effluents. However, the bottom ashes of the biomass boiler are effective in removing COD and color loads of the acidic and alkaline effluents irrespective of the bleaching process or wood species. In particular, these ashes increase the pH of all the effluents examined.

  17. Development of hot water utilizing power plants in fiscal 1999. Development of binary cycle power plant (Development of system to detect well bottom information when geothermal hot water is excavated); 1999 nendo nessui riyo hatsuden plant nado kaihatsu seika hokokusho. Binary cycle hatsuden plant no kaihatsu (chinetsusei kussakuji kotei joho kenchi system no kaihatsu)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    Developmental research has been performed on the MWD system to identify on the real time basis the information about well bottom when geothermal hot water is excavated (azimuth, inclination, pressure and temperature). This paper summarizes the achievements in fiscal 1999. In the developmental research on the detection device, attempts were made in improving the zonde, and enhancing its heat resisting performance. In addition, data were acquired on electronics parts as a result of the heat resistance identifying test. For the on-the-ground devices, improvement was made to add the experiment analyzing program with a program to remove the downhole motor pressure noise. The pressure noise during excavation in the actual wells was collected. In the analyzing system, use of PC, improvement, and operation check were performed on the well trace projecting and indicating system. Operation of the well trace estimating system was checked by using the actual data in order to prepare the operation manual. With regard to the well evaluation supporting system, improvement, operation check and that by using the actual data were executed on the PC version temperature analyzing system. Performance of the zonde was verified by the actual geothermal well test. (NEDO)

  18. Stabilization of bottom sediments from Rzeszowski Reservoir

    Directory of Open Access Journals (Sweden)

    Koś Karolina

    2015-06-01

    Full Text Available The paper presents results of stabilization of bottom sediments from Rzeszowski Reservoir. Based on the geotechnical characteristics of the tested sediments it was stated they do not fulfill all the criteria set for soils in earth embankments. Therefore, an attempt to improve their parameters was made by using two additives – cement and lime. An unconfined compressive strength, shear strength, bearing ratio and pH reaction were determined on samples after different time of curing. Based on the carried out tests it was stated that the obtained values of unconfined compressive strength of sediments stabilized with cement were relatively low and they did not fulfill the requirements set by the Polish standard, which concerns materials in road engineering. In case of lime stabilization it was stated that the tested sediments with 6% addition of the additive can be used for the bottom layers of the improved road base.

  19. Constructing bottom barriers with met grouting

    International Nuclear Information System (INIS)

    Shibazaki, M.; Yoshida, H.

    1997-01-01

    Installing a bottom barrier using conventional high pressure jetting technology and ensuring barrier continuity is challenging. This paper describes technology that has been developed and demonstrated for the emplacement of bottom barriers using pressures and flow rates above the conventional high pressure jetting parameters. The innovation capable of creating an improved body exceeding 5 meters in diameter has resulted in the satisfying connection and adherence between the treated columns. Besides, the interfaces among the improved bodies obtain the same strength and permeability lower than 1 x 10 -7 cm/sec as body itself. A wide variety of the thickness and the diameter of the improved mass optimizes the application, and the method is nearing completion. The paper explains an aspect and briefs case histories

  20. Landfilling: Bottom Lining and Leachate Collection

    DEFF Research Database (Denmark)

    Christensen, Thomas Højlund; Manfredi, Simone; Kjeldsen, Peter

    2011-01-01

    from entering the groundwater or surface water. The bottom lining system should cover the full footprint area of the landfill, including both the relatively flat bottom and the sideslopes in the case of an excavated configuration. This prevents the lateral migration of leachate from within the landfill...... triple) liners, are extremely effective in preventing leachate from entering into the environment. In addition, the risk of polluting the groundwater at a landfill by any leakage of leachate depends on several factors related to siting of the landfill: distance to the water table, distance to surface...... water bodies, and the properties of the soil beneath the landfill. In addition to the lining and drainage systems described in this chapter, the siting and hydrogeology of the landfill site (Chapter 10.12) and the top cover (Chapter 10.9) are also part of the barrier system, contributing to reducing...

  1. The effect of bottom sediment supplement on heavy metals content in plants (Zea mays and soil

    Directory of Open Access Journals (Sweden)

    Baran A.

    2013-04-01

    Full Text Available Important aspect of bottom sediments is the problem of their management or disposal after their extraction from the bottom of rivers, dam reservoirs, ports, channels or ponds. The research aimed at an assessment of potential environmental management of bottom sediment used as an admixture to light soil basing on its effect on contents of heavy metals in plants and soil. The research was conducted on light soil with granulometric structure of weakly loamy sand. The bottom sediment was added to light soil in the amount of 0 (control 5, 10, 30 i 50%. The test plant was maize (Zea mays, “Bora” c.v. The sediment applied in the presented research revealed high share of silt and clay fractions, alkaline pH and low contents of heavy metals, therefore it may be used as an admixture to the above mentioned soils to improve their productivity. The applied bottom sediment to the soil affected a decreased in Zn, Cd and Pb content in maize in comparison with the treatment without the deposit whereas increased content of Cu, Cr and Ni. No exceeded permissible content of heavy metals concerning plant assessment in view of their forage usability were registered in maize biomass.

  2. Extraction of Pluvial Flood Relevant Volunteered Geographic Information (VGI by Deep Learning from User Generated Texts and Photos

    Directory of Open Access Journals (Sweden)

    Yu Feng

    2018-01-01

    Full Text Available In recent years, pluvial floods caused by extreme rainfall events have occurred frequently. Especially in urban areas, they lead to serious damages and endanger the citizens’ safety. Therefore, real-time information about such events is desirable. With the increasing popularity of social media platforms, such as Twitter or Instagram, information provided by voluntary users becomes a valuable source for emergency response. Many applications have been built for disaster detection and flood mapping using crowdsourcing. Most of the applications so far have merely used keyword filtering or classical language processing methods to identify disaster relevant documents based on user generated texts. As the reliability of social media information is often under criticism, the precision of information retrieval plays a significant role for further analyses. Thus, in this paper, high quality eyewitnesses of rainfall and flooding events are retrieved from social media by applying deep learning approaches on user generated texts and photos. Subsequently, events are detected through spatiotemporal clustering and visualized together with these high quality eyewitnesses in a web map application. Analyses and case studies are conducted during flooding events in Paris, London and Berlin.

  3. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Dome Reef, 1989 - 2003 (NODC Accession 0002809)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  4. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Alligator Reef, 2005 - 2007 (NODC Accession 0019351)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  5. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Grecian Rocks, 2005 - 2007 (NODC Accession 0039973)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  6. Continuous bottom temperature measurements in strategic areas of the Florida Reef Track at Tennessee Reef, 1990 - 2004 (NODC Accession 0002749)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  7. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Triumph Reef, 1990 - 2006 (NODC accession 0013166)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  8. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Grecian Rocks, 1990 - 2005 (NODC Accession 0011143)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  9. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Cape Florida, 2005 - 2006 (NODC Accession 0014185)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  10. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Looe Iselin, 2006 - 2007 (NODC Accession 0039240)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  11. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Bullard Bank, 1992 - 2005 (NODC Accession 0010426)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  12. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Cape Florida, 1996 - 2005 (NODC Accession 0002788)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  13. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Sprigger Bank, 1992 - 2006 (NODC accession 0013114)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  14. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Long Key, 2008 - 2010 (NODC Accession 0093063)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  15. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Smith Shoal, 1998 - 2006 (NODC Accession 0014121)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  16. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Looe Buoy, 1988 - 2004 (NODC Accession 0002616)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  17. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Dome Reef, 2006 - 2007 (NODC Accession 0029107)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  18. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Dome Reef, 2005 - 2006 (NODC Accession 0014268)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  19. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Dome Reef, 2007 (NODC Accession 0093023)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  20. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Triumph Reef, 1990 - 2006 (NODC Accession 0013166)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  1. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Grecian Rocks, 2007 - 2010 (NODC Accession 0093026)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  2. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at 7-mile Bridge (NODC Accession 0002750)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  3. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Bullard Bank, 2005 - 2007 (NODC Accession 0039881)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  4. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Wellwood, 2006 - 2009 (NODC Accession 0093067)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  5. Metabolic Network Discovery by Top-Down and Bottom-Up Approaches and Paths for Reconciliation

    Energy Technology Data Exchange (ETDEWEB)

    Çakır, Tunahan, E-mail: tcakir@gyte.edu.tr [Computational Systems Biology Group, Department of Bioengineering, Gebze Technical University (formerly known as Gebze Institute of Technology), Gebze (Turkey); Khatibipour, Mohammad Jafar [Computational Systems Biology Group, Department of Bioengineering, Gebze Technical University (formerly known as Gebze Institute of Technology), Gebze (Turkey); Department of Chemical Engineering, Gebze Technical University (formerly known as Gebze Institute of Technology), Gebze (Turkey)

    2014-12-03

    The primary focus in the network-centric analysis of cellular metabolism by systems biology approaches is to identify the active metabolic network for the condition of interest. Two major approaches are available for the discovery of the condition-specific metabolic networks. One approach starts from genome-scale metabolic networks, which cover all possible reactions known to occur in the related organism in a condition-independent manner, and applies methods such as the optimization-based Flux-Balance Analysis to elucidate the active network. The other approach starts from the condition-specific metabolome data, and processes the data with statistical or optimization-based methods to extract information content of the data such that the active network is inferred. These approaches, termed bottom-up and top-down, respectively, are currently employed independently. However, considering that both approaches have the same goal, they can both benefit from each other paving the way for the novel integrative analysis methods of metabolome data- and flux-analysis approaches in the post-genomic era. This study reviews the strengths of constraint-based analysis and network inference methods reported in the metabolic systems biology field; then elaborates on the potential paths to reconcile the two approaches to shed better light on how the metabolism functions.

  6. Metabolic Network Discovery by Top-Down and Bottom-Up Approaches and Paths for Reconciliation

    International Nuclear Information System (INIS)

    Çakır, Tunahan; Khatibipour, Mohammad Jafar

    2014-01-01

    The primary focus in the network-centric analysis of cellular metabolism by systems biology approaches is to identify the active metabolic network for the condition of interest. Two major approaches are available for the discovery of the condition-specific metabolic networks. One approach starts from genome-scale metabolic networks, which cover all possible reactions known to occur in the related organism in a condition-independent manner, and applies methods such as the optimization-based Flux-Balance Analysis to elucidate the active network. The other approach starts from the condition-specific metabolome data, and processes the data with statistical or optimization-based methods to extract information content of the data such that the active network is inferred. These approaches, termed bottom-up and top-down, respectively, are currently employed independently. However, considering that both approaches have the same goal, they can both benefit from each other paving the way for the novel integrative analysis methods of metabolome data- and flux-analysis approaches in the post-genomic era. This study reviews the strengths of constraint-based analysis and network inference methods reported in the metabolic systems biology field; then elaborates on the potential paths to reconcile the two approaches to shed better light on how the metabolism functions.

  7. Lime application methods, water and bottom soil acidity in fresh water fish ponds

    Directory of Open Access Journals (Sweden)

    Queiroz Julio Ferraz de

    2004-01-01

    Full Text Available Although some methods for determining lime requirement of pond soils are available and commonly used, there is still no consensus on whether it is more effective to apply liming materials to the bottoms of empty ponds or to wait and apply them over the water surface after ponds are filled. There is also little information on how deep lime reacts in pond sediment over time, and whether the depth of reaction is different when liming materials are applied to the water or to the soil. Therefore, three techniques for treating fish ponds with agricultural limestone were evaluated in ponds with clayey soils at a commercial fish farm. Amounts of agricultural limestone equal to the lime requirement of bottom soils were applied to each of three ponds by: direct application over the pond water surface; spread uniformly over the bottom of the empty pond; spread uniformly over the bottom of the empty pond followed by tilling of the bottom. Effectiveness of agricultural limestone applications did not differ among treatment methods. Agricultural limestone also reacted quickly to increase total alkalinity and total hardness of pond water to acceptable concentrations within 2 weeks after application. The reaction of lime to increase soil pH was essentially complete after one to two months, and lime had no effect below a soil depth of 8 cm. Tilling of pond bottoms to incorporate liming materials is unnecessary, and tilling consumes time and is an expensive practice; filled ponds can be limed effectively.

  8. Extracting Information about the Electronic Quality of Organic Solar-Cell Absorbers from Fill Factor and Thickness

    Science.gov (United States)

    Kaienburg, Pascal; Rau, Uwe; Kirchartz, Thomas

    2016-08-01

    Understanding the fill factor in organic solar cells remains challenging due to its complex dependence on a multitude of parameters. By means of drift-diffusion simulations, we thoroughly analyze the fill factor of such low-mobility systems and demonstrate its dependence on a collection coefficient defined in this work. We systematically discuss the effect of different recombination mechanisms, space-charge regions, and contact properties. Based on these findings, we are able to interpret the thickness dependence of the fill factor for different experimental studies from the literature. The presented model provides a facile method to extract the photoactive layer's electronic quality which is of particular importance for the fill factor. We illustrate that over the past 15 years, the electronic quality has not been continuously improved, although organic solar-cell efficiencies increased steadily over the same period of time. Only recent reports show the synthesis of polymers for semiconducting films of high electronic quality that are able to produce new efficiency records.

  9. Investigating the feasibility of using partial least squares as a method of extracting salient information for the evaluation of digital breast tomosynthesis

    Science.gov (United States)

    Zhang, George Z.; Myers, Kyle J.; Park, Subok

    2013-03-01

    Digital breast tomosynthesis (DBT) has shown promise for improving the detection of breast cancer, but it has not yet been fully optimized due to a large space of system parameters to explore. A task-based statistical approach1 is a rigorous method for evaluating and optimizing this promising imaging technique with the use of optimal observers such as the Hotelling observer (HO). However, the high data dimensionality found in DBT has been the bottleneck for the use of a task-based approach in DBT evaluation. To reduce data dimensionality while extracting salient information for performing a given task, efficient channels have to be used for the HO. In the past few years, 2D Laguerre-Gauss (LG) channels, which are a complete basis for stationary backgrounds and rotationally symmetric signals, have been utilized for DBT evaluation2, 3 . But since background and signal statistics from DBT data are neither stationary nor rotationally symmetric, LG channels may not be efficient in providing reliable performance trends as a function of system parameters. Recently, partial least squares (PLS) has been shown to generate efficient channels for the Hotelling observer in detection tasks involving random backgrounds and signals.4 In this study, we investigate the use of PLS as a method for extracting salient information from DBT in order to better evaluate such systems.

  10. 46 CFR 173.058 - Double bottom requirements.

    Science.gov (United States)

    2010-10-01

    ... PERTAINING TO VESSEL USE School Ships § 173.058 Double bottom requirements. Each new sailing school vessel... service must comply with the double bottom requirements in §§ 171.105 through 171.109, inclusive, of this...

  11. Bottom Trawl Survey Protocol Development (HB0706, EK60)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Cruise objectives include: 1) Investigate performance characteristics of new research bottom trawl; 2) Develop standard operating procedures for the NEFSC Bottom...

  12. Extraction of information on macromolecular interactions from fluorescence micro-spectroscopy measurements in the presence and absence of FRET

    Science.gov (United States)

    Raicu, Valerică

    2018-06-01

    Investigations of static or dynamic interactions between proteins or other biological macromolecules in living cells often rely on the use of fluorescent tags with two different colors in conjunction with adequate theoretical descriptions of Förster Resonance Energy Transfer (FRET) and molecular-level micro-spectroscopic technology. One such method based on these general principles is FRET spectrometry, which allows determination of the quaternary structure of biomolecules from cell-level images of the distributions, or spectra of occurrence frequency of FRET efficiencies. Subsequent refinements allowed combining FRET frequency spectra with molecular concentration information, thereby providing the proportion of molecular complexes with various quaternary structures as well as their binding/dissociation energies. In this paper, we build on the mathematical principles underlying FRET spectrometry to propose two new spectrometric methods, which have distinct advantages compared to other methods. One of these methods relies on statistical analysis of color mixing in subpopulations of fluorescently tagged molecules to probe molecular association stoichiometry, while the other exploits the color shift induced by FRET to also derive geometric information in addition to stoichiometry. The appeal of the first method stems from its sheer simplicity, while the strength of the second consists in its ability to provide structural information.

  13. River bottom sediment from the Vistula as matrix of candidate for a new reference material.

    Science.gov (United States)

    Kiełbasa, Anna; Buszewski, Bogusław

    2017-08-01

    Bottom sediments are very important in aquatic ecosystems. The sediments accumulate heavy metals and compounds belonging to the group of persistent organic pollutants. The accelerated solvent extraction (ASE) was used for extraction of 16 compounds from PAH group from bottom sediment of Vistula. For the matrix of candidate of a new reference material, moisture content, particle size, loss on ignition, pH, and total organic carbon were determined. A gas chromatograph with a selective mass detector (GC/MS) was used for the final analysis. The obtained recoveries were from 86% (SD=6.9) for anthracene to 119% (SD=5.4) for dibenzo(ah)anthracene. For the candidate for a new reference material, homogeneity and analytes content were determined using a validated method. The results are a very important part of the development and certification of a new reference materials. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Information

    International Nuclear Information System (INIS)

    Boyard, Pierre.

    1981-01-01

    The fear for nuclear energy and more particularly for radioactive wastes is analyzed in the sociological context. Everybody agree on the information need, information is available but there is a problem for their diffusion. Reactions of the public are analyzed and journalists, scientists and teachers have a role to play [fr

  15. High-frequency internal waves and thick bottom mixed layers observed by gliders in the Gulf Stream

    Science.gov (United States)

    Todd, Robert E.

    2017-06-01

    Autonomous underwater gliders are conducting high-resolution surveys within the Gulf Stream along the U.S. East Coast. Glider surveys reveal two mechanisms by which energy is extracted from the Gulf Stream as it flows over the Blake Plateau, a portion of the outer continental shelf between Florida and North Carolina where bottom depths are less than 1000 m. Internal waves with vertical velocities exceeding 0.1 m s-1 and frequencies just below the local buoyancy frequency are routinely found over the Blake Plateau, particularly near the Charleston Bump, a prominent topographic feature. These waves are likely internal lee waves generated by the subinertial Gulf Stream flow over the irregular bathymetry of the outer continental shelf. Bottom mixed layers with O(100) m thickness are also frequently encountered; these thick bottom mixed layers likely form in the lee of topography due to enhanced turbulence generated by O(1) m s-1 near-bottom flows.

  16. 基于决策树的戈壁信息提取研究%Gobi information extraction based on decision tree classification method

    Institute of Scientific and Technical Information of China (English)

    冯益明; 智长贵; 姚爱冬

    2013-01-01

    Gobi is one of the main landscape types of earth' s surface in the arid region of northwestern parts of China, with the total area of 458 000-757 000 km2, accounting for the 4.8%-7.9% of China's total land area. The gobi holds abundant natural resources such as minerals, wind energy and solar power. Meanwhile, many modern cities and towns and some important traffic routes were also constructed on the gobi region. The gobi region plays an important role in the construction of western economy. Therefore, it is important to launch the gobi research under current social and economic conditions, and accurately revealing the distribution and area of gobi is the base and premise of launching the gobi research. At present, it is difficult to do fieldwork due to the execrable natural conditions and the sparse dweller in the gobi region, which leads to the scarcity of research documents on the situation, distribution, type classification, transformation and utilization of gobi. The studied region of this paper is a typical gobi distribution region, locating in Ejina County in Inner Mongolia, China, and its climatic characteristics include lack of rain, more evaporation, full sunshine, large temperature difference and frequent windy sand weather. Using Remote Sensing imageries Landsat TM5 and TM7 of plant growth season of 2005-2010, the DEM with 30 m spatial resolution, administrative map, present land use map, field investigation data and related documents as the basic data resource. Firstly, the non-gobi distribution regions were extracted in GIS software by analyzing DEM. Then, based on the analysis of spectral characteristics of difference typical ground objects, the information extraction model of Decision Tree based on knowledge was constructed to classify the remote sensing imageries, and eroded gobi and cumulated gobi were relatively accurately separated. The general accuracy of the extracted gobi information reached 91.57%. There were few materials in China on using

  17. Control Properties of Bottom Fired Marine Boilers

    DEFF Research Database (Denmark)

    Solberg, Brian; Andersen, Palle; Karstensen, Claus M. S.

    2005-01-01

    This paper focuses on model analysis of a dynamic model of a bottom fired one-pass smoke tube boiler. Linearised versions of the model are analysed to determine how gain, time constants and right half plane zeros (caused by the shrink-and-swell phenomenon) depend on the steam flow load. Furthermore...... the interactions in the system are inspected to analyse potential benefit from using a multivariable control strategy in favour of the current strategy based on single loop theory. An analysis of the nonlinear model is carried out to further determine the nonlinear characteristics of the boiler system...

  18. Rankine bottoming cycle safety analysis. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Lewandowski, G.A.

    1980-02-01

    Vector Engineering Inc. conducted a safety and hazards analysis of three Rankine Bottoming Cycle Systems in public utility applications: a Thermo Electron system using Fluorinal-85 (a mixture of 85 mole % trifluoroethanol and 15 mole % water) as the working fluid; a Sundstrand system using toluene as the working fluid; and a Mechanical Technology system using steam and Freon-II as the working fluids. The properties of the working fluids considered are flammability, toxicity, and degradation, and the risks to both plant workers and the community at large are analyzed.

  19. Pemanfaatan Bottom Ash Sebagai Agregat Buatan

    OpenAIRE

    Nuciferani, Felicia Tria; Antoni, Antoni; Hardjito, Djwantoro

    2014-01-01

    The aim of this study is to explore the possible use of bottom ash as artificial aggregates. It is found that the pelletizer method by using mixer without blade is one possibility to manufacture artificial aggregates. The optimum mixture composition of artificial aggregate is found to be 3 BA : 1FA : 0,5 C , by weight, and immersed once in cement slurry. The water content in ssd condition is 27% with the compressive strength of the aggregate 2.4 MPa on the seventh day. Concrete produced with ...

  20. Development of debris resistant bottom end piece

    International Nuclear Information System (INIS)

    Lee, Jae Kyung; Sohn, Dong Seong; Yim, Jeong Sik; Hwang, Dae Hyun; Song, Kee Nam; Oh, Dong Seok; Rhu, Ho Sik; Lee, Chang Woo; Kim, Seong Soo; Oh, Jong Myung

    1993-12-01

    Debris-related fuel failures have been identified as one of the major causes of fuel failures. In order to reduce the possibility of debris-related fuel failures, it is necessary to develop Debris-Resistant Bottom End Piece. For this development, mechanical strength test and pressure drop test were performed, and the test results were analyzed. And the laser cutting, laser welding and electron beam welding technology, which were the core manufacturing technology of DRBEP, were developed. Final design were performed, and the final drawing and specifications were prepared. The prototype of DRBEP was manufactured according to the developed munufacturing procedure. (Author)

  1. Bottom nozzle of a LWR fuel assembly

    International Nuclear Information System (INIS)

    Leroux, J.C.

    1991-01-01

    The bottom nozzle consists of a transverse element in form of box having a bending resistant grid structure which has an outer peripheral frame of cross-section corresponding to that of the fuel assembly and which has walls defining large cells. The transverse element has a retainer plate with a regular array of openings. The retainer plate is fixed above and parallel to the grid structure with a spacing in order to form, between the grid structure and the retainer plate a free space for tranquil flow of cooling water and for debris collection [fr

  2. Bottom loaded filter for radioactive liquids

    International Nuclear Information System (INIS)

    Wendland, W.G.

    1983-01-01

    This invention relates to equipment for filtering liquids and more particularly to filter assemblies for use with radioactive by-products of nuclear power plants. The invention provides a compact, bottom-loaded filter assembly that can be quickly and safely loaded and unloaded without the use of complex remote equipment. The assembly is integrally shielded and does not require external shielding. The closure hatch may be automatically aligned to facilitate quick sealing attachment after replacement of the filter cartridge, and the filter cartridge may be automatically positioned within the filter housing during the replacement operation

  3. A new kind of bottom quark factory

    International Nuclear Information System (INIS)

    Mtingwa, S.K.; Strikman, M.; AN SSSR, Leningrad

    1991-01-01

    We describe a novel method of producing large numbers of B mesons containing bottom quarks. It is known that one should analyze at least 10 9 B meson decays to elucidate the physics of CP violation and rare B decay modes. Using the ultra high energy electron beams from the future generation of electron linear colliders, we Compton backscatter low energy laser beams off these electron beams. From this process, we produce hot photons having energy hundreds of GeV. Upon scattering these hot photons onto stationary targets, we show that it is possible to photoproduce and measure the necessary 10 9 B mesons per year. 24 refs., 4 figs

  4. Neoliberalism Viewed From the Bottom Up

    DEFF Research Database (Denmark)

    Danneris, Sophie

    2017-01-01

    Drawing on the assumption that it is pivotal to include a bottom up perspective to understand the way in which the welfare system functions, this chapter sets out to explore the lived experience of neoliberalism. The purpose is to gain insight into the consequences of neoliberalism from...... the viewpoint of the vulnerable benefit claimants who encounter it on a daily basis. The analysis is based on a qualitative longitudinal study conducted from 2013 to 2015, which shows how, in varying ways, clients routinely cope with being part of a neoliberal welfare state: by resignation, by taking action...

  5. Revisiting the round bottom flask rainbow experiment

    Science.gov (United States)

    Selmke, Markus; Selmke, Sarah

    2018-01-01

    A popular demonstration experiment in optics uses a round-bottom flask filled with water to project a circular rainbow on a screen with a hole through which the flask is illuminated. We show how the vessel's wall shifts the first- and second-order bows towards each other and consequently reduces the width of Alexander's dark band. We address the challenge this introduces in observing Alexander's dark band, and explain the importance of a sufficient distance between the flask and the screen. The wall-effect also introduces a splitting of the bows that can easily be misinterpreted.

  6. ICT-ENABLED BOTTOM-UP ARCHITECTURAL DESIGN

    Directory of Open Access Journals (Sweden)

    Burak Pak

    2016-04-01

    Full Text Available This paper aims at discussing the potentials of bottom-up design practices in relation to the latest developments in Information and Communication Technologies (ICT by making an in-depth review of inaugural cases. The first part of the study involves a literature study and the elaboration of basic strategies from the case study. The second part reframes the existing ICT tools and strategies and elaborates on their potentials to support the modes of participation performed in these cases. As a result, by distilling the created knowledge, the study reveals the potentials of novel modes of ICT-enabled design participation which exploit a set of collective action tools to support sustainable ways of self-organization and bottom-up design. The final part explains the relevance of these with solid examples and presents a hypothetical case for future implementation. The paper concludes with a brief reflection on the implications of the findings for the future of architectural design education.

  7. A bottom-up approach to estimating cost elements of REDD+ pilot projects in Tanzania

    Directory of Open Access Journals (Sweden)

    Merger Eduard

    2012-08-01

    Full Text Available Abstract Background Several previous global REDD+ cost studies have been conducted, demonstrating that payments for maintaining forest carbon stocks have significant potential to be a cost-effective mechanism for climate change mitigation. These studies have mostly followed highly aggregated top-down approaches without estimating the full range of REDD+ costs elements, thus underestimating the actual costs of REDD+. Based on three REDD+ pilot projects in Tanzania, representing an area of 327,825 ha, this study explicitly adopts a bottom-up approach to data assessment. By estimating opportunity, implementation, transaction and institutional costs of REDD+ we develop a practical and replicable methodological framework to consistently assess REDD+ cost elements. Results Based on historical land use change patterns, current region-specific economic conditions and carbon stocks, project-specific opportunity costs ranged between US$ -7.8 and 28.8 tCOxxxx for deforestation and forest degradation drivers such as agriculture, fuel wood production, unsustainable timber extraction and pasture expansion. The mean opportunity costs for the three projects ranged between US$ 10.1 – 12.5 tCO2. Implementation costs comprised between 89% and 95% of total project costs (excluding opportunity costs ranging between US$ 4.5 - 12.2 tCO2 for a period of 30 years. Transaction costs for measurement, reporting, verification (MRV, and other carbon market related compliance costs comprised a minor share, between US$ 0.21 - 1.46 tCO2. Similarly, the institutional costs comprised around 1% of total REDD+ costs in a range of US$ 0.06 – 0.11 tCO2. Conclusions The use of bottom-up approaches to estimate REDD+ economics by considering regional variations in economic conditions and carbon stocks has been shown to be an appropriate approach to provide policy and decision-makers robust economic information on REDD+. The assessment of opportunity costs is a crucial first step to

  8. A bottom-up approach to estimating cost elements of REDD+ pilot projects in Tanzania

    Science.gov (United States)

    2012-01-01

    Background Several previous global REDD+ cost studies have been conducted, demonstrating that payments for maintaining forest carbon stocks have significant potential to be a cost-effective mechanism for climate change mitigation. These studies have mostly followed highly aggregated top-down approaches without estimating the full range of REDD+ costs elements, thus underestimating the actual costs of REDD+. Based on three REDD+ pilot projects in Tanzania, representing an area of 327,825 ha, this study explicitly adopts a bottom-up approach to data assessment. By estimating opportunity, implementation, transaction and institutional costs of REDD+ we develop a practical and replicable methodological framework to consistently assess REDD+ cost elements. Results Based on historical land use change patterns, current region-specific economic conditions and carbon stocks, project-specific opportunity costs ranged between US$ -7.8 and 28.8 tCOxxxx for deforestation and forest degradation drivers such as agriculture, fuel wood production, unsustainable timber extraction and pasture expansion. The mean opportunity costs for the three projects ranged between US$ 10.1 – 12.5 tCO2. Implementation costs comprised between 89% and 95% of total project costs (excluding opportunity costs) ranging between US$ 4.5 - 12.2 tCO2 for a period of 30 years. Transaction costs for measurement, reporting, verification (MRV), and other carbon market related compliance costs comprised a minor share, between US$ 0.21 - 1.46 tCO2. Similarly, the institutional costs comprised around 1% of total REDD+ costs in a range of US$ 0.06 – 0.11 tCO2. Conclusions The use of bottom-up approaches to estimate REDD+ economics by considering regional variations in economic conditions and carbon stocks has been shown to be an appropriate approach to provide policy and decision-makers robust economic information on REDD+. The assessment of opportunity costs is a crucial first step to provide information on the

  9. Nuclear expert web mining system: monitoring and analysis of nuclear acceptance by information retrieval and opinion extraction on the Internet

    Energy Technology Data Exchange (ETDEWEB)

    Reis, Thiago; Barroso, Antonio C.O.; Imakuma, Kengo, E-mail: thiagoreis@usp.b, E-mail: barroso@ipen.b, E-mail: kimakuma@ipen.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2011-07-01

    This paper presents a research initiative that aims to collect nuclear related information and to analyze opinionated texts by mining the hypertextual data environment and social networks web sites on the Internet. Different from previous approaches that employed traditional statistical techniques, it is being proposed a novel Web Mining approach, built using the concept of Expert Systems, for massive and autonomous data collection and analysis. The initial step has been accomplished, resulting in a framework design that is able to gradually encompass a set of evolving techniques, methods, and theories in such a way that this work will build a platform upon which new researches can be performed more easily by just substituting modules or plugging in new ones. Upon completion it is expected that this research will contribute to the understanding of the population views on nuclear technology and its acceptance. (author)

  10. Nuclear expert web mining system: monitoring and analysis of nuclear acceptance by information retrieval and opinion extraction on the Internet

    International Nuclear Information System (INIS)

    Reis, Thiago; Barroso, Antonio C.O.; Imakuma, Kengo

    2011-01-01

    This paper presents a research initiative that aims to collect nuclear related information and to analyze opinionated texts by mining the hypertextual data environment and social networks web sites on the Internet. Different from previous approaches that employed traditional statistical techniques, it is being proposed a novel Web Mining approach, built using the concept of Expert Systems, for massive and autonomous data collection and analysis. The initial step has been accomplished, resulting in a framework design that is able to gradually encompass a set of evolving techniques, methods, and theories in such a way that this work will build a platform upon which new researches can be performed more easily by just substituting modules or plugging in new ones. Upon completion it is expected that this research will contribute to the understanding of the population views on nuclear technology and its acceptance. (author)

  11. Tsunami Simulation Method Assimilating Ocean Bottom Pressure Data Near a Tsunami Source Region

    Science.gov (United States)

    Tanioka, Yuichiro

    2018-02-01

    A new method was developed to reproduce the tsunami height distribution in and around the source area, at a certain time, from a large number of ocean bottom pressure sensors, without information on an earthquake source. A dense cabled observation network called S-NET, which consists of 150 ocean bottom pressure sensors, was installed recently along a wide portion of the seafloor off Kanto, Tohoku, and Hokkaido in Japan. However, in the source area, the ocean bottom pressure sensors cannot observe directly an initial ocean surface displacement. Therefore, we developed the new method. The method was tested and functioned well for a synthetic tsunami from a simple rectangular fault with an ocean bottom pressure sensor network using 10 arc-min, or 20 km, intervals. For a test case that is more realistic, ocean bottom pressure sensors with 15 arc-min intervals along the north-south direction and sensors with 30 arc-min intervals along the east-west direction were used. In the test case, the method also functioned well enough to reproduce the tsunami height field in general. These results indicated that the method could be used for tsunami early warning by estimating the tsunami height field just after a great earthquake without the need for earthquake source information.

  12. Quality assurance of MSWI bottom ash. Environmental properties; Kvalitetssaekring av slaggrus. Miljoemaessiga egenskaper

    Energy Technology Data Exchange (ETDEWEB)

    Flyhammar, Peter [Lund Univ. (Sweden). Engineering Geology

    2006-04-15

    In Sweden, several hundred tonnes of MSWI bottom ash are generated annually at 29 incineration plants for municipal solid waste. So far bottom ash has mainly been disposed in to landfills or used as cover material in landfills or in other construction works at landfills. A few applications of bottom ash in construction works outside landfills have been reported. A large problem for the market of bottom ash and other secondary materials outside Swedish waste treatment plants is the lack of roles and regulations for a non-polluting use. During 2002 Hartlen and Groenholm presented a proposal to a system to assure the quality of bottom ash after homogenization and stabilization. They notice that the leaching of salts and metals to ground water constitutes the largest risk for the environment during use of bottom ash. Therefore, a quality assurance of environmental properties should be based on leaching tests. The aim of this project was to study how the control of environmental properties of bottom ash (at first hand leaching properties) earlier described in e.g. a product information sheet should be worked out. The starting-point has been a control system for bottom ash developed by Sysav. Different leaching tests illustrate however different aspects of the environmental properties, e.g. short-term and long-term leaching. Limit and target values for different variables could affect both the possibilities to use bottom ash as well as the sampling from storage heaps. We have chosen to investigate pH, availability and leached amount and the connection between these variables. the possibilities to use pH or the availability to assess both short-term and longterm leaching properties. how the number of subsamples that should be collected from a storage heap is affected by different control variables and quality requirements. how bottom ash is stabilized by today's storage technology and how the technology could be improved. Our sample test of bottom ash from Swedish

  13. Machine-learned solutions for three stages of clinical information extraction: the state of the art at i2b2 2010.

    Science.gov (United States)

    de Bruijn, Berry; Cherry, Colin; Kiritchenko, Svetlana; Martin, Joel; Zhu, Xiaodan

    2011-01-01

    As clinical text mining continues to mature, its potential as an enabling technology for innovations in patient care and clinical research is becoming a reality. A critical part of that process is rigid benchmark testing of natural language processing methods on realistic clinical narrative. In this paper, the authors describe the design and performance of three state-of-the-art text-mining applications from the National Research Council of Canada on evaluations within the 2010 i2b2 challenge. The three systems perform three key steps in clinical information extraction: (1) extraction of medical problems, tests, and treatments, from discharge summaries and progress notes; (2) classification of assertions made on the medical problems; (3) classification of relations between medical concepts. Machine learning systems performed these tasks using large-dimensional bags of features, as derived from both the text itself and from external sources: UMLS, cTAKES, and Medline. Performance was measured per subtask, using micro-averaged F-scores, as calculated by comparing system annotations with ground-truth annotations on a test set. The systems ranked high among all submitted systems in the competition, with the following F-scores: concept extraction 0.8523 (ranked first); assertion detection 0.9362 (ranked first); relationship detection 0.7313 (ranked second). For all tasks, we found that the introduction of a wide range of features was crucial to success. Importantly, our choice of machine learning algorithms allowed us to be versatile in our feature design, and to introduce a large number of features without overfitting and without encountering computing-resource bottlenecks.

  14. The petroleum industry improving the bottom line

    International Nuclear Information System (INIS)

    Benner, R.I.

    1992-01-01

    The oil and gas exploration and production business environment has presented many challenges over the last decade, notably price volatility and rising costs. Managing the margin and changing a company's cost structure to improve the bottom line is a major issue with company executives. The experiences of Oryx Energy Company since its spinoff from Sun Company in 1988 are used as an example of a company makeover. A generalized exploration and production income statement is employed to present industry cost/portfolio relationships and strategies for improving the bottom line. At Oryx, three major strategies were set in place to enhance shareholder value: an increased emphasis on applied technology, including horizontal drilling, advanced 3-dimensional seismic prospecting, and intensive use of interactive computer workstations; international expansion; and an emphasis on the U.S. Gulf of Mexico, deemphasizing the onshore U.S. and the gas processing business. Specific strategies are outlined in the areas of increasing revenues, reducing production cost and exploration expense, and controlling general and administrative expenses. 8 refs., 18 figs., 2 tabs

  15. Comparing success levels of different neural network structures in extracting discriminative information from the response patterns of a temperature-modulated resistive gas sensor

    Science.gov (United States)

    Hosseini-Golgoo, S. M.; Bozorgi, H.; Saberkari, A.

    2015-06-01

    Performances of three neural networks, consisting of a multi-layer perceptron, a radial basis function, and a neuro-fuzzy network with local linear model tree training algorithm, in modeling and extracting discriminative features from the response patterns of a temperature-modulated resistive gas sensor are quantitatively compared. For response pattern recording, a voltage staircase containing five steps each with a 20 s plateau is applied to the micro-heater of the sensor, when 12 different target gases, each at 11 concentration levels, are present. In each test, the hidden layer neuron weights are taken as the discriminatory feature vector of the target gas. These vectors are then mapped to a 3D feature space using linear discriminant analysis. The discriminative information content of the feature vectors are determined by the calculation of the Fisher’s discriminant ratio, affording quantitative comparison among the success rates achieved by the different neural network structures. The results demonstrate a superior discrimination ratio for features extracted from local linear neuro-fuzzy and radial-basis-function networks with recognition rates of 96.27% and 90.74%, respectively.

  16. Comparing success levels of different neural network structures in extracting discriminative information from the response patterns of a temperature-modulated resistive gas sensor

    International Nuclear Information System (INIS)

    Hosseini-Golgoo, S M; Bozorgi, H; Saberkari, A

    2015-01-01

    Performances of three neural networks, consisting of a multi-layer perceptron, a radial basis function, and a neuro-fuzzy network with local linear model tree training algorithm, in modeling and extracting discriminative features from the response patterns of a temperature-modulated resistive gas sensor are quantitatively compared. For response pattern recording, a voltage staircase containing five steps each with a 20 s plateau is applied to the micro-heater of the sensor, when 12 different target gases, each at 11 concentration levels, are present. In each test, the hidden layer neuron weights are taken as the discriminatory feature vector of the target gas. These vectors are then mapped to a 3D feature space using linear discriminant analysis. The discriminative information content of the feature vectors are determined by the calculation of the Fisher’s discriminant ratio, affording quantitative comparison among the success rates achieved by the different neural network structures. The results demonstrate a superior discrimination ratio for features extracted from local linear neuro-fuzzy and radial-basis-function networks with recognition rates of 96.27% and 90.74%, respectively. (paper)

  17. Extracting drug mechanism and pharmacodynamic information from clinical electroencephalographic data using generalised semi-linear canonical correlation analysis

    International Nuclear Information System (INIS)

    Brain, P; Strimenopoulou, F; Ivarsson, M; Wilson, F J; Diukova, A; Wise, R G; Berry, E; Jolly, A; Hall, J E

    2014-01-01

    Conventional analysis of clinical resting electroencephalography (EEG) recordings typically involves assessment of spectral power in pre-defined frequency bands at specific electrodes. EEG is a potentially useful technique in drug development for measuring the pharmacodynamic (PD) effects of a centrally acting compound and hence to assess the likelihood of success of a novel drug based on pharmacokinetic–pharmacodynamic (PK–PD) principles. However, the need to define the electrodes and spectral bands to be analysed a priori is limiting where the nature of the drug-induced EEG effects is initially not known. We describe the extension to human EEG data of a generalised semi-linear canonical correlation analysis (GSLCCA), developed for small animal data. GSLCCA uses data from the whole spectrum, the entire recording duration and multiple electrodes. It provides interpretable information on the mechanism of drug action and a PD measure suitable for use in PK–PD modelling. Data from a study with low (analgesic) doses of the μ-opioid agonist, remifentanil, in 12 healthy subjects were analysed using conventional spectral edge analysis and GSLCCA. At this low dose, the conventional analysis was unsuccessful but plausible results consistent with previous observations were obtained using GSLCCA, confirming that GSLCCA can be successfully applied to clinical EEG data. (paper)

  18. Extracting the Beat: An Experience-dependent Complex Integration of Multisensory Information Involving Multiple Levels of the Nervous System

    Directory of Open Access Journals (Sweden)

    Laurel J. Trainor

    2009-04-01

    Full Text Available In a series of studies we have shown that movement (or vestibular stimulation that is synchronized to every second or every third beat of a metrically ambiguous rhythm pattern biases people to perceive the meter as a march or as a waltz, respectively. Riggle (this volume claims that we postulate an "innate", "specialized brain unit" for beat perception that is "directly" influenced by vestibular input. In fact, to the contrary, we argue that experience likely plays a large role in the development of rhythmic auditory-movement interactions, and that rhythmic processing in the brain is widely distributed and includes subcortical and cortical areas involved in sound processing and movement. Further, we argue that vestibular and auditory information are integrated at various subcortical and cortical levels along with input from other sensory modalities, and it is not clear which levels are most important for rhythm processing or, indeed, what a "direct" influence of vestibular input would mean. Finally, we argue that vestibular input to sound location mechanisms may be involved, but likely cannot explain the influence of vestibular input on the perception of auditory rhythm. This remains an empirical question for future research.

  19. Improving the extraction of crisis information in the context of flood, fire, and landslide rapid mapping using SAR and optical remote sensing data

    Science.gov (United States)

    Martinis, Sandro; Clandillon, Stephen; Twele, André; Huber, Claire; Plank, Simon; Maxant, Jérôme; Cao, Wenxi; Caspard, Mathilde; May, Stéphane

    2016-04-01

    Optical and radar satellite remote sensing have proven to provide essential crisis information in case of natural disasters, humanitarian relief activities and civil security issues in a growing number of cases through mechanisms such as the Copernicus Emergency Management Service (EMS) of the European Commission or the International Charter 'Space and Major Disasters'. The aforementioned programs and initiatives make use of satellite-based rapid mapping services aimed at delivering reliable and accurate crisis information after natural hazards. Although these services are increasingly operational, they need to be continuously updated and improved through research and development (R&D) activities. The principal objective of ASAPTERRA (Advancing SAR and Optical Methods for Rapid Mapping), the ESA-funded R&D project being described here, is to improve, automate and, hence, speed-up geo-information extraction procedures in the context of natural hazards response. This is performed through the development, implementation, testing and validation of novel image processing methods using optical and Synthetic Aperture Radar (SAR) data. The methods are mainly developed based on data of the German radar satellites TerraSAR-X and TanDEM-X, the French satellite missions Pléiades-1A/1B as well as the ESA missions Sentinel-1/2 with the aim to better characterize the potential and limitations of these sensors and their synergy. The resulting algorithms and techniques are evaluated in real case applications during rapid mapping activities. The project is focussed on three types of natural hazards: floods, landslides and fires. Within this presentation an overview of the main methodological developments in each topic is given and demonstrated in selected test areas. The following developments are presented in the context of flood mapping: a fully automated Sentinel-1 based processing chain for detecting open flood surfaces, a method for the improved detection of flooded vegetation

  20. The duality of gaze: Eyes extract and signal social information during sustained cooperative and competitive dyadic gaze

    Directory of Open Access Journals (Sweden)

    Michelle eJarick

    2015-09-01

    Full Text Available In contrast to nonhuman primate eyes, which have a dark sclera surrounding a dark iris, human eyes have a white sclera that surrounds a dark iris. This high contrast morphology allows humans to determine quickly and easily where others are looking and infer what they are attending to. In recent years an enormous body of work has used photos and schematic images of faces to study these aspects of social attention, e.g., the selection of the eyes of others and the shift of attention to where those eyes are directed. However, evolutionary theory holds that humans did not develop a high contrast morphology simply to use the eyes of others as attentional cues; rather they sacrificed camouflage for communication, that is, to signal their thoughts and intentions to others. In the present study we demonstrate the importance of this by taking as our starting point the hypothesis that a cornerstone of nonverbal communication is the eye contact between individuals and the time that it is held. In a single simple study we show experimentally that the effect of eye contact can be quickly and profoundly altered merely by having participants, who had never met before, play a game in a cooperative or competitive manner. After the game participants were asked to make eye contact for a prolonged period of time (10 minutes. Those who had played the game cooperatively found this terribly difficult to do, repeatedly talking and breaking gaze. In contrast, those who had played the game competitively were able to stare quietly at each other for a sustained period. Collectively these data demonstrate that when looking at the eyes of a real person one both acquires and signals information to the other person. This duality of gaze is critical to nonverbal communication, with the nature of that communication shaped by the relationship between individuals, e.g., cooperative or competitive.

  1. Monitoring Strategies of Earth Dams by Ground-Based Radar Interferometry: How to Extract Useful Information for Seismic Risk Assessment.

    Science.gov (United States)

    Di Pasquale, Andrea; Nico, Giovanni; Pitullo, Alfredo; Prezioso, Giuseppina

    2018-01-16

    The aim of this paper is to describe how ground-based radar interferometry can provide displacement measurements of earth dam surfaces and of vibration frequencies of its main concrete infrastructures. In many cases, dams were built many decades ago and, at that time, were not equipped with in situ sensors embedded in the structure when they were built. Earth dams have scattering properties similar to landslides for which the Ground-Based Synthetic Aperture Radar (GBSAR) technique has been so far extensively applied to study ground displacements. In this work, SAR and Real Aperture Radar (RAR) configurations are used for the measurement of earth dam surface displacements and vibration frequencies of concrete structures, respectively. A methodology for the acquisition of SAR data and the rendering of results is described. The geometrical correction factor, needed to transform the Line-of-Sight (LoS) displacement measurements of GBSAR into an estimate of the horizontal displacement vector of the dam surface, is derived. Furthermore, a methodology for the acquisition of RAR data and the representation of displacement temporal profiles and vibration frequency spectra of dam concrete structures is presented. For this study a Ku-band ground-based radar, equipped with horn antennas having different radiation patterns, has been used. Four case studies, using different radar acquisition strategies specifically developed for the monitoring of earth dams, are examined. The results of this work show the information that a Ku-band ground-based radar can provide to structural engineers for a non-destructive seismic assessment of earth dams.

  2. Evaluating the Resilience of the Bottom-up Method used to Detect and Benchmark the Smartness of University Campuses

    NARCIS (Netherlands)

    Giovannella, Carlo; Andone, Diana; Dascalu, Mihai; Popescu, Elvira; Rehm, Matthias; Mealha, Oscar

    2017-01-01

    A new method to perform a bottom-up extraction and benchmark of the perceived multilevel smartness of complex ecosystems has been recently described and applied to territories and learning ecosystems like university campuses and schools. In this paper we study the resilience of our method

  3. Quality assurance of MSWI bottom ash. Environmental properties; Kvalitetssaekring av slaggrus. Miljoemaessiga egenskaper

    Energy Technology Data Exchange (ETDEWEB)

    Flyhammar, Peter [Lund Univ. (Sweden). Dept. of Engineering Geology

    2006-04-15

    In Sweden several hundred tonnes of MSWI bottom ash are generated annually at 29 incineration plants for municipal solid waste. So far bottom ash has mainly been disposed in to landfills or used as cover material in landfills or in other construction works at landfills. A few applications of bottom ash in construction works outside landfills have been reported. A large problem for the market of bottom ash and other secondary materials outside Swedish waste treatment plants is the lack of roles and regulations for a non-polluting use. During 2002 Hartlen and Groenholm (HG) presented a proposal to a system to assure the quality of bottom ash after homogenization and stabilization. A quality assurance of environmental properties should be based on leaching tests. The aim of this project was to study how the control of environmental properties of bottom ash earlier described in e.g. a product information sheet should be worked out. The starting-point has been a control system for bottom ash developed by the Sysav company. Different leaching tests illustrate however different aspects of the environmental properties, e.g. short-term and long-term leaching. Limit and target values for different variables could affect both the possibilities to use bottom ash as well as the sampling from storage heaps. We have chosen to investigate: pH, availability and leached amount and the connection between these variables; the possibilities to use pH or the availability to assess both short-term and long term leaching properties; how the number of subsamples that should be collected from a storage heap is affected by different control variables and quality requirements; how bottom ash is stabilized by today's storage technology and how the technology could be improved. Our sample test of bottom ash from Swedish incineration plants indicates that the availability of elements such as Cd, Cu, Cr, Ni, Pb and Zn in bottom ash usually is below Sysav's target values. Extreme values

  4. Extracting central places from the link structure in Wikipedia

    DEFF Research Database (Denmark)

    Kessler, Carsten

    2017-01-01

    of the German language edition of Wikipedia. The official upper and middle centers declared, based on German spatial laws, are used as a reference dataset. The characteristics of the link structure around their Wikipedia pages, which link to each other or mention each other, and how often, are used to develop...... a bottom-up method for extracting central places from Wikipedia. The method relies solely on the structure and number of links and mentions between the corresponding Wikipedia pages; no spatial information is used in the extraction process. The output of this method shows significant overlap...... with the official central place structure, especially for the upper centers. The results indicate that real-world relationships are in fact reflected in the link structure on the web in the case of Wikipedia....

  5. Informe

    Directory of Open Access Journals (Sweden)

    Egon Lichetenberger

    1950-10-01

    Full Text Available Informe del doctor Egon Lichetenberger ante el Consejo Directivo de la Facultad, sobre el  curso de especialización en Anatomía Patológica patrocinado por la Kellogg Foundation (Departamento de Patología

  6. Operating history report for the Peach Bottom HTGR. Volume I. Reactor operating history

    International Nuclear Information System (INIS)

    Scheffel, W.J.; Baldwin, N.L.; Tomlin, R.W.

    1976-01-01

    The operating history for the Peach Bottom-1 Reactor is presented for the years 1966 through 1975. Information concerning general chemistry data, general physics data, location of sensing elements in the primary helium circuit, and postirradiation examination and testing of reactor components is presented

  7. Novel compact tiltmeter for ocean bottom and other frontier observations

    International Nuclear Information System (INIS)

    Takamori, Akiteru; Araya, Akito; Kanazawa, Toshihiko; Shinohara, Masanao; Bertolini, Alessandro; DeSalvo, Riccardo

    2011-01-01

    Long-term observations conducted with large arrays of tiltmeters deployed in ocean-bottom boreholes, on the seafloor and in other hazardous regions are expected to provide rich information useful in geosciences, social engineering, resource monitoring and other applications. To facilitate such observations, we designed and built a compact, highly sensitive tiltmeter with sufficient performance, comparable to that of much larger instruments that are difficult to operate in the target locations. The new tiltmeter is suitable for observations requiring multiple instruments because its design is optimized for low-cost mass production. This paper describes its key technologies, including a very compact folded pendulum and an optical fiber readout. Preliminary results of test observations conducted using a prototype tiltmeter are compared with a conventional water-tube tiltmeter

  8. Estimates of bottom roughness length and bottom shear stress in South San Francisco Bay, California

    Science.gov (United States)

    Cheng, R.T.; Ling, C.-H.; Gartner, J.W.; Wang, P.-F.

    1999-01-01

    A field investigation of the hydrodynamics and the resuspension and transport of participate matter in a bottom boundary layer was carried out in South San Francisco Bay (South Bay), California, during March-April 1995. Using broadband acoustic Doppler current profilers, detailed measurements of turbulent mean velocity distribution within 1.5 m above bed have been obtained. A global method of data analysis was used for estimating bottom roughness length zo and bottom shear stress (or friction velocities u*). Field data have been examined by dividing the time series of velocity profiles into 24-hour periods and independently analyzing the velocity profile time series by flooding and ebbing periods. The global method of solution gives consistent properties of bottom roughness length zo and bottom shear stress values (or friction velocities u*) in South Bay. Estimated mean values of zo and u* for flooding and ebbing cycles are different. The differences in mean zo and u* are shown to be caused by tidal current flood-ebb inequality, rather than the flooding or ebbing of tidal currents. The bed shear stress correlates well with a reference velocity; the slope of the correlation defines a drag coefficient. Forty-three days of field data in South Bay show two regimes of zo (and drag coefficient) as a function of a reference velocity. When the mean velocity is >25-30 cm s-1, the ln zo (and thus the drag coefficient) is inversely proportional to the reference velocity. The cause for the reduction of roughness length is hypothesized as sediment erosion due to intensifying tidal currents thereby reducing bed roughness. When the mean velocity is <25-30 cm s-1, the correlation between zo and the reference velocity is less clear. A plausible explanation of scattered values of zo under this condition may be sediment deposition. Measured sediment data were inadequate to support this hypothesis, but the proposed hypothesis warrants further field investigation.

  9. Phosphorus availability from bottom sediments of lakes using a nuclear technique

    International Nuclear Information System (INIS)

    Flores, F.; Facetti, J.F.

    1991-01-01

    Availability of phosphorus from the bottom sediments of a lake plays an import role in the development of aquatic biota and in the enhancement of eutrophication process. In this work the 31 P↔ 32 P isotopic exchange (E values) technique was applied to assess the potential influence of this phosphorus reservoir on the water quality of Acaray and Yguazu Dams, at the Eastern Region of Paraguay. Samples analyzed were taken from the bottom sediments of the water bodies at different sites as well as from the shores. The method is reliable and yields information of ecological significance

  10. Phosphorus availability from bottom sediments of lakes using a nuclear technique

    International Nuclear Information System (INIS)

    Flores, F.; Facetti, J.F.

    1992-01-01

    Availability of phosphorus from the bottom sediments of a lake plays an import role in the development of aquatic biota and in the enhancement of the eutrophication process. In this work, the 31 P- 32 P isotopic exchange (E values) technique was applied to assess the potential influence of this phosphorus 'reservoir' on the water quality of the Acaray and Yguazu Dams in the Easter Region of Paraguay. Samples analyzed were taken from the bottom sediments of the water body at different sites as well as from the shores. The method is reliable and yields information of potential ecological significance. (author) 14 refs.; 2 tabs

  11. Control Properties of Bottom Fired Marine Boilers

    DEFF Research Database (Denmark)

    Solberg, Brian; Andersen, Palle; Karstensen, Claus M. S.

    2005-01-01

    This paper focuses on model analysis of a dynamic model of a bottom fired one-pass smoke tube boiler. Linearised versions of the model are analysed to determine how gain, time constants and right half plane zeros (caused by the shrink-and-swell phenomenon) depend on the steam flow load. Furthermore...... the interactions in the system are inspected to analyse potential benefit from using a multivariable control strategy in favour of the current strategy based on single loop theory. An analysis of the nonlinear model is carried out to further determine the nonlinear characteristics of the boiler system...... and to verify whether nonlinear control is needed. Finally a controller based on single loop theory is used to analyse if input constraints become active when rejecting transient behaviour from the disturbance steam flow. The model analysis shows large variations in system gains at steady state as function...

  12. Bottom loaded filter for radioactive liquid

    International Nuclear Information System (INIS)

    Wendland, W.G.

    1980-01-01

    A specification is given for a bottom loaded filter assembly for filtering radioactive liquids through a replaceable cartridge filter, which includes a lead-filled jacket enveloping a housing having a chamber therein for the filter cartridge. A track arrangement carries a hatch for sealing the chamber. A spacer plug supports the cartridge within guide means associated with the inlet conduit in the chamber. The plug and cartridge drop out of the chamber when the hatch is unbolted and moved laterally of the chamber along the track. During cartridge replacement a new plug and cartridge are supported in the guide means by a spacer bar inserted across the track means under the chamber. The hatch is then slid under the chamber and bolted to a flange on the housing, engaging an O-ring to seal the chamber. (author)

  13. Station blackout calculations for Peach Bottom

    International Nuclear Information System (INIS)

    Hodge, S.A.

    1985-01-01

    A calculational procedure for the Station Blackout Severe Accident Sequence at Browns Ferry Unit One has been repeated with plant-specific application to one of the Peach Bottom Units. The only changes required in code input are with regard to the primary continment concrete, the existence of sprays in the secondary containment, and the size of the refueling bay. Combustible gas mole fractions in the secondary containment of each plant during the accident sequence are determined. It is demonstrated why the current state-of-the-art corium/concrete interaction code is inadequate for application to the study of Severe Accident Sequences in plants with the BWR MK I or MK II containment design

  14. Discovering bottom squark coannihilation at the ILC

    International Nuclear Information System (INIS)

    Belyaev, Alexander; Lastovicka, Tomas; Nomerotski, Andrei; Lastovicka-Medin, Gordana

    2010-01-01

    We study the potential of the international linear collider (ILC) at √(s)=500 GeV to probe new dark matter motivated scenario where the bottom squark (sbottom) is the next-to-lightest supersymmetric particle. For this scenario, which is virtually impossible for the LHC to test, the ILC has a potential to cover a large fraction of the parameter space. The challenge is due to a very low energy of jets, below 20-30 GeV, which pushes the jet clustering and flavor tagging algorithms to their limits. The process of sbottom pair production was studied within the SiD detector concept. We demonstrate that ILC offers a unique opportunity to test the supersymmetry parameter space motivated by the sbottom-neutralino coannihilation scenario in cases when the sbottom production is kinematically accessible. The study was done with the full SiD simulation and reconstruction chain including all standard model and beam backgrounds.

  15. Scraping the bottom of the barrel

    Energy Technology Data Exchange (ETDEWEB)

    Leite, L.F. [PETROBRAS (Brazil)

    2001-03-01

    This article focuses on technologies for upgrading residual streams to improve refiners margins, and reports on the refining technology programme (PROTER) set up by the Brazilian PETROBRAS company. Details are given of fluid catalytic cracking (FCC) pilot units at PETROBRAS's CENPES Research and Development Centre in Rio de Janeiro State, the development of new proprietary closed cyclone technology, the Ultramist feedstock injection device, the feed nozzle, and the high accessibility catalyst. FCC units at PETROBRAS, FCC ongoing projects, and the use of delayed coking to convert low value residues to high value residues are described along with other bottom of barrel projects such as residue hydrocracking, hydropyrolysis, and the production of a stable fuel emulsion from an asphalt residue stream.

  16. Bottom loaded filter for radioactive liquids

    International Nuclear Information System (INIS)

    Wendland, W.G.

    1980-01-01

    A bottom loaded filter assembly for filtering radioactive liquids through a replaceable cartridge filter is disclosed. The filter assembly includes a lead-filled jacket enveloping a housing having a chamber therein for the filter cartridge. A track arrangement carries a hatch for sealing the chamber. A spacer plug supports the cartridge within guide means associated with the inlet conduit in the chamber. The plug and cartridge drop out of the chamber when the hatch is unbolted and move laterally of the chamber. During cartridge replacement, a new plug and cartridge are supported in the guide means by a spacer bar inserted across the track means under the chamber. The hatch is then slid under the chamber and bolted to the vessel, engaging an o-ring to seal the chamber

  17. Accumulation and potential dissolution of Chernobyl-derived radionuclides in river bottom sediment

    International Nuclear Information System (INIS)

    Sanada, Yukihisa; Matsunaga, Takeshi; Yanase, Nobuyuki; Nagao, Seiya; Amano, Hikaru; Takada, Hideshige; Tkachenko, Yuri

    2002-01-01

    Areas contaminated with radionuclides from the Chernobyl nuclear accident have been identified in Pripyat River near the Chernobyl Nuclear Power Plant. The river bottom sediment cores contained 137 Cs (10 5 - 10 6 Bq/m 2 ) within 0-30 cm depth, whose concentration is comparable to that in the ground soil in the vicinity of the nuclear power plant (the Exclusion Zone). The sediment cores also accumulated 90 Sr (10 5 Bq/m 2 ), 239,240 Pu (10 4 Bq/m 2 ) and 241 Am (10 4 Bq/m 2 ) derived from the accident. Several nuclear fuel particles have been preserved at 20-25 cm depth that is the peak area of the concentrations of the radionuclides. Th ese inventories in the bottom sediments were compared with those of the released radionuclides during the accident. An analysis using a selective sequential extraction technique was applied for the radionuclides in the sediments. Results suggest that the possibility of release of 137 Cs and 239,240 Pu from the bottom sediment was low compared with 90 Sr. The potential dissolution and subsequent transport of 90 Sr from the river bottom sediment should be taken into account with respect to the long-term radiological influence on the aquatic environment

  18. BC Hydro triple bottom line report 2002

    International Nuclear Information System (INIS)

    Anon

    2002-08-01

    British Columbia Hydro (BC Hydro) published this document which measures the environmental, social and economic performance of the company. It is a complement to BC Hydro's 2002 Annual Report. The report was prepared to better understand the company's business in terms of its commitment to being an environmentally, socially, and economically responsible company (the three bottom lines). BC Hydro proved its ability to integrate the three bottom lines in decision making processes by carefully examining the environmental, social and economical impacts of programs such as Power Smart, Green and Alternative Energy, and Water Use Planning. All indicators point to BC Hydro achieving its commitment of providing a minimum of 10 per cent of new demand through 2010 with new green energy sources. Water Use Plans were developed for hydroelectric generating stations, and they should all be in place by 2003. Efficiencies realised through the Power Smart program offset the increases in greenhouse gas associated with increased energy demand. Juvenile sturgeon raised in a hatchery were released into the Columbia River in May 2002. The completion of a 40-kilometre trail on the Sunshine Coast was helped by a financial contribution from BC Hydro in the amount of 23,000 dollars. Safety improvements were implemented at eight facilities, such as dam remediation, dam surveillance and instrumentation updates. Scholarships were awarded across the province, along with additional donations to non-profit organizations. Co-op positions were provided for 150 students. Internal energy efficiency programs were successful. Planning is under way for significant maintenance work and equipment replacement projects as the transmission and distribution infrastructure ages. The number of reported indicators was expanded this year. In turn, they were aligned with the revised Global Reporting Initiative (GRI) guidelines. tabs

  19. Peach Bottom HTGR decommissioning and component removal

    International Nuclear Information System (INIS)

    Kohler, E.J.; Steward, K.P.; Iacono, J.V.

    1977-07-01

    The prime objective of the Peach Bottom End-of-Life Program was to validate specific HTGR design codes and predictions by comparison of actual and predicted physics, thermal, fission product, and materials behavior in Peach Bottom. Three consecutive phases of the program provide input to the HTGR design methods verifications: (1) Nondestructive fuel and circuit gamma scanning; (2) removal of steam generator and primary circuit components; and (3) Laboratory examinations of removed components. Component removal site work commenced with establishment of restricted access areas and installation of controlled atmosphere tents to retain relative humidity at <30%. A mock-up room was established to test and develop the tooling and to train operators under simulated working conditions. Primary circuit ducting samples were removed by trepanning, and steam generator access was achieved by a combination of arc gouging and grinding. Tubing samples were removed using internal cutters and external grinding. Throughout the component removal phase, strict health physics, safety, and quality assurance programs were implemented. A total of 148 samples of primary circuit ducting and steam generator tubing were removed with no significant health physics or safety incidents. Additionally, component removal served to provide access fordetermination of cesium plateout distribution by gamma scanning inside the ducts and for macroexamination of the steam generator from both the water and helium sides. Evaluations are continuing and indicate excellent performance of the steam generator and other materials, together with close correlation of observed and predicted fission product plateout distributions. It is concluded that such a program of end-of-life research, when appropriately coordinated with decommissioning activities, can significantly advance nuclear plant and fuel technology development

  20. Collection and preparation of bottom sediment samples for analysis of radionuclides and trace elements

    International Nuclear Information System (INIS)

    2003-07-01

    The publication is the first in a series of TECDOCs on sampling and sample handling as part of the IAEA support to improve reliability of nuclear analytical techniques (NATs) in Member State laboratories. The purpose of the document is to provide information on the methods for collecting sediments, the equipment used, and the sample preparation techniques for radionuclide and elemental analysis. The most appropriate procedures for defining the strategies and criteria for selecting sampling locations, for sample storage and transportation are also given. Elements of QA/QC and documentation needs for sampling and sediment analysis are discussed. Collection and preparation of stream and river bottom sediments, lake bottom sediments, estuary bottom sediments, and marine (shallow) bottom sediments are covered. The document is intended to be a comprehensive manual for the collection and preparation of bottom sediments as a prerequisite to obtain representative and meaningful results using NATs. Quality assurance and quality control (QA/QC) is emphasized as an important aspect to ensure proper collection, transportation, preservation, and analysis since it forms the basis for interpretation and legislation. Although there are many approaches and methods available for sediment analyses, the scope of the report is limited to sample preparation for (1) analysis of radionuclides (including sediment dating using radionuclides such as Pb-210 and Cs-137) and (2) analysis of trace, minor and major elements using nuclear and related analytical techniques such as NAA, XRF and PIXE

  1. Collection and preparation of bottom sediment samples for analysis of radionuclides and trace elements

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-07-01

    The publication is the first in a series of TECDOCs on sampling and sample handling as part of the IAEA support to improve reliability of nuclear analytical techniques (NATs) in Member State laboratories. The purpose of the document is to provide information on the methods for collecting sediments, the equipment used, and the sample preparation techniques for radionuclide and elemental analysis. The most appropriate procedures for defining the strategies and criteria for selecting sampling locations, for sample storage and transportation are also given. Elements of QA/QC and documentation needs for sampling and sediment analysis are discussed. Collection and preparation of stream and river bottom sediments, lake bottom sediments, estuary bottom sediments, and marine (shallow) bottom sediments are covered. The document is intended to be a comprehensive manual for the collection and preparation of bottom sediments as a prerequisite to obtain representative and meaningful results using NATs. Quality assurance and quality control (QA/QC) is emphasized as an important aspect to ensure proper collection, transportation, preservation, and analysis since it forms the basis for interpretation and legislation. Although there are many approaches and methods available for sediment analyses, the scope of the report is limited to sample preparation for (1) analysis of radionuclides (including sediment dating using radionuclides such as Pb-210 and Cs-137) and (2) analysis of trace, minor and major elements using nuclear and related analytical techniques such as NAA, XRF and PIXE.

  2. BioCreative V track 4: a shared task for the extraction of causal network information using the Biological Expression Language.

    Science.gov (United States)

    Rinaldi, Fabio; Ellendorff, Tilia Renate; Madan, Sumit; Clematide, Simon; van der Lek, Adrian; Mevissen, Theo; Fluck, Juliane

    2016-01-01

    Automatic extraction of biological network information is one of the most desired and most complex tasks in biological and medical text mining. Track 4 at BioCreative V attempts to approach this complexity using fragments of large-scale manually curated biological networks, represented in Biological Expression Language (BEL), as training and test data. BEL is an advanced knowledge representation format which has been designed to be both human readable and machine processable. The specific goal of track 4 was to evaluate text mining systems capable of automatically constructing BEL statements from given evidence text, and of retrieving evidence text for given BEL statements. Given the complexity of the task, we designed an evaluation methodology which gives credit to partially correct statements. We identified various levels of information expressed by BEL statements, such as entities, functions, relations, and introduced an evaluation framework which rewards systems capable of delivering useful BEL fragments at each of these levels. The aim of this evaluation method is to help identify the characteristics of the systems which, if combined, would be most useful for achieving the overall goal of automatically constructing causal biological networks from text. © The Author(s) 2016. Published by Oxford University Press.

  3. Reducing Heavy Metal Element from Coal Bottom Ash by Using Citric Acid Leaching Treatment

    Directory of Open Access Journals (Sweden)

    Yahya Ahmad Asyari

    2017-01-01

    Full Text Available Coal ash is the residue that is produced during coal combustion for instance fly ash, bottom ash or boiler slag which was primarily produced from the combustion of coal. With growth in coal burning power station, huge amount of coal bottom ash (CBA considered as hazardous material which are normally disposed in an on-site disposal system without any commercialization purpose. Previous researchers have studied the extraction of silica from agricultural wastes such as palm ash and rice husk ash (RHA and CBA by using leaching treatment method. In this study, the weaker acid, citric acid solution was used to replace the strong acid in leaching treatment process. Result showed that the heavy metal content such as Copper (Cu, Zinc (Zn and Lead (Pb can be decrease. Meanwhile the silica can be extracted up to 44% from coal bottom ash using citric acid leaching treatment under the optimum reaction time of 60 minutes with solution temperature of 60°C and concentration of citric acid more than 2%.

  4. Innovation and Creativity at the Bottom of the Pyramid

    Directory of Open Access Journals (Sweden)

    Lauri Erik Lehikoinen

    2018-01-01

    Full Text Available Purpose: The purpose of this study is to illustrate how innovative and creative companies develop products and services at the bottom of the economic pyramid (B.o.P markets. This paper attempts to gain further insight regarding the usage of the 4A perspective developed by Anderson and Billou (2007 and the Triple Bottom Line (TBL framework developed by Elkington (1999 as guidelines to achieve success in BoP markets. Design/methodology/approach: The authors of this paper come from three different countries (Sweden, Norway and Belgium, which gave a possibility to gather qualitative data from companies located or founded in these three countries. The 4A’s perspective and the TBL framework is used as a theoretical foundation to further investigate the phenomenon regarding how western companies act on B.o.P markets. Thus, this paper attempts to answer the following research questions: How can (social entrepreneurs (or any companies adapt the 4A perspective to introduce disruptive innovations and still, with the help from the TBL framework, maintain their sustainable, responsible and ethical approach? Additionally, how can the mind-set of innovation and creativity at the bottom of the pyramid in developing markets be transferred to social entrepreneurs in developed markets? Primary data was gathered through interviews with Solvatten (Sweden, MicroStart (Belgium and Easypaisa (Norway. Findings: The 4A perspective was proven to be an effective tool while approaching B.o.P markets. Companies must think outside the box of traditional marketing and be creative, to achieve their goals. In dynamic markets, a company will struggle to keep up with all constraints. The case companies struggled most with acting sustainably while achieving profitability. Research limitations/implications: To further validate the results, the sample size should be bigger including several different companies and informants. Originality/value: This paper contributes to the

  5. Information extraction from airborne cameras

    NARCIS (Netherlands)

    Dijk, J.; Elands, P.J.M.; Burghouts, G.; Van Eekeren, A.W.M.

    2015-01-01

    In this paper we evaluate the added value of image interpretation techniques for EO sensors mounted on a UAV for operators in an operational setting. We start with evaluating the support by technology for strategic and tactical purposes in a real-time scenario. We discuss different variations

  6. A hybrid approach for robust multilingual toponym extraction and disambiguation

    NARCIS (Netherlands)

    Habib, Mena Badieh; van Keulen, Maurice

    Toponym extraction and disambiguation are key topics recently addressed by fields of Information Extraction and Geographical Information Retrieval. Toponym extraction and disambiguation are highly dependent processes. Not only toponym extraction effectiveness affects disambiguation, but also

  7. Mapping Robinia Pseudoacacia Forest Health Conditions by Using Combined Spectral, Spatial, and Textural Information Extracted from IKONOS Imagery and Random Forest Classifier

    Directory of Open Access Journals (Sweden)

    Hong Wang

    2015-07-01

    Full Text Available The textural and spatial information extracted from very high resolution (VHR remote sensing imagery provides complementary information for applications in which the spectral information is not sufficient for identification of spectrally similar landscape features. In this study grey-level co-occurrence matrix (GLCM textures and a local statistical analysis Getis statistic (Gi, computed from IKONOS multispectral (MS imagery acquired from the Yellow River Delta in China, along with a random forest (RF classifier, were used to discriminate Robina pseudoacacia tree health levels. Specifically, eight GLCM texture features (mean, variance, homogeneity, dissimilarity, contrast, entropy, angular second moment, and correlation were first calculated from IKONOS NIR band (Band 4 to determine an optimal window size (13 × 13 and an optimal direction (45°. Then, the optimal window size and direction were applied to the three other IKONOS MS bands (blue, green, and red for calculating the eight GLCM textures. Next, an optimal distance value (5 and an optimal neighborhood rule (Queen’s case were determined for calculating the four Gi features from the four IKONOS MS bands. Finally, different RF classification results of the three forest health conditions were created: (1 an overall accuracy (OA of 79.5% produced using the four MS band reflectances only; (2 an OA of 97.1% created with the eight GLCM features calculated from IKONOS Band 4 with the optimal window size of 13 × 13 and direction 45°; (3 an OA of 93.3% created with the all 32 GLCM features calculated from the four IKONOS MS bands with a window size of 13 × 13 and direction of 45°; (4 an OA of 94.0% created using the four Gi features calculated from the four IKONOS MS bands with the optimal distance value of 5 and Queen’s neighborhood rule; and (5 an OA of 96.9% created with the combined 16 spectral (four, spatial (four, and textural (eight features. The most important feature ranked by RF

  8. Bottom friction models for shallow water equations: Manning’s roughness coefficient and small-scale bottom heterogeneity

    Science.gov (United States)

    Dyakonova, Tatyana; Khoperskov, Alexander

    2018-03-01

    The correct description of the surface water dynamics in the model of shallow water requires accounting for friction. To simulate a channel flow in the Chezy model the constant Manning roughness coefficient is frequently used. The Manning coefficient nM is an integral parameter which accounts for a large number of physical factors determining the flow braking. We used computational simulations in a shallow water model to determine the relationship between the Manning coefficient and the parameters of small-scale perturbations of a bottom in a long channel. Comparing the transverse water velocity profiles in the channel obtained in the models with a perturbed bottom without bottom friction and with bottom friction on a smooth bottom, we constructed the dependence of nM on the amplitude and spatial scale of perturbation of the bottom relief.

  9. Properties and Leachability of Self-Compacting Concrete Incorporated with Fly Ash and Bottom Ash

    Science.gov (United States)

    Kadir, Aeslina Abdul; Ikhmal Haqeem Hassan, Mohd; Jamaluddin, Norwati; Bakri Abdullah, Mohd Mustafa Al

    2016-06-01

    The process of combustion in coal-fired power plant generates ashes, namely fly ash and bottom ash. Besides, coal ash produced from coal combustion contains heavy metals within their compositions. These metals are toxic to the environment as well as to human health. Fortunately, treatment methods are available for these ashes, and the use of fly ash and bottom ash in the concrete mix is one of the few. Therefore, an experimental program was carried out to study the properties and determine the leachability of selfcompacting concrete incorporated with fly ash and bottom ash. For experimental study, self-compacting concrete was produced with fly ash as a replacement for Ordinary Portland Cement and bottom ash as a replacement for sand with the ratios of 10%, 20%, and 30% respectively. The fresh properties tests conducted were slump flow, t500, sieve segregation and J-ring. Meanwhile for the hardened properties, density, compressive strength and water absorption test were performed. The samples were then crushed to be extracted using Toxicity Characteristic Leaching Procedure and heavy metals content within the samples were identified accordingly using Atomic Absorption Spectrometry. The results demonstrated that both fresh and hardened properties were qualified to categorize as self-compacting concrete. Improvements in compressive strength were observed, and densities for all the samples were identified as a normal weight concrete with ranges between 2000 kg/m3 to 2600 kg/m3. Other than that, it was found that incorporation up to 30% of the ashes was safe as the leached heavy metals concentration did not exceed the regulatory levels, except for arsenic. In conclusion, this study will serve as a reference which suggests that fly ash and bottom ash are widely applicable in concrete technology, and its incorporation in self-compacting concrete constitutes a potential means of adding value to appropriate mix and design.

  10. Mathematical model of whole-process calculation for bottom-blowing copper smelting

    Science.gov (United States)

    Li, Ming-zhou; Zhou, Jie-min; Tong, Chang-ren; Zhang, Wen-hai; Li, He-song

    2017-11-01

    The distribution law of materials in smelting products is key to cost accounting and contaminant control. Regardless, the distribution law is difficult to determine quickly and accurately by mere sampling and analysis. Mathematical models for material and heat balance in bottom-blowing smelting, converting, anode furnace refining, and electrolytic refining were established based on the principles of material (element) conservation, energy conservation, and control index constraint in copper bottom-blowing smelting. Simulation of the entire process of bottom-blowing copper smelting was established using a self-developed MetCal software platform. A whole-process simulation for an enterprise in China was then conducted. Results indicated that the quantity and composition information of unknown materials, as well as heat balance information, can be quickly calculated using the model. Comparison of production data revealed that the model can basically reflect the distribution law of the materials in bottom-blowing copper smelting. This finding provides theoretical guidance for mastering the performance of the entire process.

  11. Identification of "pathologs" (disease-related genes from the RIKEN mouse cDNA dataset using human curation plus FACTS, a new biological information extraction system

    Directory of Open Access Journals (Sweden)

    Socha Luis A

    2004-04-01

    Full Text Available Abstract Background A major goal in the post-genomic era is to identify and characterise disease susceptibility genes and to apply this knowledge to disease prevention and treatment. Rodents and humans have remarkably similar genomes and share closely related biochemical, physiological and pathological pathways. In this work we utilised the latest information on the mouse transcriptome as revealed by the RIKEN FANTOM2 project to identify novel human disease-related candidate genes. We define a new term "patholog" to mean a homolog of a human disease-related gene encoding a product (transcript, anti-sense or protein potentially relevant to disease. Rather than just focus on Mendelian inheritance, we applied the analysis to all potential pathologs regardless of their inheritance pattern. Results Bioinformatic analysis and human curation of 60,770 RIKEN full-length mouse cDNA clones produced 2,578 sequences that showed similarity (70–85% identity to known human-disease genes. Using a newly developed biological information extraction and annotation tool (FACTS in parallel with human expert analysis of 17,051 MEDLINE scientific abstracts we identified 182 novel potential pathologs. Of these, 36 were identified by computational tools only, 49 by human expert analysis only and 97 by both methods. These pathologs were related to neoplastic (53%, hereditary (24%, immunological (5%, cardio-vascular (4%, or other (14%, disorders. Conclusions Large scale genome projects continue to produce a vast amount of data with potential application to the study of human disease. For this potential to be realised we need intelligent strategies for data categorisation and the ability to link sequence data with relevant literature. This paper demonstrates the power of combining human expert annotation with FACTS, a newly developed bioinformatics tool, to identify novel pathologs from within large-scale mouse transcript datasets.

  12. Extraction of compositional and hydration information of sulfates from laser-induced plasma spectra recorded under Mars atmospheric conditions - Implications for ChemCam investigations on Curiosity rover

    Energy Technology Data Exchange (ETDEWEB)

    Sobron, Pablo, E-mail: pablo.sobron@asc-csa.gc.ca [Department of Earth and Planetary Sciences and McDonnell Center for the Space Sciences, Washington University, St. Louis, MO 63130 (United States); Wang, Alian [Department of Earth and Planetary Sciences and McDonnell Center for the Space Sciences, Washington University, St. Louis, MO 63130 (United States); Sobron, Francisco [Unidad Asociada UVa-CSIC a traves del Centro de Astrobiologia, Parque Tecnologico de Boecillo, Parcela 203, Boecillo (Valladolid), 47151 (Spain)

    2012-02-15

    Given the volume of spectral data required for providing accurate compositional information and thereby insight in mineralogy and petrology from laser-induced breakdown spectroscopy (LIBS) measurements, fast data processing tools are a must. This is particularly true during the tactical operations of rover-based planetary exploration missions such as the Mars Science Laboratory rover, Curiosity, which will carry a remote LIBS spectrometer in its science payload. We have developed: an automated fast pre-processing sequence of algorithms for converting a series of LIBS spectra (typically 125) recorded from a single target into a reliable SNR-enhanced spectrum; a dedicated routine to quantify its spectral features; and a set of calibration curves using standard hydrous and multi-cation sulfates. These calibration curves allow deriving the elemental compositions and the degrees of hydration of various hydrous sulfates, one of the two major types of secondary minerals found on Mars. Our quantitative tools are built upon calibration-curve modeling, through the correlation of the elemental concentrations and the peak areas of the atomic emission lines observed in the LIBS spectra of standard samples. At present, we can derive the elemental concentrations of K, Na, Ca, Mg, Fe, Al, S, O, and H in sulfates, as well as the hydration degrees of Ca- and Mg-sulfates, from LIBS spectra obtained in both Earth atmosphere and Mars atmospheric conditions in a Planetary Environment and Analysis Chamber (PEACh). In addition, structural information can be potentially obtained for various Fe-sulfates. - Highlights: Black-Right-Pointing-Pointer Routines for LIBS spectral data fast automated processing. Black-Right-Pointing-Pointer Identification of elements and determination of the elemental composition. Black-Right-Pointing-Pointer Calibration curves for sulfate samples in Earth and Mars atmospheric conditions. Black-Right-Pointing-Pointer Fe curves probably related to the crystalline

  13. Learning affects top down and bottom up modulation of eye movements in decision making

    DEFF Research Database (Denmark)

    Orquin, Jacob Lund; Bagger, Martin; Mueller Loose, Simone

    2013-01-01

    Repeated decision making is subject to changes over time such as decreases in decision time and information use and increases in decision accuracy. We show that a traditional strategy selection view of decision making cannot account for these temporal dynamics without relaxing main assumptions...... about what defines a decision strategy. As an alternative view we suggest that temporal dynamics in decision making are driven by attentional and perceptual processes and that this view has been expressed in the information reduction hypothesis. We test the information reduction hypothesis by integrating...... it in a broader framework of top down and bottom up processes and derive the predictions that repeated decisions increase top down control of attention capture which in turn leads to a reduction in bottom up attention capture. To test our hypotheses we conducted a repeated discrete choice experiment with three...

  14. 12 Trace Metals Distribution in Fish Tissues, Bottom Sediments and ...

    African Journals Online (AJOL)

    `123456789jkl''''#

    Abstract. Water samples, bottom sediments, Tilapia, and Cat Fish from Okumeshi River in Delta state of Nigeria were analysed ... Keywords: Trace metals, Fish Tissues, Water, Bottom sediments, Okumeshi River. Introduction ..... Grey Mangroove Avicemmia marina (Forsk). ... sewage treatment plant oulet pipe extension on.

  15. Monitoring of metals in Tilapia nilotica tissues, bottom sediments ...

    African Journals Online (AJOL)

    Tilapia (Tilapia nilotica), bottom sediments and water were collected from Nworie River and Oguta Lake. The muscle, liver and gills of the fish as well as the bottom sediments and water were analysed for Al, Cr, Cd, Pb, As, Zn, Mn, Co, Se, Cu, Ni and Fe using atomic absorption spectrophotometer to highlight the importance ...

  16. Bottom-feeding for blockbuster businesses.

    Science.gov (United States)

    Rosenblum, David; Tomlinson, Doug; Scott, Larry

    2003-03-01

    Marketing experts tell companies to analyze their customer portfolios and weed out buyer segments that don't generate attractive returns. Loyalty experts stress the need to aim retention programs at "good" customers--profitable ones- and encourage the "bad" ones to buy from competitors. And customer-relationship-management software provides ever more sophisticated ways to identify and eliminate poorly performing customers. On the surface, the movement to banish unprofitable customers seems reasonable. But writing off a customer relationship simply because it is currently unprofitable is at best rash and at worst counterproductive. Executives shouldn't be asking themselves, How can we shun unprofitable customers? They need to ask, How can we make money off the customers that everyone else is shunning? When you look at apparently unattractive segments through this lens, you often see opportunities to serve those segments in ways that fundamentally change customer economics. Consider Paychex, a payroll-processing company that built a nearly billion-dollar business by serving small companies. Established players had ignored these customers on the assumption that small companies couldn't afford the service. When founder Tom Golisano couldn't convince his bosses at Electronic Accounting Systems that they were missing a major opportunity, he started a company that now serves 390,000 U.S. customers, each employing around 14 people. In this article, the authors look closely at bottom-feeders--companies that assessed the needs of supposedly unattractive customers and redesigned their business models to turn a profit by fulfilling those needs. And they offer lessons other executives can use to do the same.

  17. The Interaction of Top-Down and Bottom-Up Statistics in the Resolution of Syntactic Category Ambiguity

    Science.gov (United States)

    Gibson, Edward

    2006-01-01

    This paper investigates how people resolve syntactic category ambiguities when comprehending sentences. It is proposed that people combine: (a) context-dependent syntactic expectations (top-down statistical information) and (b) context-independent lexical-category frequencies of words (bottom-up statistical information) in order to resolve…

  18. The Curvelet Transform in the analysis of 2-D GPR data: Signal enhancement and extraction of orientation-and-scale-dependent information

    Science.gov (United States)

    Tzanis, Andreas

    2013-04-01

    wavelet transform: whereas discrete wavelets are designed to provide sparse representations of functions with point singularities, curvelets are designed to provide sparse representations of functions with singularities on curves. This work investigates the utility of the CT in processing noisy GPR data from geotechnical and archaeometric surveys. The analysis has been performed with the Fast Discrete CT (FDCT - Candès et al., 2006) available from http://www.curvelet.org/ and adapted for use by the matGPR software (Tzanis, 2010). The adaptation comprises a set of driver functions that compute and display the curvelet decomposition of the input GPR section and then allow for the interactive exclusion/inclusion of data (wavefront) components at different scales and angles by cancelation/restoration of the corresponding curvelet coefficients. In this way it is possible to selectively reconstruct the data so as to abstract/retain information of given scales and orientations. It is demonstrated that the CT can be used to: (a) Enhance the S/N ratio by cancelling directional noise wavefronts of any angle of emergence, with particular reference to clutter. (b) Extract geometric information for further scrutiny, e.g. distinguish signals from small and large aperture fractures, faults, bedding etc. (c) Investigate the characteristics of signal propagation (hence material properties), albeit indirectly. This is possible because signal attenuation and temporal localization are closely associated, so that scale and spatio-temporal localization are also closely related. Thus, interfaces embedded in low attenuation domains will tend to produce sharp reflections rich in high frequencies and fine-scale localization. Conversely, interfaces in high attenuation domains will tend to produce dull reflections rich in low frequencies and broad localization. At a single scale and with respect to points (a) and (b) above, the results of the CT processor are comparable to those of the tuneable

  19. Anthropopression markers in lake bottom sediments

    Science.gov (United States)

    Nadolna, Anna; Nowicka, Barbara

    2014-05-01

    Lakes are vulnerable to various types of anthropogenic disturbances. Responses of lake ecosystems to environmental stressors are varied and depend not only on the type of a factor but also on the lake natural resistance to degradation. Within the EULAKES project an evaluation of anthropogenic stress extent in a flow-through, postglacial, ribbon lake (Lake Charzykowskie) was carried out. It was assumed, that this impact manifests unevenly, depending on a type and degree of the pressure on the shore zones, water quality of tributaries, lake basin shape and dynamics of a water movement. It was stated, that anthropogenic markers are substances accumulated in bottom sediments as a result of allochthonous substances inflow from the catchment and atmosphere. Along the selected transects 105 samples from the top layer of sediments (about 20 cm) was collected representing the contemporary accumulation (about 15 years). The content of selected chemical elements and compounds was examined, including nutrients (TN and TP), heavy metals (arsenic, cadmium, lead, chromium, nickel, copper, zinc, mercury, iron, and manganese) and pesticides (DDT, DDD, DDE, DMDT , γ-HCH). The research was conducted in the deepest points of each lake basin and along the research transects - while choosing the spots, the increased intensity of anthropogenic impact (ports, roads with heavy traffic, wastewater discharge zones, built-up areas) was taken into consideration. The river outlets to the lake, where there are ecotonal zones between limnic and fluvial environment, were also taken into account. Analysis of the markers distribution was carried out against the diversity of chemical characteristics of limnic sediments. Ribbon shape of the lake basin and the dominant wind direction provide an opportunity of easy water mixing to a considerable depth. Intensive waving processes cause removal of the matter from the littoral zone towards lake hollows (separated by the underwater tresholds), where the

  20. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at the Bahia Honda Bridge, 2005 - 2007 (NODC Accession 0039226)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  1. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract, at the M/V ELPIS Restoration Site, 2006 - 2007 (NODC Accession 0039899)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  2. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Sand Key Lighthouse, 1990 - 2005 (NODC Accession 0012769)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  3. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at the M/V Alec Owen Maitland restoration site, 2006 - 2007 (NODC Accession 0039987)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  4. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Key West Channel, 2005 - 2007 (NODC Accession 0039986)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  5. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Boca Grande Channel, 2004-2006 (NODC Accession 0014184)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  6. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at the Pillar Coral Forest site, 2006 (NODC accession 0040039)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  7. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at the M/V Alec Owen Maitland restoration site, 2004 - 2006 (NODC Accession 0010585)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  8. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Boca Grande Channel, 2007 - 2008 and 2012 (NODC Accession 0093019)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  9. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Card Sound Bridge, 2004 - 2006 (NODC Accession 0014266)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  10. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Sand Key Lighthouse, 2005 - 2007 (NODC Accession 0040080)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  11. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at 9-Ft Shoal, 2007-2010 (NODC Accession 0092549)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  12. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Ball Buoy Reef, 1990 - 1998 (NODC Accession 0002781)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  13. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract, at the M/V ELPIS Restoration Site, 2004 - 2006 (NODC Accession 0010576)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  14. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Bicentennial Coral Head, 1998 - 2006 (NODC Accession 0039481)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  15. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at the Broad Creek site, 2006 - 2007 (NODC Accession 0039880)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  16. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Looe Iselin, 2004-2006 (NODC Accession 0014271)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  17. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at the M/V Alec Owen Maitland Restoration site, 2006 - 2007 (NODC Accession 0039987)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  18. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Key West Channel, 2007 - 2010 and 2011 - 2012 (NODC Accession 0093028)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  19. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Bahia Honda Bridge, 2007-2011 (NODC Accession 0093018)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  20. Continuous bottom temperature measurements in strategic atreas of the Florida Reef Tract at the New Ground Shoal site, 1992 - 2006 (NODC Accession 0012845)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  1. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Key West Channel, 1991 - 2005 (NODC Accession 0012739)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  2. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Snake Creek Bridge, 1989 - 2005 (NODC accession 0013148)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  3. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Harbor Key Bank, 1992 - 1997 (NODC Accession 0013553)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  4. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Hen and Chickens Reef, 2006 - 2007 (NODC Accession 0020554)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  5. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Hen and Chickens Reef, 1989 - 2006 (NODC Accession 0011144)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  6. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at the Pillar Coral Forest site, 1996 - 2006 (NODC accession 0013096)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  7. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Sand Key Lighthouse, 2007 - 2010 (NODC Accession 0093065)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  8. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at 7-mile Bridge, 2005 - 2007 (NODC Accession 0039469)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  9. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at 7-mile Bridge, 2007-2010 (NODC Accession 0092548)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  10. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Carysfort Reef, 2006-2010 (NODC Accession 0093022)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  11. Continuous bottom temperature measurements in strategic areas of the Western Sambo Reef, Western Sambo Reef Buoy 16 and Jacquelyn L Grounding Site, 1990 - 2005 (NODC Accession 0014120)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  12. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at the New Ground Shoal site, 1992 - 2006 (NODC Accession 0012845)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  13. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at the Bahia Honda Bridge, 1990 - 2004 (NODC Accession 0002772)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  14. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at M/V ELPIS Restoration Site, 2007 - 2011 (NODC Accession 0093024)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  15. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Looe Key-Back Reef, 2008 and 2011 - 2012 (NODC Accession 0093064)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  16. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Alligator Reef, 2007-2010 (NODC Accession 0093017)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  17. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at 9-FT Shoal, 2005-2007 (NODC Accession 0039533)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  18. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Tennessee Reef, 2004-2006 (NODC Accession 0014272)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  19. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Bicentennial Coral Head, 2006 - 2007 (NODC Accession 0039817)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  20. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Boca Grande Channel, 2006 - 2007 (NODC Accession 0039818)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  1. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Snake Creek Bridge, 1989 - 2005 (NODC Accession 0013148)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  2. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Long Key, 2005-2006 (NODC Accession 0014269)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  3. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Bicentennial Coral Head, 2007-2009 (NODC Accession 0090835)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  4. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at the Broad Creek site, 2007-2009 (NODC Accession 0093020)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  5. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Sombrero Reef Lighthouse, 1991-2005 (NODC Accession 0013726)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  6. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Looe Key Back Reef, 2004-2006 (NODC Accession 0014270)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  7. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at Hen and Chickens Reef, 2007 - 2011 (NODC Accession 0093027)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  8. Continuous bottom temperature measurements in strategic areas of the Florida Reef Tract at the Pillar Coral Forest site, 1996 - 2006 (NODC Accession 0013096)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The purpose of this project is to document bottom seawater temperature in strategic areas of the Florida Reef Tract on a continuing basis and make that information...

  9. Pretreatment and utilization of waste incineration bottom ashes

    DEFF Research Database (Denmark)

    Astrup, Thomas

    2007-01-01

    Within recent years, researchers and authorities have had increasing focus on leaching properties from waste incineration bottom ashes. Researchers have investigated processes such as those related to carbonation, weathering, metal complexation, and leaching control. Most of these investigations......, however, have had a strong emphasis on lab experiments with little focus on full scale bottom ash upgrading methods. The introduction of regulatory limit values restricting leaching from utilized bottom ashes, has created a need for a better understanding of how lab scale experiences can be utilized...

  10. Peabody Western Coal cuts costs with bottom-dump haulers

    Energy Technology Data Exchange (ETDEWEB)

    Perla, S.; Baecker, G.; Morgan, W. [Empire Machinery, Mesa, AZ (United States)

    1995-04-01

    A new hauling concept has been introduced at the Black Mesa and Kayenta coal mines of the Peabody Western Coal Co. in northern Arizona, USA. The article describes the switch from Caterpillar 992 wheel loaders with 136 t bottom-dump trucks to 272 t bottom-dump trucks. Cat 789 off-highway trucks were modified to pull bottom-dump trucks. Haulage costs per ton of coal and cost per ton-mile have fallen significantly since the introduction of the new large hauling method. 7 figs., 2 photos.

  11. Acoustic Profiling of Bottom Sediments in Large Oil Storage Tanks

    Science.gov (United States)

    Svet, V. D.; Tsysar', S. A.

    2018-01-01

    Characteristic features of acoustic profiling of bottom sediments in large oil storage tanks are considered. Basic acoustic parameters of crude oil and bottom sediments are presented. It is shown that, because of the presence of both transition layers in crude oil and strong reverberation effects in oil tanks, the volume of bottom sediments that is calculated from an acoustic surface image is generally overestimated. To reduce the error, additional post-processing of acoustic profilometry data is proposed in combination with additional measurements of viscosity and tank density distributions in vertical at several points of the tank.

  12. Refinement of the bottom boundary of the INES scale

    International Nuclear Information System (INIS)

    Ferjencik, Milos

    2010-01-01

    No existing edition of the International Nuclear Events Scale (INES) Manual addresses in depth the determination of the bottom boundary of the Scale, although a need for a definition is felt. The article introduces a method for determining the INES bottom boundary applicable to pressurized water reactors. This bottom boundary is put identical with the threshold of degradation of the installation's nuclear safety assurance. A comprehensive flowchart has been developed as the main outcome of the analysis of the nuclear safety assurance violation issue. The use of this flowchart in INES classification to replace the introductory question in the General INES Rating Procedure in the INES Manual is recommended. (orig.)

  13. Spreading of Antarctic Bottom Water in the Atlantic Ocean

    OpenAIRE

    Morozov, E.; Tarakanov, R. Y.; Zenk, Walter

    2012-01-01

    This paper describes the transport of bottom water from its source region in the Weddell Sea through the abyssal channels of the Atlantic Ocean. The research brings together the recent observations and historical data. A strong flow of Antarctic Bottom Water through the Vema Channel is analyzed. The mean speed of the flow is 30 cm/s. A temperature increase was found in the deep Vema Channel, which has been observed for 30 years already. The flow of bottom water in the northern part of the Bra...

  14. Milk bottom-up proteomics: method optimisation.

    Directory of Open Access Journals (Sweden)

    Delphine eVincent

    2016-01-01

    Full Text Available Milk is a complex fluid whose proteome displays a diverse set of proteins of high abundance such as caseins and medium to low abundance whey proteins such as ß-lactoglobulin, lactoferrin, immunoglobulins, glycoproteins, peptide hormones and enzymes. A sample preparation method that enables high reproducibility and throughput is key in reliably identifying proteins present or proteins responding to conditions such as a diet, health or genetics. Using skim milk samples from Jersey and Holstein-Friesian cows, we compared three extraction procedures which have not previously been applied to samples of cows’ milk. Method A (urea involved a simple dilution of the milk in a urea-based buffer, method B (TCA/acetone involved a trichloroacetic acid (TCA/acetone precipitation and method C (methanol/chloroform involved a tri-phasic partition method in chloroform/methanol solution. Protein assays, SDS-PAGE profiling, and trypsin digestion followed by nanoHPLC-electrospray ionisation-tandem mass spectrometry (nLC-ESI-MS/MS analyses were performed to assess their efficiency. Replicates were used at each analytical step (extraction, digestion, injection to assess reproducibility. Mass spectrometry (MS data are available via ProteomeXchange with identifier PXD002529. Overall 186 unique accessions, major and minor proteins, were identified with a combination of methods. Method C (methanol/chloroform yielded the best resolved SDS-patterns and highest protein recovery rates, method A (urea yielded the greatest number of accessions, and, of the three procedures, method B (TCA/acetone was the least compatible of all with a wide range of downstream analytical procedures. Our results also highlighted breed differences between the proteins in milk of Jersey and Holstein-Friesian cows.

  15. Feature extraction from high resolution satellite imagery as an input to the development and rapid update of a METRANS geographic information system (GIS).

    Science.gov (United States)

    2011-06-01

    This report describes an accuracy assessment of extracted features derived from three : subsets of Quickbird pan-sharpened high resolution satellite image for the area of the : Port of Los Angeles, CA. Visual Learning Systems Feature Analyst and D...

  16. Extraction process

    International Nuclear Information System (INIS)

    Rendall, J.S.; Cahalan, M.J.

    1979-01-01

    A process is described for extracting at least two desired constituents from a mineral, using a liquid reagent which produces the constituents, or compounds thereof, in separable form and independently extracting those constituents, or compounds. The process is especially valuable for the extraction of phosphoric acid and metal values from acidulated phosphate rock, the slurry being contacted with selective extractants for phosphoric acid and metal (e.g. uranium) values. In an example, uranium values are oxidized to uranyl form and extracted using an ion exchange resin. (U.K.)

  17. Bottom trawl assessment of Lake Ontario prey fishes

    Science.gov (United States)

    Weidel, Brian C.; Connerton, Michael J.; Holden, Jeremy

    2018-01-01

    Managing Lake Ontario fisheries in an ecosystem-context requires prey fish community and population data. Since 1978, multiple annual bottom trawl surveys have quantified prey fish dynamics to inform management relative to published Fish Community Objectives. In 2017, two whole-lake surveys collected 341 bottom trawls (spring: 204, fall: 137), at depths from 8-225m, and captured 751,350 fish from 29 species. Alewife were 90% of the total fish catch while Deepwater Sculpin, Round Goby, and Rainbow Smelt comprised the majority of the remaining total catch (3.8, 3.1, and 1.1% respectively). The adult Alewife abundance index for US waters increased in 2017 relative to 2016, however the index for Canadian waters declined. Adult Alewife condition, assessed by the predicted weight of a 165 mm fish (6.5 inches), declined in 2017 from record high values observed in spring 2016. Spring 2017 Alewife condition was slightly less than the 10-year average, but the fall value was well below the 10-year average, likely due to increased Age-1 Alewife abundance. The Age-1 Alewife abundance index was the highest observed in 40 years, and 8-times higher than the previous year. The Age-1 index estimates Alewife reproductive success the preceding year. The warm summer and winter of 2016 likely contributed to the large year class. In contrast the relatively cool 2017 spring and cold winter may result in a lower than average 2017 year class. Abundance indices for Rainbow Smelt, Cisco, and Emerald Shiner either declined or remained at low levels in 2017. Pelagic prey fish diversity continues to be low since a single species, Alewife, dominates the catch. Deepwater Sculpin were the most abundant benthic prey fish in 2017 because Round Goby abundance declined sharply from 2016. Slimy Sculpin density continued to decline and the 2017 biomass index for US waters was the lowest ever observed. Prior to Round Goby proliferation, juvenile Slimy Sculpin comprised ~10% of the Slimy Sculpin catch, but

  18. Solvent extraction

    Energy Technology Data Exchange (ETDEWEB)

    Coombs, D.M.; Latimer, E.G.

    1988-01-05

    It is an object of this invention to provide for the demetallization and general upgrading of heavy oil via a solvent extracton process, and to improve the efficiency of solvent extraction operations. The yield and demetallization of product oil form heavy high-metal content oil is maximized by solvent extractions which employ either or all of the following techniques: premixing of a minor amount of the solvent with feed and using countercurrent flow for the remaining solvent; use of certain solvent/free ratios; use of segmental baffle tray extraction column internals and the proper extraction column residence time. The solvent premix/countercurrent flow feature of the invention substantially improves extractions where temperatures and pressures above the critical point of the solvent are used. By using this technique, a greater yield of extract oil can be obtained at the same metals content or a lower metals-containing extract oil product can be obtained at the same yield. Furthermore, the premixing of part of the solvent with the feed before countercurrent extraction gives high extract oil yields and high quality demetallization. The solvent/feed ratio features of the invention substanially lower the captial and operating costs for such processes while not suffering a loss in selectivity for metals rejection. The column internals and rsidence time features of the invention further improve the extractor metals rejection at a constant yield or allow for an increase in extract oil yield at a constant extract oil metals content. 13 figs., 3 tabs.

  19. Assessment of heavy metals pollution in bottom sediments of the Arabian Gulf after the Gulf War oil spill 1991

    International Nuclear Information System (INIS)

    Nasr, S.M.; Ahmed, M.H.; El-Raey, M.; Frihy, O.E.; Abdel Motti, A.

    1999-01-01

    The major objective of this study was to carry out a sequential geochemical extraction scheme for the partitioning of Fe, Mn, Co, Cu, Zn, Ni, Cr and Pb in the bottom sediments of the Arabian Gulf to detect any potential pollution impact on the gulf sediments following the 1991 gulf war oil spill, and to differentiate between anthropogenic inputs and natural background of heavy metals

  20. NMFS Bottom Longline Analytical Dataset Provided to NRDA

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Southeast Fisheries Science Center Mississippi Laboratories has conducted standardized bottom longline surveys in the Gulf of Mexico and South Atlantic since...