WorldWideScience

Sample records for extract quantitative information

  1. A methodology for the extraction of quantitative information from electron microscopy images at the atomic level

    International Nuclear Information System (INIS)

    Galindo, P L; Pizarro, J; Guerrero, E; Guerrero-Lebrero, M P; Scavello, G; Yáñez, A; Sales, D L; Herrera, M; Molina, S I; Núñez-Moraleda, B M; Maestre, J M

    2014-01-01

    In this paper we describe a methodology developed at the University of Cadiz (Spain) in the past few years for the extraction of quantitative information from electron microscopy images at the atomic level. This work is based on a coordinated and synergic activity of several research groups that have been working together over the last decade in two different and complementary fields: Materials Science and Computer Science. The aim of our joint research has been to develop innovative high-performance computing techniques and simulation methods in order to address computationally challenging problems in the analysis, modelling and simulation of materials at the atomic scale, providing significant advances with respect to existing techniques. The methodology involves several fundamental areas of research including the analysis of high resolution electron microscopy images, materials modelling, image simulation and 3D reconstruction using quantitative information from experimental images. These techniques for the analysis, modelling and simulation allow optimizing the control and functionality of devices developed using materials under study, and have been tested using data obtained from experimental samples

  2. Oxygen octahedra picker: A software tool to extract quantitative information from STEM images

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Yi, E-mail: y.wang@fkf.mpg.de; Salzberger, Ute; Sigle, Wilfried; Eren Suyolcu, Y.; Aken, Peter A. van

    2016-09-15

    In perovskite oxide based materials and hetero-structures there are often strong correlations between oxygen octahedral distortions and functionality. Thus, atomistic understanding of the octahedral distortion, which requires accurate measurements of atomic column positions, will greatly help to engineer their properties. Here, we report the development of a software tool to extract quantitative information of the lattice and of BO{sub 6} octahedral distortions from STEM images. Center-of-mass and 2D Gaussian fitting methods are implemented to locate positions of individual atom columns. The precision of atomic column distance measurements is evaluated on both simulated and experimental images. The application of the software tool is demonstrated using practical examples. - Highlights: • We report a software tool for mapping atomic positions from HAADF and ABF images. • It enables quantification of both crystal lattice and oxygen octahedral distortions. • We test the measurement accuracy and precision on simulated and experimental images. • It works well for different orientations of perovskite structures and interfaces.

  3. Optimized protein extraction for quantitative proteomics of yeasts.

    Directory of Open Access Journals (Sweden)

    Tobias von der Haar

    2007-10-01

    Full Text Available The absolute quantification of intracellular protein levels is technically demanding, but has recently become more prominent because novel approaches like systems biology and metabolic control analysis require knowledge of these parameters. Current protocols for the extraction of proteins from yeast cells are likely to introduce artifacts into quantification procedures because of incomplete or selective extraction.We have developed a novel procedure for protein extraction from S. cerevisiae based on chemical lysis and simultaneous solubilization in SDS and urea, which can extract the great majority of proteins to apparent completeness. The procedure can be used for different Saccharomyces yeast species and varying growth conditions, is suitable for high-throughput extraction in a 96-well format, and the resulting extracts can easily be post-processed for use in non-SDS compatible procedures like 2D gel electrophoresis.An improved method for quantitative protein extraction has been developed that removes some of the sources of artefacts in quantitative proteomics experiments, while at the same time allowing novel types of applications.

  4. Information extraction system

    Science.gov (United States)

    Lemmond, Tracy D; Hanley, William G; Guensche, Joseph Wendell; Perry, Nathan C; Nitao, John J; Kidwell, Paul Brandon; Boakye, Kofi Agyeman; Glaser, Ron E; Prenger, Ryan James

    2014-05-13

    An information extraction system and methods of operating the system are provided. In particular, an information extraction system for performing meta-extraction of named entities of people, organizations, and locations as well as relationships and events from text documents are described herein.

  5. QUANTITATIVE EXTRACTION OF MEIOFAUNA: A COMPARISON ...

    African Journals Online (AJOL)

    and A G DE WET. Department of Mathematical Statistics, University of Port Elizabeth. Accepted: May 1978. ABSTRACT. Two methods for the quantitative extraction of meiofauna from natural sandy sediments were investigated and compared: Cobb's decanting and sieving technique and the Oostenbrink elutriator. Both.

  6. Toward 3D structural information from quantitative electron exit wave analysis

    International Nuclear Information System (INIS)

    Borisenko, Konstantin B; Moldovan, Grigore; Kirkland, Angus I; Wang, Amy; Van Dyck, Dirk; Chen, Fu-Rong

    2012-01-01

    Simulations show that using a new direct imaging detector and accurate exit wave restoration algorithms allows nearly quantitative restoration of electron exit wave phase, which can be regarded as only qualitative for conventional indirect imaging cameras. This opens up a possibility of extracting accurate information on 3D atomic structure of the sample even from a single projection.

  7. Multimedia Information Extraction

    CERN Document Server

    Maybury, Mark T

    2012-01-01

    The advent of increasingly large consumer collections of audio (e.g., iTunes), imagery (e.g., Flickr), and video (e.g., YouTube) is driving a need not only for multimedia retrieval but also information extraction from and across media. Furthermore, industrial and government collections fuel requirements for stock media access, media preservation, broadcast news retrieval, identity management, and video surveillance.  While significant advances have been made in language processing for information extraction from unstructured multilingual text and extraction of objects from imagery and vid

  8. Challenges in Managing Information Extraction

    Science.gov (United States)

    Shen, Warren H.

    2009-01-01

    This dissertation studies information extraction (IE), the problem of extracting structured information from unstructured data. Example IE tasks include extracting person names from news articles, product information from e-commerce Web pages, street addresses from emails, and names of emerging music bands from blogs. IE is all increasingly…

  9. Semi-automatic building extraction in informal settlements from high-resolution satellite imagery

    Science.gov (United States)

    Mayunga, Selassie David

    The extraction of man-made features from digital remotely sensed images is considered as an important step underpinning management of human settlements in any country. Man-made features and buildings in particular are required for varieties of applications such as urban planning, creation of geographical information systems (GIS) databases and Urban City models. The traditional man-made feature extraction methods are very expensive in terms of equipment, labour intensive, need well-trained personnel and cannot cope with changing environments, particularly in dense urban settlement areas. This research presents an approach for extracting buildings in dense informal settlement areas using high-resolution satellite imagery. The proposed system uses a novel strategy of extracting building by measuring a single point at the approximate centre of the building. The fine measurement of the building outlines is then effected using a modified snake model. The original snake model on which this framework is based, incorporates an external constraint energy term which is tailored to preserving the convergence properties of the snake model; its use to unstructured objects will negatively affect their actual shapes. The external constrained energy term was removed from the original snake model formulation, thereby, giving ability to cope with high variability of building shapes in informal settlement areas. The proposed building extraction system was tested on two areas, which have different situations. The first area was Tungi in Dar Es Salaam, Tanzania where three sites were tested. This area is characterized by informal settlements, which are illegally formulated within the city boundaries. The second area was Oromocto in New Brunswick, Canada where two sites were tested. Oromocto area is mostly flat and the buildings are constructed using similar materials. Qualitative and quantitative measures were employed to evaluate the accuracy of the results as well as the performance

  10. Amines as extracting agents for the quantitative determinations of actinides in biological samples

    International Nuclear Information System (INIS)

    Singh, N.P.

    1987-01-01

    The use of amines (primary, secondary and tertiary chains and quaternary ammonium salts) as extracting agents for the quantitative determination of actinides in biological samples is reviewed. Among the primary amines, only Primene JM-T is used to determine Pu in urine and bone. No one has investigated the possibility of using secondary amines to quantitatively extract actinides from biological samples. Among the tertiary amines, tri-n-octylamine, tri-iso-octylamine, tyricaprylamine (Alamine) and trilaurylamine (tridodecylamine) are used extensively to extract and separate the actinides from biological samples. Only one quaternary ammonium salt, methyltricapryl ammonium chloride (Aliquat-336), is used to extract Pu from biological samples. (author) 28 refs

  11. Information Extraction of High Resolution Remote Sensing Images Based on the Calculation of Optimal Segmentation Parameters

    Science.gov (United States)

    Zhu, Hongchun; Cai, Lijie; Liu, Haiying; Huang, Wei

    2016-01-01

    Multi-scale image segmentation and the selection of optimal segmentation parameters are the key processes in the object-oriented information extraction of high-resolution remote sensing images. The accuracy of remote sensing special subject information depends on this extraction. On the basis of WorldView-2 high-resolution data, the optimal segmentation parameters methodof object-oriented image segmentation and high-resolution image information extraction, the following processes were conducted in this study. Firstly, the best combination of the bands and weights was determined for the information extraction of high-resolution remote sensing image. An improved weighted mean-variance method was proposed andused to calculatethe optimal segmentation scale. Thereafter, the best shape factor parameter and compact factor parameters were computed with the use of the control variables and the combination of the heterogeneity and homogeneity indexes. Different types of image segmentation parameters were obtained according to the surface features. The high-resolution remote sensing images were multi-scale segmented with the optimal segmentation parameters. Ahierarchical network structure was established by setting the information extraction rules to achieve object-oriented information extraction. This study presents an effective and practical method that can explain expert input judgment by reproducible quantitative measurements. Furthermore the results of this procedure may be incorporated into a classification scheme. PMID:27362762

  12. Information extraction from multi-institutional radiology reports.

    Science.gov (United States)

    Hassanpour, Saeed; Langlotz, Curtis P

    2016-01-01

    The radiology report is the most important source of clinical imaging information. It documents critical information about the patient's health and the radiologist's interpretation of medical findings. It also communicates information to the referring physicians and records that information for future clinical and research use. Although efforts to structure some radiology report information through predefined templates are beginning to bear fruit, a large portion of radiology report information is entered in free text. The free text format is a major obstacle for rapid extraction and subsequent use of information by clinicians, researchers, and healthcare information systems. This difficulty is due to the ambiguity and subtlety of natural language, complexity of described images, and variations among different radiologists and healthcare organizations. As a result, radiology reports are used only once by the clinician who ordered the study and rarely are used again for research and data mining. In this work, machine learning techniques and a large multi-institutional radiology report repository are used to extract the semantics of the radiology report and overcome the barriers to the re-use of radiology report information in clinical research and other healthcare applications. We describe a machine learning system to annotate radiology reports and extract report contents according to an information model. This information model covers the majority of clinically significant contents in radiology reports and is applicable to a wide variety of radiology study types. Our automated approach uses discriminative sequence classifiers for named-entity recognition to extract and organize clinically significant terms and phrases consistent with the information model. We evaluated our information extraction system on 150 radiology reports from three major healthcare organizations and compared its results to a commonly used non-machine learning information extraction method. We

  13. Extracting useful information from images

    DEFF Research Database (Denmark)

    Kucheryavskiy, Sergey

    2011-01-01

    The paper presents an overview of methods for extracting useful information from digital images. It covers various approaches that utilized different properties of images, like intensity distribution, spatial frequencies content and several others. A few case studies including isotropic and heter......The paper presents an overview of methods for extracting useful information from digital images. It covers various approaches that utilized different properties of images, like intensity distribution, spatial frequencies content and several others. A few case studies including isotropic...

  14. Quantitative measurements in laser-induced plasmas using optical probing. Final report

    International Nuclear Information System (INIS)

    Sweeney, D.W.

    1981-01-01

    Optical probing of laser induced plasmas can be used to quantitatively reconstruct electron number densities and magnetic fields. Numerical techniques for extracting quantitative information from the experimental data are described. A computer simulation of optical probing is used to determine the quantitative information that can be reasonably extracted from real experimental interferometric systems to reconstruct electron number density distributions. An example of a reconstructed interferogram shows a steepened electron distribution due to radiation pressure effects

  15. Extracting quantitative three-dimensional unsteady flow direction from tuft flow visualizations

    Energy Technology Data Exchange (ETDEWEB)

    Omata, Noriyasu; Shirayama, Susumu, E-mail: omata@nakl.t.u-tokyo.ac.jp, E-mail: sirayama@sys.t.u-tokyo.ac.jp [Department of Systems Innovation, School of Engineering, The University of Tokyo, Hongo 7-3-1, Bunkyo-ku, Tokyo, 113-8656 (Japan)

    2017-10-15

    We focus on the qualitative but widely used method of tuft flow visualization, and propose a method for quantifying it using information technology. By applying stereo image processing and computer vision, the three-dimensional (3D) flow direction in a real environment can be obtained quantitatively. In addition, we show that the flow can be divided temporally by performing appropriate machine learning on the data. Acquisition of flow information in real environments is important for design development, but it is generally considered difficult to apply simulations or quantitative experiments to such environments. Hence, qualitative methods including the tuft method are still in use today. Although attempts have been made previously to quantify such methods, it has not been possible to acquire 3D information. Furthermore, even if quantitative data could be acquired, analysis was often performed empirically or qualitatively. In contrast, we show that our method can acquire 3D information and analyze the measured data quantitatively. (paper)

  16. Extracting quantitative three-dimensional unsteady flow direction from tuft flow visualizations

    International Nuclear Information System (INIS)

    Omata, Noriyasu; Shirayama, Susumu

    2017-01-01

    We focus on the qualitative but widely used method of tuft flow visualization, and propose a method for quantifying it using information technology. By applying stereo image processing and computer vision, the three-dimensional (3D) flow direction in a real environment can be obtained quantitatively. In addition, we show that the flow can be divided temporally by performing appropriate machine learning on the data. Acquisition of flow information in real environments is important for design development, but it is generally considered difficult to apply simulations or quantitative experiments to such environments. Hence, qualitative methods including the tuft method are still in use today. Although attempts have been made previously to quantify such methods, it has not been possible to acquire 3D information. Furthermore, even if quantitative data could be acquired, analysis was often performed empirically or qualitatively. In contrast, we show that our method can acquire 3D information and analyze the measured data quantitatively. (paper)

  17. Extraction of Information of Audio-Visual Contents

    Directory of Open Access Journals (Sweden)

    Carlos Aguilar

    2011-10-01

    Full Text Available In this article we show how it is possible to use Channel Theory (Barwise and Seligman, 1997 for modeling the process of information extraction realized by audiences of audio-visual contents. To do this, we rely on the concepts pro- posed by Channel Theory and, especially, its treatment of representational systems. We then show how the information that an agent is capable of extracting from the content depends on the number of channels he is able to establish between the content and the set of classifications he is able to discriminate. The agent can endeavor the extraction of information through these channels from the totality of content; however, we discuss the advantages of extracting from its constituents in order to obtain a greater number of informational items that represent it. After showing how the extraction process is endeavored for each channel, we propose a method of representation of all the informative values an agent can obtain from a content using a matrix constituted by the channels the agent is able to establish on the content (source classifications, and the ones he can understand as individual (destination classifications. We finally show how this representation allows reflecting the evolution of the informative items through the evolution of audio-visual content.

  18. Quantitative analysis of perfumes in talcum powder by using headspace sorptive extraction.

    Science.gov (United States)

    Ng, Khim Hui; Heng, Audrey; Osborne, Murray

    2012-03-01

    Quantitative analysis of perfume dosage in talcum powder has been a challenge due to interference of the matrix and has so far not been widely reported. In this study, headspace sorptive extraction (HSSE) was validated as a solventless sample preparation method for the extraction and enrichment of perfume raw materials from talcum powder. Sample enrichment is performed on a thick film of poly(dimethylsiloxane) (PDMS) coated onto a magnetic stir bar incorporated in a glass jacket. Sampling is done by placing the PDMS stir bar in the headspace vial by using a holder. The stir bar is then thermally desorbed online with capillary gas chromatography-mass spectrometry. The HSSE method is based on the same principles as headspace solid-phase microextraction (HS-SPME). Nevertheless, a relatively larger amount of extracting phase is coated on the stir bar as compared to SPME. Sample amount and extraction time were optimized in this study. The method has shown good repeatability (with relative standard deviation no higher than 12.5%) and excellent linearity with correlation coefficients above 0.99 for all analytes. The method was also successfully applied in the quantitative analysis of talcum powder spiked with perfume at different dosages. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Scenario Customization for Information Extraction

    National Research Council Canada - National Science Library

    Yangarber, Roman

    2001-01-01

    Information Extraction (IE) is an emerging NLP technology, whose function is to process unstructured, natural language text, to locate specific pieces of information, or facts, in the text, and to use these facts to fill a database...

  20. Can we replace curation with information extraction software?

    Science.gov (United States)

    Karp, Peter D

    2016-01-01

    Can we use programs for automated or semi-automated information extraction from scientific texts as practical alternatives to professional curation? I show that error rates of current information extraction programs are too high to replace professional curation today. Furthermore, current IEP programs extract single narrow slivers of information, such as individual protein interactions; they cannot extract the large breadth of information extracted by professional curators for databases such as EcoCyc. They also cannot arbitrate among conflicting statements in the literature as curators can. Therefore, funding agencies should not hobble the curation efforts of existing databases on the assumption that a problem that has stymied Artificial Intelligence researchers for more than 60 years will be solved tomorrow. Semi-automated extraction techniques appear to have significantly more potential based on a review of recent tools that enhance curator productivity. But a full cost-benefit analysis for these tools is lacking. Without such analysis it is possible to expend significant effort developing information-extraction tools that automate small parts of the overall curation workflow without achieving a significant decrease in curation costs.Database URL. © The Author(s) 2016. Published by Oxford University Press.

  1. Transductive Pattern Learning for Information Extraction

    National Research Council Canada - National Science Library

    McLernon, Brian; Kushmerick, Nicholas

    2006-01-01

    .... We present TPLEX, a semi-supervised learning algorithm for information extraction that can acquire extraction patterns from a small amount of labelled text in conjunction with a large amount of unlabelled text...

  2. Quantitative information in medical imaging

    International Nuclear Information System (INIS)

    Deconinck, F.

    1985-01-01

    When developing new imaging or image processing techniques, one constantly has in mind that the new technique should provide a better, or more optimal answer to medical tasks than existing techniques do 'Better' or 'more optimal' imply some kind of standard by which one can measure imaging or image processing performance. The choice of a particular imaging modality to answer a diagnostic task, such as the detection of coronary artery stenosis is also based on an implicit optimalisation of performance criteria. Performance is measured by the ability to provide information about an object (patient) to the person (referring doctor) who ordered a particular task. In medical imaging the task is generally to find quantitative information on bodily function (biochemistry, physiology) and structure (histology, anatomy). In medical imaging, a wide range of techniques is available. Each technique has it's own characteristics. The techniques discussed in this paper are: nuclear magnetic resonance, X-ray fluorescence, scintigraphy, positron emission tomography, applied potential tomography, computerized tomography, and compton tomography. This paper provides a framework for the comparison of imaging performance, based on the way the quantitative information flow is altered by the characteristics of the modality

  3. Quantitative Information Flow as Safety and Liveness Hyperproperties

    Directory of Open Access Journals (Sweden)

    Hirotoshi Yasuoka

    2012-07-01

    Full Text Available We employ Clarkson and Schneider's "hyperproperties" to classify various verification problems of quantitative information flow. The results of this paper unify and extend the previous results on the hardness of checking and inferring quantitative information flow. In particular, we identify a subclass of liveness hyperproperties, which we call "k-observable hyperproperties", that can be checked relative to a reachability oracle via self composition.

  4. Extracting quantitative structural parameters for disordered polymers from neutron scattering data

    International Nuclear Information System (INIS)

    Rosi-Schwartz, B.; Mitchell, G.R.

    1995-01-01

    The organization of non-crystalline polymeric materials at a local level, namely on a spatial scale between a few and 100 A, is still unclear in many respects. The determination of the local structure in terms of the configuration and conformation of the polymer chain and of the packing characteristics of the chain in the bulk material represents a challenging problem. Data from wide-angle diffraction experiments are very difficult to interpret due to the very large amount of information that they carry, that is the large number of correlations present in the diffraction patterns.We describe new approaches that permit a detailed analysis of the complex neutron diffraction patterns characterizing polymer melts and glasses. The coupling of different computer modelling strategies with neutron scattering data over a wide Q range allows the extraction of detailed quantitative information on the structural arrangements of the materials of interest. Proceeding from modelling routes as diverse as force field calculations, single-chain modelling and reverse Monte Carlo, we show the successes and pitfalls of each approach in describing model systems, which illustrate the need to attack the data analysis problem simultaneously from several fronts. ((orig.))

  5. Novel mathematic models for quantitative transitivity of quality-markers in extraction process of the Buyanghuanwu decoction.

    Science.gov (United States)

    Zhang, Yu-Tian; Xiao, Mei-Feng; Deng, Kai-Wen; Yang, Yan-Tao; Zhou, Yi-Qun; Zhou, Jin; He, Fu-Yuan; Liu, Wen-Long

    2018-06-01

    Nowadays, to research and formulate an efficiency extraction system for Chinese herbal medicine, scientists have always been facing a great challenge for quality management, so that the transitivity of Q-markers in quantitative analysis of TCM was proposed by Prof. Liu recently. In order to improve the quality of extraction from raw medicinal materials for clinical preparations, a series of integrated mathematic models for transitivity of Q-markers in quantitative analysis of TCM were established. Buyanghuanwu decoction (BYHWD) was a commonly TCMs prescription, which was used to prevent and treat the ischemic heart and brain diseases. In this paper, we selected BYHWD as an extraction experimental subject to study the quantitative transitivity of TCM. Based on theory of Fick's Rule and Noyes-Whitney equation, novel kinetic models were established for extraction of active components. Meanwhile, fitting out kinetic equations of extracted models and then calculating the inherent parameters in material piece and Q-marker quantitative transfer coefficients, which were considered as indexes to evaluate transitivity of Q-markers in quantitative analysis of the extraction process of BYHWD. HPLC was applied to screen and analyze the potential Q-markers in the extraction process. Fick's Rule and Noyes-Whitney equation were adopted for mathematically modeling extraction process. Kinetic parameters were fitted and calculated by the Statistical Program for Social Sciences 20.0 software. The transferable efficiency was described and evaluated by potential Q-markers transfer trajectory via transitivity availability AUC, extraction ratio P, and decomposition ratio D respectively. The Q-marker was identified with AUC, P, D. Astragaloside IV, laetrile, paeoniflorin, and ferulic acid were studied as potential Q-markers from BYHWD. The relative technologic parameters were presented by mathematic models, which could adequately illustrate the inherent properties of raw materials

  6. Optical Aperture Synthesis Object's Information Extracting Based on Wavelet Denoising

    International Nuclear Information System (INIS)

    Fan, W J; Lu, Y

    2006-01-01

    Wavelet denoising is studied to improve OAS(optical aperture synthesis) object's Fourier information extracting. Translation invariance wavelet denoising based on Donoho wavelet soft threshold denoising is researched to remove Pseudo-Gibbs in wavelet soft threshold image. OAS object's information extracting based on translation invariance wavelet denoising is studied. The study shows that wavelet threshold denoising can improve the precision and the repetition of object's information extracting from interferogram, and the translation invariance wavelet denoising information extracting is better than soft threshold wavelet denoising information extracting

  7. The development of quantitative determination method of organic acids in complex poly herbal extraction

    Directory of Open Access Journals (Sweden)

    I. L. Dyachok

    2016-08-01

    Full Text Available Aim. The development of sensible, economical and expressive method of quantitative determination of organic acids in complex poly herbal extraction counted on izovaleric acid with the use of digital technologies. Materials and methods. Model complex poly herbal extraction of sedative action was chosen as a research object. Extraction is composed of these medical plants: Valeriana officinalis L., Crataégus, Melissa officinalis L., Hypericum, Mentha piperita L., Húmulus lúpulus, Viburnum. Based on chemical composition of plant components, we consider that main pharmacologically active compounds, which can be found in complex poly herbal extraction are: polyphenolic substances (flavonoids, which are contained in Crataégus, Viburnum, Hypericum, Mentha piperita L., Húmulus lúpulus; also organic acids, including izovaleric acid, which are contained in Valeriana officinalis L., Mentha piperita L., Melissa officinalis L., Viburnum; the aminoacid are contained in Valeriana officinalis L. For the determination of organic acids content in low concentration we applied instrumental method of analysis, namely conductometry titration which consisted in the dependences of water solution conductivity of complex poly herbal extraction on composition of organic acids. Result. The got analytical dependences, which describes tangent lines to the conductometry curve before and after the point of equivalence, allow to determine the volume of solution expended on titration and carry out procedure of quantitative determination of organic acids in the digital mode. Conclusion. The proposed method enables to determine the point of equivalence and carry out quantitative determination of organic acids counted on izovaleric acid with the use of digital technologies, that allows to computerize the method on the whole.

  8. Extraction and quantitation of total cholesterol, dolichol and dolichyl phosphate from mammalian liver

    International Nuclear Information System (INIS)

    Crick, D.C.; Carroll, K.K.

    1987-01-01

    A procedure is described for the determination of total cholesterol, dolichol and dolichyl phosphate (Dol-P) in mammalian liver. It is based on extraction of these compounds into diethyl ether after alkaline saponification of the tissue. Extractability is affected by the length of saponification and concentration of potassium hydroxide (KOH) in the saponification mixture. After extraction, total cholesterol and dolichol are quantitated directly by reverse-phase high pressure liquid chromatography (HPLC) on C18. Dol-P requires further purification before quantitation by HPLC, this is accomplished by chromatography on silicic acid. These methods gave recoveries of over 90% for cholesterol and dolichol and about 60% for Dol-P, using [4- 14 C]cholesterol, a polyprenol containing 15 isoprene units, and [1- 14 C]Dol-P as recovery standards. Concentrations of total cholesterol, dolichol and Dol-P in livers from one month-old-CBA mice were found to be 5.7 +/- 0.7 mg/g, 66.3 +/- 1.2 micrograms/g and 3.7 +/- 0.3 micrograms/g, respectively

  9. A method to extract quantitative information in analyzer-based x-ray phase contrast imaging

    International Nuclear Information System (INIS)

    Pagot, E.; Cloetens, P.; Fiedler, S.; Bravin, A.; Coan, P.; Baruchel, J.; Haertwig, J.; Thomlinson, W.

    2003-01-01

    Analyzer-based imaging is a powerful phase-sensitive technique that generates improved contrast compared to standard absorption radiography. Combining numerically two images taken on either side at ±1/2 of the full width at half-maximum (FWHM) of the rocking curve provides images of 'pure refraction' and of 'apparent absorption'. In this study, a similar approach is made by combining symmetrical images with respect to the peak of the analyzer rocking curve but at general positions, ±α·FWHM. These two approaches do not consider the ultrasmall angle scattering produced by the object independently, which can lead to inconsistent results. An accurate way to separately retrieve the quantitative information intrinsic to the object is proposed. It is based on a statistical analysis of the local rocking curve, and allows one to overcome the problems encountered using the previous approaches

  10. A chemical profiling strategy for semi-quantitative analysis of flavonoids in Ginkgo extracts.

    Science.gov (United States)

    Yang, Jing; Wang, An-Qi; Li, Xue-Jing; Fan, Xue; Yin, Shan-Shan; Lan, Ke

    2016-05-10

    Flavonoids analysis in herbal products is challenged by their vast chemical diversity. This work aimed to develop a chemical profiling strategy for the semi-quantification of flavonoids using extracts of Ginkgo biloba L. (EGB) as an example. The strategy was based on the principle that flavonoids in EGB have an almost equivalent molecular absorption coefficient at a fixed wavelength. As a result, the molecular-contents of flavonoids were able to be semi-quantitatively determined by the molecular-concentration calibration curves of common standards and recalculated as the mass-contents with the characterized molecular weight (MW). Twenty batches of EGB were subjected to HPLC-UV/DAD/MS fingerprinting analysis to test the feasibility and reliability of this strategy. The flavonoid peaks were distinguished from the other peaks with principle component analysis and Pearson correlation analysis of the normalized UV spectrometric dataset. Each flavonoid peak was subsequently tentatively identified by the MS data to ascertain their MW. It was highlighted that the flavonoids absorption at Band-II (240-280 nm) was more suitable for the semi-quantification purpose because of the less variation compared to that at Band-I (300-380 nm). The semi-quantification was therefore conducted at 254 nm. Beyond the qualitative comparison results acquired by common chemical profiling techniques, the semi-quantitative approach presented the detailed compositional information of flavonoids in EGB and demonstrated how the adulteration of one batch was achieved. The developed strategy was believed to be useful for the advanced analysis of herbal extracts with a high flavonoid content without laborious identification and isolation of individual components. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. [Quantitive variation of polysaccharides and alcohol-soluble extracts in F1 generation of Dendrobium officinale].

    Science.gov (United States)

    Zhang, Xiao-Ling; Liu, Jing-Jing; Wu, Ling-Shang; Si, Jin-Ping; Guo, Ying-Ying; Yu, Jie; Wang, Lin-Hua

    2013-11-01

    Using phenol-sulfuric acid method and hot-dip method of alcohol-soluble extracts, the contents of polysaccharides and alcohol-soluble extracts in 11 F1 generations of Dendrobium officinale were determined. The results showed that the polysaccharides contents in samples collected in May and February were 32.89%-43.07% and 25.77%-35.25%, respectively, while the extracts contents were 2.81%-4.85% and 7.90%-17.40%, respectively. They were significantly different among families. The content of polysaccharides in offspring could be significantly improved by hybridization between parents with low and high polysaccharides contents, and the hybrid vigor was obvious. Cross breeding was an effective way for breeding new varieties with higher polysaccharides contents. Harvest time would significantly affect the contents of polysaccharides and alcohol-soluble extracts. The contents of polysaccharides in families collected in May were higher than those of polysaccharides in families collected in February, but the extracts content had the opposite variation. The extents of quantitative variation of polysaccharides and alcohol-soluble extracts were different among families, and each family had its own rules. It would be significant in giving full play to their role as the excellent varieties and increasing effectiveness by studying on the quantitative accumulation regularity of polysaccharides and alcohol-soluble extracts in superior families (varieties) of D. officinale to determine the best harvesting time.

  12. Cause Information Extraction from Financial Articles Concerning Business Performance

    Science.gov (United States)

    Sakai, Hiroyuki; Masuyama, Shigeru

    We propose a method of extracting cause information from Japanese financial articles concerning business performance. Our method acquires cause informtion, e. g. “_??__??__??__??__??__??__??__??__??__??_ (zidousya no uriage ga koutyou: Sales of cars were good)”. Cause information is useful for investors in selecting companies to invest. Our method extracts cause information as a form of causal expression by using statistical information and initial clue expressions automatically. Our method can extract causal expressions without predetermined patterns or complex rules given by hand, and is expected to be applied to other tasks for acquiring phrases that have a particular meaning not limited to cause information. We compared our method with our previous one originally proposed for extracting phrases concerning traffic accident causes and experimental results showed that our new method outperforms our previous one.

  13. Sample-based XPath Ranking for Web Information Extraction

    NARCIS (Netherlands)

    Jundt, Oliver; van Keulen, Maurice

    Web information extraction typically relies on a wrapper, i.e., program code or a configuration that specifies how to extract some information from web pages at a specific website. Manually creating and maintaining wrappers is a cumbersome and error-prone task. It may even be prohibitive as some

  14. Ontology-Based Information Extraction for Business Intelligence

    Science.gov (United States)

    Saggion, Horacio; Funk, Adam; Maynard, Diana; Bontcheva, Kalina

    Business Intelligence (BI) requires the acquisition and aggregation of key pieces of knowledge from multiple sources in order to provide valuable information to customers or feed statistical BI models and tools. The massive amount of information available to business analysts makes information extraction and other natural language processing tools key enablers for the acquisition and use of that semantic information. We describe the application of ontology-based extraction and merging in the context of a practical e-business application for the EU MUSING Project where the goal is to gather international company intelligence and country/region information. The results of our experiments so far are very promising and we are now in the process of building a complete end-to-end solution.

  15. Visible light scatter as quantitative information source on milk constituents

    DEFF Research Database (Denmark)

    Melentiyeva, Anastasiya; Kucheryavskiy, Sergey; Bogomolov, Andrey

    2012-01-01

    analysis. The main task here is to extract individual quantitative information on milk fat and total protein content from spectral data. This is particularly challenging problem in the case of raw natural milk, where the fat globule sizes may essentially differ depending on source. Fig. 1. Spots of light...... designed set of raw milk samples with simultaneously varying fat, total protein and particle size distribution has been analyzed in the Vis spectral region. The feasibility of raw milk analysis by PLS regression on spectral data has been proved. The root mean-square errors below 0.10% and 0.04% for fat....... 3J&M Analytik AG, Willy-Messerschmitt-Strasse 8, 73457 Essingen, Germany. bogomolov@j-m.de Fat and protein are two major milk nutrients that are routinely analyzed in the dairy industry. Growing food quality requirements promote the dissemination of spectroscopic analysis, enabling real...

  16. Simultaneous extraction and quantitation of several bioactive amines in cheese and chocolate.

    Science.gov (United States)

    Baker, G B; Wong, J T; Coutts, R T; Pasutto, F M

    1987-04-17

    A method is described for simultaneous extraction and quantitation of the amines 2-phenylethylamine, tele-methylhistamine, histamine, tryptamine, m- and p-tyramine, 3-methoxytyramine, 5-hydroxytryptamine, cadaverine, putrescine, spermidine and spermine. This method is based on extractive derivatization of the amines with a perfluoroacylating agent, pentafluorobenzoyl chloride, under basic aqueous conditions. Analysis was done on a gas chromatograph equipped with an electron-capture detector and a capillary column system. The procedure is relatively rapid and provides derivatives with good chromatographic properties. Its application to analysis of the above amines in cheese and chocolate products is described.

  17. Validation of the Mass-Extraction-Window for Quantitative Methods Using Liquid Chromatography High Resolution Mass Spectrometry.

    Science.gov (United States)

    Glauser, Gaétan; Grund, Baptiste; Gassner, Anne-Laure; Menin, Laure; Henry, Hugues; Bromirski, Maciej; Schütz, Frédéric; McMullen, Justin; Rochat, Bertrand

    2016-03-15

    A paradigm shift is underway in the field of quantitative liquid chromatography-mass spectrometry (LC-MS) analysis thanks to the arrival of recent high-resolution mass spectrometers (HRMS). The capability of HRMS to perform sensitive and reliable quantifications of a large variety of analytes in HR-full scan mode is showing that it is now realistic to perform quantitative and qualitative analysis with the same instrument. Moreover, HR-full scan acquisition offers a global view of sample extracts and allows retrospective investigations as virtually all ionized compounds are detected with a high sensitivity. In time, the versatility of HRMS together with the increasing need for relative quantification of hundreds of endogenous metabolites should promote a shift from triple-quadrupole MS to HRMS. However, a current "pitfall" in quantitative LC-HRMS analysis is the lack of HRMS-specific guidance for validated quantitative analyses. Indeed, false positive and false negative HRMS detections are rare, albeit possible, if inadequate parameters are used. Here, we investigated two key parameters for the validation of LC-HRMS quantitative analyses: the mass accuracy (MA) and the mass-extraction-window (MEW) that is used to construct the extracted-ion-chromatograms. We propose MA-parameters, graphs, and equations to calculate rational MEW width for the validation of quantitative LC-HRMS methods. MA measurements were performed on four different LC-HRMS platforms. Experimentally determined MEW values ranged between 5.6 and 16.5 ppm and depended on the HRMS platform, its working environment, the calibration procedure, and the analyte considered. The proposed procedure provides a fit-for-purpose MEW determination and prevents false detections.

  18. A Two-Step Resume Information Extraction Algorithm

    Directory of Open Access Journals (Sweden)

    Jie Chen

    2018-01-01

    Full Text Available With the rapid growth of Internet-based recruiting, there are a great number of personal resumes among recruiting systems. To gain more attention from the recruiters, most resumes are written in diverse formats, including varying font size, font colour, and table cells. However, the diversity of format is harmful to data mining, such as resume information extraction, automatic job matching, and candidates ranking. Supervised methods and rule-based methods have been proposed to extract facts from resumes, but they strongly rely on hierarchical structure information and large amounts of labelled data, which are hard to collect in reality. In this paper, we propose a two-step resume information extraction approach. In the first step, raw text of resume is identified as different resume blocks. To achieve the goal, we design a novel feature, Writing Style, to model sentence syntax information. Besides word index and punctuation index, word lexical attribute and prediction results of classifiers are included in Writing Style. In the second step, multiple classifiers are employed to identify different attributes of fact information in resumes. Experimental results on a real-world dataset show that the algorithm is feasible and effective.

  19. The Agent of extracting Internet Information with Lead Order

    Science.gov (United States)

    Mo, Zan; Huang, Chuliang; Liu, Aijun

    In order to carry out e-commerce better, advanced technologies to access business information are in need urgently. An agent is described to deal with the problems of extracting internet information that caused by the non-standard and skimble-scamble structure of Chinese websites. The agent designed includes three modules which respond to the process of extracting information separately. A method of HTTP tree and a kind of Lead algorithm is proposed to generate a lead order, with which the required web can be retrieved easily. How to transform the extracted information structuralized with natural language is also discussed.

  20. Fine-grained information extraction from German transthoracic echocardiography reports.

    Science.gov (United States)

    Toepfer, Martin; Corovic, Hamo; Fette, Georg; Klügl, Peter; Störk, Stefan; Puppe, Frank

    2015-11-12

    Information extraction techniques that get structured representations out of unstructured data make a large amount of clinically relevant information about patients accessible for semantic applications. These methods typically rely on standardized terminologies that guide this process. Many languages and clinical domains, however, lack appropriate resources and tools, as well as evaluations of their applications, especially if detailed conceptualizations of the domain are required. For instance, German transthoracic echocardiography reports have not been targeted sufficiently before, despite of their importance for clinical trials. This work therefore aimed at development and evaluation of an information extraction component with a fine-grained terminology that enables to recognize almost all relevant information stated in German transthoracic echocardiography reports at the University Hospital of Würzburg. A domain expert validated and iteratively refined an automatically inferred base terminology. The terminology was used by an ontology-driven information extraction system that outputs attribute value pairs. The final component has been mapped to the central elements of a standardized terminology, and it has been evaluated according to documents with different layouts. The final system achieved state-of-the-art precision (micro average.996) and recall (micro average.961) on 100 test documents that represent more than 90 % of all reports. In particular, principal aspects as defined in a standardized external terminology were recognized with f 1=.989 (micro average) and f 1=.963 (macro average). As a result of keyword matching and restraint concept extraction, the system obtained high precision also on unstructured or exceptionally short documents, and documents with uncommon layout. The developed terminology and the proposed information extraction system allow to extract fine-grained information from German semi-structured transthoracic echocardiography reports

  1. Quantitative measurements in laser induced plasmas using optical probing. Progress report, October 1, 1977--April 30, 1978

    International Nuclear Information System (INIS)

    Sweeney, D.W.

    1978-06-01

    Optical probing of laser induced plasmas can be used to quantitatively reconstruct electron number densities and magnetic fields. Numerical techniques for extracting quantitative information from the experimental data are described and four Abel inversion codes are provided. A computer simulation of optical probing is used to determine the quantitative information that can be reasonably extracted from real experimental systems. Examples of reconstructed electron number densities from interferograms of laser plasmas show steepened electron distributions

  2. Quantitative Structure-Relative Volatility Relationship Model for Extractive Distillation of Ethylbenzene/p-Xylene Mixtures: Application to Binary and Ternary Mixtures as Extractive Agents

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Young-Mook; Oh, Kyunghwan; You, Hwan; No, Kyoung Tai [Bioinformatics and Molecular Design Research Center, Seoul (Korea, Republic of); Jeon, Yukwon; Shul, Yong-Gun; Hwang, Sung Bo; Shin, Hyun Kil; Kim, Min Sung; Kim, Namseok; Son, Hyoungjun [Yonsei University, Seoul (Korea, Republic of); Chu, Young Hwan [Sangji University, Wonju (Korea, Republic of); Cho, Kwang-Hwi [Soongsil University, Seoul (Korea, Republic of)

    2016-04-15

    Ethylbenzene (EB) and p-xylene (PX) are important chemicals for the production of industrial materials; accordingly, their efficient separation is desired, even though the difference in their boiling points is very small. This paper describes the efforts toward the identification of high-performance extractive agents for EB and PX separation by distillation. Most high-performance extractive agents contain halogen atoms, which present health hazards and are corrosive to distillation plates. To avoid this disadvantage of extractive agents, we developed a quantitative structure-relative volatility relationship (QSRVR) model for designing safe extractive agents. We have previously developed and reported QSRVR models for single extractive agents. In this study, we introduce extended QSRVR models for binary and ternary extractive agents. The QSRVR models accurately predict the relative volatilities of binary and ternary extractive agents. The service to predict the relative volatility for binary and ternary extractive agents is freely available from the Internet at http://qsrvr.o pengsi.org/.

  3. Quantitative Structure-Relative Volatility Relationship Model for Extractive Distillation of Ethylbenzene/p-Xylene Mixtures: Application to Binary and Ternary Mixtures as Extractive Agents

    International Nuclear Information System (INIS)

    Kang, Young-Mook; Oh, Kyunghwan; You, Hwan; No, Kyoung Tai; Jeon, Yukwon; Shul, Yong-Gun; Hwang, Sung Bo; Shin, Hyun Kil; Kim, Min Sung; Kim, Namseok; Son, Hyoungjun; Chu, Young Hwan; Cho, Kwang-Hwi

    2016-01-01

    Ethylbenzene (EB) and p-xylene (PX) are important chemicals for the production of industrial materials; accordingly, their efficient separation is desired, even though the difference in their boiling points is very small. This paper describes the efforts toward the identification of high-performance extractive agents for EB and PX separation by distillation. Most high-performance extractive agents contain halogen atoms, which present health hazards and are corrosive to distillation plates. To avoid this disadvantage of extractive agents, we developed a quantitative structure-relative volatility relationship (QSRVR) model for designing safe extractive agents. We have previously developed and reported QSRVR models for single extractive agents. In this study, we introduce extended QSRVR models for binary and ternary extractive agents. The QSRVR models accurately predict the relative volatilities of binary and ternary extractive agents. The service to predict the relative volatility for binary and ternary extractive agents is freely available from the Internet at http://qsrvr.o pengsi.org/.

  4. Zone analysis in biology articles as a basis for information extraction.

    Science.gov (United States)

    Mizuta, Yoko; Korhonen, Anna; Mullen, Tony; Collier, Nigel

    2006-06-01

    In the field of biomedicine, an overwhelming amount of experimental data has become available as a result of the high throughput of research in this domain. The amount of results reported has now grown beyond the limits of what can be managed by manual means. This makes it increasingly difficult for the researchers in this area to keep up with the latest developments. Information extraction (IE) in the biological domain aims to provide an effective automatic means to dynamically manage the information contained in archived journal articles and abstract collections and thus help researchers in their work. However, while considerable advances have been made in certain areas of IE, pinpointing and organizing factual information (such as experimental results) remains a challenge. In this paper we propose tackling this task by incorporating into IE information about rhetorical zones, i.e. classification of spans of text in terms of argumentation and intellectual attribution. As the first step towards this goal, we introduce a scheme for annotating biological texts for rhetorical zones and provide a qualitative and quantitative analysis of the data annotated according to this scheme. We also discuss our preliminary research on automatic zone analysis, and its incorporation into our IE framework.

  5. Extracting Information from Multimedia Meeting Collections

    OpenAIRE

    Gatica-Perez, Daniel; Zhang, Dong; Bengio, Samy

    2005-01-01

    Multimedia meeting collections, composed of unedited audio and video streams, handwritten notes, slides, and electronic documents that jointly constitute a raw record of complex human interaction processes in the workplace, have attracted interest due to the increasing feasibility of recording them in large quantities, by the opportunities for information access and retrieval applications derived from the automatic extraction of relevant meeting information, and by the challenges that the ext...

  6. Quantitative immunoelectrophoretic analysis of extract from cow hair and dander

    Energy Technology Data Exchange (ETDEWEB)

    Prahl, P; Weeke, B; Loewenstein, H [Rigshospitalet, Copenhagen (Denmark)

    1978-01-01

    Quantitative immunoelectrophoresis used for the analysis of a dialysed, centrifuged and freeze-dried extract from cow hair and dander revealed 17 antigens. Five of these were identified as serum proteins. Partial identity to antigens of serum and extract from hair and dander of goat, sheep, swine, horse, dog, cat, and guinea pig, and to antigens of house dust was demonstrated. Sera from 36 patients with manifest allergy to cow hair and dander selected on the basis of case history, RAST, skin and provocation test, were examined in crossed radioimmunoelectrophoresis (CRIE); sera from five persons with high serum IgE, but without allergy to cow hair and dander, and sera from five normal individuals were controls. 31/36 of the sera contained IgE with specific affinity for two of the antigens of the extract. Further, two major and six minor allergens were identified. The control sera showed no specific IgE binding. A significant positive correlation was found between RAST and CRIE for the first group of patients. The approximate molecular weights of the four major allergens obtained by means of gel chromatography were: 2.4 x 10/sup 4/, 2 x 10/sup 4/, 2 x 10/sup 5/ dalton, respectively. Using Con-A and Con-A Sepharose in crossed immunoaffinoelectrophoresis, eight of the antigens were revealed to contain groups with affinity for Con-A.

  7. Quantitative immunoelectrophoretic analysis of extract from cow hair and dander

    International Nuclear Information System (INIS)

    Prahl, P.; Weeke, B.; Loewenstein, H.

    1978-01-01

    Quantitative immunoelectrophoresis used for the analysis of a dialysed, centrifuged and freeze-dried extract from cow hair and dander revealed 17 antigens. Five of these were identified as serum proteins. Partial identity to antigens of serum and extract from hair and dander of goat, sheep, swine, horse, dog, cat, and guinea pig, and to antigens of house dust was demonstrated. Sera from 36 patients with manifest allergy to cow hair and dander selected on the basis of case history, RAST, skin and provocation test, were examined in crossed radioimmunoelectrophoresis (CRIE); sera from five persons with high serum IgE, but without allergy to cow hair and dander, and sera from five normal individuals were controls. 31/36 of the sera contained IgE with specific affinity for two of the antigens of the extract. Further, two major and six minor allergens were identified. The control sera showed no specific IgE binding. A significant positive correlation was found between RAST and CRIE for the first group of patients. The approximate molecular weights of the four major allergens obtained by means of gel chromatography were: 2.4 x 10 4 , 2 x 10 4 , 2 x 10 5 dalton, respectively. Using Con-A and Con-A Sepharose in crossed immunoaffinoelectrophoresis, eight of the antigens were revealed to contain groups with affinity for Con-A. (author)

  8. NAMED ENTITY RECOGNITION FROM BIOMEDICAL TEXT -AN INFORMATION EXTRACTION TASK

    Directory of Open Access Journals (Sweden)

    N. Kanya

    2016-07-01

    Full Text Available Biomedical Text Mining targets the Extraction of significant information from biomedical archives. Bio TM encompasses Information Retrieval (IR and Information Extraction (IE. The Information Retrieval will retrieve the relevant Biomedical Literature documents from the various Repositories like PubMed, MedLine etc., based on a search query. The IR Process ends up with the generation of corpus with the relevant document retrieved from the Publication databases based on the query. The IE task includes the process of Preprocessing of the document, Named Entity Recognition (NER from the documents and Relationship Extraction. This process includes Natural Language Processing, Data Mining techniques and machine Language algorithm. The preprocessing task includes tokenization, stop word Removal, shallow parsing, and Parts-Of-Speech tagging. NER phase involves recognition of well-defined objects such as genes, proteins or cell-lines etc. This process leads to the next phase that is extraction of relationships (IE. The work was based on machine learning algorithm Conditional Random Field (CRF.

  9. Comparison of methods of extracting information for meta-analysis of observational studies in nutritional epidemiology

    Directory of Open Access Journals (Sweden)

    Jong-Myon Bae

    2016-01-01

    Full Text Available OBJECTIVES: A common method for conducting a quantitative systematic review (QSR for observational studies related to nutritional epidemiology is the “highest versus lowest intake” method (HLM, in which only the information concerning the effect size (ES of the highest category of a food item is collected on the basis of its lowest category. However, in the interval collapsing method (ICM, a method suggested to enable a maximum utilization of all available information, the ES information is collected by collapsing all categories into a single category. This study aimed to compare the ES and summary effect size (SES between the HLM and ICM. METHODS: A QSR for evaluating the citrus fruit intake and risk of pancreatic cancer and calculating the SES by using the HLM was selected. The ES and SES were estimated by performing a meta-analysis using the fixed-effect model. The directionality and statistical significance of the ES and SES were used as criteria for determining the concordance between the HLM and ICM outcomes. RESULTS: No significant differences were observed in the directionality of SES extracted by using the HLM or ICM. The application of the ICM, which uses a broader information base, yielded more-consistent ES and SES, and narrower confidence intervals than the HLM. CONCLUSIONS: The ICM is advantageous over the HLM owing to its higher statistical accuracy in extracting information for QSR on nutritional epidemiology. The application of the ICM should hence be recommended for future studies.

  10. Semantic Information Extraction of Lanes Based on Onboard Camera Videos

    Science.gov (United States)

    Tang, L.; Deng, T.; Ren, C.

    2018-04-01

    In the field of autonomous driving, semantic information of lanes is very important. This paper proposes a method of automatic detection of lanes and extraction of semantic information from onboard camera videos. The proposed method firstly detects the edges of lanes by the grayscale gradient direction, and improves the Probabilistic Hough transform to fit them; then, it uses the vanishing point principle to calculate the lane geometrical position, and uses lane characteristics to extract lane semantic information by the classification of decision trees. In the experiment, 216 road video images captured by a camera mounted onboard a moving vehicle were used to detect lanes and extract lane semantic information. The results show that the proposed method can accurately identify lane semantics from video images.

  11. Integrating Information Extraction Agents into a Tourism Recommender System

    Science.gov (United States)

    Esparcia, Sergio; Sánchez-Anguix, Víctor; Argente, Estefanía; García-Fornes, Ana; Julián, Vicente

    Recommender systems face some problems. On the one hand information needs to be maintained updated, which can result in a costly task if it is not performed automatically. On the other hand, it may be interesting to include third party services in the recommendation since they improve its quality. In this paper, we present an add-on for the Social-Net Tourism Recommender System that uses information extraction and natural language processing techniques in order to automatically extract and classify information from the Web. Its goal is to maintain the system updated and obtain information about third party services that are not offered by service providers inside the system.

  12. Extraction of fish body oil from Sardinella longiceps by employing direct steaming method and its quantitative and qualitative assessment

    Directory of Open Access Journals (Sweden)

    Moorthy Pravinkumar

    2015-12-01

    Full Text Available Objective: To analyze the quantitative and qualitative properties of the extracted fish oil from Sardinella longiceps (S. longiceps. Methods: Four size groups of S. longiceps were examined for the extraction of fish oil based on length. The size groups included Group I (size range of 7.1–10.0 cm, Group II (size range of 10.1–13.0 cm, Group III (size range of 13.1–16.0 cm and Group IV (size range of 16.1– 19.0 cm. Fish oil was extracted from the tissues of S. longiceps by direct steaming method. The oil was then subjected to the determination of specific gravity, refractive index, moisture content, free fatty acids, iodine value, peroxide value, saponification value and observation of colour. Results: The four groups showed different yield of fish oil that Group IV recorded the highest values of (165.00 ± 1.00 mL/kg followed by Group III [(145.66 ± 1.15 mL/kg] and Group II [(129.33 ± 0.58 mL/kg], whereas Group I recorded the lowest values of (78.33 ± 0.58 mL/ kg in monsoon season, and the average yield was (180.0 ± 4.9 mL/kg fish tissues. These analytical values of the crude oil were well within the acceptable standard values for both fresh and stocked samples. Conclusions: The information generated in the present study pertaining to the quantitative and qualitative analysis of fish oil will serve as a reference baseline for entrepreneurs and industrialists in future for the successful commercial production of fish oil by employing oil sardines.

  13. Optimal Information Extraction of Laser Scanning Dataset by Scale-Adaptive Reduction

    Science.gov (United States)

    Zang, Y.; Yang, B.

    2018-04-01

    3D laser technology is widely used to collocate the surface information of object. For various applications, we need to extract a good perceptual quality point cloud from the scanned points. To solve the problem, most of existing methods extract important points based on a fixed scale. However, geometric features of 3D object come from various geometric scales. We propose a multi-scale construction method based on radial basis function. For each scale, important points are extracted from the point cloud based on their importance. We apply a perception metric Just-Noticeable-Difference to measure degradation of each geometric scale. Finally, scale-adaptive optimal information extraction is realized. Experiments are undertaken to evaluate the effective of the proposed method, suggesting a reliable solution for optimal information extraction of object.

  14. OPTIMAL INFORMATION EXTRACTION OF LASER SCANNING DATASET BY SCALE-ADAPTIVE REDUCTION

    Directory of Open Access Journals (Sweden)

    Y. Zang

    2018-04-01

    Full Text Available 3D laser technology is widely used to collocate the surface information of object. For various applications, we need to extract a good perceptual quality point cloud from the scanned points. To solve the problem, most of existing methods extract important points based on a fixed scale. However, geometric features of 3D object come from various geometric scales. We propose a multi-scale construction method based on radial basis function. For each scale, important points are extracted from the point cloud based on their importance. We apply a perception metric Just-Noticeable-Difference to measure degradation of each geometric scale. Finally, scale-adaptive optimal information extraction is realized. Experiments are undertaken to evaluate the effective of the proposed method, suggesting a reliable solution for optimal information extraction of object.

  15. Application of crown ethers to selective extraction and quantitative analysis of technetium 99, iodine 129 and cesium 135 in effluents

    International Nuclear Information System (INIS)

    Paviet, P.

    1992-01-01

    Properties of crown ethers are first recalled. Then extraction of technetium 99 is studied in actual radioactive effluents. Quantitative analysis is carried out by liquid scintillation and interference of tritium is corrected. Iodine 129 is extracted from radioactive effluents and determined by gamma spectrometry. Finally cesium 135 is extracted and determined by thermo ionization mass spectroscopy

  16. Knowledge Dictionary for Information Extraction on the Arabic Text Data

    Directory of Open Access Journals (Sweden)

    Wahyu Jauharis Saputra

    2013-04-01

    Full Text Available Information extraction is an early stage of a process of textual data analysis. Information extraction is required to get information from textual data that can be used for process analysis, such as classification and categorization. A textual data is strongly influenced by the language. Arabic is gaining a significant attention in many studies because Arabic language is very different from others, and in contrast to other languages, tools and research on the Arabic language is still lacking. The information extracted using the knowledge dictionary is a concept of expression. A knowledge dictionary is usually constructed manually by an expert and this would take a long time and is specific to a problem only. This paper proposed a method for automatically building a knowledge dictionary. Dictionary knowledge is formed by classifying sentences having the same concept, assuming that they will have a high similarity value. The concept that has been extracted can be used as features for subsequent computational process such as classification or categorization. Dataset used in this paper was the Arabic text dataset. Extraction result was tested by using a decision tree classification engine and the highest precision value obtained was 71.0% while the highest recall value was 75.0%. 

  17. A quantitative approach to modeling the information processing of NPP operators under input information overload

    International Nuclear Information System (INIS)

    Kim, Jong Hyun; Seong, Poong Hyun

    2002-01-01

    This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task under input information overload. We primarily develop the information processing model having multiple stages, which contains information flow. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory. We also investigate the applicability of this approach to quantifying the information reduction of operators under the input information overload

  18. Multi-Filter String Matching and Human-Centric Entity Matching for Information Extraction

    Science.gov (United States)

    Sun, Chong

    2012-01-01

    More and more information is being generated in text documents, such as Web pages, emails and blogs. To effectively manage this unstructured information, one broadly used approach includes locating relevant content in documents, extracting structured information and integrating the extracted information for querying, mining or further analysis. In…

  19. On the quantitativeness of EDS STEM

    Energy Technology Data Exchange (ETDEWEB)

    Lugg, N.R. [Institute of Engineering Innovation, The University of Tokyo, 2-11-16, Yayoi, Bunkyo-ku, Tokyo 113-8656 (Japan); Kothleitner, G. [Institute for Electron Microscopy and Nanoanalysis, Graz University of Technology, Steyrergasse 17, 8010 Graz (Austria); Centre for Electron Microscopy, Steyrergasse 17, 8010 Graz (Austria); Shibata, N.; Ikuhara, Y. [Institute of Engineering Innovation, The University of Tokyo, 2-11-16, Yayoi, Bunkyo-ku, Tokyo 113-8656 (Japan)

    2015-04-15

    Chemical mapping using energy dispersive X-ray spectroscopy (EDS) in scanning transmission electron microscopy (STEM) has recently shown to be a powerful technique in analyzing the elemental identity and location of atomic columns in materials at atomic resolution. However, most applications of EDS STEM have been used only to qualitatively map whether elements are present at specific sites. Obtaining calibrated EDS STEM maps so that they are on an absolute scale is a difficult task and even if one achieves this, extracting quantitative information about the specimen – such as the number or density of atoms under the probe – adds yet another layer of complexity to the analysis due to the multiple elastic and inelastic scattering of the electron probe. Quantitative information may be obtained by comparing calibrated EDS STEM with theoretical simulations, but in this case a model of the structure must be assumed a priori. Here we first theoretically explore how exactly elastic and thermal scattering of the probe confounds the quantitative information one is able to extract about the specimen from an EDS STEM map. We then show using simulation how tilting the specimen (or incident probe) can reduce the effects of scattering and how it can provide quantitative information about the specimen. We then discuss drawbacks of this method – such as the loss of atomic resolution along the tilt direction – but follow this with a possible remedy: precession averaged EDS STEM mapping. - Highlights: • Signal obtained in EDS STEM maps (of STO) compared to non-channelling signal. • Deviation from non-channelling signal occurs in on-axis experiments. • Tilting specimen: signal close to non-channelling case but atomic resolution is lost. • Tilt-precession series: non-channelling signal and atomic-resolution features obtained. • Associated issues are discussed.

  20. Phytochrome quantitation in crude extracts of Avena by enzyme-linked immunosorbent assay with monoclonal antibodies

    Energy Technology Data Exchange (ETDEWEB)

    Shimazaki, Y; Cordonnier, M M; Pratt, L H

    1983-01-01

    An enzyme-linked immunosorbent assay (ELISA), which uses both rabbit polyclonal and mouse monoclonal antibodies to phytochrome, has been adapted for quantitation of phytochrome in crude plant extracts. The assay has a detection limit of about 100 pg phytochrome and can be completed within 10 h. Quantitation of phytochrome in crude extracts of etiolated oat seedlings by ELISA gave values that agreed well with those obtained by spectrophotometric assay. When etiolated oat seedlings were irradiated continuously for 24 h, the amount of phytochrome detected by ELISA and by spectrophotometric assay decreased by more than 1000-fold and about 100-fold, respectively. This discrepancy indicates that phytochrome in light-treated plants may be antigenically distinct from that found in fully etiolated plants. When these light-grown oat seedlings were kept in darkness for 48 h, phytochrome content detected by ELISA increased by 50-fold in crude extracts of green oat shoots, but only about 12-fold in extracts of herbicide-treated oat shoots. Phytochrome reaccumulation in green oat shoots was initially more rapid in the more mature cells of the primary leaf tip than near the basal part of the shoot. The inhibitory effect of Norflurazon on phytochrome accumulation was much more evident near the leaf tip than the shoot base. A 5-min red irradiation of oat seedlings at the end of a 48-h dark period resulted in a subsequent, massive decrease in phytochrome content in crude extracts from both green and Norflurazon-bleached oat shoots. These observations eliminate the possibility that substantial accumulation of chromophore-free phytochrome was being detected and indicate that Norflurazon has a substantial effect on phytochrome accumulation during a prolonged dark period. 25 references, 9 figures, 3 tables.

  1. Research on Crowdsourcing Emergency Information Extraction of Based on Events' Frame

    Science.gov (United States)

    Yang, Bo; Wang, Jizhou; Ma, Weijun; Mao, Xi

    2018-01-01

    At present, the common information extraction method cannot extract the structured emergency event information accurately; the general information retrieval tool cannot completely identify the emergency geographic information; these ways also do not have an accurate assessment of these results of distilling. So, this paper proposes an emergency information collection technology based on event framework. This technique is to solve the problem of emergency information picking. It mainly includes emergency information extraction model (EIEM), complete address recognition method (CARM) and the accuracy evaluation model of emergency information (AEMEI). EIEM can be structured to extract emergency information and complements the lack of network data acquisition in emergency mapping. CARM uses a hierarchical model and the shortest path algorithm and allows the toponomy pieces to be joined as a full address. AEMEI analyzes the results of the emergency event and summarizes the advantages and disadvantages of the event framework. Experiments show that event frame technology can solve the problem of emergency information drawing and provides reference cases for other applications. When the emergency disaster is about to occur, the relevant departments query emergency's data that has occurred in the past. They can make arrangements ahead of schedule which defense and reducing disaster. The technology decreases the number of casualties and property damage in the country and world. This is of great significance to the state and society.

  2. Extracting quantitative measures from EAP: a small clinical study using BFOR.

    Science.gov (United States)

    Hosseinbor, A Pasha; Chung, Moo K; Wu, Yu-Chien; Fleming, John O; Field, Aaron S; Alexander, Andrew L

    2012-01-01

    The ensemble average propagator (EAP) describes the 3D average diffusion process of water molecules, capturing both its radial and angular contents, and hence providing rich information about complex tissue microstructure properties. Bessel Fourier orientation reconstruction (BFOR) is one of several analytical, non-Cartesian EAP reconstruction schemes employing multiple shell acquisitions that have recently been proposed. Such modeling bases have not yet been fully exploited in the extraction of rotationally invariant q-space indices that describe the degree of diffusion anisotropy/restrictivity. Such quantitative measures include the zero-displacement probability (P(o)), mean squared displacement (MSD), q-space inverse variance (QIV), and generalized fractional anisotropy (GFA), and all are simply scalar features of the EAP. In this study, a general relationship between MSD and q-space diffusion signal is derived and an EAP-based definition of GFA is introduced. A significant part of the paper is dedicated to utilizing BFOR in a clinical dataset, comprised of 5 multiple sclerosis (MS) patients and 4 healthy controls, to estimate P(o), MSD, QIV, and GFA of corpus callosum, and specifically, to see if such indices can detect changes between normal appearing white matter (NAWM) and healthy white matter (WM). Although the sample size is small, this study is a proof of concept that can be extended to larger sample sizes in the future.

  3. A rapid extraction of landslide disaster information research based on GF-1 image

    Science.gov (United States)

    Wang, Sai; Xu, Suning; Peng, Ling; Wang, Zhiyi; Wang, Na

    2015-08-01

    In recent years, the landslide disasters occurred frequently because of the seismic activity. It brings great harm to people's life. It has caused high attention of the state and the extensive concern of society. In the field of geological disaster, landslide information extraction based on remote sensing has been controversial, but high resolution remote sensing image can improve the accuracy of information extraction effectively with its rich texture and geometry information. Therefore, it is feasible to extract the information of earthquake- triggered landslides with serious surface damage and large scale. Taking the Wenchuan county as the study area, this paper uses multi-scale segmentation method to extract the landslide image object through domestic GF-1 images and DEM data, which uses the estimation of scale parameter tool to determine the optimal segmentation scale; After analyzing the characteristics of landslide high-resolution image comprehensively and selecting spectrum feature, texture feature, geometric features and landform characteristics of the image, we can establish the extracting rules to extract landslide disaster information. The extraction results show that there are 20 landslide whose total area is 521279.31 .Compared with visual interpretation results, the extraction accuracy is 72.22%. This study indicates its efficient and feasible to extract earthquake landslide disaster information based on high resolution remote sensing and it provides important technical support for post-disaster emergency investigation and disaster assessment.

  4. Information Extraction with Character-level Neural Networks and Free Noisy Supervision

    OpenAIRE

    Meerkamp, Philipp; Zhou, Zhengyi

    2016-01-01

    We present an architecture for information extraction from text that augments an existing parser with a character-level neural network. The network is trained using a measure of consistency of extracted data with existing databases as a form of noisy supervision. Our architecture combines the ability of constraint-based information extraction systems to easily incorporate domain knowledge and constraints with the ability of deep neural networks to leverage large amounts of data to learn compl...

  5. Unsupervised information extraction by text segmentation

    CERN Document Server

    Cortez, Eli

    2013-01-01

    A new unsupervised approach to the problem of Information Extraction by Text Segmentation (IETS) is proposed, implemented and evaluated herein. The authors' approach relies on information available on pre-existing data to learn how to associate segments in the input string with attributes of a given domain relying on a very effective set of content-based features. The effectiveness of the content-based features is also exploited to directly learn from test data structure-based features, with no previous human-driven training, a feature unique to the presented approach. Based on the approach, a

  6. Extraction of CT dose information from DICOM metadata: automated Matlab-based approach.

    Science.gov (United States)

    Dave, Jaydev K; Gingold, Eric L

    2013-01-01

    The purpose of this study was to extract exposure parameters and dose-relevant indexes of CT examinations from information embedded in DICOM metadata. DICOM dose report files were identified and retrieved from a PACS. An automated software program was used to extract from these files information from the structured elements in the DICOM metadata relevant to exposure. Extracting information from DICOM metadata eliminated potential errors inherent in techniques based on optical character recognition, yielding 100% accuracy.

  7. Advanced applications of natural language processing for performing information extraction

    CERN Document Server

    Rodrigues, Mário

    2015-01-01

    This book explains how can be created information extraction (IE) applications that are able to tap the vast amount of relevant information available in natural language sources: Internet pages, official documents such as laws and regulations, books and newspapers, and social web. Readers are introduced to the problem of IE and its current challenges and limitations, supported with examples. The book discusses the need to fill the gap between documents, data, and people, and provides a broad overview of the technology supporting IE. The authors present a generic architecture for developing systems that are able to learn how to extract relevant information from natural language documents, and illustrate how to implement working systems using state-of-the-art and freely available software tools. The book also discusses concrete applications illustrating IE uses.   ·         Provides an overview of state-of-the-art technology in information extraction (IE), discussing achievements and limitations for t...

  8. Post-processing of Deep Web Information Extraction Based on Domain Ontology

    Directory of Open Access Journals (Sweden)

    PENG, T.

    2013-11-01

    Full Text Available Many methods are utilized to extract and process query results in deep Web, which rely on the different structures of Web pages and various designing modes of databases. However, some semantic meanings and relations are ignored. So, in this paper, we present an approach for post-processing deep Web query results based on domain ontology which can utilize the semantic meanings and relations. A block identification model (BIM based on node similarity is defined to extract data blocks that are relevant to specific domain after reducing noisy nodes. Feature vector of domain books is obtained by result set extraction model (RSEM based on vector space model (VSM. RSEM, in combination with BIM, builds the domain ontology on books which can not only remove the limit of Web page structures when extracting data information, but also make use of semantic meanings of domain ontology. After extracting basic information of Web pages, a ranking algorithm is adopted to offer an ordered list of data records to users. Experimental results show that BIM and RSEM extract data blocks and build domain ontology accurately. In addition, relevant data records and basic information are extracted and ranked. The performances precision and recall show that our proposed method is feasible and efficient.

  9. Information Management Processes for Extraction of Student Dropout Indicators in Courses in Distance Mode

    Directory of Open Access Journals (Sweden)

    Renata Maria Abrantes Baracho

    2016-04-01

    Full Text Available This research addresses the use of information management processes in order to extract student dropout indicators in distance mode courses. Distance education in Brazil aims to facilitate access to information. The MEC (Ministry of Education announced, in the second semester of 2013, that the main obstacles faced by institutions offering courses in this mode were students dropping out and the resistance of both educators and students to this mode. The research used a mixed methodology, qualitative and quantitative, to obtain student dropout indicators. The factors found and validated in this research were: the lack of interest from students, insufficient training in the use of the virtual learning environment for students, structural problems in the schools that were chosen to offer the course, students without e-mail, incoherent answers to activities to the course, lack of knowledge on the part of the student when using the computer tool. The scenario considered was a course offered in distance mode called Aluno Integrado (Integrated Student

  10. An Effective Approach to Biomedical Information Extraction with Limited Training Data

    Science.gov (United States)

    Jonnalagadda, Siddhartha

    2011-01-01

    In the current millennium, extensive use of computers and the internet caused an exponential increase in information. Few research areas are as important as information extraction, which primarily involves extracting concepts and the relations between them from free text. Limitations in the size of training data, lack of lexicons and lack of…

  11. MedTime: a temporal information extraction system for clinical narratives.

    Science.gov (United States)

    Lin, Yu-Kai; Chen, Hsinchun; Brown, Randall A

    2013-12-01

    Temporal information extraction from clinical narratives is of critical importance to many clinical applications. We participated in the EVENT/TIMEX3 track of the 2012 i2b2 clinical temporal relations challenge, and presented our temporal information extraction system, MedTime. MedTime comprises a cascade of rule-based and machine-learning pattern recognition procedures. It achieved a micro-averaged f-measure of 0.88 in both the recognitions of clinical events and temporal expressions. We proposed and evaluated three time normalization strategies to normalize relative time expressions in clinical texts. The accuracy was 0.68 in normalizing temporal expressions of dates, times, durations, and frequencies. This study demonstrates and evaluates the integration of rule-based and machine-learning-based approaches for high performance temporal information extraction from clinical narratives. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. a Statistical Texture Feature for Building Collapse Information Extraction of SAR Image

    Science.gov (United States)

    Li, L.; Yang, H.; Chen, Q.; Liu, X.

    2018-04-01

    Synthetic Aperture Radar (SAR) has become one of the most important ways to extract post-disaster collapsed building information, due to its extreme versatility and almost all-weather, day-and-night working capability, etc. In view of the fact that the inherent statistical distribution of speckle in SAR images is not used to extract collapsed building information, this paper proposed a novel texture feature of statistical models of SAR images to extract the collapsed buildings. In the proposed feature, the texture parameter of G0 distribution from SAR images is used to reflect the uniformity of the target to extract the collapsed building. This feature not only considers the statistical distribution of SAR images, providing more accurate description of the object texture, but also is applied to extract collapsed building information of single-, dual- or full-polarization SAR data. The RADARSAT-2 data of Yushu earthquake which acquired on April 21, 2010 is used to present and analyze the performance of the proposed method. In addition, the applicability of this feature to SAR data with different polarizations is also analysed, which provides decision support for the data selection of collapsed building information extraction.

  13. Quality control in quantitative computed tomography

    International Nuclear Information System (INIS)

    Jessen, K.A.; Joergensen, J.

    1989-01-01

    Computed tomography (CT) has for several years been an indispensable tool in diagnostic radiology, but it is only recently that extraction of quantitative information from CT images has been of practical clinical value. Only careful control of the scan parameters, and especially the scan geometry, allows useful information to be obtained; and it can be demonstrated by simple phantom measurements how sensitive a CT system can be to variations in size, shape and position of the phantom in the gantry aperture. Significant differences exist between systems that are not manifested in normal control of image quality and general performance tests. Therefore an actual system has to be analysed for its suitability for quantitative use of the images before critical clinical applications are justified. (author)

  14. DKIE: Open Source Information Extraction for Danish

    DEFF Research Database (Denmark)

    Derczynski, Leon; Field, Camilla Vilhelmsen; Bøgh, Kenneth Sejdenfaden

    2014-01-01

    Danish is a major Scandinavian language spoken daily by around six million people. However, it lacks a unified, open set of NLP tools. This demonstration will introduce DKIE, an extensible open-source toolkit for processing Danish text. We implement an information extraction architecture for Danish...

  15. Transliteration normalization for Information Extraction and Machine Translation

    Directory of Open Access Journals (Sweden)

    Yuval Marton

    2014-12-01

    Full Text Available Foreign name transliterations typically include multiple spelling variants. These variants cause data sparseness and inconsistency problems, increase the Out-of-Vocabulary (OOV rate, and present challenges for Machine Translation, Information Extraction and other natural language processing (NLP tasks. This work aims to identify and cluster name spelling variants using a Statistical Machine Translation method: word alignment. The variants are identified by being aligned to the same “pivot” name in another language (the source-language in Machine Translation settings. Based on word-to-word translation and transliteration probabilities, as well as the string edit distance metric, names with similar spellings in the target language are clustered and then normalized to a canonical form. With this approach, tens of thousands of high-precision name transliteration spelling variants are extracted from sentence-aligned bilingual corpora in Arabic and English (in both languages. When these normalized name spelling variants are applied to Information Extraction tasks, improvements over strong baseline systems are observed. When applied to Machine Translation tasks, a large improvement potential is shown.

  16. End-to-end information extraction without token-level supervision

    DEFF Research Database (Denmark)

    Palm, Rasmus Berg; Hovy, Dirk; Laws, Florian

    2017-01-01

    Most state-of-the-art information extraction approaches rely on token-level labels to find the areas of interest in text. Unfortunately, these labels are time-consuming and costly to create, and consequently, not available for many real-life IE tasks. To make matters worse, token-level labels...... and output text. We evaluate our model on the ATIS data set, MIT restaurant corpus and the MIT movie corpus and compare to neural baselines that do use token-level labels. We achieve competitive results, within a few percentage points of the baselines, showing the feasibility of E2E information extraction...

  17. EXTRACTION AND QUANTITATIVE ANALYSIS OF ELEMENTAL SULFUR FROM SULFIDE MINERAL SURFACES BY HIGH-PERFORMANCE LIQUID CHROMATOGRAPHY. (R826189)

    Science.gov (United States)

    A simple method for the quantitative determination of elemental sulfur on oxidized sulfide minerals is described. Extraction of elemental sulfur in perchloroethylene and subsequent analysis with high-performance liquid chromatography were used to ascertain the total elemental ...

  18. Mining knowledge from text repositories using information extraction ...

    Indian Academy of Sciences (India)

    Information extraction (IE); text mining; text repositories; knowledge discovery from .... general purpose English words. However ... of precision and recall, as extensive experimentation is required due to lack of public tagged corpora. 4. Mining ...

  19. Towards an information extraction and knowledge formation framework based on Shannon entropy

    Directory of Open Access Journals (Sweden)

    Iliescu Dragoș

    2017-01-01

    Full Text Available Information quantity subject is approached in this paperwork, considering the specific domain of nonconforming product management as information source. This work represents a case study. Raw data were gathered from a heavy industrial works company, information extraction and knowledge formation being considered herein. Involved method for information quantity estimation is based on Shannon entropy formula. Information and entropy spectrum are decomposed and analysed for extraction of specific information and knowledge-that formation. The result of the entropy analysis point out the information needed to be acquired by the involved organisation, this being presented as a specific knowledge type.

  20. Tagline: Information Extraction for Semi-Structured Text Elements in Medical Progress Notes

    Science.gov (United States)

    Finch, Dezon Kile

    2012-01-01

    Text analysis has become an important research activity in the Department of Veterans Affairs (VA). Statistical text mining and natural language processing have been shown to be very effective for extracting useful information from medical documents. However, neither of these techniques is effective at extracting the information stored in…

  1. Mars Target Encyclopedia: Information Extraction for Planetary Science

    Science.gov (United States)

    Wagstaff, K. L.; Francis, R.; Gowda, T.; Lu, Y.; Riloff, E.; Singh, K.

    2017-06-01

    Mars surface targets / and published compositions / Seek and ye will find. We used text mining methods to extract information from LPSC abstracts about the composition of Mars surface targets. Users can search by element, mineral, or target.

  2. Quantitative approaches to information recovery from black holes

    Energy Technology Data Exchange (ETDEWEB)

    Balasubramanian, Vijay [David Rittenhouse Laboratory, University of Pennsylvania, 209 South 33rd Street, Philadelphia, PA 19104 (United States); Czech, Bartlomiej, E-mail: vijay@physics.upenn.edu, E-mail: czech@phas.ubc.ca [Department of Physics and Astronomy, University of British Columbia, 6224 Agricultural Road, Vancouver, BC V6T 1Z1 (Canada)

    2011-08-21

    The evaporation of black holes into apparently thermal radiation poses a serious conundrum for theoretical physics: at face value, it appears that in the presence of a black hole, quantum evolution is non-unitary and destroys information. This information loss paradox has its seed in the presence of a horizon causally separating the interior and asymptotic regions in a black hole spacetime. A quantitative resolution of the paradox could take several forms: (a) a precise argument that the underlying quantum theory is unitary, and that information loss must be an artifact of approximations in the derivation of black hole evaporation, (b) an explicit construction showing how information can be recovered by the asymptotic observer, (c) a demonstration that the causal disconnection of the black hole interior from infinity is an artifact of the semiclassical approximation. This review summarizes progress on all these fronts. (topical review)

  3. DNA agarose gel electrophoresis for antioxidant analysis: Development of a quantitative approach for phenolic extracts.

    Science.gov (United States)

    Silva, Sara; Costa, Eduardo M; Vicente, Sandra; Veiga, Mariana; Calhau, Conceição; Morais, Rui M; Pintado, Manuela E

    2017-10-15

    Most of the fast in vitro assays proposed to determine the antioxidant capacity of a compound/extract lack either biological context or employ complex protocols. Therefore, the present work proposes the improvement of an agarose gel DNA electrophoresis in order to allow for a quantitative estimation of the antioxidant capacity of pure phenolic compounds as well as of a phenolic rich extract, while also considering their possible pro-oxidant effects. The result obtained demonstrated that the proposed method allowed for the evaluation of the protection of DNA oxidation [in the presence of hydrogen peroxide (H 2 O 2 ) and an H 2 O 2 /iron (III) chloride (FeCl 3 ) systems] as well as for the observation of pro-oxidant activities, with the measurements registering interclass correlation coefficients above 0.9. Moreover, this method allowed for the characterization of the antioxidant capacity of a blueberry extract while demonstrating that it had no perceived pro-oxidant effect. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Augmenting Amyloid PET Interpretations With Quantitative Information Improves Consistency of Early Amyloid Detection.

    Science.gov (United States)

    Harn, Nicholas R; Hunt, Suzanne L; Hill, Jacqueline; Vidoni, Eric; Perry, Mark; Burns, Jeffrey M

    2017-08-01

    Establishing reliable methods for interpreting elevated cerebral amyloid-β plaque on PET scans is increasingly important for radiologists, as availability of PET imaging in clinical practice increases. We examined a 3-step method to detect plaque in cognitively normal older adults, focusing on the additive value of quantitative information during the PET scan interpretation process. Fifty-five F-florbetapir PET scans were evaluated by 3 experienced raters. Scans were first visually interpreted as having "elevated" or "nonelevated" plaque burden ("Visual Read"). Images were then processed using a standardized quantitative analysis software (MIMneuro) to generate whole brain and region of interest SUV ratios. This "Quantitative Read" was considered elevated if at least 2 of 6 regions of interest had an SUV ratio of more than 1.1. The final interpretation combined both visual and quantitative data together ("VisQ Read"). Cohen kappa values were assessed as a measure of interpretation agreement. Plaque was elevated in 25.5% to 29.1% of the 165 total Visual Reads. Interrater agreement was strong (kappa = 0.73-0.82) and consistent with reported values. Quantitative Reads were elevated in 45.5% of participants. Final VisQ Reads changed from initial Visual Reads in 16 interpretations (9.7%), with most changing from "nonelevated" Visual Reads to "elevated." These changed interpretations demonstrated lower plaque quantification than those initially read as "elevated" that remained unchanged. Interrater variability improved for VisQ Reads with the addition of quantitative information (kappa = 0.88-0.96). Inclusion of quantitative information increases consistency of PET scan interpretations for early detection of cerebral amyloid-β plaque accumulation.

  5. Optimum detection for extracting maximum information from symmetric qubit sets

    International Nuclear Information System (INIS)

    Mizuno, Jun; Fujiwara, Mikio; Sasaki, Masahide; Akiba, Makoto; Kawanishi, Tetsuya; Barnett, Stephen M.

    2002-01-01

    We demonstrate a class of optimum detection strategies for extracting the maximum information from sets of equiprobable real symmetric qubit states of a single photon. These optimum strategies have been predicted by Sasaki et al. [Phys. Rev. A 59, 3325 (1999)]. The peculiar aspect is that the detections with at least three outputs suffice for optimum extraction of information regardless of the number of signal elements. The cases of ternary (or trine), quinary, and septenary polarization signals are studied where a standard von Neumann detection (a projection onto a binary orthogonal basis) fails to access the maximum information. Our experiments demonstrate that it is possible with present technologies to attain about 96% of the theoretical limit

  6. Study on methods and techniques of aeroradiometric weak information extraction for sandstone-hosted uranium deposits based on GIS

    International Nuclear Information System (INIS)

    Han Shaoyang; Ke Dan; Hou Huiqun

    2005-01-01

    The weak information extraction is one of the important research contents in the current sandstone-type uranium prospecting in China. This paper introduces the connotation of aeroradiometric weak information extraction, and discusses the formation theories of aeroradiometric weak information extraction, and discusses the formation theories of aeroradiometric weak information and establishes some effective mathematic models for weak information extraction. Models for weak information extraction are realized based on GIS software platform. Application tests of weak information extraction are realized based on GIS software platform. Application tests of weak information extraction are completed in known uranium mineralized areas. Research results prove that the prospective areas of sandstone-type uranium deposits can be rapidly delineated by extracting aeroradiometric weak information. (authors)

  7. Using text mining techniques to extract phenotypic information from the PhenoCHF corpus.

    Science.gov (United States)

    Alnazzawi, Noha; Thompson, Paul; Batista-Navarro, Riza; Ananiadou, Sophia

    2015-01-01

    Phenotypic information locked away in unstructured narrative text presents significant barriers to information accessibility, both for clinical practitioners and for computerised applications used for clinical research purposes. Text mining (TM) techniques have previously been applied successfully to extract different types of information from text in the biomedical domain. They have the potential to be extended to allow the extraction of information relating to phenotypes from free text. To stimulate the development of TM systems that are able to extract phenotypic information from text, we have created a new corpus (PhenoCHF) that is annotated by domain experts with several types of phenotypic information relating to congestive heart failure. To ensure that systems developed using the corpus are robust to multiple text types, it integrates text from heterogeneous sources, i.e., electronic health records (EHRs) and scientific articles from the literature. We have developed several different phenotype extraction methods to demonstrate the utility of the corpus, and tested these methods on a further corpus, i.e., ShARe/CLEF 2013. Evaluation of our automated methods showed that PhenoCHF can facilitate the training of reliable phenotype extraction systems, which are robust to variations in text type. These results have been reinforced by evaluating our trained systems on the ShARe/CLEF corpus, which contains clinical records of various types. Like other studies within the biomedical domain, we found that solutions based on conditional random fields produced the best results, when coupled with a rich feature set. PhenoCHF is the first annotated corpus aimed at encoding detailed phenotypic information. The unique heterogeneous composition of the corpus has been shown to be advantageous in the training of systems that can accurately extract phenotypic information from a range of different text types. Although the scope of our annotation is currently limited to a single

  8. Quantitative Model for Economic Analyses of Information Security Investment in an Enterprise Information System

    Directory of Open Access Journals (Sweden)

    Bojanc Rok

    2012-11-01

    Full Text Available The paper presents a mathematical model for the optimal security-technology investment evaluation and decision-making processes based on the quantitative analysis of security risks and digital asset assessments in an enterprise. The model makes use of the quantitative analysis of different security measures that counteract individual risks by identifying the information system processes in an enterprise and the potential threats. The model comprises the target security levels for all identified business processes and the probability of a security accident together with the possible loss the enterprise may suffer. The selection of security technology is based on the efficiency of selected security measures. Economic metrics are applied for the efficiency assessment and comparative analysis of different protection technologies. Unlike the existing models for evaluation of the security investment, the proposed model allows direct comparison and quantitative assessment of different security measures. The model allows deep analyses and computations providing quantitative assessments of different options for investments, which translate into recommendations facilitating the selection of the best solution and the decision-making thereof. The model was tested using empirical examples with data from real business environment.

  9. Information Extraction, Data Integration, and Uncertain Data Management: The State of The Art

    NARCIS (Netherlands)

    Habib, Mena Badieh; van Keulen, Maurice

    2011-01-01

    Information Extraction, data Integration, and uncertain data management are different areas of research that got vast focus in the last two decades. Many researches tackled those areas of research individually. However, information extraction systems should have integrated with data integration

  10. Information extraction from muon radiography data

    International Nuclear Information System (INIS)

    Borozdin, K.N.; Asaki, T.J.; Chartrand, R.; Hengartner, N.W.; Hogan, G.E.; Morris, C.L.; Priedhorsky, W.C.; Schirato, R.C.; Schultz, L.J.; Sottile, M.J.; Vixie, K.R.; Wohlberg, B.E.; Blanpied, G.

    2004-01-01

    Scattering muon radiography was proposed recently as a technique of detection and 3-d imaging for dense high-Z objects. High-energy cosmic ray muons are deflected in matter in the process of multiple Coulomb scattering. By measuring the deflection angles we are able to reconstruct the configuration of high-Z material in the object. We discuss the methods for information extraction from muon radiography data. Tomographic methods widely used in medical images have been applied to a specific muon radiography information source. Alternative simple technique based on the counting of high-scattered muons in the voxels seems to be efficient in many simulated scenes. SVM-based classifiers and clustering algorithms may allow detection of compact high-Z object without full image reconstruction. The efficiency of muon radiography can be increased using additional informational sources, such as momentum estimation, stopping power measurement, and detection of muonic atom emission.

  11. Preservation of information in Fourier theory based deconvolved nuclear spectra

    International Nuclear Information System (INIS)

    Madan, V.K.; Gopalakrishnan, K.R.; Sharma, R.C.; Rattan, S.S.

    1995-01-01

    Nuclear spectroscopy is extremely useful to the internal radiation dosimetry for the estimation of body burden due to gamma emitters. Analysis of nuclear spectra is concerned with the extraction of qualitative and quantitative information embedded in the spectra. A spectral deconvolution method based on Fourier theory is probably the simplest method of deconvolving nuclear spectra. It is proved mathematically that the deconvolution method preserves the qualitative information. It is shown by using simulated spectra and an observed gamma ray spectrum that the method preserves the quantitative information. This may provide a novel approach of information extraction from a deconvolved spectrum. The paper discusses the methodology, mathematical analysis, and the results obtained by deconvolving spectra. (author). 6 refs., 2 tabs

  12. Quantitation of promethazine and metabolites in urine samples using on-line solid-phase extraction and column-switching

    Science.gov (United States)

    Song, Q.; Putcha, L.; Harm, D. L. (Principal Investigator)

    2001-01-01

    A chromatographic method for the quantitation of promethazine (PMZ) and its three metabolites in urine employing on-line solid-phase extraction and column-switching has been developed. The column-switching system described here uses an extraction column for the purification of PMZ and its metabolites from a urine matrix. The extraneous matrix interference was removed by flushing the extraction column with a gradient elution. The analytes of interest were then eluted onto an analytical column for further chromatographic separation using a mobile phase of greater solvent strength. This method is specific and sensitive with a range of 3.75-1400 ng/ml for PMZ and 2.5-1400 ng/ml for the metabolites promethazine sulfoxide, monodesmethyl promethazine sulfoxide and monodesmethyl promethazine. The lower limits of quantitation (LLOQ) were 3.75 ng/ml with less than 6.2% C.V. for PMZ and 2.50 ng/ml with less than 11.5% C.V. for metabolites based on a signal-to-noise ratio of 10:1 or greater. The accuracy and precision were within +/- 11.8% in bias and not greater than 5.5% C.V. in intra- and inter-assay precision for PMZ and metabolites. Method robustness was investigated using a Plackett-Burman experimental design. The applicability of the analytical method for pharmacokinetic studies in humans is illustrated.

  13. High-resolution gas chromatography/mas spectrometry method for characterization and quantitative analysis of ginkgolic acids in ginkgo biloba plants, extracts, and dietary supplements

    Science.gov (United States)

    A high resolution GC/MS with Selected Ion Monitor (SIM) method focusing on the characterization and quantitative analysis of ginkgolic acids (GAs) in Ginkgo biloba L. plant materials, extracts and commercial products was developed and validated. The method involved sample extraction with (1:1) meth...

  14. Quantitative analysis of semivolatile organic compounds in selected fractions of air sample extracts by GC/MI-IR spectrometry

    International Nuclear Information System (INIS)

    Childers, J.W.; Wilson, N.K.; Barbour, R.K.

    1990-01-01

    The authors are currently investigating the capabilities of gas chromatography/matrix isolation infrared (GC/MI-IR) spectrometry for the determination of semivolatile organic compounds (SVOCs) in environmental air sample extracts. Their efforts are focused on the determination of SVOCs such as alkylbenzene positional isomers, which are difficult to separate chromatographically and to distinguish by conventional electron-impact ionization GC/mass spectrometry. They have performed a series of systematic experiments to identify sources of error in quantitative GC/MI-IR analyses. These experiments were designed to distinguish between errors due to instrument design or performance and errors that arise from some characteristic inherent to the GC/MI-IR technique, such as matrix effects. They have investigated repeatability as a function of several aspects of GC/MI IR spectrometry, including sample injection, spectral acquisition, cryogenic disk movement, and matrix deposition. The precision, linearity, dynamic range, and detection limits of a commercial GC/MI-IR system for target SVOCs were determined and compared to those obtained with the system's flame ionization detector. The use of deuterated internal standards in the quantitative GC/MI-IR analysis of selected fractions of ambient air sample extracts will be demonstrated. They will also discuss the current limitations of the technique in quantitative analyses and suggest improvements for future consideration

  15. Extraction of bioliquid and quantitative determination of saturates ...

    African Journals Online (AJOL)

    Bioliquid generated from the leaves of banana (Musa sapintum) through anaerobic fungal degradation was obtained by soxhlet extraction using absolute methanol as solvent at 600C for 72 hours. The bioliquid extracted was recovered from the extracting solvent by evaporation using rotary evaporator. The extract obtained ...

  16. Recognition techniques for extracting information from semistructured documents

    Science.gov (United States)

    Della Ventura, Anna; Gagliardi, Isabella; Zonta, Bruna

    2000-12-01

    Archives of optical documents are more and more massively employed, the demand driven also by the new norms sanctioning the legal value of digital documents, provided they are stored on supports that are physically unalterable. On the supply side there is now a vast and technologically advanced market, where optical memories have solved the problem of the duration and permanence of data at costs comparable to those for magnetic memories. The remaining bottleneck in these systems is the indexing. The indexing of documents with a variable structure, while still not completely automated, can be machine supported to a large degree with evident advantages both in the organization of the work, and in extracting information, providing data that is much more detailed and potentially significant for the user. We present here a system for the automatic registration of correspondence to and from a public office. The system is based on a general methodology for the extraction, indexing, archiving, and retrieval of significant information from semi-structured documents. This information, in our prototype application, is distributed among the database fields of sender, addressee, subject, date, and body of the document.

  17. Stability Test and Quantitative and Qualitative Analyses of the Amino Acids in Pharmacopuncture Extracted from Scolopendra subspinipes mutilans

    Science.gov (United States)

    Cho, GyeYoon; Han, KyuChul; Yoon, JinYoung

    2015-01-01

    Objectives: Scolopendra subspinipes mutilans (S. subspinipes mutilans) is known as a traditional medicine and includes various amino acids, peptides and proteins. The amino acids in the pharmacopuncture extracted from S. subspinipes mutilans by using derivatization methods were analyzed quantitatively and qualitatively by using high performance liquid chromatography (HPLC) over a 12 month period to confirm its stability. Methods: Amino acids of pharmacopuncture extracted from S. subspinipes mutilans were derived by using O-phthaldialdehyde (OPA) & 9-fluorenyl methoxy carbonyl chloride (FMOC) reagent and were analyzed using HPLC. The amino acids were detected by using a diode array detector (DAD) and a fluorescence detector (FLD) to compare a mixed amino acid standard (STD) to the pharmacopuncture from centipedes. The stability tests on the pharmacopuncture from centipedes were done using HPLC for three conditions: a room temperature test chamber, an acceleration test chamber, and a cold test chamber. Results: The pharmacopuncture from centipedes was prepared by using the method of the Korean Pharmacopuncture Institute (KPI) and through quantitative analyses was shown to contain 9 amino acids of the 16 amino acids in the mixed amino acid STD. The amounts of the amino acids in the pharmacopuncture from centipedes were 34.37 ppm of aspartate, 123.72 ppm of arginine, 170.63 ppm of alanine, 59.55 ppm of leucine and 57 ppm of lysine. The relative standard deviation (RSD %) results for the pharmacopuncture from centipedes had a maximum value of 14.95% and minimum value of 1.795% on the room temperature test chamber, the acceleration test chamber and the cold test chamber stability tests. Conclusion: Stability tests on and quantitative and qualitative analyses of the amino acids in the pharmacopuncture extracted from centipedes by using derivatization methods were performed by using HPLC. Through research, we hope to determine the relationship between time and the

  18. Comparing success levels of different neural network structures in extracting discriminative information from the response patterns of a temperature-modulated resistive gas sensor

    Science.gov (United States)

    Hosseini-Golgoo, S. M.; Bozorgi, H.; Saberkari, A.

    2015-06-01

    Performances of three neural networks, consisting of a multi-layer perceptron, a radial basis function, and a neuro-fuzzy network with local linear model tree training algorithm, in modeling and extracting discriminative features from the response patterns of a temperature-modulated resistive gas sensor are quantitatively compared. For response pattern recording, a voltage staircase containing five steps each with a 20 s plateau is applied to the micro-heater of the sensor, when 12 different target gases, each at 11 concentration levels, are present. In each test, the hidden layer neuron weights are taken as the discriminatory feature vector of the target gas. These vectors are then mapped to a 3D feature space using linear discriminant analysis. The discriminative information content of the feature vectors are determined by the calculation of the Fisher’s discriminant ratio, affording quantitative comparison among the success rates achieved by the different neural network structures. The results demonstrate a superior discrimination ratio for features extracted from local linear neuro-fuzzy and radial-basis-function networks with recognition rates of 96.27% and 90.74%, respectively.

  19. Comparing success levels of different neural network structures in extracting discriminative information from the response patterns of a temperature-modulated resistive gas sensor

    International Nuclear Information System (INIS)

    Hosseini-Golgoo, S M; Bozorgi, H; Saberkari, A

    2015-01-01

    Performances of three neural networks, consisting of a multi-layer perceptron, a radial basis function, and a neuro-fuzzy network with local linear model tree training algorithm, in modeling and extracting discriminative features from the response patterns of a temperature-modulated resistive gas sensor are quantitatively compared. For response pattern recording, a voltage staircase containing five steps each with a 20 s plateau is applied to the micro-heater of the sensor, when 12 different target gases, each at 11 concentration levels, are present. In each test, the hidden layer neuron weights are taken as the discriminatory feature vector of the target gas. These vectors are then mapped to a 3D feature space using linear discriminant analysis. The discriminative information content of the feature vectors are determined by the calculation of the Fisher’s discriminant ratio, affording quantitative comparison among the success rates achieved by the different neural network structures. The results demonstrate a superior discrimination ratio for features extracted from local linear neuro-fuzzy and radial-basis-function networks with recognition rates of 96.27% and 90.74%, respectively. (paper)

  20. Rate phenomena in uranium extraction by amines

    International Nuclear Information System (INIS)

    Coleman, C.F.; McDowell, W.J.

    1979-01-01

    Kinetics studies and other rate measurements are reviewed in the amine extraction of uranium and of some other related and associated metal ions. Equilibration is relatively fast in the uranium sulfate systems most important to uranium hydrometallurgy. Significantly slow equilibration has been encountered in some other systems. Most of the recorded rate information, both qualitative and quantitative, has come from exploratory and process-development work, while some kinetics studies have been directed specifically toward elucidation of extraction mechanisms. 71 references

  1. Data from quantitative label free proteomics analysis of rat spleen

    Directory of Open Access Journals (Sweden)

    Khadar Dudekula

    2016-09-01

    Full Text Available The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides. A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis. Keywords: Spleen, Rat, Protein extraction, Label-free quantitative proteomics

  2. Quantitative digital radiography with two dimensional flat panels

    International Nuclear Information System (INIS)

    Dinten, J.M.; Robert-Coutant, C.; Darboux, M.

    2003-01-01

    Purpose: Attenuation law relates radiographic images to irradiated object thickness and chemical composition. Film radiography exploits qualitatively this property for diagnosis. Digital radiographic flat panels present large dynamic range, reproducibility and linearity properties which open the gate for quantification. We will present, through two applications (mammography and bone densitometry), an approach to extract quantitative information from digital 2D radiographs. Material and method: The main difficulty for quantification is X-rays scatter, which superimposes to acquisition data. Because of multiple scatterings and 3D geometry dependence, it cannot be directly exploited through an exact analytical model. Therefore we have developed an approach for its estimation and subtraction from medical radiographs, based on approximations and derivations of analytical models of scatter formation in human tissues. Results: In digital mammography, the objective is to build a map of the glandular tissue thickness. Its separation from fat tissue is based on two equations: height of compression and attenuation. This last equation needs X-Rays scatter correction. In bone densitometry, physicians look for quantitative bone mineral density. Today, clinical DEXA systems use collimated single or linear detectors to eliminate scatter. This scanning technology induces poor image quality. By applying our scatter correction approach, we have developed a bone densitometer using a digital flat panel (Lexxos, DMS). It provides with accurate and reproducible measurements while presenting radiological image quality. Conclusion: These applications show how information processing, and especially X-Rays scatter processing, enables to extract quantitative information from digital radiographs. This approach, associated to Computer Aided Diagnosis algorithms or reconstructions algorithms, gives access to useful information for diagnosis. (author)

  3. Quantitative Indicators for Behaviour Drift Detection from Home Automation Data.

    Science.gov (United States)

    Veronese, Fabio; Masciadri, Andrea; Comai, Sara; Matteucci, Matteo; Salice, Fabio

    2017-01-01

    Smart Homes diffusion provides an opportunity to implement elderly monitoring, extending seniors' independence and avoiding unnecessary assistance costs. Information concerning the inhabitant behaviour is contained in home automation data, and can be extracted by means of quantitative indicators. The application of such approach proves it can evidence behaviour changes.

  4. Comparison of mentha extracts obtained by different extraction methods

    Directory of Open Access Journals (Sweden)

    Milić Slavica

    2006-01-01

    Full Text Available The different methods of mentha extraction, such as steam distillation, extraction by methylene chloride (Soxhlet extraction and supercritical fluid extraction (SFE by carbon dioxide (CO J were investigated. SFE by CO, was performed at pressure of 100 bar and temperature of40°C. The extraction yield, as well as qualitative and quantitative composition of obtained extracts, determined by GC-MS method, were compared.

  5. RESEARCH ON REMOTE SENSING GEOLOGICAL INFORMATION EXTRACTION BASED ON OBJECT ORIENTED CLASSIFICATION

    Directory of Open Access Journals (Sweden)

    H. Gao

    2018-04-01

    Full Text Available The northern Tibet belongs to the Sub cold arid climate zone in the plateau. It is rarely visited by people. The geological working conditions are very poor. However, the stratum exposures are good and human interference is very small. Therefore, the research on the automatic classification and extraction of remote sensing geological information has typical significance and good application prospect. Based on the object-oriented classification in Northern Tibet, using the Worldview2 high-resolution remote sensing data, combined with the tectonic information and image enhancement, the lithological spectral features, shape features, spatial locations and topological relations of various geological information are excavated. By setting the threshold, based on the hierarchical classification, eight kinds of geological information were classified and extracted. Compared with the existing geological maps, the accuracy analysis shows that the overall accuracy reached 87.8561 %, indicating that the classification-oriented method is effective and feasible for this study area and provides a new idea for the automatic extraction of remote sensing geological information.

  6. Terrain Extraction by Integrating Terrestrial Laser Scanner Data and Spectral Information

    Science.gov (United States)

    Lau, C. L.; Halim, S.; Zulkepli, M.; Azwan, A. M.; Tang, W. L.; Chong, A. K.

    2015-10-01

    The extraction of true terrain points from unstructured laser point cloud data is an important process in order to produce an accurate digital terrain model (DTM). However, most of these spatial filtering methods just utilizing the geometrical data to discriminate the terrain points from nonterrain points. The point cloud filtering method also can be improved by using the spectral information available with some scanners. Therefore, the objective of this study is to investigate the effectiveness of using the three-channel (red, green and blue) of the colour image captured from built-in digital camera which is available in some Terrestrial Laser Scanner (TLS) for terrain extraction. In this study, the data acquisition was conducted at a mini replica landscape in Universiti Teknologi Malaysia (UTM), Skudai campus using Leica ScanStation C10. The spectral information of the coloured point clouds from selected sample classes are extracted for spectral analysis. The coloured point clouds which within the corresponding preset spectral threshold are identified as that specific feature point from the dataset. This process of terrain extraction is done through using developed Matlab coding. Result demonstrates that a higher spectral resolution passive image is required in order to improve the output. This is because low quality of the colour images captured by the sensor contributes to the low separability in spectral reflectance. In conclusion, this study shows that, spectral information is capable to be used as a parameter for terrain extraction.

  7. Extracting Semantic Information from Visual Data: A Survey

    Directory of Open Access Journals (Sweden)

    Qiang Liu

    2016-03-01

    Full Text Available The traditional environment maps built by mobile robots include both metric ones and topological ones. These maps are navigation-oriented and not adequate for service robots to interact with or serve human users who normally rely on the conceptual knowledge or semantic contents of the environment. Therefore, the construction of semantic maps becomes necessary for building an effective human-robot interface for service robots. This paper reviews recent research and development in the field of visual-based semantic mapping. The main focus is placed on how to extract semantic information from visual data in terms of feature extraction, object/place recognition and semantic representation methods.

  8. KneeTex: an ontology-driven system for information extraction from MRI reports.

    Science.gov (United States)

    Spasić, Irena; Zhao, Bo; Jones, Christopher B; Button, Kate

    2015-01-01

    In the realm of knee pathology, magnetic resonance imaging (MRI) has the advantage of visualising all structures within the knee joint, which makes it a valuable tool for increasing diagnostic accuracy and planning surgical treatments. Therefore, clinical narratives found in MRI reports convey valuable diagnostic information. A range of studies have proven the feasibility of natural language processing for information extraction from clinical narratives. However, no study focused specifically on MRI reports in relation to knee pathology, possibly due to the complexity of knee anatomy and a wide range of conditions that may be associated with different anatomical entities. In this paper we describe KneeTex, an information extraction system that operates in this domain. As an ontology-driven information extraction system, KneeTex makes active use of an ontology to strongly guide and constrain text analysis. We used automatic term recognition to facilitate the development of a domain-specific ontology with sufficient detail and coverage for text mining applications. In combination with the ontology, high regularity of the sublanguage used in knee MRI reports allowed us to model its processing by a set of sophisticated lexico-semantic rules with minimal syntactic analysis. The main processing steps involve named entity recognition combined with coordination, enumeration, ambiguity and co-reference resolution, followed by text segmentation. Ontology-based semantic typing is then used to drive the template filling process. We adopted an existing ontology, TRAK (Taxonomy for RehAbilitation of Knee conditions), for use within KneeTex. The original TRAK ontology expanded from 1,292 concepts, 1,720 synonyms and 518 relationship instances to 1,621 concepts, 2,550 synonyms and 560 relationship instances. This provided KneeTex with a very fine-grained lexico-semantic knowledge base, which is highly attuned to the given sublanguage. Information extraction results were evaluated

  9. A Validated Reverse Phase HPLC Analytical Method for Quantitation of Glycoalkaloids in Solanum lycocarpum and Its Extracts

    Directory of Open Access Journals (Sweden)

    Renata Fabiane Jorge Tiossi

    2012-01-01

    Full Text Available Solanum lycocarpum (Solanaceae is native to the Brazilian Cerrado. Fruits of this species contain the glycoalkaloids solasonine (SN and solamargine (SM, which display antiparasitic and anticancer properties. A method has been developed for the extraction and HPLC-UV analysis of the SN and SM in different parts of S. lycocarpum, mainly comprising ripe and unripe fruits, leaf, and stem. This analytical method was validated and gave good detection response with linearity over a dynamic range of 0.77–1000.00 μg mL−1 and recovery in the range of 80.92–91.71%, allowing a reliable quantitation of the target compounds. Unripe fruits displayed higher concentrations of glycoalkaloids (1.04% ± 0.01 of SN and 0.69% ± 0.00 of SM than the ripe fruits (0.83% ± 0.02 of SN and 0.60% ± 0.01 of SM. Quantitation of glycoalkaloids in the alkaloidic extract gave 45.09% ± 1.14 of SN and 44.37% ± 0.60 of SM, respectively.

  10. Extracting the information backbone in online system.

    Directory of Open Access Journals (Sweden)

    Qian-Ming Zhang

    Full Text Available Information overload is a serious problem in modern society and many solutions such as recommender system have been proposed to filter out irrelevant information. In the literature, researchers have been mainly dedicated to improving the recommendation performance (accuracy and diversity of the algorithms while they have overlooked the influence of topology of the online user-object bipartite networks. In this paper, we find that some information provided by the bipartite networks is not only redundant but also misleading. With such "less can be more" feature, we design some algorithms to improve the recommendation performance by eliminating some links from the original networks. Moreover, we propose a hybrid method combining the time-aware and topology-aware link removal algorithms to extract the backbone which contains the essential information for the recommender systems. From the practical point of view, our method can improve the performance and reduce the computational time of the recommendation system, thus improving both of their effectiveness and efficiency.

  11. Extracting the information backbone in online system.

    Science.gov (United States)

    Zhang, Qian-Ming; Zeng, An; Shang, Ming-Sheng

    2013-01-01

    Information overload is a serious problem in modern society and many solutions such as recommender system have been proposed to filter out irrelevant information. In the literature, researchers have been mainly dedicated to improving the recommendation performance (accuracy and diversity) of the algorithms while they have overlooked the influence of topology of the online user-object bipartite networks. In this paper, we find that some information provided by the bipartite networks is not only redundant but also misleading. With such "less can be more" feature, we design some algorithms to improve the recommendation performance by eliminating some links from the original networks. Moreover, we propose a hybrid method combining the time-aware and topology-aware link removal algorithms to extract the backbone which contains the essential information for the recommender systems. From the practical point of view, our method can improve the performance and reduce the computational time of the recommendation system, thus improving both of their effectiveness and efficiency.

  12. Extracting the Information Backbone in Online System

    Science.gov (United States)

    Zhang, Qian-Ming; Zeng, An; Shang, Ming-Sheng

    2013-01-01

    Information overload is a serious problem in modern society and many solutions such as recommender system have been proposed to filter out irrelevant information. In the literature, researchers have been mainly dedicated to improving the recommendation performance (accuracy and diversity) of the algorithms while they have overlooked the influence of topology of the online user-object bipartite networks. In this paper, we find that some information provided by the bipartite networks is not only redundant but also misleading. With such “less can be more” feature, we design some algorithms to improve the recommendation performance by eliminating some links from the original networks. Moreover, we propose a hybrid method combining the time-aware and topology-aware link removal algorithms to extract the backbone which contains the essential information for the recommender systems. From the practical point of view, our method can improve the performance and reduce the computational time of the recommendation system, thus improving both of their effectiveness and efficiency. PMID:23690946

  13. Simultaneous HPLC quantitative analysis of mangostin derivatives in Tetragonula pagdeni propolis extracts

    Directory of Open Access Journals (Sweden)

    Sumet Kongkiatpaiboon

    2016-04-01

    Full Text Available Propolis has been used as indigenous medicine for curing numerous maladies. The one that is of ethnopharmacological use is stingless bee propolis from Tetragonula pagdeni. A simultaneous high-performance liquid chromatography (HPLC investigation was developed and validated to determine the contents of bioactive compounds: 3-isomangostin, gamma-mangostin, beta-mangostin, and alpha-mangostin. HPLC analysis was effectively performed using a Hypersil BDS C18 column, with the gradient elution of methanol–0.2% formic acid and a flow rate of 1 ml/min, at 25 °C and detected at 245 nm. Parameters for the validation included accuracy, precision, linearity, and limits of quantitation and detection. The developed HPLC technique was precise, with lower than 2% relative standard deviation. The recovery values of 3-isomangostin, gamma-mangostin, beta-mangostin, and alpha-mangostin in the extracts were 99.98%, 99.97%, 98.98% and 99.19%, respectively. The average contents of these mixtures in the propolis extracts collected from different seasons were 0.127%, 1.008%, 0.323% and 2.703% (w/w, respectively. The developed HPLC technique was suitable and practical for the simultaneous analysis of these mangostin derivatives in T. pagdeni propolis and would be a valuable guidance for the standardization of its pharmaceutical products.

  14. Perceived relevance and information needs regarding food topics and preferred information sources among Dutch adults: results of a quantitative consumer study

    NARCIS (Netherlands)

    Dillen, van S.M.E.; Hiddink, G.J.; Koelen, M.A.; Graaf, de C.; Woerkum, van C.M.J.

    2004-01-01

    Objective: For more effective nutrition communication, it is crucial to identify sources from which consumers seek information. Our purpose was to assess perceived relevance and information needs regarding food topics, and preferred information sources by means of quantitative consumer research.

  15. Extraction Method for Earthquake-Collapsed Building Information Based on High-Resolution Remote Sensing

    International Nuclear Information System (INIS)

    Chen, Peng; Wu, Jian; Liu, Yaolin; Wang, Jing

    2014-01-01

    At present, the extraction of earthquake disaster information from remote sensing data relies on visual interpretation. However, this technique cannot effectively and quickly obtain precise and efficient information for earthquake relief and emergency management. Collapsed buildings in the town of Zipingpu after the Wenchuan earthquake were used as a case study to validate two kinds of rapid extraction methods for earthquake-collapsed building information based on pixel-oriented and object-oriented theories. The pixel-oriented method is based on multi-layer regional segments that embody the core layers and segments of the object-oriented method. The key idea is to mask layer by layer all image information, including that on the collapsed buildings. Compared with traditional techniques, the pixel-oriented method is innovative because it allows considerably rapid computer processing. As for the object-oriented method, a multi-scale segment algorithm was applied to build a three-layer hierarchy. By analyzing the spectrum, texture, shape, location, and context of individual object classes in different layers, the fuzzy determined rule system was established for the extraction of earthquake-collapsed building information. We compared the two sets of results using three variables: precision assessment, visual effect, and principle. Both methods can extract earthquake-collapsed building information quickly and accurately. The object-oriented method successfully overcomes the pepper salt noise caused by the spectral diversity of high-resolution remote sensing data and solves the problem of same object, different spectrums and that of same spectrum, different objects. With an overall accuracy of 90.38%, the method achieves more scientific and accurate results compared with the pixel-oriented method (76.84%). The object-oriented image analysis method can be extensively applied in the extraction of earthquake disaster information based on high-resolution remote sensing

  16. Improving information extraction using a probability-based approach

    DEFF Research Database (Denmark)

    Kim, S.; Ahmed, Saeema; Wallace, K.

    2007-01-01

    Information plays a crucial role during the entire life-cycle of a product. It has been shown that engineers frequently consult colleagues to obtain the information they require to solve problems. However, the industrial world is now more transient and key personnel move to other companies...... or retire. It is becoming essential to retrieve vital information from archived product documents, if it is available. There is, therefore, great interest in ways of extracting relevant and sharable information from documents. A keyword-based search is commonly used, but studies have shown...... the recall, while maintaining the high precision, a learning approach that makes identification decisions based on a probability model, rather than simply looking up the presence of the pre-defined variations, looks promising. This paper presents the results of developing such a probability-based entity...

  17. The Limitations of Quantitative Social Science for Informing Public Policy

    Science.gov (United States)

    Jerrim, John; de Vries, Robert

    2017-01-01

    Quantitative social science (QSS) has the potential to make an important contribution to public policy. However it also has a number of limitations. The aim of this paper is to explain these limitations to a non-specialist audience and to identify a number of ways in which QSS research could be improved to better inform public policy.

  18. Phytochemical analysis and standardization of Strychnos nux-vomica extract through HPTLC techniques

    Directory of Open Access Journals (Sweden)

    Dinesh Kumar Patel

    2012-05-01

    Full Text Available Objective: The objective is to develop a noval qualitative and quantitative method by which we can determine different phytoconstituents of Strychnos nux-vomica L. Methods: To profile the phyconstituents of Strychnos nux-vomica, in the present study hydroalcoholic extract of Strychnos nux-vomica was subjected to preliminary phytochemical analysis, antimicrobial activities against certain pathogenic microorganisms, solubility test, loss on drying and pH value. Extract was also subjected to the quantitative analysis including total phenol, flavonoid and heavy metal analysis. Quantitative analysis was performed through HPTLC methods using strychnine and brucine as a standard marker. Results: Phytochemical analysis revealed the presence of alkaloid, carbohydrate, tannin, steroid, triterpenoid and glycoside in the extract. Total flavonoid and phenol content of Strychnos nux-vomica L extract was found to be 0.40 % and 0.43%. Result showed that the level of heavy metal (lead, arsenic, mercury and cadmium complie the standard level. Total bacterial count, yeast and moulds contents were found to be under the limit whereas E. coli and salmonella was found to be absent in the extract. Content of strychnine and brucine were found to be 4.75% and 3.91%. Conclusions: These studies provide valluable information for correct identification and selection of the drug from various adulterations. In future this study will be helpful for the quantitative analysis as well as standardization of the Strychnos nux-vomica L.

  19. Accurate facade feature extraction method for buildings from three-dimensional point cloud data considering structural information

    Science.gov (United States)

    Wang, Yongzhi; Ma, Yuqing; Zhu, A.-xing; Zhao, Hui; Liao, Lixia

    2018-05-01

    Facade features represent segmentations of building surfaces and can serve as a building framework. Extracting facade features from three-dimensional (3D) point cloud data (3D PCD) is an efficient method for 3D building modeling. By combining the advantages of 3D PCD and two-dimensional optical images, this study describes the creation of a highly accurate building facade feature extraction method from 3D PCD with a focus on structural information. The new extraction method involves three major steps: image feature extraction, exploration of the mapping method between the image features and 3D PCD, and optimization of the initial 3D PCD facade features considering structural information. Results show that the new method can extract the 3D PCD facade features of buildings more accurately and continuously. The new method is validated using a case study. In addition, the effectiveness of the new method is demonstrated by comparing it with the range image-extraction method and the optical image-extraction method in the absence of structural information. The 3D PCD facade features extracted by the new method can be applied in many fields, such as 3D building modeling and building information modeling.

  20. Information analysis of iris biometrics for the needs of cryptology key extraction

    Directory of Open Access Journals (Sweden)

    Adamović Saša

    2013-01-01

    Full Text Available The paper presents a rigorous analysis of iris biometric information for the synthesis of an optimized system for the extraction of a high quality cryptology key. Estimations of local entropy and mutual information were identified as segments of the iris most suitable for this purpose. In order to optimize parameters, corresponding wavelets were transformed, in order to obtain the highest possible entropy and mutual information lower in the transformation domain, which set frameworks for the synthesis of systems for the extraction of truly random sequences of iris biometrics, without compromising authentication properties. [Projekat Ministarstva nauke Republike Srbije, br. TR32054 i br. III44006

  1. Information retrieval and terminology extraction in online resources for patients with diabetes.

    Science.gov (United States)

    Seljan, Sanja; Baretić, Maja; Kucis, Vlasta

    2014-06-01

    Terminology use, as a mean for information retrieval or document indexing, plays an important role in health literacy. Specific types of users, i.e. patients with diabetes need access to various online resources (on foreign and/or native language) searching for information on self-education of basic diabetic knowledge, on self-care activities regarding importance of dietetic food, medications, physical exercises and on self-management of insulin pumps. Automatic extraction of corpus-based terminology from online texts, manuals or professional papers, can help in building terminology lists or list of "browsing phrases" useful in information retrieval or in document indexing. Specific terminology lists represent an intermediate step between free text search and controlled vocabulary, between user's demands and existing online resources in native and foreign language. The research aiming to detect the role of terminology in online resources, is conducted on English and Croatian manuals and Croatian online texts, and divided into three interrelated parts: i) comparison of professional and popular terminology use ii) evaluation of automatic statistically-based terminology extraction on English and Croatian texts iii) comparison and evaluation of extracted terminology performed on English manual using statistical and hybrid approaches. Extracted terminology candidates are evaluated by comparison with three types of reference lists: list created by professional medical person, list of highly professional vocabulary contained in MeSH and list created by non-medical persons, made as intersection of 15 lists. Results report on use of popular and professional terminology in online diabetes resources, on evaluation of automatically extracted terminology candidates in English and Croatian texts and on comparison of statistical and hybrid extraction methods in English text. Evaluation of automatic and semi-automatic terminology extraction methods is performed by recall

  2. [Extraction of buildings three-dimensional information from high-resolution satellite imagery based on Barista software].

    Science.gov (United States)

    Zhang, Pei-feng; Hu, Yuan-man; He, Hong-shi

    2010-05-01

    The demand for accurate and up-to-date spatial information of urban buildings is becoming more and more important for urban planning, environmental protection, and other vocations. Today's commercial high-resolution satellite imagery offers the potential to extract the three-dimensional information of urban buildings. This paper extracted the three-dimensional information of urban buildings from QuickBird imagery, and validated the precision of the extraction based on Barista software. It was shown that the extraction of three-dimensional information of the buildings from high-resolution satellite imagery based on Barista software had the advantages of low professional level demand, powerful universality, simple operation, and high precision. One pixel level of point positioning and height determination accuracy could be achieved if the digital elevation model (DEM) and sensor orientation model had higher precision and the off-Nadir View Angle was relatively perfect.

  3. Optimisation of information influences on problems of consequences of Chernobyl accident and quantitative criteria for estimation of information actions

    International Nuclear Information System (INIS)

    Sobaleu, A.

    2004-01-01

    Consequences of Chernobyl NPP accident still very important for Belarus. About 2 million Byelorussians live in the districts polluted by Chernobyl radionuclides. Modern approaches to the decision of after Chernobyl problems in Belarus assume more active use of information and educational actions to grow up a new radiological culture. It will allow to reduce internal doze of radiation without spending a lot of money and other resources. Experience of information work with the population affected by Chernobyl since 1986 till 2004 has shown, that information and educational influences not always reach the final aim - application of received knowledge on radiating safety in practice and changing the style of life. If we take into account limited funds and facilities, we should optimize information work. The optimization can be achieved on the basis of quantitative estimations of information actions effectiveness. It is possible to use two parameters for this quantitative estimations: 1) increase in knowledge of the population and experts on the radiating safety, calculated by new method based on applied theory of the information (Mathematical Theory of Communication) by Claude E. Shannon and 2) reduction of internal doze of radiation, calculated on the basis of measurements on human irradiation counter (HIC) before and after an information or educational influence. (author)

  4. Portable microwave assisted extraction: An original concept for green analytical chemistry.

    Science.gov (United States)

    Perino, Sandrine; Petitcolas, Emmanuel; de la Guardia, Miguel; Chemat, Farid

    2013-11-08

    This paper describes a portable microwave assisted extraction apparatus (PMAE) for extraction of bioactive compounds especially essential oils and aromas directly in a crop or in a forest. The developed procedure, based on the concept of green analytical chemistry, is appropriate to obtain direct in-field information about the level of essential oils in natural samples and to illustrate green chemical lesson and research. The efficiency of this experiment was validated for the extraction of essential oil of rosemary directly in a crop and allows obtaining a quantitative information on the content of essential oil, which was similar to that obtained by conventional methods in the laboratory. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Quantitative Analysis of Qualitative Information from Interviews: A Systematic Literature Review

    Science.gov (United States)

    Fakis, Apostolos; Hilliam, Rachel; Stoneley, Helen; Townend, Michael

    2014-01-01

    Background: A systematic literature review was conducted on mixed methods area. Objectives: The overall aim was to explore how qualitative information from interviews has been analyzed using quantitative methods. Methods: A contemporary review was undertaken and based on a predefined protocol. The references were identified using inclusion and…

  6. A Method for Extracting Road Boundary Information from Crowdsourcing Vehicle GPS Trajectories.

    Science.gov (United States)

    Yang, Wei; Ai, Tinghua; Lu, Wei

    2018-04-19

    Crowdsourcing trajectory data is an important approach for accessing and updating road information. In this paper, we present a novel approach for extracting road boundary information from crowdsourcing vehicle traces based on Delaunay triangulation (DT). First, an optimization and interpolation method is proposed to filter abnormal trace segments from raw global positioning system (GPS) traces and interpolate the optimization segments adaptively to ensure there are enough tracking points. Second, constructing the DT and the Voronoi diagram within interpolated tracking lines to calculate road boundary descriptors using the area of Voronoi cell and the length of triangle edge. Then, the road boundary detection model is established integrating the boundary descriptors and trajectory movement features (e.g., direction) by DT. Third, using the boundary detection model to detect road boundary from the DT constructed by trajectory lines, and a regional growing method based on seed polygons is proposed to extract the road boundary. Experiments were conducted using the GPS traces of taxis in Beijing, China, and the results show that the proposed method is suitable for extracting the road boundary from low-frequency GPS traces, multi-type road structures, and different time intervals. Compared with two existing methods, the automatically extracted boundary information was proved to be of higher quality.

  7. A Method for Extracting Road Boundary Information from Crowdsourcing Vehicle GPS Trajectories

    Directory of Open Access Journals (Sweden)

    Wei Yang

    2018-04-01

    Full Text Available Crowdsourcing trajectory data is an important approach for accessing and updating road information. In this paper, we present a novel approach for extracting road boundary information from crowdsourcing vehicle traces based on Delaunay triangulation (DT. First, an optimization and interpolation method is proposed to filter abnormal trace segments from raw global positioning system (GPS traces and interpolate the optimization segments adaptively to ensure there are enough tracking points. Second, constructing the DT and the Voronoi diagram within interpolated tracking lines to calculate road boundary descriptors using the area of Voronoi cell and the length of triangle edge. Then, the road boundary detection model is established integrating the boundary descriptors and trajectory movement features (e.g., direction by DT. Third, using the boundary detection model to detect road boundary from the DT constructed by trajectory lines, and a regional growing method based on seed polygons is proposed to extract the road boundary. Experiments were conducted using the GPS traces of taxis in Beijing, China, and the results show that the proposed method is suitable for extracting the road boundary from low-frequency GPS traces, multi-type road structures, and different time intervals. Compared with two existing methods, the automatically extracted boundary information was proved to be of higher quality.

  8. A cascade of classifiers for extracting medication information from discharge summaries

    Directory of Open Access Journals (Sweden)

    Halgrim Scott

    2011-07-01

    Full Text Available Abstract Background Extracting medication information from clinical records has many potential applications, and recently published research, systems, and competitions reflect an interest therein. Much of the early extraction work involved rules and lexicons, but more recently machine learning has been applied to the task. Methods We present a hybrid system consisting of two parts. The first part, field detection, uses a cascade of statistical classifiers to identify medication-related named entities. The second part uses simple heuristics to link those entities into medication events. Results The system achieved performance that is comparable to other approaches to the same task. This performance is further improved by adding features that reference external medication name lists. Conclusions This study demonstrates that our hybrid approach outperforms purely statistical or rule-based systems. The study also shows that a cascade of classifiers works better than a single classifier in extracting medication information. The system is available as is upon request from the first author.

  9. Quantitative nanometer-scale mapping of dielectric tunability

    Energy Technology Data Exchange (ETDEWEB)

    Tselev, Alexander [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Klein, Andreas [Technische Univ. Darmstadt (Germany); Gassmann, Juergen [Technische Univ. Darmstadt (Germany); Jesse, Stephen [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Li, Qian [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kalinin, Sergei V. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wisinger, Nina Balke [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-08-21

    Two scanning probe microscopy techniques—near-field scanning microwave microscopy (SMM) and piezoresponse force microscopy (PFM)—are used to characterize and image tunability in a thin (Ba,Sr)TiO3 film with nanometer scale spatial resolution. While sMIM allows direct probing of tunability by measurement of the change in the dielectric constant, in PFM, tunability can be extracted via electrostrictive response. The near-field microwave imaging and PFM provide similar information about dielectric tunability with PFM capable to deliver quantitative information on tunability with a higher spatial resolution close to 15 nm. This is the first time that information about the dielectric tunability is available on such length scales.

  10. Rapid automatic keyword extraction for information retrieval and analysis

    Science.gov (United States)

    Rose, Stuart J [Richland, WA; Cowley,; E, Wendy [Richland, WA; Crow, Vernon L [Richland, WA; Cramer, Nicholas O [Richland, WA

    2012-03-06

    Methods and systems for rapid automatic keyword extraction for information retrieval and analysis. Embodiments can include parsing words in an individual document by delimiters, stop words, or both in order to identify candidate keywords. Word scores for each word within the candidate keywords are then calculated based on a function of co-occurrence degree, co-occurrence frequency, or both. Based on a function of the word scores for words within the candidate keyword, a keyword score is calculated for each of the candidate keywords. A portion of the candidate keywords are then extracted as keywords based, at least in part, on the candidate keywords having the highest keyword scores.

  11. QUANTITATIVE ION-PAIR EXTRACTION OF 4(5)-METHYLIMIDAZOLE FROM CARAMEL COLOR AND ITS DETERMINATION BY REVERSED-PHASE ION-PAIR LIQUID-CHROMATOGRAPHY

    DEFF Research Database (Denmark)

    Thomsen, Mohens; Willumsen, Dorthe

    1981-01-01

    A procedure for quantitative ion-pair extraction of 4(5)-methylimidazole from caramel colour using bis(2-ethylhexyl)phosphoric acid as ion-pairing agent has been developed. Furthermore, a reversed-phase ion-pair liquid chromatographic separation method has been established to analyse the content...

  12. A convenient method for the quantitative determination of elemental sulfur in coal by HPLC analysis of perchloroethylene extracts

    Science.gov (United States)

    Buchanan, D.H.; Coombs, K.J.; Murphy, P.M.; Chaven, C.

    1993-01-01

    A convenient method for the quantitative determination of elemental sulfur in coal is described. Elemental sulfur is extracted from the coal with hot perchloroethylene (PCE) (tetrachloroethene, C2Cl4) and quantitatively determined by HPLC analysis on a C18 reverse-phase column using UV detection. Calibration solutions were prepared from sublimed sulfur. Results of quantitative HPLC analyses agreed with those of a chemical/spectroscopic analysis. The HPLC method was found to be linear over the concentration range of 6 ?? 10-4 to 2 ?? 10-2 g/L. The lower detection limit was 4 ?? 10-4 g/L, which for a coal sample of 20 g is equivalent to 0.0006% by weight of coal. Since elemental sulfur is known to react slowly with hydrocarbons at the temperature of boiling PCE, standard solutions of sulfur in PCE were heated with coals from the Argonne Premium Coal Sample program. Pseudo-first-order uptake of sulfur by the coals was observed over several weeks of heating. For the Illinois No. 6 premium coal, the rate constant for sulfur uptake was 9.7 ?? 10-7 s-1, too small for retrograde reactions between solubilized sulfur and coal to cause a significant loss in elemental sulfur isolated during the analytical extraction. No elemental sulfur was produced when the following pure compounds were heated to reflux in PCE for up to 1 week: benzyl sulfide, octyl sulfide, thiane, thiophene, benzothiophene, dibenzothiophene, sulfuric acid, or ferrous sulfate. A sluury of mineral pyrite in PCE contained elemental sulfur which increased in concentration with heating time. ?? 1993 American Chemical Society.

  13. Providing Quantitative Information and a Nudge to Undergo Stool Testing in a Colorectal Cancer Screening Decision Aid: A Randomized Clinical Trial.

    Science.gov (United States)

    Schwartz, Peter H; Perkins, Susan M; Schmidt, Karen K; Muriello, Paul F; Althouse, Sandra; Rawl, Susan M

    2017-08-01

    Guidelines recommend that patient decision aids should provide quantitative information about probabilities of potential outcomes, but the impact of this information is unknown. Behavioral economics suggests that patients confused by quantitative information could benefit from a "nudge" towards one option. We conducted a pilot randomized trial to estimate the effect sizes of presenting quantitative information and a nudge. Primary care patients (n = 213) eligible for colorectal cancer screening viewed basic screening information and were randomized to view (a) quantitative information (quantitative module), (b) a nudge towards stool testing with the fecal immunochemical test (FIT) (nudge module), (c) neither a nor b, or (d) both a and b. Outcome measures were perceived colorectal cancer risk, screening intent, preferred test, and decision conflict, measured before and after viewing the decision aid, and screening behavior at 6 months. Patients viewing the quantitative module were more likely to be screened than those who did not ( P = 0.012). Patients viewing the nudge module had a greater increase in perceived colorectal cancer risk than those who did not ( P = 0.041). Those viewing the quantitative module had a smaller increase in perceived risk than those who did not ( P = 0.046), and the effect was moderated by numeracy. Among patients with high numeracy who did not view the nudge module, those who viewed the quantitative module had a greater increase in intent to undergo FIT ( P = 0.028) than did those who did not. The limitations of this study were the limited sample size and single healthcare system. Adding quantitative information to a decision aid increased uptake of colorectal cancer screening, while adding a nudge to undergo FIT did not increase uptake. Further research on quantitative information in decision aids is warranted.

  14. Robust Vehicle and Traffic Information Extraction for Highway Surveillance

    Directory of Open Access Journals (Sweden)

    Yeh Chia-Hung

    2005-01-01

    Full Text Available A robust vision-based traffic monitoring system for vehicle and traffic information extraction is developed in this research. It is challenging to maintain detection robustness at all time for a highway surveillance system. There are three major problems in detecting and tracking a vehicle: (1 the moving cast shadow effect, (2 the occlusion effect, and (3 nighttime detection. For moving cast shadow elimination, a 2D joint vehicle-shadow model is employed. For occlusion detection, a multiple-camera system is used to detect occlusion so as to extract the exact location of each vehicle. For vehicle nighttime detection, a rear-view monitoring technique is proposed to maintain tracking and detection accuracy. Furthermore, we propose a method to improve the accuracy of background extraction, which usually serves as the first step in any vehicle detection processing. Experimental results are given to demonstrate that the proposed techniques are effective and efficient for vision-based highway surveillance.

  15. Quantitative contrast-enhanced mammography for contrast medium kinetics studies

    Science.gov (United States)

    Arvanitis, C. D.; Speller, R.

    2009-10-01

    Quantitative contrast-enhanced mammography, based on a dual-energy approach, aims to extract quantitative and temporal information of the tumour enhancement after administration of iodinated vascular contrast media. Simulations using analytical expressions and optimization of critical parameters essential for the development of quantitative contrast-enhanced mammography are presented. The procedure has been experimentally evaluated using a tissue-equivalent phantom and an amorphous silicon active matrix flat panel imager. The x-ray beams were produced by a tungsten target tube and spectrally shaped using readily available materials. Measurement of iodine projected thickness in mg cm-2 has been performed. The effect of beam hardening does not introduce nonlinearities in the measurement of iodine projected thickness for values of thicknesses found in clinical investigations. However, scattered radiation introduces significant deviations from slope equal to unity when compared with the actual iodine projected thickness. Scatter correction before the analysis of the dual-energy images provides accurate iodine projected thickness measurements. At 10% of the exposure used in clinical mammography, signal-to-noise ratios in excess of 5 were achieved for iodine projected thicknesses less than 3 mg cm-2 within a 4 cm thick phantom. For the extraction of temporal information, a limited number of low-dose images were used with the phantom incorporating a flow of iodinated contrast medium. The results suggest that spatial and temporal information of iodinated contrast media can be used to indirectly measure the tumour microvessel density and determine its uptake and washout from breast tumours. The proposed method can significantly improve tumour detection in dense breasts. Its application to perform in situ x-ray biopsy and assessment of the oncolytic effect of anticancer agents is foreseeable.

  16. Preliminary Phytochemical Screening, Quantitative Analysis of Alkaloids, and Antioxidant Activity of Crude Plant Extracts from Ephedra intermedia Indigenous to Balochistan.

    Science.gov (United States)

    Gul, Rahman; Jan, Syed Umer; Faridullah, Syed; Sherani, Samiullah; Jahan, Nusrat

    2017-01-01

    The aim of this study was to evaluate the antioxidant activity, screening the phytogenic chemical compounds, and to assess the alkaloids present in the E. intermedia to prove its uses in Pakistani folk medicines for the treatment of asthma and bronchitis. Antioxidant activity was analyzed by using 2,2-diphenyl-1-picryl-hydrazyl-hydrate assay. Standard methods were used for the identification of cardiac glycosides, phenolic compounds, flavonoids, anthraquinones, and alkaloids. High performance liquid chromatography (HPLC) was used for quantitative purpose of ephedrine alkaloids in E. intermedia . The quantitative separation was confirmed on Shimadzu 10AVP column (Shampack) of internal diameter (id) 3.0 mm and 50 mm in length. The extract of the solute in flow rate of 1 ml/min at the wavelength 210 nm and methanolic extract showed the antioxidant activity and powerful oxygen free radicals scavenging activities and the IC50 for the E. intermedia plant was near to the reference standard ascorbic acid. The HPLC method was useful for the quantitative purpose of ephedrine (E) and pseudoephedrine (PE) used for 45 samples of one species collected from central habitat in three districts (Ziarat, Shairani, and Kalat) of Balochistan. Results showed that average alkaloid substance in E. intermedia was as follows: PE (0.209%, 0.238%, and 0.22%) and E (0.0538%, 0.0666%, and 0.0514%).

  17. Understanding the information needs of people with haematological cancers. A meta-ethnography of quantitative and qualitative research.

    Science.gov (United States)

    Atherton, K; Young, B; Salmon, P

    2017-11-01

    Clinical practice in haematological oncology often involves difficult diagnostic and treatment decisions. In this context, understanding patients' information needs and the functions that information serves for them is particularly important. We systematically reviewed qualitative and quantitative evidence on haematological oncology patients' information needs to inform how these needs can best be addressed in clinical practice. PsycINFO, Medline and CINAHL Plus electronic databases were searched for relevant empirical papers published from January 2003 to July 2016. Synthesis of the findings drew on meta-ethnography and meta-study. Most quantitative studies used a survey design and indicated that patients are largely content with the information they receive from physicians, however much or little they actually receive, although a minority of patients are not content with information. Qualitative studies suggest that a sense of being in a caring relationship with a physician allows patients to feel content with the information they have been given, whereas patients who lack such a relationship want more information. The qualitative evidence can help explain the lack of association between the amount of information received and contentment with it in the quantitative research. Trusting relationships are integral to helping patients feel that their information needs have been met. © 2017 John Wiley & Sons Ltd.

  18. Quantitative radiomic profiling of glioblastoma represents transcriptomic expression.

    Science.gov (United States)

    Kong, Doo-Sik; Kim, Junhyung; Ryu, Gyuha; You, Hye-Jin; Sung, Joon Kyung; Han, Yong Hee; Shin, Hye-Mi; Lee, In-Hee; Kim, Sung-Tae; Park, Chul-Kee; Choi, Seung Hong; Choi, Jeong Won; Seol, Ho Jun; Lee, Jung-Il; Nam, Do-Hyun

    2018-01-19

    Quantitative imaging biomarkers have increasingly emerged in the field of research utilizing available imaging modalities. We aimed to identify good surrogate radiomic features that can represent genetic changes of tumors, thereby establishing noninvasive means for predicting treatment outcome. From May 2012 to June 2014, we retrospectively identified 65 patients with treatment-naïve glioblastoma with available clinical information from the Samsung Medical Center data registry. Preoperative MR imaging data were obtained for all 65 patients with primary glioblastoma. A total of 82 imaging features including first-order statistics, volume, and size features, were semi-automatically extracted from structural and physiologic images such as apparent diffusion coefficient and perfusion images. Using commercially available software, NordicICE, we performed quantitative imaging analysis and collected the dataset composed of radiophenotypic parameters. Unsupervised clustering methods revealed that the radiophenotypic dataset was composed of three clusters. Each cluster represented a distinct molecular classification of glioblastoma; classical type, proneural and neural types, and mesenchymal type. These clusters also reflected differential clinical outcomes. We found that extracted imaging signatures does not represent copy number variation and somatic mutation. Quantitative radiomic features provide a potential evidence to predict molecular phenotype and treatment outcome. Radiomic profiles represents transcriptomic phenotypes more well.

  19. A writer's guide to education scholarship: Quantitative methodologies for medical education research (part 1).

    Science.gov (United States)

    Thoma, Brent; Camorlinga, Paola; Chan, Teresa M; Hall, Andrew Koch; Murnaghan, Aleisha; Sherbino, Jonathan

    2018-01-01

    Quantitative research is one of the many research methods used to help educators advance their understanding of questions in medical education. However, little research has been done on how to succeed in publishing in this area. We conducted a scoping review to identify key recommendations and reporting guidelines for quantitative educational research and scholarship. Medline, ERIC, and Google Scholar were searched for English-language articles published between 2006 and January 2016 using the search terms, "research design," "quantitative," "quantitative methods," and "medical education." A hand search was completed for additional references during the full-text review. Titles/abstracts were reviewed by two authors (BT, PC) and included if they focused on quantitative research in medical education and outlined reporting guidelines, or provided recommendations on conducting quantitative research. One hundred articles were reviewed in parallel with the first 30 used for calibration and the subsequent 70 to calculate Cohen's kappa coefficient. Two reviewers (BT, PC) conducted a full text review and extracted recommendations and reporting guidelines. A simple thematic analysis summarized the extracted recommendations. Sixty-one articles were reviewed in full, and 157 recommendations were extracted. The thematic analysis identified 86 items, 14 categories, and 3 themes. Fourteen quality evaluation tools and reporting guidelines were found. Discussion This paper provides guidance for junior researchers in the form of key quality markers and reporting guidelines. We hope that quantitative researchers in medical education will be informed by the results and that further work will be done to refine the list of recommendations.

  20. Research of building information extraction and evaluation based on high-resolution remote-sensing imagery

    Science.gov (United States)

    Cao, Qiong; Gu, Lingjia; Ren, Ruizhi; Wang, Lang

    2016-09-01

    Building extraction currently is important in the application of high-resolution remote sensing imagery. At present, quite a few algorithms are available for detecting building information, however, most of them still have some obvious disadvantages, such as the ignorance of spectral information, the contradiction between extraction rate and extraction accuracy. The purpose of this research is to develop an effective method to detect building information for Chinese GF-1 data. Firstly, the image preprocessing technique is used to normalize the image and image enhancement is used to highlight the useful information in the image. Secondly, multi-spectral information is analyzed. Subsequently, an improved morphological building index (IMBI) based on remote sensing imagery is proposed to get the candidate building objects. Furthermore, in order to refine building objects and further remove false objects, the post-processing (e.g., the shape features, the vegetation index and the water index) is employed. To validate the effectiveness of the proposed algorithm, the omission errors (OE), commission errors (CE), the overall accuracy (OA) and Kappa are used at final. The proposed method can not only effectively use spectral information and other basic features, but also avoid extracting excessive interference details from high-resolution remote sensing images. Compared to the original MBI algorithm, the proposed method reduces the OE by 33.14% .At the same time, the Kappa increase by 16.09%. In experiments, IMBI achieved satisfactory results and outperformed other algorithms in terms of both accuracies and visual inspection

  1. Extracting Social Networks and Contact Information From Email and the Web

    National Research Council Canada - National Science Library

    Culotta, Aron; Bekkerman, Ron; McCallum, Andrew

    2005-01-01

    ...-suited for such information extraction tasks. By recursively calling itself on new people discovered on the Web, the system builds a social network with multiple degrees of separation from the user...

  2. Identifying Contributors of DNA Mixtures by Means of Quantitative Information of STR Typing

    DEFF Research Database (Denmark)

    Tvedebrink, Torben; Eriksen, Poul Svante; Mogensen, Helle Smidt

    2012-01-01

    identified using polymorphic genetic markers. However, modern typing techniques supply additional quantitative data, which contain very important information about the observed evidence. This is particularly true for cases of DNA mixtures, where more than one individual has contributed to the observed......Abstract Estimating the weight of evidence in forensic genetics is often done in terms of a likelihood ratio, LR. The LR evaluates the probability of the observed evidence under competing hypotheses. Most often, probabilities used in the LR only consider the evidence from the genomic variation...... biological stain. This article presents a method for including the quantitative information of short tandem repeat (STR) DNA mixtures in the LR. Also, an efficient algorithmic method for finding the best matching combination of DNA mixture profiles is derived and implemented in an on-line tool for two...

  3. Quantitative Study of Emotional Intelligence and Communication Levels in Information Technology Professionals

    Science.gov (United States)

    Hendon, Michalina

    2016-01-01

    This quantitative non-experimental correlational research analyzes the relationship between emotional intelligence and communication due to the lack of this research on information technology professionals in the U.S. One hundred and eleven (111) participants completed a survey that measures both the emotional intelligence and communication…

  4. Solvent Front Position Extraction procedure with thin-layer chromatography as a mode of multicomponent sample preparation for quantitative analysis by instrumental technique.

    Science.gov (United States)

    Klimek-Turek, A; Sikora, E; Dzido, T H

    2017-12-29

    A concept of using thin-layer chromatography to multicomponent sample preparation for quantitative determination of solutes followed by instrumental technique is presented. Thin-layer chromatography (TLC) is used to separate chosen substances and their internal standard from other components (matrix) and to form a single spot/zone containing them at the solvent front position. The location of the analytes and internal standard in the solvent front zone allows their easy extraction followed by quantitation by HPLC. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Evaluation of sample extraction methods for proteomics analysis of green algae Chlorella vulgaris.

    Science.gov (United States)

    Gao, Yan; Lim, Teck Kwang; Lin, Qingsong; Li, Sam Fong Yau

    2016-05-01

    Many protein extraction methods have been developed for plant proteome analysis but information is limited on the optimal protein extraction method from algae species. This study evaluated four protein extraction methods, i.e. direct lysis buffer method, TCA-acetone method, phenol method, and phenol/TCA-acetone method, using green algae Chlorella vulgaris for proteome analysis. The data presented showed that phenol/TCA-acetone method was superior to the other three tested methods with regards to shotgun proteomics. Proteins identified using shotgun proteomics were validated using sequential window acquisition of all theoretical fragment-ion spectra (SWATH) technique. Additionally, SWATH provides protein quantitation information from different methods and protein abundance using different protein extraction methods was evaluated. These results highlight the importance of green algae protein extraction method for subsequent MS analysis and identification. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Information Extraction of High-Resolution Remotely Sensed Image Based on Multiresolution Segmentation

    Directory of Open Access Journals (Sweden)

    Peng Shao

    2014-08-01

    Full Text Available The principle of multiresolution segmentation was represented in detail in this study, and the canny algorithm was applied for edge-detection of a remotely sensed image based on this principle. The target image was divided into regions based on object-oriented multiresolution segmentation and edge-detection. Furthermore, object hierarchy was created, and a series of features (water bodies, vegetation, roads, residential areas, bare land and other information were extracted by the spectral and geometrical features. The results indicate that the edge-detection has a positive effect on multiresolution segmentation, and overall accuracy of information extraction reaches to 94.6% by the confusion matrix.

  7. Quantitative data extraction from transmission electron micrographs

    International Nuclear Information System (INIS)

    Sprague, J.A.

    1982-01-01

    The discussion will cover an overview of quantitative TEM, the digital image analysis process, coherent optical processing, and finally a summary of the author's views on potentially useful advances in TEM image processing

  8. Quantitative Detection of Trace Level Cloxacillin in Food Samples Using Magnetic Molecularly Imprinted Polymer Extraction and Surface-Enhanced Raman Spectroscopy Nanopillars

    DEFF Research Database (Denmark)

    Ashley, Jon; Wu, Kaiyu; Hansen, Mikkel Fougt

    2017-01-01

    There is an increasing demand for rapid, sensitive, and low cost analytical methods to routinely screen antibiotic residues in food products. Conventional detection of antibiotics involves sample preparation by liquid-liquid or solid-phase extraction, followed by analysis using liquid...... with surface-enhanced Raman spectroscopy (SERS)-based detection for quantitative analysis of cloxacillin in pig serum. MMIP microspheres were synthesized using a core-shell technique. The large loading capacity and high selectivity of the MMIP microspheres enabled efficient extraction of cloxacillin, while...... using an internal standard. By coherently combining MMIP extraction and silicon nanopillar-based SERS biosensor, good sensitivity toward cloxacillin was achieved. The detection limit was 7.8 pmol. Cloxacillin recoveries from spiked pig plasma samples were found to be more than 80%....

  9. Qualitative and Quantitative Data on the Use of the Internet for Archaeological Information

    Directory of Open Access Journals (Sweden)

    Lorna-Jane Richardson

    2015-04-01

    Full Text Available These survey results are from an online survey of 577 UK-based archaeological volunteers, professional archaeologists and archaeological organisations. These data cover a variety of topics related to how and why people access the Internet for information about archaeology, including demographic information, activity relating to accessing information on archaeological topics, archaeological sharing and networking and the use of mobile phone apps and QR codes for public engagement. There is wide scope for further qualitative and quantitative analysis of these data.

  10. Extracting information from multiplex networks

    Science.gov (United States)

    Iacovacci, Jacopo; Bianconi, Ginestra

    2016-06-01

    Multiplex networks are generalized network structures that are able to describe networks in which the same set of nodes are connected by links that have different connotations. Multiplex networks are ubiquitous since they describe social, financial, engineering, and biological networks as well. Extending our ability to analyze complex networks to multiplex network structures increases greatly the level of information that is possible to extract from big data. For these reasons, characterizing the centrality of nodes in multiplex networks and finding new ways to solve challenging inference problems defined on multiplex networks are fundamental questions of network science. In this paper, we discuss the relevance of the Multiplex PageRank algorithm for measuring the centrality of nodes in multilayer networks and we characterize the utility of the recently introduced indicator function Θ ˜ S for describing their mesoscale organization and community structure. As working examples for studying these measures, we consider three multiplex network datasets coming for social science.

  11. OpenCV-Based Nanomanipulation Information Extraction and the Probe Operation in SEM

    Directory of Open Access Journals (Sweden)

    Dongjie Li

    2015-02-01

    Full Text Available Aimed at the established telenanomanipulation system, the method of extracting location information and the strategies of probe operation were studied in this paper. First, the machine learning algorithm of OpenCV was used to extract location information from SEM images. Thus nanowires and probe in SEM images can be automatically tracked and the region of interest (ROI can be marked quickly. Then the location of nanowire and probe can be extracted from the ROI. To study the probe operation strategy, the Van der Waals force between probe and a nanowire was computed; thus relevant operating parameters can be obtained. With these operating parameters, the nanowire in 3D virtual environment can be preoperated and an optimal path of the probe can be obtained. The actual probe runs automatically under the telenanomanipulation system's control. Finally, experiments were carried out to verify the above methods, and results show the designed methods have achieved the expected effect.

  12. A method for automating the extraction of specialized information from the web

    NARCIS (Netherlands)

    Lin, L.; Liotta, A.; Hippisley, A.; Hao, Y.; Liu, J.; Wang, Y.; Cheung, Y-M.; Yin, H.; Jiao, L.; Ma, j.; Jiao, Y-C.

    2005-01-01

    The World Wide Web can be viewed as a gigantic distributed database including millions of interconnected hosts some of which publish information via web servers or peer-to-peer systems. We present here a novel method for the extraction of semantically rich information from the web in a fully

  13. A comparison of sorptive extraction techniques coupled to a new quantitative, sensitive, high throughput GC-MS/MS method for methoxypyrazine analysis in wine.

    Science.gov (United States)

    Hjelmeland, Anna K; Wylie, Philip L; Ebeler, Susan E

    2016-02-01

    Methoxypyrazines are volatile compounds found in plants, microbes, and insects that have potent vegetal and earthy aromas. With sensory detection thresholds in the low ng L(-1) range, modest concentrations of these compounds can profoundly impact the aroma quality of foods and beverages, and high levels can lead to consumer rejection. The wine industry routinely analyzes the most prevalent methoxypyrazine, 2-isobutyl-3-methoxypyrazine (IBMP), to aid in harvest decisions, since concentrations decrease during berry ripening. In addition to IBMP, three other methoxypyrazines IPMP (2-isopropyl-3-methoxypyrazine), SBMP (2-sec-butyl-3-methoxypyrazine), and EMP (2-ethyl-3-methoxypyrazine) have been identified in grapes and/or wine and can impact aroma quality. Despite their routine analysis in the wine industry (mostly IBMP), accurate methoxypyrazine quantitation is hindered by two major challenges: sensitivity and resolution. With extremely low sensory detection thresholds (~8-15 ng L(-1) in wine for IBMP), highly sensitive analytical methods to quantify methoxypyrazines at trace levels are necessary. Here we were able to achieve resolution of IBMP as well as IPMP, EMP, and SBMP from co-eluting compounds using one-dimensional chromatography coupled to positive chemical ionization tandem mass spectrometry. Three extraction techniques HS-SPME (headspace-solid phase microextraction), SBSE (stirbar sorptive extraction), and HSSE (headspace sorptive extraction) were validated and compared. A 30 min extraction time was used for HS-SPME and SBSE extraction techniques, while 120 min was necessary to achieve sufficient sensitivity for HSSE extractions. All extraction methods have limits of quantitation (LOQ) at or below 1 ng L(-1) for all four methoxypyrazines analyzed, i.e., LOQ's at or below reported sensory detection limits in wine. The method is high throughput, with resolution of all compounds possible with a relatively rapid 27 min GC oven program. Copyright © 2015

  14. Information Risk Management: Qualitative or Quantitative? Cross industry lessons from medical and financial fields

    Directory of Open Access Journals (Sweden)

    Upasna Saluja

    2012-06-01

    Full Text Available Enterprises across the world are taking a hard look at their risk management practices. A number of qualitative and quantitative models and approaches are employed by risk practitioners to keep risk under check. As a norm most organizations end up choosing the more flexible, easier to deploy and customize qualitative models of risk assessment. In practice one sees that such models often call upon the practitioners to make qualitative judgments on a relative rating scale which brings in considerable room for errors, biases and subjectivity. On the other hand under the quantitative risk analysis approach, estimation of risk is connected with application of numerical measures of some kind. Medical risk management models lend themselves as ideal candidates for deriving lessons for Information Security Risk Management. We can use this considerably developed understanding of risk management from the medical field especially Survival Analysis towards handling risks that information infrastructures face. Similarly, financial risk management discipline prides itself on perhaps the most quantifiable of models in risk management. Market Risk and Credit Risk Information Security Risk Management can make risk measurement more objective and quantitative by referring to the approach of Credit Risk. During the recent financial crisis many investors and financial institutions lost money or went bankrupt respectively, because they did not apply the basic principles of risk management. Learning from the financial crisis provides some valuable lessons for information risk management.

  15. Quantitative measurement of cerebral oxygen extraction fraction using MRI in patients with MELAS.

    Directory of Open Access Journals (Sweden)

    Lei Yu

    Full Text Available OBJECTIVE: To quantify the cerebral OEF at different phases of stroke-like episodes in patients with mitochondrial myopathy, encephalopathy, lactic acidosis, and stroke-like episodes (MELAS by using MRI. METHODS: We recruited 32 patients with MELAS confirmed by gene analysis. Conventional MRI scanning, as well as functional MRI including arterial spin labeling and oxygen extraction fraction imaging, was undertaken to obtain the pathological and metabolic information of the brains at different stages of stroke-like episodes in patients. A total of 16 MRI examinations at the acute and subacute phase and 19 examinations at the interictal phase were performed. In addition, 24 healthy volunteers were recruited for control subjects. Six regions of interest were placed in the anterior, middle, and posterior parts of the bilateral hemispheres to measure the OEF of the brain or the lesions. RESULTS: OEF was reduced significantly in brains of patients at both the acute and subacute phase (0.266 ± 0.026 and at the interictal phase (0.295 ± 0.009, compared with normal controls (0.316 ± 0.025. In the brains at the acute and subacute phase of the episode, 13 ROIs were prescribed on the stroke-like lesions, which showed decreased OEF compared with the contralateral spared brain regions. Increased blood flow was revealed in the stroke-like lesions at the acute and subacute phase, which was confined to the lesions. CONCLUSION: MRI can quantitatively show changes in OEF at different phases of stroke-like episodes. The utilization of oxygen in the brain seems to be reduced more severely after the onset of episodes in MELAS, especially for those brain tissues involved in the episodes.

  16. Extraction of land cover change information from ENVISAT-ASAR data in Chengdu Plain

    Science.gov (United States)

    Xu, Wenbo; Fan, Jinlong; Huang, Jianxi; Tian, Yichen; Zhang, Yong

    2006-10-01

    Land cover data are essential to most global change research objectives, including the assessment of current environmental conditions and the simulation of future environmental scenarios that ultimately lead to public policy development. Chinese Academy of Sciences generated a nationwide land cover database in order to carry out the quantification and spatial characterization of land use/cover changes (LUCC) in 1990s. In order to improve the reliability of the database, we will update the database anytime. But it is difficult to obtain remote sensing data to extract land cover change information in large-scale. It is hard to acquire optical remote sensing data in Chengdu plain, so the objective of this research was to evaluate multitemporal ENVISAT advanced synthetic aperture radar (ASAR) data for extracting land cover change information. Based on the fieldwork and the nationwide 1:100000 land cover database, the paper assesses several land cover changes in Chengdu plain, for example: crop to buildings, forest to buildings, and forest to bare land. The results show that ENVISAT ASAR data have great potential for the applications of extracting land cover change information.

  17. Quantitative phase imaging with scanning holographic microscopy: an experimental assesment

    Directory of Open Access Journals (Sweden)

    Tada Yoshitaka

    2006-11-01

    Full Text Available Abstract This paper demonstrates experimentally how quantitative phase information can be obtained in scanning holographic microscopy. Scanning holography can operate in both coherent and incoherent modes, simultaneously if desired, with different detector geometries. A spatially integrating detector provides an incoherent hologram of the object's intensity distribution (absorption and/or fluorescence, for example, while a point detector in a conjugate plane of the pupil provides a coherent hologram of the object's complex amplitude, from which a quantitative measure of its phase distribution can be extracted. The possibility of capturing simultaneously holograms of three-dimensional specimens, leading to three-dimensional reconstructions with absorption contrast, reflectance contrast, fluorescence contrast, as was previously demonstrated, and quantitative phase contrast, as shown here for the first time, opens up new avenues for multimodal imaging in biological studies.

  18. Extraction and Quantitative HPLC Analysis of Coumarin in Hydroalcoholic Extracts of Mikania glomerata Spreng: ("guaco" Leaves

    Directory of Open Access Journals (Sweden)

    Celeghini Renata M. S.

    2001-01-01

    Full Text Available Methods for preparation of hydroalcoholic extracts of "guaco" (Mikania glomerata Spreng. leaves were compared: maceration, maceration under sonication, infusion and supercritical fluid extraction. Evaluation of these methods showed that maceration under sonication had the best results, when considering the ratio extraction yield/extraction time. A high performance liquid chromatography (HPLC procedure for the determination of coumarin in these hydroalcoholic extracts of "guaco" leaves is described. The HPLC method is shown to be sensitive and reproducible.

  19. Qualitative and quantitative analysis of Dibenzofuran, Alkyldibenzofurans, and Benzo[b]naphthofurans in crude oils and source rock extracts

    Science.gov (United States)

    Meijun Li,; Ellis, Geoffrey S.

    2015-01-01

    Dibenzofuran (DBF), its alkylated homologues, and benzo[b]naphthofurans (BNFs) are common oxygen-heterocyclic aromatic compounds in crude oils and source rock extracts. A series of positional isomers of alkyldibenzofuran and benzo[b]naphthofuran were identified in mass chromatograms by comparison with internal standards and standard retention indices. The response factors of dibenzofuran in relation to internal standards were obtained by gas chromatography-mass spectrometry analyses of a set of mixed solutions with different concentration ratios. Perdeuterated dibenzofuran and dibenzothiophene are optimal internal standards for quantitative analyses of furan compounds in crude oils and source rock extracts. The average concentration of the total DBFs in oils derived from siliciclastic lacustrine rock extracts from the Beibuwan Basin, South China Sea, was 518 μg/g, which is about 5 times that observed in the oils from carbonate source rocks in the Tarim Basin, Northwest China. The BNFs occur ubiquitously in source rock extracts and related oils of various origins. The results of this work suggest that the relative abundance of benzo[b]naphthofuran isomers, that is, the benzo[b]naphtho[2,1-d]furan/{benzo[b]naphtho[2,1-d]furan + benzo[b]naphtho[1,2-d]furan} ratio, may be a potential molecular geochemical parameter to indicate oil migration pathways and distances.

  20. Quantitative detection of nitric oxide in exhaled human breath by extractive electrospray ionization mass spectrometry

    Science.gov (United States)

    Pan, Susu; Tian, Yong; Li, Ming; Zhao, Jiuyan; Zhu, Lanlan; Zhang, Wei; Gu, Haiwei; Wang, Haidong; Shi, Jianbo; Fang, Xiang; Li, Penghui; Chen, Huanwen

    2015-03-01

    Exhaled nitric oxide (eNO) is a useful biomarker of various physiological conditions, including asthma and other pulmonary diseases. Herein a fast and sensitive analytical method has been developed for the quantitative detection of eNO based on extractive electrospray ionization mass spectrometry (EESI-MS). Exhaled NO molecules selectively reacted with 2-phenyl-4, 4, 5, 5-tetramethylimidazoline-1-oxyl-3-oxide (PTIO) reagent, and eNO concentration was derived based on the EESI-MS response of 1-oxyl-2-phenyl-4, 4, 5, 5-tetramethylimidazoline (PTI) product. The method allowed quantification of eNO below ppb level (~0.02 ppbv) with a relative standard deviation (RSD) of 11.6%. In addition, eNO levels of 20 volunteers were monitored by EESI-MS over the time period of 10 hrs. Long-term eNO response to smoking a cigarette was recorded, and the observed time-dependent profile was discussed. This work extends the application of EESI-MS to small molecules (mass spectrometers. Long-term quantitative profiling of eNO by EESI-MS opens new possibilities for the research of human metabolism and clinical diagnosis.

  1. Evolving spectral transformations for multitemporal information extraction using evolutionary computation

    Science.gov (United States)

    Momm, Henrique; Easson, Greg

    2011-01-01

    Remote sensing plays an important role in assessing temporal changes in land features. The challenge often resides in the conversion of large quantities of raw data into actionable information in a timely and cost-effective fashion. To address this issue, research was undertaken to develop an innovative methodology integrating biologically-inspired algorithms with standard image classification algorithms to improve information extraction from multitemporal imagery. Genetic programming was used as the optimization engine to evolve feature-specific candidate solutions in the form of nonlinear mathematical expressions of the image spectral channels (spectral indices). The temporal generalization capability of the proposed system was evaluated by addressing the task of building rooftop identification from a set of images acquired at different dates in a cross-validation approach. The proposed system generates robust solutions (kappa values > 0.75 for stage 1 and > 0.4 for stage 2) despite the statistical differences between the scenes caused by land use and land cover changes coupled with variable environmental conditions, and the lack of radiometric calibration between images. Based on our results, the use of nonlinear spectral indices enhanced the spectral differences between features improving the clustering capability of standard classifiers and providing an alternative solution for multitemporal information extraction.

  2. Extraction of Graph Information Based on Image Contents and the Use of Ontology

    Science.gov (United States)

    Kanjanawattana, Sarunya; Kimura, Masaomi

    2016-01-01

    A graph is an effective form of data representation used to summarize complex information. Explicit information such as the relationship between the X- and Y-axes can be easily extracted from a graph by applying human intelligence. However, implicit knowledge such as information obtained from other related concepts in an ontology also resides in…

  3. Results of Studying Astronomy Students’ Science Literacy, Quantitative Literacy, and Information Literacy

    Science.gov (United States)

    Buxner, Sanlyn; Impey, Chris David; Follette, Katherine B.; Dokter, Erin F.; McCarthy, Don; Vezino, Beau; Formanek, Martin; Romine, James M.; Brock, Laci; Neiberding, Megan; Prather, Edward E.

    2017-01-01

    Introductory astronomy courses often serve as terminal science courses for non-science majors and present an opportunity to assess non future scientists’ attitudes towards science as well as basic scientific knowledge and scientific analysis skills that may remain unchanged after college. Through a series of studies, we have been able to evaluate students’ basic science knowledge, attitudes towards science, quantitative literacy, and informational literacy. In the Fall of 2015, we conducted a case study of a single class administering all relevant surveys to an undergraduate class of 20 students. We will present our analysis of trends of each of these studies as well as the comparison case study. In general we have found that students basic scientific knowledge has remained stable over the past quarter century. In all of our studies, there is a strong relationship between student attitudes and their science and quantitative knowledge and skills. Additionally, students’ information literacy is strongly connected to their attitudes and basic scientific knowledge. We are currently expanding these studies to include new audiences and will discuss the implications of our findings for instructors.

  4. Membrane-based microchannel device for continuous quantitative extraction of dissolved free sulfide from water and from oil.

    Science.gov (United States)

    Toda, Kei; Ebisu, Yuki; Hirota, Kazutoshi; Ohira, Shin-Ichi

    2012-09-05

    Underground fluids are important natural sources of drinking water, geothermal energy, and oil-based fuels. To facilitate the surveying of such underground fluids, a novel microchannel extraction device was investigated for in-line continuous analysis and flow injection analysis of sulfide levels in water and in oil. Of the four designs investigated, the honeycomb-patterned microchannel extraction (HMCE) device was found to offer the most effective liquid-liquid extraction. In the HMCE device, a thin silicone membrane was sandwiched between two polydimethylsiloxane plates in which honeycomb-patterned microchannels had been fabricated. The identical patterns on the two plates were accurately aligned. The extracted sulfide was detected by quenching monitoring of fluorescein mercuric acetate (FMA). The sulfide extraction efficiencies from water and oil samples of the HMCE device and of three other designs (two annular and one rectangular channel) were examined theoretically and experimentally. The best performance was obtained with the HMCE device because of its thin sample layer (small diffusion distance) and large interface area. Quantitative extraction from both water and oil could be obtained using the HMCE device. The estimated limit of detection for continuous monitoring was 0.05 μM, and sulfide concentrations in the range of 0.15-10 μM could be determined when the acceptor was 5 μM FMA alkaline solution. The method was applied to natural water analysis using flow injection mode, and the data agreed with those obtained using headspace gas chromatography-flame photometric detection. The analysis of hydrogen sulfide levels in prepared oil samples was also performed. The proposed device is expected to be used for real time survey of oil wells and groundwater wells. Copyright © 2012 Elsevier B.V. All rights reserved.

  5. An Accurate Integral Method for Vibration Signal Based on Feature Information Extraction

    Directory of Open Access Journals (Sweden)

    Yong Zhu

    2015-01-01

    Full Text Available After summarizing the advantages and disadvantages of current integral methods, a novel vibration signal integral method based on feature information extraction was proposed. This method took full advantage of the self-adaptive filter characteristic and waveform correction feature of ensemble empirical mode decomposition in dealing with nonlinear and nonstationary signals. This research merged the superiorities of kurtosis, mean square error, energy, and singular value decomposition on signal feature extraction. The values of the four indexes aforementioned were combined into a feature vector. Then, the connotative characteristic components in vibration signal were accurately extracted by Euclidean distance search, and the desired integral signals were precisely reconstructed. With this method, the interference problem of invalid signal such as trend item and noise which plague traditional methods is commendably solved. The great cumulative error from the traditional time-domain integral is effectively overcome. Moreover, the large low-frequency error from the traditional frequency-domain integral is successfully avoided. Comparing with the traditional integral methods, this method is outstanding at removing noise and retaining useful feature information and shows higher accuracy and superiority.

  6. Extract of Acanthospermum hispidum

    African Journals Online (AJOL)

    Administrator

    quantitatively. Acute toxicity study of the extract was conducted, and diabetic rats induced using alloxan (80 mg/kg ... Type 2 diabetes is one of the leading causes of mortality and ..... (2011): Phytochemical screening and extraction - A review.

  7. Biologically active extracts with kidney affections applications

    Science.gov (United States)

    Pascu (Neagu), Mihaela; Pascu, Daniela-Elena; Cozea, Andreea; Bunaciu, Andrei A.; Miron, Alexandra Raluca; Nechifor, Cristina Aurelia

    2015-12-01

    This paper is aimed to select plant materials rich in bioflavonoid compounds, made from herbs known for their application performances in the prevention and therapy of renal diseases, namely kidney stones and urinary infections (renal lithiasis, nephritis, urethritis, cystitis, etc.). This paper presents a comparative study of the medicinal plant extracts composition belonging to Ericaceae-Cranberry (fruit and leaves) - Vaccinium vitis-idaea L. and Bilberry (fruit) - Vaccinium myrtillus L. Concentrated extracts obtained from medicinal plants used in this work were analyzed from structural, morphological and compositional points of view using different techniques: chromatographic methods (HPLC), scanning electronic microscopy, infrared, and UV spectrophotometry, also by using kinetic model. Liquid chromatography was able to identify the specific compounds of the Ericaceae family, present in all three extracts, arbutosid, as well as specific components of each species, mostly from the class of polyphenols. The identification and quantitative determination of the active ingredients from these extracts can give information related to their therapeutic effects.

  8. Knowledge discovery: Extracting usable information from large amounts of data

    International Nuclear Information System (INIS)

    Whiteson, R.

    1998-01-01

    The threat of nuclear weapons proliferation is a problem of world wide concern. Safeguards are the key to nuclear nonproliferation and data is the key to safeguards. The safeguards community has access to a huge and steadily growing volume of data. The advantages of this data rich environment are obvious, there is a great deal of information which can be utilized. The challenge is to effectively apply proven and developing technologies to find and extract usable information from that data. That information must then be assessed and evaluated to produce the knowledge needed for crucial decision making. Efficient and effective analysis of safeguards data will depend on utilizing technologies to interpret the large, heterogeneous data sets that are available from diverse sources. With an order-of-magnitude increase in the amount of data from a wide variety of technical, textual, and historical sources there is a vital need to apply advanced computer technologies to support all-source analysis. There are techniques of data warehousing, data mining, and data analysis that can provide analysts with tools that will expedite their extracting useable information from the huge amounts of data to which they have access. Computerized tools can aid analysts by integrating heterogeneous data, evaluating diverse data streams, automating retrieval of database information, prioritizing inputs, reconciling conflicting data, doing preliminary interpretations, discovering patterns or trends in data, and automating some of the simpler prescreening tasks that are time consuming and tedious. Thus knowledge discovery technologies can provide a foundation of support for the analyst. Rather than spending time sifting through often irrelevant information, analysts could use their specialized skills in a focused, productive fashion. This would allow them to make their analytical judgments with more confidence and spend more of their time doing what they do best

  9. An integrated enhancement and reconstruction strategy for the quantitative extraction of actin stress fibers from fluorescence micrographs.

    Science.gov (United States)

    Zhang, Zhen; Xia, Shumin; Kanchanawong, Pakorn

    2017-05-22

    The stress fibers are prominent organization of actin filaments that perform important functions in cellular processes such as migration, polarization, and traction force generation, and whose collective organization reflects the physiological and mechanical activities of the cells. Easily visualized by fluorescence microscopy, the stress fibers are widely used as qualitative descriptors of cell phenotypes. However, due to the complexity of the stress fibers and the presence of other actin-containing cellular features, images of stress fibers are relatively challenging to quantitatively analyze using previously developed approaches, requiring significant user intervention. This poses a challenge for the automation of their detection, segmentation, and quantitative analysis. Here we describe an open-source software package, SFEX (Stress Fiber Extractor), which is geared for efficient enhancement, segmentation, and analysis of actin stress fibers in adherent tissue culture cells. Our method made use of a carefully chosen image filtering technique to enhance filamentous structures, effectively facilitating the detection and segmentation of stress fibers by binary thresholding. We subdivided the skeletons of stress fiber traces into piecewise-linear fragments, and used a set of geometric criteria to reconstruct the stress fiber networks by pairing appropriate fiber fragments. Our strategy enables the trajectory of a majority of stress fibers within the cells to be comprehensively extracted. We also present a method for quantifying the dimensions of the stress fibers using an image gradient-based approach. We determine the optimal parameter space using sensitivity analysis, and demonstrate the utility of our approach by analyzing actin stress fibers in cells cultured on various micropattern substrates. We present an open-source graphically-interfaced computational tool for the extraction and quantification of stress fibers in adherent cells with minimal user input. This

  10. Extraction of temporal information in functional MRI

    Science.gov (United States)

    Singh, M.; Sungkarat, W.; Jeong, Jeong-Won; Zhou, Yongxia

    2002-10-01

    The temporal resolution of functional MRI (fMRI) is limited by the shape of the haemodynamic response function (hrf) and the vascular architecture underlying the activated regions. Typically, the temporal resolution of fMRI is on the order of 1 s. We have developed a new data processing approach to extract temporal information on a pixel-by-pixel basis at the level of 100 ms from fMRI data. Instead of correlating or fitting the time-course of each pixel to a single reference function, which is the common practice in fMRI, we correlate each pixel's time-course to a series of reference functions that are shifted with respect to each other by 100 ms. The reference function yielding the highest correlation coefficient for a pixel is then used as a time marker for that pixel. A Monte Carlo simulation and experimental study of this approach were performed to estimate the temporal resolution as a function of signal-to-noise ratio (SNR) in the time-course of a pixel. Assuming a known and stationary hrf, the simulation and experimental studies suggest a lower limit in the temporal resolution of approximately 100 ms at an SNR of 3. The multireference function approach was also applied to extract timing information from an event-related motor movement study where the subjects flexed a finger on cue. The event was repeated 19 times with the event's presentation staggered to yield an approximately 100-ms temporal sampling of the haemodynamic response over the entire presentation cycle. The timing differences among different regions of the brain activated by the motor task were clearly visualized and quantified by this method. The results suggest that it is possible to achieve a temporal resolution of /spl sim/200 ms in practice with this approach.

  11. Quantitative extraction of the bedrock exposure rate based on unmanned aerial vehicle data and Landsat-8 OLI image in a karst environment

    Science.gov (United States)

    Wang, Hongyan; Li, Qiangzi; Du, Xin; Zhao, Longcai

    2017-12-01

    In the karst regions of southwest China, rocky desertification is one of the most serious problems in land degradation. The bedrock exposure rate is an important index to assess the degree of rocky desertification in karst regions. Because of the inherent merits of macro-scale, frequency, efficiency, and synthesis, remote sensing is a promising method to monitor and assess karst rocky desertification on a large scale. However, actual measurement of the bedrock exposure rate is difficult and existing remote-sensing methods cannot directly be exploited to extract the bedrock exposure rate owing to the high complexity and heterogeneity of karst environments. Therefore, using unmanned aerial vehicle (UAV) and Landsat-8 Operational Land Imager (OLI) data for Xingren County, Guizhou Province, quantitative extraction of the bedrock exposure rate based on multi-scale remote-sensing data was developed. Firstly, we used an object-oriented method to carry out accurate classification of UAVimages. From the results of rock extraction, the bedrock exposure rate was calculated at the 30 m grid scale. Parts of the calculated samples were used as training data; other data were used for model validation. Secondly, in each grid the band reflectivity of Landsat-8 OLI data was extracted and a variety of rock and vegetation indexes (e.g., NDVI and SAVI) were calculated. Finally, a network model was established to extract the bedrock exposure rate. The correlation coefficient of the network model was 0.855, that of the validation model was 0.677 and the root mean square error of the validation model was 0.073. This method is valuable for wide-scale estimation of bedrock exposure rate in karst environments. Using the quantitative inversion model, a distribution map of the bedrock exposure rate in Xingren County was obtained.

  12. Stable Isotope Quantitative N-Glycan Analysis by Liquid Separation Techniques and Mass Spectrometry.

    Science.gov (United States)

    Mittermayr, Stefan; Albrecht, Simone; Váradi, Csaba; Millán-Martín, Silvia; Bones, Jonathan

    2017-01-01

    Liquid phase separation analysis and subsequent quantitation remains a challenging task for protein-derived oligosaccharides due to their inherent structural complexity and diversity. Incomplete resolution or co-detection of multiple glycan species complicates peak area-based quantitation and associated statistical analysis when optical detection methods are used. The approach outlined herein describes the utilization of stable isotope variants of commonly used fluorescent tags that allow for mass-based glycan identification and relative quantitation following separation by liquid chromatography (LC) or capillary electrophoresis (CE). Comparability assessment of glycoprotein-derived oligosaccharides is performed by derivatization with commercially available isotope variants of 2-aminobenzoic acid or aniline and analysis by LC- and CE-mass spectrometry. Quantitative information is attained from the extracted ion chromatogram/electropherogram ratios generated from the light and heavy isotope clusters.

  13. INFORMATION EXTRACTION IN TOMB PIT USING HYPERSPECTRAL DATA

    Directory of Open Access Journals (Sweden)

    X. Yang

    2018-04-01

    Full Text Available Hyperspectral data has characteristics of multiple bands and continuous, large amount of data, redundancy, and non-destructive. These characteristics make it possible to use hyperspectral data to study cultural relics. In this paper, the hyperspectral imaging technology is adopted to recognize the bottom images of an ancient tomb located in Shanxi province. There are many black remains on the bottom surface of the tomb, which are suspected to be some meaningful texts or paintings. Firstly, the hyperspectral data is preprocessing to get the reflectance of the region of interesting. For the convenient of compute and storage, the original reflectance value is multiplied by 10000. Secondly, this article uses three methods to extract the symbols at the bottom of the ancient tomb. Finally we tried to use morphology to connect the symbols and gave fifteen reference images. The results show that the extraction of information based on hyperspectral data can obtain a better visual experience, which is beneficial to the study of ancient tombs by researchers, and provides some references for archaeological research findings.

  14. Information Extraction in Tomb Pit Using Hyperspectral Data

    Science.gov (United States)

    Yang, X.; Hou, M.; Lyu, S.; Ma, S.; Gao, Z.; Bai, S.; Gu, M.; Liu, Y.

    2018-04-01

    Hyperspectral data has characteristics of multiple bands and continuous, large amount of data, redundancy, and non-destructive. These characteristics make it possible to use hyperspectral data to study cultural relics. In this paper, the hyperspectral imaging technology is adopted to recognize the bottom images of an ancient tomb located in Shanxi province. There are many black remains on the bottom surface of the tomb, which are suspected to be some meaningful texts or paintings. Firstly, the hyperspectral data is preprocessing to get the reflectance of the region of interesting. For the convenient of compute and storage, the original reflectance value is multiplied by 10000. Secondly, this article uses three methods to extract the symbols at the bottom of the ancient tomb. Finally we tried to use morphology to connect the symbols and gave fifteen reference images. The results show that the extraction of information based on hyperspectral data can obtain a better visual experience, which is beneficial to the study of ancient tombs by researchers, and provides some references for archaeological research findings.

  15. In vitro prebiotic effects and quantitative analysis of Bulnesia sarmienti extract

    Directory of Open Access Journals (Sweden)

    Md Ahsanur Reza

    2016-10-01

    Full Text Available Prebiotics are used to influence the growth, colonization, survival, and activity of probiotics, and enhance the innate immunity, thus improving the health status of the host. The survival, growth, and activity of probiotics are often interfered with by intrinsic factors and indigenous microbes in the gastrointestinal tract. In this study, Bulnesia sarmienti aqueous extract (BSAE was evaluated for the growth-promoting activity of different strains of Lactobacillus acidophilus, and a simple, precise, cost-effective high-performance liquid chromatography (HPLC method was developed and validated for the determination of active prebiotic ingredients in the extract. Different strains of L. acidophilus (probiotic were incubated in de Man, Rogosa, and Sharpe (MRS medium with the supplementation of BSAE in a final concentration of 0.0%, 1.0%, and 3.0% (w/v as the sole carbon source. Growth of the probiotics was determined by measuring the pH changes and colony-forming units (CFU/mL using the microdilution method for a period of 24 hours. The HPLC method was designed by optimizing mobile-phase composition, flow rate, column temperature, and detection wavelength. The method was validated according to the requirements of a new method, including accuracy, precision, linearity, limit of detection, limit of quantitation, and specificity. The major prebiotic active ingredients in BSAE were determined using the validated HPLC method. The rapid growth rate of different strains of L. acidophilus was observed in growth media with BSAE, whereas the decline of pH values of cultures varied in different strains of probiotics depending on the time of culture. (+-Catechin and (−-epicatechin were identified on the basis of their retention time, absorbance spectrum, and mass spectrometry fragmentation pattern. The developed method met the limit of all validation parameters. The prebiotic active components, (+-catechin and (−-epicatechin, were quantified as 1.27% and 0

  16. Dual-wavelength phase-shifting digital holography selectively extracting wavelength information from wavelength-multiplexed holograms.

    Science.gov (United States)

    Tahara, Tatsuki; Mori, Ryota; Kikunaga, Shuhei; Arai, Yasuhiko; Takaki, Yasuhiro

    2015-06-15

    Dual-wavelength phase-shifting digital holography that selectively extracts wavelength information from five wavelength-multiplexed holograms is presented. Specific phase shifts for respective wavelengths are introduced to remove the crosstalk components and extract only the object wave at the desired wavelength from the holograms. Object waves in multiple wavelengths are selectively extracted by utilizing 2π ambiguity and the subtraction procedures based on phase-shifting interferometry. Numerical results show the validity of the proposed technique. The proposed technique is also experimentally demonstrated.

  17. A highly sensitive quantitative cytosensor technique for the identification of receptor ligands in tissue extracts.

    Science.gov (United States)

    Lenkei, Z; Beaudet, A; Chartrel, N; De Mota, N; Irinopoulou, T; Braun, B; Vaudry, H; Llorens-Cortes, C

    2000-11-01

    Because G-protein-coupled receptors (GPCRs) constitute excellent putative therapeutic targets, functional characterization of orphan GPCRs through identification of their endogenous ligands has great potential for drug discovery. We propose here a novel single cell-based assay for identification of these ligands. This assay involves (a) fluorescent tagging of the GPCR, (b) expression of the tagged receptor in a heterologous expression system, (c) incubation of the transfected cells with fractions purified from tissue extracts, and (d) imaging of ligand-induced receptor internalization by confocal microscopy coupled to digital image quantification. We tested this approach in CHO cells stably expressing the NT1 neurotensin receptor fused to EGFP (enhanced green fluorescent protein), in which neurotensin promoted internalization of the NT1-EGFP receptor in a dose-dependent fashion (EC(50) = 0.98 nM). Similarly, four of 120 consecutive reversed-phase HPLC fractions of frog brain extracts promoted internalization of the NT1-EGFP receptor. The same four fractions selectively contained neurotensin, an endogenous ligand of the NT1 receptor, as detected by radioimmunoassay and inositol phosphate production. The present internalization assay provides a highly specific quantitative cytosensor technique with sensitivity in the nanomolar range that should prove useful for the identification of putative natural and synthetic ligands for GPCRs.

  18. Extracting information from two-dimensional electrophoresis gels by partial least squares regression

    DEFF Research Database (Denmark)

    Jessen, Flemming; Lametsch, R.; Bendixen, E.

    2002-01-01

    of all proteins/spots in the gels. In the present study it is demonstrated how information can be extracted by multivariate data analysis. The strategy is based on partial least squares regression followed by variable selection to find proteins that individually or in combination with other proteins vary......Two-dimensional gel electrophoresis (2-DE) produces large amounts of data and extraction of relevant information from these data demands a cautious and time consuming process of spot pattern matching between gels. The classical approach of data analysis is to detect protein markers that appear...... or disappear depending on the experimental conditions. Such biomarkers are found by comparing the relative volumes of individual spots in the individual gels. Multivariate statistical analysis and modelling of 2-DE data for comparison and classification is an alternative approach utilising the combination...

  19. Quantitative metrics for evaluating the phased roll-out of clinical information systems.

    Science.gov (United States)

    Wong, David; Wu, Nicolas; Watkinson, Peter

    2017-09-01

    We introduce a novel quantitative approach for evaluating the order of roll-out during phased introduction of clinical information systems. Such roll-outs are associated with unavoidable risk due to patients transferring between clinical areas using both the old and new systems. We proposed a simple graphical model of patient flow through a hospital. Using a simple instance of the model, we showed how a roll-out order can be generated by minimising the flow of patients from the new system to the old system. The model was applied to admission and discharge data acquired from 37,080 patient journeys at the Churchill Hospital, Oxford between April 2013 and April 2014. The resulting order was evaluated empirically and produced acceptable orders. The development of data-driven approaches to clinical Information system roll-out provides insights that may not necessarily be ascertained through clinical judgment alone. Such methods could make a significant contribution to the smooth running of an organisation during the roll-out of a potentially disruptive technology. Unlike previous approaches, which are based on clinical opinion, the approach described here quantitatively assesses the appropriateness of competing roll-out strategies. The data-driven approach was shown to produce strategies that matched clinical intuition and provides a flexible framework that may be used to plan and monitor Clinical Information System roll-out. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  20. Chaotic spectra: How to extract dynamic information

    International Nuclear Information System (INIS)

    Taylor, H.S.; Gomez Llorente, J.M.; Zakrzewski, J.; Kulander, K.C.

    1988-10-01

    Nonlinear dynamics is applied to chaotic unassignable atomic and molecular spectra with the aim of extracting detailed information about regular dynamic motions that exist over short intervals of time. It is shown how this motion can be extracted from high resolution spectra by doing low resolution studies or by Fourier transforming limited regions of the spectrum. These motions mimic those of periodic orbits (PO) and are inserts into the dominant chaotic motion. Considering these inserts and the PO as a dynamically decoupled region of space, resonant scattering theory and stabilization methods enable us to compute ladders of resonant states which interact with the chaotic quasi-continuum computed in principle from basis sets placed off the PO. The interaction of the resonances with the quasicontinuum explains the low resolution spectra seen in such experiments. It also allows one to associate low resolution features with a particular PO. The motion on the PO thereby supplies the molecular movements whose quantization causes the low resolution spectra. Characteristic properties of the periodic orbit based resonances are discussed. The method is illustrated on the photoabsorption spectrum of the hydrogen atom in a strong magnetic field and on the photodissociation spectrum of H 3 + . Other molecular systems which are currently under investigation using this formalism are also mentioned. 53 refs., 10 figs., 2 tabs

  1. Automated Extraction of Substance Use Information from Clinical Texts.

    Science.gov (United States)

    Wang, Yan; Chen, Elizabeth S; Pakhomov, Serguei; Arsoniadis, Elliot; Carter, Elizabeth W; Lindemann, Elizabeth; Sarkar, Indra Neil; Melton, Genevieve B

    2015-01-01

    Within clinical discourse, social history (SH) includes important information about substance use (alcohol, drug, and nicotine use) as key risk factors for disease, disability, and mortality. In this study, we developed and evaluated a natural language processing (NLP) system for automated detection of substance use statements and extraction of substance use attributes (e.g., temporal and status) based on Stanford Typed Dependencies. The developed NLP system leveraged linguistic resources and domain knowledge from a multi-site social history study, Propbank and the MiPACQ corpus. The system attained F-scores of 89.8, 84.6 and 89.4 respectively for alcohol, drug, and nicotine use statement detection, as well as average F-scores of 82.1, 90.3, 80.8, 88.7, 96.6, and 74.5 respectively for extraction of attributes. Our results suggest that NLP systems can achieve good performance when augmented with linguistic resources and domain knowledge when applied to a wide breadth of substance use free text clinical notes.

  2. Network and Ensemble Enabled Entity Extraction in Informal Text (NEEEEIT) final report

    Energy Technology Data Exchange (ETDEWEB)

    Kegelmeyer, Philip W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shead, Timothy M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dunlavy, Daniel M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-09-01

    This SAND report summarizes the activities and outcomes of the Network and Ensemble Enabled Entity Extraction in Information Text (NEEEEIT) LDRD project, which addressed improving the accuracy of conditional random fields for named entity recognition through the use of ensemble methods.

  3. Three-dimensional information extraction from GaoFen-1 satellite images for landslide monitoring

    Science.gov (United States)

    Wang, Shixin; Yang, Baolin; Zhou, Yi; Wang, Futao; Zhang, Rui; Zhao, Qing

    2018-05-01

    To more efficiently use GaoFen-1 (GF-1) satellite images for landslide emergency monitoring, a Digital Surface Model (DSM) can be generated from GF-1 across-track stereo image pairs to build a terrain dataset. This study proposes a landslide 3D information extraction method based on the terrain changes of slope objects. The slope objects are mergences of segmented image objects which have similar aspects; and the terrain changes are calculated from the post-disaster Digital Elevation Model (DEM) from GF-1 and the pre-disaster DEM from GDEM V2. A high mountain landslide that occurred in Wenchuan County, Sichuan Province is used to conduct a 3D information extraction test. The extracted total area of the landslide is 22.58 ha; the displaced earth volume is 652,100 m3; and the average sliding direction is 263.83°. The accuracies of them are 0.89, 0.87 and 0.95, respectively. Thus, the proposed method expands the application of GF-1 satellite images to the field of landslide emergency monitoring.

  4. Lithium NLP: A System for Rich Information Extraction from Noisy User Generated Text on Social Media

    OpenAIRE

    Bhargava, Preeti; Spasojevic, Nemanja; Hu, Guoning

    2017-01-01

    In this paper, we describe the Lithium Natural Language Processing (NLP) system - a resource-constrained, high- throughput and language-agnostic system for information extraction from noisy user generated text on social media. Lithium NLP extracts a rich set of information including entities, topics, hashtags and sentiment from text. We discuss several real world applications of the system currently incorporated in Lithium products. We also compare our system with existing commercial and acad...

  5. System and method for extracting physiological information from remotely detected electromagnetic radiation

    NARCIS (Netherlands)

    2016-01-01

    The present invention relates to a device and a method for extracting physiological information indicative of at least one health symptom from remotely detected electromagnetic radiation. The device comprises an interface (20) for receiving a data stream comprising remotely detected image data

  6. System and method for extracting physiological information from remotely detected electromagnetic radiation

    NARCIS (Netherlands)

    2015-01-01

    The present invention relates to a device and a method for extracting physiological information indicative of at least one health symptom from remotely detected electromagnetic radiation. The device comprises an interface (20) for receiving a data stream comprising remotely detected image data

  7. Strategy for Extracting DNA from Clay Soil and Detecting a Specific Target Sequence via Selective Enrichment and Real-Time (Quantitative) PCR Amplification ▿

    Science.gov (United States)

    Yankson, Kweku K.; Steck, Todd R.

    2009-01-01

    We present a simple strategy for isolating and accurately enumerating target DNA from high-clay-content soils: desorption with buffers, an optional magnetic capture hybridization step, and quantitation via real-time PCR. With the developed technique, μg quantities of DNA were extracted from mg samples of pure kaolinite and a field clay soil. PMID:19633108

  8. Extracting of implicit information in English advertising texts with phonetic and lexical-morphological means

    Directory of Open Access Journals (Sweden)

    Traikovskaya Natalya Petrovna

    2015-12-01

    Full Text Available The article deals with phonetic and lexical-morphological language means participating in the process of extracting implicit information in English-speaking advertising texts for men and women. The functioning of phonetic means of the English language is not the basis for implication of information in advertising texts. Lexical and morphological means play the role of markers of relevant information, playing the role of the activator ofimplicit information in the texts of advertising.

  9. Investigation of UO2 as an accelerator for quantitative extraction of F- and Cl- in ThO2 and sintered ThO2

    International Nuclear Information System (INIS)

    Pandey, Ashish; Fulzele, Ajit; Das, D.K.; Prakash, Amrit; Behere, P.G.; Afzal, Mohd

    2013-01-01

    This paper presents UO 2 as an effective accelerator for the quantitative extraction of F - and Cl - from ThO 2 and sintered ThO 2 . Thoria requires higher temperature to loose its structural integrity to release halides. Sample composed of UO 2 and ThO 2 or UO 2 and sintered ThO 2 gives quantitative yield of F - and Cl - even at lower temperature. Accelerator amount and pyrohydrolysis conditions were optimized. The pyrohydrolyzate was analyzed for F - and Cl - by ISE. The limit of detection was 1 μg/g in the samples with good recovery (95%) and relative standard deviation less than 5%. (author)

  10. Optimization and automation of quantitative NMR data extraction.

    Science.gov (United States)

    Bernstein, Michael A; Sýkora, Stan; Peng, Chen; Barba, Agustín; Cobas, Carlos

    2013-06-18

    NMR is routinely used to quantitate chemical species. The necessary experimental procedures to acquire quantitative data are well-known, but relatively little attention has been applied to data processing and analysis. We describe here a robust expert system that can be used to automatically choose the best signals in a sample for overall concentration determination and determine analyte concentration using all accepted methods. The algorithm is based on the complete deconvolution of the spectrum which makes it tolerant of cases where signals are very close to one another and includes robust methods for the automatic classification of NMR resonances and molecule-to-spectrum multiplets assignments. With the functionality in place and optimized, it is then a relatively simple matter to apply the same workflow to data in a fully automatic way. The procedure is desirable for both its inherent performance and applicability to NMR data acquired for very large sample sets.

  11. Methods from Information Extraction from LIDAR Intensity Data and Multispectral LIDAR Technology

    Science.gov (United States)

    Scaioni, M.; Höfle, B.; Baungarten Kersting, A. P.; Barazzetti, L.; Previtali, M.; Wujanz, D.

    2018-04-01

    LiDAR is a consolidated technology for topographic mapping and 3D reconstruction, which is implemented in several platforms On the other hand, the exploitation of the geometric information has been coupled by the use of laser intensity, which may provide additional data for multiple purposes. This option has been emphasized by the availability of sensors working on different wavelength, thus able to provide additional information for classification of surfaces and objects. Several applications ofmonochromatic and multi-spectral LiDAR data have been already developed in different fields: geosciences, agriculture, forestry, building and cultural heritage. The use of intensity data to extract measures of point cloud quality has been also developed. The paper would like to give an overview on the state-of-the-art of these techniques, and to present the modern technologies for the acquisition of multispectral LiDAR data. In addition, the ISPRS WG III/5 on `Information Extraction from LiDAR Intensity Data' has collected and made available a few open data sets to support scholars to do research on this field. This service is presented and data sets delivered so far as are described.

  12. METHODS FROM INFORMATION EXTRACTION FROM LIDAR INTENSITY DATA AND MULTISPECTRAL LIDAR TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    M. Scaioni

    2018-04-01

    Full Text Available LiDAR is a consolidated technology for topographic mapping and 3D reconstruction, which is implemented in several platforms On the other hand, the exploitation of the geometric information has been coupled by the use of laser intensity, which may provide additional data for multiple purposes. This option has been emphasized by the availability of sensors working on different wavelength, thus able to provide additional information for classification of surfaces and objects. Several applications ofmonochromatic and multi-spectral LiDAR data have been already developed in different fields: geosciences, agriculture, forestry, building and cultural heritage. The use of intensity data to extract measures of point cloud quality has been also developed. The paper would like to give an overview on the state-of-the-art of these techniques, and to present the modern technologies for the acquisition of multispectral LiDAR data. In addition, the ISPRS WG III/5 on ‘Information Extraction from LiDAR Intensity Data’ has collected and made available a few open data sets to support scholars to do research on this field. This service is presented and data sets delivered so far as are described.

  13. Quantitative radiomics studies for tissue characterization: a review of technology and methodological procedures.

    Science.gov (United States)

    Larue, Ruben T H M; Defraene, Gilles; De Ruysscher, Dirk; Lambin, Philippe; van Elmpt, Wouter

    2017-02-01

    Quantitative analysis of tumour characteristics based on medical imaging is an emerging field of research. In recent years, quantitative imaging features derived from CT, positron emission tomography and MR scans were shown to be of added value in the prediction of outcome parameters in oncology, in what is called the radiomics field. However, results might be difficult to compare owing to a lack of standardized methodologies to conduct quantitative image analyses. In this review, we aim to present an overview of the current challenges, technical routines and protocols that are involved in quantitative imaging studies. The first issue that should be overcome is the dependency of several features on the scan acquisition and image reconstruction parameters. Adopting consistent methods in the subsequent target segmentation step is evenly crucial. To further establish robust quantitative image analyses, standardization or at least calibration of imaging features based on different feature extraction settings is required, especially for texture- and filter-based features. Several open-source and commercial software packages to perform feature extraction are currently available, all with slightly different functionalities, which makes benchmarking quite challenging. The number of imaging features calculated is typically larger than the number of patients studied, which emphasizes the importance of proper feature selection and prediction model-building routines to prevent overfitting. Even though many of these challenges still need to be addressed before quantitative imaging can be brought into daily clinical practice, radiomics is expected to be a critical component for the integration of image-derived information to personalize treatment in the future.

  14. Automated extraction of chemical structure information from digital raster images

    Directory of Open Access Journals (Sweden)

    Shedden Kerby A

    2009-02-01

    Full Text Available Abstract Background To search for chemical structures in research articles, diagrams or text representing molecules need to be translated to a standard chemical file format compatible with cheminformatic search engines. Nevertheless, chemical information contained in research articles is often referenced as analog diagrams of chemical structures embedded in digital raster images. To automate analog-to-digital conversion of chemical structure diagrams in scientific research articles, several software systems have been developed. But their algorithmic performance and utility in cheminformatic research have not been investigated. Results This paper aims to provide critical reviews for these systems and also report our recent development of ChemReader – a fully automated tool for extracting chemical structure diagrams in research articles and converting them into standard, searchable chemical file formats. Basic algorithms for recognizing lines and letters representing bonds and atoms in chemical structure diagrams can be independently run in sequence from a graphical user interface-and the algorithm parameters can be readily changed-to facilitate additional development specifically tailored to a chemical database annotation scheme. Compared with existing software programs such as OSRA, Kekule, and CLiDE, our results indicate that ChemReader outperforms other software systems on several sets of sample images from diverse sources in terms of the rate of correct outputs and the accuracy on extracting molecular substructure patterns. Conclusion The availability of ChemReader as a cheminformatic tool for extracting chemical structure information from digital raster images allows research and development groups to enrich their chemical structure databases by annotating the entries with published research articles. Based on its stable performance and high accuracy, ChemReader may be sufficiently accurate for annotating the chemical database with links

  15. Extraction of Trivalent Actinides and Lanthanides from Californium Campaign Rework Solution Using TODGA-based Solvent Extraction System

    Energy Technology Data Exchange (ETDEWEB)

    Benker, Dennis [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Delmau, Laetitia Helene [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Dryman, Joshua Cory [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-07-01

    This report presents the studies carried out to demonstrate the possibility of quantitatively extracting trivalent actinides and lanthanides from highly acidic solutions using a neutral ligand-based solvent extraction system. These studies stemmed from the perceived advantage of such systems over cationexchange- based solvent extraction systems that require an extensive feed adjustment to make a low-acid feed. The targeted feed solutions are highly acidic aqueous phases obtained after the dissolution of curium targets during a californium (Cf) campaign. Results obtained with actual Cf campaign solutions, but highly diluted to be manageable in a glove box, are presented, followed by results of tests run in the hot cells with Cf campaign rework solutions. It was demonstrated that a solvent extraction system based on the tetraoctyl diglycolamide molecule is capable of quantitatively extracting trivalent actinides from highly acidic solutions. This system was validated using actual feeds from a Cf campaign.

  16. An Improved DNA Extraction Method for Efficient and Quantitative Recovery of Phytoplankton Diversity in Natural Assemblages.

    Directory of Open Access Journals (Sweden)

    Jian Yuan

    Full Text Available Marine phytoplankton are highly diverse with different species possessing different cell coverings, posing challenges for thoroughly breaking the cells in DNA extraction yet preserving DNA integrity. While quantitative molecular techniques have been increasingly used in phytoplankton research, an effective and simple method broadly applicable to different lineages and natural assemblages is still lacking. In this study, we developed a bead-beating protocol based on our previous experience and tested it against 9 species of phytoplankton representing different lineages and different cell covering rigidities. We found the bead-beating method enhanced the final yield of DNA (highest as 2 folds in comparison with the non-bead-beating method, while also preserving the DNA integrity. When our method was applied to a field sample collected at a subtropical bay located in Xiamen, China, the resultant ITS clone library revealed a highly diverse assemblage of phytoplankton and other micro-eukaryotes, including Archaea, Amoebozoa, Chlorophyta, Ciliphora, Bacillariophyta, Dinophyta, Fungi, Metazoa, etc. The appearance of thecate dinoflagellates, thin-walled phytoplankton and "naked" unicellular organisms indicates that our method could obtain the intact DNA of organisms with different cell coverings. All the results demonstrate that our method is useful for DNA extraction of phytoplankton and environmental surveys of their diversity and abundance.

  17. A study on the quantitative model of human response time using the amount and the similarity of information

    International Nuclear Information System (INIS)

    Lee, Sung Jin

    2006-02-01

    The mental capacity to retain or recall information, or memory is related to human performance during processing of information. Although a large number of studies have been carried out on human performance, little is known about the similarity effect. The purpose of this study was to propose and validate a quantitative and predictive model on human response time in the user interface with the basic concepts of information amount, similarity and degree of practice. It was difficult to explain human performance by only similarity or information amount. There were two difficulties: constructing a quantitative model on human response time and validating the proposed model by experimental work. A quantitative model based on the Hick's law, the law of practice and similarity theory was developed. The model was validated under various experimental conditions by measuring the participants' response time in the environment of a computer-based display. Human performance was improved by degree of similarity and practice in the user interface. Also we found the age-related human performance which was degraded as he or she was more elder. The proposed model may be useful for training operators who will handle some interfaces and predicting human performance by changing system design

  18. Counter current extraction for the partitioning of actinides from PFBR-SHLW using TODGA

    International Nuclear Information System (INIS)

    Ansari, S.A.; Gujar, R.B.; Kumar, Mithilesh; Seshagiri, T.K.; Godbole, S.V.; Manchanda, V.K.; Rajeswari, S.; Antony, M.P.; Srinivasan, T.G.

    2009-01-01

    Counter current extraction for the partitioning of actinides from simulated HLW of PFBR origin was demonstrated with 0.1M TODGA + 0.5M DHOA dissolved in NPH using a 16 stage mixer-settler unit. Results demonstrated that all lanthanides could be quantitatively extracted from the feed solution and quantitatively stripped from the loaded organic phase with 0.2M HNO 3 . The extracted lanthanides are not scrubbed with 0.2M oxalic acid at 5M HNO 3 in the scrubbing cycle. Elements such as Ba, Cd and Sn were not extracted. However, Pd was partially extracted but was scrubbed quantitatively. (author)

  19. Quantitative imaging biomarkers: the application of advanced image processing and analysis to clinical and preclinical decision making.

    Science.gov (United States)

    Prescott, Jeffrey William

    2013-02-01

    The importance of medical imaging for clinical decision making has been steadily increasing over the last four decades. Recently, there has also been an emphasis on medical imaging for preclinical decision making, i.e., for use in pharamaceutical and medical device development. There is also a drive towards quantification of imaging findings by using quantitative imaging biomarkers, which can improve sensitivity, specificity, accuracy and reproducibility of imaged characteristics used for diagnostic and therapeutic decisions. An important component of the discovery, characterization, validation and application of quantitative imaging biomarkers is the extraction of information and meaning from images through image processing and subsequent analysis. However, many advanced image processing and analysis methods are not applied directly to questions of clinical interest, i.e., for diagnostic and therapeutic decision making, which is a consideration that should be closely linked to the development of such algorithms. This article is meant to address these concerns. First, quantitative imaging biomarkers are introduced by providing definitions and concepts. Then, potential applications of advanced image processing and analysis to areas of quantitative imaging biomarker research are described; specifically, research into osteoarthritis (OA), Alzheimer's disease (AD) and cancer is presented. Then, challenges in quantitative imaging biomarker research are discussed. Finally, a conceptual framework for integrating clinical and preclinical considerations into the development of quantitative imaging biomarkers and their computer-assisted methods of extraction is presented.

  20. Quantitative analysis of ultrasound B-mode images of carotid atherosclerotic plaque: correlation with visual classification and histological examination

    DEFF Research Database (Denmark)

    Wilhjelm, Jens E.; Grønholdt, Marie-Louise; Wiebe, Brit

    1998-01-01

    regions of the plaque in still ultrasound images from three orthogonal scan planes and finally a histological analysis of the surgically removed plaque. The quantitative comparison was made with the linear model and with separation of the available data into training and test sets. The comparison......This paper presents a quantitative comparison of three types of information available for 52 patients scheduled for carotid endarterectomy: subjective classification of the ultrasound images obtained during scanning before operation, first- and second-order statistical features extracted from...

  1. From remote sensing data about information extraction for 3D geovisualization - Development of a workflow

    International Nuclear Information System (INIS)

    Tiede, D.

    2010-01-01

    With an increased availability of high (spatial) resolution remote sensing imagery since the late nineties, the need to develop operative workflows for the automated extraction, provision and communication of information from such data has grown. Monitoring requirements, aimed at the implementation of environmental or conservation targets, management of (environmental-) resources, and regional planning as well as international initiatives, especially the joint initiative of the European Commission and ESA (European Space Agency) for Global Monitoring for Environment and Security (GMES) play also a major part. This thesis addresses the development of an integrated workflow for the automated provision of information derived from remote sensing data. Considering applied data and fields of application, this work aims to design the workflow as generic as possible. Following research questions are discussed: What are the requirements of a workflow architecture that seamlessly links the individual workflow elements in a timely manner and secures accuracy of the extracted information effectively? How can the workflow retain its efficiency if mounds of data are processed? How can the workflow be improved with regards to automated object-based image analysis (OBIA)? Which recent developments could be of use? What are the limitations or which workarounds could be applied in order to generate relevant results? How can relevant information be prepared target-oriented and communicated effectively? How can the more recently developed freely available virtual globes be used for the delivery of conditioned information under consideration of the third dimension as an additional, explicit carrier of information? Based on case studies comprising different data sets and fields of application it is demonstrated how methods to extract and process information as well as to effectively communicate results can be improved and successfully combined within one workflow. It is shown that (1

  2. Automated concept-level information extraction to reduce the need for custom software and rules development.

    Science.gov (United States)

    D'Avolio, Leonard W; Nguyen, Thien M; Goryachev, Sergey; Fiore, Louis D

    2011-01-01

    Despite at least 40 years of promising empirical performance, very few clinical natural language processing (NLP) or information extraction systems currently contribute to medical science or care. The authors address this gap by reducing the need for custom software and rules development with a graphical user interface-driven, highly generalizable approach to concept-level retrieval. A 'learn by example' approach combines features derived from open-source NLP pipelines with open-source machine learning classifiers to automatically and iteratively evaluate top-performing configurations. The Fourth i2b2/VA Shared Task Challenge's concept extraction task provided the data sets and metrics used to evaluate performance. Top F-measure scores for each of the tasks were medical problems (0.83), treatments (0.82), and tests (0.83). Recall lagged precision in all experiments. Precision was near or above 0.90 in all tasks. Discussion With no customization for the tasks and less than 5 min of end-user time to configure and launch each experiment, the average F-measure was 0.83, one point behind the mean F-measure of the 22 entrants in the competition. Strong precision scores indicate the potential of applying the approach for more specific clinical information extraction tasks. There was not one best configuration, supporting an iterative approach to model creation. Acceptable levels of performance can be achieved using fully automated and generalizable approaches to concept-level information extraction. The described implementation and related documentation is available for download.

  3. Extraction of quantifiable information from complex systems

    CERN Document Server

    Dahmen, Wolfgang; Griebel, Michael; Hackbusch, Wolfgang; Ritter, Klaus; Schneider, Reinhold; Schwab, Christoph; Yserentant, Harry

    2014-01-01

    In April 2007, the  Deutsche Forschungsgemeinschaft (DFG) approved the  Priority Program 1324 “Mathematical Methods for Extracting Quantifiable Information from Complex Systems.” This volume presents a comprehensive overview of the most important results obtained over the course of the program.   Mathematical models of complex systems provide the foundation for further technological developments in science, engineering and computational finance.  Motivated by the trend toward steadily increasing computer power, ever more realistic models have been developed in recent years. These models have also become increasingly complex, and their numerical treatment poses serious challenges.   Recent developments in mathematics suggest that, in the long run, much more powerful numerical solution strategies could be derived if the interconnections between the different fields of research were systematically exploited at a conceptual level. Accordingly, a deeper understanding of the mathematical foundations as w...

  4. THE EXTRACTION OF INDOOR BUILDING INFORMATION FROM BIM TO OGC INDOORGML

    Directory of Open Access Journals (Sweden)

    T.-A. Teo

    2017-07-01

    Full Text Available Indoor Spatial Data Infrastructure (indoor-SDI is an important SDI for geosptial analysis and location-based services. Building Information Model (BIM has high degree of details in geometric and semantic information for building. This study proposed direct conversion schemes to extract indoor building information from BIM to OGC IndoorGML. The major steps of the research include (1 topological conversion from building model into indoor network model; and (2 generation of IndoorGML. The topological conversion is a major process of generating and mapping nodes and edges from IFC to indoorGML. Node represents every space (e.g. IfcSpace and objects (e.g. IfcDoor in the building while edge shows the relationships between nodes. According to the definition of IndoorGML, the topological model in the dual space is also represented as a set of nodes and edges. These definitions of IndoorGML are the same as in the indoor network. Therefore, we can extract the necessary data in the indoor network and easily convert them into IndoorGML based on IndoorGML Schema. The experiment utilized a real BIM model to examine the proposed method. The experimental results indicated that the 3D indoor model (i.e. IndoorGML model can be automatically imported from IFC model by the proposed procedure. In addition, the geometric and attribute of building elements are completely and correctly converted from BIM to indoor-SDI.

  5. Current extraction and separation of uranium, thorium and rare earths elements from monazite leach solution using organophosphorous extractants

    International Nuclear Information System (INIS)

    Biswas, Sujoy; Rupawate, V.H.; Hareendran, K.N.; Roy, S.B.

    2014-01-01

    A new process based on solvent extraction has been developed for separation of uranium, thorium and rare earths from monazite leach solution using organophosphorous extractants. The Thorium cake coming from monazite source was dissolved in HNO 3 medium in presence of trace amount of HF for feed preparation. The separation of U(VI) was carried out by liquid-liquid extraction using tris-2-ethyl hexyl phosphoric acid (TEHP) in dodecane leaving thorium and rare earths elements in the raffinate. The thorium from raffinate was selectively extracted using 1M tri iso amyl phosphate (TiAP) in dodecane in organic phase leaving all rare earths elements in aqueous solution. The uranium and thorium from organic medium was quantitatively stripped using 0.05 M HNO 3 counter current mode. Results indicate the quantitative separation of uranium, thorium and rare earths from thorium cake (monazite source) using organophosphorous extractant in counter current mode

  6. ONTOGRABBING: Extracting Information from Texts Using Generative Ontologies

    DEFF Research Database (Denmark)

    Nilsson, Jørgen Fischer; Szymczak, Bartlomiej Antoni; Jensen, P.A.

    2009-01-01

    We describe principles for extracting information from texts using a so-called generative ontology in combination with syntactic analysis. Generative ontologies are introduced as semantic domains for natural language phrases. Generative ontologies extend ordinary finite ontologies with rules...... for producing recursively shaped terms representing the ontological content (ontological semantics) of NL noun phrases and other phrases. We focus here on achieving a robust, often only partial, ontology-driven parsing of and ascription of semantics to a sentence in the text corpus. The aim of the ontological...... analysis is primarily to identify paraphrases, thereby achieving a search functionality beyond mere keyword search with synsets. We further envisage use of the generative ontology as a phrase-based rather than word-based browser into text corpora....

  7. MedEx: a medication information extraction system for clinical narratives

    Science.gov (United States)

    Stenner, Shane P; Doan, Son; Johnson, Kevin B; Waitman, Lemuel R; Denny, Joshua C

    2010-01-01

    Medication information is one of the most important types of clinical data in electronic medical records. It is critical for healthcare safety and quality, as well as for clinical research that uses electronic medical record data. However, medication data are often recorded in clinical notes as free-text. As such, they are not accessible to other computerized applications that rely on coded data. We describe a new natural language processing system (MedEx), which extracts medication information from clinical notes. MedEx was initially developed using discharge summaries. An evaluation using a data set of 50 discharge summaries showed it performed well on identifying not only drug names (F-measure 93.2%), but also signature information, such as strength, route, and frequency, with F-measures of 94.5%, 93.9%, and 96.0% respectively. We then applied MedEx unchanged to outpatient clinic visit notes. It performed similarly with F-measures over 90% on a set of 25 clinic visit notes. PMID:20064797

  8. Methods to extract information on the atomic and molecular states from scientific abstracts

    International Nuclear Information System (INIS)

    Sasaki, Akira; Ueshima, Yutaka; Yamagiwa, Mitsuru; Murata, Masaki; Kanamaru, Toshiyuki; Shirado, Tamotsu; Isahara, Hitoshi

    2005-01-01

    We propose a new application of information technology to recognize and extract expressions of atomic and molecular states from electrical forms of scientific abstracts. Present results will help scientists to understand atomic states as well as the physics discussed in the articles. Combining with the internet search engines, it will make one possible to collect not only atomic and molecular data but broader scientific information over a wide range of research fields. (author)

  9. Mapping face recognition information use across cultures

    Directory of Open Access Journals (Sweden)

    Sébastien eMiellet

    2013-02-01

    Full Text Available Face recognition is not rooted in a universal eye movement information-gathering strategy. Western observers favor a local facial feature sampling strategy, whereas Eastern observers prefer sampling face information from a global, central fixation strategy. Yet, the precise qualitative (the diagnostic and quantitative (the amount information underlying these cultural perceptual biases in face recognition remains undetermined.To this end, we monitored the eye movements of Western and Eastern observers during a face recognition task, with a novel gaze-contingent technique: the Expanding Spotlight. We used 2° Gaussian apertures centered on the observers' fixations expanding dynamically at a rate of 1° every 25ms at each fixation - the longer the fixation duration, the larger the aperture size. Identity-specific face information was only displayed within the Gaussian aperture; outside the aperture, an average face template was displayed to facilitate saccade planning. Thus, the Expanding Spotlight simultaneously maps out the facial information span at each fixation location.Data obtained with the Expanding Spotlight technique confirmed that Westerners extract more information from the eye region, whereas Easterners extract more information from the nose region. Interestingly, this quantitative difference was paired with a qualitative disparity. Retinal filters based on spatial frequency decomposition built from the fixations maps revealed that Westerners used local high-spatial frequency information sampling, covering all the features critical for effective face recognition (the eyes and the mouth. In contrast, Easterners achieved a similar result by using global low-spatial frequency information from those facial features.Our data show that the face system flexibly engages into local or global eye movement strategies across cultures, by relying on distinct facial information span and culturally tuned spatially filtered information. Overall, our

  10. Extraction and quantitation of furanic compounds dissolved in oils

    International Nuclear Information System (INIS)

    Koreh, O.; Torkos, K.; Mahara, M.B.; Borossay, J.

    1998-01-01

    Furans are amongst the decomposition products which are generated by the degradation of cellulose in paper. Paper insulation is used in capacitors, cables and transformers. These furans dissolve in the impregnating mineral oil, and a method, involving liquid/liquid extraction, solid phase extraction and high performance liquid chromatography, has been developed to determine the concentration of 2-furfural the most stable compound in oil. The degradation of paper is being examined in order to find correlation between the change in dielectric and mechanical properties and the increase in concentration of 2-furfural in the oil. (author)

  11. Multicomponent quantitative spectroscopic analysis without reference substances based on ICA modelling.

    Science.gov (United States)

    Monakhova, Yulia B; Mushtakova, Svetlana P

    2017-05-01

    A fast and reliable spectroscopic method for multicomponent quantitative analysis of targeted compounds with overlapping signals in complex mixtures has been established. The innovative analytical approach is based on the preliminary chemometric extraction of qualitative and quantitative information from UV-vis and IR spectral profiles of a calibration system using independent component analysis (ICA). Using this quantitative model and ICA resolution results of spectral profiling of "unknown" model mixtures, the absolute analyte concentrations in multicomponent mixtures and authentic samples were then calculated without reference solutions. Good recoveries generally between 95% and 105% were obtained. The method can be applied to any spectroscopic data that obey the Beer-Lambert-Bouguer law. The proposed method was tested on analysis of vitamins and caffeine in energy drinks and aromatic hydrocarbons in motor fuel with 10% error. The results demonstrated that the proposed method is a promising tool for rapid simultaneous multicomponent analysis in the case of spectral overlap and the absence/inaccessibility of reference materials.

  12. MRMer, an interactive open source and cross-platform system for data extraction and visualization of multiple reaction monitoring experiments.

    Science.gov (United States)

    Martin, Daniel B; Holzman, Ted; May, Damon; Peterson, Amelia; Eastham, Ashley; Eng, Jimmy; McIntosh, Martin

    2008-11-01

    Multiple reaction monitoring (MRM) mass spectrometry identifies and quantifies specific peptides in a complex mixture with very high sensitivity and speed and thus has promise for the high throughput screening of clinical samples for candidate biomarkers. We have developed an interactive software platform, called MRMer, for managing highly complex MRM-MS experiments, including quantitative analyses using heavy/light isotopic peptide pairs. MRMer parses and extracts information from MS files encoded in the platform-independent mzXML data format. It extracts and infers precursor-product ion transition pairings, computes integrated ion intensities, and permits rapid visual curation for analyses exceeding 1000 precursor-product pairs. Results can be easily output for quantitative comparison of consecutive runs. Additionally MRMer incorporates features that permit the quantitative analysis experiments including heavy and light isotopic peptide pairs. MRMer is open source and provided under the Apache 2.0 license.

  13. Analysis Methods for Extracting Knowledge from Large-Scale WiFi Monitoring to Inform Building Facility Planning

    DEFF Research Database (Denmark)

    Ruiz-Ruiz, Antonio; Blunck, Henrik; Prentow, Thor Siiger

    2014-01-01

    realistic data to inform facility planning. In this paper, we propose analysis methods to extract knowledge from large sets of network collected WiFi traces to better inform facility management and planning in large building complexes. The analysis methods, which build on a rich set of temporal and spatial......The optimization of logistics in large building com- plexes with many resources, such as hospitals, require realistic facility management and planning. Current planning practices rely foremost on manual observations or coarse unverified as- sumptions and therefore do not properly scale or provide....... Spatio-temporal visualization tools built on top of these methods enable planners to inspect and explore extracted information to inform facility-planning activities. To evaluate the methods, we present results for a large hospital complex covering more than 10 hectares. The evaluation is based on Wi...

  14. MetaFluxNet: the management of metabolic reaction information and quantitative metabolic flux analysis.

    Science.gov (United States)

    Lee, Dong-Yup; Yun, Hongsoek; Park, Sunwon; Lee, Sang Yup

    2003-11-01

    MetaFluxNet is a program package for managing information on the metabolic reaction network and for quantitatively analyzing metabolic fluxes in an interactive and customized way. It allows users to interpret and examine metabolic behavior in response to genetic and/or environmental modifications. As a result, quantitative in silico simulations of metabolic pathways can be carried out to understand the metabolic status and to design the metabolic engineering strategies. The main features of the program include a well-developed model construction environment, user-friendly interface for metabolic flux analysis (MFA), comparative MFA of strains having different genotypes under various environmental conditions, and automated pathway layout creation. http://mbel.kaist.ac.kr/ A manual for MetaFluxNet is available as PDF file.

  15. Pungency Quantitation of Hot Pepper Sauces Using HPLC

    Science.gov (United States)

    Betts, Thomas A.

    1999-02-01

    A class of compounds known as capsaicinoids are responsible for the "heat" of hot peppers. To determine the pungency of a particular pepper or pepper product, one may quantify the capsaicinoids and relate those concentrations to the perceived heat. The format of the laboratory described here allows students to collectively develop an HPLC method for the quantitation of the two predominant capsaicinoids (capsaicin and dihydrocapsaicin) in hot-pepper products. Each small group of students investigated one of the following aspects of the method: detector wavelength, mobile-phase composition, extraction of capsaicinoids, calibration, and quantitation. The format of the lab forced students to communicate and cooperate to develop this method. The resulting HPLC method involves extraction with acetonitrile followed by solid-phase extraction clean-up, an isocratic 80:20 methanol-water mobile phase, a 4.6 mm by 25 cm C-18 column, and UV absorbance detection at 284 nm. The method developed by the students was then applied to the quantitation of capsaicinoids in a variety of hot pepper sauces. Editor's Note on Hazards in our April 2000 issue addresses the above.

  16. Quantitative determination of 1,4-dioxane and tetrahydrofuran in groundwater by solid phase extraction GC/MS/MS.

    Science.gov (United States)

    Isaacson, Carl; Mohr, Thomas K G; Field, Jennifer A

    2006-12-01

    Groundwater contamination by cyclic ethers, 1,4-dioxane (dioxane), a probable human carcinogen, and tetrahydrofuran (THF), a co-contaminant at many chlorinated solvent release sites, are a growing concern. Cyclic ethers are readily transported in groundwater, yet little is known about their fate in environmental systems. High water solubility coupled with low Henry's law constants and octanol-water partition coefficients make their removal from groundwater problematic for both remedial and analytical purposes. A solid-phase extraction (SPE) method based on activated carbon disks was developed for the quantitative determination of dioxane and THF. The method requires 80 mL samples and a total of 1.2 mL of solvent (acetone). The number of steps is minimized due to the "in-vial" elution of the disks. Average recoveries for dioxane and THF were 98% and 95%, respectively, with precision, as indicated by the relative standard deviation of <2% to 6%. The method quantitation limits are 0.31 microg/L for dioxane and 3.1 microg/L for THF. The method was demonstrated by analyzing groundwater samples for dioxane and THF collected during a single sampling campaign at a TCA-impacted site. Dioxane concentrations and areal extent of dioxane in groundwater were greater than those of either TCA or THF.

  17. Basil (Ocimum basilicum L. essential oil and extracts obtained by supercritical fluid extraction

    Directory of Open Access Journals (Sweden)

    Zeković Zoran P.

    2015-01-01

    Full Text Available The extracts obtained from sweet basil (Ocimum basilicum L. by hydrodistillation and supercritical fluid extraction (SFE were qualitative and quantitative analyzed by GC-MS and GC-FID. Essential oil (EO content of basil sample, determined by an official method, was 0.565% (V/w. The yields of basil obtained by SFE were from 0.719 to 1.483% (w/w, depending on the supercritical fluid (carbon dioxide density (from 0.378 to 0.929 g mL-1. The dominant compounds detected in all investigated samples (EO obtained by hydrodistillation and different SFE extracts were: linalool, as the major compound of basil EO (content from 10.14 to 49.79%, w/w, eugenol (from 3.74 to 9.78% and ä-cardinene (from 3.94 to 8.07%. The quantitative results of GC-MS from peak areas and by GC-FID using external standard method involving main standards, were compared and discussed. [Projekat Ministarstva nauke Republike Srbije, br. TR 31013

  18. Domain-independent information extraction in unstructured text

    Energy Technology Data Exchange (ETDEWEB)

    Irwin, N.H. [Sandia National Labs., Albuquerque, NM (United States). Software Surety Dept.

    1996-09-01

    Extracting information from unstructured text has become an important research area in recent years due to the large amount of text now electronically available. This status report describes the findings and work done during the second year of a two-year Laboratory Directed Research and Development Project. Building on the first-year`s work of identifying important entities, this report details techniques used to group words into semantic categories and to output templates containing selective document content. Using word profiles and category clustering derived during a training run, the time-consuming knowledge-building task can be avoided. Though the output still lacks in completeness when compared to systems with domain-specific knowledge bases, the results do look promising. The two approaches are compatible and could complement each other within the same system. Domain-independent approaches retain appeal as a system that adapts and learns will soon outpace a system with any amount of a priori knowledge.

  19. Separation of thorium from lanthanides by solvent extraction with ionizable crown ethers.

    Science.gov (United States)

    Du, H S; Wood, D J; Elshani, S; Wai, C M

    1993-02-01

    Thorium and the lanthanides are extracted by alpha-(sym-dibenzo-16-crown-5-oxy)acetic acid and its analogues in different pH ranges. At pH 4.5, Th is quantitatively extracted by the crown ether carboxylic acids into chloroform whereas the extraction of the lanthanides is negligible. Separation of Th from the lanthanides can be achieved by solvent extraction under this condition. The extraction does not require specific counteranions and is reversible with respect to pH. Trace amounts of Th in water can be quantitatively recovered using this extraction system for neutron activation analysis. The nature of the extracted Th complex and the mechanism of extraction are discussed.

  20. Extracting breathing rate information from a wearable reflectance pulse oximeter sensor.

    Science.gov (United States)

    Johnston, W S; Mendelson, Y

    2004-01-01

    The integration of multiple vital physiological measurements could help combat medics and field commanders to better predict a soldier's health condition and enhance their ability to perform remote triage procedures. In this paper we demonstrate the feasibility of extracting accurate breathing rate information from a photoplethysmographic signal that was recorded by a reflectance pulse oximeter sensor mounted on the forehead and subsequently processed by a simple time domain filtering and frequency domain Fourier analysis.

  1. Data from quantitative label free proteomics analysis of rat spleen.

    Science.gov (United States)

    Dudekula, Khadar; Le Bihan, Thierry

    2016-09-01

    The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides). A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis.

  2. Systematically extracting metal- and solvent-related occupational information from free-text responses to lifetime occupational history questionnaires.

    Science.gov (United States)

    Friesen, Melissa C; Locke, Sarah J; Tornow, Carina; Chen, Yu-Cheng; Koh, Dong-Hee; Stewart, Patricia A; Purdue, Mark; Colt, Joanne S

    2014-06-01

    Lifetime occupational history (OH) questionnaires often use open-ended questions to capture detailed information about study participants' jobs. Exposure assessors use this information, along with responses to job- and industry-specific questionnaires, to assign exposure estimates on a job-by-job basis. An alternative approach is to use information from the OH responses and the job- and industry-specific questionnaires to develop programmable decision rules for assigning exposures. As a first step in this process, we developed a systematic approach to extract the free-text OH responses and convert them into standardized variables that represented exposure scenarios. Our study population comprised 2408 subjects, reporting 11991 jobs, from a case-control study of renal cell carcinoma. Each subject completed a lifetime OH questionnaire that included verbatim responses, for each job, to open-ended questions including job title, main tasks and activities (task), tools and equipment used (tools), and chemicals and materials handled (chemicals). Based on a review of the literature, we identified exposure scenarios (occupations, industries, tasks/tools/chemicals) expected to involve possible exposure to chlorinated solvents, trichloroethylene (TCE) in particular, lead, and cadmium. We then used a SAS macro to review the information reported by study participants to identify jobs associated with each exposure scenario; this was done using previously coded standardized occupation and industry classification codes, and a priori lists of associated key words and phrases related to possibly exposed tasks, tools, and chemicals. Exposure variables representing the occupation, industry, and task/tool/chemicals exposure scenarios were added to the work history records of the study respondents. Our identification of possibly TCE-exposed scenarios in the OH responses was compared to an expert's independently assigned probability ratings to evaluate whether we missed identifying

  3. The Feature Extraction Based on Texture Image Information for Emotion Sensing in Speech

    Directory of Open Access Journals (Sweden)

    Kun-Ching Wang

    2014-09-01

    Full Text Available In this paper, we present a novel texture image feature for Emotion Sensing in Speech (ESS. This idea is based on the fact that the texture images carry emotion-related information. The feature extraction is derived from time-frequency representation of spectrogram images. First, we transform the spectrogram as a recognizable image. Next, we use a cubic curve to enhance the image contrast. Then, the texture image information (TII derived from the spectrogram image can be extracted by using Laws’ masks to characterize emotional state. In order to evaluate the effectiveness of the proposed emotion recognition in different languages, we use two open emotional databases including the Berlin Emotional Speech Database (EMO-DB and eNTERFACE corpus and one self-recorded database (KHUSC-EmoDB, to evaluate the performance cross-corpora. The results of the proposed ESS system are presented using support vector machine (SVM as a classifier. Experimental results show that the proposed TII-based feature extraction inspired by visual perception can provide significant classification for ESS systems. The two-dimensional (2-D TII feature can provide the discrimination between different emotions in visual expressions except for the conveyance pitch and formant tracks. In addition, the de-noising in 2-D images can be more easily completed than de-noising in 1-D speech.

  4. Separation of thorium from lanthanides by solvent extraction with ionizable crown ethers

    International Nuclear Information System (INIS)

    Du, H.S.; Wood, D.J.; Elshani, Sadik; Wai, C.M.

    1993-01-01

    Thorium and the lanthanides are extracted by α-(sym-dibenzo-16-crown-5-oxy)acetic acid and its analogues in different pH ranges. At pH 4.5, Th is quantitatively extracted by the crown ether carboxylic acids into chloroform whereas the extraction of the lanthanides is negligible. Separation of Th from the lanthanides can be achieved by solvent extraction under this condition. The extraction does not require specific counteranions and is reversible with respect to pH. Trace amounts of Th in water can be quantitatively recovered using this extraction system for neutron activation analysis. The nature of the extracted Th complex and the mechanism of extraction are discussed. (author)

  5. Parent experiences and information needs relating to procedural pain in children: a systematic review protocol.

    Science.gov (United States)

    Gates, Allison; Shave, Kassi; Featherstone, Robin; Buckreus, Kelli; Ali, Samina; Scott, Shannon; Hartling, Lisa

    2017-06-06

    There exist many evidence-based interventions available to manage procedural pain in children and neonates, yet they are severely underutilized. Parents play an important role in the management of their child's pain; however, many do not possess adequate knowledge of how to effectively do so. The purpose of the planned study is to systematically review and synthesize current knowledge of the experiences and information needs of parents with regard to the management of their child's pain and distress related to medical procedures in the emergency department. We will conduct a systematic review using rigorous methods and reporting based on the PRISMA statement. We will conduct a comprehensive search of literature published between 2000 and 2016 reporting on parents' experiences and information needs with regard to helping their child manage procedural pain and distress. Ovid MEDLINE, Ovid PsycINFO, CINAHL, and PubMed will be searched. We will also search reference lists of key studies and gray literature sources. Two reviewers will screen the articles following inclusion criteria defined a priori. One reviewer will then extract the data from each article following a data extraction form developed by the study team. The second reviewer will check the data extraction for accuracy and completeness. Any disagreements with regard to study inclusion or data extraction will be resolved via discussion. Data from qualitative studies will be summarized thematically, while those from quantitative studies will be summarized narratively. The second reviewer will confirm the overarching themes resulting from the qualitative and quantitative data syntheses. The Critical Appraisal Skills Programme Qualitative Research Checklist and the Quality Assessment Tool for Quantitative Studies will be used to assess the quality of the evidence from each included study. To our knowledge, no published review exists that comprehensively reports on the experiences and information needs of parents

  6. Quantitative ultrasound and photoacoustic imaging for the assessment of vascular parameters

    CERN Document Server

    Meiburger, Kristen M

    2017-01-01

    This book describes the development of quantitative techniques for ultrasound and photoacoustic imaging in the assessment of architectural and vascular parameters. It presents morphological vascular research based on the development of quantitative imaging techniques for the use of clinical B-mode ultrasound images, and preclinical architectural vascular investigations on quantitative imaging techniques for ultrasounds and photoacoustics. The book is divided into two main parts, the first of which focuses on the development and validation of quantitative techniques for the assessment of vascular morphological parameters that can be extracted from B-mode ultrasound longitudinal images of the common carotid artery. In turn, the second part highlights quantitative imaging techniques for assessing the architectural parameters of vasculature that can be extracted from 3D volumes, using both contrast-enhanced ultrasound (CEUS) imaging and photoacoustic imaging without the addition of any contrast agent. Sharing and...

  7. YAdumper: extracting and translating large information volumes from relational databases to structured flat files.

    Science.gov (United States)

    Fernández, José M; Valencia, Alfonso

    2004-10-12

    Downloading the information stored in relational databases into XML and other flat formats is a common task in bioinformatics. This periodical dumping of information requires considerable CPU time, disk and memory resources. YAdumper has been developed as a purpose-specific tool to deal with the integral structured information download of relational databases. YAdumper is a Java application that organizes database extraction following an XML template based on an external Document Type Declaration. Compared with other non-native alternatives, YAdumper substantially reduces memory requirements and considerably improves writing performance.

  8. Historical Origins of Information Behavior Research by Reference Publication Year Spectroscopy

    Directory of Open Access Journals (Sweden)

    Faramarz Soheili

    2015-12-01

    Full Text Available Background: Using a quantitative method named reference publication year spectroscopy (RPYS, this research tried to determine the historical roots of information behavior research. RPYS pave the way for determining the significant years and works in information behavior. Methodology: Using scientometric method, the initial data of this study, have been extracted from the Web of Science. Using RPYS software, the revised data were analyzed and visualized in Excel. Finding: The distribution of cited references in information behavior revealed three peaks within 19th century. Moreover, our analysis identified 6 peaks between 1900 to 1969 in the field of information behavior, respectively in 1948, 1954, 1957, 1960, 1965, and 1967 has occurred. Results: Based on the study findings, it seems that information behavior research has been shaped intellectually by fields such as Psychology, quantitative and qualitative methodologies, etc. Additionally, it has been influenced by some theories and theoretical works.

  9. Quantitative magnetic resonance micro-imaging methods for pharmaceutical research.

    Science.gov (United States)

    Mantle, M D

    2011-09-30

    The use of magnetic resonance imaging (MRI) as a tool in pharmaceutical research is now well established and the current literature covers a multitude of different pharmaceutically relevant research areas. This review focuses on the use of quantitative magnetic resonance micro-imaging techniques and how they have been exploited to extract information that is of direct relevance to the pharmaceutical industry. The article is divided into two main areas. The first half outlines the theoretical aspects of magnetic resonance and deals with basic magnetic resonance theory, the effects of nuclear spin-lattice (T(1)), spin-spin (T(2)) relaxation and molecular diffusion upon image quantitation, and discusses the applications of rapid magnetic resonance imaging techniques. In addition to the theory, the review aims to provide some practical guidelines for the pharmaceutical researcher with an interest in MRI as to which MRI pulse sequences/protocols should be used and when. The second half of the article reviews the recent advances and developments that have appeared in the literature concerning the use of quantitative micro-imaging methods to pharmaceutically relevant research. Copyright © 2010 Elsevier B.V. All rights reserved.

  10. From Specific Information Extraction to Inferences: A Hierarchical Framework of Graph Comprehension

    Science.gov (United States)

    2004-09-01

    The skill to interpret the information displayed in graphs is so important to have, the National Council of Teachers of Mathematics has created...guidelines to ensure that students learn these skills ( NCTM : Standards for Mathematics , 2003). These guidelines are based primarily on the extraction of...graphical perception. Human Computer Interaction, 8, 353-388. NCTM : Standards for Mathematics . (2003, 2003). Peebles, D., & Cheng, P. C.-H. (2002

  11. Wires in the soup: quantitative models of cell signaling

    Science.gov (United States)

    Cheong, Raymond; Levchenko, Andre

    2014-01-01

    Living cells are capable of extracting information from their environments and mounting appropriate responses to a variety of associated challenges. The underlying signal transduction networks enabling this can be quite complex, necessitating for their unraveling by sophisticated computational modeling coupled with precise experimentation. Although we are still at the beginning of this process, some recent examples of integrative analysis of cell signaling are very encouraging. This review highlights the case of the NF-κB pathway in order to illustrate how a quantitative model of a signaling pathway can be gradually constructed through continuous experimental validation, and what lessons one might learn from such exercises. PMID:18291655

  12. New generation quantitative x-ray microscopy encompassing phase-contrast

    International Nuclear Information System (INIS)

    Wilkins, S.W.; Mayo, S.C.; Gureyev, T.E.; Miller, P.R.; Pogany, A.; Stevenson, A.W.; Gao, D.; Davis, T.J.; Parry, D.J.; Paganin, D.

    2000-01-01

    Full text: We briefly outline a new approach to X-ray ultramicroscopy using projection imaging in a scanning electron microscope (SEM). Compared to earlier approaches, the new approach offers spatial resolution of ≤0.1 micron and includes novel features such as: i) phase contrast to give additional sample information over a wide energy range, rapid phase/amplitude extraction algorithms to enable new real-time modes of microscopic imaging widespread applications are envisaged to fields such as materials science, biomedical research, and microelectronics device inspection. Some illustrative examples are presented. The quantitative methods described here are also very relevant to X-ray projection microscopy using synchrotron sources

  13. Quantitative nanoscale surface voltage measurement on organic semiconductor blends

    International Nuclear Information System (INIS)

    Cuenat, Alexandre; Muñiz-Piniella, Andrés; Muñoz-Rojo, Miguel; Murphy, Craig E; Tsoi, Wing C

    2012-01-01

    We report on the validation of a method based on Kelvin probe force microscopy (KPFM) able to measure the different phases and the relative work function of polymer blend heterojunctions at the nanoscale. The method does not necessitate complex ultra-high vacuum setup. The quantitative information that can be extracted from the topography and the Kelvin probe measurements is critically analysed. Surface voltage difference can be observed at the nanoscale on poly(3-hexyl-thiophene):[6,6]-phenyl-C61-butyric acid methyl ester (P3HT:PCBM) blends and dependence on the annealing condition and the regio-regularity of P3HT is observed. (paper)

  14. Quantitative and Qualitative Analysis of Nutrition and Food Safety Information in School Science Textbooks of India

    Science.gov (United States)

    Subba Rao, G. M.; Vijayapushapm, T.; Venkaiah, K.; Pavarala, V.

    2012-01-01

    Objective: To assess quantity and quality of nutrition and food safety information in science textbooks prescribed by the Central Board of Secondary Education (CBSE), India for grades I through X. Design: Content analysis. Methods: A coding scheme was developed for quantitative and qualitative analyses. Two investigators independently coded the…

  15. Quantitative analysis of gender stereotypes and information aggregation in a national election.

    Directory of Open Access Journals (Sweden)

    Michele Tumminello

    Full Text Available By analyzing a database of a questionnaire answered by a large majority of candidates and elected in a parliamentary election, we quantitatively verify that (i female candidates on average present political profiles which are more compassionate and more concerned with social welfare issues than male candidates and (ii the voting procedure acts as a process of information aggregation. Our results show that information aggregation proceeds with at least two distinct paths. In the first case candidates characterize themselves with a political profile aiming to describe the profile of the majority of voters. This is typically the case of candidates of political parties which are competing for the center of the various political dimensions. In the second case, candidates choose a political profile manifesting a clear difference from opposite political profiles endorsed by candidates of a political party positioned at the opposite extreme of some political dimension.

  16. Pattern decomposition and quantitative-phase analysis in pulsed neutron transmission

    International Nuclear Information System (INIS)

    Steuwer, A.; Santisteban, J.R.; Withers, P.J.; Edwards, L.

    2004-01-01

    Neutron diffraction methods provide accurate quantitative insight into material properties with applications ranging from fundamental physics to applied engineering research. Neutron radiography or tomography on the other hand, are useful tools in the non-destructive spatial imaging of materials or engineering components, but are less accurate with respect to any quantitative analysis. It is possible to combine the advantages of diffraction and radiography using pulsed neutron transmission in a novel way. Using a pixellated detector at a time-of-flight source it is possible to collect 2D 'images' containing a great deal of interesting information in the thermal regime. This together with the unprecedented intensities available at spallation sources and improvements in computing power allow for a re-assessment of the transmission methods. It opens the possibility of simultaneous imaging of diverse material properties such as strain or temperature, as well as the variation in attenuation, and can assist in the determination of phase volume fraction. Spatial and time resolution (for dynamic experiment) are limited only by the detector technology and the intensity of the source. In this example, phase information contained in the cross-section is extracted from Bragg edges using an approach similar to pattern decomposition

  17. Patent Keyword Extraction Algorithm Based on Distributed Representation for Patent Classification

    Directory of Open Access Journals (Sweden)

    Jie Hu

    2018-02-01

    Full Text Available Many text mining tasks such as text retrieval, text summarization, and text comparisons depend on the extraction of representative keywords from the main text. Most existing keyword extraction algorithms are based on discrete bag-of-words type of word representation of the text. In this paper, we propose a patent keyword extraction algorithm (PKEA based on the distributed Skip-gram model for patent classification. We also develop a set of quantitative performance measures for keyword extraction evaluation based on information gain and cross-validation, based on Support Vector Machine (SVM classification, which are valuable when human-annotated keywords are not available. We used a standard benchmark dataset and a homemade patent dataset to evaluate the performance of PKEA. Our patent dataset includes 2500 patents from five distinct technological fields related to autonomous cars (GPS systems, lidar systems, object recognition systems, radar systems, and vehicle control systems. We compared our method with Frequency, Term Frequency-Inverse Document Frequency (TF-IDF, TextRank and Rapid Automatic Keyword Extraction (RAKE. The experimental results show that our proposed algorithm provides a promising way to extract keywords from patent texts for patent classification.

  18. Guidelines for reporting quantitative mass spectrometry based experiments in proteomics.

    Science.gov (United States)

    Martínez-Bartolomé, Salvador; Deutsch, Eric W; Binz, Pierre-Alain; Jones, Andrew R; Eisenacher, Martin; Mayer, Gerhard; Campos, Alex; Canals, Francesc; Bech-Serra, Joan-Josep; Carrascal, Montserrat; Gay, Marina; Paradela, Alberto; Navajas, Rosana; Marcilla, Miguel; Hernáez, María Luisa; Gutiérrez-Blázquez, María Dolores; Velarde, Luis Felipe Clemente; Aloria, Kerman; Beaskoetxea, Jabier; Medina-Aunon, J Alberto; Albar, Juan P

    2013-12-16

    Mass spectrometry is already a well-established protein identification tool and recent methodological and technological developments have also made possible the extraction of quantitative data of protein abundance in large-scale studies. Several strategies for absolute and relative quantitative proteomics and the statistical assessment of quantifications are possible, each having specific measurements and therefore, different data analysis workflows. The guidelines for Mass Spectrometry Quantification allow the description of a wide range of quantitative approaches, including labeled and label-free techniques and also targeted approaches such as Selected Reaction Monitoring (SRM). The HUPO Proteomics Standards Initiative (HUPO-PSI) has invested considerable efforts to improve the standardization of proteomics data handling, representation and sharing through the development of data standards, reporting guidelines, controlled vocabularies and tooling. In this manuscript, we describe a key output from the HUPO-PSI-namely the MIAPE Quant guidelines, which have developed in parallel with the corresponding data exchange format mzQuantML [1]. The MIAPE Quant guidelines describe the HUPO-PSI proposal concerning the minimum information to be reported when a quantitative data set, derived from mass spectrometry (MS), is submitted to a database or as supplementary information to a journal. The guidelines have been developed with input from a broad spectrum of stakeholders in the proteomics field to represent a true consensus view of the most important data types and metadata, required for a quantitative experiment to be analyzed critically or a data analysis pipeline to be reproduced. It is anticipated that they will influence or be directly adopted as part of journal guidelines for publication and by public proteomics databases and thus may have an impact on proteomics laboratories across the world. This article is part of a Special Issue entitled: Standardization and

  19. Validation of a quantitative NMR method for suspected counterfeit products exemplified on determination of benzethonium chloride in grapefruit seed extracts.

    Science.gov (United States)

    Bekiroglu, Somer; Myrberg, Olle; Ostman, Kristina; Ek, Marianne; Arvidsson, Torbjörn; Rundlöf, Torgny; Hakkarainen, Birgit

    2008-08-05

    A 1H-nuclear magnetic resonance (NMR) spectroscopy method for quantitative determination of benzethonium chloride (BTC) as a constituent of grapefruit seed extract was developed. The method was validated, assessing its specificity, linearity, range, and precision, as well as accuracy, limit of quantification and robustness. The method includes quantification using an internal reference standard, 1,3,5-trimethoxybenzene, and regarded as simple, rapid, and easy to implement. A commercial grapefruit seed extract was studied and the experiments were performed on spectrometers operating at two different fields, 300 and 600 MHz for proton frequencies, the former with a broad band (BB) probe and the latter equipped with both a BB probe and a CryoProbe. The concentration average for the product sample was 78.0, 77.8 and 78.4 mg/ml using the 300 BB probe, the 600MHz BB probe and CryoProbe, respectively. The standard deviation and relative standard deviation (R.S.D., in parenthesis) for the average concentrations was 0.2 (0.3%), 0.3 (0.4%) and 0.3mg/ml (0.4%), respectively.

  20. A New Variable Selection Method Based on Mutual Information Maximization by Replacing Collinear Variables for Nonlinear Quantitative Structure-Property Relationship Models

    Energy Technology Data Exchange (ETDEWEB)

    Ghasemi, Jahan B.; Zolfonoun, Ehsan [Toosi University of Technology, Tehran (Korea, Republic of)

    2012-05-15

    Selection of the most informative molecular descriptors from the original data set is a key step for development of quantitative structure activity/property relationship models. Recently, mutual information (MI) has gained increasing attention in feature selection problems. This paper presents an effective mutual information-based feature selection approach, named mutual information maximization by replacing collinear variables (MIMRCV), for nonlinear quantitative structure-property relationship models. The proposed variable selection method was applied to three different QSPR datasets, soil degradation half-life of 47 organophosphorus pesticides, GC-MS retention times of 85 volatile organic compounds, and water-to-micellar cetyltrimethylammonium bromide partition coefficients of 62 organic compounds.The obtained results revealed that using MIMRCV as feature selection method improves the predictive quality of the developed models compared to conventional MI based variable selection algorithms.

  1. A New Variable Selection Method Based on Mutual Information Maximization by Replacing Collinear Variables for Nonlinear Quantitative Structure-Property Relationship Models

    International Nuclear Information System (INIS)

    Ghasemi, Jahan B.; Zolfonoun, Ehsan

    2012-01-01

    Selection of the most informative molecular descriptors from the original data set is a key step for development of quantitative structure activity/property relationship models. Recently, mutual information (MI) has gained increasing attention in feature selection problems. This paper presents an effective mutual information-based feature selection approach, named mutual information maximization by replacing collinear variables (MIMRCV), for nonlinear quantitative structure-property relationship models. The proposed variable selection method was applied to three different QSPR datasets, soil degradation half-life of 47 organophosphorus pesticides, GC-MS retention times of 85 volatile organic compounds, and water-to-micellar cetyltrimethylammonium bromide partition coefficients of 62 organic compounds.The obtained results revealed that using MIMRCV as feature selection method improves the predictive quality of the developed models compared to conventional MI based variable selection algorithms

  2. Climate Change Education: Quantitatively Assessing the Impact of a Botanical Garden as an Informal Learning Environment

    Science.gov (United States)

    Sellmann, Daniela; Bogner, Franz X.

    2013-01-01

    Although informal learning environments have been studied extensively, ours is one of the first studies to quantitatively assess the impact of learning in botanical gardens on students' cognitive achievement. We observed a group of 10th graders participating in a one-day educational intervention on climate change implemented in a botanical garden.…

  3. Infrared thermography quantitative image processing

    Science.gov (United States)

    Skouroliakou, A.; Kalatzis, I.; Kalyvas, N.; Grivas, TB

    2017-11-01

    Infrared thermography is an imaging technique that has the ability to provide a map of temperature distribution of an object’s surface. It is considered for a wide range of applications in medicine as well as in non-destructive testing procedures. One of its promising medical applications is in orthopaedics and diseases of the musculoskeletal system where temperature distribution of the body’s surface can contribute to the diagnosis and follow up of certain disorders. Although the thermographic image can give a fairly good visual estimation of distribution homogeneity and temperature pattern differences between two symmetric body parts, it is important to extract a quantitative measurement characterising temperature. Certain approaches use temperature of enantiomorphic anatomical points, or parameters extracted from a Region of Interest (ROI). A number of indices have been developed by researchers to that end. In this study a quantitative approach in thermographic image processing is attempted based on extracting different indices for symmetric ROIs on thermograms of the lower back area of scoliotic patients. The indices are based on first order statistical parameters describing temperature distribution. Analysis and comparison of these indices result in evaluating the temperature distribution pattern of the back trunk expected in healthy, regarding spinal problems, subjects.

  4. Potential of mangrove Avicennia rumphiana extract as an antioxidant agent using multilevel extraction

    Science.gov (United States)

    Sulmartiwi, L.; Pujiastuti, D. Y.; Tjahjaningsih, W.; Jariyah

    2018-04-01

    Avicennia rumphiana is one of abundant mangrove found in Indonesia. Multilevel extraction methods were simultaneously conducted to screen the antioxidant activity from mangrove. The leaves, fruits and barks were consequently extracted using n-hexane, ethyl acetate and ethanol. The presence of phenolic, flavonoids and tannins compounds were characterized by quantitative and qualitative phytochemical assay as well as the antioxidant activity was examined using DPPH-free radical scavenging assay. The phytochemical test revealed that all of the extracts showed positive result. The fruits extract exhibited the highest phenolic, flavonoid and tannin (23.86 mg/g, 13.77 mg/g and 74.63 mg/g), respectively. The extracts were further confirmed for antioxidant using IC50 value and revealed that ethyl acetate extract has antioxidant activity better than n-hexane and ethyl acetate extract. Furthermore, this study indicated that mangrove Avicennia rumphiana could be subsequently explored for other biological activities due to their potential secondary metabolites.

  5. Relating interesting quantitative time series patterns with text events and text features

    Science.gov (United States)

    Wanner, Franz; Schreck, Tobias; Jentner, Wolfgang; Sharalieva, Lyubka; Keim, Daniel A.

    2013-12-01

    In many application areas, the key to successful data analysis is the integrated analysis of heterogeneous data. One example is the financial domain, where time-dependent and highly frequent quantitative data (e.g., trading volume and price information) and textual data (e.g., economic and political news reports) need to be considered jointly. Data analysis tools need to support an integrated analysis, which allows studying the relationships between textual news documents and quantitative properties of the stock market price series. In this paper, we describe a workflow and tool that allows a flexible formation of hypotheses about text features and their combinations, which reflect quantitative phenomena observed in stock data. To support such an analysis, we combine the analysis steps of frequent quantitative and text-oriented data using an existing a-priori method. First, based on heuristics we extract interesting intervals and patterns in large time series data. The visual analysis supports the analyst in exploring parameter combinations and their results. The identified time series patterns are then input for the second analysis step, in which all identified intervals of interest are analyzed for frequent patterns co-occurring with financial news. An a-priori method supports the discovery of such sequential temporal patterns. Then, various text features like the degree of sentence nesting, noun phrase complexity, the vocabulary richness, etc. are extracted from the news to obtain meta patterns. Meta patterns are defined by a specific combination of text features which significantly differ from the text features of the remaining news data. Our approach combines a portfolio of visualization and analysis techniques, including time-, cluster- and sequence visualization and analysis functionality. We provide two case studies, showing the effectiveness of our combined quantitative and textual analysis work flow. The workflow can also be generalized to other

  6. 5W1H Information Extraction with CNN-Bidirectional LSTM

    Science.gov (United States)

    Nurdin, A.; Maulidevi, N. U.

    2018-03-01

    In this work, information about who, did what, when, where, why, and how on Indonesian news articles were extracted by combining Convolutional Neural Network and Bidirectional Long Short-Term Memory. Convolutional Neural Network can learn semantically meaningful representations of sentences. Bidirectional LSTM can analyze the relations among words in the sequence. We also use word embedding word2vec for word representation. By combining these algorithms, we obtained F-measure 0.808. Our experiments show that CNN-BLSTM outperforms other shallow methods, namely IBk, C4.5, and Naïve Bayes with the F-measure 0.655, 0.645, and 0.595, respectively.

  7. Respiratory Information Extraction from Electrocardiogram Signals

    KAUST Repository

    Amin, Gamal El Din Fathy

    2010-12-01

    The Electrocardiogram (ECG) is a tool measuring the electrical activity of the heart, and it is extensively used for diagnosis and monitoring of heart diseases. The ECG signal reflects not only the heart activity but also many other physiological processes. The respiratory activity is a prominent process that affects the ECG signal due to the close proximity of the heart and the lungs. In this thesis, several methods for the extraction of respiratory process information from the ECG signal are presented. These methods allow an estimation of the lung volume and the lung pressure from the ECG signal. The potential benefit of this is to eliminate the corresponding sensors used to measure the respiration activity. A reduction of the number of sensors connected to patients will increase patients’ comfort and reduce the costs associated with healthcare. As a further result, the efficiency of diagnosing respirational disorders will increase since the respiration activity can be monitored with a common, widely available method. The developed methods can also improve the detection of respirational disorders that occur while patients are sleeping. Such disorders are commonly diagnosed in sleeping laboratories where the patients are connected to a number of different sensors. Any reduction of these sensors will result in a more natural sleeping environment for the patients and hence a higher sensitivity of the diagnosis.

  8. Quantitative study of FORC diagrams in thermally corrected Stoner– Wohlfarth nanoparticles systems

    International Nuclear Information System (INIS)

    De Biasi, E.; Curiale, J.; Zysler, R.D.

    2016-01-01

    The use of FORC diagrams is becoming increasingly popular among researchers devoted to magnetism and magnetic materials. However, a thorough interpretation of this kind of diagrams, in order to achieve quantitative information, requires an appropriate model of the studied system. For that reason most of the FORC studies are used for a qualitative analysis. In magnetic systems thermal fluctuations 'blur' the signatures of the anisotropy, volume and particle interactions distributions, therefore thermal effects in nanoparticles systems conspire against a proper interpretation and analysis of these diagrams. Motivated by this fact, we have quantitatively studied the degree of accuracy of the information extracted from FORC diagrams for the special case of single-domain thermal corrected Stoner– Wohlfarth (easy axes along the external field orientation) nanoparticles systems. In this work, the starting point is an analytical model that describes the behavior of a magnetic nanoparticles system as a function of field, anisotropy, temperature and measurement time. In order to study the quantitative degree of accuracy of our model, we built FORC diagrams for different archetypical cases of magnetic nanoparticles. Our results show that from the quantitative information obtained from the diagrams, under the hypotheses of the proposed model, is possible to recover the features of the original system with accuracy above 95%. This accuracy is improved at low temperatures and also it is possible to access to the anisotropy distribution directly from the FORC coercive field profile. Indeed, our simulations predict that the volume distribution plays a secondary role being the mean value and its deviation the only important parameters. Therefore it is possible to obtain an accurate result for the inversion and interaction fields despite the features of the volume distribution. - Highlights: • Quantify the degree of accuracy of the information obtained using the FORC diagrams.

  9. Extracting additional risk managers information from a risk assessment of Listeria monocytogenes in deli meats

    NARCIS (Netherlands)

    Pérez-Rodríguez, F.; Asselt, van E.D.; García-Gimeno, R.M.; Zurera, G.; Zwietering, M.H.

    2007-01-01

    The risk assessment study of Listeria monocytogenes in ready-to-eat foods conducted by the U.S. Food and Drug Administration is an example of an extensive quantitative microbiological risk assessment that could be used by risk analysts and other scientists to obtain information and by managers and

  10. Proceedings of the International Symposium on quantitative description of metal extraction processes

    International Nuclear Information System (INIS)

    Themelis, N.J.

    1991-01-01

    This book contains the proceedings of the H.H. Kellogg International Symposium. Topics include: Extractive metallurgy; Thermochemical phenomena in metallurgy; Thermodynamic modeling of metallurgical processes; and Transport and rate phenomena in metallurgical extraction

  11. Cationic dyes as extraction and spectrophotometric reagents: extraction of thiocyanate complex of mercury (II) in association with malachite green

    Energy Technology Data Exchange (ETDEWEB)

    Iyer, N V; Murthy, T K.S.

    1975-01-01

    An extraction spectrophotometric method for the determination of Hg(II) is described. This is based on the extraction of Hg(CNS)/sub 3//sup -/ complex in association with the cation of malachite green into benzene. The benzene extract has lambda max at 640 mm. Maximal extraction takes place from an aqueous solution of pH 4.5. Although four extractions are needed for quantitative recovery of Hg(II), a single extraction with aqueous organic = 2.5 : 1 is recommended for analysis and the apparent molar absorptivity is 65,000. The interference from a number of anions and cations has also been studied. (auth)

  12. Investigation of cultivated lavender (Lavandula officinalis L. extraction and its extracts

    Directory of Open Access Journals (Sweden)

    Nađalin Vesna

    2014-01-01

    Full Text Available In this study essential oil content was determined in lavender flowers and leaves by hydrodistillation. Physical and chemical characteristics of the isolated oils were determined. By using CO2 in supercritical state the extraction of lavender flowers was performed with a selected solvent flow under isothermal and isobaric conditions. By the usage of gas chromatography in combination with mass spectrometry (GC/MS and gas chromatography with flame ionisation detector (GC/FID the qualitative and quantitative analysis of the obtained essential oil and supercritical extracts (SFE was carried out. Also, the analysis of individual SFE extracts obtained during different extraction times was performed. It turned out that the main components of the analysed samples were linalool, linalool acetate, lavandulol, caryophyllene oxide, lavandulyl acetate, terpinen-4-ol and others. Two proposed models were used for modelling the extraction system lavender flower - supercritical CO2 on the basis of experimental results obtained by examining the extraction kinetics of this system. The applied models fitted well with the experimental results.

  13. Information extraction and knowledge graph construction from geoscience literature

    Science.gov (United States)

    Wang, Chengbin; Ma, Xiaogang; Chen, Jianguo; Chen, Jingwen

    2018-03-01

    Geoscience literature published online is an important part of open data, and brings both challenges and opportunities for data analysis. Compared with studies of numerical geoscience data, there are limited works on information extraction and knowledge discovery from textual geoscience data. This paper presents a workflow and a few empirical case studies for that topic, with a focus on documents written in Chinese. First, we set up a hybrid corpus combining the generic and geology terms from geology dictionaries to train Chinese word segmentation rules of the Conditional Random Fields model. Second, we used the word segmentation rules to parse documents into individual words, and removed the stop-words from the segmentation results to get a corpus constituted of content-words. Third, we used a statistical method to analyze the semantic links between content-words, and we selected the chord and bigram graphs to visualize the content-words and their links as nodes and edges in a knowledge graph, respectively. The resulting graph presents a clear overview of key information in an unstructured document. This study proves the usefulness of the designed workflow, and shows the potential of leveraging natural language processing and knowledge graph technologies for geoscience.

  14. Comparative evaluation of three automated systems for DNA extraction in conjunction with three commercially available real-time PCR assays for quantitation of plasma Cytomegalovirus DNAemia in allogeneic stem cell transplant recipients.

    Science.gov (United States)

    Bravo, Dayana; Clari, María Ángeles; Costa, Elisa; Muñoz-Cobo, Beatriz; Solano, Carlos; José Remigia, María; Navarro, David

    2011-08-01

    Limited data are available on the performance of different automated extraction platforms and commercially available quantitative real-time PCR (QRT-PCR) methods for the quantitation of cytomegalovirus (CMV) DNA in plasma. We compared the performance characteristics of the Abbott mSample preparation system DNA kit on the m24 SP instrument (Abbott), the High Pure viral nucleic acid kit on the COBAS AmpliPrep system (Roche), and the EZ1 Virus 2.0 kit on the BioRobot EZ1 extraction platform (Qiagen) coupled with the Abbott CMV PCR kit, the LightCycler CMV Quant kit (Roche), and the Q-CMV complete kit (Nanogen), for both plasma specimens from allogeneic stem cell transplant (Allo-SCT) recipients (n = 42) and the OptiQuant CMV DNA panel (AcroMetrix). The EZ1 system displayed the highest extraction efficiency over a wide range of CMV plasma DNA loads, followed by the m24 and the AmpliPrep methods. The Nanogen PCR assay yielded higher mean CMV plasma DNA values than the Abbott and the Roche PCR assays, regardless of the platform used for DNA extraction. Overall, the effects of the extraction method and the QRT-PCR used on CMV plasma DNA load measurements were less pronounced for specimens with high CMV DNA content (>10,000 copies/ml). The performance characteristics of the extraction methods and QRT-PCR assays evaluated herein for clinical samples were extensible at cell-based standards from AcroMetrix. In conclusion, different automated systems are not equally efficient for CMV DNA extraction from plasma specimens, and the plasma CMV DNA loads measured by commercially available QRT-PCRs can differ significantly. The above findings should be taken into consideration for the establishment of cutoff values for the initiation or cessation of preemptive antiviral therapies and for the interpretation of data from clinical studies in the Allo-SCT setting.

  15. Quantitative Detection of Trace Malachite Green in Aquiculture Water Samples by Extractive Electrospray Ionization Mass Spectrometry.

    Science.gov (United States)

    Fang, Xiaowei; Yang, Shuiping; Chingin, Konstantin; Zhu, Liang; Zhang, Xinglei; Zhou, Zhiquan; Zhao, Zhanfeng

    2016-08-11

    Exposure to malachite green (MG) may pose great health risks to humans; thus, it is of prime importance to develop fast and robust methods to quantitatively screen the presence of malachite green in water. Herein the application of extractive electrospray ionization mass spectrometry (EESI-MS) has been extended to the trace detection of MG within lake water and aquiculture water, due to the intensive use of MG as a biocide in fisheries. This method has the advantage of obviating offline liquid-liquid extraction or tedious matrix separation prior to the measurement of malachite green in native aqueous medium. The experimental results indicate that the extrapolated detection limit for MG was ~3.8 μg·L(-1) (S/N = 3) in lake water samples and ~0.5 μg·L(-1) in ultrapure water under optimized experimental conditions. The signal intensity of MG showed good linearity over the concentration range of 10-1000 μg·L(-1). Measurement of practical water samples fortified with MG at 0.01, 0.1 and 1.0 mg·L(-1) gave a good validation of the established calibration curve. The average recoveries and relative standard deviation (RSD) of malachite green in lake water and Carassius carassius fish farm effluent water were 115% (6.64% RSD), 85.4% (9.17% RSD) and 96.0% (7.44% RSD), respectively. Overall, the established EESI-MS/MS method has been demonstrated suitable for sensitive and rapid (malachite green in various aqueous media, indicating its potential for online real-time monitoring of real life samples.

  16. Quantitative Analysis of Tenofovir by Titrimetric, Extractive Ion-pair ...

    African Journals Online (AJOL)

    Methods: Tenofovir disoproxil forms a complex of 1:1 molar ratio with fumaric acid that was employed in its aqueous titration with sodium hydroxide. Non-aqueous titration was also employed for its determination. Extractive ion-pair spectrophotometric technique using methyl orange was similarly employed to evaluate ...

  17. Extraction efficiency of hydrophilic and lipophilic antioxidants from lyophilized foods using pressurized liquid extraction and manual extraction.

    Science.gov (United States)

    Watanabe, Jun; Oki, Tomoyuki; Takebayashi, Jun; Takano-Ishikawa, Yuko

    2014-09-01

    The efficient extraction of antioxidants from food samples is necessary in order to accurately measure their antioxidant capacities. α-Tocopherol and gallic acid were spiked into samples of 5 lyophilized and pulverized vegetables and fruits (onion, cabbage, Satsuma mandarin orange, pumpkin, and spinach). The lipophilic and hydrophilic antioxidants in the samples were sequentially extracted with a mixed solvent of n-hexane and dichloromethane, and then with acetic acid-acidified aqueous methanol. Duplicate samples were extracted: one set was extracted using an automated pressurized liquid extraction apparatus, and the other set was extracted manually. Spiked α-tocopherol and gallic acid were recovered almost quantitatively in the extracted lipophilic and hydrophilic fractions, respectively, especially when pressurized liquid extraction was used. The expected increase in lipophilic oxygen radical absorbance capacity (L-ORAC) due to spiking with α-tocopherol, and the expected increase in 2,2-diphenyl-1-picrylhydrazyl radical scavenging activities and total polyphenol content due to spiking with gallic acid, were all recovered in high yield. Relatively low recoveries, as reflected in the hydrophilic ORAC (H-ORAC) value, were obtained following spiking with gallic acid, suggesting an interaction between gallic acid and endogenous antioxidants. The H-ORAC values of gallic acid-spiked samples were almost the same as those of postadded (spiked) samples. These results clearly indicate that lipophilic and hydrophilic antioxidants are effectively extracted from lyophilized food, especially when pressurized liquid extraction is used. © 2014 Institute of Food Technologists®

  18. Extraction of neutron spectral information from Bonner-Sphere data

    CERN Document Server

    Haney, J H; Zaidins, C S

    1999-01-01

    We have extended a least-squares method of extracting neutron spectral information from Bonner-Sphere data which was previously developed by Zaidins et al. (Med. Phys. 5 (1978) 42). A pulse-height analysis with background stripping is employed which provided a more accurate count rate for each sphere. Newer response curves by Mares and Schraube (Nucl. Instr. and Meth. A 366 (1994) 461) were included for the moderating spheres and the bare detector which comprise the Bonner spectrometer system. Finally, the neutron energy spectrum of interest was divided using the philosophy of fuzzy logic into three trapezoidal regimes corresponding to slow, moderate, and fast neutrons. Spectral data was taken using a PuBe source in two different environments and the analyzed data is presented for these cases as slow, moderate, and fast neutron fluences. (author)

  19. Information Extraction for Clinical Data Mining: A Mammography Case Study.

    Science.gov (United States)

    Nassif, Houssam; Woods, Ryan; Burnside, Elizabeth; Ayvaci, Mehmet; Shavlik, Jude; Page, David

    2009-01-01

    Breast cancer is the leading cause of cancer mortality in women between the ages of 15 and 54. During mammography screening, radiologists use a strict lexicon (BI-RADS) to describe and report their findings. Mammography records are then stored in a well-defined database format (NMD). Lately, researchers have applied data mining and machine learning techniques to these databases. They successfully built breast cancer classifiers that can help in early detection of malignancy. However, the validity of these models depends on the quality of the underlying databases. Unfortunately, most databases suffer from inconsistencies, missing data, inter-observer variability and inappropriate term usage. In addition, many databases are not compliant with the NMD format and/or solely consist of text reports. BI-RADS feature extraction from free text and consistency checks between recorded predictive variables and text reports are crucial to addressing this problem. We describe a general scheme for concept information retrieval from free text given a lexicon, and present a BI-RADS features extraction algorithm for clinical data mining. It consists of a syntax analyzer, a concept finder and a negation detector. The syntax analyzer preprocesses the input into individual sentences. The concept finder uses a semantic grammar based on the BI-RADS lexicon and the experts' input. It parses sentences detecting BI-RADS concepts. Once a concept is located, a lexical scanner checks for negation. Our method can handle multiple latent concepts within the text, filtering out ultrasound concepts. On our dataset, our algorithm achieves 97.7% precision, 95.5% recall and an F 1 -score of 0.97. It outperforms manual feature extraction at the 5% statistical significance level.

  20. Tool for the quantitative evaluation of a Facebook app-based informal training process

    Directory of Open Access Journals (Sweden)

    Adolfo Calle-Gómez

    2017-02-01

    Full Text Available The study of the impact of Facebook in academy has been mainly based on the qualitative evaluation of the academic performance and motivation of students. This work takes as starting point the use of the Facebook app Sigma in the Universidad Técnica de Ambato. Students of this university share educative resources through Sigma. This constitutes an informal learning process. We have proposed to construct Gamma, a tool for the generation of statistics and charts that illustrates the impact of the social network in the resulting learning process. This paper presents the results of the study of how Gamma is valued by those who like to do informal learning. It was checked that 1 Gamma gives feedback about the value of educative resources and social actions and that 2 it allows the quantitative measurement of the impact of using Facebook in the informal learning process. As an added value, Gamma supports the communication between supporters and detractors of the use of Facebook in the academia.

  1. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    Science.gov (United States)

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory

  2. Analyse géochimique de la matiére organique extraite des roches sédimentaires. IV. Extraction des roches en faible quantités Geochemical Analysis of Organic Matter Extracted from Sedimentary Rocks Iv. Exraction from Small Amounts of Rock

    Directory of Open Access Journals (Sweden)

    Monin J. C.

    2006-11-01

    Full Text Available L'extraction en Soxhlet est inappllcable lorsque les échantillons de roche sont de trop petite taille. A l'occasion de la mise au point du protocole d'extraction correspondant, on examine l'influence d'un certain nombre de conditions opératoires sur le rendement d'extraction : température, durée nature et quantité du solvant, présence de lumière, présence d'air, procédé d'extraction. Pour les hydrocarbures, tant saturés qu'aromatiques, le facteur essentiel est l'agitation du milieu d'extraction ; la nature du solvant n'est pas critique, à condition de ne pas choisir un très mauvais solvant des hydrocarbures : l'extractibilité est en effet plus fonction du pouvoir désorbant vis-à-vis de la roche que du pouvoir solvant proprement dit. Pour les résines et asphalténes, l'interprétation des résultats est délicate, car la frontière n'est pas nette entre produits simplement dissous, produits de solvolyse et, produits de néoformation par interaction solvant-matière organique-matière minérale. II n'existe donc pas de protocole d'extraction recommandable dans l'absolu. Tout dépend des exigences analytiques et aussi pratiques du laboratoire; à l'Institut Français du Pétrole (IFP le protocole retenu est l'extraction en bécher avec agitation magnétique pendant 20 min dans le chroroforme à 50 °C (approximativement; on donne aussi le protocole d'évaporation du solvant et de récupération de l'extrait, qui doit être étudié soigneusement étant donné les faibles quantités mises en jeu. A Soxhlet extractor cannot be used with rock samples that are too small in size. With the development of on extraction procédure for such cases, this article examines the influence of various operating conditions on extraction yield, i. e. temperature, duration, nature and amount of solvent, presence of light, présence of air and extraction process. For both saturated and aromatic hydrocarbons, the essential factor is the stirring of the

  3. Selective extraction of hydrocarbons, phosphonates and phosphonic acids from soils by successive supercritical fluid and pressurized liquid extractions.

    Science.gov (United States)

    Chaudot, X; Tambuté, A; Caude, M

    2000-01-14

    Hydrocarbons, dialkyl alkylphosphonates and alkyl alkylphosphonic acids are selectively extracted from spiked soils by successive implementation of supercritical carbon dioxide, supercritical methanol-modified carbon dioxide and pressurized water. More than 95% of hydrocarbons are extracted during the first step (pure supercritical carbon dioxide extraction) whereas no organophosphorus compound is evidenced in this first extract. A quantitative extraction of phosphonates is achieved during the second step (methanol-modified supercritical carbon dioxide extraction). Polar phosphonic acids are extracted during a third step (pressurized water extraction) and analyzed by gas chromatography under methylated derivatives (diazomethane derivatization). Global recoveries for these compounds are close to 80%, a loss of about 20% occurring during the derivatization process (co-evaporation with solvent). The developed selective extraction method was successfully applied to a soil sample during an international collaborative exercise.

  4. Overview of image processing tools to extract physical information from JET videos

    Science.gov (United States)

    Craciunescu, T.; Murari, A.; Gelfusa, M.; Tiseanu, I.; Zoita, V.; EFDA Contributors, JET

    2014-11-01

    In magnetic confinement nuclear fusion devices such as JET, the last few years have witnessed a significant increase in the use of digital imagery, not only for the surveying and control of experiments, but also for the physical interpretation of results. More than 25 cameras are routinely used for imaging on JET in the infrared (IR) and visible spectral regions. These cameras can produce up to tens of Gbytes per shot and their information content can be very different, depending on the experimental conditions. However, the relevant information about the underlying physical processes is generally of much reduced dimensionality compared to the recorded data. The extraction of this information, which allows full exploitation of these diagnostics, is a challenging task. The image analysis consists, in most cases, of inverse problems which are typically ill-posed mathematically. The typology of objects to be analysed is very wide, and usually the images are affected by noise, low levels of contrast, low grey-level in-depth resolution, reshaping of moving objects, etc. Moreover, the plasma events have time constants of ms or tens of ms, which imposes tough conditions for real-time applications. On JET, in the last few years new tools and methods have been developed for physical information retrieval. The methodology of optical flow has allowed, under certain assumptions, the derivation of information about the dynamics of video objects associated with different physical phenomena, such as instabilities, pellets and filaments. The approach has been extended in order to approximate the optical flow within the MPEG compressed domain, allowing the manipulation of the large JET video databases and, in specific cases, even real-time data processing. The fast visible camera may provide new information that is potentially useful for disruption prediction. A set of methods, based on the extraction of structural information from the visual scene, have been developed for the

  5. Overview of image processing tools to extract physical information from JET videos

    International Nuclear Information System (INIS)

    Craciunescu, T; Tiseanu, I; Zoita, V; Murari, A; Gelfusa, M

    2014-01-01

    In magnetic confinement nuclear fusion devices such as JET, the last few years have witnessed a significant increase in the use of digital imagery, not only for the surveying and control of experiments, but also for the physical interpretation of results. More than 25 cameras are routinely used for imaging on JET in the infrared (IR) and visible spectral regions. These cameras can produce up to tens of Gbytes per shot and their information content can be very different, depending on the experimental conditions. However, the relevant information about the underlying physical processes is generally of much reduced dimensionality compared to the recorded data. The extraction of this information, which allows full exploitation of these diagnostics, is a challenging task. The image analysis consists, in most cases, of inverse problems which are typically ill-posed mathematically. The typology of objects to be analysed is very wide, and usually the images are affected by noise, low levels of contrast, low grey-level in-depth resolution, reshaping of moving objects, etc. Moreover, the plasma events have time constants of ms or tens of ms, which imposes tough conditions for real-time applications. On JET, in the last few years new tools and methods have been developed for physical information retrieval. The methodology of optical flow has allowed, under certain assumptions, the derivation of information about the dynamics of video objects associated with different physical phenomena, such as instabilities, pellets and filaments. The approach has been extended in order to approximate the optical flow within the MPEG compressed domain, allowing the manipulation of the large JET video databases and, in specific cases, even real-time data processing. The fast visible camera may provide new information that is potentially useful for disruption prediction. A set of methods, based on the extraction of structural information from the visual scene, have been developed for the

  6. Extraction of prospecting information of uranium deposit based on high spatial resolution satellite data. Taking bashibulake region as an example

    International Nuclear Information System (INIS)

    Yang Xu; Liu Dechang; Zhang Jielin

    2008-01-01

    In this study, the signification and content of prospecting information of uranium deposit are expounded. Quickbird high spatial resolution satellite data are used to extract the prospecting information of uranium deposit in Bashibulake area in the north of Tarim Basin. By using the pertinent methods of image processing, the information of ore-bearing bed, ore-control structure and mineralized alteration have been extracted. The results show a high consistency with the field survey. The aim of this study is to explore practicability of high spatial resolution satellite data for prospecting minerals, and to broaden the thinking of prospectation at similar area. (authors)

  7. Extracting Low-Frequency Information from Time Attenuation in Elastic Waveform Inversion

    Science.gov (United States)

    Guo, Xuebao; Liu, Hong; Shi, Ying; Wang, Weihong

    2017-03-01

    Low-frequency information is crucial for recovering background velocity, but the lack of low-frequency information in field data makes inversion impractical without accurate initial models. Laplace-Fourier domain waveform inversion can recover a smooth model from real data without low-frequency information, which can be used for subsequent inversion as an ideal starting model. In general, it also starts with low frequencies and includes higher frequencies at later inversion stages, while the difference is that its ultralow frequency information comes from the Laplace-Fourier domain. Meanwhile, a direct implementation of the Laplace-transformed wavefield using frequency domain inversion is also very convenient. However, because broad frequency bands are often used in the pure time domain waveform inversion, it is difficult to extract the wavefields dominated by low frequencies in this case. In this paper, low-frequency components are constructed by introducing time attenuation into the recorded residuals, and the rest of the method is identical to the traditional time domain inversion. Time windowing and frequency filtering are also applied to mitigate the ambiguity of the inverse problem. Therefore, we can start at low frequencies and to move to higher frequencies. The experiment shows that the proposed method can achieve a good inversion result in the presence of a linear initial model and records without low-frequency information.

  8. A COMPARATIVE ANALYSIS OF WEB INFORMATION EXTRACTION TECHNIQUES DEEP LEARNING vs. NAÏVE BAYES vs. BACK PROPAGATION NEURAL NETWORKS IN WEB DOCUMENT EXTRACTION

    Directory of Open Access Journals (Sweden)

    J. Sharmila

    2016-01-01

    Full Text Available Web mining related exploration is getting the chance to be more essential these days in view of the reason that a lot of information is overseen through the web. Web utilization is expanding in an uncontrolled way. A particular framework is required for controlling such extensive measure of information in the web space. Web mining is ordered into three noteworthy divisions: Web content mining, web usage mining and web structure mining. Tak-Lam Wong has proposed a web content mining methodology in the exploration with the aid of Bayesian Networks (BN. In their methodology, they were learning on separating the web data and characteristic revelation in view of the Bayesian approach. Roused from their investigation, we mean to propose a web content mining methodology, in view of a Deep Learning Algorithm. The Deep Learning Algorithm gives the interest over BN on the basis that BN is not considered in any learning architecture planning like to propose system. The main objective of this investigation is web document extraction utilizing different grouping algorithm and investigation. This work extricates the data from the web URL. This work shows three classification algorithms, Deep Learning Algorithm, Bayesian Algorithm and BPNN Algorithm. Deep Learning is a capable arrangement of strategies for learning in neural system which is connected like computer vision, speech recognition, and natural language processing and biometrics framework. Deep Learning is one of the simple classification technique and which is utilized for subset of extensive field furthermore Deep Learning has less time for classification. Naive Bayes classifiers are a group of basic probabilistic classifiers in view of applying Bayes hypothesis with concrete independence assumptions between the features. At that point the BPNN algorithm is utilized for classification. Initially training and testing dataset contains more URL. We extract the content presently from the dataset. The

  9. The Quantitative Theory of Information

    DEFF Research Database (Denmark)

    Topsøe, Flemming; Harremoës, Peter

    2008-01-01

    Information Theory as developed by Shannon and followers is becoming more and more important in a number of sciences. The concepts appear to be just the right ones with intuitively appealing operational interpretations. Furthermore, the information theoretical quantities are connected by powerful...

  10. Elastography as a hybrid imaging technique : coupling with photoacoustics and quantitative imaging

    International Nuclear Information System (INIS)

    Widlak, T.G.

    2015-01-01

    While classical imaging methods, such as ultrasound, computed tomography or magnetic resonance imaging, are well-known and mathematically understood, a host of physiological parameters relevant for diagnostic purposes cannot be obtained by them. This gap is recently being closed by the introduction of hybrid, or coupled-physics imaging methods. They connect more then one physical modality, and aim to provide quantitative information on optical, electrical or mechanical parameters with high resolution. Central to this thesis is the mechanical contrast of elastic tissue, especially Young’s modulus or the shear modulus. Different methods of qualitative elastography provide interior information of the mechanical displacement field. From this interior data the nonlinear inverse problem of quantitative elastography aims to reconstruct the shear modulus. In this thesis, the elastography problem is seen from a hybrid imaging perspective; methods from coupled-physics inspired literature and regularization theory have been employed to recover displacement and shear modulus information. The overdetermined systems approach by G. Bal is applied to the quantitative problem, and ellipticity criteria are deduced, for one and several measurements, as well as injectivity results. Together with the geometric theory of G. Chavent, the results are used for analyzing convergence of Tikhonov regularization. Also, a convergence analysis for the Levenberg Marquardt method is provided. As a second mainstream project in this thesis, elastography imaging is developed for extracting displacements from photoacoustic images. A novel method is provided for texturizing the images, and the optical flow problem for motion estimation is shown to be regularized with this texture generation. The results are tested in cooperation with the Medical University Vienna, and the methods for quantitative determination of the shear modulus evaluated in first experiments. In summary, the overdetermined systems

  11. Organization of extracting molecules of the diamide type: link with the extracting properties?

    International Nuclear Information System (INIS)

    Meridiano, Y.

    2009-02-01

    The aim of these studies is to establish a link between the different organizations of diamide extractants (used in the DIAMEX process) and their extracting properties. The effects of the key parameters leading the liquid-liquid extraction (concentration of extractant, nature of solute, activity of the aqueous phase, nature of the diluent and temperature) are studied: 1) at the supramolecular scale, with the characterization of the extractant organizations by vapor-pressure osmometry (VPO) and small angle neutron and X-ray scattering (SANS/SAXS) experiments; 2) at the molecular scale, with the quantification of the extracted solutes (water, nitric acid, metal nitrate) and the determination of extracted complexes stoichiometries by electro-spray mass spectrometry (ESI-MS) experiments. The DMDOHEMA molecule acts as a classical surfactant and forms aggregates of the reverse micelle type. Taking into account the established supramolecular diagrams, a quantitative link between the extractants structures and their extracting properties has been brought to light. To model the europium nitrate extraction, two approaches have been developed: - an approach based on mass action laws. Extractions equilibria have been proposed taking into account the supramolecular speciation; - an innovative approach considering the extracted ions as adsorbed on a specific surface of the extractant molecule which depends on the extractant organization state. The ion extraction can be considered as a sum of isotherms corresponding to the different states of organization. This approach allows to compare the extraction efficiency of an extracting molecule as a function of its organization state. (author)

  12. Evaluation of needle trap micro-extraction and solid-phase micro-extraction: Obtaining comprehensive information on volatile emissions from in vitro cultures.

    Science.gov (United States)

    Oertel, Peter; Bergmann, Andreas; Fischer, Sina; Trefz, Phillip; Küntzel, Anne; Reinhold, Petra; Köhler, Heike; Schubert, Jochen K; Miekisch, Wolfram

    2018-05-14

    Volatile organic compounds (VOCs) emitted from in vitro cultures may reveal information on species and metabolism. Owing to low nmol L -1 concentration ranges, pre-concentration techniques are required for gas chromatography-mass spectrometry (GC-MS) based analyses. This study was intended to compare the efficiency of established micro-extraction techniques - solid-phase micro-extraction (SPME) and needle-trap micro-extraction (NTME) - for the analysis of complex VOC patterns. For SPME, a 75 μm Carboxen®/polydimethylsiloxane fiber was used. The NTME needle was packed with divinylbenzene, Carbopack X and Carboxen 1000. The headspace was sampled bi-directionally. Seventy-two VOCs were calibrated by reference standard mixtures in the range of 0.041-62.24 nmol L -1 by means of GC-MS. Both pre-concentration methods were applied to profile VOCs from cultures of Mycobacterium avium ssp. paratuberculosis. Limits of detection ranged from 0.004 to 3.93 nmol L -1 (median = 0.030 nmol L -1 ) for NTME and from 0.001 to 5.684 nmol L -1 (median = 0.043 nmol L -1 ) for SPME. NTME showed advantages in assessing polar compounds such as alcohols. SPME showed advantages in reproducibility but disadvantages in sensitivity for N-containing compounds. Micro-extraction techniques such as SPME and NTME are well suited for trace VOC profiling over cultures if the limitations of each technique is taken into account. Copyright © 2018 John Wiley & Sons, Ltd.

  13. Quantitative Image Feature Engine (QIFE): an Open-Source, Modular Engine for 3D Quantitative Feature Extraction from Volumetric Medical Images.

    Science.gov (United States)

    Echegaray, Sebastian; Bakr, Shaimaa; Rubin, Daniel L; Napel, Sandy

    2017-10-06

    The aim of this study was to develop an open-source, modular, locally run or server-based system for 3D radiomics feature computation that can be used on any computer system and included in existing workflows for understanding associations and building predictive models between image features and clinical data, such as survival. The QIFE exploits various levels of parallelization for use on multiprocessor systems. It consists of a managing framework and four stages: input, pre-processing, feature computation, and output. Each stage contains one or more swappable components, allowing run-time customization. We benchmarked the engine using various levels of parallelization on a cohort of CT scans presenting 108 lung tumors. Two versions of the QIFE have been released: (1) the open-source MATLAB code posted to Github, (2) a compiled version loaded in a Docker container, posted to DockerHub, which can be easily deployed on any computer. The QIFE processed 108 objects (tumors) in 2:12 (h/mm) using 1 core, and 1:04 (h/mm) hours using four cores with object-level parallelization. We developed the Quantitative Image Feature Engine (QIFE), an open-source feature-extraction framework that focuses on modularity, standards, parallelism, provenance, and integration. Researchers can easily integrate it with their existing segmentation and imaging workflows by creating input and output components that implement their existing interfaces. Computational efficiency can be improved by parallelizing execution at the cost of memory usage. Different parallelization levels provide different trade-offs, and the optimal setting will depend on the size and composition of the dataset to be processed.

  14. Development of Total Reflection X-ray fluorescence spectrometry quantitative methodologies for elemental characterization of building materials and their degradation products

    Science.gov (United States)

    García-Florentino, Cristina; Maguregui, Maite; Marguí, Eva; Torrent, Laura; Queralt, Ignasi; Madariaga, Juan Manuel

    2018-05-01

    In this work, a Total Reflection X-ray fluorescence (TXRF) spectrometry based quantitative methodology for elemental characterization of liquid extracts and solids belonging to old building materials and their degradation products from a building of the beginning of 20th century with a high historic cultural value in Getxo, (Basque Country, North of Spain) is proposed. This quantification strategy can be considered a faster methodology comparing to traditional Energy or Wavelength Dispersive X-ray fluorescence (ED-XRF and WD-XRF) spectrometry based methodologies or other techniques such as Inductively Coupled Plasma Mass Spectrometry (ICP-MS). In particular, two kinds of liquid extracts were analysed: (i) water soluble extracts from different mortars and (ii) acid extracts from mortars, black crusts, and calcium carbonate formations. In order to try to avoid the acid extraction step of the materials and their degradation products, it was also studied the TXRF direct measurement of the powdered solid suspensions in water. With this aim, different parameters such as the deposition volume and the measuring time were studied for each kind of samples. Depending on the quantified element, the limits of detection achieved with the TXRF quantitative methodologies for liquid extracts and solids were set around 0.01-1.2 and 2-200 mg/L respectively. The quantification of K, Ca, Ti, Mn, Fe, Zn, Rb, Sr, Sn and Pb in the liquid extracts was proved to be a faster alternative to other more classic quantification techniques (i.e. ICP-MS), accurate enough to obtain information about the composition of the acidic soluble part of the materials and their degradation products. Regarding the solid samples measured as suspensions, it was quite difficult to obtain stable and repetitive suspensions affecting in this way the accuracy of the results. To cope with this problem, correction factors based on the quantitative results obtained using ED-XRF were calculated to improve the accuracy of

  15. Approaching the largest ‘API’: extracting information from the Internet with Python

    Directory of Open Access Journals (Sweden)

    Jonathan E. Germann

    2018-02-01

    Full Text Available This article explores the need for libraries to algorithmically access and manipulate the world’s largest API: the Internet. The billions of pages on the ‘Internet API’ (HTTP, HTML, CSS, XPath, DOM, etc. are easily accessible and manipulable. Libraries can assist in creating meaning through the datafication of information on the world wide web. Because most information is created for human consumption, some programming is required for automated extraction. Python is an easy-to-learn programming language with extensive packages and community support for web page automation. Four packages (Urllib, Selenium, BeautifulSoup, Scrapy in Python can automate almost any web page for all sized projects. An example warrant data project is explained to illustrate how well Python packages can manipulate web pages to create meaning through assembling custom datasets.

  16. Solvent-extraction methods applied to the chemical analysis of uranium. III. Study of the extraction with inert solvents

    International Nuclear Information System (INIS)

    Vera Palomino, J.; Palomares Delgado, F.; Petrement Eguiluz, J. C.

    1964-01-01

    The extraction of uranium on the trace level is studied by using tributylphosphate as active agent under conditions aiming the attainment of quantitative extraction by means of a single step process using a number of salting-out agents and keeping inside the general lines as reported in two precedent papers. Two inert solvents were investigated, benzene and cyclohexane, which allowed to derive the corresponding empirical equations describing the extraction process and the results obtained were compared with those previously reported for solvents which, like ethyl acetate and methylisobuthylketone, favour to a more or less extend the extraction of uranium. (Author) 4 refs

  17. Biologically active extracts with kidney affections applications

    International Nuclear Information System (INIS)

    Pascu, Mihaela; Pascu, Daniela-Elena; Cozea, Andreea; Bunaciu, Andrei A.; Miron, Alexandra Raluca; Nechifor, Cristina Aurelia

    2015-01-01

    Highlights: • The paper highlighted the compositional similarities and differences between the three extracts of bilberry and cranberry fruit derived from the same Ericaceae family. • A method of antioxidant activity, different cellulose membranes, a Whatman filter and Langmuir – kinetic model were used. • Arbutoside presence in all three extracts of bilberry and cranberry fruit explains their use in urinary infections – cystitis and colibacillosis. • Following these research studies, it was established that the fruits of bilberry and cranberry (fruit and leaves) significantly reduce the risk of urinary infections, and work effectively to protect against free radicals and inflammation. - Abstract: This paper is aimed to select plant materials rich in bioflavonoid compounds, made from herbs known for their application performances in the prevention and therapy of renal diseases, namely kidney stones and urinary infections (renal lithiasis, nephritis, urethritis, cystitis, etc.). This paper presents a comparative study of the medicinal plant extracts composition belonging to Ericaceae-Cranberry (fruit and leaves) – Vaccinium vitis-idaea L. and Bilberry (fruit) – Vaccinium myrtillus L. Concentrated extracts obtained from medicinal plants used in this work were analyzed from structural, morphological and compositional points of view using different techniques: chromatographic methods (HPLC), scanning electronic microscopy, infrared, and UV spectrophotometry, also by using kinetic model. Liquid chromatography was able to identify the specific compounds of the Ericaceae family, present in all three extracts, arbutosid, as well as specific components of each species, mostly from the class of polyphenols. The identification and quantitative determination of the active ingredients from these extracts can give information related to their therapeutic effects.

  18. Biologically active extracts with kidney affections applications

    Energy Technology Data Exchange (ETDEWEB)

    Pascu, Mihaela, E-mail: mihhaela_neagu@yahoo.com [SC HOFIGAL S.A., Analytical Research Department, 2 Intr. Serelor, Bucharest-4 042124 (Romania); Politehnica University of Bucharest, Faculty of Applied Chemistry and Material Science, 1-5 Polizu Street, 11061 Bucharest (Romania); Pascu, Daniela-Elena [Politehnica University of Bucharest, Faculty of Applied Chemistry and Material Science, 1-5 Polizu Street, 11061 Bucharest (Romania); Cozea, Andreea [SC HOFIGAL S.A., Analytical Research Department, 2 Intr. Serelor, Bucharest-4 042124 (Romania); Transilvania University of Brasov, Faculty of Food and Tourism, 148 Castle Street, 500036 Brasov (Romania); Bunaciu, Andrei A. [SCIENT – Research Center for Instrumental Analysis, S.C. CROMATEC-PLUS S.R.L., 18 Sos. Cotroceni, Bucharest 060114 (Romania); Miron, Alexandra Raluca; Nechifor, Cristina Aurelia [Politehnica University of Bucharest, Faculty of Applied Chemistry and Material Science, 1-5 Polizu Street, 11061 Bucharest (Romania)

    2015-12-15

    Highlights: • The paper highlighted the compositional similarities and differences between the three extracts of bilberry and cranberry fruit derived from the same Ericaceae family. • A method of antioxidant activity, different cellulose membranes, a Whatman filter and Langmuir – kinetic model were used. • Arbutoside presence in all three extracts of bilberry and cranberry fruit explains their use in urinary infections – cystitis and colibacillosis. • Following these research studies, it was established that the fruits of bilberry and cranberry (fruit and leaves) significantly reduce the risk of urinary infections, and work effectively to protect against free radicals and inflammation. - Abstract: This paper is aimed to select plant materials rich in bioflavonoid compounds, made from herbs known for their application performances in the prevention and therapy of renal diseases, namely kidney stones and urinary infections (renal lithiasis, nephritis, urethritis, cystitis, etc.). This paper presents a comparative study of the medicinal plant extracts composition belonging to Ericaceae-Cranberry (fruit and leaves) – Vaccinium vitis-idaea L. and Bilberry (fruit) – Vaccinium myrtillus L. Concentrated extracts obtained from medicinal plants used in this work were analyzed from structural, morphological and compositional points of view using different techniques: chromatographic methods (HPLC), scanning electronic microscopy, infrared, and UV spectrophotometry, also by using kinetic model. Liquid chromatography was able to identify the specific compounds of the Ericaceae family, present in all three extracts, arbutosid, as well as specific components of each species, mostly from the class of polyphenols. The identification and quantitative determination of the active ingredients from these extracts can give information related to their therapeutic effects.

  19. Extracting information on the spatial variability in erosion rate stored in detrital cooling age distributions in river sands

    Science.gov (United States)

    Braun, Jean; Gemignani, Lorenzo; van der Beek, Peter

    2018-03-01

    One of the main purposes of detrital thermochronology is to provide constraints on the regional-scale exhumation rate and its spatial variability in actively eroding mountain ranges. Procedures that use cooling age distributions coupled with hypsometry and thermal models have been developed in order to extract quantitative estimates of erosion rate and its spatial distribution, assuming steady state between tectonic uplift and erosion. This hypothesis precludes the use of these procedures to assess the likely transient response of mountain belts to changes in tectonic or climatic forcing. Other methods are based on an a priori knowledge of the in situ distribution of ages to interpret the detrital age distributions. In this paper, we describe a simple method that, using the observed detrital mineral age distributions collected along a river, allows us to extract information about the relative distribution of erosion rates in an eroding catchment without relying on a steady-state assumption, the value of thermal parameters or an a priori knowledge of in situ age distributions. The model is based on a relatively low number of parameters describing lithological variability among the various sub-catchments and their sizes and only uses the raw ages. The method we propose is tested against synthetic age distributions to demonstrate its accuracy and the optimum conditions for it use. In order to illustrate the method, we invert age distributions collected along the main trunk of the Tsangpo-Siang-Brahmaputra river system in the eastern Himalaya. From the inversion of the cooling age distributions we predict present-day erosion rates of the catchments along the Tsangpo-Siang-Brahmaputra river system, as well as some of its tributaries. We show that detrital age distributions contain dual information about present-day erosion rate, i.e., from the predicted distribution of surface ages within each catchment and from the relative contribution of any given catchment to the

  20. Extracting information on the spatial variability in erosion rate stored in detrital cooling age distributions in river sands

    Directory of Open Access Journals (Sweden)

    J. Braun

    2018-03-01

    Full Text Available One of the main purposes of detrital thermochronology is to provide constraints on the regional-scale exhumation rate and its spatial variability in actively eroding mountain ranges. Procedures that use cooling age distributions coupled with hypsometry and thermal models have been developed in order to extract quantitative estimates of erosion rate and its spatial distribution, assuming steady state between tectonic uplift and erosion. This hypothesis precludes the use of these procedures to assess the likely transient response of mountain belts to changes in tectonic or climatic forcing. Other methods are based on an a priori knowledge of the in situ distribution of ages to interpret the detrital age distributions. In this paper, we describe a simple method that, using the observed detrital mineral age distributions collected along a river, allows us to extract information about the relative distribution of erosion rates in an eroding catchment without relying on a steady-state assumption, the value of thermal parameters or an a priori knowledge of in situ age distributions. The model is based on a relatively low number of parameters describing lithological variability among the various sub-catchments and their sizes and only uses the raw ages. The method we propose is tested against synthetic age distributions to demonstrate its accuracy and the optimum conditions for it use. In order to illustrate the method, we invert age distributions collected along the main trunk of the Tsangpo–Siang–Brahmaputra river system in the eastern Himalaya. From the inversion of the cooling age distributions we predict present-day erosion rates of the catchments along the Tsangpo–Siang–Brahmaputra river system, as well as some of its tributaries. We show that detrital age distributions contain dual information about present-day erosion rate, i.e., from the predicted distribution of surface ages within each catchment and from the relative contribution of

  1. Research in health sciences library and information science: a quantitative analysis.

    Science.gov (United States)

    Dimitroff, A

    1992-10-01

    A content analysis of research articles published between 1966 and 1990 in the Bulletin of the Medical Library Association was undertaken. Four specific questions were addressed: What subjects are of interest to health sciences librarians? Who is conducting this research? How do health sciences librarians conduct their research? Do health sciences librarians obtain funding for their research activities? Bibliometric characteristics of the research articles are described and compared to characteristics of research in library and information science as a whole in terms of subject and methodology. General findings were that most research in health sciences librarianship is conducted by librarians affiliated with academic health sciences libraries (51.8%); most deals with an applied (45.7%) or a theoretical (29.2%) topic; survey (41.0%) or observational (20.7%) research methodologies are used; descriptive quantitative analytical techniques are used (83.5%); and over 25% of research is funded. The average number of authors was 1.85, average article length was 7.25 pages, and average number of citations per article was 9.23. These findings are consistent with those reported in the general library and information science literature for the most part, although specific differences do exist in methodological and analytical areas.

  2. Extracting and Using Photon Polarization Information in Radiative B Decays

    Energy Technology Data Exchange (ETDEWEB)

    Grossman, Yuval

    2000-05-09

    The authors discuss the uses of conversion electron pairs for extracting photon polarization information in weak radiative B decays. Both cases of leptons produced through a virtual and real photon are considered. Measurements of the angular correlation between the (K-pi) and (e{sup +}e{sup {minus}}) decay planes in B --> K*(--> K-pi)gamma (*)(--> e{sup +}e{sup {minus}}) decays can be used to determine the helicity amplitudes in the radiative B --> K*gamma decays. A large right-handed helicity amplitude in B-bar decays is a signal of new physics. The time-dependent CP asymmetry in the B{sup 0} decay angular correlation is shown to measure sin 2-beta and cos 2-beta with little hadronic uncertainty.

  3. SAR matrices: automated extraction of information-rich SAR tables from large compound data sets.

    Science.gov (United States)

    Wassermann, Anne Mai; Haebel, Peter; Weskamp, Nils; Bajorath, Jürgen

    2012-07-23

    We introduce the SAR matrix data structure that is designed to elucidate SAR patterns produced by groups of structurally related active compounds, which are extracted from large data sets. SAR matrices are systematically generated and sorted on the basis of SAR information content. Matrix generation is computationally efficient and enables processing of large compound sets. The matrix format is reminiscent of SAR tables, and SAR patterns revealed by different categories of matrices are easily interpretable. The structural organization underlying matrix formation is more flexible than standard R-group decomposition schemes. Hence, the resulting matrices capture SAR information in a comprehensive manner.

  4. Text mining facilitates database curation - extraction of mutation-disease associations from Bio-medical literature.

    Science.gov (United States)

    Ravikumar, Komandur Elayavilli; Wagholikar, Kavishwar B; Li, Dingcheng; Kocher, Jean-Pierre; Liu, Hongfang

    2015-06-06

    Advances in the next generation sequencing technology has accelerated the pace of individualized medicine (IM), which aims to incorporate genetic/genomic information into medicine. One immediate need in interpreting sequencing data is the assembly of information about genetic variants and their corresponding associations with other entities (e.g., diseases or medications). Even with dedicated effort to capture such information in biological databases, much of this information remains 'locked' in the unstructured text of biomedical publications. There is a substantial lag between the publication and the subsequent abstraction of such information into databases. Multiple text mining systems have been developed, but most of them focus on the sentence level association extraction with performance evaluation based on gold standard text annotations specifically prepared for text mining systems. We developed and evaluated a text mining system, MutD, which extracts protein mutation-disease associations from MEDLINE abstracts by incorporating discourse level analysis, using a benchmark data set extracted from curated database records. MutD achieves an F-measure of 64.3% for reconstructing protein mutation disease associations in curated database records. Discourse level analysis component of MutD contributed to a gain of more than 10% in F-measure when compared against the sentence level association extraction. Our error analysis indicates that 23 of the 64 precision errors are true associations that were not captured by database curators and 68 of the 113 recall errors are caused by the absence of associated disease entities in the abstract. After adjusting for the defects in the curated database, the revised F-measure of MutD in association detection reaches 81.5%. Our quantitative analysis reveals that MutD can effectively extract protein mutation disease associations when benchmarking based on curated database records. The analysis also demonstrates that incorporating

  5. Simplified and rapid method for extraction of ergosterol from natural samples and detection with quantitative and semi-quantitative methods using thin-layer chromatography

    OpenAIRE

    Larsen, Cand.scient Thomas; Ravn, Senior scientist Helle; Axelsen, Senior Scientist Jørgen

    2004-01-01

    A new and simplified method for extraction of ergosterol (ergoste-5,7,22-trien-3-beta-ol) from fungi in soil and litter was developed using pre-soaking extraction and paraffin oil for recovery. Recoveries of ergosterol were in the range of 94 - 100% depending on the solvent to oil ratio. Extraction efficiencies equal to heat-assisted extraction treatments were obtained with pre-soaked extraction. Ergosterol was detected with thin-layer chromatography (TLC) using fluorodensitometry with a quan...

  6. Extracting chemical information from high-resolution Kβ X-ray emission spectroscopy

    Science.gov (United States)

    Limandri, S.; Robledo, J.; Tirao, G.

    2018-06-01

    High-resolution X-ray emission spectroscopy allows studying the chemical environment of a wide variety of materials. Chemical information can be obtained by fitting the X-ray spectra and observing the behavior of some spectral features. Spectral changes can also be quantified by means of statistical parameters calculated by considering the spectrum as a probability distribution. Another possibility is to perform statistical multivariate analysis, such as principal component analysis. In this work the performance of these procedures for extracting chemical information in X-ray emission spectroscopy spectra for mixtures of Mn2+ and Mn4+ oxides are studied. A detail analysis of the parameters obtained, as well as the associated uncertainties is shown. The methodologies are also applied for Mn oxidation state characterization of double perovskite oxides Ba1+xLa1-xMnSbO6 (with 0 ≤ x ≤ 0.7). The results show that statistical parameters and multivariate analysis are the most suitable for the analysis of this kind of spectra.

  7. Feature extraction and learning using context cue and Rényi entropy based mutual information

    DEFF Research Database (Denmark)

    Pan, Hong; Olsen, Søren Ingvor; Zhu, Yaping

    2015-01-01

    information. In particular, for feature extraction, we develop a new set of kernel descriptors−Context Kernel Descriptors (CKD), which enhance the original KDES by embedding the spatial context into the descriptors. Context cues contained in the context kernel enforce some degree of spatial consistency, thus...... improving the robustness of CKD. For feature learning and reduction, we propose a novel codebook learning method, based on a Rényi quadratic entropy based mutual information measure called Cauchy-Schwarz Quadratic Mutual Information (CSQMI), to learn a compact and discriminative CKD codebook. Projecting...... as the information about the underlying labels of the CKD using CSQMI. Thus the resulting codebook and reduced CKD are discriminative. We verify the effectiveness of our method on several public image benchmark datasets such as YaleB, Caltech-101 and CIFAR-10, as well as a challenging chicken feet dataset of our own...

  8. Relating Maxwell’s demon and quantitative analysis of information leakage for practical imperative programs

    International Nuclear Information System (INIS)

    Anjaria, Kushal; Mishra, Arun

    2017-01-01

    Shannon observed the relation between information entropy and Maxwell demon experiment to come up with information entropy formula. After that, Shannon’s entropy formula is widely used to measure information leakage in imperative programs. But in the present work, our aim is to go in a reverse direction and try to find possible Maxwell’s demon experimental setup for contemporary practical imperative programs in which variations of Shannon’s entropy formula has been applied to measure the information leakage. To establish the relation between the second principle of thermodynamics and quantitative analysis of information leakage, present work models contemporary variations of imperative programs in terms of Maxwell’s demon experimental setup. In the present work five contemporary variations of imperative program related to information quantification are identified. They are: (i) information leakage in imperative program, (ii) imperative multithreaded program, (iii) point to point leakage in the imperative program, (iv) imperative program with infinite observation, and (v) imperative program in the SOA-based environment. For these variations, minimal work required by an attacker to gain the secret is also calculated using historical Maxwell’s demon experiment. To model the experimental setup of Maxwell’s demon, non-interference security policy is used. In the present work, imperative programs with one-bit secret information have been considered to avoid the complexity. The findings of the present work from the history of physics can be utilized in many areas related to information flow of physical computing, nano-computing, quantum computing, biological computing, energy dissipation in computing, and computing power analysis. (paper)

  9. Robust real-time extraction of respiratory signals from PET list-mode data.

    Science.gov (United States)

    Salomon, Andre; Zhang, Bin; Olivier, Patrick; Goedicke, Andreas

    2018-05-01

    Respiratory motion, which typically cannot simply be suspended during PET image acquisition, affects lesions' detection and quantitative accuracy inside or in close vicinity to the lungs. Some motion compensation techniques address this issue via pre-sorting ("binning") of the acquired PET data into a set of temporal gates, where each gate is assumed to be minimally affected by respiratory motion. Tracking respiratory motion is typically realized using dedicated hardware (e.g. using respiratory belts and digital cameras). Extracting respiratory signalsdirectly from the acquired PET data simplifies the clinical workflow as it avoids to handle additional signal measurement equipment. We introduce a new data-driven method "Combined Local Motion Detection" (CLMD). It uses the Time-of-Flight (TOF) information provided by state-of-the-art PET scanners in order to enable real-time respiratory signal extraction without additional hardware resources. CLMD applies center-of-mass detection in overlapping regions based on simple back-positioned TOF event sets acquired in short time frames. Following a signal filtering and quality-based pre-selection step, the remaining extracted individual position information over time is then combined to generate a global respiratory signal. The method is evaluated using 7 measured FDG studies from single and multiple scan positions of the thorax region, and it is compared to other software-based methods regarding quantitative accuracy and statistical noise stability. Correlation coefficients around 90% between the reference and the extracted signal have been found for those PET scans where motion affected features such as tumors or hot regions were present in the PET field-of-view. For PET scans with a quarter of typically applied radiotracer doses, the CLMD method still provides similar high correlation coefficients which indicates its robustness to noise. Each CLMD processing needed less than 0.4s in total on a standard multi-core CPU

  10. Robust real-time extraction of respiratory signals from PET list-mode data

    Science.gov (United States)

    Salomon, André; Zhang, Bin; Olivier, Patrick; Goedicke, Andreas

    2018-06-01

    Respiratory motion, which typically cannot simply be suspended during PET image acquisition, affects lesions’ detection and quantitative accuracy inside or in close vicinity to the lungs. Some motion compensation techniques address this issue via pre-sorting (‘binning’) of the acquired PET data into a set of temporal gates, where each gate is assumed to be minimally affected by respiratory motion. Tracking respiratory motion is typically realized using dedicated hardware (e.g. using respiratory belts and digital cameras). Extracting respiratory signals directly from the acquired PET data simplifies the clinical workflow as it avoids handling additional signal measurement equipment. We introduce a new data-driven method ‘combined local motion detection’ (CLMD). It uses the time-of-flight (TOF) information provided by state-of-the-art PET scanners in order to enable real-time respiratory signal extraction without additional hardware resources. CLMD applies center-of-mass detection in overlapping regions based on simple back-positioned TOF event sets acquired in short time frames. Following a signal filtering and quality-based pre-selection step, the remaining extracted individual position information over time is then combined to generate a global respiratory signal. The method is evaluated using seven measured FDG studies from single and multiple scan positions of the thorax region, and it is compared to other software-based methods regarding quantitative accuracy and statistical noise stability. Correlation coefficients around 90% between the reference and the extracted signal have been found for those PET scans where motion affected features such as tumors or hot regions were present in the PET field-of-view. For PET scans with a quarter of typically applied radiotracer doses, the CLMD method still provides similar high correlation coefficients which indicates its robustness to noise. Each CLMD processing needed less than 0.4 s in total on a standard

  11. Selection of Suitable DNA Extraction Methods for Genetically Modified Maize 3272, and Development and Evaluation of an Event-Specific Quantitative PCR Method for 3272.

    Science.gov (United States)

    Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Teshima, Reiko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi

    2016-01-01

    A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) maize, 3272. We first attempted to obtain genome DNA from this maize using a DNeasy Plant Maxi kit and a DNeasy Plant Mini kit, which have been widely utilized in our previous studies, but DNA extraction yields from 3272 were markedly lower than those from non-GM maize seeds. However, lowering of DNA extraction yields was not observed with GM quicker or Genomic-tip 20/G. We chose GM quicker for evaluation of the quantitative method. We prepared a standard plasmid for 3272 quantification. The conversion factor (Cf), which is required to calculate the amount of a genetically modified organism (GMO), was experimentally determined for two real-time PCR instruments, the Applied Biosystems 7900HT (the ABI 7900) and the Applied Biosystems 7500 (the ABI7500). The determined Cf values were 0.60 and 0.59 for the ABI 7900 and the ABI 7500, respectively. To evaluate the developed method, a blind test was conducted as part of an interlaboratory study. The trueness and precision were evaluated as the bias and reproducibility of the relative standard deviation (RSDr). The determined values were similar to those in our previous validation studies. The limit of quantitation for the method was estimated to be 0.5% or less, and we concluded that the developed method would be suitable and practical for detection and quantification of 3272.

  12. A Primer on Disseminating Applied Quantitative Research

    Science.gov (United States)

    Bell, Bethany A.; DiStefano, Christine; Morgan, Grant B.

    2010-01-01

    Transparency and replication are essential features of scientific inquiry, yet scientific communications of applied quantitative research are often lacking in much-needed procedural information. In an effort to promote researchers dissemination of their quantitative studies in a cohesive, detailed, and informative manner, the authors delineate…

  13. Evaluation of ultrasound-assisted extraction as sample pre-treatment for quantitative determination of rare earth elements in marine biological tissues by inductively coupled plasma-mass spectrometry

    International Nuclear Information System (INIS)

    Costas, M.; Lavilla, I.; Gil, S.; Pena, F.; Calle, I.; Cabaleiro, N. de la; Bendicho, C.

    2010-01-01

    In this work, the determination of rare earth elements (REEs), i.e. Y, La, Ce, Pr, Nd, Sm, Eu, Gd, Tb, Dy, Ho, Er, Tm, Yb and Lu in marine biological tissues by inductively coupled-mass spectrometry (ICP-MS) after a sample preparation method based on ultrasound-assisted extraction (UAE) is described. The suitability of the extracts for ICP-MS measurements was evaluated. For that, studies were focused on the following issues: (i) use of clean up of extracts with a C18 cartridge for non-polar solid phase extraction; (ii) use of different internal standards; (iii) signal drift caused by changes in the nebulization efficiency and salt deposition on the cones during the analysis. The signal drift produced by direct introduction of biological extracts in the instrument was evaluated using a calibration verification standard for bracketing (standard-sample bracketing, SSB) and cumulative sum (CUSUM) control charts. Parameters influencing extraction such as extractant composition, mass-to-volume ratio, particle size, sonication time and sonication amplitude were optimized. Diluted single acids (HNO 3 and HCl) and mixtures (HNO 3 + HCl) were evaluated for improving the extraction efficiency. Quantitative recoveries for REEs were achieved using 5 mL of 3% (v/v) HNO 3 + 2% (v/v) HCl, particle size <200 μm, 3 min of sonication time and 50% of sonication amplitude. Precision, expressed as relative standard deviation from three independent extractions, ranged from 0.1 to 8%. In general, LODs were improved by a factor of 5 in comparison with those obtained after microwave-assisted digestion (MAD). The accuracy of the method was evaluated using the CRM BCR-668 (mussel tissue). Different seafood samples of common consumption were analyzed by ICP-MS after UAE and MAD.

  14. Label-free quantitative mass spectrometry for analysis of protein antigens in a meningococcal group B outer membrane vesicle vaccine.

    Science.gov (United States)

    Dick, Lawrence W; Mehl, John T; Loughney, John W; Mach, Anna; Rustandi, Richard R; Ha, Sha; Zhang, Lan; Przysiecki, Craig T; Dieter, Lance; Hoang, Van M

    2015-01-01

    The development of a multivalent outer membrane vesicle (OMV) vaccine where each strain contributes multiple key protein antigens presents numerous analytical challenges. One major difficulty is the ability to accurately and specifically quantitate each antigen, especially during early development and process optimization when immunoreagents are limited or unavailable. To overcome this problem, quantitative mass spectrometry methods can be used. In place of traditional mass assays such as enzyme-linked immunosorbent assays (ELISAs), quantitative LC-MS/MS using multiple reaction monitoring (MRM) can be used during early-phase process development to measure key protein components in complex vaccines in the absence of specific immunoreagents. Multiplexed, label-free quantitative mass spectrometry methods using protein extraction by either detergent or 2-phase solvent were developed to quantitate levels of several meningococcal serogroup B protein antigens in an OMV vaccine candidate. Precision was demonstrated to be less than 15% RSD for the 2-phase extraction and less than 10% RSD for the detergent extraction method. Accuracy was 70 to 130% for the method using a 2-phase extraction and 90-110% for detergent extraction. The viability of MS-based protein quantification as a vaccine characterization method was demonstrated and advantages over traditional quantitative methods were evaluated. Implementation of these MS-based quantification methods can help to decrease the development time for complex vaccines and can provide orthogonal confirmation of results from existing antigen quantification techniques.

  15. Anti-oxidative and antimicrobial activities of Hieracium pilosella L. extracts

    Directory of Open Access Journals (Sweden)

    LJILJANA P. STANOJEVIC

    2008-05-01

    Full Text Available The anti-oxidative and antimicrobial activities of different extracts from Hieracium pilosella L. (Asteraceae whole plant were investigated. The total dry extracts were determined for all the investigated solvents: methanol, dichloromethane, ethyl acetate and dichloromethane:methanol (9:1. It was found that the highest yield was obtained by extraction with methanol (12.9 g/100 g of dry plant material. Qualitative and quantitative analysis were performed by the HPLC method, using external standards. Chlorogenic acid, apigenin-7-O-glucoside and umbelliferone were detected in the highest quantity in the extracts. The qualitative and quantitative composition of the extracts depends on the solvent used. The 1,1-diphenyl-2-picrylhydrazyl (DPPH radical scavenging effect of the extracts was determined spectrophotometrically. The highest radical scavenging effect was observed in the methanolic extract, both with and without incubation, EC50 = 0.012 and EC50 = 0.015 mg ml-1, respectively. The antimicrobial activities of the extracts towards the bacteria (Escherichia coli, Pseudomonas aeruginosa, Staphylococcus aureus, Bacillus subtilis, Salmonella enteritidis and Klebsiella pneumoniae and the fungi (Aspergillus niger and Candida albicans were determined by the disc diffusion method. The minimal inhibitory concentrations were determined for all the investigated extracts against all the mentioned microorganisms.

  16. Algorithm of pulmonary emphysema extraction using thoracic 3D CT images

    Science.gov (United States)

    Saita, Shinsuke; Kubo, Mitsuru; Kawata, Yoshiki; Niki, Noboru; Nakano, Yasutaka; Ohmatsu, Hironobu; Tominaga, Keigo; Eguchi, Kenji; Moriyama, Noriyuki

    2007-03-01

    Recently, due to aging and smoking, emphysema patients are increasing. The restoration of alveolus which was destroyed by emphysema is not possible, thus early detection of emphysema is desired. We describe a quantitative algorithm for extracting emphysematous lesions and quantitatively evaluate their distribution patterns using low dose thoracic 3-D CT images. The algorithm identified lung anatomies, and extracted low attenuation area (LAA) as emphysematous lesion candidates. Applying the algorithm to thoracic 3-D CT images and then by follow-up 3-D CT images, we demonstrate its potential effectiveness to assist radiologists and physicians to quantitatively evaluate the emphysematous lesions distribution and their evolution in time interval changes.

  17. Chemometric study of Andalusian extra virgin olive oils Raman spectra: Qualitative and quantitative information.

    Science.gov (United States)

    Sánchez-López, E; Sánchez-Rodríguez, M I; Marinas, A; Marinas, J M; Urbano, F J; Caridad, J M; Moalem, M

    2016-08-15

    Authentication of extra virgin olive oil (EVOO) is an important topic for olive oil industry. The fraudulent practices in this sector are a major problem affecting both producers and consumers. This study analyzes the capability of FT-Raman combined with chemometric treatments of prediction of the fatty acid contents (quantitative information), using gas chromatography as the reference technique, and classification of diverse EVOOs as a function of the harvest year, olive variety, geographical origin and Andalusian PDO (qualitative information). The optimal number of PLS components that summarizes the spectral information was introduced progressively. For the estimation of the fatty acid composition, the lowest error (both in fitting and prediction) corresponded to MUFA, followed by SAFA and PUFA though such errors were close to zero in all cases. As regards the qualitative variables, discriminant analysis allowed a correct classification of 94.3%, 84.0%, 89.0% and 86.6% of samples for harvest year, olive variety, geographical origin and PDO, respectively. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Chromatographic analysis of wheatgrass extracts

    Directory of Open Access Journals (Sweden)

    Masood Shah Khan

    2015-01-01

    Full Text Available Aim: Wheatgrass (WG is the shoot of Triticum aestivum Linn. belongs to the family Gramineae, and possess high chlorophyll content and essential vitamins, minerals, vital enzymes, amino acids, dietary fibers etc., It has been shown to possess anti-cancer, anti-ulcer, antioxidant, and anti-arthritic activity due to the presence of biologically active compounds, and minerals. Therefore, in the present study, high-performance thin layer chromatography (HPTLC, and high-performance liquid chromatography (HPLC methods for qualitative and quantitative analysis have been proposed, which will help in quality evaluation of wheat grass extract. Materials and Methods: Samples for analysis were prepared in methanol and water simply by sonication. These were applied on pre-coated silica plate and chromatograms were developed using toluene: Ethyl acetate: Formic acid. HPLC analysis was done on Waters HPLC system using water, methanol, and acetonitrile as mobile phase. Merck C18 column has been used. Results: HPTLC finger printing of alcoholic extracts of WG was carried out and found 10–11 spots at different wavelengths 254, 366, and 435 nm. HPLC fingerprinting produced 22 peaks at 256 nm. Quantitative HPTLC analysis was done to determine the gallic acid content, and was found to be 0.077% w/w in aqueous extract. By HPLC, the content of gallic acid and rutin was found to be 0.07%, and 0.04% w/w in aqueous extract of WG. Conclusion: The developed HPLC and HPTLC fingerprinting method can be used for the quality control, and standardization of WG and its extracts used as nutritional supplement.

  19. Extractive Spectrophotometric Determination of Omeprazole in ...

    African Journals Online (AJOL)

    Erah

    Abstract. Purpose: To develop a simple, rapid and selective method for the extractive spectrophotometric determination of .... The colour intensity of the organic layer had shown a very .... considerable attention for quantitative analyses of many ...

  20. EXTRACTION AND QUANTITATIVE DETERMINATION OF ASCORBIC ACID FROM BANANA PEEL MUSA ACUMINATA ‘KEPOK’

    Directory of Open Access Journals (Sweden)

    Khairul Anwar Mohamad Said

    2016-04-01

    Full Text Available This paper discusses the extraction of an antioxidant compound, which is ascorbic acid or vitamin C, from a banana peel using an ultrasound-assisted extraction (UAE method. The type of banana used was Musa acuminata also known as “PisangKepok” in Malaysia. The investigation includes the effect of solvent/solid ratio (4.5, 5 g and 10  ml/g, sonication time (15, 30 and 45 mins and temperature variation (30 , 45  and 60oC on the extraction of ascorbic acid compounds from the banana peel to determine the best or optimum condition of the operation. Out of all extract samples analyzed by redox titration method using iodine solution, it was found that the highest yield was 0.04939 ± 0.00080 mg that resulted from an extraction at 30oC for 15 mins with 5 ml/g solvent-to-solute ratio.KEYWORDS:  Musa acuminata; ultrasound-assisted extraction; vitamin C; redox titration

  1. Scholarly Information Extraction Is Going to Make a Quantum Leap with PubMed Central (PMC).

    Science.gov (United States)

    Matthies, Franz; Hahn, Udo

    2017-01-01

    With the increasing availability of complete full texts (journal articles), rather than their surrogates (titles, abstracts), as resources for text analytics, entirely new opportunities arise for information extraction and text mining from scholarly publications. Yet, we gathered evidence that a range of problems are encountered for full-text processing when biomedical text analytics simply reuse existing NLP pipelines which were developed on the basis of abstracts (rather than full texts). We conducted experiments with four different relation extraction engines all of which were top performers in previous BioNLP Event Extraction Challenges. We found that abstract-trained engines loose up to 6.6% F-score points when run on full-text data. Hence, the reuse of existing abstract-based NLP software in a full-text scenario is considered harmful because of heavy performance losses. Given the current lack of annotated full-text resources to train on, our study quantifies the price paid for this short cut.

  2. Data Assimilation to Extract Soil Moisture Information from SMAP Observations

    Directory of Open Access Journals (Sweden)

    Jana Kolassa

    2017-11-01

    Full Text Available This study compares different methods to extract soil moisture information through the assimilation of Soil Moisture Active Passive (SMAP observations. Neural network (NN and physically-based SMAP soil moisture retrievals were assimilated into the National Aeronautics and Space Administration (NASA Catchment model over the contiguous United States for April 2015 to March 2017. By construction, the NN retrievals are consistent with the global climatology of the Catchment model soil moisture. Assimilating the NN retrievals without further bias correction improved the surface and root zone correlations against in situ measurements from 14 SMAP core validation sites (CVS by 0.12 and 0.16, respectively, over the model-only skill, and reduced the surface and root zone unbiased root-mean-square error (ubRMSE by 0.005 m 3 m − 3 and 0.001 m 3 m − 3 , respectively. The assimilation reduced the average absolute surface bias against the CVS measurements by 0.009 m 3 m − 3 , but increased the root zone bias by 0.014 m 3 m − 3 . Assimilating the NN retrievals after a localized bias correction yielded slightly lower surface correlation and ubRMSE improvements, but generally the skill differences were small. The assimilation of the physically-based SMAP Level-2 passive soil moisture retrievals using a global bias correction yielded similar skill improvements, as did the direct assimilation of locally bias-corrected SMAP brightness temperatures within the SMAP Level-4 soil moisture algorithm. The results show that global bias correction methods may be able to extract more independent information from SMAP observations compared to local bias correction methods, but without accurate quality control and observation error characterization they are also more vulnerable to adverse effects from retrieval errors related to uncertainties in the retrieval inputs and algorithm. Furthermore, the results show that using global bias correction approaches without a

  3. Comparison of accelerated solvent extraction and standard shaking extraction for determination of dioxins in foods

    Energy Technology Data Exchange (ETDEWEB)

    Hori, T.; Tobiishi, K.; Ashizuka, Y.; Nakagawa, R.; Iida, T. [Fukuoka Institute of Health and Environmental Sciences, Fukuoka (Japan); Tsutsumi, T.; Sasaki, K. [National Institute of Health Sciences, Tokyo (Japan)

    2004-09-15

    We previously developed a highly sensitive method for determining dioxin content in food using a solvent cut large volume (SCLV) injection system coupled to a cyanopropyl phase capillary column. The SCLV injection system coupled to a 40m-length Rtx-2330 column showed sufficient separation of 2,3,7,8-chlorine substituted isomers, and had at least five-times higher sensitivity than the conventional injection technique. In the current method, a large volume of sample (generally 100g) must be treated collectively in order to attain the desirable limit of detection (LODs) at low ppt levels, namely 0.01pg/g for tetra-CDD and -CDF. The present method allowed the reduction of sample volume from 100g to 20g when such usual LODs are demanded. The SCLV injection technique is expected to improve the efficiency of laboratory performance, especially when it is coupled to an automated extraction method, such as accelerated solvent extraction (ASE). In order to examine the applicability of ASE for the determination of dioxins in food samples, it is important to verify its extraction efficacy against that of the conventional technique. In the present study we examine the applicability of an ASE for the determination of dioxins in food samples, and the method's performance was compared with that of standard conventional shaking extraction (separatory funnel extraction) regarding recovery rates and quantitative determination. It is considered that homogeneous tissue, such as dried seaweed powder or dried milk powder, is suitable for the method's quantitative validation.

  4. Comparison of accelerated solvent extraction and standard shaking extraction for determination of dioxins in foods

    Energy Technology Data Exchange (ETDEWEB)

    Hori, T; Tobiishi, K; Ashizuka, Y; Nakagawa, R; Iida, T [Fukuoka Institute of Health and Environmental Sciences, Fukuoka (Japan); Tsutsumi, T; Sasaki, K [National Institute of Health Sciences, Tokyo (Japan)

    2004-09-15

    We previously developed a highly sensitive method for determining dioxin content in food using a solvent cut large volume (SCLV) injection system coupled to a cyanopropyl phase capillary column. The SCLV injection system coupled to a 40m-length Rtx-2330 column showed sufficient separation of 2,3,7,8-chlorine substituted isomers, and had at least five-times higher sensitivity than the conventional injection technique. In the current method, a large volume of sample (generally 100g) must be treated collectively in order to attain the desirable limit of detection (LODs) at low ppt levels, namely 0.01pg/g for tetra-CDD and -CDF. The present method allowed the reduction of sample volume from 100g to 20g when such usual LODs are demanded. The SCLV injection technique is expected to improve the efficiency of laboratory performance, especially when it is coupled to an automated extraction method, such as accelerated solvent extraction (ASE). In order to examine the applicability of ASE for the determination of dioxins in food samples, it is important to verify its extraction efficacy against that of the conventional technique. In the present study we examine the applicability of an ASE for the determination of dioxins in food samples, and the method's performance was compared with that of standard conventional shaking extraction (separatory funnel extraction) regarding recovery rates and quantitative determination. It is considered that homogeneous tissue, such as dried seaweed powder or dried milk powder, is suitable for the method's quantitative validation.

  5. The effect of informed consent on stress levels associated with extraction of impacted mandibular third molars.

    Science.gov (United States)

    Casap, Nardy; Alterman, Michael; Sharon, Guy; Samuni, Yuval

    2008-05-01

    To evaluate the effect of informed consent on stress levels associated with removal of impacted mandibular third molars. A total of 60 patients scheduled for extraction of impacted mandibular third molars participated in this study. The patients were unaware of the study's objectives. Data from 20 patients established the baseline levels of electrodermal activity (EDA). The remaining 40 patients were randomly assigned into 2 equal groups receiving either a detailed document of informed consent, disclosing the possible risks involved with the surgery, or a simplified version. Pulse, blood pressure, and EDA were monitored before, during, and after completion of the consent document. Changes in EDA, but not in blood pressure, were measured on completion of either version of the consent document. A greater increase in EDA was associated with the detailed version of the consent document (P = .004). A similar concomitant increase (although nonsignificant) in pulse values was monitored on completion of both versions. Completion of overdisclosed document of informed consent is associated with changes in physiological parameters. The results suggest that overdetailed listing and disclosure before extraction of impacted mandibular third molars can increase patient stress.

  6. Method of extracting significant trouble information of nuclear power plants using probabilistic analysis technique

    International Nuclear Information System (INIS)

    Shimada, Yoshio; Miyazaki, Takamasa

    2005-01-01

    In order to analyze and evaluate large amounts of trouble information of overseas nuclear power plants, it is necessary to select information that is significant in terms of both safety and reliability. In this research, a method of efficiently and simply classifying degrees of importance of components in terms of safety and reliability while paying attention to root-cause components appearing in the information was developed. Regarding safety, the reactor core damage frequency (CDF), which is used in the probabilistic analysis of a reactor, was used. Regarding reliability, the automatic plant trip probability (APTP), which is used in the probabilistic analysis of automatic reactor trips, was used. These two aspects were reflected in the development of criteria for classifying degrees of importance of components. By applying these criteria, a simple method of extracting significant trouble information of overseas nuclear power plants was developed. (author)

  7. Ammonium chloride salting out extraction/cleanup for trace-level quantitative analysis in food and biological matrices by flow injection tandem mass spectrometry.

    Science.gov (United States)

    Nanita, Sergio C; Padivitage, Nilusha L T

    2013-03-20

    A sample extraction and purification procedure that uses ammonium-salt-induced acetonitrile/water phase separation was developed and demonstrated to be compatible with the recently reported method for pesticide residue analysis based on fast extraction and dilution flow injection mass spectrometry (FED-FI-MS). The ammonium salts evaluated were chloride, acetate, formate, carbonate, and sulfate. A mixture of NaCl and MgSO4, salts used in the well-known QuEChERS method, was also tested for comparison. With thermal decomposition/evaporation temperature of salts resulted in negligible ion source residual under typical electrospray conditions, leading to consistent method performance and less instrument cleaning. Although all ammonium salts tested induced acetonitrile/water phase separation, NH4Cl yielded the best performance, thus it was the preferred salting out agent. The NH4Cl salting out method was successfully coupled with FI/MS/MS and tested for fourteen pesticide active ingredients: chlorantraniliprole, cyantraniliprole, chlorimuron ethyl, oxamyl, methomyl, sulfometuron methyl, chlorsulfuron, triflusulfuron methyl, azimsulfuron, flupyrsulfuron methyl, aminocyclopyrachlor, aminocyclopyrachlor methyl, diuron and hexazinone. A validation study was conducted with nine complex matrices: sorghum, rice, grapefruit, canola, milk, eggs, beef, urine and blood plasma. The method is applicable to all analytes, except aminocyclopyrachlor. The method was deemed appropriate for quantitative analysis in 114 out of 126 analyte/matrix cases tested (applicability rate=0.90). The NH4Cl salting out extraction/cleanup allowed expansion of FI/MS/MS for analysis in food of plant and animal origin, and body fluids with increased ruggedness and sensitivity, while maintaining high-throughput (run time=30s/sample). Limits of quantitation (LOQs) of 0.01mgkg(-1) (ppm), the 'well-accepted standard' in pesticide residue analysis, were achieved in >80% of cases tested; while limits of detection

  8. New microwave-integrated Soxhlet extraction. An advantageous tool for the extraction of lipids from food products.

    Science.gov (United States)

    Virot, Matthieu; Tomao, Valérie; Colnagui, Giulio; Visinoni, Franco; Chemat, Farid

    2007-12-07

    A new process of Soxhlet extraction assisted by microwave was designed and developed. The process is performed in four steps, which ensures complete, rapid and accurate extraction of the samples. A second-order central composite design (CCD) has been used to investigate the performance of the new device. The results provided by analysis of variance and Pareto chart, indicated that the extraction time was the most important factor followed by the leaching time. The response surface methodology allowed us to determine optimal conditions for olive oil extraction: 13 min of extraction time, 17 min of leaching time, and 720 W of irradiation power. The proposed process is suitable for lipids determination from food. Microwave-integrated Soxhlet (MIS) extraction has been compared with a conventional technique, Soxhlet extraction, for the extraction of oil from olives (Aglandau, Vaucluse, France). The oils extracted by MIS for 32 min were quantitatively (yield) and qualitatively (fatty acid composition) similar to those obtained by conventional Soxhlet extraction for 8 h. MIS is a green technology and appears as a good alternative for the extraction of fat and oils from food products.

  9. Assessment of the ion-trap mass spectrometer for routine qualitative and quantitative analysis of drugs of abuse extracted from urine.

    Science.gov (United States)

    Vorce, S P; Sklerov, J H; Kalasinsky, K S

    2000-10-01

    The ion-trap mass spectrometer (MS) has been available as a detector for gas chromatography (GC) for nearly two decades. However, it still occupies a minor role in forensic toxicology drug-testing laboratories. Quadrupole MS instruments make up the majority of GC detectors used in drug confirmation. This work addresses the use of these two MS detectors, comparing the ion ratio precision and quantitative accuracy for the analysis of different classes of abused drugs extracted from urine. Urine specimens were prepared at five concentrations each for amphetamine (AMP), methamphetamine (METH), benzoylecgonine (BZE), delta9-carboxy-tetrahydrocannabinol (delta9-THCCOOH), phencyclidine (PCP), morphine (MOR), codeine (COD), and 6-acetylmorphine (6-AM). Concentration ranges for AMP, METH, BZE, delta9-THCCOOH, PCP, MOR, COD, and 6-AM were 50-2500, 50-5000, 15-800, 1.5-65, 1-250, 500-32000, 250-21000, and 1.5-118 ng/mL, respectively. Sample extracts were injected into a GC-quadrupole MS operating in selected ion monitoring (SIM) mode and a GC-ion-trap MS operating in either selected ion storage (SIS) or full scan (FS) mode. Precision was assessed by the evaluation of five ion ratios for n = 15 injections at each concentration using a single-point calibration. Precision measurements for SIM ion ratios provided coefficients of variation (CV) between 2.6 and 9.8% for all drugs. By comparison, the SIS and FS data yielded CV ranges of 4.0-12.8% and 4.0-11.2%, respectively. The total ion ratio failure rates were 0.2% (SIM), 0.7% (SIS), and 1.2% (FS) for the eight drugs analyzed. Overall, the SIS mode produced stable, comparable mean ratios over the concentration ranges examined, but had greater variance within batch runs. Examination of postmortem and quality-control samples produced forensically accurate quantitation by SIS when compared to SIM. Furthermore, sensitivity of FS was equivalent to SIM for all compounds examined except for 6-AM.

  10. Microwave-assisted extraction and accelerated solvent extraction with ethyl acetate-cyclohexane before determination of organochlorines in fish tissue by gas chromatography with electron-capture detection.

    Science.gov (United States)

    Weichbrodt, M; Vetter, W; Luckas, B

    2000-01-01

    Focused open-vessel microwave-assisted extraction (FOV-MAE), closed-vessel microwave-assisted extraction (CV-MAE), and accelerated solvent extraction (ASE) were used for extraction before determination of organochlorine compounds (polychlorinated biphenyls, DDT, toxaphene, chlordane, hexachlorobenzene, hexachlorocyclohexanes, and dieldrin) in cod liver and fish fillets. Wet samples were extracted without the time-consuming step of lyophilization or other sample-drying procedures. Extractions were performed with the solvent mixture ethyl acetate-cyclohexane (1 + 1, v/v), which allowed direct use of gel-permeation chromatography without solvent exchange. For FOV-MAE, the solvent mixture removed water from the sample matrix via azeotropic distillation. The status of water removal was controlled during extraction by measuring the temperature of the distillate. After water removal, the temperature of the distillate increased and the solvent mixture became less polar. Only the pure extraction solvent allowed quantitative extraction of the organochlorine compounds. For CV-MAE, water could not be separated during the extraction. For this reason, the extraction procedure for wet fish tissue required 2 extraction steps: the first for manual removal of coextracted water, and the second for quantitative extraction of the organochlorine compounds with the pure solvent. Therefore, CV-MAE is less convenient for samples with high water content. For ASE, water in the sample was bound with Na2SO4. The reproducibility for each technique was very good (relative standard deviation was typically <10%); the slightly varying levels were attributed to deviations during sample cleanup and the generally low levels.

  11. Lab-on-capillary: a rapid, simple and quantitative genetic analysis platform integrating nucleic acid extraction, amplification and detection.

    Science.gov (United States)

    Fu, Yu; Zhou, Xiaoming; Xing, Da

    2017-12-05

    In this work, we describe for the first time a genetic diagnosis platform employing a polydiallyldimethylammonium chloride (PDDA)-modified capillary and a liquid-based thermalization system for rapid, simple and quantitative DNA analysis with minimal user interaction. Positively charged PDDA is modified on the inner surface of the silicon dioxide capillary by using an electrostatic self-assembly approach that allows the negatively charged DNA to be separated from the lysate in less than 20 seconds. The capillary loaded with the PCR mix is incorporated in the thermalization system, which can achieve on-site real-time PCR. This system is based on the circulation of pre-heated liquids in the chamber, allowing for high-speed thermalization of the capillary and fast amplification. Multiple targets can be simultaneously analysed with multiplex spatial melting. Starting with live Escherichia coli (E. coli) cells in milk, as a realistic sample, the current method can achieve DNA extraction, amplification, and detection within 40 min.

  12. Radiometric determination of 90Sr in the dissolver solution of the spent PHWR fuel after its separation with solvent extraction and extraction chromatography

    International Nuclear Information System (INIS)

    Kulkarni, P.G.; Gupta, K.K.; Pant, D.K.; Bhalerao, B.A.; Gurba, P.B.; Janardan, P.; Changrani, R.D.; Dey, P.K.; Pathak, P.N.; Mohapatra, P.K.; Manchanda, V.K.

    2010-01-01

    A simple radiometric method for 90 Sr determination in the dissolver solution of the PHWR spent fuel has been developed.The method involves the quantitative separation of Sr from the associated actinides and other fission products by solvent extraction with 30% trialkylphosphine oxide (TRPO) -n-dodecane followed by extraction chromatography with XAD-7-Di-butylcyclohexano-18-crown-6 resin. The separation scheme yields quantitative recovery of 90 Sr and the separated 90 Sr was found to be radiochemically pure. 90 Sr was estimated by β-radiometry and the precision of the method at 5 mCi/mL level was 2% (RSD). (author)

  13. Point Cloud Classification of Tesserae from Terrestrial Laser Data Combined with Dense Image Matching for Archaeological Information Extraction

    Science.gov (United States)

    Poux, F.; Neuville, R.; Billen, R.

    2017-08-01

    Reasoning from information extraction given by point cloud data mining allows contextual adaptation and fast decision making. However, to achieve this perceptive level, a point cloud must be semantically rich, retaining relevant information for the end user. This paper presents an automatic knowledge-based method for pre-processing multi-sensory data and classifying a hybrid point cloud from both terrestrial laser scanning and dense image matching. Using 18 features including sensor's biased data, each tessera in the high-density point cloud from the 3D captured complex mosaics of Germigny-des-prés (France) is segmented via a colour multi-scale abstraction-based featuring extracting connectivity. A 2D surface and outline polygon of each tessera is generated by a RANSAC plane extraction and convex hull fitting. Knowledge is then used to classify every tesserae based on their size, surface, shape, material properties and their neighbour's class. The detection and semantic enrichment method shows promising results of 94% correct semantization, a first step toward the creation of an archaeological smart point cloud.

  14. Combined Extraction Processes of Lipid from Chlorella vulgaris Microalgae: Microwave Prior to Supercritical Carbon Dioxide Extraction

    Science.gov (United States)

    Dejoye, Céline; Vian, Maryline Abert; Lumia, Guy; Bouscarle, Christian; Charton, Frederic; Chemat, Farid

    2011-01-01

    Extraction yields and fatty acid profiles from freeze-dried Chlorella vulgaris by microwave pretreatment followed by supercritical carbon dioxide (MW-SCCO2) extraction were compared with those obtained by supercritical carbon dioxide extraction alone (SCCO2). Work performed with pressure range of 20–28 Mpa and temperature interval of 40–70 °C, gave the highest extraction yield (w/w dry weight) at 28 MPa/40 °C. MW-SCCO2 allowed to obtain the highest extraction yield (4.73%) compared to SCCO2 extraction alone (1.81%). Qualitative and quantitative analyses of microalgae oil showed that palmitic, oleic, linoleic and α-linolenic acid were the most abundant identified fatty acids. Oils obtained by MW-SCCO2 extraction had the highest concentrations of fatty acids compared to SCCO2 extraction without pretreatment. Native form, and microwave pretreated and untreated microalgae were observed by scanning electronic microscopy (SEM). SEM micrographs of pretreated microalgae present tearing wall agglomerates. After SCCO2, microwave pretreated microalgae presented several micro cracks; while native form microalgae wall was slightly damaged. PMID:22272135

  15. Combined Extraction Processes of Lipid from Chlorella vulgaris Microalgae: Microwave Prior to Supercritical Carbon Dioxide Extraction

    Directory of Open Access Journals (Sweden)

    Farid Chemat

    2011-12-01

    Full Text Available Extraction yields and fatty acid profiles from freeze-dried Chlorella vulgaris by microwave pretreatment followed by supercritical carbon dioxide (MW-SCCO2 extraction were compared with those obtained by supercritical carbon dioxide extraction alone (SCCO2. Work performed with pressure range of 20–28 Mpa and temperature interval of 40–70 °C, gave the highest extraction yield (w/w dry weight at 28 MPa/40 °C. MW-SCCO2 allowed to obtain the highest extraction yield (4.73% compared to SCCO2 extraction alone (1.81%. Qualitative and quantitative analyses of microalgae oil showed that palmitic, oleic, linoleic and α-linolenic acid were the most abundant identified fatty acids. Oils obtained by MW-SCCO2 extraction had the highest concentrations of fatty acids compared to SCCO2 extraction without pretreatment. Native form, and microwave pretreated and untreated microalgae were observed by scanning electronic microscopy (SEM. SEM micrographs of pretreated microalgae present tearing wall agglomerates. After SCCO2, microwave pretreated microalgae presented several micro cracks; while native form microalgae wall was slightly damaged.

  16. Application of information and complexity theories to public opinion polls. The case of Greece (2004-2007)

    OpenAIRE

    Panos, C. P.; Chatzisavvas, K. Ch.

    2007-01-01

    A general methodology to study public opinion inspired from information and complexity theories is outlined. It is based on probabilistic data extracted from opinion polls. It gives a quantitative information-theoretic explanation of high job approval of Greek Prime Minister Mr. Constantinos Karamanlis (2004-2007), while the same time series of polls conducted by the company Metron Analysis showed that his party New Democracy (abbr. ND) was slightly higher than the opposition party of PASOK -...

  17. Quantitative Analysis of First-Pass Contrast-Enhanced Myocardial Perfusion Multidetector CT Using a Patlak Plot Method and Extraction Fraction Correction During Adenosine Stress

    Science.gov (United States)

    Ichihara, Takashi; George, Richard T.; Silva, Caterina; Lima, Joao A. C.; Lardo, Albert C.

    2011-02-01

    The purpose of this study was to develop a quantitative method for myocardial blood flow (MBF) measurement that can be used to derive accurate myocardial perfusion measurements from dynamic multidetector computed tomography (MDCT) images by using a compartment model for calculating the first-order transfer constant (K1) with correction for the capillary transit extraction fraction (E). Six canine models of left anterior descending (LAD) artery stenosis were prepared and underwent first-pass contrast-enhanced MDCT perfusion imaging during adenosine infusion (0.14-0.21 mg/kg/min). K1 , which is the first-order transfer constant from left ventricular (LV) blood to myocardium, was measured using the Patlak plot method applied to time-attenuation curve data of the LV blood pool and myocardium. The results were compared against microsphere MBF measurements, and the extraction fraction of contrast agent was calculated. K1 is related to the regional MBF as K1=EF, E=(1-exp(-PS/F)), where PS is the permeability-surface area product and F is myocardial flow. Based on the above relationship, a look-up table from K1 to MBF can be generated and Patlak plot-derived K1 values can be converted to the calculated MBF. The calculated MBF and microsphere MBF showed a strong linear association. The extraction fraction in dogs as a function of flow (F) was E=(1-exp(-(0.2532F+0.7871)/F)) . Regional MBF can be measured accurately using the Patlak plot method based on a compartment model and look-up table with extraction fraction correction from K1 to MBF.

  18. Historical Patterns Based on Automatically Extracted Data: the Case of Classical Composers

    DEFF Research Database (Denmark)

    Borowiecki, Karol; O'Hagan, John

    2012-01-01

    application that automatically extracts and processes information was developed to generate data on the birth location, occupations and importance (using word count methods) of over 12,000 composers over six centuries. Quantitative measures of the relative importance of different types of music......The purpose of this paper is to demonstrate the potential for generating interesting aggregate data on certain aspect of the lives of thousands of composers, and indeed other creative groups, from large on-line dictionaries and to be able to do so relatively quickly. A purpose-built java...... and of the different music instruments over the centuries were also generated. Finally quantitative indicators of the importance of different cities over the different centuries in the lives of these composers are constructed. A range of interesting findings emerge in relation to all of these aspects of the lives...

  19. Audio-Visual Speech Recognition Using Lip Information Extracted from Side-Face Images

    Directory of Open Access Journals (Sweden)

    Koji Iwano

    2007-03-01

    Full Text Available This paper proposes an audio-visual speech recognition method using lip information extracted from side-face images as an attempt to increase noise robustness in mobile environments. Our proposed method assumes that lip images can be captured using a small camera installed in a handset. Two different kinds of lip features, lip-contour geometric features and lip-motion velocity features, are used individually or jointly, in combination with audio features. Phoneme HMMs modeling the audio and visual features are built based on the multistream HMM technique. Experiments conducted using Japanese connected digit speech contaminated with white noise in various SNR conditions show effectiveness of the proposed method. Recognition accuracy is improved by using the visual information in all SNR conditions. These visual features were confirmed to be effective even when the audio HMM was adapted to noise by the MLLR method.

  20. Quantitative polarized Raman spectroscopy in highly turbid bone tissue.

    Science.gov (United States)

    Raghavan, Mekhala; Sahar, Nadder D; Wilson, Robert H; Mycek, Mary-Ann; Pleshko, Nancy; Kohn, David H; Morris, Michael D

    2010-01-01

    Polarized Raman spectroscopy allows measurement of molecular orientation and composition and is widely used in the study of polymer systems. Here, we extend the technique to the extraction of quantitative orientation information from bone tissue, which is optically thick and highly turbid. We discuss multiple scattering effects in tissue and show that repeated measurements using a series of objectives of differing numerical apertures can be employed to assess the contributions of sample turbidity and depth of field on polarized Raman measurements. A high numerical aperture objective minimizes the systematic errors introduced by multiple scattering. We test and validate the use of polarized Raman spectroscopy using wild-type and genetically modified (oim/oim model of osteogenesis imperfecta) murine bones. Mineral orientation distribution functions show that mineral crystallites are not as well aligned (pbones (28+/-3 deg) compared to wild-type bones (22+/-3 deg), in agreement with small-angle X-ray scattering results. In wild-type mice, backbone carbonyl orientation is 76+/-2 deg and in oim/oim mice, it is 72+/-4 deg (p>0.05). We provide evidence that simultaneous quantitative measurements of mineral and collagen orientations on intact bone specimens are possible using polarized Raman spectroscopy.

  1. Tip-Enhanced Raman Voltammetry: Coverage Dependence and Quantitative Modeling.

    Science.gov (United States)

    Mattei, Michael; Kang, Gyeongwon; Goubert, Guillaume; Chulhai, Dhabih V; Schatz, George C; Jensen, Lasse; Van Duyne, Richard P

    2017-01-11

    Electrochemical atomic force microscopy tip-enhanced Raman spectroscopy (EC-AFM-TERS) was employed for the first time to observe nanoscale spatial variations in the formal potential, E 0' , of a surface-bound redox couple. TERS cyclic voltammograms (TERS CVs) of single Nile Blue (NB) molecules were acquired at different locations spaced 5-10 nm apart on an indium tin oxide (ITO) electrode. Analysis of TERS CVs at different coverages was used to verify the observation of single-molecule electrochemistry. The resulting TERS CVs were fit to the Laviron model for surface-bound electroactive species to quantitatively extract the formal potential E 0' at each spatial location. Histograms of single-molecule E 0' at each coverage indicate that the electrochemical behavior of the cationic oxidized species is less sensitive to local environment than the neutral reduced species. This information is not accessible using purely electrochemical methods or ensemble spectroelectrochemical measurements. We anticipate that quantitative modeling and measurement of site-specific electrochemistry with EC-AFM-TERS will have a profound impact on our understanding of the role of nanoscale electrode heterogeneity in applications such as electrocatalysis, biological electron transfer, and energy production and storage.

  2. A lighting metric for quantitative evaluation of accent lighting systems

    Science.gov (United States)

    Acholo, Cyril O.; Connor, Kenneth A.; Radke, Richard J.

    2014-09-01

    Accent lighting is critical for artwork and sculpture lighting in museums, and subject lighting for stage, Film and television. The research problem of designing effective lighting in such settings has been revived recently with the rise of light-emitting-diode-based solid state lighting. In this work, we propose an easy-to-apply quantitative measure of the scene's visual quality as perceived by human viewers. We consider a well-accent-lit scene as one which maximizes the information about the scene (in an information-theoretic sense) available to the user. We propose a metric based on the entropy of the distribution of colors, which are extracted from an image of the scene from the viewer's perspective. We demonstrate that optimizing the metric as a function of illumination configuration (i.e., position, orientation, and spectral composition) results in natural, pleasing accent lighting. We use a photorealistic simulation tool to validate the functionality of our proposed approach, showing its successful application to two- and three-dimensional scenes.

  3. Determination of Niacinamide in Lotions and Creams Using Liquid-Liquid Extraction and High-Performance Liquid Chromatography

    Science.gov (United States)

    Usher, Karyn M.; Simmons, Carolyn R.; Keating, Daniel W.; Rossi, Henry F., III

    2015-01-01

    Chemical separations are an important part of an undergraduate chemistry curriculum. Sophomore students often get experience with liquid-liquid extraction in organic chemistry classes, but liquid-liquid extraction is not as often introduced as a quantitative sample preparation method in honors general chemistry or quantitative analysis classes.…

  4. Markovian Processes for Quantitative Information Leakage

    DEFF Research Database (Denmark)

    Biondi, Fabrizio

    Quantification of information leakage is a successful approach for evaluating the security of a system. It models the system to be analyzed as a channel with the secret as the input and an output as observable by the attacker as the output, and applies information theory to quantify the amount...... and randomized processes with Markovian models and to compute their information leakage for a very general model of attacker. We present the QUAIL tool that automates such analysis and is able to compute the information leakage of an imperative WHILE language. Finally, we show how to use QUAIL to analyze some...... of information transmitted through such channel, thus effectively quantifying how many bits of the secret can be inferred by the attacker by analyzing the system’s output. Channels are usually encoded as matrices of conditional probabilities, known as channel matrices. Such matrices grow exponentially...

  5. High Performance Liquid Chromatography of Vitamin A: A Quantitative Determination.

    Science.gov (United States)

    Bohman, Ove; And Others

    1982-01-01

    Experimental procedures are provided for the quantitative determination of Vitamin A (retinol) in food products by analytical liquid chromatography. Standard addition and calibration curve extraction methods are outlined. (SK)

  6. The use of semi-structured interviews for collection of qualitative and quantitative data in hydrological studies

    Science.gov (United States)

    O'Keeffe, Jimmy; Buytaert, Wouter; Mijic, Ana; Brozovic, Nicholas

    2015-04-01

    To build an accurate, robust understanding of the environment, it is important to not only collect information describing its physical characteristics, but also the drivers which influence it. As environmental change, from increasing CO2 levels to decreasing water levels, is often heavily influenced by human activity, gathering information on anthropogenic as well as environmental variables is extremely important. This can mean collecting qualitative, as well as quantitative information. In reality studies are often bound by financial and time constraints, limiting the depth and detail of the research. It is up to the researcher to determine what the best methodology to answer the research questions is likely to be. Here we present a methodology of collecting qualitative and quantitative information in tandem for hydrological studies through the use of semi-structured interviews. This is applied to a case study in two districts of Uttar Pradesh, North India, one of the most intensely irrigated areas of the world. Here, decreasing water levels exacerbated by unchecked water abstraction, an expanding population and government subsidies, have put the long term resilience of the farming population in doubt. Through random selection of study locations, combined with convenience sampling of the participants therein, we show how the data collected can provide valuable insight into the drivers which have led to the current water scenario. We also show how reliable quantitative information can, using the same methodology, be effectively and efficiently extracted for modelling purposes, which along with developing an understanding of the characteristics of the environment is vital in coming up with realistic and sustainable solutions for water resource management in the future.

  7. Architecture and data processing alternatives for the TSE computer. Volume 2: Extraction of topological information from an image by the Tse computer

    Science.gov (United States)

    Jones, J. R.; Bodenheimer, R. E.

    1976-01-01

    A simple programmable Tse processor organization and arithmetic operations necessary for extraction of the desired topological information are described. Hardware additions to this organization are discussed along with trade-offs peculiar to the tse computing concept. An improved organization is presented along with the complementary software for the various arithmetic operations. The performance of the two organizations is compared in terms of speed, power, and cost. Software routines developed to extract the desired information from an image are included.

  8. Information Extraction for Social Media

    NARCIS (Netherlands)

    Habib, M. B.; Keulen, M. van

    2014-01-01

    The rapid growth in IT in the last two decades has led to a growth in the amount of information available online. A new style for sharing information is social media. Social media is a continuously instantly updated source of information. In this position paper, we propose a framework for

  9. A construction scheme of web page comment information extraction system based on frequent subtree mining

    Science.gov (United States)

    Zhang, Xiaowen; Chen, Bingfeng

    2017-08-01

    Based on the frequent sub-tree mining algorithm, this paper proposes a construction scheme of web page comment information extraction system based on frequent subtree mining, referred to as FSM system. The entire system architecture and the various modules to do a brief introduction, and then the core of the system to do a detailed description, and finally give the system prototype.

  10. Extraction chromatography of fission products

    International Nuclear Information System (INIS)

    Bonnevie-Svendsen, M.; Goon, K.

    1978-01-01

    Various cases of using extraction chromatography during analysis of fission products are reviewed. The use of the extraction chromatography method is considered while analysing reprocessed products of nuclear fuel for quantitative radiochemical analysis and control of fission product and actinoide separation during extraction and their chemical state in production solutions. The method is used to obtain pure fractions of typical burnup monitors (neodymium, molybdenum, cerium, cesium, europium, lanthanides) during determination of nuclear fuel burnup degree. While studying the nature of nuclear reactions the method is used to separate quickly short-life isotopes, to purify β-radiator fractions before measuring their half-life periods, to enrich isotopes forming with low output during fission. Examples of using extraction chromatography are given to separate long half-life or stable fission products from spent solutions, to control environment object contamination

  11. Supercritical fluid extraction of hops

    Directory of Open Access Journals (Sweden)

    ZORAN ZEKOVIC

    2007-01-01

    Full Text Available Five cultivars of hop were extracted by the method of supercritical fluid extraction using carbon dioxide (SFE–CO2 as extractant. The extraction (50 g of hop sample using a CO2 flow rate of 97.725 L/h was done in the two steps: 1. extraction at 150 bar and 40°C for 2.5 h (sample of series A was obtained and, after that, the same sample of hop was extracted in the second step: 2. extraction at 300 bar and 40 °C for 2.5 h (sample of series B was obtained. The Magnum cultivar was chosen for the investigation of the extraction kinetics. For the qualitative and quantitative analysis of the obtained hop extracts, the GC-MS method was used. Two of four themost common compounds of hop aroma (a-humulene and b-caryophyllene were detected in samples of series A. In addition, isomerized a-acids and a high content of b-acids were detected. The a-acids content in the samples of series B was the highest in the extract of the Magnum cultivar (it is a bitter variety of hop. The low contents of a-acids in all the other hop samples resulted in extracts with low a-acids content, i.e., that contents were under the prescribed a-acids content.

  12. Medicaid Analytic eXtract (MAX) General Information

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicaid Analytic eXtract (MAX) data is a set of person-level data files on Medicaid eligibility, service utilization, and payments. The MAX data are created to...

  13. Quantitative stem cell biology: the threat and the glory.

    Science.gov (United States)

    Pollard, Steven M

    2016-11-15

    Major technological innovations over the past decade have transformed our ability to extract quantitative data from biological systems at an unprecedented scale and resolution. These quantitative methods and associated large datasets should lead to an exciting new phase of discovery across many areas of biology. However, there is a clear threat: will we drown in these rivers of data? On 18th July 2016, stem cell biologists gathered in Cambridge for the 5th annual Cambridge Stem Cell Symposium to discuss 'Quantitative stem cell biology: from molecules to models'. This Meeting Review provides a summary of the data presented by each speaker, with a focus on quantitative techniques and the new biological insights that are emerging. © 2016. Published by The Company of Biologists Ltd.

  14. A simplified edge illumination set-up for quantitative phase contrast mammography with synchrotron radiation at clinical doses

    International Nuclear Information System (INIS)

    Longo, Mariaconcetta; Rigon, Luigi; Lopez, Frances C M; Longo, Renata; Chen, Rongchang; Dreossi, Diego; Zanconati, Fabrizio

    2015-01-01

    This work presents the first study of x-ray phase contrast imaging based on a simple implementation of the edge illumination method (EIXPCi) in the field of mammography with synchrotron radiation. A simplified EIXPCi set-up was utilized to study a possible application in mammography at clinical doses. Moreover, through a novel algorithm capable of separating and quantifying absorption and phase perturbations of images acquired in EIXPCi modality, it is possible to extract quantitative information on breast images, allowing an accurate tissue identification. The study was carried out at the SYRMEP beamline of Elettra synchrotron radiation facility (Trieste, Italy), where a mastectomy specimen was investigated with the EIXPCi technique. The sample was exposed at three different energies suitable for mammography with synchrotron radiation in order to test the validity of the novel algorithm in extracting values of linear attenuation coefficients integrated over the sample thickness. It is demonstrated that the quantitative data are in good agreement with the theoretical values of linear attenuation coefficients calculated on the hypothesis of the breast with a given composition. The results are promising and encourage the current efforts to apply the method in mammography with synchrotron radiation. (note)

  15. Unsupervised Symbolization of Signal Time Series for Extraction of the Embedded Information

    Directory of Open Access Journals (Sweden)

    Yue Li

    2017-03-01

    Full Text Available This paper formulates an unsupervised algorithm for symbolization of signal time series to capture the embedded dynamic behavior. The key idea is to convert time series of the digital signal into a string of (spatially discrete symbols from which the embedded dynamic information can be extracted in an unsupervised manner (i.e., no requirement for labeling of time series. The main challenges here are: (1 definition of the symbol assignment for the time series; (2 identification of the partitioning segment locations in the signal space of time series; and (3 construction of probabilistic finite-state automata (PFSA from the symbol strings that contain temporal patterns. The reported work addresses these challenges by maximizing the mutual information measures between symbol strings and PFSA states. The proposed symbolization method has been validated by numerical simulation as well as by experimentation in a laboratory environment. Performance of the proposed algorithm has been compared to that of two commonly used algorithms of time series partitioning.

  16. PHENOLIC COMPOUNDS OF WATER-ETHANOLIC EXTRACT OF MENTHA LONGIFOLIA L

    Directory of Open Access Journals (Sweden)

    O. A. Grebennikova

    2014-01-01

    Full Text Available The article represents data about qualitative and quantitative composition of phenolic compounds in water-ethanol extract of perspective clone of Mentha longifolia L. of NBE-NSC selection. Phenolic substances content in water-ethanol extract amounted to 3003.3 mg/100g. 13 components were determined in the extract. The extract contains caffeic acid, chlorogenic acid isomers, rosmarinic acid and glycosides of luteolin. Rosmarinic acid (50.2% prevails among phenolic substances of Mentha longifolia extract. The conclusion is that the use of this extract is possible to create products with high biological value

  17. Quantitative X-ray mapping, scatter diagrams and the generation of correction maps to obtain more information about your material

    Science.gov (United States)

    Wuhrer, R.; Moran, K.

    2014-03-01

    Quantitative X-ray mapping with silicon drift detectors and multi-EDS detector systems have become an invaluable analysis technique and one of the most useful methods of X-ray microanalysis today. The time to perform an X-ray map has reduced considerably with the ability to map minor and trace elements very accurately due to the larger detector area and higher count rate detectors. Live X-ray imaging can now be performed with a significant amount of data collected in a matter of minutes. A great deal of information can be obtained from X-ray maps. This includes; elemental relationship or scatter diagram creation, elemental ratio mapping, chemical phase mapping (CPM) and quantitative X-ray maps. In obtaining quantitative x-ray maps, we are able to easily generate atomic number (Z), absorption (A), fluorescence (F), theoretical back scatter coefficient (η), and quantitative total maps from each pixel in the image. This allows us to generate an image corresponding to each factor (for each element present). These images allow the user to predict and verify where they are likely to have problems in our images, and are especially helpful to look at possible interface artefacts. The post-processing techniques to improve the quantitation of X-ray map data and the development of post processing techniques for improved characterisation are covered in this paper.

  18. Quantitative X-ray mapping, scatter diagrams and the generation of correction maps to obtain more information about your material

    International Nuclear Information System (INIS)

    Wuhrer, R; Moran, K

    2014-01-01

    Quantitative X-ray mapping with silicon drift detectors and multi-EDS detector systems have become an invaluable analysis technique and one of the most useful methods of X-ray microanalysis today. The time to perform an X-ray map has reduced considerably with the ability to map minor and trace elements very accurately due to the larger detector area and higher count rate detectors. Live X-ray imaging can now be performed with a significant amount of data collected in a matter of minutes. A great deal of information can be obtained from X-ray maps. This includes; elemental relationship or scatter diagram creation, elemental ratio mapping, chemical phase mapping (CPM) and quantitative X-ray maps. In obtaining quantitative x-ray maps, we are able to easily generate atomic number (Z), absorption (A), fluorescence (F), theoretical back scatter coefficient (η), and quantitative total maps from each pixel in the image. This allows us to generate an image corresponding to each factor (for each element present). These images allow the user to predict and verify where they are likely to have problems in our images, and are especially helpful to look at possible interface artefacts. The post-processing techniques to improve the quantitation of X-ray map data and the development of post processing techniques for improved characterisation are covered in this paper

  19. Selective Liquid-Liquid Extraction of Lead Ions Using Newly Synthesized Extractant 2-(Dibutylcarbamoylbenzoic Acid

    Directory of Open Access Journals (Sweden)

    Hossein Soltani

    2015-12-01

    Full Text Available A new carboxylic acid extractant, named 2-(dibutylcarbamoylbenzoic acid, is prepared and its potential for selective solvent extraction and recovery of lead ions from industrial samples was investigated. The slope analysis indicated that the lead ions are extracted by formation of 1:2 metal to ligand complexes. The effect of the parameters influencing the extraction efficiency including kind of the organic diluent, extractant concentration, type of the salt used for ionic strength adjustment, contact time and temperature was evaluated and discussed. Under optimized conditions (aqueous phase: 5 ml, initial lead concentration 1 × 10-4 M, pH 4, sodium chloride 0.1 M; organic phase: 5 ml dichloromethane, ligand concentration 0.05 M, a quantitative (75.2 ± 0.8% and highly selective extraction of lead ions in the presence of zinc, nickel, cobalt and cadmium ions (each 1 × 10-4 M was achieved, after 20 min. magnetically stirring of the phases, at      25 °C. The extracted lead ions were stripped from the organic phase by diluted nitric acid (0.1 M solution. The proposed method was successfully applied for separation of lead from industrial samples. The study of the effect of temperature allowed evaluating the thermodynamic parameters of the extraction process of lead ions by the studied extractant into dichloromethane.

  20. Art or Science? An Evidence-Based Approach to Human Facial Beauty a Quantitative Analysis Towards an Informed Clinical Aesthetic Practice.

    Science.gov (United States)

    Harrar, Harpal; Myers, Simon; Ghanem, Ali M

    2018-02-01

    Patients often seek guidance from the aesthetic practitioners regarding treatments to enhance their 'beauty'. Is there a science behind the art of assessment and if so is it measurable? Through the centuries, this question has challenged scholars, artists and surgeons. This study aims to undertake a review of the evidence behind quantitative facial measurements in assessing beauty to help the practitioner in everyday aesthetic practice. A Medline, Embase search for beauty, facial features and quantitative analysis was undertaken. Inclusion criteria were studies on adults, and exclusions included studies undertaken for dental, cleft lip, oncology, burns or reconstructive surgeries. The abstracts and papers were appraised, and further studies excluded that were considered inappropriate. The data were extracted using a standardised table. The final dataset was appraised in accordance with the PRISMA checklist and Holland and Rees' critique tools. Of the 1253 studies screened, 1139 were excluded from abstracts and a further 70 excluded from full text articles. The remaining 44 were assessed qualitatively and quantitatively. It became evident that the datasets were not comparable. Nevertheless, common themes were obvious, and these were summarised. Despite measures of the beauty of individual components to the sum of all the parts, such as symmetry and the golden ratio, we are yet far from establishing what truly constitutes quantitative beauty. Perhaps beauty is truly in the 'eyes of the beholder' (and perhaps in the eyes of the subject too). This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266 .

  1. Progress towards in vitro quantitative imaging of human femur using compound quantitative ultrasonic tomography

    International Nuclear Information System (INIS)

    Lasaygues, Philippe; Ouedraogo, Edgard; Lefebvre, Jean-Pierre; Gindre, Marcel; Talmant, Marilyne; Laugier, Pascal

    2005-01-01

    The objective of this study is to make cross-sectional ultrasonic quantitative tomography of the diaphysis of long bones. Ultrasonic propagation in bones is affected by the severe mismatch between the acoustic properties of this biological solid and those of the surrounding soft medium, namely, the soft tissues in vivo or water in vitro. Bone imaging is then a nonlinear inverse-scattering problem. In this paper, we showed that in vitro quantitative images of sound velocities in a human femur cross section could be reconstructed by combining ultrasonic reflection tomography (URT), which provides images of the macroscopic structure of the bone, and ultrasonic transmission tomography (UTT), which provides quantitative images of the sound velocity. For the shape, we developed an image-processing tool to extract the external and internal boundaries and cortical thickness measurements. For velocity mapping, we used a wavelet analysis tool adapted to ultrasound, which allowed us to detect precisely the time of flight from the transmitted signals. A brief review of the ultrasonic tomography that we developed using correction algorithms of the wavepaths and compensation procedures are presented. Also shown are the first results of our analyses on models and specimens of long bone using our new iterative quantitative protocol

  2. Extraction chromatographic method of uranium(VI) with high molecular mass amine (ALIQUAT - 336)

    International Nuclear Information System (INIS)

    Roy, Uday Sankar; Dutta, Keshab Kumar

    1999-01-01

    A selective method has been developed for reversed phase extraction chromatographic studies of uranium(VI) with Aliquat - 336 (liquid anion exchanger) coated on silica gel as stationary phase. Quantitative extraction of uranium has been achieved in HCl - medium from 1.25(M)-4(M). The effect of different acids with various concentrations stripping agents, flow rate on extraction and elution have been investigated. The exchange capacity of the prepared exchanger has been determined. Uranium(VI) has been separated quantitatively from Th, Ce, Zr, Pb, Ga, Hg, Fe, La, Pr, Nd, Sm and Cr from a binary mixture by controlling the extraction and elution conditions. The separation of U(VI) from ternary and quarternary mixtures of various metal ions has also been achieved. (author)

  3. Addressing Risk Assessment for Patient Safety in Hospitals through Information Extraction in Medical Reports

    Science.gov (United States)

    Proux, Denys; Segond, Frédérique; Gerbier, Solweig; Metzger, Marie Hélène

    Hospital Acquired Infections (HAI) is a real burden for doctors and risk surveillance experts. The impact on patients' health and related healthcare cost is very significant and a major concern even for rich countries. Furthermore required data to evaluate the threat is generally not available to experts and that prevents from fast reaction. However, recent advances in Computational Intelligence Techniques such as Information Extraction, Risk Patterns Detection in documents and Decision Support Systems allow now to address this problem.

  4. Leukotriene B4 catabolism: quantitation of leukotriene B4 and its omega-oxidation products by reversed-phase high-performance liquid chromatography.

    Science.gov (United States)

    Shak, S

    1987-01-01

    LTB4 and its omega-oxidation products may be rapidly, sensitively, and specifically quantitated by the methods of solid-phase extraction and reversed-phase high-performance liquid chromatography (HPLC), which are described in this chapter. Although other techniques, such as radioimmunoassay or gas chromatography-mass spectrometry, may be utilized for quantitative analysis of the lipoxygenase products of arachidonic acid, only the technique of reversed-phase HPLC can quantitate as many as 10 metabolites in a single analysis, without prior derivatization. In this chapter, we also reviewed the chromatographic theory which we utilized in order to optimize reversed-phase HPLC analysis of LTB4 and its omega-oxidation products. With this information and a gradient HPLC system, it is possible for any investigator to develop a powerful assay for the potent inflammatory mediator, LTB4, or for any other lipoxygenase product of arachidonic acid.

  5. Remote sensing information acquisition of paleo-channel sandstone-type uranium deposit in Nuheting area

    International Nuclear Information System (INIS)

    Liu Jianjun

    2000-01-01

    The author briefly describes the genesis and ore-formation mechanism of paleo-channel sandstone-type uranium deposit in Nuheting area. Techniques such as remote sensing digital image data processing and data enhancement, as well as 3-dimension quantitative analysis of drill hole data are applied to extract information on metallogenic environment of paleo-channel sandstone-type uranium deposit and the distribution of paleo-channel

  6. Label-Free Quantitative Analysis of Mitochondrial Proteomes Using the Multienzyme Digestion-Filter Aided Sample Preparation (MED-FASP) and "Total Protein Approach".

    Science.gov (United States)

    Wiśniewski, Jacek R

    2017-01-01

    Determination of proteome composition and measuring of changes in protein titers provide important information with a substantial value for studying mitochondria.This chapter describes a workflow for the quantitative analysis of mitochondrial proteome with a focus on sample preparation and quantitative analysis of the data. The workflow involves the multienzyme digestion-filter aided sample preparation (MED-FASP) protocol enabling efficient extraction of proteins and high rate of protein-to-peptide conversion. Consecutive protein digestion with Lys C and trypsin enables generation of peptide fractions with minimal overlap, largely increases the number of identified proteins, and extends their sequence coverage. Abundances of proteins identified by multiple peptides can be assessed by the "Total Protein Approach."

  7. Strategies for the extraction and analysis of non-extractable polyphenols from plants.

    Science.gov (United States)

    Domínguez-Rodríguez, Gloria; Marina, María Luisa; Plaza, Merichel

    2017-09-08

    The majority of studies based on phenolic compounds from plants are focused on the extractable fraction derived from an aqueous or aqueous-organic extraction. However, an important fraction of polyphenols is ignored due to the fact that they remain retained in the residue of extraction. They are the so-called non-extractable polyphenols (NEPs) which are high molecular weight polymeric polyphenols or individual low molecular weight phenolics associated to macromolecules. The scarce information available about NEPs shows that these compounds possess interesting biological activities. That is why the interest about the study of these compounds has been increasing in the last years. Furthermore, the extraction and characterization of NEPs are considered a challenge because the developed analytical methodologies present some limitations. Thus, the present literature review summarizes current knowledge of NEPs and the different methodologies for the extraction of these compounds, with a particular focus on hydrolysis treatments. Besides, this review provides information on the most recent developments in the purification, separation, identification and quantification of NEPs from plants. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Algorithm of pulmonary emphysema extraction using low dose thoracic 3D CT images

    Science.gov (United States)

    Saita, S.; Kubo, M.; Kawata, Y.; Niki, N.; Nakano, Y.; Omatsu, H.; Tominaga, K.; Eguchi, K.; Moriyama, N.

    2006-03-01

    Recently, due to aging and smoking, emphysema patients are increasing. The restoration of alveolus which was destroyed by emphysema is not possible, thus early detection of emphysema is desired. We describe a quantitative algorithm for extracting emphysematous lesions and quantitatively evaluate their distribution patterns using low dose thoracic 3-D CT images. The algorithm identified lung anatomies, and extracted low attenuation area (LAA) as emphysematous lesion candidates. Applying the algorithm to 100 thoracic 3-D CT images and then by follow-up 3-D CT images, we demonstrate its potential effectiveness to assist radiologists and physicians to quantitatively evaluate the emphysematous lesions distribution and their evolution in time interval changes.

  9. Quantitative analysis of tellurium in simple substance sulfur

    International Nuclear Information System (INIS)

    Arikawa, Yoshiko

    1976-01-01

    The MIBK extraction-bismuthiol-2 absorptiometric method for the quantitative analysis of tellurium was studied. The method and its limitation were compared with the atomic absorption method. The period of time required to boil the solution in order to decompose excess hydrogen peroxide and to reduce tellurium from 6 valance to 4 valance was examined. As a result of experiment, the decomposition was fast in the alkaline solution. It takes 30 minutes with alkaline solution and 40 minutes with acid solution to indicate constant absorption. A method of analyzing the sample containing tellurium less than 5 ppm was studied. The experiment revealed that the sample containing a very small amount of tellurium can be analyzed when concentration by extraction is carried out for the sample solutions which are divided into one gram each because it is difficult to treat several grams of the sample at one time. This method also is suitable for the quantitative analysis of selenium. This method showed good addition effect and reproducibility within the relative error of 5%. The comparison between the calibration curve of the standard solution of tellurium 4 subjected to the reaction with bismuthiol-2 and the calibration curve obtained from the extraction of tellurium 4 with MIBK indicated that the extraction is perfect. The result by bismuthiol-2 method and that by atom absorption method coincided quite well on the same sample. (Iwakiri, K.)

  10. Analytical Methods Development in Support of the Caustic Side Solvent Extraction System

    International Nuclear Information System (INIS)

    Maskarinec, M.P.

    2001-01-01

    The goal of the project reported herein was to develop and apply methods for the analysis of the major components of the solvent system used in the Caustic-Side Solvent Extraction Process (CSSX). These include the calix(4)arene, the modifier, 1-(2,2,3,3- tetrafluoropropoxy)-3-(4-sec-butylphenoxy)-2-propanol and tri-n-octylamine. In addition, it was an objective to develop methods that would allow visualization of other components under process conditions. These analyses would include quantitative laboratory methods for each of the components, quantitative analysis of expected breakdown products (4-see-butylphenol and di-n-octylamine), and qualitative investigations of possible additional breakdown products under a variety of process extremes. These methods would also provide a framework for process analysis should a pilot facility be developed. Two methods were implemented for sample preparation of aqueous phases. The first involves solid-phase extraction and produces quantitative recovery of the solvent components and degradation products from the various aqueous streams. This method can be automated and is suitable for use in radiation shielded facilities. The second is a variation of an established EPA liquid-liquid extraction procedure. This method is also quantitative and results in a final extract amenable to virtually any instrumental analysis. Two HPLC methods were developed for quantitative analysis. The first is a reverse-phase system with variable wavelength W detection. This method is excellent from a quantitative point of view. The second method is a size-exclusion method coupled with dual UV and evaporative light scattering detectors. This method is much faster than the reverse-phase method and allows for qualitative analysis of other components of the waste. For tri-n-octylamine and other degradation products, a GC method was developed and subsequently extended to GUMS. All methods have precision better than 5%. The combination of these methods

  11. Quantitative non-invasive intracellular imaging of Plasmodium falciparum infected human erythrocytes

    International Nuclear Information System (INIS)

    Edward, Kert; Farahi, Faramarz

    2014-01-01

    Malaria is a virulent pathological condition which results in over a million annual deaths. The parasitic agent Plasmodium falciparum has been extensively studied in connection with this epidemic but much remains unknown about its development inside the red blood cell host. Optical and fluorescence imaging are among the two most common procedures for investigating infected erythrocytes but both require the introduction of exogenous contrast agents. In this letter, we present a procedure for the non-invasive in situ imaging of malaria infected red blood cells. The procedure is based on the utilization of simultaneously acquired quantitative phase and independent topography data to extract intracellular information. Our method allows for the identification of the developmental stages of the parasite and facilitates in situ analysis of the morphological changes associated with the progression of this disease. This information may assist in the development of efficacious treatment therapies for this condition. (letters)

  12. Extraction and separation of U(VI and Th(IV from hydrobromic acid media using Cyanex-923 extractant

    Directory of Open Access Journals (Sweden)

    Ghag Snehal M.

    2010-01-01

    Full Text Available A systematic study of the solvent extraction of uranium(VI and thorium(IV from hydrobromic acid media was performed using the neutral phosphine oxide extractant Cyanex-923 in toluene. These metal ions were found to be quantitatively extracted with Cyanex-923 in toluene in the acidity range 5x10-5-1x10-4 M and 5x10-5-5x10-3 M, respectively, and they are stripped from the organic phase with 7.0 M HClO4 and 2.0- 4.0 M HCl, respectively. The effect of the equilibrium period, diluents, diverse ions and stripping agent on the extraction of U(VI and Th(IV was studied. The stoichiometry of the extracted species of these metal ions was determined based on the slope analysis method. The extraction reactions proceed by solvation and their probable extracted species found in the organic phase were UO2Br2•2Cyanex-923 and ThBr4•2Cyanex-923. Based on these results, a sequential procedure for their separation from each other was developed.

  13. Optimized Clinical Use of RNALater and FFPE Samples for Quantitative Proteomics

    DEFF Research Database (Denmark)

    Bennike, Tue Bjerg; Kastaniegaard, Kenneth; Padurariu, Simona

    2015-01-01

    Introduction and Objectives The availability of patient samples is essential for clinical proteomic research. Biobanks worldwide store mainly samples stabilized in RNAlater as well as formalin-fixed and paraffin embedded (FFPE) biopsies. Biobank material is a potential source for clinical...... we compare to FFPE and frozen samples being the control. Methods From the sigmoideum of two healthy participants’ twenty-four biopsies were extracted using endoscopy. The biopsies was stabilized either by being directly frozen, RNAlater, FFPE or incubated for 30 min at room temperature prior to FFPE...... information. Conclusion We have demonstrated that quantitative proteome analysis and pathway mapping of samples stabilized in RNAlater as well as by FFPE is feasible with minimal impact on the quality of protein quantification and post-translational modifications....

  14. Modelling dental implant extraction by pullout and torque procedures.

    Science.gov (United States)

    Rittel, D; Dorogoy, A; Shemtov-Yona, K

    2017-07-01

    Dental implants extraction, achieved either by applying torque or pullout force, is used to estimate the bone-implant interfacial strength. A detailed description of the mechanical and physical aspects of the extraction process in the literature is still missing. This paper presents 3D nonlinear dynamic finite element simulations of a commercial implant extraction process from the mandible bone. Emphasis is put on the typical load-displacement and torque-angle relationships for various types of cortical and trabecular bone strengths. The simulations also study of the influence of the osseointegration level on those relationships. This is done by simulating implant extraction right after insertion when interfacial frictional contact exists between the implant and bone, and long after insertion, assuming that the implant is fully bonded to the bone. The model does not include a separate representation and model of the interfacial layer for which available data is limited. The obtained relationships show that the higher the strength of the trabecular bone the higher the peak extraction force, while for application of torque, it is the cortical bone which might dictate the peak torque value. Information on the relative strength contrast of the cortical and trabecular components, as well as the progressive nature of the damage evolution, can be revealed from the obtained relations. It is shown that full osseointegration might multiply the peak and average load values by a factor 3-12 although the calculated work of extraction varies only by a factor of 1.5. From a quantitative point of view, it is suggested that, as an alternative to reporting peak load or torque values, an average value derived from the extraction work be used to better characterize the bone-implant interfacial strength. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Quantitative image processing in fluid mechanics

    Science.gov (United States)

    Hesselink, Lambertus; Helman, James; Ning, Paul

    1992-01-01

    The current status of digital image processing in fluid flow research is reviewed. In particular, attention is given to a comprehensive approach to the extraction of quantitative data from multivariate databases and examples of recent developments. The discussion covers numerical simulations and experiments, data processing, generation and dissemination of knowledge, traditional image processing, hybrid processing, fluid flow vector field topology, and isosurface analysis using Marching Cubes.

  16. Quantitative graph theory mathematical foundations and applications

    CERN Document Server

    Dehmer, Matthias

    2014-01-01

    The first book devoted exclusively to quantitative graph theory, Quantitative Graph Theory: Mathematical Foundations and Applications presents and demonstrates existing and novel methods for analyzing graphs quantitatively. Incorporating interdisciplinary knowledge from graph theory, information theory, measurement theory, and statistical techniques, this book covers a wide range of quantitative-graph theoretical concepts and methods, including those pertaining to real and random graphs such as:Comparative approaches (graph similarity or distance)Graph measures to characterize graphs quantitat

  17. Particle-size distribution (PSD) of pulverized hair: A quantitative approach of milling efficiency and its correlation with drug extraction efficiency.

    Science.gov (United States)

    Chagas, Aline Garcia da Rosa; Spinelli, Eliani; Fiaux, Sorele Batista; Barreto, Adriana da Silva; Rodrigues, Silvana Vianna

    2017-08-01

    Different types of hair were submitted to different milling procedures and their resulting powders were analyzed by scanning electron microscopy (SEM) and laser diffraction (LD). SEM results were qualitative whereas LD results were quantitative and accurately characterized the hair powders through their particle size distribution (PSD). Different types of hair were submitted to an optimized milling conditions and their PSD was quite similar. A good correlation was obtained between PSD results and ketamine concentration in a hair sample analyzed by LC-MS/MS. Hair samples were frozen in liquid nitrogen for 5min and pulverized at 25Hz for 10min, resulting in 61% of particles sample extracted after pulverization comparing with the same sample cut in 1mm fragments. When milling time was extended to 25min, >90% of particles were sample retesting and quality control procedures. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Videomicroscopic extraction of specific information on cell proliferation and migration in vitro

    International Nuclear Information System (INIS)

    Debeir, Olivier; Megalizzi, Veronique; Warzee, Nadine; Kiss, Robert; Decaestecker, Christine

    2008-01-01

    In vitro cell imaging is a useful exploratory tool for cell behavior monitoring with a wide range of applications in cell biology and pharmacology. Combined with appropriate image analysis techniques, this approach has been shown to provide useful information on the detection and dynamic analysis of cell events. In this context, numerous efforts have been focused on cell migration analysis. In contrast, the cell division process has been the subject of fewer investigations. The present work focuses on this latter aspect and shows that, in complement to cell migration data, interesting information related to cell division can be extracted from phase-contrast time-lapse image series, in particular cell division duration, which is not provided by standard cell assays using endpoint analyses. We illustrate our approach by analyzing the effects induced by two sigma-1 receptor ligands (haloperidol and 4-IBP) on the behavior of two glioma cell lines using two in vitro cell models, i.e., the low-density individual cell model and the high-density scratch wound model. This illustration also shows that the data provided by our approach are suggestive as to the mechanism of action of compounds, and are thus capable of informing the appropriate selection of further time-consuming and more expensive biological evaluations required to elucidate a mechanism

  19. Comparison of salivary collection and processing methods for quantitative HHV-8 detection.

    Science.gov (United States)

    Speicher, D J; Johnson, N W

    2014-10-01

    Saliva is a proved diagnostic fluid for the qualitative detection of infectious agents, but the accuracy of viral load determinations is unknown. Stabilising fluids impede nucleic acid degradation, compared with collection onto ice and then freezing, and we have shown that the DNA Genotek P-021 prototype kit (P-021) can produce high-quality DNA after 14 months of storage at room temperature. Here we evaluate the quantitative capability of 10 collection/processing methods. Unstimulated whole mouth fluid was spiked with a mixture of HHV-8 cloned constructs, 10-fold serial dilutions were produced, and samples were extracted and then examined with quantitative PCR (qPCR). Calibration curves were compared by linear regression and qPCR dynamics. All methods extracted with commercial spin columns produced linear calibration curves with large dynamic range and gave accurate viral loads. Ethanol precipitation of the P-021 does not produce a linear standard curve, and virus is lost in the cell pellet. DNA extractions from the P-021 using commercial spin columns produced linear standard curves with wide dynamic range and excellent limit of detection. When extracted with spin columns, the P-021 enables accurate viral loads down to 23 copies μl(-1) DNA. The quantitative and long-term storage capability of this system makes it ideal for study of salivary DNA viruses in resource-poor settings. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  20. Quantitative spatial analysis of the mouse brain lipidome by pressurized liquid extraction surface analysis

    DEFF Research Database (Denmark)

    Almeida, Reinaldo; Berzina, Zane; Christensen, Eva Arnspang

    2015-01-01

    extracted directly from tissue sections. PLESA uses a sealed and pressurized sampling probe that enables the use of chloroform-containing extraction solvents for efficient in situ lipid microextraction with a spatial resolution of 400 μm. Quantification of lipid species is achieved by the inclusion...

  1. Using the DOM Tree for Content Extraction

    Directory of Open Access Journals (Sweden)

    David Insa

    2012-10-01

    Full Text Available The main information of a webpage is usually mixed between menus, advertisements, panels, and other not necessarily related information; and it is often difficult to automatically isolate this information. This is precisely the objective of content extraction, a research area of widely interest due to its many applications. Content extraction is useful not only for the final human user, but it is also frequently used as a preprocessing stage of different systems that need to extract the main content in a web document to avoid the treatment and processing of other useless information. Other interesting application where content extraction is particularly used is displaying webpages in small screens such as mobile phones or PDAs. In this work we present a new technique for content extraction that uses the DOM tree of the webpage to analyze the hierarchical relations of the elements in the webpage. Thanks to this information, the technique achieves a considerable recall and precision. Using the DOM structure for content extraction gives us the benefits of other approaches based on the syntax of the webpage (such as characters, words and tags, but it also gives us a very precise information regarding the related components in a block, thus, producing very cohesive blocks.

  2. Extraspectral Imaging for Improving the Perceived Information Presented in Retinal Prosthesis

    Directory of Open Access Journals (Sweden)

    Walid Al-Atabany

    2018-01-01

    Full Text Available Retinal prosthesis is steadily improving as a clinical treatment for blindness caused by retinitis pigmentosa. However, despite the continued exciting progress, the level of visual return is still very poor. It is also unlikely that those utilising these devices will stop being legally blind in the near future. Therefore, it is important to develop methods to maximise the transfer of useful information extracted from the visual scene. Such an approach can be achieved by digitally suppressing less important visual features and textures within the scene. The result can be interpreted as a cartoon-like image of the scene. Furthermore, utilising extravisual wavelengths such as infrared can be useful in the decision process to determine the optimal information to present. In this paper, we, therefore, present a processing methodology that utilises information extracted from the infrared spectrum to assist in the preprocessing of the visual image prior to conversion to retinal information. We demonstrate how this allows for enhanced recognition and how it could be implemented for optogenetic forms of retinal prosthesis. The new approach has been quantitatively evaluated on volunteers showing 112% enhancement in recognizing objects over normal approaches.

  3. Incineration of organic solar cells: Efficient end of life management by quantitative silver recovery

    DEFF Research Database (Denmark)

    Søndergaard, Roar R.; Zimmermann, Yannick Serge; Espinosa Martinez, Nieves

    2016-01-01

    Recovery of silver from the electrodes of roll-to-roll processed organic solar cells after incineration has been performed quantitatively by extraction with nitric acid. This procedure is more than 10 times faster than previous reports and the amount of acid needed for the extraction is reduced...

  4. Three-dimensional Hessian matrix-based quantitative vascular imaging of rat iris with optical-resolution photoacoustic microscopy in vivo

    Science.gov (United States)

    Zhao, Huangxuan; Wang, Guangsong; Lin, Riqiang; Gong, Xiaojing; Song, Liang; Li, Tan; Wang, Wenjia; Zhang, Kunya; Qian, Xiuqing; Zhang, Haixia; Li, Lin; Liu, Zhicheng; Liu, Chengbo

    2018-04-01

    For the diagnosis and evaluation of ophthalmic diseases, imaging and quantitative characterization of vasculature in the iris are very important. The recently developed photoacoustic imaging, which is ultrasensitive in imaging endogenous hemoglobin molecules, provides a highly efficient label-free method for imaging blood vasculature in the iris. However, the development of advanced vascular quantification algorithms is still needed to enable accurate characterization of the underlying vasculature. We have developed a vascular information quantification algorithm by adopting a three-dimensional (3-D) Hessian matrix and applied for processing iris vasculature images obtained with a custom-built optical-resolution photoacoustic imaging system (OR-PAM). For the first time, we demonstrate in vivo 3-D vascular structures of a rat iris with a the label-free imaging method and also accurately extract quantitative vascular information, such as vessel diameter, vascular density, and vascular tortuosity. Our results indicate that the developed algorithm is capable of quantifying the vasculature in the 3-D photoacoustic images of the iris in-vivo, thus enhancing the diagnostic capability of the OR-PAM system for vascular-related ophthalmic diseases in vivo.

  5. Quantitative Analysis of Bioactive Compounds from Aromatic Plants by Means of Dynamic Headspace Extraction and Multiple Headspace Extraction-Gas Chromatography-Mass Spectrometry

    NARCIS (Netherlands)

    Omar, Jone; Olivares, Maitane; Alonso, Ibone; Vallejo, Asier; Aizpurua-Olaizola, Oier; Etxebarria, Nestor

    2016-01-01

    Seven monoterpenes in 4 aromatic plants (sage, cardamom, lavender, and rosemary) were quantified in liquid extracts and directly in solid samples by means of dynamic headspace-gas chromatography-mass spectrometry (DHS-GC-MS) and multiple headspace extraction-gas chromatography-mass spectrometry

  6. Inexperienced clinicians can extract pathoanatomic information from MRI narrative reports with high reproducability for use in research/quality assurance

    DEFF Research Database (Denmark)

    Kent, Peter; Briggs, Andrew M; Albert, Hanne Birgit

    2011-01-01

    Background Although reproducibility in reading MRI images amongst radiologists and clinicians has been studied previously, no studies have examined the reproducibility of inexperienced clinicians in extracting pathoanatomic information from magnetic resonance imaging (MRI) narrative reports and t...

  7. Information Extraction and Interpretation Analysis of Mineral Potential Targets Based on ETM+ Data and GIS technology: A Case Study of Copper and Gold Mineralization in Burma

    International Nuclear Information System (INIS)

    Wenhui, Du; Yongqing, Chen; Nana, Guo; Yinglong, Hao; Pengfei, Zhao; Gongwen, Wang

    2014-01-01

    Mineralization-alteration and structure information extraction plays important roles in mineral resource prospecting and assessment using remote sensing data and the Geographical Information System (GIS) technology. Choosing copper and gold mines in Burma as example, the authors adopt band ratio, threshold segmentation and principal component analysis (PCA) to extract the hydroxyl alteration information using ETM+ remote sensing images. Digital elevation model (DEM) (30m spatial resolution) and ETM+ data was used to extract linear and circular faults that are associated with copper and gold mineralization. Combining geological data and the above information, the weights of evidence method and the C-A fractal model was used to integrate and identify the ore-forming favourable zones in this area. Research results show that the high grade potential targets are located with the known copper and gold deposits, and the integrated information can be used to the next exploration for the mineral resource decision-making

  8. Extraction of fission product rhodium from nitric acid solutions. 1

    International Nuclear Information System (INIS)

    Gorski, B.; Beer, M.; Russ, L.

    1988-01-01

    The extraction of noble metals from nitric acid solutions represents one problem of separating valueable substances from nuclear wastes in nuclear fuel reprocessing. Results of distribution experiments demonstrate the possibility of solvent extraction of rhodium using tertiary amines in presence of nitrite. Even short mixing times realize high distribution coefficients allowing quantitative separation from aqueous solutions. (author)

  9. Online recovery of radiocesium from soil, cellulose and plant samples by supercritical fluid extraction employing crown ethers and calix-crown derivatives as extractants

    International Nuclear Information System (INIS)

    Kanekar, A.S.; Pathak, P.N.; Mohapatra, P.K.

    2014-01-01

    Two crown ethers (CEs) viz. dibenzo18crown6, and dibenzo12crown7 and three calix-crown derivatives viz. (octyloxy)calix[4]arene-mono-crown-6 (CMC), calix[4]arene-bis(o-benzocrown-6) (CBC), and calix[4]arene-bis(naphthocrown-6) (CNC) were evaluated for the recovery of 137 Cs from synthetic soil, cellulose (tissue paper), and plant samples by supercritical fluid extraction (SFE) route. CEs showed poor extraction of 137 Cs from soil matrix. SFE experiments using 1 × 10 -3 M solutions of CMC, CBC and CNC in acetonitrile at 3 M HNO 3 as modifiers displayed better extraction of 137 Cs, viz. 21(±2) % (CMC), 16.5(±3) % (CBC), and 4(±1) % (CNC). It was not possible to recover 137 Cs quantitatively from soil matrix. The inefficient extraction of 137 Cs from soil matrix was attributed to its incorporation into the interstitial sites. Experiments on tissue papers using CMC showed near quantitative 137 Cs recovery. On the other hand, recovery from plant samples varied between 50(±5) % (for stems) and 75(±5) % (for leaves). (author)

  10. Solvent extraction of Sb(III) with malachite green into chloroform

    International Nuclear Information System (INIS)

    Shanbhag, B.S.; Turel, Z.R.

    2002-01-01

    A rapid and selective method for the solvent extraction of Sb(III) using malachite green (C. I. Basic green 4) has been described. Effect of different parameters affecting the extraction coefficient value of Sb(III) such as acidity, time of equilibration, KI concentration, solvents, anions, etc. has been studied. For various elements the separation factor has been evaluated. The stoichiometry of the extracted species has been determined by the method of substoichiometric extraction. The decontamination factor for some elements using substoichiometric quantities of the extracting agent has been evaluated. Radiotracers were employed for the extraction studies. The method elaborated has been employed for the quantitative determination of antimony in normal, benign and cancerous tissues of the human brain. (author)

  11. Direct thermal desorption in the analysis of cheese volatiles by gas chromatography and gas chromatography-mass spectrometry: comparison with simultaneous distillation-extraction and dynamic headspace.

    Science.gov (United States)

    Valero, E; Sanz, J; Martínez-Castro, I

    2001-06-01

    Direct thermal desorption (DTD) has been used as a technique for extracting volatile components of cheese as a preliminary step to their gas chromatographic (GC) analysis. In this study, it is applied to different cheese varieties: Camembert, blue, Chaumes, and La Serena. Volatiles are also extracted using other techniques such as simultaneous distillation-extraction and dynamic headspace. Separation and identification of the cheese components are carried out by GC-mass spectrometry. Approximately 100 compounds are detected in the examined cheeses. The described results show that DTD is fast, simple, and easy to automate; requires only a small amount of sample (approximately 50 mg); and affords quantitative information about the main groups of compounds present in cheeses.

  12. Simultaneous Distillation Extraction of Some Volatile Flavor Components from Pu-erh Tea Samples—Comparison with Steam Distillation-Liquid/Liquid Extraction and Soxhlet Extraction

    Directory of Open Access Journals (Sweden)

    Xungang Gu

    2009-01-01

    Full Text Available A simutaneous distillation extraction (SDE combined GC method was constructed for determination of volatile flavor components in Pu-erh tea samples. Dichloromethane and ethyl decylate was employed as organic phase in SDE and internal standard in determination, respectively. Weakly polar DB-5 column was used to separate the volatile flavor components in GC, 10 of the components were quantitatively analyzed, and further confirmed by GC-MS. The recovery covered from 66.4%–109%, and repeatability expressed as RSD was in range of 1.44%–12.6%. SDE was most suitable for the extraction of the anlytes by comparing with steam distillation-liquid/liquid extraction and Soxhlet extraction. Commercially available Pu-erh tea samples, including Pu-erh raw tea and ripe tea, were analyzed by the constructed method. the high-volatile components, such as benzyl alcohol, linalool oxide, and linalool, were greatly rich in Pu-erh raw teas, while the contents of 1,2,3-Trimethoxylbenzene and 1,2,4-Trimethoxylbenzene were much high in Pu-erh ripe teas.

  13. Intra-laboratory validation of chronic bee paralysis virus quantitation using an accredited standardised real-time quantitative RT-PCR method.

    Science.gov (United States)

    Blanchard, Philippe; Regnault, Julie; Schurr, Frank; Dubois, Eric; Ribière, Magali

    2012-03-01

    Chronic bee paralysis virus (CBPV) is responsible for chronic bee paralysis, an infectious and contagious disease in adult honey bees (Apis mellifera L.). A real-time RT-PCR assay to quantitate the CBPV load is now available. To propose this assay as a reference method, it was characterised further in an intra-laboratory study during which the reliability and the repeatability of results and the performance of the assay were confirmed. The qPCR assay alone and the whole quantitation method (from sample RNA extraction to analysis) were both assessed following the ISO/IEC 17025 standard and the recent XP U47-600 standard issued by the French Standards Institute. The performance of the qPCR assay and of the overall CBPV quantitation method were validated over a 6 log range from 10(2) to 10(8) with a detection limit of 50 and 100 CBPV RNA copies, respectively, and the protocol of the real-time RT-qPCR assay for CBPV quantitation was approved by the French Accreditation Committee. Copyright © 2011 Elsevier B.V. All rights reserved.

  14. Terpenes as Green Solvents for Extraction of Oil from Microalgae

    Directory of Open Access Journals (Sweden)

    Celine Dejoye Tanzi

    2012-07-01

    Full Text Available Herein is described a green and original alternative procedure for the extraction of oil from microalgae. Extractions were carried out using terpenes obtained from renewable feedstocks as alternative solvents instead of hazardous petroleum solvents such as n-hexane. The described method is achieved in two steps using Soxhlet extraction followed by the elimination of the solvent from the medium using Clevenger distillation in the second step. Oils extracted from microalgae were compared in terms of qualitative and quantitative determination. No significant difference was obtained between each extract, allowing us to conclude that the proposed method is green, clean and efficient.

  15. Preparation of Biological Samples Containing Metoprolol and Bisoprolol for Applying Methods for Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Corina Mahu Ştefania

    2015-12-01

    Full Text Available Arterial hypertension is a complex disease with many serious complications, representing a leading cause of mortality. Selective beta-blockers such as metoprolol and bisoprolol are frequently used in the management of hypertension. Numerous analytical methods have been developed for the determination of these substances in biological fluids, such as liquid chromatography coupled with mass spectrometry, gas chromatography coupled with mass spectrometry, high performance liquid chromatography. Due to the complex composition of biological fluids a biological sample pre-treatment before the use of the method for quantitative determination is required in order to remove proteins and potential interferences. The most commonly used methods for processing biological samples containing metoprolol and bisoprolol were identified through a thorough literature search using PubMed, ScienceDirect, and Willey Journals databases. Articles published between years 2005-2015 were reviewed. Protein precipitation, liquid-liquid extraction and solid phase extraction are the main techniques for the extraction of these drugs from plasma, serum, whole blood and urine samples. In addition, numerous other techniques have been developed for the preparation of biological samples, such as dispersive liquid-liquid microextraction, carrier-mediated liquid phase microextraction, hollow fiber-protected liquid phase microextraction, on-line molecularly imprinted solid phase extraction. The analysis of metoprolol and bisoprolol in human plasma, urine and other biological fluids provides important information in clinical and toxicological trials, thus requiring the application of appropriate extraction techniques for the detection of these antihypertensive substances at nanogram and picogram levels.

  16. Comparison of supercritical fluid and Soxhlet extractions for the quantification of hydrocarbons from Euphorbia macroclada.

    Science.gov (United States)

    Ozcan, Adnan; Ozcan, Asiye Safa

    2004-10-08

    This study compares conventional Soxhlet extraction and analytical scale supercritical fluid extraction (SFE) for their yields in extracting of hydrocarbons from arid-land plant Euphorbia macroclada. The plant material was firstly sequentially extracted with supercritical carbon dioxide, modified with 10% methanol (v/v) in the optimum conditions that is a pressure of 400atm and a temperature of 50 degrees C and then it was sonicated in methylene chloride for an additional 4h. E. macroclada was secondly extracted by using a Soxhlet apparatus at 30 degrees C for 8h in methylene chloride. The validated SFE was then compared to the extraction yield of E. macroclada with a Soxhlet extraction by using the Student's t-test at the 95% confidence level. All of extracts were fractionated with silica-gel in a glass column to get better hydrocarbon yields. Thus, the highest hydrocarbons yield from E. macroclada was achieved with SFE (5.8%) when it compared with Soxhlet extractions (1.1%). Gas chromatography (GC) analysis was performed to determine the quantitative hydrocarbons from plant material. The greatest quantitative hydrocarbon recovery from GC was obtained by supercritical carbon dioxide extract (0.6mgg(-1)).

  17. Using Local Grammar for Entity Extraction from Clinical Reports

    Directory of Open Access Journals (Sweden)

    Aicha Ghoulam

    2015-06-01

    Full Text Available Information Extraction (IE is a natural language processing (NLP task whose aim is to analyze texts written in natural language to extract structured and useful information such as named entities and semantic relations linking these entities. Information extraction is an important task for many applications such as bio-medical literature mining, customer care, community websites, and personal information management. The increasing information available in patient clinical reports is difficult to access. As it is often in an unstructured text form, doctors need tools to enable them access to this information and the ability to search it. Hence, a system for extracting this information in a structured form can benefits healthcare professionals. The work presented in this paper uses a local grammar approach to extract medical named entities from French patient clinical reports. Experimental results show that the proposed approach achieved an F-Measure of 90. 06%.

  18. BioSimplify: an open source sentence simplification engine to improve recall in automatic biomedical information extraction.

    Science.gov (United States)

    Jonnalagadda, Siddhartha; Gonzalez, Graciela

    2010-11-13

    BioSimplify is an open source tool written in Java that introduces and facilitates the use of a novel model for sentence simplification tuned for automatic discourse analysis and information extraction (as opposed to sentence simplification for improving human readability). The model is based on a "shot-gun" approach that produces many different (simpler) versions of the original sentence by combining variants of its constituent elements. This tool is optimized for processing biomedical scientific literature such as the abstracts indexed in PubMed. We tested our tool on its impact to the task of PPI extraction and it improved the f-score of the PPI tool by around 7%, with an improvement in recall of around 20%. The BioSimplify tool and test corpus can be downloaded from https://biosimplify.sourceforge.net.

  19. Sulphur containing novel extractants for extraction-separation of palladium (II)

    International Nuclear Information System (INIS)

    Shukla, J.P.; Sawant, S.R.; Anil Kumar; Singh, R.K.

    1995-01-01

    Extraction performance of palladium (II) by sulphur containing extragents has unequivocally established their strong extraction ability toward this thiophilic soft metal. Hence a comprehensive investigative study was initiated by us to examine selective reversible extraction-separation of trace and macro amounts of palladium (II) from both aqueous nitric acid as well as hydrochloric acid media into 1,2-dichloroethane by 1,10-dithia-18 crown-6 (1,10-DT18C6), S 6 -pentano-36 (S 6 -P-36) and bis (2-ethylhexyl) sulphoxide (BESO) dissolved in toluene. From the study of aqueous phase acidity, reagent concentration, period of equilibration, diluent, strippant and diverse ions, conditions are established from its quantitative and reversible extraction. Recovery of Pd(II) from loaded thiacrown and sulphoxide phase is easily accomplished by using sodium thiocyanate, ammonium thiocyanate, thiourea, sodium thiosulphate and mixture of (2M Na 2 CO 3 + 0.5 NH 4 OH) (only for BESO) as the strippants. The lack of interference from even appreciable amounts of contaminants like 137 Cs, 106 Ru, 233 U and 239 Pu may be considered as one of the outstanding advantages of the method. Application of these extractants has been successfully tested for the recovery of palladium from high active waste matrix. The extracted complex from both the thiacrowns has been characterized by elemental analyses and UV-Visible spectra, confirmed to be PdA 2 .T (A = NO - 3 , Cl - ) from dilute (pH ∼ 2) acid solutions while composition of organic species with palladium for the sulphoxide, has also been confirmed to be disolvate of the type Pd(NO 3 ) 2 .2BESO. (author). 52 refs., 6 tabs., 6 figs

  20. Quantitative Evaluation of Defect Based on Ultrasonic Guided Wave and CHMM

    Directory of Open Access Journals (Sweden)

    Chen Le

    2016-01-01

    Full Text Available The axial length of pipe defects is not linear with the reflection coefficient, which is difficult to identify the axial length of the defect by the reflection coefficient method. Continuous Hidden Markov Model (CHMM is proposed to accurately classify the axial length of defects, achieving the objective of preliminary quantitative evaluation. Firstly, wavelet packet decomposition method is used to extract the characteristic information of the guided wave signal, and Kernel Sliced Inverse Regression (KSIR method is used to reduce the dimension of feature set. Then, a variety of CHMM models are trained for classification. Finally, the trained models are used to identify the artificial corrosion defects on the outer surface of the pipe. The results show that the CHMM model has better robustness and can accurately identify the axial defects.

  1. DEVELOPMENT OF TECHNIQUES FOR QUANTITATIVE ANALYSIS OF LIME FLOWERS

    Directory of Open Access Journals (Sweden)

    Demyanenko DV

    2016-03-01

    Full Text Available Introduction. The article is devoted to the development of techniques for quantitative analysis of lime flower in order to make amendments to existing pharmacopoeian monographs for this herbal drug. Lime inflorescences contain lipophilic biologically active substances (BAS causing notable antimicrobial and anti-inflammatory effects and also more polar phenolic compounds with antiulcer activity. Considering this, it’s necessary to regulate all these groups of BAS quantitatively. Materials and methods. For this study six batches of lime flowers harvested in 2008-2009 yrs. in Kharkiv, Rivno and Zhitomir regions were used as crude herbal drug. Loss on drying was determined by routine pharmacopoeian procedures. Total content of lipophilic substances was determined gravimetrically after Soxhlet extraction of samples 1, 5, 7 and 10 g in weight with methylene chloride, considering that by its extracting ability this solvent is close to liquefied difluorochloromethane (freon R22 used by us for obtaining of lipophilic complexes. The duration of complete analytical extraction was determined by infusion of six 10 g assays of lime flowers during 1, 2, 3, 4, 5, 6 hours, then quantity of lipophilic extractives was revealed gravimetrically. Quantity of essential oil in lime flowers was evaluated under the procedure of ЕР7, 2.8.12. Weight of the herbal drug sample was 200 g, distillation rate – 2,5- 3,5 ml/min, volume of distillation liquid (water – 500 ml, volume of xylene in the graduated tube – 0,50 ml. Total flavonoid content recalculated to quercetin was determined after hydrolysis with acidified acetone, withdrawing of flavonoid aglycones with ethylacetate and by further spectrophotometry of their complexes with aluminium chloride. All quantitative determinations were replicated five times for each assay. All chemicals and reagents were of analytical grade. Results and discussion. It was found that adequate accuracy of the analysis of lipophilic

  2. Anniversary Paper: History and status of CAD and quantitative image analysis: The role of Medical Physics and AAPM

    International Nuclear Information System (INIS)

    Giger, Maryellen L.; Chan, Heang-Ping; Boone, John

    2008-01-01

    algorithms using appropriate cases to measure performance and robustness; conducting observer studies with which to evaluate radiologists in the diagnostic task without and with the use of the computer aid; and ultimately assessing performance with a clinical trial. Medical physicists also have an important role in quantitative imaging, by validating the quantitative integrity of scanners and developing imaging techniques, and image analysis tools that extract quantitative data in a more accurate and automated fashion. As imaging systems become more complex and the need for better quantitative information from images grows, the future includes the combined research efforts from physicists working in CAD with those working on quantitative imaging systems to readily yield information on morphology, function, molecular structure, and more--from animal imaging research to clinical patient care. A historical review of CAD and a discussion of challenges for the future are presented here, along with the extension to quantitative image analysis.

  3. Rapid Quantitation of Ascorbic and Folic Acids in SRM 3280 Multivitamin/Multielement Tablets using Flow-Injection Tandem Mass Spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Bhandari, Deepak [ORNL; Kertesz, Vilmos [ORNL; Van Berkel, Gary J [ORNL

    2013-01-01

    RATIONALE: Ascorbic acid (AA) and folic acid (FA) are water-soluble vitamins and are usually fortified in food and dietary supplements. For the safety of human health, proper intake of these vitamins is recommended. Improvement in the analysis time required for the quantitative determination of these vitamins in food and nutritional formulations is desired. METHODS: A simple and fast (~5 min) in-tube sample preparation was performed, independently for FA and AA, by mixing extraction solvent with a powdered sample aliquot followed by agitation, centrifugation, and filtration to recover an extract for analysis. Quantitative detection was achieved by flow-injection (1 L injection volume) electrospray ionization tandem mass spectrometry (ESI-MS/MS) in negative ion mode using the method of standard addition. RESULTS: Method of standard addition was employed for the quantitative estimation of each vitamin in a sample extract. At least 2 spiked and 1 non-spiked sample extract were injected in triplicate for each quantitative analysis. Given an injection-to-injection interval of approximately 2 min, about 18 min was required to complete the quantitative estimation of each vitamin. The concentration values obtained for the respective vitamins in the standard reference material (SRM) 3280 using this approach were within the statistical range of the certified values provided in the NIST Certificate of Analysis. The estimated limit of detections of FA and AA were 13 and 5.9 ng/g, respectively. CONCLUSIONS: Flow-injection ESI-MS/MS was successfully applied for the rapid quantitation of FA and AA in SRM 3280 multivitamin/multielement tablets.

  4. Standardization of solvent extraction procedure for determination of uranium in seawater

    International Nuclear Information System (INIS)

    Sukanta Maity; Sahu, S.K.; Pandit, G.G.

    2015-01-01

    Solvent extraction procedure using ammonium pyrolidine dithiocarbamate complexing agent in methyl isobutyl ketone organic phase and acid exchange back-extraction is described for the simultaneous quantitative pre-concentration of uranium in seawater followed by its determination by differential pulse adsorptive stripping voltammetry. Solvent extraction time is optimized for extraction of uranium from seawater. Solvent extraction efficiency for uranium in seawater at different pH was carried out. The method gives a recovery of 98 ± 2 % for 400 mL sample at pH 3.0 ± 0.02, facilitating the rapid and interference free analysis of seawater samples. (author)

  5. A hybrid approach for robust multilingual toponym extraction and disambiguation

    NARCIS (Netherlands)

    Habib, Mena Badieh; van Keulen, Maurice

    Toponym extraction and disambiguation are key topics recently addressed by fields of Information Extraction and Geographical Information Retrieval. Toponym extraction and disambiguation are highly dependent processes. Not only toponym extraction effectiveness affects disambiguation, but also

  6. Multiplexed Colorimetric Solid-Phase Extraction

    Science.gov (United States)

    Gazda, Daniel B.; Fritz, James S.; Porter, Marc D.

    2009-01-01

    Multiplexed colorimetric solid-phase extraction (MC-SPE) is an extension of colorimetric solid-phase extraction (C-SPE) an analytical platform that combines colorimetric reagents, solid phase extraction, and diffuse reflectance spectroscopy to quantify trace analytes in water. In CSPE, analytes are extracted and complexed on the surface of an extraction membrane impregnated with a colorimetric reagent. The analytes are then quantified directly on the membrane surface using a handheld diffuse reflectance spectrophotometer. Importantly, the use of solid-phase extraction membranes as the matrix for impregnation of the colorimetric reagents creates a concentration factor that enables the detection of low concentrations of analytes in small sample volumes. In extending C-SPE to a multiplexed format, a filter holder that incorporates discrete analysis channels and a jig that facilitates the concurrent operation of multiple sample syringes have been designed, enabling the simultaneous determination of multiple analytes. Separate, single analyte membranes, placed in a readout cartridge create unique, analyte-specific addresses at the exit of each channel. Following sample exposure, the diffuse reflectance spectrum of each address is collected serially and the Kubelka-Munk function is used to quantify each water quality parameter via calibration curves. In a demonstration, MC-SPE was used to measure the pH of a sample and quantitate Ag(I) and Ni(II).

  7. Extraction and evaluation of bioactive compounds with antioxidant potential from green arabica coffee extract

    Directory of Open Access Journals (Sweden)

    Simona PATRICHE

    2015-12-01

    Full Text Available During the last decade researches concerning the essential role of coffee in health and disease prevention showed an increased development. In the present study we obtained extracts from three green Arabica coffee varieties which demonstrated a significant antioxidant potential due to the presence in their composition of two bioactive compounds, caffeine and chlorogenic acids. The content and antioxidant activity of bioactive compounds were evaluated by qualitative and quantitative analyses using spectrophotometric and chromatography methods. The chlorogenic acid was found in high concentrations, being followed by gallic, p-coumaric and ferulic acids. The highest caffeine contents were found in the green coffee extracts of the Supremo–Columbia and Top Quality–Kenya products.

  8. The BEL information extraction workflow (BELIEF): evaluation in the BioCreative V BEL and IAT track

    OpenAIRE

    Madan, Sumit; Hodapp, Sven; Senger, Philipp; Ansari, Sam; Szostak, Justyna; Hoeng, Julia; Peitsch, Manuel; Fluck, Juliane

    2016-01-01

    Network-based approaches have become extremely important in systems biology to achieve a better understanding of biological mechanisms. For network representation, the Biological Expression Language (BEL) is well designed to collate findings from the scientific literature into biological network models. To facilitate encoding and biocuration of such findings in BEL, a BEL Information Extraction Workflow (BELIEF) was developed. BELIEF provides a web-based curation interface, the BELIEF Dashboa...

  9. Quantitative determination, Metal analysis and Antiulcer evaluation of Methanol seeds extract of Citrullus lanatus Thunb (Cucurbitaceae in Rats

    Directory of Open Access Journals (Sweden)

    Okunrobo O. Lucky

    2012-10-01

    Full Text Available Objective: The use of herbs in treatment of diseases is gradually becoming universally accepted especially in non industrialized societies. Citrullus lanatus Thunb (Cucurbitaceae commonly called water melon is widely consumed in this part of the world as food and medicine. This work was conducted to investigate the phytochemical composition, proximate and metal content analysis of the seed of Citrullus lanatus and to determine the antiulcer action of the methanol seed extract. Methods: Phytochemical screening, proximate and metal content analysis was done using the standard procedures and the antiulcer activity was evaluated against acetylsalicylic acid-induced ulcers. Results: The results revealed the presence of the following phytochemicals; flavonoids, saponins, tannins, alkaloids, glycosides. Proximate analysis indicated high concentration of carbohydrate, protein and fat while metal analysis showed the presence of sodium, calcium, zinc, magnesium at levels within the recommended dietary intake. Antiulcer potential of the extract against acetylsalicylic acid induced ulceration of gastric mucosa of Wister rats was evaluated at three doses (200mg/kg, 400mg/kg, and 800mg/kg. The ulcer parameters investigated included ulcer number, ulcer severity, ulcer index and percentage ulcer protection. The antiulcer activity was compared against ranitidine at 20mg/kg. The extract exhibited a dose related antiulcer activity with maximum activity at 800mg/kg (P<0.001. Conclusions: Proximate and metal content analysis of the seeds provides information that the consumption of the seeds of Citrullus lanatus is safe. This present study also provides preliminary data for the first time that the seeds of Citrullus lanatus possesses antiulcer activity in animal model.

  10. Development of the extraction method for the simultaneous determination of butyl-, phenyl- and octyltin compounds in sewage sludge.

    Science.gov (United States)

    Zuliani, Tea; Lespes, Gaetane; Milacic, Radmila; Scancar, Janez

    2010-03-15

    The toxicity and bioaccumulation of organotin compounds (OTCs) led to the development of sensitive and selective analytical methods for their determination. In the past much attention was assigned to the study of OTCs in biological samples, water and sediments, coming mostly from marine environment. Little information about OTCs pollution of terrestrial ecosystems is available. In order to optimise the extraction method for simultaneous determination of butyl-, phenyl- and octyltin compounds in sewage sludge five different extractants (tetramethylammonium hydroxide, HCl in methanol, glacial acetic acid, mixture of acetic acid and methanol (3:1), and mixture of acetic acid, methanol and water (1:1:1)), the presence or not of a complexing agent (tropolone), and the use of different modes of extraction (mechanical stirring, microwave and ultrasonic assisted extraction) were tested. Extracted OTCs were derivatised with sodium tetraethylborate and determined by gas chromatography coupled with mass spectrometer. Quantitative extraction of butyl-, phenyl- and octyltin compounds was obtained by the use of glacial acetic acid as extractant and mechanical stirring for 16h or sonication for 30 min. The limits of detection and quantification for OTCs investigated in sewage sludge were in the ng S ng(-1) range. Copyright (c) 2009 Elsevier B.V. All rights reserved.

  11. Studying Facebook via data extraction: the Netvizz application

    NARCIS (Netherlands)

    Rieder, B.

    2013-01-01

    This paper describes Netvizz, a data collection and extraction application that allows researchers to export data in standard file formats from different sections of the Facebook social networking service. Friendship networks, groups, and pages can thus be analyzed quantitatively and qualitatively

  12. Quantitative Decision Support Requires Quantitative User Guidance

    Science.gov (United States)

    Smith, L. A.

    2009-12-01

    Is it conceivable that models run on 2007 computer hardware could provide robust and credible probabilistic information for decision support and user guidance at the ZIP code level for sub-daily meteorological events in 2060? In 2090? Retrospectively, how informative would output from today’s models have proven in 2003? or the 1930’s? Consultancies in the United Kingdom, including the Met Office, are offering services to “future-proof” their customers from climate change. How is a US or European based user or policy maker to determine the extent to which exciting new Bayesian methods are relevant here? or when a commercial supplier is vastly overselling the insights of today’s climate science? How are policy makers and academic economists to make the closely related decisions facing them? How can we communicate deep uncertainty in the future at small length-scales without undermining the firm foundation established by climate science regarding global trends? Three distinct aspects of the communication of the uses of climate model output targeting users and policy makers, as well as other specialist adaptation scientists, are discussed. First, a brief scientific evaluation of the length and time scales at which climate model output is likely to become uninformative is provided, including a note on the applicability the latest Bayesian methodology to current state-of-the-art general circulation models output. Second, a critical evaluation of the language often employed in communication of climate model output, a language which accurately states that models are “better”, have “improved” and now “include” and “simulate” relevant meteorological processed, without clearly identifying where the current information is thought to be uninformative and misleads, both for the current climate and as a function of the state of the (each) climate simulation. And thirdly, a general approach for evaluating the relevance of quantitative climate model output

  13. Ingenious Snake: An Adaptive Multi-Class Contours Extraction

    Science.gov (United States)

    Li, Baolin; Zhou, Shoujun

    2018-04-01

    Active contour model (ACM) plays an important role in computer vision and medical image application. The traditional ACMs were used to extract single-class of object contours. While, simultaneous extraction of multi-class of interesting contours (i.e., various contours with closed- or open-ended) have not been solved so far. Therefore, a novel ACM model named “Ingenious Snake” is proposed to adaptively extract these interesting contours. In the first place, the ridge-points are extracted based on the local phase measurement of gradient vector flow field; the consequential ridgelines initialization are automated with high speed. Secondly, the contours’ deformation and evolvement are implemented with the ingenious snake. In the experiments, the result from initialization, deformation and evolvement are compared with the existing methods. The quantitative evaluation of the structure extraction is satisfying with respect of effectiveness and accuracy.

  14. VIPAR, a quantitative approach to 3D histopathology applied to lymphatic malformations.

    Science.gov (United States)

    Hägerling, René; Drees, Dominik; Scherzinger, Aaron; Dierkes, Cathrin; Martin-Almedina, Silvia; Butz, Stefan; Gordon, Kristiana; Schäfers, Michael; Hinrichs, Klaus; Ostergaard, Pia; Vestweber, Dietmar; Goerge, Tobias; Mansour, Sahar; Jiang, Xiaoyi; Mortimer, Peter S; Kiefer, Friedemann

    2017-08-17

    Lack of investigatory and diagnostic tools has been a major contributing factor to the failure to mechanistically understand lymphedema and other lymphatic disorders in order to develop effective drug and surgical therapies. One difficulty has been understanding the true changes in lymph vessel pathology from standard 2D tissue sections. VIPAR (volume information-based histopathological analysis by 3D reconstruction and data extraction), a light-sheet microscopy-based approach for the analysis of tissue biopsies, is based on digital reconstruction and visualization of microscopic image stacks. VIPAR allows semiautomated segmentation of the vasculature and subsequent nonbiased extraction of characteristic vessel shape and connectivity parameters. We applied VIPAR to analyze biopsies from healthy lymphedematous and lymphangiomatous skin. Digital 3D reconstruction provided a directly visually interpretable, comprehensive representation of the lymphatic and blood vessels in the analyzed tissue volumes. The most conspicuous features were disrupted lymphatic vessels in lymphedematous skin and a hyperplasia (4.36-fold lymphatic vessel volume increase) in the lymphangiomatous skin. Both abnormalities were detected by the connectivity analysis based on extracted vessel shape and structure data. The quantitative evaluation of extracted data revealed a significant reduction of lymphatic segment length (51.3% and 54.2%) and straightness (89.2% and 83.7%) for lymphedematous and lymphangiomatous skin, respectively. Blood vessel length was significantly increased in the lymphangiomatous sample (239.3%). VIPAR is a volume-based tissue reconstruction data extraction and analysis approach that successfully distinguished healthy from lymphedematous and lymphangiomatous skin. Its application is not limited to the vascular systems or skin. Max Planck Society, DFG (SFB 656), and Cells-in-Motion Cluster of Excellence EXC 1003.

  15. Protective role of Tinospora cordifolia extract against radiation-induced qualitative, quantitative and biochemical alterations in testes

    International Nuclear Information System (INIS)

    Sharma, Priyanka; Parmar, Jyoti; Sharma, Priyanka; Verma, Preeti; Goyal, P.K.

    2012-01-01

    restoring almost normal structure at the end of experimentation. Furthermore, TCE administration inhibited radiation-induced elevation of lipid per-oxidation (LPO) and reduction of glutathione (GSH) and catalase (CAT) levels in testes. These observations signify that the Tinospora cordifolia root extract can be used as an efficient radio- protector against radiation mediated qualitative, quantitative and biochemical alterations in testes. (author)

  16. Geopositioning with a quadcopter: Extracted feature locations and predicted accuracy without a priori sensor attitude information

    Science.gov (United States)

    Dolloff, John; Hottel, Bryant; Edwards, David; Theiss, Henry; Braun, Aaron

    2017-05-01

    This paper presents an overview of the Full Motion Video-Geopositioning Test Bed (FMV-GTB) developed to investigate algorithm performance and issues related to the registration of motion imagery and subsequent extraction of feature locations along with predicted accuracy. A case study is included corresponding to a video taken from a quadcopter. Registration of the corresponding video frames is performed without the benefit of a priori sensor attitude (pointing) information. In particular, tie points are automatically measured between adjacent frames using standard optical flow matching techniques from computer vision, an a priori estimate of sensor attitude is then computed based on supplied GPS sensor positions contained in the video metadata and a photogrammetric/search-based structure from motion algorithm, and then a Weighted Least Squares adjustment of all a priori metadata across the frames is performed. Extraction of absolute 3D feature locations, including their predicted accuracy based on the principles of rigorous error propagation, is then performed using a subset of the registered frames. Results are compared to known locations (check points) over a test site. Throughout this entire process, no external control information (e.g. surveyed points) is used other than for evaluation of solution errors and corresponding accuracy.

  17. A quantitative method to analyze the quality of EIA information in wind energy development and avian/bat assessments

    International Nuclear Information System (INIS)

    Chang, Tony; Nielsen, Erik; Auberle, William; Solop, Frederic I.

    2013-01-01

    The environmental impact assessment (EIA) has been a tool for decision makers since the enactment of the National Environmental Policy Act (NEPA). Since that time, few analyses have been performed to verify the quality of information and content within EIAs. High quality information within assessments is vital in order for decision makers, stake holders, and the public to understand the potential impact of proposed actions on the ecosystem and wildlife species. Low quality information has been a major cause for litigation and economic loss. Since 1999, wind energy development has seen an exponential growth with unknown levels of impact on wildlife species, in particular bird and bat species. The purpose of this article is to: (1) develop, validate, and apply a quantitative index to review avian/bat assessment quality for wind energy EIAs; and (2) assess the trends and status of avian/bat assessment quality in a sample of wind energy EIAs. This research presents the development and testing of the Avian and Bat Assessment Quality Index (ABAQI), a new approach to quantify information quality of ecological assessments within wind energy development EIAs in relation to avian and bat species based on review areas and factors derived from 23 state wind/wildlife siting guidance documents. The ABAQI was tested through a review of 49 publicly available EIA documents and validated by identifying high variation in avian and bat assessments quality for wind energy developments. Of all the reviewed EIAs, 66% failed to provide high levels of preconstruction avian and bat survey information, compared to recommended factors from state guidelines. This suggests the need for greater consistency from recommended guidelines by state, and mandatory compliance by EIA preparers to avoid possible habitat and species loss, wind energy development shut down, and future lawsuits. - Highlights: ► We developed, validated, and applied a quantitative index to review avian/bat assessment quality

  18. A quantitative method to analyze the quality of EIA information in wind energy development and avian/bat assessments

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Tony, E-mail: tc282@nau.edu [Environmental Science and Policy Program, School of Earth Science and Environmental Sustainability, Northern Arizona University, 602 S Humphreys P.O. Box 5694, Flagstaff, AZ, 86011 (United States); Nielsen, Erik, E-mail: erik.nielsen@nau.edu [Environmental Science and Policy Program, School of Earth Science and Environmental Sustainability, Northern Arizona University, 602 S Humphreys P.O. Box 5694, Flagstaff, AZ, 86011 (United States); Auberle, William, E-mail: william.auberle@nau.edu [Civil and Environmental Engineering Program, Department of Civil and Environmental Engineering, Northern Arizona University, 2112 S Huffer Ln P.O. Box 15600, Flagstaff, AZ, 860011 (United States); Solop, Frederic I., E-mail: fred.solop@nau.edu [Political Science Program, Department of Politics and International Affairs, Northern Arizona University, P.O. Box 15036, Flagstaff, AZ 86001 (United States)

    2013-01-15

    The environmental impact assessment (EIA) has been a tool for decision makers since the enactment of the National Environmental Policy Act (NEPA). Since that time, few analyses have been performed to verify the quality of information and content within EIAs. High quality information within assessments is vital in order for decision makers, stake holders, and the public to understand the potential impact of proposed actions on the ecosystem and wildlife species. Low quality information has been a major cause for litigation and economic loss. Since 1999, wind energy development has seen an exponential growth with unknown levels of impact on wildlife species, in particular bird and bat species. The purpose of this article is to: (1) develop, validate, and apply a quantitative index to review avian/bat assessment quality for wind energy EIAs; and (2) assess the trends and status of avian/bat assessment quality in a sample of wind energy EIAs. This research presents the development and testing of the Avian and Bat Assessment Quality Index (ABAQI), a new approach to quantify information quality of ecological assessments within wind energy development EIAs in relation to avian and bat species based on review areas and factors derived from 23 state wind/wildlife siting guidance documents. The ABAQI was tested through a review of 49 publicly available EIA documents and validated by identifying high variation in avian and bat assessments quality for wind energy developments. Of all the reviewed EIAs, 66% failed to provide high levels of preconstruction avian and bat survey information, compared to recommended factors from state guidelines. This suggests the need for greater consistency from recommended guidelines by state, and mandatory compliance by EIA preparers to avoid possible habitat and species loss, wind energy development shut down, and future lawsuits. - Highlights: Black-Right-Pointing-Pointer We developed, validated, and applied a quantitative index to review

  19. Extracting Metrics for Three-dimensional Root Systems: Volume and Surface Analysis from In-soil X-ray Computed Tomography Data.

    Science.gov (United States)

    Suresh, Niraj; Stephens, Sean A; Adams, Lexor; Beck, Anthon N; McKinney, Adriana L; Varga, Tamas

    2016-04-26

    Plant roots play a critical role in plant-soil-microbe interactions that occur in the rhizosphere, as well as processes with important implications to climate change and crop management. Quantitative size information on roots in their native environment is invaluable for studying root growth and environmental processes involving plants. X-ray computed tomography (XCT) has been demonstrated to be an effective tool for in situ root scanning and analysis. We aimed to develop a costless and efficient tool that approximates the surface and volume of the root regardless of its shape from three-dimensional (3D) tomography data. The root structure of a Prairie dropseed (Sporobolus heterolepis) specimen was imaged using XCT. The root was reconstructed, and the primary root structure was extracted from the data using a combination of licensed and open-source software. An isosurface polygonal mesh was then created for ease of analysis. We have developed the standalone application imeshJ, generated in MATLAB(1), to calculate root volume and surface area from the mesh. The outputs of imeshJ are surface area (in mm(2)) and the volume (in mm(3)). The process, utilizing a unique combination of tools from imaging to quantitative root analysis, is described. A combination of XCT and open-source software proved to be a powerful combination to noninvasively image plant root samples, segment root data, and extract quantitative information from the 3D data. This methodology of processing 3D data should be applicable to other material/sample systems where there is connectivity between components of similar X-ray attenuation and difficulties arise with segmentation.

  20. [Studies on preparative technology and quantitative determination for extracts of total saponin in roof of Panax japonicus].

    Science.gov (United States)

    He, Yu-min; Lu, Ke-ming; Yuan, Ding; Zhang, Chang-cheng

    2008-11-01

    To explore the optimum extraction and purification condition of the total saponins in the root of Panax japonicus (RPJ), and establish its quality control methods. Designed L16 (4(5)) orthogonal test with the extraction rate of total saponins as index, to determine the rational extraction process, and the techniques of water-saturated n-butanol extraction and acetone precipitation were applied to purify the alcohol extract of RPJ. Total saponins were detected by spectrophotometry and its triterpenoidal sapogenin oleanolic acid detected by HPLC. The optimum conditions of total saponins from RPJ was as follows: the material was pulverized, dipped in 60% ethanol aqueous solution as extract solvent at 10 times of volume, and refluxed 3 times for 3 h each time. Extractant of water-saturated n-butanol with extraction times of 3 and precipitant of acetone with precipitation amount of 4-5 times were included in the purification process, which would obtain the quality products. The content of total saponins could reach to 83.48%, and oleanolic acid to 38.30%. The optimized preparative technology is stable, convenient and practical. The extract rate of RPJ was high and steady with this technology, which provided new evidence for industrializing production of the plant and developing new drug.

  1. Extraction and Analysis of Information Related to Research & Development Declared Under an Additional Protocol

    International Nuclear Information System (INIS)

    Idinger, J.; Labella, R.; Rialhe, A.; Teller, N.

    2015-01-01

    The additional protocol (AP) provides important tools to strengthen and improve the effectiveness and efficiency of the safeguards system. Safeguards are designed to verify that States comply with their international commitments not to use nuclear material or to engage in nuclear-related activities for the purpose of developing nuclear weapons or other nuclear explosive devices. Under an AP based on INFCIRC/540, a State must provide to the IAEA additional information about, and inspector access to, all parts of its nuclear fuel cycle. In addition, the State has to supply information about its nuclear fuel cycle-related research and development (R&D) activities. The majority of States declare their R&D activities under the AP Articles 2.a.(i), 2.a.(x), and 2.b.(i) as part of initial declarations and their annual updates under the AP. In order to verify consistency and completeness of information provided under the AP by States, the Agency has started to analyze declared R&D information by identifying interrelationships between States in different R&D areas relevant to safeguards. The paper outlines the quality of R&D information provided by States to the Agency, describes how the extraction and analysis of relevant declarations are currently carried out at the Agency and specifies what kinds of difficulties arise during evaluation in respect to cross-linking international projects and finding gaps in reporting. In addition, the paper tries to elaborate how the reporting quality of AP information with reference to R&D activities and the assessment process of R&D information could be improved. (author)

  2. Metabolite extraction from adherently growing mammalian cells for metabolomics studies: optimization of harvesting and extraction protocols.

    Science.gov (United States)

    Dettmer, Katja; Nürnberger, Nadine; Kaspar, Hannelore; Gruber, Michael A; Almstetter, Martin F; Oefner, Peter J

    2011-01-01

    Trypsin/ethylenediaminetetraacetic acid (EDTA) treatment and cell scraping in a buffer solution were compared for harvesting adherently growing mammalian SW480 cells for metabolomics studies. In addition, direct scraping with a solvent was tested. Trypsinated and scraped cell pellets were extracted using seven different extraction protocols including pure methanol, methanol/water, pure acetone, acetone/water, methanol/chloroform/water, methanol/isopropanol/water, and acid-base methanol. The extracts were analyzed by GC-MS after methoximation/silylation and derivatization with propyl chloroformate, respectively. The metabolic fingerprints were compared and 25 selected metabolites including amino acids and intermediates of energy metabolism were quantitatively determined. Moreover, the influence of freeze/thaw cycles, ultrasonication and homogenization using ceramic beads on extraction yield was tested. Pure acetone yielded the lowest extraction efficiency while methanol, methanol/water, methanol/isopropanol/water, and acid-base methanol recovered similar metabolite amounts with good reproducibility. Based on overall performance, methanol/water was chosen as a suitable extraction solvent. Repeated freeze/thaw cycles, ultrasonication and homogenization did not improve overall metabolite yield of the methanol/water extraction. Trypsin/EDTA treatment caused substantial metabolite leakage proving it inadequate for metabolomics studies. Gentle scraping of the cells in a buffer solution and subsequent extraction with methanol/water resulted on average in a sevenfold lower recovery of quantified metabolites compared with direct scraping using methanol/water, making the latter one the method of choice to harvest and extract metabolites from adherently growing mammalian SW480 cells.

  3. Content of polyphenol compound in mangrove and macroalga extracts

    Science.gov (United States)

    Takarina, N. D.; Patria, M. P.

    2017-07-01

    Polyphenol or phenolic are compounds containing one or more hydroxyl group of the aromatic ring [1]. These compounds have some activities like antibacterial, antiseptic, and antioxidants. Natural resources like mangrove and macroalga were known containing these compounds. The purpose of the research was to investigate polyphenol content in mangrove and macroalga. Materials used in this research were mangrove (Avicennia sp.) leaves and the whole part of macroalga (Caulerpa racemosa). Samples were dried for 5 days then macerated in order to get an extract. Maceration were done using methanol for 48 hours (first) and 24 hours (second) continously. Polyphenol content was determined using phytochemical screening on both extracts. The quantitative test was carried out to determine catechin and tannin as polyphenol compound. The result showed that catechin was observed in both extracts while tannin in mangrove extract only. According to quantitative test, mangrove has a higher content of catechin and tannin which were 12.37-13.44 % compared to macroalga which was 2.57-4.58 %. Those indicated that both materials can be the source of polyphenol compound with higher content on mangrove. Moreover, according to this result, these resources can be utilized for advanced studies and human needs like medical drug.

  4. Quantitative gas chromatography-olfactometry carried out at different dilutions of an extract. Key differences in the odor profiles of four high-quality Spanish aged red wines.

    Science.gov (United States)

    Ferreira, V; Aznar, M; López, R; Cacho, J

    2001-10-01

    Four Spanish aged red wines made in different wine-making areas have been extracted, and the extracts and their 1:5, 1:50, and 1:500 dilutions have been analyzed by a gas chromatography-olfactometry (GC-O) approach in which three judges evaluated odor intensity on a four-point scale. Sixty-nine different odor regions were detected in the GC-O profiles of wines, 63 of which could be identified. GC-O data have been processed to calculate averaged flavor dilution factors (FD). Different ANOVA strategies have been further applied on FD and on intensity data to check for significant differences among wines and to assess the effects of dilution and the judge. Data show that FD and the average intensity of the odorants are strongly correlated (r(2) = 0.892). However, the measurement of intensity represents a quantitative advantage in terms of detecting differences. For some odorants, dilution exerts a critical role in the detection of differences. Significant differences among wines have been found in 30 of the 69 odorants detected in the experiment. Most of these differences are introduced by grape compounds such as methyl benzoate and terpenols, by compounds released by the wood, such as furfural, (Z)-whiskey lactone, Furaneol, 4-propylguaiacol, eugenol, 4-ethylphenol, 2,6-dimethoxyphenol, isoeugenol, and ethyl vanillate, by compounds formed by lactic acid bacteria, such as 2,3-butanedione and acetoine, or by compounds formed during the oxidative storage of wines, such as methional, sotolon, o-aminoacetophenone, and phenylacetic acid. The most important differences from a quantitative point of view are due to 2-methyl-3-mercaptofuran, 4-propylguaiacol, 2,6-dimethoxyphenol, and isoeugenol.

  5. Automated extraction of pleural effusion in three-dimensional thoracic CT images

    Science.gov (United States)

    Kido, Shoji; Tsunomori, Akinori

    2009-02-01

    It is important for diagnosis of pulmonary diseases to measure volume of accumulating pleural effusion in threedimensional thoracic CT images quantitatively. However, automated extraction of pulmonary effusion correctly is difficult. Conventional extraction algorithm using a gray-level based threshold can not extract pleural effusion from thoracic wall or mediastinum correctly, because density of pleural effusion in CT images is similar to those of thoracic wall or mediastinum. So, we have developed an automated extraction method of pulmonary effusion by use of extracting lung area with pleural effusion. Our method used a template of lung obtained from a normal lung for segmentation of lungs with pleural effusions. Registration process consisted of two steps. First step was a global matching processing between normal and abnormal lungs of organs such as bronchi, bones (ribs, sternum and vertebrae) and upper surfaces of livers which were extracted using a region-growing algorithm. Second step was a local matching processing between normal and abnormal lungs which were deformed by the parameter obtained from the global matching processing. Finally, we segmented a lung with pleural effusion by use of the template which was deformed by two parameters obtained from the global matching processing and the local matching processing. We compared our method with a conventional extraction method using a gray-level based threshold and two published methods. The extraction rates of pleural effusions obtained from our method were much higher than those obtained from other methods. Automated extraction method of pulmonary effusion by use of extracting lung area with pleural effusion is promising for diagnosis of pulmonary diseases by providing quantitative volume of accumulating pleural effusion.

  6. Multi-Paradigm and Multi-Lingual Information Extraction as Support for Medical Web Labelling Authorities

    Directory of Open Access Journals (Sweden)

    Martin Labsky

    2010-10-01

    Full Text Available Until recently, quality labelling of medical web content has been a pre-dominantly manual activity. However, the advances in automated text processing opened the way to computerised support of this activity. The core enabling technology is information extraction (IE. However, the heterogeneity of websites offering medical content imposes particular requirements on the IE techniques to be applied. In the paper we discuss these requirements and describe a multi-paradigm approach to IE addressing them. Experiments on multi-lingual data are reported. The research has been carried out within the EU MedIEQ project.

  7. Solvent extraction of platinum with thiobenzanilide. Separation of platinum from copper

    International Nuclear Information System (INIS)

    Shkil', A.N.; Zolotov, Yu.A.

    1989-01-01

    The solvent extraction of micro concentrations of platinum has been investigated from hydrochloric acid media using thiobenzanilide in the presence of SnCl 2 and KI. In the presence of SnCl 2 platinum is extracted rapidly and to significant completion. Conditions have been developed for the quantitative extraction of platinum. The authors have also examined the solvent extraction of copper(II) using thiobenzanilide, interference due to copper(II) and iron(III) on solvent extraction of platinum, and methods to suppress this interference. A procedure has also been developed for the separation of platinum from copper. Solvent extraction of metals was studied using radioactive isotopes: 197 Pt, 64 Cu, 59 Fe, 198 Au, 109 Pd, 110m Ag

  8. BioSimplify: an open source sentence simplification engine to improve recall in automatic biomedical information extraction

    OpenAIRE

    Jonnalagadda, Siddhartha; Gonzalez, Graciela

    2011-01-01

    BioSimplify is an open source tool written in Java that introduces and facilitates the use of a novel model for sentence simplification tuned for automatic discourse analysis and information extraction (as opposed to sentence simplification for improving human readability). The model is based on a "shot-gun" approach that produces many different (simpler) versions of the original sentence by combining variants of its constituent elements. This tool is optimized for processing biomedical scien...

  9. Puffed cereals with added chamomile – quantitative analysis of polyphenols and optimization of their extraction method

    Directory of Open Access Journals (Sweden)

    Tomasz Blicharski

    2017-05-01

    For most of the analyzed compounds, the highest yields were obtained by ultrasound assisted extraction. The highest temperature during the ultrasonification process (60oC increased the efficiency of extraction, without degradation of polyphenols. UAE easily arrives at extraction equilibrium and therefore permits shorter periods of time, reducing the energy input. Furthermore, UAE meets the requirements of ‘Green Chemistry’

  10. Evaluation of RNA extraction methods and identification of putative reference genes for real-time quantitative polymerase chain reaction expression studies on olive (Olea europaea L.) fruits.

    Science.gov (United States)

    Nonis, Alberto; Vezzaro, Alice; Ruperti, Benedetto

    2012-07-11

    Genome wide transcriptomic surveys together with targeted molecular studies are uncovering an ever increasing number of differentially expressed genes in relation to agriculturally relevant processes in olive (Olea europaea L). These data need to be supported by quantitative approaches enabling the precise estimation of transcript abundance. qPCR being the most widely adopted technique for mRNA quantification, preliminary work needs to be done to set up robust methods for extraction of fully functional RNA and for the identification of the best reference genes to obtain reliable quantification of transcripts. In this work, we have assessed different methods for their suitability for RNA extraction from olive fruits and leaves and we have evaluated thirteen potential candidate reference genes on 21 RNA samples belonging to fruit developmental/ripening series and to leaves subjected to wounding. By using two different algorithms, GAPDH2 and PP2A1 were identified as the best reference genes for olive fruit development and ripening, and their effectiveness for normalization of expression of two ripening marker genes was demonstrated.

  11. An image-processing strategy to extract important information suitable for a low-size stimulus pattern in a retinal prosthesis.

    Science.gov (United States)

    Chen, Yili; Fu, Jixiang; Chu, Dawei; Li, Rongmao; Xie, Yaoqin

    2017-11-27

    A retinal prosthesis is designed to help the blind to obtain some sight. It consists of an external part and an internal part. The external part is made up of a camera, an image processor and an RF transmitter. The internal part is made up of an RF receiver, implant chip and microelectrode. Currently, the number of microelectrodes is in the hundreds, and we do not know the mechanism for using an electrode to stimulate the optic nerve. A simple hypothesis is that the pixels in an image correspond to the electrode. The images captured by the camera should be processed by suitable strategies to correspond to stimulation from the electrode. Thus, it is a question of how to obtain the important information from the image captured in the picture. Here, we use the region of interest (ROI), a useful algorithm for extracting the ROI, to retain the important information, and to remove the redundant information. This paper explains the details of the principles and functions of the ROI. Because we are investigating a real-time system, we need a fast processing ROI as a useful algorithm to extract the ROI. Thus, we simplified the ROI algorithm and used it in an outside image-processing digital signal processing (DSP) system of the retinal prosthesis. The results show that our image-processing strategies are suitable for a real-time retinal prosthesis and can eliminate redundant information and provide useful information for expression in a low-size image.

  12. An Informed Approach to Improving Quantitative Literacy and Mitigating Math Anxiety in Undergraduates Through Introductory Science Courses

    Science.gov (United States)

    Follette, K.; McCarthy, D.

    2012-08-01

    Current trends in the teaching of high school and college science avoid numerical engagement because nearly all students lack basic arithmetic skills and experience anxiety when encountering numbers. Nevertheless, such skills are essential to science and vital to becoming savvy consumers, citizens capable of recognizing pseudoscience, and discerning interpreters of statistics in ever-present polls, studies, and surveys in which our society is awash. Can a general-education collegiate course motivate students to value numeracy and to improve their quantitative skills in what may well be their final opportunity in formal education? We present a tool to assess whether skills in numeracy/quantitative literacy can be fostered and improved in college students through the vehicle of non-major introductory courses in astronomy. Initial classroom applications define the magnitude of this problem and indicate that significant improvements are possible. Based on these initial results we offer this tool online and hope to collaborate with other educators, both formal and informal, to develop effective mechanisms for encouraging all students to value and improve their skills in basic numeracy.

  13. Extractive Summarisation of Medical Documents

    OpenAIRE

    Abeed Sarker; Diego Molla; Cecile Paris

    2012-01-01

    Background Evidence Based Medicine (EBM) practice requires practitioners to extract evidence from published medical research when answering clinical queries. Due to the time-consuming nature of this practice, there is a strong motivation for systems that can automatically summarise medical documents and help practitioners find relevant information. Aim The aim of this work is to propose an automatic query-focused, extractive summarisation approach that selects informative sentences from medic...

  14. Automated prostate cancer localization without the need for peripheral zone extraction using multiparametric MRI.

    Science.gov (United States)

    Liu, Xin; Yetik, Imam Samil

    2011-06-01

    Multiparametric magnetic resonance imaging (MRI) has been shown to have higher localization accuracy than transrectal ultrasound (TRUS) for prostate cancer. Therefore, automated cancer segmentation using multiparametric MRI is receiving a growing interest, since MRI can provide both morphological and functional images for tissue of interest. However, all automated methods to this date are applicable to a single zone of the prostate, and the peripheral zone (PZ) of the prostate needs to be extracted manually, which is a tedious and time-consuming job. In this paper, our goal is to remove the need of PZ extraction by incorporating the spatial and geometric information of prostate tumors with multiparametric MRI derived from T2-weighted MRI, diffusion-weighted imaging (DWI) and dynamic contrast enhanced MRI (DCE-MRI). In order to remove the need of PZ extraction, the authors propose a new method to incorporate the spatial information of the cancer. This is done by introducing a new feature called location map. This new feature is constructed by applying a nonlinear transformation to the spatial position coordinates of each pixel, so that the location map implicitly represents the geometric position of each pixel with respect to the prostate region. Then, this new feature is combined with multiparametric MR images to perform tumor localization. The proposed algorithm is applied to multiparametric prostate MRI data obtained from 20 patients with biopsy-confirmed prostate cancer. The proposed method which does not need the masks of PZ was found to have prostate cancer detection specificity of 0.84, sensitivity of 0.80 and dice coefficient value of 0.42. The authors have found that fusing the spatial information allows us to obtain tumor outline without the need of PZ extraction with a considerable success (better or similar performance to methods that require manual PZ extraction). Our experimental results quantitatively demonstrate the effectiveness of the proposed

  15. Fuel reprocessing of the fast molten salt reactor: actinides et lanthanides extraction

    International Nuclear Information System (INIS)

    Jaskierowicz, S.

    2012-01-01

    The fuel reprocessing of the molten salt reactor (Gen IV concept) is a multi-steps process in which actinides and lanthanides extraction is performed by a reductive extraction technique. The development of an analytic model has showed that the contact between the liquid fuel LiF-ThF 4 and a metallic phase constituted of Bi-Li provide firstly a selective and quantitative extraction of actinides and secondly a quantitative extraction of lanthanides. The control of this process implies the knowledge of saline phase properties. Studies of the physico-chemical properties of fluoride salts lead to develop a technique based on potentiometric measurements to evaluate the fluoro-acidity of the salts. An acidity scale was established in order to classify the different fluoride salts considered. Another electrochemical method was also developed in order to determine the solvation properties of solutes in fluoride F- environment (and particularly ThF 4 by F-) in reductive extraction technique, a metallic phase is also involved. A method to prepare this phase was developed by electro-reduction of lithium on a bismuth liquid cathode in LiCl-LiF melt. This technique allows to accurately control the molar fraction of lithium introduced into the liquid bismuth, which is a main parameter to obtain an efficient extraction. (author)

  16. Semantics-based information extraction for detecting economic events

    NARCIS (Netherlands)

    A.C. Hogenboom (Alexander); F. Frasincar (Flavius); K. Schouten (Kim); O. van der Meer

    2013-01-01

    textabstractAs today's financial markets are sensitive to breaking news on economic events, accurate and timely automatic identification of events in news items is crucial. Unstructured news items originating from many heterogeneous sources have to be mined in order to extract knowledge useful for

  17. Quantify Water Extraction by TBP/Dodecane via Molecular Dynamics Simulations

    International Nuclear Information System (INIS)

    Khomami, Bamin; Cui, Shengting; De Almeida, Valmor F.

    2013-01-01

    The purpose of this project is to quantify the interfacial transport of water into the most prevalent nuclear reprocessing solvent extractant mixture, namely tri-butyl- phosphate (TBP) and dodecane, via massively parallel molecular dynamics simulations on the most powerful machines available for open research. Specifically, we will accomplish this objective by evolving the water/TBP/dodecane system up to 1 ms elapsed time, and validate the simulation results by direct comparison with experimentally measured water solubility in the organic phase. The significance of this effort is to demonstrate for the first time that the combination of emerging simulation tools and state-of-the-art supercomputers can provide quantitative information on par to experimental measurements for solvent extraction systems of relevance to the nuclear fuel cycle. Results: Initially, the isolated single component, and single phase systems were studied followed by the two-phase, multicomponent counterpart. Specifically, the systems we studied were: pure TBP; pure n-dodecane; TBP/n-dodecane mixture; and the complete extraction system: water-TBP/n-dodecane two phase system to gain deep insight into the water extraction process. We have completely achieved our goal of simulating the molecular extraction of water molecules into the TBP/n-dodecane mixture up to the saturation point, and obtained favorable comparison with experimental data. Many insights into fundamental molecular level processes and physics were obtained from the process. Most importantly, we found that the dipole moment of the extracting agent is crucially important in affecting the interface roughness and the extraction rate of water molecules into the organic phase. In addition, we have identified shortcomings in the existing OPLS-AA force field potential for long-chain alkanes. The significance of this force field is that it is supposed to be optimized for molecular liquid simulations. We found that it failed for dodecane and

  18. Lung region extraction based on the model information and the inversed MIP method by using chest CT images

    International Nuclear Information System (INIS)

    Tomita, Toshihiro; Miguchi, Ryosuke; Okumura, Toshiaki; Yamamoto, Shinji; Matsumoto, Mitsuomi; Tateno, Yukio; Iinuma, Takeshi; Matsumoto, Toru.

    1997-01-01

    We developed a lung region extraction method based on the model information and the inversed MIP method in the Lung Cancer Screening CT (LSCT). Original model is composed of typical 3-D lung contour lines, a body axis, an apical point, and a convex hull. First, the body axis. the apical point, and the convex hull are automatically extracted from the input image Next, the model is properly transformed to fit to those of input image by the affine transformation. Using the same affine transformation coefficients, typical lung contour lines are also transferred, which correspond to rough contour lines of input image. Experimental results applied for 68 samples showed this method quite promising. (author)

  19. Extraction-spectrophotometric determination of purine alkaloids in water solutions using aliphatic alcohols

    Directory of Open Access Journals (Sweden)

    Y. I. Korenman

    2012-01-01

    Full Text Available For extraction of caffeine, theobromin and theophylline from water solutions are applied aliphatic alcohols С3 – С9. Water concentrates analyzed method UF- spectrophotometry. Factors of distribution and extraction degree are calculated. Influence of length of a hydrocarbonic radical in a solvent and nature olecule salting-out agent on interphase distribution of alkaloids is studied. Dependence of quantitative characteristics extraction from number active groups in structure alkaloids is established.

  20. Comparison of extraction fluids used with contaminated soils

    International Nuclear Information System (INIS)

    Erickson, D.C.; White, E.; Loehr, R.C.

    1991-01-01

    Five separate solutions were evaluated for use as leaching fluids with soils containing petroleum refining waste residues. The extraction fluids were: (a) water, (b) dilute hydrochloric acid, (c) 0.05 molar EDTA, (d) acetate buffer and (e) a dilute sulfuric/nitric acid mixture. The soils were collected from former refinery land treatment sites which had been used to treat petroleum refining wastes. The extractions were performed using a rotary tumbler (30 RPM, 18 hours) and the resulting solutions were analyzed for polynuclear aromatic hydrocarbons (PAHs) and metals. Concentrations of the PAHs in each of the five solutions were near or below the analytical quantitation limits. Metal concentrations were highest in the HCL and EDTA extracts, although only a small fraction of the total available metal present in the soils was extracted by the solutions evaluated

  1. Automated quantitative assessment of proteins' biological function in protein knowledge bases.

    Science.gov (United States)

    Mayr, Gabriele; Lepperdinger, Günter; Lackner, Peter

    2008-01-01

    Primary protein sequence data are archived in databases together with information regarding corresponding biological functions. In this respect, UniProt/Swiss-Prot is currently the most comprehensive collection and it is routinely cross-examined when trying to unravel the biological role of hypothetical proteins. Bioscientists frequently extract single entries and further evaluate those on a subjective basis. In lieu of a standardized procedure for scoring the existing knowledge regarding individual proteins, we here report about a computer-assisted method, which we applied to score the present knowledge about any given Swiss-Prot entry. Applying this quantitative score allows the comparison of proteins with respect to their sequence yet highlights the comprehension of functional data. pfs analysis may be also applied for quality control of individual entries or for database management in order to rank entry listings.

  2. Automated Quantitative Assessment of Proteins' Biological Function in Protein Knowledge Bases

    Directory of Open Access Journals (Sweden)

    Gabriele Mayr

    2008-01-01

    Full Text Available Primary protein sequence data are archived in databases together with information regarding corresponding biological functions. In this respect, UniProt/Swiss-Prot is currently the most comprehensive collection and it is routinely cross-examined when trying to unravel the biological role of hypothetical proteins. Bioscientists frequently extract single entries and further evaluate those on a subjective basis. In lieu of a standardized procedure for scoring the existing knowledge regarding individual proteins, we here report about a computer-assisted method, which we applied to score the present knowledge about any given Swiss-Prot entry. Applying this quantitative score allows the comparison of proteins with respect to their sequence yet highlights the comprehension of functional data. pfs analysis may be also applied for quality control of individual entries or for database management in order to rank entry listings.

  3. Rapid liquid–liquid extraction of thallium(III from succinate media with 2-octylaminopyridine in chloroform as the extractant

    Directory of Open Access Journals (Sweden)

    SANDIP V. MAHAMUNI

    2008-04-01

    Full Text Available A simple solvent extraction study of thallium(III was conducted. Selective and quantitative extraction of thallium(III by 2-octylaminopyridine (2-OAP in chloroform occurred from aqueous sodium succinate medium (0.0075 M at pH 3.0. Thallium(III was back extracted with acetate buffer (pH 4.63. The effect of the concentration of succinate and 2-OAP, the role of various diluents, stripping agents, loading capacity of 2-OAP, equilibrium time and aqueous:organic volume ratio on the extraction of thallium(III was studied. The stoichiometry of the extracted species was determined based on the slope analysis method and found to be 1: 2: 1 (metal:acid:extractant. The temperature dependence of the extraction equilibrium constant was also examined to estimate the apparent thermodynamic functions ∆H, ∆G and ∆S for the extraction reaction. The method is free from interference of a large number of cations and anions. The method was used for the selective extraction of thallium(III from its binary mixture with Zn(II, Cd(II, Hg(II, Bi(III, Pb(II, Se(IV, Te(IV, Sb(III, Ga(III, In(III, Al(III, Tl(I and Fe(III. The proposed method was applied to the synthetic mixtures and alloys. It is simple, selective, rapid and eco-friendly.

  4. Automated extraction protocol for quantification of SARS-Coronavirus RNA in serum: an evaluation study

    Directory of Open Access Journals (Sweden)

    Lui Wing-bong

    2006-02-01

    Full Text Available Abstract Background We have previously developed a test for the diagnosis and prognostic assessment of the severe acute respiratory syndrome (SARS based on the detection of the SARS-coronavirus RNA in serum by real-time quantitative reverse transcriptase polymerase chain reaction (RT-PCR. In this study, we evaluated the feasibility of automating the serum RNA extraction procedure in order to increase the throughput of the assay. Methods An automated nucleic acid extraction platform using the MagNA Pure LC instrument (Roche Diagnostics was evaluated. We developed a modified protocol in compliance with the recommended biosafety guidelines from the World Health Organization based on the use of the MagNA Pure total nucleic acid large volume isolation kit for the extraction of SARS-coronavirus RNA. The modified protocol was compared with a column-based extraction kit (QIAamp viral RNA mini kit, Qiagen for quantitative performance, analytical sensitivity and precision. Results The newly developed automated protocol was shown to be free from carry-over contamination and have comparable performance with other standard protocols and kits designed for the MagNA Pure LC instrument. However, the automated method was found to be less sensitive, less precise and led to consistently lower serum SARS-coronavirus concentrations when compared with the column-based extraction method. Conclusion As the diagnostic efficiency and prognostic value of the serum SARS-CoV RNA RT-PCR test is critically associated with the analytical sensitivity and quantitative performance contributed both by the RNA extraction and RT-PCR components of the test, we recommend the use of the column-based manual RNA extraction method.

  5. The profile of quantitative risk indicators in Krsko NPP

    International Nuclear Information System (INIS)

    Vrbanic, I.; Basic, I.; Bilic-Zabric, T.; Spiler, J.

    2004-01-01

    During the past decade strong initiative was observed which was aimed at incorporating information on risk into various aspects of operation of nuclear power plants. The initiative was observable in activities carried out by regulators as well as utilities and industry. It resulted in establishing the process, or procedure, which is often referred to as integrated decision making or risk informed decision making. In this process, engineering analyses and evaluations that are usually termed traditional and that rely on considerations of safety margins and defense in depth are supplemented by quantitative indicators of risk. Throughout the process, the plant risk was most commonly expressed in terms of likelihood of events involving damage to the reactor core and events with radiological releases to the environment. These became two commonly used quantitative indicators or metrics of plant risk (or, reciprocally, plant safety). They were evaluated for their magnitude (e.g. the expected number of events per specified time interval), as well as their profile (e.g. the types of contributing events). The information for quantitative risk indicators (to be used in risk informing process) is obtained from plant's probabilistic safety analyses or analyses of hazards. It is dependable on issues such as availability of input data or quality of model or analysis. Nuclear power plant Krsko has recently performed Periodic Safety Review, which was a good opportunity to evaluate and integrate the plant specific information on quantitative plant risk indicators and their profile. The paper discusses some aspects of quantitative plant risk profile and its perception.(author)

  6. Applicability of a set of tomographic reconstruction algorithms for quantitative SPECT on irradiated nuclear fuel assemblies

    Energy Technology Data Exchange (ETDEWEB)

    Jacobsson Svärd, Staffan, E-mail: staffan.jacobsson_svard@physics.uu.se; Holcombe, Scott; Grape, Sophie

    2015-05-21

    A fuel assembly operated in a nuclear power plant typically contains 100–300 fuel rods, depending on fuel type, which become strongly radioactive during irradiation in the reactor core. For operational and security reasons, it is of interest to experimentally deduce rod-wise information from the fuel, preferably by means of non-destructive measurements. The tomographic SPECT technique offers such possibilities through its two-step application; (1) recording the gamma-ray flux distribution around the fuel assembly, and (2) reconstructing the assembly's internal source distribution, based on the recorded radiation field. In this paper, algorithms for performing the latter step and extracting quantitative relative rod-by-rod data are accounted for. As compared to application of SPECT in nuclear medicine, nuclear fuel assemblies present a much more heterogeneous distribution of internal attenuation to gamma radiation than the human body, typically with rods containing pellets of heavy uranium dioxide surrounded by cladding of a zirconium alloy placed in water or air. This inhomogeneity severely complicates the tomographic quantification of the rod-wise relative source content, and the deduction of conclusive data requires detailed modelling of the attenuation to be introduced in the reconstructions. However, as shown in this paper, simplified models may still produce valuable information about the fuel. Here, a set of reconstruction algorithms for SPECT on nuclear fuel assemblies are described and discussed in terms of their quantitative performance for two applications; verification of fuel assemblies' completeness in nuclear safeguards, and rod-wise fuel characterization. It is argued that a request not to base the former assessment on any a priori information brings constraints to which reconstruction methods that may be used in that case, whereas the use of a priori information on geometry and material content enables highly accurate quantitative

  7. Multineuronal vectorization is more efficient than time-segmental vectorization for information extraction from neuronal activities in the inferior temporal cortex.

    Science.gov (United States)

    Kaneko, Hidekazu; Tamura, Hiroshi; Tate, Shunta; Kawashima, Takahiro; Suzuki, Shinya S; Fujita, Ichiro

    2010-08-01

    In order for patients with disabilities to control assistive devices with their own neural activity, multineuronal spike trains must be efficiently decoded because only limited computational resources can be used to generate prosthetic control signals in portable real-time applications. In this study, we compare the abilities of two vectorizing procedures (multineuronal and time-segmental) to extract information from spike trains during the same total neuron-seconds. In the multineuronal vectorizing procedure, we defined a response vector whose components represented the spike counts of one to five neurons. In the time-segmental vectorizing procedure, a response vector consisted of components representing a neuron's spike counts for one to five time-segment(s) of a response period of 1 s. Spike trains were recorded from neurons in the inferior temporal cortex of monkeys presented with visual stimuli. We examined whether the amount of information of the visual stimuli carried by these neurons differed between the two vectorizing procedures. The amount of information calculated with the multineuronal vectorizing procedure, but not the time-segmental vectorizing procedure, significantly increased with the dimensions of the response vector. We conclude that the multineuronal vectorizing procedure is superior to the time-segmental vectorizing procedure in efficiently extracting information from neuronal signals. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  8. Extraction of plutonium and uranium from oxalate bearing solutions using phosphonic acid

    International Nuclear Information System (INIS)

    Godbole, A.G.; Mapara, P.M.; Swarup, Rajendra

    1995-01-01

    A feasibility study on the solvent extraction of plutonium and uranium from solutions containing oxalic and nitric acids using a phosphonic acid extractant (PC88A) was made to explore the possibility of recovering Pu from these solutions. Batch experiments on the extraction of Pu(IV) and U(VI) under different parameters were carried out using PC88A in dodecane. The results indicated that Pu could be extracted quantitatively by PC88A from these solutions. A good separation of Pu from U could be achieved at higher temperatures. (author). 6 refs., 3 tabs

  9. Analytical and preparative separation of organic acids from water by extraction with trioctylamine

    International Nuclear Information System (INIS)

    Eberle, S.H.; Hoyer, O.; Knobel, K.P.; Hodenberg, S. von

    1977-12-01

    The extraction of pure organic acids and of humic and ligninsulfonic acid from water by a solution of trioctylamine in chloroform was investigated (technical grade amine = ALAMINE). Quantitative separation is achieved by double extraction with 5% ALAMINE at pH 3,5 - 4. The acids may be back-extracted with dilute sodium hydroxide solution. Procedures are described for the analytical extraction of water samples of 200 to 2.000 ml and for the flow-through processing of large water volumes. (orig.) [de

  10. [Reconstituting evaluation methods based on both qualitative and quantitative paradigms].

    Science.gov (United States)

    Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro

    2011-01-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.

  11. LC-MS/MS quantitative analysis of reducing carbohydrates in soil solutions extracted from crop rhizospheres.

    Science.gov (United States)

    McRae, G; Monreal, C M

    2011-06-01

    A simple, sensitive, and specific analytical method has been developed for the quantitative determination of 15 reducing carbohydrates in the soil solution of crop rhizosphere. Reducing carbohydrates were derivatized with 1-phenyl-3-methyl-5-pyrazolone, separated by reversed-phase high-performance liquid chromatography and detected by electrospray ionization tandem mass spectrometry. Lower limits of quantitation of 2 ng/mL were achieved for all carbohydrates. Quantitation was performed using peak area ratios (analyte/internal standard) and a calibration curve spiked in water with glucose-d(2) as the internal standard. Calibration curves showed excellent linearity over the range 2-100 ng/mL (10-1,000 ng/mL for glucose). The method has been tested with quality control samples spiked in water and soil solution samples obtained from the rhizosphere of wheat and canola and has been found to provide accurate and precise results.

  12. Quantitative determination of sotolon, maltol and free furaneol in wine by solid-phase extraction and gas chromatography-ion-trap mass spectrometry.

    Science.gov (United States)

    Ferreira, Vicente; Jarauta, Idoia; López, Ricardo; Cacho, Juan

    2003-08-22

    A method for the analytical determination of sotolon [4,5-dimethyl-3-hydroxy-2(5H)-furanone], maltol [3-hydroxy-2-methyl-4H-pyran-4-one] and free furaneol [2,5-dimethyl-4-hydroxy-3(2H)-furanone] in wine has been developed. The analytes are extracted from 50 ml of wine in a solid-phase extraction cartridge filled with 800 mg of LiChrolut EN resins. Interferences are removed with 15 ml of a pentane-dichloromethane (20:1) solution, and analytes are recovered with 6 ml of dichloromethane. The extract is concentrated up to 0.1 ml and analyzed by GC-ion trap MS. Maltol and sotolon were determined by selected ion storage of ions in the m/z ranges 120-153 and 79-95, using the ions m/z 126 and 83 for quantitation, respectively. Furaneol was determined by non-resonant fragmentation of the m/z 128 mother ion and subsequent analysis of the m/z 81 ion. The detection limits of the method are in all cases between 0.5 and 1 microg l(-1), well below the olfactory thresholds of the compounds. The precision of the method is in the 4-5% range for levels in wine around 20 microg l(-1). Linearity holds at least up to 400 microg l(-1), and is satisfactory in all cases. The recoveries of maltol and sotolon are constant (70 and 64%, respectively) and do not depend on the type of wine. On the contrary, in the case of furaneol, red wines show constant and high recoveries (97%), while the recoveries on white wines range between 30 and 80%. Different experiments showed that this behavior is probably due to the existence of complexes formed between furaneol and sulphur dioxide or catechols. Sensory experiments confirmed that the complexed forms found in white wines are not perceived by orthonasal olfaction, and that the furaneol determined by the method can be considered as the free and odor-active fraction.

  13. Efficacy of Pitfall Trapping, Winkler and Berlese Extraction Methods for Measuring Ground-Dwelling Arthropods in Moist-Deciduous Forests in the Western Ghats

    Science.gov (United States)

    Sabu, Thomas K.; Shiju, Raj T.

    2010-01-01

    The present study provides data to decide on the most appropriate method for sampling of ground-dwelling arthropods measured in a moist-deciduous forest in the Western Ghats in South India. The abundance of ground-dwelling arthropods was compared among large numbers of samples obtained using pitfall trapping, Berlese and Winkler extraction methods. Highest abundance and frequency of most of the represented taxa indicated pitfall trapping as the ideal method for sampling of ground-dwelling arthropods. However, with possible bias towards surface-active taxa, pitfall-trapping data is inappropriate for quantitative studies, and Berlese extraction is the better alternative. Berlese extraction is the better method for quantitative measurements than the other two methods, whereas pitfall trapping would be appropriate for qualitative measurements. A comparison of the Berlese and Winkler extraction data shows that in a quantitative multigroup approach, Winkler extraction was inferior to Berlese extraction because the total number of arthropods caught was the lowest; and many of the taxa that were caught from an identical sample via Berlese extraction method were not caught. Significantly a greater frequency and higher abundance of arthropods belonging to Orthoptera, Blattaria, and Diptera occurred in pitfall-trapped samples and Psocoptera and Acariformes in Berlese-extracted samples than that were obtained in the other two methods, indicating that both methods are useful, one complementing the other, eliminating a chance for possible under-representation of taxa in quantitative studies. PMID:20673122

  14. Laboratory Assessment of Bio-efficacies of Phytochemical Extracts ...

    African Journals Online (AJOL)

    They were ground into powder and stored in air-tight glass bottles. The volatile phytochemical oil ... degrees of mosquitocidal activity. These observed variations in the bio-efficacies of the different extracts could be attributed to the corresponding variations in their qualitative and quantitative bioactive compound contents.

  15. DEVELOPMENT OF AUTOMATIC EXTRACTION METHOD FOR ROAD UPDATE INFORMATION BASED ON PUBLIC WORK ORDER OUTLOOK

    Science.gov (United States)

    Sekimoto, Yoshihide; Nakajo, Satoru; Minami, Yoshitaka; Yamaguchi, Syohei; Yamada, Harutoshi; Fuse, Takashi

    Recently, disclosure of statistic data, representing financial effects or burden for public work, through each web site of national or local government, enables us to discuss macroscopic financial trends. However, it is still difficult to grasp a basic property nationwide how each spot was changed by public work. In this research, our research purpose is to collect road update information reasonably which various road managers provide, in order to realize efficient updating of various maps such as car navigation maps. In particular, we develop the system extracting public work concerned and registering summary including position information to database automatically from public work order outlook, released by each local government, combinating some web mining technologies. Finally, we collect and register several tens of thousands from web site all over Japan, and confirm the feasibility of our method.

  16. Wavelet-based verification of the quantitative precipitation forecast

    Science.gov (United States)

    Yano, Jun-Ichi; Jakubiak, Bogumil

    2016-06-01

    This paper explores the use of wavelets for spatial verification of quantitative precipitation forecasts (QPF), and especially the capacity of wavelets to provide both localization and scale information. Two 24-h forecast experiments using the two versions of the Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS) on 22 August 2010 over Poland are used to illustrate the method. Strong spatial localizations and associated intermittency of the precipitation field make verification of QPF difficult using standard statistical methods. The wavelet becomes an attractive alternative, because it is specifically designed to extract spatially localized features. The wavelet modes are characterized by the two indices for the scale and the localization. Thus, these indices can simply be employed for characterizing the performance of QPF in scale and localization without any further elaboration or tunable parameters. Furthermore, spatially-localized features can be extracted in wavelet space in a relatively straightforward manner with only a weak dependence on a threshold. Such a feature may be considered an advantage of the wavelet-based method over more conventional "object" oriented verification methods, as the latter tend to represent strong threshold sensitivities. The present paper also points out limits of the so-called "scale separation" methods based on wavelets. Our study demonstrates how these wavelet-based QPF verifications can be performed straightforwardly. Possibilities for further developments of the wavelet-based methods, especially towards a goal of identifying a weak physical process contributing to forecast error, are also pointed out.

  17. Academic Activities Transaction Extraction Based on Deep Belief Network

    Directory of Open Access Journals (Sweden)

    Xiangqian Wang

    2017-01-01

    Full Text Available Extracting information about academic activity transactions from unstructured documents is a key problem in the analysis of academic behaviors of researchers. The academic activities transaction includes five elements: person, activities, objects, attributes, and time phrases. The traditional method of information extraction is to extract shallow text features and then to recognize advanced features from text with supervision. Since the information processing of different levels is completed in steps, the error generated from various steps will be accumulated and affect the accuracy of final results. However, because Deep Belief Network (DBN model has the ability to automatically unsupervise learning of the advanced features from shallow text features, the model is employed to extract the academic activities transaction. In addition, we use character-based feature to describe the raw features of named entities of academic activity, so as to improve the accuracy of named entity recognition. In this paper, the accuracy of the academic activities extraction is compared by using character-based feature vector and word-based feature vector to express the text features, respectively, and with the traditional text information extraction based on Conditional Random Fields. The results show that DBN model is more effective for the extraction of academic activities transaction information.

  18. Extract the Relational Information of Static Features and Motion Features for Human Activities Recognition in Videos

    Directory of Open Access Journals (Sweden)

    Li Yao

    2016-01-01

    Full Text Available Both static features and motion features have shown promising performance in human activities recognition task. However, the information included in these features is insufficient for complex human activities. In this paper, we propose extracting relational information of static features and motion features for human activities recognition. The videos are represented by a classical Bag-of-Word (BoW model which is useful in many works. To get a compact and discriminative codebook with small dimension, we employ the divisive algorithm based on KL-divergence to reconstruct the codebook. After that, to further capture strong relational information, we construct a bipartite graph to model the relationship between words of different feature set. Then we use a k-way partition to create a new codebook in which similar words are getting together. With this new codebook, videos can be represented by a new BoW vector with strong relational information. Moreover, we propose a method to compute new clusters from the divisive algorithm’s projective function. We test our work on the several datasets and obtain very promising results.

  19. Solvent extraction of some metal ions by dithiocarbamate types of chemically modified lipophilic chitosan

    International Nuclear Information System (INIS)

    Inoue, K.; Nakagawa, H.; Naganawa, H.; Tachimori, S.

    2001-01-01

    Chitosan is a basic polysaccharide containing primary amino groups with high reactivity. we prepared O,O'-decanoyl chitosan and dithiocarbamate O,O'-decanoyl chitosan; the former was soluble in chloroform and toluene, while latter was soluble not only these diluents but also in some aliphatic diluents such as hexane and kerosene which are employed in commercial scale solvent extraction. Solvent extraction by dithiocarbamate O,O'-decanoyl chitosan in kerosene was tested for some base metal ions from sulfuric acid solution. The sequence of selectivity for these metal ions was found to be as follows: Cu(II) >> Ni(II) > Cd(II) ∼ Fe(III) > Co(II) ∼ Zn(II). Copper(II) was quantitatively extracted at pH > 1 and quantitatively stripped with 2 M sulfuric acid solution. Solvent extraction of silver(I) and gold(III) from hydrochloric acid as well as lanthanides and americium(III) from nitrate solution were also tested. Americium was selectively extracted over trivalent lanthanides, suggesting a high possibility for the final treatment of high level radioactive wastes. (authors)

  20. Ion-pair extraction of [3]histobadine from biological fluids

    International Nuclear Information System (INIS)

    Scasnar, V.

    1997-01-01

    A simple and specific radiometric assay was developed for determination of stobadine, a cardio protective drug, in the serum of experimental animals. It is based on a single extraction step of the radioactively labeled drug from serum into the benzene solution of dicarbolide of cobalt followed by the quantitation of the extracted radioactivity by using liquid scintillation counting. The extraction mechanism involves the ion-pair formation between the protonized molecule of stobadine and the hydrophobic, negatively charged molecule of dicarbollide of cobalt. The extraction of yield of stobadine from 1 ml of serum was 95% in the concentration range from 1 to 6000 ng/ml. The co extraction of metabolites was less than 5%. The assay was applied to determination of stobadine in serum of dogs and the data obtained were in good agreement with those obtained by high performance liquid chromatography. (author)

  1. Identification of ginseng root using quantitative X-ray microtomography.

    Science.gov (United States)

    Ye, Linlin; Xue, Yanling; Wang, Yudan; Qi, Juncheng; Xiao, Tiqiao

    2017-07-01

    The use of X-ray phase-contrast microtomography for the investigation of Chinese medicinal materials is advantageous for its nondestructive, in situ , and three-dimensional quantitative imaging properties. The X-ray phase-contrast microtomography quantitative imaging method was used to investigate the microstructure of ginseng, and the phase-retrieval method is also employed to process the experimental data. Four different ginseng samples were collected and investigated; these were classified according to their species, production area, and sample growth pattern. The quantitative internal characteristic microstructures of ginseng were extracted successfully. The size and position distributions of the calcium oxalate cluster crystals (COCCs), important secondary metabolites that accumulate in ginseng, are revealed by the three-dimensional quantitative imaging method. The volume and amount of the COCCs in different species of the ginseng are obtained by a quantitative analysis of the three-dimensional microstructures, which shows obvious difference among the four species of ginseng. This study is the first to provide evidence of the distribution characteristics of COCCs to identify four types of ginseng, with regard to species authentication and age identification, by X-ray phase-contrast microtomography quantitative imaging. This method is also expected to reveal important relationships between COCCs and the occurrence of the effective medicinal components of ginseng.

  2. Pressurized Hot Water Extraction of anthocyanins from red onion: A study on extraction and degradation rates

    Energy Technology Data Exchange (ETDEWEB)

    Petersson, Erik V.; Liu Jiayin; Sjoeberg, Per J.R.; Danielsson, Rolf [Uppsala University, Department of Physical and Analytical Chemistry, P.O. Box 599, SE-751 24, Uppsala (Sweden); Turner, Charlotta, E-mail: Charlotta.Turner@kemi.uu.se [Uppsala University, Department of Physical and Analytical Chemistry, P.O. Box 599, SE-751 24, Uppsala (Sweden)

    2010-03-17

    Pressurized Hot Water Extraction (PHWE) is a quick, efficient and environmentally friendly technique for extractions. However, when using PHWE to extract thermally unstable analytes, extraction and degradation effects occur at the same time, and thereby compete. At first, the extraction effect dominates, but degradation effects soon take over. In this paper, extraction and degradation rates of anthocyanins from red onion were studied with experiments in a static batch reactor at 110 deg. C. A total extraction curve was calculated with data from the actual extraction and degradation curves, showing that more anthocyanins, 21-36% depending on the species, could be extracted if no degradation occurred, but then longer extraction times would be required than those needed to reach the peak level in the apparent extraction curves. The results give information about the different kinetic processes competing during an extraction procedure.

  3. Phytochemicals and heavy metals analysis of methanolic extract of edible mushrooms collected from Karak District, Khyber Pakhtunkhwa, Pakistan

    Directory of Open Access Journals (Sweden)

    Farhan

    2016-09-01

    Full Text Available Objective: To qualitatively evaluate the phytochemicals and quantitatively determine the heavy metals of three species of edible mushrooms collected from the Karak area of Khyber Pukhtoonkhwa, Pakistan. Methods: The plant sample was subjected to methanolic extraction. The extraction was then concentrated by using rotary evaporator. The methanolic extract was screened for the qualitative study of various phytochemicals and quantitative measurement of heavy metals. Results: A maximum of phytochemicals were confirmed by carring out different tests. Among the different phytochemicals, alkaloids, flavonoids, proteins and carbohydrates were found to be present in the extracts, while saponins and glycosides were not detected. Similarly quantitative study of heavy metals was also conducted on the same extracts of the edible mushrooms. The results suggested that iron was present in maximum concentration than all other metals and nickel was found to be present in little amount when compared with other metals. All the metals were found present. Conclusions: The concentrations of heavy metals were investigated in the samples which were different in all samples. The presence of different phytochemicals in the mushroom is the key for its active biological profile.

  4. Subjective Quantitative Studies of Human Agency

    Science.gov (United States)

    Alkire, Sabina

    2005-01-01

    Amartya Sen's writings have articulated the importance of human agency, and identified the need for information on agency freedom to inform our evaluation of social arrangements. Many approaches to poverty reduction stress the need for empowerment. This paper reviews "subjective quantitative measures of human agency at the individual level." It…

  5. How predictive quantitative modelling of tissue organisation can inform liver disease pathogenesis.

    Science.gov (United States)

    Drasdo, Dirk; Hoehme, Stefan; Hengstler, Jan G

    2014-10-01

    From the more than 100 liver diseases described, many of those with high incidence rates manifest themselves by histopathological changes, such as hepatitis, alcoholic liver disease, fatty liver disease, fibrosis, and, in its later stages, cirrhosis, hepatocellular carcinoma, primary biliary cirrhosis and other disorders. Studies of disease pathogeneses are largely based on integrating -omics data pooled from cells at different locations with spatial information from stained liver structures in animal models. Even though this has led to significant insights, the complexity of interactions as well as the involvement of processes at many different time and length scales constrains the possibility to condense disease processes in illustrations, schemes and tables. The combination of modern imaging modalities with image processing and analysis, and mathematical models opens up a promising new approach towards a quantitative understanding of pathologies and of disease processes. This strategy is discussed for two examples, ammonia metabolism after drug-induced acute liver damage, and the recovery of liver mass as well as architecture during the subsequent regeneration process. This interdisciplinary approach permits integration of biological mechanisms and models of processes contributing to disease progression at various scales into mathematical models. These can be used to perform in silico simulations to promote unravelling the relation between architecture and function as below illustrated for liver regeneration, and bridging from the in vitro situation and animal models to humans. In the near future novel mechanisms will usually not be directly elucidated by modelling. However, models will falsify hypotheses and guide towards the most informative experimental design. Copyright © 2014 European Association for the Study of the Liver. Published by Elsevier B.V. All rights reserved.

  6. The extraction behaviour of As(III) and As(V) in APDC-CCl4 system

    International Nuclear Information System (INIS)

    Yang Ruiying; Zhu Xuping

    1997-01-01

    The extraction and back-extraction behaviour of As(III) and As(V) in APDC-CCl 4 system have been studied by using 76 As trace technique. As(III) can be extracted quantitatively by APDC-CCl 4 system at pH = 1-3. As(V) can be extracted after being reduced to As(III) by Na 2 S 2 O 3 . Water with high pH value can be used for back-extraction. The method can be applied in the separation of inorganic As(III) and As(V) in water quality inspection

  7. High-Resolution Remote Sensing Image Building Extraction Based on Markov Model

    Science.gov (United States)

    Zhao, W.; Yan, L.; Chang, Y.; Gong, L.

    2018-04-01

    With the increase of resolution, remote sensing images have the characteristics of increased information load, increased noise, more complex feature geometry and texture information, which makes the extraction of building information more difficult. To solve this problem, this paper designs a high resolution remote sensing image building extraction method based on Markov model. This method introduces Contourlet domain map clustering and Markov model, captures and enhances the contour and texture information of high-resolution remote sensing image features in multiple directions, and further designs the spectral feature index that can characterize "pseudo-buildings" in the building area. Through the multi-scale segmentation and extraction of image features, the fine extraction from the building area to the building is realized. Experiments show that this method can restrain the noise of high-resolution remote sensing images, reduce the interference of non-target ground texture information, and remove the shadow, vegetation and other pseudo-building information, compared with the traditional pixel-level image information extraction, better performance in building extraction precision, accuracy and completeness.

  8. On effect of diluent nature on synergistic extraction

    International Nuclear Information System (INIS)

    Shmidt, V.S.; Rybakov, K.A.; Rubisov, V.N.

    1982-01-01

    Published experimental mass data on the effect of diluent nature on the extraction of metals by mixtures of acidic (HA) and neutral (B) extractants are analysed using correlations based on the linearity of ratios of free energies. It is determined that the logarithm of equilibrium constant of MAsub(n)Bsub(m) adduct formation in the organic phase causing synergism decreases linearity as diluent tabular BP * parameters increase according to lgKsub(s)=lgKsub(os)-aBP * formula while the sensitivity coefficient a grows roughly proportionally to the augmentation of solvation number m and lgKsub(os) increases as extraction ability B grows. Values of logarithms of metal extraction constants by mixtures of extractants (Ksub(ex)) also decrease linearly as diluent BP * increases, the sensitivity coefficcient of this dependence being connected with the value of HA physical distribution constant and its hydrophobic nature. The found regularities permit to forecast using BP * scale, the effect of diluent nature on synergistic extraction of metal cations by mixtures of acidic extractants of different hydrophobic nature with neutral extractants and to describe quantitatively in a brief form mass data of extraction constants for series of such systems within the limits of which only the nature of the diluent changes

  9. Linking attentional processes and conceptual problem solving: visual cues facilitate the automaticity of extracting relevant information from diagrams.

    Science.gov (United States)

    Rouinfar, Amy; Agra, Elise; Larson, Adam M; Rebello, N Sanjay; Loschky, Lester C

    2014-01-01

    This study investigated links between visual attention processes and conceptual problem solving. This was done by overlaying visual cues on conceptual physics problem diagrams to direct participants' attention to relevant areas to facilitate problem solving. Participants (N = 80) individually worked through four problem sets, each containing a diagram, while their eye movements were recorded. Each diagram contained regions that were relevant to solving the problem correctly and separate regions related to common incorrect responses. Problem sets contained an initial problem, six isomorphic training problems, and a transfer problem. The cued condition saw visual cues overlaid on the training problems. Participants' verbal responses were used to determine their accuracy. This study produced two major findings. First, short duration visual cues which draw attention to solution-relevant information and aid in the organizing and integrating of it, facilitate both immediate problem solving and generalization of that ability to new problems. Thus, visual cues can facilitate re-representing a problem and overcoming impasse, enabling a correct solution. Importantly, these cueing effects on problem solving did not involve the solvers' attention necessarily embodying the solution to the problem, but were instead caused by solvers attending to and integrating relevant information in the problems into a solution path. Second, this study demonstrates that when such cues are used across multiple problems, solvers can automatize the extraction of problem-relevant information extraction. These results suggest that low-level attentional selection processes provide a necessary gateway for relevant information to be used in problem solving, but are generally not sufficient for correct problem solving. Instead, factors that lead a solver to an impasse and to organize and integrate problem information also greatly facilitate arriving at correct solutions.

  10. Liquid-liquid extraction of uranium(VI) using Cyanex 272 in toluene from sodium salicylate medium

    International Nuclear Information System (INIS)

    Madane, Namdev S.; Nikam, Gurunath H.; Jadhav, Deepali V.; Mohite, Baburao S.

    2011-01-01

    Liquid-liquid extraction of U(VI) from sodium salicylate media using Cyanex 272 in toluene has been carried out. Uranium(VI) was quantitatively extracted from 1 x 10 -3 M sodium salicylate with 5 x 10 -4 M Cyanex 272 in toluene. It was stripped quantitatively from the organic phase with 1M HCl and determined spectrophotometrically with arsenazo(III) at 660 nm. The effect of concentrations of sodium salicylate, extractant, diluents, metal ion and strippants have been studied. Separation of uranium(VI) from other elements was achieved from binary as well as from multicomponent mixtures. The method was extended to determination of uranium(VI) in geological samples. The method is simple, rapid and selective with good reproducibility (approximately ± 2%). (author)

  11. Extraction and separation studies of Ga(III, In(III and Tl(III using the neutral organophosphorous extractant, Cyanex-923

    Directory of Open Access Journals (Sweden)

    P. M. DHADKE

    2003-07-01

    Full Text Available The neutral extractant, Cyanes-923 has been used for the extraction and separation of gallium(III, indium(III and thallium(III from acidic solution. These metal ions were found to be quantitatively extracted with Cyanex-923 in toluene in the pH range 4.5–5.5, 5.0–6.5 and 1.5–3.0, respectively, and from the organic phase they can be stripped with 2.0 mol dm-3 HNO3, 3.0 mol dm-3 HNO3 and 3.0 mol dm-3 HCl, respectively. The effect of pH equilibration period, diluents, diverse ions and stripping agents on the extraction of Ga(III, In(III and Tl(III has been studied. The stroichiometry of the extracted species of these metal ions was determined on the basis of the slope analysis method. The reaction proceed by solvation and the probable extracted species found were [MCl3. 3Cyanex-923] [where M = Ga(III or In(III ] and [HTlCl4. 3Cyanex-923]. Based on these results a sequential procedure for the separation of Ga(III, In(III and Tl(III from each other was developed.

  12. A METHOD OF EXTRACTING SHORELINE BASED ON SEMANTIC INFORMATION USING DUAL-LENGTH LiDAR DATA

    Directory of Open Access Journals (Sweden)

    C. Yao

    2017-09-01

    Full Text Available Shoreline is a spatial varying separation between water and land. By utilizing dual-wavelength LiDAR point data together with semantic information that shoreline often appears beyond water surface profile and is observable on the beach, the paper generates the shoreline and the details are as follows: (1 Gain the water surface profile: first we obtain water surface by roughly selecting water points based on several features of water body, then apply least square fitting method to get the whole water trend surface. Then we get the ground surface connecting the under -water surface by both TIN progressive filtering method and surface interpolation method. After that, we have two fitting surfaces intersected to get water surface profile of the island. (2 Gain the sandy beach: we grid all points and select the water surface profile grids points as seeds, then extract sandy beach points based on eight-neighborhood method and features, then we get all sandy beaches. (3 Get the island shoreline: first we get the sandy beach shoreline based on intensity information, then we get a threshold value to distinguish wet area and dry area, therefore we get the shoreline of several sandy beaches. In some extent, the shoreline has the same height values within a small area, by using all the sandy shoreline points to fit a plane P, and the intersection line of the ground surface and the shoreline plane P can be regarded as the island shoreline. By comparing with the surveying shoreline, the results show that the proposed method can successfully extract shoreline.

  13. Information Extraction From Chemical Patents

    Directory of Open Access Journals (Sweden)

    Sandra Bergmann

    2012-01-01

    Full Text Available The development of new chemicals or pharmaceuticals is preceded by an indepth analysis of published patents in this field. This information retrieval is a costly and time inefficient step when done by a human reader, yet it is mandatory for potential success of an investment. The goal of the research project UIMA-HPC is to automate and hence speed-up the process of knowledge mining about patents. Multi-threaded analysis engines, developed according to UIMA (Unstructured Information Management Architecture standards, process texts and images in thousands of documents in parallel. UNICORE (UNiform Interface to COmputing Resources workflow control structures make it possible to dynamically allocate resources for every given task to gain best cpu-time/realtime ratios in an HPC environment.

  14. Quantitative Analysis of Tetramethylenedisulfotetramine ("Tetramine") Spiked into Beverages by Liquid Chromatography Tandem Mass Spectrometry with Validation by Gas Chromatography Mass Spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Owens, J; Hok, S; Alcaraz, A; Koester, C

    2008-11-13

    Tetramethylenedisulfotetramine, commonly known as tetramine, is a highly neurotoxic rodenticide (human oral LD{sub 50} = 0.1 mg/kg) used in hundreds of deliberate food poisoning events in China. Here we describe a method for quantitation of tetramine spiked into beverages, including milk, juice, tea, cola, and water and cleaned up by C8 solid phase extraction and liquid-liquid extraction. Quantitation by high performance liquid chromatography tandem mass spectrometry (LC/MS/MS) was based upon fragmentation of m/z 347 to m/z 268. The method was validated by gas chromatography mass spectrometry (GC/MS) operated in SIM mode for ions m/z 212, 240, and 360. The limit of quantitation was 0.10 {micro}g/mL by LC/MS/MS versus 0.15 {micro}g/mL for GC/MS. Fortifications of the beverages at 2.5 {micro}g/mL and 0.25 {micro}g/mL were recovered ranging from 73-128% by liquid-liquid extraction for GC/MS analysis, 13-96% by SPE and 10-101% by liquid-liquid extraction for LC/MS/MS analysis.

  15. Quantitative Analysis of Tetramethylenedisulfotetramine ('Tetramine') Spiked into Beverages by Liquid Chromatography Tandem Mass Spectrometry with Validation by Gas Chromatography Mass Spectrometry

    International Nuclear Information System (INIS)

    Owens, J.; Hok, S.; Alcaraz, A.; Koester, C.

    2008-01-01

    Tetramethylenedisulfotetramine, commonly known as tetramine, is a highly neurotoxic rodenticide (human oral LD 50 = 0.1 mg/kg) used in hundreds of deliberate food poisoning events in China. Here we describe a method for quantitation of tetramine spiked into beverages, including milk, juice, tea, cola, and water and cleaned up by C8 solid phase extraction and liquid-liquid extraction. Quantitation by high performance liquid chromatography tandem mass spectrometry (LC/MS/MS) was based upon fragmentation of m/z 347 to m/z 268. The method was validated by gas chromatography mass spectrometry (GC/MS) operated in SIM mode for ions m/z 212, 240, and 360. The limit of quantitation was 0.10 (micro)g/mL by LC/MS/MS versus 0.15 (micro)g/mL for GC/MS. Fortifications of the beverages at 2.5 (micro)g/mL and 0.25 (micro)g/mL were recovered ranging from 73-128% by liquid-liquid extraction for GC/MS analysis, 13-96% by SPE and 10-101% by liquid-liquid extraction for LC/MS/MS analysis.

  16. Quantitative analysis of phytosterols in edible oils using APCI liquid chromatography-tandem mass spectrometry

    Science.gov (United States)

    Mo, Shunyan; Dong, Linlin; Hurst, W. Jeffrey; van Breemen, Richard B.

    2014-01-01

    Previous methods for the quantitative analysis of phytosterols have usually used GC-MS and require elaborate sample preparation including chemical derivatization. Other common methods such as HPLC with absorbance detection do not provide information regarding the identity of the analytes. To address the need for an assay that utilizes mass selectivity while avoiding derivatization, a quantitative method based on LC-tandem mass spectrometry (LC-MS-MS) was developed and validated for the measurement of six abundant dietary phytosterols and structurally related triterpene alcohols including brassicasterol, campesterol, cycloartenol, β-sitosterol, stigmasterol, and lupeol in edible oils. Samples were saponified, extracted with hexane and then analyzed using reversed phase HPLC with positive ion atmospheric pressure chemical ionization tandem mass spectrometry and selected reaction monitoring. The utility of the LC-MS-MS method was demonstrated by analyzing 14 edible oils. All six compounds were present in at least some of the edible oils. The most abundant phytosterol in all samples was β-sitosterol, which was highest in corn oil at 4.35 ± 0.03 mg/g, followed by campesterol in canola oil at 1.84 ± 0.01 mg/g. The new LC-MS-MS method for the quantitative analysis of phytosterols provides a combination of speed, selectivity and sensitivity that exceed those of previous assays. PMID:23884629

  17. An innovative method for extracting isotopic information from low-resolution gamma spectra

    International Nuclear Information System (INIS)

    Miko, D.; Estep, R.J.; Rawool-Sullivan, M.W.

    1998-01-01

    A method is described for the extraction of isotopic information from attenuated gamma ray spectra using the gross-count material basis set (GC-MBS) model. This method solves for the isotopic composition of an unknown mixture of isotopes attenuated through an absorber of unknown material. For binary isotopic combinations the problem is nonlinear in only one variable and is easily solved using standard line optimization techniques. Results are presented for NaI spectrum analyses of various binary combinations of enriched uranium, depleted uranium, low burnup Pu, 137 Cs, and 133 Ba attenuated through a suite of absorbers ranging in Z from polyethylene through lead. The GC-MBS method results are compared to those computed using ordinary response function fitting and with a simple net peak area method. The GC-MBS method was found to be significantly more accurate than the other methods over the range of absorbers and isotopic blends studied

  18. Extractive text summarization system to aid data extraction from full text in systematic review development.

    Science.gov (United States)

    Bui, Duy Duc An; Del Fiol, Guilherme; Hurdle, John F; Jonnalagadda, Siddhartha

    2016-12-01

    Extracting data from publication reports is a standard process in systematic review (SR) development. However, the data extraction process still relies too much on manual effort which is slow, costly, and subject to human error. In this study, we developed a text summarization system aimed at enhancing productivity and reducing errors in the traditional data extraction process. We developed a computer system that used machine learning and natural language processing approaches to automatically generate summaries of full-text scientific publications. The summaries at the sentence and fragment levels were evaluated in finding common clinical SR data elements such as sample size, group size, and PICO values. We compared the computer-generated summaries with human written summaries (title and abstract) in terms of the presence of necessary information for the data extraction as presented in the Cochrane review's study characteristics tables. At the sentence level, the computer-generated summaries covered more information than humans do for systematic reviews (recall 91.2% vs. 83.8%, p<0.001). They also had a better density of relevant sentences (precision 59% vs. 39%, p<0.001). At the fragment level, the ensemble approach combining rule-based, concept mapping, and dictionary-based methods performed better than individual methods alone, achieving an 84.7% F-measure. Computer-generated summaries are potential alternative information sources for data extraction in systematic review development. Machine learning and natural language processing are promising approaches to the development of such an extractive summarization system. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. EnvMine: A text-mining system for the automatic extraction of contextual information

    Directory of Open Access Journals (Sweden)

    de Lorenzo Victor

    2010-06-01

    Full Text Available Abstract Background For ecological studies, it is crucial to count on adequate descriptions of the environments and samples being studied. Such a description must be done in terms of their physicochemical characteristics, allowing a direct comparison between different environments that would be difficult to do otherwise. Also the characterization must include the precise geographical location, to make possible the study of geographical distributions and biogeographical patterns. Currently, there is no schema for annotating these environmental features, and these data have to be extracted from textual sources (published articles. So far, this had to be performed by manual inspection of the corresponding documents. To facilitate this task, we have developed EnvMine, a set of text-mining tools devoted to retrieve contextual information (physicochemical variables and geographical locations from textual sources of any kind. Results EnvMine is capable of retrieving the physicochemical variables cited in the text, by means of the accurate identification of their associated units of measurement. In this task, the system achieves a recall (percentage of items retrieved of 92% with less than 1% error. Also a Bayesian classifier was tested for distinguishing parts of the text describing environmental characteristics from others dealing with, for instance, experimental settings. Regarding the identification of geographical locations, the system takes advantage of existing databases such as GeoNames to achieve 86% recall with 92% precision. The identification of a location includes also the determination of its exact coordinates (latitude and longitude, thus allowing the calculation of distance between the individual locations. Conclusion EnvMine is a very efficient method for extracting contextual information from different text sources, like published articles or web pages. This tool can help in determining the precise location and physicochemical

  20. Earth Science Data Analytics: Preparing for Extracting Knowledge from Information

    Science.gov (United States)

    Kempler, Steven; Barbieri, Lindsay

    2016-01-01

    Data analytics is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. Data analytics is a broad term that includes data analysis, as well as an understanding of the cognitive processes an analyst uses to understand problems and explore data in meaningful ways. Analytics also include data extraction, transformation, and reduction, utilizing specific tools, techniques, and methods. Turning to data science, definitions of data science sound very similar to those of data analytics (which leads to a lot of the confusion between the two). But the skills needed for both, co-analyzing large amounts of heterogeneous data, understanding and utilizing relevant tools and techniques, and subject matter expertise, although similar, serve different purposes. Data Analytics takes on a practitioners approach to applying expertise and skills to solve issues and gain subject knowledge. Data Science, is more theoretical (research in itself) in nature, providing strategic actionable insights and new innovative methodologies. Earth Science Data Analytics (ESDA) is the process of examining, preparing, reducing, and analyzing large amounts of spatial (multi-dimensional), temporal, or spectral data using a variety of data types to uncover patterns, correlations and other information, to better understand our Earth. The large variety of datasets (temporal spatial differences, data types, formats, etc.) invite the need for data analytics skills that understand the science domain, and data preparation, reduction, and analysis techniques, from a practitioners point of view. The application of these skills to ESDA is the focus of this presentation. The Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster was created in recognition of the practical need to facilitate the co-analysis of large amounts of data and information for Earth science. Thus, from a to

  1. [Methods of quantitative proteomics].

    Science.gov (United States)

    Kopylov, A T; Zgoda, V G

    2007-01-01

    In modern science proteomic analysis is inseparable from other fields of systemic biology. Possessing huge resources quantitative proteomics operates colossal information on molecular mechanisms of life. Advances in proteomics help researchers to solve complex problems of cell signaling, posttranslational modification, structure and functional homology of proteins, molecular diagnostics etc. More than 40 various methods have been developed in proteomics for quantitative analysis of proteins. Although each method is unique and has certain advantages and disadvantages all these use various isotope labels (tags). In this review we will consider the most popular and effective methods employing both chemical modifications of proteins and also metabolic and enzymatic methods of isotope labeling.

  2. Combining qualitative and quantitative operational research methods to inform quality improvement in pathways that span multiple settings

    Science.gov (United States)

    Crowe, Sonya; Brown, Katherine; Tregay, Jenifer; Wray, Jo; Knowles, Rachel; Ridout, Deborah A; Bull, Catherine; Utley, Martin

    2017-01-01

    Background Improving integration and continuity of care across sectors within resource constraints is a priority in many health systems. Qualitative operational research methods of problem structuring have been used to address quality improvement in services involving multiple sectors but not in combination with quantitative operational research methods that enable targeting of interventions according to patient risk. We aimed to combine these methods to augment and inform an improvement initiative concerning infants with congenital heart disease (CHD) whose complex care pathway spans multiple sectors. Methods Soft systems methodology was used to consider systematically changes to services from the perspectives of community, primary, secondary and tertiary care professionals and a patient group, incorporating relevant evidence. Classification and regression tree (CART) analysis of national audit datasets was conducted along with data visualisation designed to inform service improvement within the context of limited resources. Results A ‘Rich Picture’ was developed capturing the main features of services for infants with CHD pertinent to service improvement. This was used, along with a graphical summary of the CART analysis, to guide discussions about targeting interventions at specific patient risk groups. Agreement was reached across representatives of relevant health professions and patients on a coherent set of targeted recommendations for quality improvement. These fed into national decisions about service provision and commissioning. Conclusions When tackling complex problems in service provision across multiple settings, it is important to acknowledge and work with multiple perspectives systematically and to consider targeting service improvements in response to confined resources. Our research demonstrates that applying a combination of qualitative and quantitative operational research methods is one approach to doing so that warrants further

  3. An Overview of Advanced SILAC-Labeling Strategies for Quantitative Proteomics.

    Science.gov (United States)

    Terzi, F; Cambridge, S

    2017-01-01

    Comparative, quantitative mass spectrometry of proteins provides great insight to protein abundance and function, but some molecular characteristics related to protein dynamics are not so easily obtained. Because the metabolic incorporation of stable amino acid isotopes allows the extraction of distinct temporal and spatial aspects of protein dynamics, the SILAC methodology is uniquely suited to be adapted for advanced labeling strategies. New SILAC strategies have emerged that allow deeper foraging into the complexity of cellular proteomes. Here, we review a few advanced SILAC-labeling strategies that have been published during last the years. Among them, different subsaturating-labeling as well as dual-labeling schemes are most prominent for a range of analyses including those of neuronal proteomes, secretion, or cell-cell-induced stimulations. These recent developments suggest that much more information can be gained from proteomic analyses if the labeling strategies are specifically tailored toward the experimental design. © 2017 Elsevier Inc. All rights reserved.

  4. Mining of the social network extraction

    Science.gov (United States)

    Nasution, M. K. M.; Hardi, M.; Syah, R.

    2017-01-01

    The use of Web as social media is steadily gaining ground in the study of social actor behaviour. However, information in Web can be interpreted in accordance with the ability of the method such as superficial methods for extracting social networks. Each method however has features and drawbacks: it cannot reveal the behaviour of social actors, but it has the hidden information about them. Therefore, this paper aims to reveal such information in the social networks mining. Social behaviour could be expressed through a set of words extracted from the list of snippets.

  5. BBN based Quantitative Assessment of Software Design Specification

    International Nuclear Information System (INIS)

    Eom, Heung-Seop; Park, Gee-Yong; Kang, Hyun-Gook; Kwon, Kee-Choon; Chang, Seung-Cheol

    2007-01-01

    Probabilistic Safety Assessment (PSA), which is one of the important methods in assessing the overall safety of a nuclear power plant (NPP), requires quantitative reliability information of safety-critical software, but the conventional reliability assessment methods can not provide enough information for PSA of a NPP. Therefore current PSA which includes safety-critical software does not usually consider the reliability of the software or uses arbitrary values for it. In order to solve this situation this paper proposes a method that can produce quantitative reliability information of safety-critical software for PSA by making use of Bayesian Belief Networks (BBN). BBN has generally been used to model an uncertain system in many research fields including the safety assessment of software. The proposed method was constructed by utilizing BBN which can combine the qualitative and the quantitative evidence relevant to the reliability of safety critical software. The constructed BBN model can infer a conclusion in a formal and a quantitative way. A case study was carried out with the proposed method to assess the quality of software design specification (SDS) of safety-critical software that will be embedded in a reactor protection system. The intermediate V and V results of the software design specification were used as inputs to the BBN model

  6. Silica-based ionic liquid coating for 96-blade system for extraction of aminoacids from complex matrixes

    International Nuclear Information System (INIS)

    Mousavi, Fatemeh; Pawliszyn, Janusz

    2013-01-01

    Graphical abstract: -- Highlights: •Silica-based 1-vinyl-3-octadecylimidazolium bromide ionic liquid was synthesized and characterized. •The synthesized polymer was immobilized on the stainless steel blade using polyacrylonitrile glue. •SiImC 18 -PAN 96-blade SPME was applied as an extraction phase for extraction of highly polar compounds in grape matrix. •This system provides high extraction efficiency and reproducibility for up to 50 extractions from tartaric buffer and 20 extractions from grape pulp. -- Abstract: 1-Vinyl-3-octadecylimidazolium bromide ionic liquid [C 18 VIm]Br was prepared and used for the modification of mercaptopropyl-functionalized silica (Si-MPS) through surface radical chain-transfer addition. The synthesized octadecylimidazolium-modified silica (SiImC 18 ) was characterized by thermogravimetric analysis (TGA), infrared spectroscopy (IR), 13 C NMR and 29 Si NMR spectroscopy and used as an extraction phase for the automated 96-blade solid phase microextraction (SPME) system with thin-film geometry using polyacrylonitrile (PAN) glue. The new proposed extraction phase was applied for extraction of aminoacids from grape pulp, and LC–MS–MS method was developed for separation of model compounds. Extraction efficiency, reusability, linearity, limit of detection, limit of quantitation and matrix effect were evaluated. The whole process of sample preparation for the proposed method requires 270 min for 96 samples simultaneously (60 min preconditioning, 90 min extraction, 60 min desorption and 60 min for carryover step) using 96-blade SPME system. Inter-blade and intra-blade reproducibility were in the respective ranges of 5–13 and 3–10% relative standard deviation (RSD) for all model compounds. Limits of detection and quantitation of the proposed SPME-LC–MS/MS system for analysis of analytes were found to range from 0.1 to 1.0 and 0.5 to 3.0 μg L −1 , respectively. Standard addition calibration was applied for quantitative

  7. Silica-based ionic liquid coating for 96-blade system for extraction of aminoacids from complex matrixes

    Energy Technology Data Exchange (ETDEWEB)

    Mousavi, Fatemeh; Pawliszyn, Janusz, E-mail: janusz@uwaterloo.ca

    2013-11-25

    Graphical abstract: -- Highlights: •Silica-based 1-vinyl-3-octadecylimidazolium bromide ionic liquid was synthesized and characterized. •The synthesized polymer was immobilized on the stainless steel blade using polyacrylonitrile glue. •SiImC{sub 18}-PAN 96-blade SPME was applied as an extraction phase for extraction of highly polar compounds in grape matrix. •This system provides high extraction efficiency and reproducibility for up to 50 extractions from tartaric buffer and 20 extractions from grape pulp. -- Abstract: 1-Vinyl-3-octadecylimidazolium bromide ionic liquid [C{sub 18}VIm]Br was prepared and used for the modification of mercaptopropyl-functionalized silica (Si-MPS) through surface radical chain-transfer addition. The synthesized octadecylimidazolium-modified silica (SiImC{sub 18}) was characterized by thermogravimetric analysis (TGA), infrared spectroscopy (IR), {sup 13}C NMR and {sup 29}Si NMR spectroscopy and used as an extraction phase for the automated 96-blade solid phase microextraction (SPME) system with thin-film geometry using polyacrylonitrile (PAN) glue. The new proposed extraction phase was applied for extraction of aminoacids from grape pulp, and LC–MS–MS method was developed for separation of model compounds. Extraction efficiency, reusability, linearity, limit of detection, limit of quantitation and matrix effect were evaluated. The whole process of sample preparation for the proposed method requires 270 min for 96 samples simultaneously (60 min preconditioning, 90 min extraction, 60 min desorption and 60 min for carryover step) using 96-blade SPME system. Inter-blade and intra-blade reproducibility were in the respective ranges of 5–13 and 3–10% relative standard deviation (RSD) for all model compounds. Limits of detection and quantitation of the proposed SPME-LC–MS/MS system for analysis of analytes were found to range from 0.1 to 1.0 and 0.5 to 3.0 μg L{sup −1}, respectively. Standard addition calibration was

  8. Developing a Process Model for the Forensic Extraction of Information from Desktop Search Applications

    Directory of Open Access Journals (Sweden)

    Timothy Pavlic

    2008-03-01

    Full Text Available Desktop search applications can contain cached copies of files that were deleted from the file system. Forensic investigators see this as a potential source of evidence, as documents deleted by suspects may still exist in the cache. Whilst there have been attempts at recovering data collected by desktop search applications, there is no methodology governing the process, nor discussion on the most appropriate means to do so. This article seeks to address this issue by developing a process model that can be applied when developing an information extraction application for desktop search applications, discussing preferred methods and the limitations of each. This work represents a more structured approach than other forms of current research.

  9. SIMPLE METHOD FOR THE EXTRACTION OF PHOTOPIGMENTS AND MYCOSPORINE-LIKE AMINO ACIDS (MAAS) FROM SYMBIODINIUM SPP.

    Science.gov (United States)

    Numerous extraction methods have been developed and used in the quantitation of both photopigments and mycosporine amino acids (MAAs) found in Symbiodinium sp. and zooanthellate metazoans. We have development of a simple, mild extraction procedure using methanol, which when coupl...

  10. Smart Extraction and Analysis System for Clinical Research.

    Science.gov (United States)

    Afzal, Muhammad; Hussain, Maqbool; Khan, Wajahat Ali; Ali, Taqdir; Jamshed, Arif; Lee, Sungyoung

    2017-05-01

    With the increasing use of electronic health records (EHRs), there is a growing need to expand the utilization of EHR data to support clinical research. The key challenge in achieving this goal is the unavailability of smart systems and methods to overcome the issue of data preparation, structuring, and sharing for smooth clinical research. We developed a robust analysis system called the smart extraction and analysis system (SEAS) that consists of two subsystems: (1) the information extraction system (IES), for extracting information from clinical documents, and (2) the survival analysis system (SAS), for a descriptive and predictive analysis to compile the survival statistics and predict the future chance of survivability. The IES subsystem is based on a novel permutation-based pattern recognition method that extracts information from unstructured clinical documents. Similarly, the SAS subsystem is based on a classification and regression tree (CART)-based prediction model for survival analysis. SEAS is evaluated and validated on a real-world case study of head and neck cancer. The overall information extraction accuracy of the system for semistructured text is recorded at 99%, while that for unstructured text is 97%. Furthermore, the automated, unstructured information extraction has reduced the average time spent on manual data entry by 75%, without compromising the accuracy of the system. Moreover, around 88% of patients are found in a terminal or dead state for the highest clinical stage of disease (level IV). Similarly, there is an ∼36% probability of a patient being alive if at least one of the lifestyle risk factors was positive. We presented our work on the development of SEAS to replace costly and time-consuming manual methods with smart automatic extraction of information and survival prediction methods. SEAS has reduced the time and energy of human resources spent unnecessarily on manual tasks.

  11. Sequence complexity and work extraction

    International Nuclear Information System (INIS)

    Merhav, Neri

    2015-01-01

    We consider a simplified version of a solvable model by Mandal and Jarzynski, which constructively demonstrates the interplay between work extraction and the increase of the Shannon entropy of an information reservoir which is in contact with a physical system. We extend Mandal and Jarzynski’s main findings in several directions: first, we allow sequences of correlated bits rather than just independent bits. Secondly, at least for the case of binary information, we show that, in fact, the Shannon entropy is only one measure of complexity of the information that must increase in order for work to be extracted. The extracted work can also be upper bounded in terms of the increase in other quantities that measure complexity, like the predictability of future bits from past ones. Third, we provide an extension to the case of non-binary information (i.e. a larger alphabet), and finally, we extend the scope to the case where the incoming bits (before the interaction) form an individual sequence, rather than a random one. In this case, the entropy before the interaction can be replaced by the Lempel–Ziv (LZ) complexity of the incoming sequence, a fact that gives rise to an entropic meaning of the LZ complexity, not only in information theory, but also in physics. (paper)

  12. A quantitative method for determination of aflatoxin B in roasted corn.

    Science.gov (United States)

    Shannon, G M; Shotwell, O L

    1975-07-01

    Roasting aflatoxin-contaminated corn will reduce toxin levels. A quantitative analysis for aflatoxin in roasted corn has been developed by modifying a cleanup technique for green coffee extracts approved as official first action by the AOAC. A chloroform extract is partially purified on a Florisil column, and thin layer chromatographic (TLC) plates are developed with methylene chloride-chloroform-isoamyl alcohol-formic acid (81+15+3+1). Recoveries average 101% and the sensitivity limit is 5 ppb aflatoxin B1. A 2-dimensional TLC procedure can also be used to separate the aflatoxins from background interferences.

  13. Fixed kernel regression for voltammogram feature extraction

    International Nuclear Information System (INIS)

    Acevedo Rodriguez, F J; López-Sastre, R J; Gil-Jiménez, P; Maldonado Bascón, S; Ruiz-Reyes, N

    2009-01-01

    Cyclic voltammetry is an electroanalytical technique for obtaining information about substances under analysis without the need for complex flow systems. However, classifying the information in voltammograms obtained using this technique is difficult. In this paper, we propose the use of fixed kernel regression as a method for extracting features from these voltammograms, reducing the information to a few coefficients. The proposed approach has been applied to a wine classification problem with accuracy rates of over 98%. Although the method is described here for extracting voltammogram information, it can be used for other types of signals

  14. Cranberry and Grape Seed Extracts Inhibit the Proliferative Phenotype of Oral Squamous Cell Carcinomas

    Directory of Open Access Journals (Sweden)

    Kourt Chatelain

    2011-01-01

    Full Text Available Proanthocyanidins, compounds highly concentrated in dietary fruits, such as cranberries and grapes, demonstrate significant cancer prevention potential against many types of cancer. The objective of this study was to evaluate cranberry and grape seed extracts to quantitate and compare their anti-proliferative effects on the most common type of oral cancer, oral squamous cell carcinoma. Using two well-characterized oral squamous cell carcinoma cell lines, CAL27 and SCC25, assays were performed to evaluate the effects of cranberry and grape seed extract on phenotypic behaviors of these oral cancers. The proliferation of both oral cancer cell lines was significantly inhibited by the administration of cranberry and grape seed extracts, in a dose-dependent manner. In addition, key regulators of apoptosis, caspase-2 and caspase-8, were concomitantly up-regulated by these treatments. However, cranberry and grape seed extracts elicited differential effects on cell adhesion, cell morphology, and cell cycle regulatory pathways. This study represents one of the first comparative investigations of cranberry and grape seed extracts and their anti-proliferative effects on oral cancers. Previous findings using purified proanthocyanidin from grape seed extract demonstrated more prominent growth inhibition, as well as apoptosis-inducing, properties on CAL27 cells. These observations provide evidence that cranberry and grape seed extracts not only inhibit oral cancer proliferation but also that the mechanism of this inhibition may function by triggering key apoptotic regulators in these cell lines. This information will be of benefit to researchers interested in elucidating which dietary components are central to mechanisms involved in the mediation of oral carcinogenesis and progression.

  15. Extraction of indirectly captured information for use in a comparison of offline pH measurement technologies.

    Science.gov (United States)

    Ritchie, Elspeth K; Martin, Elaine B; Racher, Andy; Jaques, Colin

    2017-06-10

    Understanding the causes of discrepancies in pH readings of a sample can allow more robust pH control strategies to be implemented. It was found that 59.4% of differences between two offline pH measurement technologies for an historical dataset lay outside an expected instrument error range of ±0.02pH. A new variable, Osmo Res , was created using multiple linear regression (MLR) to extract information indirectly captured in the recorded measurements for osmolality. Principal component analysis and time series analysis were used to validate the expansion of the historical dataset with the new variable Osmo Res . MLR was used to identify variables strongly correlated (p<0.05) with differences in pH readings by the two offline pH measurement technologies. These included concentrations of specific chemicals (e.g. glucose) and Osmo Res, indicating culture medium and bolus feed additions as possible causes of discrepancies between the offline pH measurement technologies. Temperature was also identified as statistically significant. It is suggested that this was a result of differences in pH-temperature compensations employed by the pH measurement technologies. In summary, a method for extracting indirectly captured information has been demonstrated, and it has been shown that competing pH measurement technologies were not necessarily interchangeable at the desired level of control (±0.02pH). Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Extraction and analysis of reducing alteration information of oil-gas in Bashibulake uranium ore district based on ASTER remote sensing data

    International Nuclear Information System (INIS)

    Ye Fawang; Liu Dechang; Zhao Yingjun; Yang Xu

    2008-01-01

    Beginning with the analysis of the spectral characteristics of sandstone with reducing alteration of oil-gas in Bashibulake ore district, the extract technology of reducing alteration information based on ASTER data is presented. Several remote sensing anomaly zones of reducing alteration information similar with that in uranium deposit are interpreted in study area. On the basis of above study, these alteration anomaly information are further classified by using the advantage of ASTER data with multi-band in SWIR, the geological significance for alteration anomaly information is respectively discussed. As a result, alteration anomalies good for uranium prospecting are really selected, which provides some important information for uranium exploration in outland of Bashibulake uranium ore area. (authors)

  17. Quantitative radiography

    International Nuclear Information System (INIS)

    Brase, J.M.; Martz, H.E.; Waltjen, K.E.; Hurd, R.L.; Wieting, M.G.

    1986-01-01

    Radiographic techniques have been used in nondestructive evaluation primarily to develop qualitative information (i.e., defect detection). This project applies and extends the techniques developed in medical x-ray imaging, particularly computed tomography (CT), to develop quantitative information (both spatial dimensions and material quantities) on the three-dimensional (3D) structure of solids. Accomplishments in FY 86 include (1) improvements in experimental equipment - an improved microfocus system that will give 20-μm resolution and has potential for increased imaging speed, and (2) development of a simple new technique for displaying 3D images so as to clearly show the structure of the object. Image reconstruction and data analysis for a series of synchrotron CT experiments conducted by LLNL's Chemistry Department has begun

  18. Quantitative imaging methods in osteoporosis.

    Science.gov (United States)

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  19. An MRM-based workflow for absolute quantitation of lysine-acetylated metabolic enzymes in mouse liver.

    Science.gov (United States)

    Xu, Leilei; Wang, Fang; Xu, Ying; Wang, Yi; Zhang, Cuiping; Qin, Xue; Yu, Hongxiu; Yang, Pengyuan

    2015-12-07

    As a key post-translational modification mechanism, protein acetylation plays critical roles in regulating and/or coordinating cell metabolism. Acetylation is a prevalent modification process in enzymes. Protein acetylation modification occurs in sub-stoichiometric amounts; therefore extracting biologically meaningful information from these acetylation sites requires an adaptable, sensitive, specific, and robust method for their quantification. In this work, we combine immunoassays and multiple reaction monitoring-mass spectrometry (MRM-MS) technology to develop an absolute quantification for acetylation modification. With this hybrid method, we quantified the acetylation level of metabolic enzymes, which could demonstrate the regulatory mechanisms of the studied enzymes. The development of this quantitative workflow is a pivotal step for advancing our knowledge and understanding of the regulatory effects of protein acetylation in physiology and pathophysiology.

  20. Quantitative X-ray fluorescence analysis at the ESRF ID18F microprobe

    CERN Document Server

    Vekemans, B; Somogyi, A; Drakopoulos, M; Kempenaers, L; Simionovici, A; Adams, F

    2003-01-01

    The new ID18F end-station at the European synchrotron radiation facility (ESRF) in Grenoble (France) is dedicated to sensitive and accurate quantitative micro-X-ray fluorescence (XRF) analysis at the ppm level with accuracy better than 10% for elements with atomic numbers above 18. For accurate quantitative analysis, given a high level of instrumental stability, major steps are the extraction and conversion of experimental X-ray line intensities into elemental concentrations. For this purpose a two-step quantification approach was adopted. In the first step, the collected XRF spectra are deconvoluted on the basis of a non-linear least-squares fitting algorithm (AXIL). The extracted characteristic line intensities are then used as input for a detailed Monte Carlo (MC) simulation code dedicated to XRF spectroscopy taking into account specific experimental conditions (excitation/detection) as well as sample characteristics (absorption and enhancement effects, sample topology, heterogeneity etc.). The iterative u...

  1. Information Extraction of Tourist Geological Resources Based on 3d Visualization Remote Sensing Image

    Science.gov (United States)

    Wang, X.

    2018-04-01

    Tourism geological resources are of high value in admiration, scientific research and universal education, which need to be protected and rationally utilized. In the past, most of the remote sensing investigations of tourism geological resources used two-dimensional remote sensing interpretation method, which made it difficult for some geological heritages to be interpreted and led to the omission of some information. This aim of this paper is to assess the value of a method using the three-dimensional visual remote sensing image to extract information of geological heritages. skyline software system is applied to fuse the 0.36 m aerial images and 5m interval DEM to establish the digital earth model. Based on the three-dimensional shape, color tone, shadow, texture and other image features, the distribution of tourism geological resources in Shandong Province and the location of geological heritage sites were obtained, such as geological structure, DaiGu landform, granite landform, Volcanic landform, sandy landform, Waterscapes, etc. The results show that using this method for remote sensing interpretation is highly recognizable, making the interpretation more accurate and comprehensive.

  2. Development of quantitative x-ray microtomography

    International Nuclear Information System (INIS)

    Deckman, H.W.; Dunsmuir, J.A.; D'Amico, K.L.; Ferguson, S.R.; Flannery, B.P.

    1990-01-01

    The authors have developed several x-ray microtomography systems which function as quantitative three dimensional x-ray microscopes. In this paper the authors describe the evolutionary path followed from making the first high resolution experimental microscopes to later generations which can be routinely used for investigating materials. Developing the instrumentation for reliable quantitative x-ray microscopy using synchrotron and laboratory based x-ray sources has led to other imaging modalities for obtaining temporal and spatial two dimensional information

  3. Advances in Surface Plasmon Resonance Imaging enable quantitative measurement of laterally heterogeneous coatings of nanoscale thickness

    Science.gov (United States)

    Raegen, Adam; Reiter, Kyle; Clarke, Anthony; Lipkowski, Jacek; Dutcher, John

    2013-03-01

    The Surface Plasmon Resonance (SPR) phenomenon is routinely exploited to qualitatively probe changes to the optical properties of nanoscale coatings on thin metallic surfaces, for use in probes and sensors. Unfortunately, extracting truly quantitative information is usually limited to a select few cases - uniform absorption/desorption of small biomolecules and films, in which a continuous ``slab'' model is a good approximation. We present advancements in the SPR technique that expand the number of cases for which the technique can provide meaningful results. Use of a custom, angle-scanning SPR imaging system, together with a refined data analysis method, allow for quantitative kinetic measurements of laterally heterogeneous systems. We first demonstrate the directionally heterogeneous nature of the SPR phenomenon using a directionally ordered sample, then show how this allows for the calculation of the average coverage of a heterogeneous sample. Finally, the degradation of cellulose microfibrils and bundles of microfibrils due to the action of cellulolytic enzymes will be presented as an excellent example of the capabilities of the SPR imaging system.

  4. Rethinking the Numerate Citizen: Quantitative Literacy and Public Issues

    Directory of Open Access Journals (Sweden)

    Ander W. Erickson

    2016-07-01

    Full Text Available Does a citizen need to possess quantitative literacy in order to make responsible decisions on behalf of the public good? If so, how much is enough? This paper presents an analysis of the quantitative claims made on behalf of ballot measures in order to better delineate the role of quantitative literacy for the citizen. I argue that this role is surprisingly limited due to the contextualized nature of quantitative claims that are encountered outside of a school setting. Instead, rational dependence, or the reasoned dependence on the knowledge of others, is proposed as an educational goal that can supplement quantitative literacy and, in so doing, provide a more realistic plan for informed evaluations of quantitative claims.

  5. Study of total phenol, flavonoids contents and phytochemical screening of various leaves crude extracts of locally grown Thymus vulgaris.

    Science.gov (United States)

    Hossain, Mohammad Amzad; AL-Raqmi, Khulood Ahmed Salim; AL-Mijizy, Zawan Hamood; Weli, Afaf Mohammed; Al-Riyami, Qasim

    2013-09-01

    To prepare various crude extracts using different polarities of solvent and to quantitatively evaluate their total phenol, flavonoids contents and phytochemical screening of Thymus vulgaris collected from Al Jabal Al Akhdar, Nizwa, Sultanate of Oman. The leave sample was extracted with methanol and evaporated. Then it was defatted with water and extracted with different polarities organic solvents with increasing polarities. The prepare hexane, chloroform, ethyl acetate, butanol and methanol crude extracts were used for their evaluation of total phenol, flavonoids contents and phytochemical screening study. The established conventional methods were used for quantitative determination of total phenol, flavonoids contents and phytochemical screening. Phytochemical screening for various crude extracts were tested and shown positive result for flavonoids, saponins and steroids compounds. The result for total phenol content was the highest in butanol and the lowest in methanol crude extract whereas the total flavonoids contents was the highest in methanol and the lowest hexane crude extract. The crude extracts from locally grown Thymus vulgaris showed high concentration of flavonoids and it could be used as antibiotics for different curable and uncurable diseases.

  6. Addressing Information Proliferation: Applications of Information Extraction and Text Mining

    Science.gov (United States)

    Li, Jingjing

    2013-01-01

    The advent of the Internet and the ever-increasing capacity of storage media have made it easy to store, deliver, and share enormous volumes of data, leading to a proliferation of information on the Web, in online libraries, on news wires, and almost everywhere in our daily lives. Since our ability to process and absorb this information remains…

  7. Effects of ultrahigh pressure extraction on yield and antioxidant activity of chlorogenic acid and cynaroside extracted from flower buds of Lonicera japonica.

    Science.gov (United States)

    Hu, Wen; Guo, Ting; Jiang, Wen-Jun; Dong, Guang-Li; Chen, Da-Wei; Yang, Shi-Lin; Li, He-Ran

    2015-06-01

    The present study was designed to establish and optimize a new method for extracting chlorogenic acid and cynaroside from Lonicera japonica Thunb. through orthogonal experimental designl. A new ultrahigh pressure extraction (UPE) technology was applied to extract chlorogenic acid and cynaroside from L. japonica. The influential factors, including solvent type, ethanol concentration, extraction pressure, time, and temperature, and the solid/liquid ratio, have been studied to optimize the extraction process. The optimal conditions for the UPE were developed by quantitative analysis of the extraction products by HPLC-DAD in comparison with standard samples. In addition, the microstructures of the medicinal materials before and after extraction were studied by scanning electron microscopy (SEM). Furthermore, the extraction efficiency of different extraction methods and the 2, 2-diphenyl-1-picrylhydrazyl (DPPH) radical scavenging activities of the extracts were investigated. The optimal conditions for extracting chlorogenic acid and cynaroside were as follows: ethanol concentration, 60%; extraction pressure, 400 MPa; extraction time, 2 min; extraction temperature, 30 °C; and the solid/liquid ratio, 1 : 50. Under these conditions, the yields of chlorogenic acid and cynaroside were raised to 4.863% and 0.080%, respectively. Compared with other extraction methods, such as heat reflux extraction (HRE), ultrasonic extraction (UE), and Sohxlet extraction (SE), the UPE method showed several advantages, including higher extraction yield, shorter extraction time, lower energy consumption, and higher purity of the extracts. This study could help better utilize L. japonica flower buds as a readily accessible source of natural antioxidants in food and pharmaceutical industries. Copyright © 2015 China Pharmaceutical University. Published by Elsevier B.V. All rights reserved.

  8. Effects of Leaf Extracts of Selected Plants on Quality of Stored Citrus sinensis (Sweet Orange Juice

    Directory of Open Access Journals (Sweden)

    Oluwagbenga O. ADEOGUN

    2017-06-01

    Full Text Available Reduction in the quality of fruits during storage has been a concern to the consumers and the effect can be felt on the economy of developing countries. Leaves of plants such as Canna indica, Megaphrynium macrostachyum and Thaumatococcus daniellii have been documented as food packaging materials in West Africa. Based on this, the quality of stored sweet orange juice was investigated using ethanolic extracts of leaves of C. indica, M. macrostachyum and T. daniellii to enhance the shelf life of the juice. The extracts were used to assess the quality of juice for 30 days using quantitative parameters such as total soluble solid, browning potential, pH, microbial analysis and turbidity at 4 oC and at room temperature (27-31 oC. The qualitative and quantitative phytochemical constituents of the extracts were determined. The extracts’ toxicity was determined using Brine shrimp. The quality assessment evidently revealed that the freshly squeezed orange juice with the extracts possess tolerable activity to enhance the shelf life of orange juice. The leaf extract of M. macrostachyum had the highest preservation rate on the juice after 30 days. The qualitative phytochemical screening revealed the presence of alkaloid, tannin, saponins, flavonoids, steroids and terpenoids in the three plants tested. The quantitative phytochemical analysis of the most active extracts in the three plants revealed that M. macrostachum had the highest contents of alkaloids (107.48 mg/g and flavonoids (56.92 mg/g.The study showed that the extracts were non-lethal on Brine shrimp. This study ascertained the potential preservative qualities of the test plants for enhancing the shelf-life of orange juice.

  9. Investigation of quantitative separation of thorium, uranium, neptunium and plutonium from complex radiochemical mixtures

    International Nuclear Information System (INIS)

    Ushatskij, V.N.; Preobrazhenskaya, L.D.; Kolychev, V.B.; Gugel', E.S.

    1979-01-01

    Quantitative separation of actinides and their radiochemical purification with the aid of TBP with subsequent separation of thorium and quantitative separation of U, Np and Pu with the aid of D2EHPA have been studied. The method has been developed for quantitative extraction-chromatographic separation and radiochemical purification of nanogram amounts of U, Pu and microgram amounts of Th and Np from complex radiochemical mixtures containing both fragment radioisotopes and non-radioactive macrocomponents ( Fe,Al,Mg,Mn, Na and others). The method calls for application of one-extraction-chromatographic column with TBP and one column with D2EHPA. Thorium is separated at the first stage since it does not form complexes in a chloride solution during washing of the sorption column with 6. OM HCl. Npsup((4)) and Pusup((3)) required for separation are stabilized with the aid of hydrazine and hydroxylamine mixture. The yield of each of the above-cited actinide elements during the complete two-stage separation and at the stage of their separation varies within the range of 98.5-99.3%

  10. Quantum measurement information as a key to energy extraction from local vacuums

    International Nuclear Information System (INIS)

    Hotta, Masahiro

    2008-01-01

    In this paper, a protocol is proposed in which energy extraction from local vacuum states is possible by using quantum measurement information for the vacuum state of quantum fields. In the protocol, Alice, who stays at a spatial point, excites the ground state of the fields by a local measurement. Consequently, wave packets generated by Alice's measurement propagate the vacuum to spatial infinity. Let us assume that Bob stays away from Alice and fails to catch the excitation energy when the wave packets pass in front of him. Next Alice announces her local measurement result to Bob by classical communication. Bob performs a local unitary operation depending on the measurement result. In this process, positive energy is released from the fields to Bob's apparatus of the unitary operation. In the field systems, wave packets are generated with negative energy around Bob's location. Soon afterwards, the negative-energy wave packets begin to chase after the positive-energy wave packets generated by Alice and form loosely bound states.

  11. CRAFT (complete reduction to amplitude frequency table)--robust and time-efficient Bayesian approach for quantitative mixture analysis by NMR.

    Science.gov (United States)

    Krishnamurthy, Krish

    2013-12-01

    The intrinsic quantitative nature of NMR is increasingly exploited in areas ranging from complex mixture analysis (as in metabolomics and reaction monitoring) to quality assurance/control. Complex NMR spectra are more common than not, and therefore, extraction of quantitative information generally involves significant prior knowledge and/or operator interaction to characterize resonances of interest. Moreover, in most NMR-based metabolomic experiments, the signals from metabolites are normally present as a mixture of overlapping resonances, making quantification difficult. Time-domain Bayesian approaches have been reported to be better than conventional frequency-domain analysis at identifying subtle changes in signal amplitude. We discuss an approach that exploits Bayesian analysis to achieve a complete reduction to amplitude frequency table (CRAFT) in an automated and time-efficient fashion - thus converting the time-domain FID to a frequency-amplitude table. CRAFT uses a two-step approach to FID analysis. First, the FID is digitally filtered and downsampled to several sub FIDs, and secondly, these sub FIDs are then modeled as sums of decaying sinusoids using the Bayesian approach. CRAFT tables can be used for further data mining of quantitative information using fingerprint chemical shifts of compounds of interest and/or statistical analysis of modulation of chemical quantity in a biological study (metabolomics) or process study (reaction monitoring) or quality assurance/control. The basic principles behind this approach as well as results to evaluate the effectiveness of this approach in mixture analysis are presented. Copyright © 2013 John Wiley & Sons, Ltd.

  12. The Analysis of Tree Species Distribution Information Extraction and Landscape Pattern Based on Remote Sensing Images

    Directory of Open Access Journals (Sweden)

    Yi Zeng

    2017-08-01

    Full Text Available The forest ecosystem is the largest land vegetation type, which plays the role of unreplacement with its unique value. And in the landscape scale, the research on forest landscape pattern has become the current hot spot, wherein the study of forest canopy structure is very important. They determines the process and the strength of forests energy flow, which influences the adjustments of ecosystem for climate and species diversity to some extent. The extraction of influencing factors of canopy structure and the analysis of the vegetation distribution pattern are especially important. To solve the problems, remote sensing technology, which is superior to other technical means because of its fine timeliness and large-scale monitoring, is applied to the study. Taking Lingkong Mountain as the study area, the paper uses the remote sensing image to analyze the forest distribution pattern and obtains the spatial characteristics of canopy structure distribution, and DEM data are as the basic data to extract the influencing factors of canopy structure. In this paper, pattern of trees distribution is further analyzed by using terrain parameters, spatial analysis tools and surface processes quantitative simulation. The Hydrological Analysis tool is used to build distributed hydrological model, and corresponding algorithm is applied to determine surface water flow path, rivers network and basin boundary. Results show that forest vegetation distribution of dominant tree species present plaque on the landscape scale and their distribution have spatial heterogeneity which is related to terrain factors closely. After the overlay analysis of aspect, slope and forest distribution pattern respectively, the most suitable area for stand growth and the better living condition are obtained.

  13. Quantitative estimation of diacetylmorphine by preparative TLC and UV spectroscopy

    International Nuclear Information System (INIS)

    Khan, L.; Siddiqui, M.T.; Ahmad, N.; Shafi, N.

    2001-01-01

    A simple and efficient method for the quantitative estimation of di acetylmorphine in narcotic products has been described. Comparative TLC of narcotic specimens with standards showed presence of morphine, monoacetylmorphine, diacetylmorphine papaverine and noscapine, Resolution of the mixtures was achieved by preparative TLC. Bands corresponding to diacetylmorphine scraped, eluted UV absorption of extracts measured and contents quantified. (author)

  14. Automated DNA extraction platforms offer solutions to challenges of assessing microbial biofouling in oil production facilities.

    Science.gov (United States)

    Oldham, Athenia L; Drilling, Heather S; Stamps, Blake W; Stevenson, Bradley S; Duncan, Kathleen E

    2012-11-20

    The analysis of microbial assemblages in industrial, marine, and medical systems can inform decisions regarding quality control or mitigation. Modern molecular approaches to detect, characterize, and quantify microorganisms provide rapid and thorough measures unbiased by the need for cultivation. The requirement of timely extraction of high quality nucleic acids for molecular analysis is faced with specific challenges when used to study the influence of microorganisms on oil production. Production facilities are often ill equipped for nucleic acid extraction techniques, making the preservation and transportation of samples off-site a priority. As a potential solution, the possibility of extracting nucleic acids on-site using automated platforms was tested. The performance of two such platforms, the Fujifilm QuickGene-Mini80™ and Promega Maxwell®16 was compared to a widely used manual extraction kit, MOBIO PowerBiofilm™ DNA Isolation Kit, in terms of ease of operation, DNA quality, and microbial community composition. Three pipeline biofilm samples were chosen for these comparisons; two contained crude oil and corrosion products and the third transported seawater. Overall, the two more automated extraction platforms produced higher DNA yields than the manual approach. DNA quality was evaluated for amplification by quantitative PCR (qPCR) and end-point PCR to generate 454 pyrosequencing libraries for 16S rRNA microbial community analysis. Microbial community structure, as assessed by DGGE analysis and pyrosequencing, was comparable among the three extraction methods. Therefore, the use of automated extraction platforms should enhance the feasibility of rapidly evaluating microbial biofouling at remote locations or those with limited resources.

  15. Quantitative Radiomics System Decoding the Tumor Phenotype | Informatics Technology for Cancer Research (ITCR)

    Science.gov (United States)

    Our goal is to construct a publicly available computational radiomics system for the objective and automated extraction of quantitative imaging features that we believe will yield biomarkers of greater prognostic value compared with routinely extracted descriptors of tumor size. We will create a generalized, open, portable, and extensible radiomics platform that is widely applicable across cancer types and imaging modalities and describe how we will use lung and head and neck cancers as models to validate our developments.

  16. Extraction or adsorption? Voltammetric assessment of protamine transfer at ionophore-based polymeric membranes.

    Science.gov (United States)

    Garada, Mohammed B; Kabagambe, Benjamin; Amemiya, Shigeru

    2015-01-01

    Cation-exchange extraction of polypeptide protamine from water into an ionophore-based polymeric membrane has been hypothesized as the origin of a potentiometric sensor response to this important heparin antidote. Here, we apply ion-transfer voltammetry not only to confirm protamine extraction into ionophore-doped polymeric membranes but also to reveal protamine adsorption at the membrane/water interface. Protamine adsorption is thermodynamically more favorable than protamine extraction as shown by cyclic voltammetry at plasticized poly(vinyl chloride) membranes containing dinonylnaphthalenesulfonate as a protamine-selective ionophore. Reversible adsorption of protamine at low concentrations down to 0.038 μg/mL is demonstrated by stripping voltammetry. Adsorptive preconcentration of protamine at the membrane/water interface is quantitatively modeled by using the Frumkin adsorption isotherm. We apply this model to ensure that stripping voltammograms are based on desorption of all protamine molecules that are transferred across the interface during a preconcentration step. In comparison to adsorption, voltammetric extraction of protamine requires ∼0.2 V more negative potentials, where a potentiometric super-Nernstian response to protamine is also observed. This agreement confirms that the potentiometric protamine response is based on protamine extraction. The voltammetrically reversible protamine extraction results in an apparently irreversible potentiometric response to protamine because back-extraction of protamine from the membrane extremely slows down at the mixed potential based on cation-exchange extraction of protamine. Significantly, this study demonstrates the advantages of ion-transfer voltammetry over potentiometry to quantitatively and mechanistically assess protamine transfer at ionophore-based polymeric membranes as foundation for reversible, selective, and sensitive detection of protamine.

  17. Liquid-liquid extraction of uranium from nitric acid solution using di-n-butylsulfoxide in petroleum ether as extractant

    Energy Technology Data Exchange (ETDEWEB)

    Khan, M.H.; Shahida, S. [Dept. of Chemistry, Univ. of Azad Jammu and Kashmir, Muzaffarabad (Pakistan); Ali, A. [Nuclear Chemistry Div., Pakistan Inst. of Nuclear Science and Technology, Nilore, Islamabad (Pakistan)

    2008-07-01

    A simple, efficient and economical liquid-liquid extraction method has been developed for quantitative extraction of uranium from 2 M HNO{sub 3} using di-n-butyl sulfoxide in petroleum ether. The dependence of the partition reaction of U(VI) on the concentration of HNO{sub 3}, extractant and temperature was studied. The reaction was found to be inversely dependent upon the temperature and the values for related thermodynamics functions ({delta}H, {delta}S, {delta}G) for extraction equilibrium were determined to be -33.6 kJ/mol, -1.29 kJ/mol/degree and -0.11 kJ/mol/degree, respectively. The effect of Al(NO{sub 3}){sub 3} as salting-out agent and diverse ions on the extraction was examined. The salting-out agent slightly enhanced the extraction. All cations studied have showed negligible effect on the extraction, whereas phosphate and fluoride interfered seriously. Among others, oxalate, citrate and sulphide ions affect the extraction to a lesser extent. Uranium was successfully extracted from a synthetic mixture of Ti{sup +4}, Zr{sup +4}, Hf{sup +4} and Th{sup +4} using EDTA as masking agent. Among strippants, deionized water was found most suitable, and the recovery of uranium was noted to be {>=} 96%. The stoichiometric composition of the extracted species was found to be UO{sub 2}(NO{sub 3}){sub 2} . 2DBSO. The extraction mechanism is discussed on the basis of the results obtained. The extractant has high loading as well as recycling capacity without any degradation. The method was also applied to the Standard Reference Material (NBL-49) and the measured value was found to be in agreement with the reported value within {+-}2% deviation. (orig.)

  18. DEXTER: Disease-Expression Relation Extraction from Text.

    Science.gov (United States)

    Gupta, Samir; Dingerdissen, Hayley; Ross, Karen E; Hu, Yu; Wu, Cathy H; Mazumder, Raja; Vijay-Shanker, K

    2018-01-01

    Gene expression levels affect biological processes and play a key role in many diseases. Characterizing expression profiles is useful for clinical research, and diagnostics and prognostics of diseases. There are currently several high-quality databases that capture gene expression information, obtained mostly from large-scale studies, such as microarray and next-generation sequencing technologies, in the context of disease. The scientific literature is another rich source of information on gene expression-disease relationships that not only have been captured from large-scale studies but have also been observed in thousands of small-scale studies. Expression information obtained from literature through manual curation can extend expression databases. While many of the existing databases include information from literature, they are limited by the time-consuming nature of manual curation and have difficulty keeping up with the explosion of publications in the biomedical field. In this work, we describe an automated text-mining tool, Disease-Expression Relation Extraction from Text (DEXTER) to extract information from literature on gene and microRNA expression in the context of disease. One of the motivations in developing DEXTER was to extend the BioXpress database, a cancer-focused gene expression database that includes data derived from large-scale experiments and manual curation of publications. The literature-based portion of BioXpress lags behind significantly compared to expression information obtained from large-scale studies and can benefit from our text-mined results. We have conducted two different evaluations to measure the accuracy of our text-mining tool and achieved average F-scores of 88.51 and 81.81% for the two evaluations, respectively. Also, to demonstrate the ability to extract rich expression information in different disease-related scenarios, we used DEXTER to extract information on differential expression information for 2024 genes in lung

  19. About increasing informativity of diagnostic system of asynchronous electric motor by extracting additional information from values of consumed current parameter

    Science.gov (United States)

    Zhukovskiy, Y.; Korolev, N.; Koteleva, N.

    2018-05-01

    This article is devoted to expanding the possibilities of assessing the technical state of the current consumption of asynchronous electric drives, as well as increasing the information capacity of diagnostic methods, in conditions of limited access to equipment and incompleteness of information. The method of spectral analysis of the electric drive current can be supplemented by an analysis of the components of the current of the Park's vector. The research of the hodograph evolution in the moment of appearance and development of defects was carried out using the example of current asymmetry in the phases of an induction motor. The result of the study is the new diagnostic parameters of the asynchronous electric drive. During the research, it was proved that the proposed diagnostic parameters allow determining the type and level of the defect. At the same time, there is no need to stop the equipment and taky it out of service for repair. Modern digital control and monitoring systems can use the proposed parameters based on the stator current of an electrical machine to improve the accuracy and reliability of obtaining diagnostic patterns and predicting their changes in order to improve the equipment maintenance systems. This approach can also be used in systems and objects where there are significant parasitic vibrations and unsteady loads. The extraction of useful information can be carried out in electric drive systems in the structure of which there is a power electric converter.

  20. Simulation evaluation of quantitative myocardial perfusion assessment from cardiac CT

    Science.gov (United States)

    Bindschadler, Michael; Modgil, Dimple; Branch, Kelley R.; La Riviere, Patrick J.; Alessio, Adam M.

    2014-03-01

    Contrast enhancement on cardiac CT provides valuable information about myocardial perfusion and methods have been proposed to assess perfusion with static and dynamic acquisitions. There is a lack of knowledge and consensus on the appropriate approach to ensure 1) sufficient diagnostic accuracy for clinical decisions and 2) low radiation doses for patient safety. This work developed a thorough dynamic CT simulation and several accepted blood flow estimation techniques to evaluate the performance of perfusion assessment across a range of acquisition and estimation scenarios. Cardiac CT acquisitions were simulated for a range of flow states (Flow = 0.5, 1, 2, 3 ml/g/min, cardiac output = 3,5,8 L/min). CT acquisitions were simulated with a validated CT simulator incorporating polyenergetic data acquisition and realistic x-ray flux levels for dynamic acquisitions with a range of scenarios including 1, 2, 3 sec sampling for 30 sec with 25, 70, 140 mAs. Images were generated using conventional image reconstruction with additional image-based beam hardening correction to account for iodine content. Time attenuation curves were extracted for multiple regions around the myocardium and used to estimate flow. In total, 2,700 independent realizations of dynamic sequences were generated and multiple MBF estimation methods were applied to each of these. Evaluation of quantitative kinetic modeling yielded blood flow estimates with an root mean square error (RMSE) of ~0.6 ml/g/min averaged across multiple scenarios. Semi-quantitative modeling and qualitative static imaging resulted in significantly more error (RMSE = ~1.2 and ~1.2 ml/min/g respectively). For quantitative methods, dose reduction through reduced temporal sampling or reduced tube current had comparable impact on the MBF estimate fidelity. On average, half dose acquisitions increased the RMSE of estimates by only 18% suggesting that substantial dose reductions can be employed in the context of quantitative myocardial