WorldWideScience

Sample records for extracting high-level information

  1. End-to-end information extraction without token-level supervision

    DEFF Research Database (Denmark)

    Palm, Rasmus Berg; Hovy, Dirk; Laws, Florian

    2017-01-01

    Most state-of-the-art information extraction approaches rely on token-level labels to find the areas of interest in text. Unfortunately, these labels are time-consuming and costly to create, and consequently, not available for many real-life IE tasks. To make matters worse, token-level labels...... and output text. We evaluate our model on the ATIS data set, MIT restaurant corpus and the MIT movie corpus and compare to neural baselines that do use token-level labels. We achieve competitive results, within a few percentage points of the baselines, showing the feasibility of E2E information extraction...

  2. [Extraction of buildings three-dimensional information from high-resolution satellite imagery based on Barista software].

    Science.gov (United States)

    Zhang, Pei-feng; Hu, Yuan-man; He, Hong-shi

    2010-05-01

    The demand for accurate and up-to-date spatial information of urban buildings is becoming more and more important for urban planning, environmental protection, and other vocations. Today's commercial high-resolution satellite imagery offers the potential to extract the three-dimensional information of urban buildings. This paper extracted the three-dimensional information of urban buildings from QuickBird imagery, and validated the precision of the extraction based on Barista software. It was shown that the extraction of three-dimensional information of the buildings from high-resolution satellite imagery based on Barista software had the advantages of low professional level demand, powerful universality, simple operation, and high precision. One pixel level of point positioning and height determination accuracy could be achieved if the digital elevation model (DEM) and sensor orientation model had higher precision and the off-Nadir View Angle was relatively perfect.

  3. Information Extraction with Character-level Neural Networks and Free Noisy Supervision

    OpenAIRE

    Meerkamp, Philipp; Zhou, Zhengyi

    2016-01-01

    We present an architecture for information extraction from text that augments an existing parser with a character-level neural network. The network is trained using a measure of consistency of extracted data with existing databases as a form of noisy supervision. Our architecture combines the ability of constraint-based information extraction systems to easily incorporate domain knowledge and constraints with the ability of deep neural networks to leverage large amounts of data to learn compl...

  4. Automated concept-level information extraction to reduce the need for custom software and rules development.

    Science.gov (United States)

    D'Avolio, Leonard W; Nguyen, Thien M; Goryachev, Sergey; Fiore, Louis D

    2011-01-01

    Despite at least 40 years of promising empirical performance, very few clinical natural language processing (NLP) or information extraction systems currently contribute to medical science or care. The authors address this gap by reducing the need for custom software and rules development with a graphical user interface-driven, highly generalizable approach to concept-level retrieval. A 'learn by example' approach combines features derived from open-source NLP pipelines with open-source machine learning classifiers to automatically and iteratively evaluate top-performing configurations. The Fourth i2b2/VA Shared Task Challenge's concept extraction task provided the data sets and metrics used to evaluate performance. Top F-measure scores for each of the tasks were medical problems (0.83), treatments (0.82), and tests (0.83). Recall lagged precision in all experiments. Precision was near or above 0.90 in all tasks. Discussion With no customization for the tasks and less than 5 min of end-user time to configure and launch each experiment, the average F-measure was 0.83, one point behind the mean F-measure of the 22 entrants in the competition. Strong precision scores indicate the potential of applying the approach for more specific clinical information extraction tasks. There was not one best configuration, supporting an iterative approach to model creation. Acceptable levels of performance can be achieved using fully automated and generalizable approaches to concept-level information extraction. The described implementation and related documentation is available for download.

  5. Extract of mangosteen increases high density lipoprotein levels in rats fed high lipid

    Directory of Open Access Journals (Sweden)

    Dwi Laksono Adiputro

    2013-04-01

    Full Text Available Background In cardiovascular medicine, Garcinia mangostana has been used as an antioxidant to inhibit oxidation of low density lipoproteins and as an antiobesity agent. The effect of Garcinia mangostana on hyperlipidemia is unknown. The aim of this study was to evaluate the effect of an ethanolic extract of Garcinia mangostana pericarp on lipid profile in rats fed a high lipid diet. Methods A total of 40 rats were divided into five groups control, high lipid diet, and high lipid diet + ethanolic extract of Garcinia mangostana pericarp at dosages of 200, 400, and 800 mg/kg body weight. The control group received a standard diet for 60 days. The high lipid diet group received standard diet plus egg yolk, goat fat, cholic acid, and pig fat for 60 days with or without ethanolic extract of Garcinia mangostana pericarp by the oral route. After 60 days, rats were anesthesized with ether for collection of blood by cardiac puncture. Analysis of blood lipid profile comprised colorimetric determination of cholesterol, triglyceride, low density lipoprotein (LDL, and high density lipoprotein (HDL. Results From the results of one-way ANOVA it was concluded that there were significant between-group differences in cholesterol, trygliceride, LDL, and HDL levels (p=0.000. Ethanolic extract of Garcinia mangostana pericarp significantly decreased cholesterol, trygliceride, and LDL levels, starting at 400 mg/kg body weight (p=0.000. Ethanolic extract of Garcinia mangostana pericarp significantly increased HDL level starting at 200 mg/kg body weight (p=0.000. Conclusion Ethanolic extract of Garcinia mangostana pericarp has a beneficial effect on lipid profile in rats on a high lipid diet.

  6. Extract of mangosteen increases high density lipoprotein levels in rats fed high lipid

    Directory of Open Access Journals (Sweden)

    Dwi Laksono Adiputro

    2015-12-01

    Full Text Available BACKGROUND In cardiovascular medicine, Garcinia mangostana has been used as an antioxidant to inhibit oxidation of low density lipoproteins and as an antiobesity agent. The effect of Garcinia mangostana on hyperlipidemia is unknown. The aim of this study was to evaluate the effect of an ethanolic extract of Garcinia mangostana pericarp on lipid profile in rats fed a high lipid diet. METHODS A total of 40 rats were divided into five groups control, high lipid diet, and high lipid diet + ethanolic extract of Garcinia mangostana pericarp at dosages of 200, 400, and 800 mg/kg body weight. The control group received a standard diet for 60 days. The high lipid diet group received standard diet plus egg yolk, goat fat, cholic acid, and pig fat for 60 days with or without ethanolic extract of Garcinia mangostana pericarp by the oral route. After 60 days, rats were anesthesized with ether for collection of blood by cardiac puncture. Analysis of blood lipid profile comprised colorimetric determination of cholesterol, triglyceride, low density lipoprotein (LDL, and high density lipoprotein (HDL. RESULTS From the results of one-way ANOVA it was concluded that there were significant between-group differences in cholesterol, trygliceride, LDL, and HDL levels (p=0.000. Ethanolic extract of Garcinia mangostana pericarp significantly decreased cholesterol, trygliceride, and LDL levels, starting at 400 mg/kg body weight (p=0.000. Ethanolic extract of Garcinia mangostana pericarp significantly increased HDL level starting at 200 mg/kg body weight (p=0.000. CONCLUSION Ethanolic extract of Garcinia mangostana pericarp has a beneficial effect on lipid profile in rats on a high lipid diet.

  7. Interaction between High-Level and Low-Level Image Analysis for Semantic Video Object Extraction

    Directory of Open Access Journals (Sweden)

    Andrea Cavallaro

    2004-06-01

    Full Text Available The task of extracting a semantic video object is split into two subproblems, namely, object segmentation and region segmentation. Object segmentation relies on a priori assumptions, whereas region segmentation is data-driven and can be solved in an automatic manner. These two subproblems are not mutually independent, and they can benefit from interactions with each other. In this paper, a framework for such interaction is formulated. This representation scheme based on region segmentation and semantic segmentation is compatible with the view that image analysis and scene understanding problems can be decomposed into low-level and high-level tasks. Low-level tasks pertain to region-oriented processing, whereas the high-level tasks are closely related to object-level processing. This approach emulates the human visual system: what one “sees” in a scene depends on the scene itself (region segmentation as well as on the cognitive task (semantic segmentation at hand. The higher-level segmentation results in a partition corresponding to semantic video objects. Semantic video objects do not usually have invariant physical properties and the definition depends on the application. Hence, the definition incorporates complex domain-specific knowledge and is not easy to generalize. For the specific implementation used in this paper, motion is used as a clue to semantic information. In this framework, an automatic algorithm is presented for computing the semantic partition based on color change detection. The change detection strategy is designed to be immune to the sensor noise and local illumination variations. The lower-level segmentation identifies the partition corresponding to perceptually uniform regions. These regions are derived by clustering in an N-dimensional feature space, composed of static as well as dynamic image attributes. We propose an interaction mechanism between the semantic and the region partitions which allows to

  8. Information Extraction of High Resolution Remote Sensing Images Based on the Calculation of Optimal Segmentation Parameters

    Science.gov (United States)

    Zhu, Hongchun; Cai, Lijie; Liu, Haiying; Huang, Wei

    2016-01-01

    Multi-scale image segmentation and the selection of optimal segmentation parameters are the key processes in the object-oriented information extraction of high-resolution remote sensing images. The accuracy of remote sensing special subject information depends on this extraction. On the basis of WorldView-2 high-resolution data, the optimal segmentation parameters methodof object-oriented image segmentation and high-resolution image information extraction, the following processes were conducted in this study. Firstly, the best combination of the bands and weights was determined for the information extraction of high-resolution remote sensing image. An improved weighted mean-variance method was proposed andused to calculatethe optimal segmentation scale. Thereafter, the best shape factor parameter and compact factor parameters were computed with the use of the control variables and the combination of the heterogeneity and homogeneity indexes. Different types of image segmentation parameters were obtained according to the surface features. The high-resolution remote sensing images were multi-scale segmented with the optimal segmentation parameters. Ahierarchical network structure was established by setting the information extraction rules to achieve object-oriented information extraction. This study presents an effective and practical method that can explain expert input judgment by reproducible quantitative measurements. Furthermore the results of this procedure may be incorporated into a classification scheme. PMID:27362762

  9. Extraction Method for Earthquake-Collapsed Building Information Based on High-Resolution Remote Sensing

    International Nuclear Information System (INIS)

    Chen, Peng; Wu, Jian; Liu, Yaolin; Wang, Jing

    2014-01-01

    At present, the extraction of earthquake disaster information from remote sensing data relies on visual interpretation. However, this technique cannot effectively and quickly obtain precise and efficient information for earthquake relief and emergency management. Collapsed buildings in the town of Zipingpu after the Wenchuan earthquake were used as a case study to validate two kinds of rapid extraction methods for earthquake-collapsed building information based on pixel-oriented and object-oriented theories. The pixel-oriented method is based on multi-layer regional segments that embody the core layers and segments of the object-oriented method. The key idea is to mask layer by layer all image information, including that on the collapsed buildings. Compared with traditional techniques, the pixel-oriented method is innovative because it allows considerably rapid computer processing. As for the object-oriented method, a multi-scale segment algorithm was applied to build a three-layer hierarchy. By analyzing the spectrum, texture, shape, location, and context of individual object classes in different layers, the fuzzy determined rule system was established for the extraction of earthquake-collapsed building information. We compared the two sets of results using three variables: precision assessment, visual effect, and principle. Both methods can extract earthquake-collapsed building information quickly and accurately. The object-oriented method successfully overcomes the pepper salt noise caused by the spectral diversity of high-resolution remote sensing data and solves the problem of same object, different spectrums and that of same spectrum, different objects. With an overall accuracy of 90.38%, the method achieves more scientific and accurate results compared with the pixel-oriented method (76.84%). The object-oriented image analysis method can be extensively applied in the extraction of earthquake disaster information based on high-resolution remote sensing

  10. Solvent extraction in the treatment of acidic high-level liquid waste : where do we stand?

    International Nuclear Information System (INIS)

    Horwitz, E. P.; Schulz, W. W.

    1998-01-01

    During the last 15 years, a number of solvent extraction/recovery processes have been developed for the removal of the transuranic elements, 90 Sr and 137 Cs from acidic high-level liquid waste. These processes are based on the use of a variety of both acidic and neutral extractants. This chapter will present an overview and analysis of the various extractants and flowsheets developed to treat acidic high-level liquid waste streams. The advantages and disadvantages of each extractant along with comparisons of the individual systems are discussed

  11. Semi-automatic building extraction in informal settlements from high-resolution satellite imagery

    Science.gov (United States)

    Mayunga, Selassie David

    The extraction of man-made features from digital remotely sensed images is considered as an important step underpinning management of human settlements in any country. Man-made features and buildings in particular are required for varieties of applications such as urban planning, creation of geographical information systems (GIS) databases and Urban City models. The traditional man-made feature extraction methods are very expensive in terms of equipment, labour intensive, need well-trained personnel and cannot cope with changing environments, particularly in dense urban settlement areas. This research presents an approach for extracting buildings in dense informal settlement areas using high-resolution satellite imagery. The proposed system uses a novel strategy of extracting building by measuring a single point at the approximate centre of the building. The fine measurement of the building outlines is then effected using a modified snake model. The original snake model on which this framework is based, incorporates an external constraint energy term which is tailored to preserving the convergence properties of the snake model; its use to unstructured objects will negatively affect their actual shapes. The external constrained energy term was removed from the original snake model formulation, thereby, giving ability to cope with high variability of building shapes in informal settlement areas. The proposed building extraction system was tested on two areas, which have different situations. The first area was Tungi in Dar Es Salaam, Tanzania where three sites were tested. This area is characterized by informal settlements, which are illegally formulated within the city boundaries. The second area was Oromocto in New Brunswick, Canada where two sites were tested. Oromocto area is mostly flat and the buildings are constructed using similar materials. Qualitative and quantitative measures were employed to evaluate the accuracy of the results as well as the performance

  12. High-Resolution Remote Sensing Image Building Extraction Based on Markov Model

    Science.gov (United States)

    Zhao, W.; Yan, L.; Chang, Y.; Gong, L.

    2018-04-01

    With the increase of resolution, remote sensing images have the characteristics of increased information load, increased noise, more complex feature geometry and texture information, which makes the extraction of building information more difficult. To solve this problem, this paper designs a high resolution remote sensing image building extraction method based on Markov model. This method introduces Contourlet domain map clustering and Markov model, captures and enhances the contour and texture information of high-resolution remote sensing image features in multiple directions, and further designs the spectral feature index that can characterize "pseudo-buildings" in the building area. Through the multi-scale segmentation and extraction of image features, the fine extraction from the building area to the building is realized. Experiments show that this method can restrain the noise of high-resolution remote sensing images, reduce the interference of non-target ground texture information, and remove the shadow, vegetation and other pseudo-building information, compared with the traditional pixel-level image information extraction, better performance in building extraction precision, accuracy and completeness.

  13. Research of building information extraction and evaluation based on high-resolution remote-sensing imagery

    Science.gov (United States)

    Cao, Qiong; Gu, Lingjia; Ren, Ruizhi; Wang, Lang

    2016-09-01

    Building extraction currently is important in the application of high-resolution remote sensing imagery. At present, quite a few algorithms are available for detecting building information, however, most of them still have some obvious disadvantages, such as the ignorance of spectral information, the contradiction between extraction rate and extraction accuracy. The purpose of this research is to develop an effective method to detect building information for Chinese GF-1 data. Firstly, the image preprocessing technique is used to normalize the image and image enhancement is used to highlight the useful information in the image. Secondly, multi-spectral information is analyzed. Subsequently, an improved morphological building index (IMBI) based on remote sensing imagery is proposed to get the candidate building objects. Furthermore, in order to refine building objects and further remove false objects, the post-processing (e.g., the shape features, the vegetation index and the water index) is employed. To validate the effectiveness of the proposed algorithm, the omission errors (OE), commission errors (CE), the overall accuracy (OA) and Kappa are used at final. The proposed method can not only effectively use spectral information and other basic features, but also avoid extracting excessive interference details from high-resolution remote sensing images. Compared to the original MBI algorithm, the proposed method reduces the OE by 33.14% .At the same time, the Kappa increase by 16.09%. In experiments, IMBI achieved satisfactory results and outperformed other algorithms in terms of both accuracies and visual inspection

  14. Next Generation Extractants for Cesium Separation from High-Level Waste: From Fundamental Concepts to Site Implementation

    International Nuclear Information System (INIS)

    Moyer, Bruce A.; Bazelaire, Eve; Bonnesen, Peter V.; Bryan, Jeffrey C.; Delmau, Latitia H.; Engle, Nancy L.; Gorbunova, Maryna G.; Keever, Tamara J.; Levitskaia, Tatiana G.; Sachleben, Richard A.; Tomkins, Bruce A.; Bartsch, Richard A.

    2004-01-01

    General project objectives. This project seeks a fundamental understanding and major improvement in cesium separation from high-level waste by cesium-selective calixcrown extractants. Systems of particular interest involve novel solvent-extraction systems containing specific members of the calix[4]arene-crown-6 family, alcohol solvating agents, and alkylamines. Questions being addressed pertain to cesium binding strength, extraction selectivity, cesium stripping, and extractant solubility. Enhanced properties in this regard will specifically benefit cleanup projects funded by the USDOE Office of Environmental Management to treat and dispose of high-level radioactive wastes currently stored in underground tanks at the Savannah River Site (SRS), the Hanford site, and the Idaho National Environmental and Engineering Laboratory.1 The most direct beneficiary will be the SRS Salt Processing Project, which has recently identified the Caustic-Side Solvent Extraction (CSSX) process employing a calixcrown as its preferred technology for cesium removal from SRS high level tank waste.2 This technology owes its development in part to fundamental results obtained in this program

  15. Next Generation Extractants for Cesium Separation from High-Level Waste: From Fundamental Concepts to Site Implementation

    International Nuclear Information System (INIS)

    Moyer, Bruce A; Bazelaire, Eve; Bonnesen, Peter V.; Bryan, Jeffrey C.; Delmau, Laetitia H.; Engle, Nancy L.; Gorbunova, Maryna G.; Keever, Tamara J.; Levitskaia, Tatiana G.; Sachleben, Richard A.; Tomkins, Bruce A.; Bartsch, Richard A.; Talanov, Vladimir S.; Gibson, Harry W.; Jones, Jason W.; Hay, Benjamin P.

    2003-01-01

    This project seeks a fundamental understanding and major improvement in cesium separation from high-level waste by cesium-selective calixcrown extractants. Systems of particular interest involve novel solvent-extraction systems containing specific members of the calix[4]arene-crown-6 family, alcohol solvating agents, and alkylamines. Questions being addressed pertain to cesium binding strength, extraction selectivity, cesium stripping, and extractant solubility. Enhanced properties in this regard will specifically benefit cleanup projects funded by the USDOE Office of Environmental Management to treat and dispose of high-level radioactive wastes currently stored in underground tanks at the Savannah River Site (SRS), the Hanford site, and the Idaho National Environmental and Engineering Laboratory.1 The most direct beneficiary will be the SRS Salt Processing Project, which has recently identified the Caustic-Side Solvent Extraction (CSSX) process employing a calixcrown as its preferred technology for cesium removal from SRS high-level tank waste.2 This technology owes its development in part to fundamental results obtained in this program

  16. Optimization of TRPO process parameters for americium extraction from high level waste

    International Nuclear Information System (INIS)

    Chen Jing; Wang Jianchen; Song Chongli

    2001-01-01

    The numerical calculations for Am multistage fractional extraction by trialkyl phosphine oxide (TRPO) were verified by a hot test. 1750L/t-U high level waste (HLW) was used as the feed to the TRPO process. The analysis used the simple objective function to minimize the total waste content in the TRPO process streams. Some process parameters were optimized after other parameters were selected. The optimal process parameters for Am extraction by TRPO are: 10 stages for extraction and 2 stages for scrubbing; a flow rate ratio of 0.931 for extraction and 4.42 for scrubbing; nitric acid concentration of 1.35 mol/L for the feed and 0.5 mol/L for the scrubbing solution. Finally, the nitric acid and Am concentration profiles in the optimal TRPO extraction process are given

  17. Recent developments in the extraction separation method for treatment of high-level liquid waste

    International Nuclear Information System (INIS)

    Jiao Rongzhou; Song Chongli; Zhu Yongjun

    2000-01-01

    A description and review of the recent developments in the extraction separation method for partitioning transuranium elements from high-level liquid waste (HLLW) is presented. The extraction separation processes such as TRUEX process, DIAMEX process, DIDPA process, CTH process, TRPO process are briefly discussed

  18. Separation of transuranium elements from high-level waste by extraction with diisodecyl phosphoric acid

    International Nuclear Information System (INIS)

    Morita, Y.; Kubota, M.; Tani, S.

    1991-01-01

    Separation of transuranic elements (TRU) by extraction with diisodecyl phosphoric acid (DIDPA) has been studied to develop a partitioning process for high-level waste (HLW). In the present study, experiments of counter-current continuous extraction and back-extraction using a miniature mixer-settler were carried out to find the optimum process condition for the separation of Np initially in the pentavalent state and to examine the extraction behaviors of fission and corrosion products. (J.P.N.)

  19. Demonstration of Caustic-Side Solvent Extraction with Savannah River Site High Level Waste

    International Nuclear Information System (INIS)

    Walker, D.D.

    2001-01-01

    Researchers successfully demonstrated the chemistry and process equipment of the Caustic-Side Solvent Extraction (CSSX) flowsheet for the decontamination of high level waste using a 33-stage, 2-cm centrifugal contactor apparatus at the Savannah River Technology Center. This represents the first CSSX process demonstration using Savannah River Site (SRS) high level waste. Three tests lasting 6, 12, and 48 hours processed simulated average SRS waste, simulated Tank 37H/44F composite waste, and Tank 37H/44F high level waste, respectively

  20. Extraction of transuranic elements from high-level waste

    International Nuclear Information System (INIS)

    Morita, Y.; Kubota, M.; Tani, S.

    1991-01-01

    The present study on the counter-current continuous extraction and back-extraction offered a promising prospect of separating TRU from HLW by the DIDPA extraction process which consisted of the following three steps; simultaneous extraction of TRU, Np, Pu, Am and Cm (and rare earths) with 0.5 M DIDPA - 0.1 M TBP solvent, back-extraction of trivalent TRU, Am and Cm, with 4 M HNO 3 , and back-extraction of TRU actinides, Np and Pu, with oxalic acid. At the extraction step, temperature should be raised and H 2 O 2 should be added several times. The contacting time of the aqueous and organic phases is the most important parameter for Np extraction. Raising temperature at the first back-extraction step also has a good effect on the recovery of Am and Cm. The back-extraction of Np with oxalic acid is a simple process without change of Np oxidation state. A small part of Ru remained in the used solvent. However, its concentration was not so high that its remaining would have no influence on the several times recycling of the solvent. (author)

  1. Information extraction system

    Science.gov (United States)

    Lemmond, Tracy D; Hanley, William G; Guensche, Joseph Wendell; Perry, Nathan C; Nitao, John J; Kidwell, Paul Brandon; Boakye, Kofi Agyeman; Glaser, Ron E; Prenger, Ryan James

    2014-05-13

    An information extraction system and methods of operating the system are provided. In particular, an information extraction system for performing meta-extraction of named entities of people, organizations, and locations as well as relationships and events from text documents are described herein.

  2. Fundamental study on the extraction of transuranium elements from high-level liquid waste

    International Nuclear Information System (INIS)

    Kubota, Masumitsu; Morita, Yasuji; Tochiyama, Osamu; Inoue, Yasushi.

    1988-01-01

    A great many extractants have been studied for the separation of transuranium elements. The present study deals with the survey and classification of the extractants appearing in literature, bearing in mind the relationship between the molecular structure of extractants and their extractability for the transuranium elements from the standpoint of their selective separation from high-level liquid waste (HLW) generated from fuel reprocessing. The extractants surveyed were classified into six groups; unidentate neutral organophosphorus compounds, bidentate neutral organophosphorus compounds, acidic organophosphorus compounds, amines and ammonium salts, N,N-disubstituted amides and the other compounds. These extractants are not always applicable to the separation of transuranium elements from HLW because of their limitations in extractability and radiation durability. Only a limited number of extractants belonging to the bidentate neutral organophosphorus compounds and the acidic organophosphorus compounds are considered to be suitable for the present purpose. (author)

  3. A methodology for the extraction of quantitative information from electron microscopy images at the atomic level

    International Nuclear Information System (INIS)

    Galindo, P L; Pizarro, J; Guerrero, E; Guerrero-Lebrero, M P; Scavello, G; Yáñez, A; Sales, D L; Herrera, M; Molina, S I; Núñez-Moraleda, B M; Maestre, J M

    2014-01-01

    In this paper we describe a methodology developed at the University of Cadiz (Spain) in the past few years for the extraction of quantitative information from electron microscopy images at the atomic level. This work is based on a coordinated and synergic activity of several research groups that have been working together over the last decade in two different and complementary fields: Materials Science and Computer Science. The aim of our joint research has been to develop innovative high-performance computing techniques and simulation methods in order to address computationally challenging problems in the analysis, modelling and simulation of materials at the atomic scale, providing significant advances with respect to existing techniques. The methodology involves several fundamental areas of research including the analysis of high resolution electron microscopy images, materials modelling, image simulation and 3D reconstruction using quantitative information from experimental images. These techniques for the analysis, modelling and simulation allow optimizing the control and functionality of devices developed using materials under study, and have been tested using data obtained from experimental samples

  4. The effect of informed consent on stress levels associated with extraction of impacted mandibular third molars.

    Science.gov (United States)

    Casap, Nardy; Alterman, Michael; Sharon, Guy; Samuni, Yuval

    2008-05-01

    To evaluate the effect of informed consent on stress levels associated with removal of impacted mandibular third molars. A total of 60 patients scheduled for extraction of impacted mandibular third molars participated in this study. The patients were unaware of the study's objectives. Data from 20 patients established the baseline levels of electrodermal activity (EDA). The remaining 40 patients were randomly assigned into 2 equal groups receiving either a detailed document of informed consent, disclosing the possible risks involved with the surgery, or a simplified version. Pulse, blood pressure, and EDA were monitored before, during, and after completion of the consent document. Changes in EDA, but not in blood pressure, were measured on completion of either version of the consent document. A greater increase in EDA was associated with the detailed version of the consent document (P = .004). A similar concomitant increase (although nonsignificant) in pulse values was monitored on completion of both versions. Completion of overdisclosed document of informed consent is associated with changes in physiological parameters. The results suggest that overdetailed listing and disclosure before extraction of impacted mandibular third molars can increase patient stress.

  5. Development of technical information database for high level waste disposal

    International Nuclear Information System (INIS)

    Kudo, Koji; Takada, Susumu; Kawanishi, Motoi

    2005-01-01

    A concept design of the high level waste disposal information database and the disposal technologies information database are explained. The high level waste disposal information database contains information on technologies, waste, management and rules, R and D, each step of disposal site selection, characteristics of sites, demonstration of disposal technology, design of disposal site, application for disposal permit, construction of disposal site, operation and closing. Construction of the disposal technologies information system and the geological disposal technologies information system is described. The screen image of the geological disposal technologies information system is shown. User is able to search the full text retrieval and attribute retrieval in the image. (S.Y. )

  6. High Level Information Fusion (HLIF) with nested fusion loops

    Science.gov (United States)

    Woodley, Robert; Gosnell, Michael; Fischer, Amber

    2013-05-01

    Situation modeling and threat prediction require higher levels of data fusion in order to provide actionable information. Beyond the sensor data and sources the analyst has access to, the use of out-sourced and re-sourced data is becoming common. Through the years, some common frameworks have emerged for dealing with information fusion—perhaps the most ubiquitous being the JDL Data Fusion Group and their initial 4-level data fusion model. Since these initial developments, numerous models of information fusion have emerged, hoping to better capture the human-centric process of data analyses within a machine-centric framework. 21st Century Systems, Inc. has developed Fusion with Uncertainty Reasoning using Nested Assessment Characterizer Elements (FURNACE) to address challenges of high level information fusion and handle bias, ambiguity, and uncertainty (BAU) for Situation Modeling, Threat Modeling, and Threat Prediction. It combines JDL fusion levels with nested fusion loops and state-of-the-art data reasoning. Initial research has shown that FURNACE is able to reduce BAU and improve the fusion process by allowing high level information fusion (HLIF) to affect lower levels without the double counting of information or other biasing issues. The initial FURNACE project was focused on the underlying algorithms to produce a fusion system able to handle BAU and repurposed data in a cohesive manner. FURNACE supports analyst's efforts to develop situation models, threat models, and threat predictions to increase situational awareness of the battlespace. FURNACE will not only revolutionize the military intelligence realm, but also benefit the larger homeland defense, law enforcement, and business intelligence markets.

  7. Multimedia Information Extraction

    CERN Document Server

    Maybury, Mark T

    2012-01-01

    The advent of increasingly large consumer collections of audio (e.g., iTunes), imagery (e.g., Flickr), and video (e.g., YouTube) is driving a need not only for multimedia retrieval but also information extraction from and across media. Furthermore, industrial and government collections fuel requirements for stock media access, media preservation, broadcast news retrieval, identity management, and video surveillance.  While significant advances have been made in language processing for information extraction from unstructured multilingual text and extraction of objects from imagery and vid

  8. Licensing information needs for a high-level waste repository

    International Nuclear Information System (INIS)

    Wright, R.J.; Greeves, J.T.; Logsdon, M.J.

    1985-01-01

    The information needs for licensing findings during the development of a repository for high-level waste (HLW) are described. In particular, attention is given to the information and needs to demonstrate, for construction authorization purposes: repository constructibility, waste retrievability, waste containment, and waste isolation

  9. Can we replace curation with information extraction software?

    Science.gov (United States)

    Karp, Peter D

    2016-01-01

    Can we use programs for automated or semi-automated information extraction from scientific texts as practical alternatives to professional curation? I show that error rates of current information extraction programs are too high to replace professional curation today. Furthermore, current IEP programs extract single narrow slivers of information, such as individual protein interactions; they cannot extract the large breadth of information extracted by professional curators for databases such as EcoCyc. They also cannot arbitrate among conflicting statements in the literature as curators can. Therefore, funding agencies should not hobble the curation efforts of existing databases on the assumption that a problem that has stymied Artificial Intelligence researchers for more than 60 years will be solved tomorrow. Semi-automated extraction techniques appear to have significantly more potential based on a review of recent tools that enhance curator productivity. But a full cost-benefit analysis for these tools is lacking. Without such analysis it is possible to expend significant effort developing information-extraction tools that automate small parts of the overall curation workflow without achieving a significant decrease in curation costs.Database URL. © The Author(s) 2016. Published by Oxford University Press.

  10. Challenges in Managing Information Extraction

    Science.gov (United States)

    Shen, Warren H.

    2009-01-01

    This dissertation studies information extraction (IE), the problem of extracting structured information from unstructured data. Example IE tasks include extracting person names from news articles, product information from e-commerce Web pages, street addresses from emails, and names of emerging music bands from blogs. IE is all increasingly…

  11. Fine-grained information extraction from German transthoracic echocardiography reports.

    Science.gov (United States)

    Toepfer, Martin; Corovic, Hamo; Fette, Georg; Klügl, Peter; Störk, Stefan; Puppe, Frank

    2015-11-12

    Information extraction techniques that get structured representations out of unstructured data make a large amount of clinically relevant information about patients accessible for semantic applications. These methods typically rely on standardized terminologies that guide this process. Many languages and clinical domains, however, lack appropriate resources and tools, as well as evaluations of their applications, especially if detailed conceptualizations of the domain are required. For instance, German transthoracic echocardiography reports have not been targeted sufficiently before, despite of their importance for clinical trials. This work therefore aimed at development and evaluation of an information extraction component with a fine-grained terminology that enables to recognize almost all relevant information stated in German transthoracic echocardiography reports at the University Hospital of Würzburg. A domain expert validated and iteratively refined an automatically inferred base terminology. The terminology was used by an ontology-driven information extraction system that outputs attribute value pairs. The final component has been mapped to the central elements of a standardized terminology, and it has been evaluated according to documents with different layouts. The final system achieved state-of-the-art precision (micro average.996) and recall (micro average.961) on 100 test documents that represent more than 90 % of all reports. In particular, principal aspects as defined in a standardized external terminology were recognized with f 1=.989 (micro average) and f 1=.963 (macro average). As a result of keyword matching and restraint concept extraction, the system obtained high precision also on unstructured or exceptionally short documents, and documents with uncommon layout. The developed terminology and the proposed information extraction system allow to extract fine-grained information from German semi-structured transthoracic echocardiography reports

  12. Extraction of prospecting information of uranium deposit based on high spatial resolution satellite data. Taking bashibulake region as an example

    International Nuclear Information System (INIS)

    Yang Xu; Liu Dechang; Zhang Jielin

    2008-01-01

    In this study, the signification and content of prospecting information of uranium deposit are expounded. Quickbird high spatial resolution satellite data are used to extract the prospecting information of uranium deposit in Bashibulake area in the north of Tarim Basin. By using the pertinent methods of image processing, the information of ore-bearing bed, ore-control structure and mineralized alteration have been extracted. The results show a high consistency with the field survey. The aim of this study is to explore practicability of high spatial resolution satellite data for prospecting minerals, and to broaden the thinking of prospectation at similar area. (authors)

  13. A rapid extraction of landslide disaster information research based on GF-1 image

    Science.gov (United States)

    Wang, Sai; Xu, Suning; Peng, Ling; Wang, Zhiyi; Wang, Na

    2015-08-01

    In recent years, the landslide disasters occurred frequently because of the seismic activity. It brings great harm to people's life. It has caused high attention of the state and the extensive concern of society. In the field of geological disaster, landslide information extraction based on remote sensing has been controversial, but high resolution remote sensing image can improve the accuracy of information extraction effectively with its rich texture and geometry information. Therefore, it is feasible to extract the information of earthquake- triggered landslides with serious surface damage and large scale. Taking the Wenchuan county as the study area, this paper uses multi-scale segmentation method to extract the landslide image object through domestic GF-1 images and DEM data, which uses the estimation of scale parameter tool to determine the optimal segmentation scale; After analyzing the characteristics of landslide high-resolution image comprehensively and selecting spectrum feature, texture feature, geometric features and landform characteristics of the image, we can establish the extracting rules to extract landslide disaster information. The extraction results show that there are 20 landslide whose total area is 521279.31 .Compared with visual interpretation results, the extraction accuracy is 72.22%. This study indicates its efficient and feasible to extract earthquake landslide disaster information based on high resolution remote sensing and it provides important technical support for post-disaster emergency investigation and disaster assessment.

  14. Extracting chemical information from high-resolution Kβ X-ray emission spectroscopy

    Science.gov (United States)

    Limandri, S.; Robledo, J.; Tirao, G.

    2018-06-01

    High-resolution X-ray emission spectroscopy allows studying the chemical environment of a wide variety of materials. Chemical information can be obtained by fitting the X-ray spectra and observing the behavior of some spectral features. Spectral changes can also be quantified by means of statistical parameters calculated by considering the spectrum as a probability distribution. Another possibility is to perform statistical multivariate analysis, such as principal component analysis. In this work the performance of these procedures for extracting chemical information in X-ray emission spectroscopy spectra for mixtures of Mn2+ and Mn4+ oxides are studied. A detail analysis of the parameters obtained, as well as the associated uncertainties is shown. The methodologies are also applied for Mn oxidation state characterization of double perovskite oxides Ba1+xLa1-xMnSbO6 (with 0 ≤ x ≤ 0.7). The results show that statistical parameters and multivariate analysis are the most suitable for the analysis of this kind of spectra.

  15. Information Extraction of High-Resolution Remotely Sensed Image Based on Multiresolution Segmentation

    Directory of Open Access Journals (Sweden)

    Peng Shao

    2014-08-01

    Full Text Available The principle of multiresolution segmentation was represented in detail in this study, and the canny algorithm was applied for edge-detection of a remotely sensed image based on this principle. The target image was divided into regions based on object-oriented multiresolution segmentation and edge-detection. Furthermore, object hierarchy was created, and a series of features (water bodies, vegetation, roads, residential areas, bare land and other information were extracted by the spectral and geometrical features. The results indicate that the edge-detection has a positive effect on multiresolution segmentation, and overall accuracy of information extraction reaches to 94.6% by the confusion matrix.

  16. Information extraction from muon radiography data

    International Nuclear Information System (INIS)

    Borozdin, K.N.; Asaki, T.J.; Chartrand, R.; Hengartner, N.W.; Hogan, G.E.; Morris, C.L.; Priedhorsky, W.C.; Schirato, R.C.; Schultz, L.J.; Sottile, M.J.; Vixie, K.R.; Wohlberg, B.E.; Blanpied, G.

    2004-01-01

    Scattering muon radiography was proposed recently as a technique of detection and 3-d imaging for dense high-Z objects. High-energy cosmic ray muons are deflected in matter in the process of multiple Coulomb scattering. By measuring the deflection angles we are able to reconstruct the configuration of high-Z material in the object. We discuss the methods for information extraction from muon radiography data. Tomographic methods widely used in medical images have been applied to a specific muon radiography information source. Alternative simple technique based on the counting of high-scattered muons in the voxels seems to be efficient in many simulated scenes. SVM-based classifiers and clustering algorithms may allow detection of compact high-Z object without full image reconstruction. The efficiency of muon radiography can be increased using additional informational sources, such as momentum estimation, stopping power measurement, and detection of muonic atom emission.

  17. High-level fusion of depth and intensity for pedestrian classification

    NARCIS (Netherlands)

    Rohrbach, M.; Enzweiler, M.; Gavrila, D.M.

    2009-01-01

    This paper presents a novel approach to pedestrian classification which involves a high-level fusion of depth and intensity cues. Instead of utilizing depth information only in a pre-processing step, we propose to extract discriminative spatial features (gradient orientation histograms and local

  18. Enzymatic extraction of star gooseberry (Phyllanthus acidus) juice with high antioxidant level

    Science.gov (United States)

    Loan, Do Thi Thanh; Tra, Tran Thi Thu; Nguyet, Ton Nu Minh; Man, Le Van Viet

    2017-09-01

    Ascorbic acid and phenolic compounds are main antioxidants in star gooseberry (Phyllanthus acidus) fruit. In this study, Pectinex Ultra SP-L preparation with pectinase activity was used in the extraction of star gooseberry juice. The effects of pectinase concentration and biocatalytic time on the content of ascorbic acid, phenolic compounds and antioxidant activity of the fruit juice were firstly investigated. Response surface methodology was then used to optimize the conditions of enzymatic extraction for maximizing the antioxidant activity of the star gooseberry juice. The optimal pectinase concentration and biocatalytic time were 19 polygalacturonase units per 100g pulp dry weight and 67 min, respectively under which the maximal antioxidant activity achieved 5595±6 µmol Trolox equivalent per 100g juice dry weight. On the basis of kinetic model of second-order extraction, the extraction rate constant of ascorbic acid and phenolic compounds in the enzymatic extraction increased approximately 21% and 157%, respectively in comparison with that in the conventional extraction. Application of pectinase preparation to the fruit juice extraction was therefore potential for improvement in antioxidant level of the product.

  19. Information extraction from multi-institutional radiology reports.

    Science.gov (United States)

    Hassanpour, Saeed; Langlotz, Curtis P

    2016-01-01

    The radiology report is the most important source of clinical imaging information. It documents critical information about the patient's health and the radiologist's interpretation of medical findings. It also communicates information to the referring physicians and records that information for future clinical and research use. Although efforts to structure some radiology report information through predefined templates are beginning to bear fruit, a large portion of radiology report information is entered in free text. The free text format is a major obstacle for rapid extraction and subsequent use of information by clinicians, researchers, and healthcare information systems. This difficulty is due to the ambiguity and subtlety of natural language, complexity of described images, and variations among different radiologists and healthcare organizations. As a result, radiology reports are used only once by the clinician who ordered the study and rarely are used again for research and data mining. In this work, machine learning techniques and a large multi-institutional radiology report repository are used to extract the semantics of the radiology report and overcome the barriers to the re-use of radiology report information in clinical research and other healthcare applications. We describe a machine learning system to annotate radiology reports and extract report contents according to an information model. This information model covers the majority of clinically significant contents in radiology reports and is applicable to a wide variety of radiology study types. Our automated approach uses discriminative sequence classifiers for named-entity recognition to extract and organize clinically significant terms and phrases consistent with the information model. We evaluated our information extraction system on 150 radiology reports from three major healthcare organizations and compared its results to a commonly used non-machine learning information extraction method. We

  20. Effect of Arctium Lappa Root Extract on Glucose Levels and Insulin Resistance in Rats with High Sucrose Diet

    Directory of Open Access Journals (Sweden)

    A Ahangarpour

    2013-06-01

    Full Text Available Introduction: Diabetes Mellitus is a growing health problem in all over the world. Arctium Lappa has been used therapeutically in Europe, North America and Asia. Antioxidants and antidiabetic compounds have been found in the root of Arctium Lappa. This study intends to investigate the effects of Arctium Lappa root aqueous extract on glucose, insulin levels and Fasting Insulin Resistance Index in female rats with high sucrose diet. Methods: 40 female Wistar rats weighting 150-250(g were applied. After having a diet induced by sucrose 50% in drinking water for 5 weeks, the animals were randomly divided into two groups of control, sucrose induced, and three groups of sucrose induced along with Arctium Lappa root aqueous extract (50,100,200 mg/Kg (8 rats in each group. Treatment by extracts was used during 2 weeks (i.p. and 24 hours after the last treatment, heart blood samples were gathered. After Blood samples were centrifuged, fasting plasma glucose (12 h was determined by kit and fasting insulin concentration was assayed by Enzyme-linked immunosorbent assay (Elisa methods. Result: Glucose levels, insulin and FIRI in sucrose group significantly increased in comparison with control group. Glucose levels in aqueous extract groups; 50 mg/kg (116.14±16.64mg/dl and 200 mg/kg (90.66±22.58 mg/dl in comparison with sucrose group (140.5±18.73 mg/dl significantly decreased. Insulin level and FIRI in all of aqueous extract groups were significantly decreased (P<0.001 in comparison with sucrose group. Conclusions: Arctium Lappa root aqueous extracts in animal model has revealed significant decrease in blood glucose and insulin levels.

  1. Determination of Np, Pu and Am in high level radioactive waste with extraction-liquid scintillation counting

    International Nuclear Information System (INIS)

    Yang Dazhu; Zhu Yongjun; Jiao Rongzhou

    1994-01-01

    A new method for the determination of transuranium elements, Np, Pu and Am with extraction-liquid scintillation counting has been studied systematically. Procedures for the separation of Pu and Am by HDEHP-TRPO extraction and for the separation of Np by TTA-TiOA extraction have been developed, by which the recovery of Np, Pu and Am is 97%, 99% and 99%, respectively, and the decontamination factors for the major fission products ( 90 Sr, 137 Cs etc.) are 10 4 -10 6 . Pulse shape discrimination (PSD) technique has been introduced to liquid scintillation counting, by which the counting efficiency of α-activity is >99% and the rejection of β-counts is >99.95%. This new method, combining extraction and pulse shape discrimination with liquid scintillation technique, has been successfully applied to the assay of Np, Pu and Am in high level radioactive waste. (author) 7 refs.; 7 figs.; 4 tabs

  2. Demonstration of pyropartitioning process by using genuine high-level liquid waste. Reductive-extraction of actinide elements from chlorination product

    International Nuclear Information System (INIS)

    Uozumi, Koichi; Iizuka, Masatoshi; Kurata, Masaki; Ougier, Michel; Malmbeck, Rikard; Winckel, Stefaan van

    2009-01-01

    The pyropartitioning process separates the minor actinide elements (MAs) together with uranium and plutonium from the high-level liquid waste generated at the Purex reprocessing of spent LWR fuel and introduces them to metallic fuel cycle. For the demonstration of this technology, a series experiment using 520g of genuine high-level liquid waste was started and the conversion of actinide elements to their chlorides was already demonstrated by denitration and chlorination. In the present study, a reductive extraction experiment in molten salt/liquid cadmium system to recover actinide elements from the chlorination product of the genuine high-level liquid waste was performed. The results of the experiment are as following; 1) By the addition of the cadmium-lithium alloy reductant, almost all of plutonium and MAs in the initial high-level liquid waste were recovered in the cadmium phase. It means no mass loss during denitration, chlorination, and reductive-extraction. 2) The separation factor values of plutonium, MAs, and rare-earth fission product elements versus uranium agreed with the literature values. Therefore, actinide elements will be separated from fission product elements in the actual system. Hence, the pyropartitioning process was successfully demonstrated. (author)

  3. Extraction of Urban Water Bodies from High-Resolution Remote-Sensing Imagery Using Deep Learning

    Directory of Open Access Journals (Sweden)

    Yang Chen

    2018-05-01

    Full Text Available Accurate information on urban surface water is important for assessing the role it plays in urban ecosystem services in the context of human survival and climate change. The precise extraction of urban water bodies from images is of great significance for urban planning and socioeconomic development. In this paper, a novel deep-learning architecture is proposed for the extraction of urban water bodies from high-resolution remote sensing (HRRS imagery. First, an adaptive simple linear iterative clustering algorithm is applied for segmentation of the remote-sensing image into high-quality superpixels. Then, a new convolutional neural network (CNN architecture is designed that can extract useful high-level features of water bodies from input data in a complex urban background and mark the superpixel as one of two classes: an including water or no-water pixel. Finally, a high-resolution image of water-extracted superpixels is generated. Experimental results show that the proposed method achieved higher accuracy for water extraction from the high-resolution remote-sensing images than traditional approaches, and the average overall accuracy is 99.14%.

  4. Polarity-specific high-level information propagation in neural networks.

    Science.gov (United States)

    Lin, Yen-Nan; Chang, Po-Yen; Hsiao, Pao-Yueh; Lo, Chung-Chuan

    2014-01-01

    Analyzing the connectome of a nervous system provides valuable information about the functions of its subsystems. Although much has been learned about the architectures of neural networks in various organisms by applying analytical tools developed for general networks, two distinct and functionally important properties of neural networks are often overlooked. First, neural networks are endowed with polarity at the circuit level: Information enters a neural network at input neurons, propagates through interneurons, and leaves via output neurons. Second, many functions of nervous systems are implemented by signal propagation through high-level pathways involving multiple and often recurrent connections rather than by the shortest paths between nodes. In the present study, we analyzed two neural networks: the somatic nervous system of Caenorhabditis elegans (C. elegans) and the partial central complex network of Drosophila, in light of these properties. Specifically, we quantified high-level propagation in the vertical and horizontal directions: the former characterizes how signals propagate from specific input nodes to specific output nodes and the latter characterizes how a signal from a specific input node is shared by all output nodes. We found that the two neural networks are characterized by very efficient vertical and horizontal propagation. In comparison, classic small-world networks show a trade-off between vertical and horizontal propagation; increasing the rewiring probability improves the efficiency of horizontal propagation but worsens the efficiency of vertical propagation. Our result provides insights into how the complex functions of natural neural networks may arise from a design that allows them to efficiently transform and combine input signals.

  5. Extractions of High Quality RNA from the Seeds of Jerusalem Artichoke and Other Plant Species with High Levels of Starch and Lipid

    Directory of Open Access Journals (Sweden)

    Tanupat Mornkham

    2013-04-01

    Full Text Available Jerusalem artichoke (Helianthus tuberosus L. is an important tuber crop. However, Jerusalem artichoke seeds contain high levels of starch and lipid, making the extraction of high-quality RNA extremely difficult and the gene expression analysis challenging. This study was aimed to improve existing methods for extracting total RNA from Jerusalem artichoke dry seeds and to assess the applicability of the improved method in other plant species. Five RNA extraction methods were evaluated on Jerusalem artichoke seeds and two were modified. One modified method with the significant improvement was applied to assay seeds of diverse Jerusalem artichoke accessions, sunflower, rice, maize, peanut and marigold. The effectiveness of the improved method to extract total RNA from seeds was assessed using qPCR analysis of four selected genes. The improved method of Ma and Yang (2011 yielded a maximum RNA solubility and removed most interfering substances. The improved protocol generated 29 to 41 µg RNA/30 mg fresh weight. An A260/A280 ratio of 1.79 to 2.22 showed their RNA purity. Extracted RNA was effective for downstream applications such as first-stranded cDNA synthesis, cDNA cloning and qPCR. The improved method was also effective to extract total RNA from seeds of sunflower, rice, maize and peanut that are rich in polyphenols, lipids and polysaccharides.

  6. Extractions of High Quality RNA from the Seeds of Jerusalem Artichoke and Other Plant Species with High Levels of Starch and Lipid.

    Science.gov (United States)

    Mornkham, Tanupat; Wangsomnuk, Preeya Puangsomlee; Fu, Yong-Bi; Wangsomnuk, Pinich; Jogloy, Sanun; Patanothai, Aran

    2013-04-29

    Jerusalem artichoke (Helianthus tuberosus L.) is an important tuber crop. However, Jerusalem artichoke seeds contain high levels of starch and lipid, making the extraction of high-quality RNA extremely difficult and the gene expression analysis challenging. This study was aimed to improve existing methods for extracting total RNA from Jerusalem artichoke dry seeds and to assess the applicability of the improved method in other plant species. Five RNA extraction methods were evaluated on Jerusalem artichoke seeds and two were modified. One modified method with the significant improvement was applied to assay seeds of diverse Jerusalem artichoke accessions, sunflower, rice, maize, peanut and marigold. The effectiveness of the improved method to extract total RNA from seeds was assessed using qPCR analysis of four selected genes. The improved method of Ma and Yang (2011) yielded a maximum RNA solubility and removed most interfering substances. The improved protocol generated 29 to 41 µg RNA/30 mg fresh weight. An A260/A280 ratio of 1.79 to 2.22 showed their RNA purity. Extracted RNA was effective for downstream applications such as first-stranded cDNA synthesis, cDNA cloning and qPCR. The improved method was also effective to extract total RNA from seeds of sunflower, rice, maize and peanut that are rich in polyphenols, lipids and polysaccharides.

  7. Actinide separation of high-level waste using solvent extractants on magnetic microparticles

    International Nuclear Information System (INIS)

    Nunez, L.; Buchholz, B.A.; Kaminski, M.; Aase, S.B.; Brown, N.R.; Vandegrift, G.F.

    1994-01-01

    Polymeric-coated ferromagnetic particles with an absorbed layer of octyl(phenyl)-N,N-diisobutylcarbamoylmethylphosphine oxide (CMPO) diluted by tributyl phosphate (TBP) are being evaluated for application in the separation and the recovery of low concentrations of americium and plutonium from nuclear waste solutions. Due to their chemical nature, these extractants selectively complex americium and plutonium contaminants onto the particles, which can be recovered from the waste solution using a magnet. The effectiveness of the extractant-absorbed particles at removing transuranics (TRU) from simulated solutions and various nitric acid solutions was measured by gamma and liquid scintillation counting of plutonium and americium. The HNO 3 concentration range was 0.01 M to 6M. The partition coefficients (K d ) for various actinides at 2M HNO 3 were determined to be between 3,000 and 30,000. These values are larger than those projected for TRU recovery by traditional liquid/liquid extraction. Results from transmission electron microscopy indicated a large dependence of K d on relative magnetite location within the polymer and the polymer surface area. Energy disperse spectroscopy demonstrated homogeneous metal complexation on the polymer surface with no metal clustering. The radiolytic stability of the particles was determined by using 60 Co gamma irradiation under various conditions. The results showed that K d more strongly depends on the nitric acid dissolution rate of the magnetite than the gamma irradiation dose. Results of actinide separation from simulated high-level waste representative of that at various DOE sites are also discussed

  8. Extracting useful information from images

    DEFF Research Database (Denmark)

    Kucheryavskiy, Sergey

    2011-01-01

    The paper presents an overview of methods for extracting useful information from digital images. It covers various approaches that utilized different properties of images, like intensity distribution, spatial frequencies content and several others. A few case studies including isotropic and heter......The paper presents an overview of methods for extracting useful information from digital images. It covers various approaches that utilized different properties of images, like intensity distribution, spatial frequencies content and several others. A few case studies including isotropic...

  9. Extraction and Analysis of Autonomous System Level Internet Map of Turkey

    Directory of Open Access Journals (Sweden)

    Hakan Çetin

    2010-01-01

    Full Text Available At the high level, the Internet is a mesh that is composed of thousands of autonomous system (AS connected together. This mesh is represented as a graph where each autonomous system is considered as a node and the connections with Border Gateway Protocol neighbored autonomous systems considered as an edge. Analysis of this mesh and visual representation of the graph gives us the AS level topology of the Internet. In recent years there are increasing numbers of studies that are focused on the structure of the topology of the Internet. It is important to study the Internet infrastructure in Turkey and to provide a way to monitor the changes to it over time. In this study we present the AS level Internet map of Turkey with explanation of each step. In order to get the whole AS level map, we first determined the ASs that geographically reside in Turkey and afterwards determined the interconnections among this ASs, along with international interconnections. Then we extracted the relations between connected ASs and analyzed the structural properties of AS infrastructure. We explained the methods we used in each step. Using the extracted data we analyzed the AS level properties of Turkey and we provide the AS level Internet map of Turkey along with a web-based software that can monitor and provide information of ASs in Turkey.

  10. Knowledge Dictionary for Information Extraction on the Arabic Text Data

    Directory of Open Access Journals (Sweden)

    Wahyu Jauharis Saputra

    2013-04-01

    Full Text Available Information extraction is an early stage of a process of textual data analysis. Information extraction is required to get information from textual data that can be used for process analysis, such as classification and categorization. A textual data is strongly influenced by the language. Arabic is gaining a significant attention in many studies because Arabic language is very different from others, and in contrast to other languages, tools and research on the Arabic language is still lacking. The information extracted using the knowledge dictionary is a concept of expression. A knowledge dictionary is usually constructed manually by an expert and this would take a long time and is specific to a problem only. This paper proposed a method for automatically building a knowledge dictionary. Dictionary knowledge is formed by classifying sentences having the same concept, assuming that they will have a high similarity value. The concept that has been extracted can be used as features for subsequent computational process such as classification or categorization. Dataset used in this paper was the Arabic text dataset. Extraction result was tested by using a decision tree classification engine and the highest precision value obtained was 71.0% while the highest recall value was 75.0%. 

  11. Extraction of Information of Audio-Visual Contents

    Directory of Open Access Journals (Sweden)

    Carlos Aguilar

    2011-10-01

    Full Text Available In this article we show how it is possible to use Channel Theory (Barwise and Seligman, 1997 for modeling the process of information extraction realized by audiences of audio-visual contents. To do this, we rely on the concepts pro- posed by Channel Theory and, especially, its treatment of representational systems. We then show how the information that an agent is capable of extracting from the content depends on the number of channels he is able to establish between the content and the set of classifications he is able to discriminate. The agent can endeavor the extraction of information through these channels from the totality of content; however, we discuss the advantages of extracting from its constituents in order to obtain a greater number of informational items that represent it. After showing how the extraction process is endeavored for each channel, we propose a method of representation of all the informative values an agent can obtain from a content using a matrix constituted by the channels the agent is able to establish on the content (source classifications, and the ones he can understand as individual (destination classifications. We finally show how this representation allows reflecting the evolution of the informative items through the evolution of audio-visual content.

  12. MedTime: a temporal information extraction system for clinical narratives.

    Science.gov (United States)

    Lin, Yu-Kai; Chen, Hsinchun; Brown, Randall A

    2013-12-01

    Temporal information extraction from clinical narratives is of critical importance to many clinical applications. We participated in the EVENT/TIMEX3 track of the 2012 i2b2 clinical temporal relations challenge, and presented our temporal information extraction system, MedTime. MedTime comprises a cascade of rule-based and machine-learning pattern recognition procedures. It achieved a micro-averaged f-measure of 0.88 in both the recognitions of clinical events and temporal expressions. We proposed and evaluated three time normalization strategies to normalize relative time expressions in clinical texts. The accuracy was 0.68 in normalizing temporal expressions of dates, times, durations, and frequencies. This study demonstrates and evaluates the integration of rule-based and machine-learning-based approaches for high performance temporal information extraction from clinical narratives. Copyright © 2013 Elsevier Inc. All rights reserved.

  13. Scenario Customization for Information Extraction

    National Research Council Canada - National Science Library

    Yangarber, Roman

    2001-01-01

    Information Extraction (IE) is an emerging NLP technology, whose function is to process unstructured, natural language text, to locate specific pieces of information, or facts, in the text, and to use these facts to fill a database...

  14. Transductive Pattern Learning for Information Extraction

    National Research Council Canada - National Science Library

    McLernon, Brian; Kushmerick, Nicholas

    2006-01-01

    .... We present TPLEX, a semi-supervised learning algorithm for information extraction that can acquire extraction patterns from a small amount of labelled text in conjunction with a large amount of unlabelled text...

  15. Caesium extraction from acidic high level liquid wastes with functionalized calixarenes

    International Nuclear Information System (INIS)

    Simon, N.; Eymard, S.; Tournois, B.; Dozol, J.F.

    2000-01-01

    In the framework of French law programme, studies are under way to selectively remove caesium from acidic high activity wastes. Calix[4]arene crown derivatives exhibit outstanding efficiency and selectivity for caesium. An optimisation of the formulation of a selective extractant system for Cs based on crown calixarenes and usable in a process which use liquid-liquid extraction is presented. A system involving a monoamide as a modifier is proposed. Besides these improvements, a reference solvent based on a standard 1,3-di-(n-octyl-oxy)2,4-calix(4)arene crown is studied. Flow-sheets related to this system are calculated and easily transferable to the optimised new system. (authors)

  16. Transliteration normalization for Information Extraction and Machine Translation

    Directory of Open Access Journals (Sweden)

    Yuval Marton

    2014-12-01

    Full Text Available Foreign name transliterations typically include multiple spelling variants. These variants cause data sparseness and inconsistency problems, increase the Out-of-Vocabulary (OOV rate, and present challenges for Machine Translation, Information Extraction and other natural language processing (NLP tasks. This work aims to identify and cluster name spelling variants using a Statistical Machine Translation method: word alignment. The variants are identified by being aligned to the same “pivot” name in another language (the source-language in Machine Translation settings. Based on word-to-word translation and transliteration probabilities, as well as the string edit distance metric, names with similar spellings in the target language are clustered and then normalized to a canonical form. With this approach, tens of thousands of high-precision name transliteration spelling variants are extracted from sentence-aligned bilingual corpora in Arabic and English (in both languages. When these normalized name spelling variants are applied to Information Extraction tasks, improvements over strong baseline systems are observed. When applied to Machine Translation tasks, a large improvement potential is shown.

  17. Spent nuclear fuel project high-level information management plan

    Energy Technology Data Exchange (ETDEWEB)

    Main, G.C.

    1996-09-13

    This document presents the results of the Spent Nuclear Fuel Project (SNFP) Information Management Planning Project (IMPP), a short-term project that identified information management (IM) issues and opportunities within the SNFP and outlined a high-level plan to address them. This high-level plan for the SNMFP IM focuses on specific examples from within the SNFP. The plan`s recommendations can be characterized in several ways. Some recommendations address specific challenges that the SNFP faces. Others form the basis for making smooth transitions in several important IM areas. Still others identify areas where further study and planning are indicated. The team`s knowledge of developments in the IM industry and at the Hanford Site were crucial in deciding where to recommend that the SNFP act and where they should wait for Site plans to be made. Because of the fast pace of the SNFP and demands on SNFP staff, input and interaction were primarily between the IMPP team and members of the SNFP Information Management Steering Committee (IMSC). Key input to the IMPP came from a workshop where IMSC members and their delegates developed a set of draft IM principles. These principles, described in Section 2, became the foundation for the recommendations found in the transition plan outlined in Section 5. Availability of SNFP staff was limited, so project documents were used as a basis for much of the work. The team, realizing that the status of the project and the environment are continually changing, tried to keep abreast of major developments since those documents were generated. To the extent possible, the information contained in this document is current as of the end of fiscal year (FY) 1995. Programs and organizations on the Hanford Site as a whole are trying to maximize their return on IM investments. They are coordinating IM activities and trying to leverage existing capabilities. However, the SNFP cannot just rely on Sitewide activities to meet its IM requirements

  18. Optical Aperture Synthesis Object's Information Extracting Based on Wavelet Denoising

    International Nuclear Information System (INIS)

    Fan, W J; Lu, Y

    2006-01-01

    Wavelet denoising is studied to improve OAS(optical aperture synthesis) object's Fourier information extracting. Translation invariance wavelet denoising based on Donoho wavelet soft threshold denoising is researched to remove Pseudo-Gibbs in wavelet soft threshold image. OAS object's information extracting based on translation invariance wavelet denoising is studied. The study shows that wavelet threshold denoising can improve the precision and the repetition of object's information extracting from interferogram, and the translation invariance wavelet denoising information extracting is better than soft threshold wavelet denoising information extracting

  19. Separation of aromatic precipitates from simulated high level radioactive waste by hydrolysis, evaporation and liquid-liquid extraction

    International Nuclear Information System (INIS)

    Young, S.R.; Shah, H.B.; Carter, J.T.

    1991-01-01

    The Defense Waste Processing Facility (DWPF) at the SRS will be the United States' first facility to process High Level radioactive Waste (HLW) into a borosilicate glass matrix. The removal of aromatic precipitates by hydrolysis, evaporation and liquid-liquid extraction will be a key step in the processing of the HLW. This step, titled the Precipitate Hydrolysis Process, has been demonstrated by the Savannah River Laboratory with the Precipitate Hydrolysis Experimental Facility (PHEF). The mission of the PHEF is to demonstrate processing of simulated high level radioactive waste which contains tetraphenylborate precipitates and nitrite. Reduction of nitrite by hydroxylamine nitrate and hydrolysis of the tetraphenylborate by formic acid is discussed. Gaseous production, which is primarily benzene, nitrous oxide and carbon dioxide, has been quantified. Production of high-boiling organic compounds and the accumulation of these organic compounds within the process are addressed

  20. Cause Information Extraction from Financial Articles Concerning Business Performance

    Science.gov (United States)

    Sakai, Hiroyuki; Masuyama, Shigeru

    We propose a method of extracting cause information from Japanese financial articles concerning business performance. Our method acquires cause informtion, e. g. “_??__??__??__??__??__??__??__??__??__??_ (zidousya no uriage ga koutyou: Sales of cars were good)”. Cause information is useful for investors in selecting companies to invest. Our method extracts cause information as a form of causal expression by using statistical information and initial clue expressions automatically. Our method can extract causal expressions without predetermined patterns or complex rules given by hand, and is expected to be applied to other tasks for acquiring phrases that have a particular meaning not limited to cause information. We compared our method with our previous one originally proposed for extracting phrases concerning traffic accident causes and experimental results showed that our new method outperforms our previous one.

  1. Sample-based XPath Ranking for Web Information Extraction

    NARCIS (Netherlands)

    Jundt, Oliver; van Keulen, Maurice

    Web information extraction typically relies on a wrapper, i.e., program code or a configuration that specifies how to extract some information from web pages at a specific website. Manually creating and maintaining wrappers is a cumbersome and error-prone task. It may even be prohibitive as some

  2. Ontology-Based Information Extraction for Business Intelligence

    Science.gov (United States)

    Saggion, Horacio; Funk, Adam; Maynard, Diana; Bontcheva, Kalina

    Business Intelligence (BI) requires the acquisition and aggregation of key pieces of knowledge from multiple sources in order to provide valuable information to customers or feed statistical BI models and tools. The massive amount of information available to business analysts makes information extraction and other natural language processing tools key enablers for the acquisition and use of that semantic information. We describe the application of ontology-based extraction and merging in the context of a practical e-business application for the EU MUSING Project where the goal is to gather international company intelligence and country/region information. The results of our experiments so far are very promising and we are now in the process of building a complete end-to-end solution.

  3. Site characterization information needs for a high-level waste geologic repository

    International Nuclear Information System (INIS)

    Gupta, D.C.; Nataraja, M.S.; Justus, P.S.

    1987-01-01

    At each of the three candidate sites recommended for site characterization for High-Level Waste Geologic Repository development, the DOE has proposed to conduct both surface-based testing and in situ exploration and testing at the depths that wastes would be emplaced. The basic information needs and consequently the planned surface-based and in situ testing program will be governed to a large extent by the amount of credit taken for individual components of the geologic repository in meeting the performance objectives and siting criteria. Therefore, identified information to be acquired from site characterization activities should be commensurate with DOE's assigned performance goals for the repository system components on a site-specific basis. Because of the uncertainties that are likely to be associated with initial assignment of performance goals, the information needs should be both reasonably and conservatively identified

  4. Information analysis of iris biometrics for the needs of cryptology key extraction

    Directory of Open Access Journals (Sweden)

    Adamović Saša

    2013-01-01

    Full Text Available The paper presents a rigorous analysis of iris biometric information for the synthesis of an optimized system for the extraction of a high quality cryptology key. Estimations of local entropy and mutual information were identified as segments of the iris most suitable for this purpose. In order to optimize parameters, corresponding wavelets were transformed, in order to obtain the highest possible entropy and mutual information lower in the transformation domain, which set frameworks for the synthesis of systems for the extraction of truly random sequences of iris biometrics, without compromising authentication properties. [Projekat Ministarstva nauke Republike Srbije, br. TR32054 i br. III44006

  5. A Two-Step Resume Information Extraction Algorithm

    Directory of Open Access Journals (Sweden)

    Jie Chen

    2018-01-01

    Full Text Available With the rapid growth of Internet-based recruiting, there are a great number of personal resumes among recruiting systems. To gain more attention from the recruiters, most resumes are written in diverse formats, including varying font size, font colour, and table cells. However, the diversity of format is harmful to data mining, such as resume information extraction, automatic job matching, and candidates ranking. Supervised methods and rule-based methods have been proposed to extract facts from resumes, but they strongly rely on hierarchical structure information and large amounts of labelled data, which are hard to collect in reality. In this paper, we propose a two-step resume information extraction approach. In the first step, raw text of resume is identified as different resume blocks. To achieve the goal, we design a novel feature, Writing Style, to model sentence syntax information. Besides word index and punctuation index, word lexical attribute and prediction results of classifiers are included in Writing Style. In the second step, multiple classifiers are employed to identify different attributes of fact information in resumes. Experimental results on a real-world dataset show that the algorithm is feasible and effective.

  6. The Agent of extracting Internet Information with Lead Order

    Science.gov (United States)

    Mo, Zan; Huang, Chuliang; Liu, Aijun

    In order to carry out e-commerce better, advanced technologies to access business information are in need urgently. An agent is described to deal with the problems of extracting internet information that caused by the non-standard and skimble-scamble structure of Chinese websites. The agent designed includes three modules which respond to the process of extracting information separately. A method of HTTP tree and a kind of Lead algorithm is proposed to generate a lead order, with which the required web can be retrieved easily. How to transform the extracted information structuralized with natural language is also discussed.

  7. RESEARCH ON REMOTE SENSING GEOLOGICAL INFORMATION EXTRACTION BASED ON OBJECT ORIENTED CLASSIFICATION

    Directory of Open Access Journals (Sweden)

    H. Gao

    2018-04-01

    Full Text Available The northern Tibet belongs to the Sub cold arid climate zone in the plateau. It is rarely visited by people. The geological working conditions are very poor. However, the stratum exposures are good and human interference is very small. Therefore, the research on the automatic classification and extraction of remote sensing geological information has typical significance and good application prospect. Based on the object-oriented classification in Northern Tibet, using the Worldview2 high-resolution remote sensing data, combined with the tectonic information and image enhancement, the lithological spectral features, shape features, spatial locations and topological relations of various geological information are excavated. By setting the threshold, based on the hierarchical classification, eight kinds of geological information were classified and extracted. Compared with the existing geological maps, the accuracy analysis shows that the overall accuracy reached 87.8561 %, indicating that the classification-oriented method is effective and feasible for this study area and provides a new idea for the automatic extraction of remote sensing geological information.

  8. Improving information extraction using a probability-based approach

    DEFF Research Database (Denmark)

    Kim, S.; Ahmed, Saeema; Wallace, K.

    2007-01-01

    Information plays a crucial role during the entire life-cycle of a product. It has been shown that engineers frequently consult colleagues to obtain the information they require to solve problems. However, the industrial world is now more transient and key personnel move to other companies...... or retire. It is becoming essential to retrieve vital information from archived product documents, if it is available. There is, therefore, great interest in ways of extracting relevant and sharable information from documents. A keyword-based search is commonly used, but studies have shown...... the recall, while maintaining the high precision, a learning approach that makes identification decisions based on a probability model, rather than simply looking up the presence of the pre-defined variations, looks promising. This paper presents the results of developing such a probability-based entity...

  9. 40 CFR 227.30 - High-level radioactive waste.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 24 2010-07-01 2010-07-01 false High-level radioactive waste. 227.30...-level radioactive waste. High-level radioactive waste means the aqueous waste resulting from the operation of the first cycle solvent extraction system, or equivalent, and the concentrated waste from...

  10. Optimized ultra-high-pressure-assisted extraction of procyanidins from lychee pericarp improves the antioxidant activity of extracts.

    Science.gov (United States)

    Zhang, Ruifen; Su, Dongxiao; Hou, Fangli; Liu, Lei; Huang, Fei; Dong, Lihong; Deng, Yuanyuan; Zhang, Yan; Wei, Zhencheng; Zhang, Mingwei

    2017-08-01

    To establish optimal ultra-high-pressure (UHP)-assisted extraction conditions for procyanidins from lychee pericarp, a response surface analysis method with four factors and three levels was adopted. The optimum conditions were as follows: 295 MPa pressure, 13 min pressure holding time, 16.0 mL/g liquid-to-solid ratio, and 70% ethanol concentration. Compared with conventional ethanol extraction and ultrasonic-assisted extraction methods, the yields of the total procyanidins, flavonoids, and phenolics extracted using the UHP process were significantly increased; consequently, the oxygen radical absorbance capacity and cellular antioxidant activity of UHP-assisted lychee pericarp extracts were substantially enhanced. LC-MS/MS and high-performance liquid chromatography quantification results for individual phenolic compounds revealed that the yield of procyanidin compounds, including epicatechin, procyanidin A2, and procyanidin B2, from lychee pericarp could be significantly improved by the UHP-assisted extraction process. This UHP-assisted extraction process is thus a practical method for the extraction of procyanidins from lychee pericarp.

  11. KneeTex: an ontology-driven system for information extraction from MRI reports.

    Science.gov (United States)

    Spasić, Irena; Zhao, Bo; Jones, Christopher B; Button, Kate

    2015-01-01

    In the realm of knee pathology, magnetic resonance imaging (MRI) has the advantage of visualising all structures within the knee joint, which makes it a valuable tool for increasing diagnostic accuracy and planning surgical treatments. Therefore, clinical narratives found in MRI reports convey valuable diagnostic information. A range of studies have proven the feasibility of natural language processing for information extraction from clinical narratives. However, no study focused specifically on MRI reports in relation to knee pathology, possibly due to the complexity of knee anatomy and a wide range of conditions that may be associated with different anatomical entities. In this paper we describe KneeTex, an information extraction system that operates in this domain. As an ontology-driven information extraction system, KneeTex makes active use of an ontology to strongly guide and constrain text analysis. We used automatic term recognition to facilitate the development of a domain-specific ontology with sufficient detail and coverage for text mining applications. In combination with the ontology, high regularity of the sublanguage used in knee MRI reports allowed us to model its processing by a set of sophisticated lexico-semantic rules with minimal syntactic analysis. The main processing steps involve named entity recognition combined with coordination, enumeration, ambiguity and co-reference resolution, followed by text segmentation. Ontology-based semantic typing is then used to drive the template filling process. We adopted an existing ontology, TRAK (Taxonomy for RehAbilitation of Knee conditions), for use within KneeTex. The original TRAK ontology expanded from 1,292 concepts, 1,720 synonyms and 518 relationship instances to 1,621 concepts, 2,550 synonyms and 560 relationship instances. This provided KneeTex with a very fine-grained lexico-semantic knowledge base, which is highly attuned to the given sublanguage. Information extraction results were evaluated

  12. From GPS tracks to context: Inference of high-level context information through spatial clustering

    OpenAIRE

    Moreira, Adriano; Santos, Maribel Yasmina

    2005-01-01

    Location-aware applications use the location of users to adapt their behaviour and to select the relevant information for users in a particular situation. This location information is obtained through a set of location sensors, or from network-based location services, and is often used directly, without any further processing, as a parameter in a selection process. In this paper we propose a method to infer high-level context information from a series of position records obtained from a GPS r...

  13. Improving IUE High Dispersion Extraction

    Science.gov (United States)

    Lawton, Patricia J.; VanSteenberg, M. E.; Massa, D.

    2007-01-01

    We present a different method to extract high dispersion International Ultraviolet Explorer (IUE) spectra from the New Spectral Image Processing System (NEWSIPS) geometrically and photometrically corrected (SI HI) images of the echellogram. The new algorithm corrects many of the deficiencies that exist in the NEWSIPS high dispersion (SIHI) spectra . Specifically, it does a much better job of accounting for the overlap of the higher echelle orders, it eliminates a significant time dependency in the extracted spectra (which can be traced to the background model used in the NEWSIPS extractions), and it can extract spectra from echellogram images that are more highly distorted than the NEWSIPS extraction routines can handle. Together, these improvements yield a set of IUE high dispersion spectra whose scientific integrity is sign ificantly better than the NEWSIPS products. This work has been supported by NASA ADP grants.

  14. A research of road centerline extraction algorithm from high resolution remote sensing images

    Science.gov (United States)

    Zhang, Yushan; Xu, Tingfa

    2017-09-01

    Satellite remote sensing technology has become one of the most effective methods for land surface monitoring in recent years, due to its advantages such as short period, large scale and rich information. Meanwhile, road extraction is an important field in the applications of high resolution remote sensing images. An intelligent and automatic road extraction algorithm with high precision has great significance for transportation, road network updating and urban planning. The fuzzy c-means (FCM) clustering segmentation algorithms have been used in road extraction, but the traditional algorithms did not consider spatial information. An improved fuzzy C-means clustering algorithm combined with spatial information (SFCM) is proposed in this paper, which is proved to be effective for noisy image segmentation. Firstly, the image is segmented using the SFCM. Secondly, the segmentation result is processed by mathematical morphology to remover the joint region. Thirdly, the road centerlines are extracted by morphology thinning and burr trimming. The average integrity of the centerline extraction algorithm is 97.98%, the average accuracy is 95.36% and the average quality is 93.59%. Experimental results show that the proposed method in this paper is effective for road centerline extraction.

  15. Preliminary study on the three-dimensional geoscience information system of high-level radioactive waste geological disposal

    International Nuclear Information System (INIS)

    Li Peinan; Zhu Hehua; Li Xiaojun; Wang Ju; Zhong Xia

    2010-01-01

    The 3D geosciences information system of high-level radioactive waste geological disposal is an important research direction in the current high-level radioactive waste disposal project and a platform of information integration and publishing can be used for the relevant research direction based on the provided data and models interface. Firstly, this paper introduces the basic features about the disposal project of HLW and the function and requirement of the system, which includes the input module, the database management module, the function module, the maintenance module and the output module. Then, the framework system of the high-level waste disposal project information system has been studied, and the overall system architecture has been proposed. Finally, based on the summary and analysis of the database management, the 3D modeling, spatial analysis, digital numerical integration and visualization of underground project, the implementations of key functional modules and the platform have been expounded completely, and the conclusion has been drawn that the component-based software development method should be utilized in system development. (authors)

  16. Extracting information from multiplex networks

    Science.gov (United States)

    Iacovacci, Jacopo; Bianconi, Ginestra

    2016-06-01

    Multiplex networks are generalized network structures that are able to describe networks in which the same set of nodes are connected by links that have different connotations. Multiplex networks are ubiquitous since they describe social, financial, engineering, and biological networks as well. Extending our ability to analyze complex networks to multiplex network structures increases greatly the level of information that is possible to extract from big data. For these reasons, characterizing the centrality of nodes in multiplex networks and finding new ways to solve challenging inference problems defined on multiplex networks are fundamental questions of network science. In this paper, we discuss the relevance of the Multiplex PageRank algorithm for measuring the centrality of nodes in multilayer networks and we characterize the utility of the recently introduced indicator function Θ ˜ S for describing their mesoscale organization and community structure. As working examples for studying these measures, we consider three multiplex network datasets coming for social science.

  17. Technetium Chemistry in High-Level Waste

    International Nuclear Information System (INIS)

    Hess, Nancy J.

    2006-01-01

    Tc contamination is found within the DOE complex at those sites whose mission involved extraction of plutonium from irradiated uranium fuel or isotopic enrichment of uranium. At the Hanford Site, chemical separations and extraction processes generated large amounts of high level and transuranic wastes that are currently stored in underground tanks. The waste from these extraction processes is currently stored in underground High Level Waste (HLW) tanks. However, the chemistry of the HLW in any given tank is greatly complicated by repeated efforts to reduce volume and recover isotopes. These processes ultimately resulted in mixing of waste streams from different processes. As a result, the chemistry and the fate of Tc in HLW tanks are not well understood. This lack of understanding has been made evident in the failed efforts to leach Tc from sludge and to remove Tc from supernatants prior to immobilization. Although recent interest in Tc chemistry has shifted from pretreatment chemistry to waste residuals, both needs are served by a fundamental understanding of Tc chemistry

  18. Accurate facade feature extraction method for buildings from three-dimensional point cloud data considering structural information

    Science.gov (United States)

    Wang, Yongzhi; Ma, Yuqing; Zhu, A.-xing; Zhao, Hui; Liao, Lixia

    2018-05-01

    Facade features represent segmentations of building surfaces and can serve as a building framework. Extracting facade features from three-dimensional (3D) point cloud data (3D PCD) is an efficient method for 3D building modeling. By combining the advantages of 3D PCD and two-dimensional optical images, this study describes the creation of a highly accurate building facade feature extraction method from 3D PCD with a focus on structural information. The new extraction method involves three major steps: image feature extraction, exploration of the mapping method between the image features and 3D PCD, and optimization of the initial 3D PCD facade features considering structural information. Results show that the new method can extract the 3D PCD facade features of buildings more accurately and continuously. The new method is validated using a case study. In addition, the effectiveness of the new method is demonstrated by comparing it with the range image-extraction method and the optical image-extraction method in the absence of structural information. The 3D PCD facade features extracted by the new method can be applied in many fields, such as 3D building modeling and building information modeling.

  19. Comparison Pore Aggregate Levels After Extraction With Solvents Pertamax Plus And Gasoline

    Science.gov (United States)

    Anggraini, Muthia

    2017-12-01

    Loss of asphalt content extraction results become problems in Field Work For implementing parties. The use of solvents with high octane (pertamax plus) for the extraction, dissolving the asphalt more than gasoline. By comparing the levels of aggregate pores after using solvent extraction pertamax plus compared to gasoline could answer that pertamax plus more solvent dissolves the bitumen compared to gasoline. This study aims to obtain comparative levels of porous aggregate mix AC-WC after using solvent extraction pertamax plus compared to gasoline. This study uses the aggregate that has been extracted from the production of asphalt mixtures, when finisher and after compaction field. The method used is the assay of coarse and fine aggregate pores, extraction of bitumen content to separate the aggregate with bitumen. Results of testing the total absorption after extraction using a solvent preta max plus in the production of asphalt mixtures 0.80%, while gasoline solvent 0.67% deviation occurs 0.13%. In the finisher after the solvent extraction preta max plus 0.77%, while 0.67% gasoline solvent occurs deviation of 0.1%. At the core after extraction and solvent pertamax plus 0.71%, while gasoline solvent 0.60% 0.11% deviation occurs. The total water absorption after extraction using a solvent pertamax plus greater than gasoline. This proves that the solvent dissolves pertamax plus more asphalt than gasoline.

  20. Development of Effective Solvent Modifiers for the Solvent Extraction of Cesium from Alkaline High-Level Tank Waste

    International Nuclear Information System (INIS)

    Bonnesen, Peter V.; Delmau, Laetitia H.; Moyer, Bruce A.; Lumetta, Gregg J.

    2003-01-01

    A series of novel alkylphenoxy fluorinated alcohols were prepared and investigated for their effectiveness as modifiers in solvents containing calix(4)arene-bis-(tert-octylbenzo)-crown-6 for extracting cesium from alkaline nitrate media. A modifier that contained a terminal 1,1,2,2-tetrafluoroethoxy group was found to decompose following long-term exposure to warm alkaline solutions. However, replacement of the tetrafluoroethoxy group with a 2,2,3,3-tetrafluoropropoxy group led to a series of modifiers that possessed the alkaline stability required for a solvent extraction process. Within this series of modifiers, the structure of the alkyl substituent (tert-octyl, tert-butyl, tert-amyl, and sec-butyl) of the alkylphenoxy moiety was found to have a profound impact on the phase behavior of the solvent in liquid-liquid contacting experiments, and hence on the overall suitability of the modifier for a solvent extraction process. The sec-butyl derivative(1-(2,2,3,3-tetrafluoropropoxy)-3- (4-sec-butylphenoxy)-2-propanol) (Cs-7SB) was found to possess the best overall balance of properties with respect to third phase and coalescence behavior, cleanup following degradation, resistance to solids formation, and cesium distribution behavior. Accordingly, this modifier was selected for use as a component of the solvent employed in the Caustic-Side Solvent Extraction (CSSX) process for removing cesium from high level nuclear waste (HLW) at the U.S. Department of Energy's (DOE) Savannah River Site. In batch equilibrium experiments, this solvent has also been successfully shown to extract cesium from both simulated and actual solutions generated from caustic leaching of HLW tank sludge stored in tank B-110 at the DOE's Hanford Site.

  1. High level cognitive information processing in neural networks

    Science.gov (United States)

    Barnden, John A.; Fields, Christopher A.

    1992-01-01

    Two related research efforts were addressed: (1) high-level connectionist cognitive modeling; and (2) local neural circuit modeling. The goals of the first effort were to develop connectionist models of high-level cognitive processes such as problem solving or natural language understanding, and to understand the computational requirements of such models. The goals of the second effort were to develop biologically-realistic model of local neural circuits, and to understand the computational behavior of such models. In keeping with the nature of NASA's Innovative Research Program, all the work conducted under the grant was highly innovative. For instance, the following ideas, all summarized, are contributions to the study of connectionist/neural networks: (1) the temporal-winner-take-all, relative-position encoding, and pattern-similarity association techniques; (2) the importation of logical combinators into connection; (3) the use of analogy-based reasoning as a bridge across the gap between the traditional symbolic paradigm and the connectionist paradigm; and (4) the application of connectionism to the domain of belief representation/reasoning. The work on local neural circuit modeling also departs significantly from the work of related researchers. In particular, its concentration on low-level neural phenomena that could support high-level cognitive processing is unusual within the area of biological local circuit modeling, and also serves to expand the horizons of the artificial neural net field.

  2. Highway extraction from high resolution aerial photography using a geometric active contour model

    Science.gov (United States)

    Niu, Xutong

    Highway extraction and vehicle detection are two of the most important steps in traffic-flow analysis from multi-frame aerial photographs. The traditional method of deriving traffic flow trajectories relies on manual vehicle counting from a sequence of aerial photographs, which is tedious and time-consuming. This research presents a new framework for semi-automatic highway extraction. The basis of the new framework is an improved geometric active contour (GAC) model. This novel model seeks to minimize an objective function that transforms a problem of propagation of regular curves into an optimization problem. The implementation of curve propagation is based on level set theory. By using an implicit representation of a two-dimensional curve, a level set approach can be used to deal with topological changes naturally, and the output is unaffected by different initial positions of the curve. However, the original GAC model, on which the new model is based, only incorporates boundary information into the curve propagation process. An error-producing phenomenon called leakage is inevitable wherever there is an uncertain weak edge. In this research, region-based information is added as a constraint into the original GAC model, thereby, giving this proposed method the ability of integrating both boundary and region-based information during the curve propagation. Adding the region-based constraint eliminates the leakage problem. This dissertation applies the proposed augmented GAC model to the problem of highway extraction from high-resolution aerial photography. First, an optimized stopping criterion is designed and used in the implementation of the GAC model. It effectively saves processing time and computations. Second, a seed point propagation framework is designed and implemented. This framework incorporates highway extraction, tracking, and linking into one procedure. A seed point is usually placed at an end node of highway segments close to the boundary of the

  3. Point Cloud Classification of Tesserae from Terrestrial Laser Data Combined with Dense Image Matching for Archaeological Information Extraction

    Science.gov (United States)

    Poux, F.; Neuville, R.; Billen, R.

    2017-08-01

    Reasoning from information extraction given by point cloud data mining allows contextual adaptation and fast decision making. However, to achieve this perceptive level, a point cloud must be semantically rich, retaining relevant information for the end user. This paper presents an automatic knowledge-based method for pre-processing multi-sensory data and classifying a hybrid point cloud from both terrestrial laser scanning and dense image matching. Using 18 features including sensor's biased data, each tessera in the high-density point cloud from the 3D captured complex mosaics of Germigny-des-prés (France) is segmented via a colour multi-scale abstraction-based featuring extracting connectivity. A 2D surface and outline polygon of each tessera is generated by a RANSAC plane extraction and convex hull fitting. Knowledge is then used to classify every tesserae based on their size, surface, shape, material properties and their neighbour's class. The detection and semantic enrichment method shows promising results of 94% correct semantization, a first step toward the creation of an archaeological smart point cloud.

  4. A novel quantum information hiding protocol based on entanglement swapping of high-level Bell states

    International Nuclear Information System (INIS)

    Xu Shu-Jiang; Wang Lian-Hai; Chen Xiu-Bo; Niu Xin-Xin; Yang Yi-Xian

    2015-01-01

    Using entanglement swapping of high-level Bell states, we first derive a covert layer between the secret message and the possible output results of the entanglement swapping between any two generalized Bell states, and then propose a novel high-efficiency quantum information hiding protocol based on the covert layer. In the proposed scheme, a covert channel can be built up under the cover of a high-level quantum secure direct communication (QSDC) channel for securely transmitting secret messages without consuming any auxiliary quantum state or any extra communication resource. It is shown that this protocol not only has a high embedding efficiency but also achieves a good imperceptibility as well as a high security. (paper)

  5. Partitioning of actinide from simulated high level wastes arising from reprocessing of PHWR fuels: counter current extraction studies using CMPO

    International Nuclear Information System (INIS)

    Deshingkar, D.S.; Chitnis, R.R.; Wattal, P.K.; Theyyunni, T.K.; Nair, M.K.T.; Ramanujam, A.; Dhami, P.S.; Gopalakrishnan, V.; Rao, M.K.; Mathur, J.N.; Murali, M.S.; Iyer, R.H.; Badheka, L.P.; Banerji, A.

    1994-01-01

    High level wastes (HLW) arising from reprocessing of pressurised heavy water reactor (PHWR) fuels contain actinides like neptunium, americium and cerium which are not extracted in the Purex process. They also contain small quantities of uranium and plutonium in addition to fission products. Removal of these actinides prior to vitrification of HLW can effectively reduce the active surveillance period of final waste form. Counter current studies using indigenously synthesised octyl (phenyl)-N, N-diisobutylcarbamoylmethylphosphine oxide (CMPO) were taken up as a follow-up of successful runs with simulated sulphate bearing low acid HLW solutions. The simulated HLW arising from reprocessing of PHWR fuel was prepared based on presumed burnup of 6500 MWd/Te of uranium, 3 years cooling period and 800 litres of waste generation per tonne of fuel reprocessed. The alpha activity of the HLW raffinate after extraction with the CMPO-TBP mixture could be brought down to near background level. (author). 13 refs., 2 tabs., 12 figs

  6. Partitioning of actinide from simulated high level wastes arising from reprocessing of PHWR fuels: counter current extraction studies using CMPO

    Energy Technology Data Exchange (ETDEWEB)

    Deshingkar, D S; Chitnis, R R; Wattal, P K; Theyyunni, T K; Nair, M K.T. [Bhabha Atomic Research Centre, Bombay (India). Process Engineering and Systems Development Div.; Ramanujam, A; Dhami, P S; Gopalakrishnan, V; Rao, M K [Bhabha Atomic Research Centre, Bombay (India). Fuel Reprocessing Group; Mathur, J N; Murali, M S; Iyer, R H [Bhabha Atomic Research Centre, Bombay (India). Radiochemistry Div.; Badheka, L P; Banerji, A [Bhabha Atomic Research Centre, Bombay (India). Bio-organic Div.

    1994-12-31

    High level wastes (HLW) arising from reprocessing of pressurised heavy water reactor (PHWR) fuels contain actinides like neptunium, americium and cerium which are not extracted in the Purex process. They also contain small quantities of uranium and plutonium in addition to fission products. Removal of these actinides prior to vitrification of HLW can effectively reduce the active surveillance period of final waste form. Counter current studies using indigenously synthesised octyl (phenyl)-N, N-diisobutylcarbamoylmethylphosphine oxide (CMPO) were taken up as a follow-up of successful runs with simulated sulphate bearing low acid HLW solutions. The simulated HLW arising from reprocessing of PHWR fuel was prepared based on presumed burnup of 6500 MWd/Te of uranium, 3 years cooling period and 800 litres of waste generation per tonne of fuel reprocessed. The alpha activity of the HLW raffinate after extraction with the CMPO-TBP mixture could be brought down to near background level. (author). 13 refs., 2 tabs., 12 figs.

  7. Extracting Information from Multimedia Meeting Collections

    OpenAIRE

    Gatica-Perez, Daniel; Zhang, Dong; Bengio, Samy

    2005-01-01

    Multimedia meeting collections, composed of unedited audio and video streams, handwritten notes, slides, and electronic documents that jointly constitute a raw record of complex human interaction processes in the workplace, have attracted interest due to the increasing feasibility of recording them in large quantities, by the opportunities for information access and retrieval applications derived from the automatic extraction of relevant meeting information, and by the challenges that the ext...

  8. The Application of Chinese High-Spatial Remote Sensing Satellite Image in Land Law Enforcement Information Extraction

    Science.gov (United States)

    Wang, N.; Yang, R.

    2018-04-01

    Chinese high -resolution (HR) remote sensing satellites have made huge leap in the past decade. Commercial satellite datasets, such as GF-1, GF-2 and ZY-3 images, the panchromatic images (PAN) resolution of them are 2 m, 1 m and 2.1 m and the multispectral images (MS) resolution are 8 m, 4 m, 5.8 m respectively have been emerged in recent years. Chinese HR satellite imagery has been free downloaded for public welfare purposes using. Local government began to employ more professional technician to improve traditional land management technology. This paper focused on analysing the actual requirements of the applications in government land law enforcement in Guangxi Autonomous Region. 66 counties in Guangxi Autonomous Region were selected for illegal land utilization spot extraction with fusion Chinese HR images. The procedure contains: A. Defines illegal land utilization spot type. B. Data collection, GF-1, GF-2, and ZY-3 datasets were acquired in the first half year of 2016 and other auxiliary data were collected in 2015. C. Batch process, HR images were collected for batch preprocessing through ENVI/IDL tool. D. Illegal land utilization spot extraction by visual interpretation. E. Obtaining attribute data with ArcGIS Geoprocessor (GP) model. F. Thematic mapping and surveying. Through analysing 42 counties results, law enforcement officials found 1092 illegal land using spots and 16 suspicious illegal mining spots. The results show that Chinese HR satellite images have great potential for feature information extraction and the processing procedure appears robust.

  9. Selective extraction of actinides from high level liquid wastes. Study of the possibilities offered by the Redox properties of actinides

    International Nuclear Information System (INIS)

    Adnet, J.M.

    1991-07-01

    Partitioning of high level liquid wastes coming from nuclear fuel reprocessing by the PUREX process, consists in the elimination of minor actinides (Np, Am, and traces of Pu and U). Among the possible processes, the selective extraction of actinides with oxidation states higher than three is studied. First part of this work deals with a preliminary step; the elimination of the ruthenium from fission products solutions using the electrovolatilization of the RuO4 compound. The second part of this work concerns the complexation and oxidation reactions of the elements U, Np, Pu and Am in presence of a compound belonging to the insaturated polyanions family: the potassium phosphotungstate. For actinide ions with oxidation state (IV) complexed with phosphotungstate anion the extraction mechanism by dioctylamine was studied and the use of a chromatographic extraction technic permitted successful separations between tetravalents actinides and trivalents actinides. Finally, in accordance with the obtained results, the basis of a separation scheme for the management of fission products solutions is proposed

  10. High construal level can help negotiators to reach integrative agreements: The role of information exchange and judgement accuracy.

    Science.gov (United States)

    Wening, Stefanie; Keith, Nina; Abele, Andrea E

    2016-06-01

    In negotiations, a focus on interests (why negotiators want something) is key to integrative agreements. Yet, many negotiators spontaneously focus on positions (what they want), with suboptimal outcomes. Our research applies construal-level theory to negotiations and proposes that a high construal level instigates a focus on interests during negotiations which, in turn, positively affects outcomes. In particular, we tested the notion that the effect of construal level on outcomes was mediated by information exchange and judgement accuracy. Finally, we expected the mere mode of presentation of task material to affect construal levels and manipulated construal levels using concrete versus abstract negotiation tasks. In two experiments, participants negotiated in dyads in either a high- or low-construal-level condition. In Study 1, high-construal-level dyads outperformed dyads in the low-construal-level condition; this main effect was mediated by information exchange. Study 2 replicated both the main and mediation effects using judgement accuracy as mediator and additionally yielded a positive effect of a high construal level on a second, more complex negotiation task. These results not only provide empirical evidence for the theoretically proposed link between construal levels and negotiation outcomes but also shed light on the processes underlying this effect. © 2015 The British Psychological Society.

  11. Collaboration, Automation, and Information Management at Hanford High Level Radioactive Waste (HLW) Tank Farms

    International Nuclear Information System (INIS)

    Aurah, Mirwaise Y.; Roberts, Mark A.

    2013-01-01

    Washington River Protection Solutions (WRPS), operator of High Level Radioactive Waste (HLW) Tank Farms at the Hanford Site, is taking an over 20-year leap in technology, replacing systems that were monitored with clipboards and obsolete computer systems, as well as solving major operations and maintenance hurdles in the area of process automation and information management. While WRPS is fully compliant with procedures and regulations, the current systems are not integrated and do not share data efficiently, hampering how information is obtained and managed

  12. NAMED ENTITY RECOGNITION FROM BIOMEDICAL TEXT -AN INFORMATION EXTRACTION TASK

    Directory of Open Access Journals (Sweden)

    N. Kanya

    2016-07-01

    Full Text Available Biomedical Text Mining targets the Extraction of significant information from biomedical archives. Bio TM encompasses Information Retrieval (IR and Information Extraction (IE. The Information Retrieval will retrieve the relevant Biomedical Literature documents from the various Repositories like PubMed, MedLine etc., based on a search query. The IR Process ends up with the generation of corpus with the relevant document retrieved from the Publication databases based on the query. The IE task includes the process of Preprocessing of the document, Named Entity Recognition (NER from the documents and Relationship Extraction. This process includes Natural Language Processing, Data Mining techniques and machine Language algorithm. The preprocessing task includes tokenization, stop word Removal, shallow parsing, and Parts-Of-Speech tagging. NER phase involves recognition of well-defined objects such as genes, proteins or cell-lines etc. This process leads to the next phase that is extraction of relationships (IE. The work was based on machine learning algorithm Conditional Random Field (CRF.

  13. Information retrieval and terminology extraction in online resources for patients with diabetes.

    Science.gov (United States)

    Seljan, Sanja; Baretić, Maja; Kucis, Vlasta

    2014-06-01

    Terminology use, as a mean for information retrieval or document indexing, plays an important role in health literacy. Specific types of users, i.e. patients with diabetes need access to various online resources (on foreign and/or native language) searching for information on self-education of basic diabetic knowledge, on self-care activities regarding importance of dietetic food, medications, physical exercises and on self-management of insulin pumps. Automatic extraction of corpus-based terminology from online texts, manuals or professional papers, can help in building terminology lists or list of "browsing phrases" useful in information retrieval or in document indexing. Specific terminology lists represent an intermediate step between free text search and controlled vocabulary, between user's demands and existing online resources in native and foreign language. The research aiming to detect the role of terminology in online resources, is conducted on English and Croatian manuals and Croatian online texts, and divided into three interrelated parts: i) comparison of professional and popular terminology use ii) evaluation of automatic statistically-based terminology extraction on English and Croatian texts iii) comparison and evaluation of extracted terminology performed on English manual using statistical and hybrid approaches. Extracted terminology candidates are evaluated by comparison with three types of reference lists: list created by professional medical person, list of highly professional vocabulary contained in MeSH and list created by non-medical persons, made as intersection of 15 lists. Results report on use of popular and professional terminology in online diabetes resources, on evaluation of automatically extracted terminology candidates in English and Croatian texts and on comparison of statistical and hybrid extraction methods in English text. Evaluation of automatic and semi-automatic terminology extraction methods is performed by recall

  14. Comparison of clinical parameters and environmental noise levels between regular surgery and piezosurgery for extraction of impacted third molars.

    Science.gov (United States)

    Chang, Hao-Hueng; Lee, Ming-Shu; Hsu, You-Chyun; Tsai, Shang-Jye; Lin, Chun-Pin

    2015-10-01

    Impacted third molars can be extracted by regular surgery or piezosurgery. The aim of this study was to compare clinical parameters and device-produced noise levels between regular surgery and piezosurgery for the extraction of impacted third molars. Twenty patients (18 women and 2 men, 17-29 years of age) with bilateral symmetrical impacted mandibular or maxillary third molars of the same level were included in this randomized crossover clinical trial. The 40 impacted third molars were divided into a control group (n = 20), in which the third molar was extracted by regular surgery using a high-speed handpiece and an elevator, and an experimental group (n = 20), in which the third molar was extracted by piezosurgery using a high-speed handpiece and a piezotome. The clinical parameters were evaluated by a self-reported questionnaire. The noise levels produced by the high-speed handpiece and piezotome were measured and compared between the experimental and control groups. Patients in the experimental group had a better feeling about tooth extraction and force delivery during extraction and less facial swelling than patients in the control group. However, there were no significant differences in noise-related disturbance, extraction period, degree of facial swelling, pain score, pain duration, any noise levels produced by the devices under different circumstances during tooth extraction between the control and experimental groups. The piezosurgery device produced noise levels similar to or lower than those of the high-speed drilling device. However, piezosurgery provides advantages of increased patient comfort during extraction of impacted third molars. Copyright © 2014. Published by Elsevier B.V.

  15. Semantic Information Extraction of Lanes Based on Onboard Camera Videos

    Science.gov (United States)

    Tang, L.; Deng, T.; Ren, C.

    2018-04-01

    In the field of autonomous driving, semantic information of lanes is very important. This paper proposes a method of automatic detection of lanes and extraction of semantic information from onboard camera videos. The proposed method firstly detects the edges of lanes by the grayscale gradient direction, and improves the Probabilistic Hough transform to fit them; then, it uses the vanishing point principle to calculate the lane geometrical position, and uses lane characteristics to extract lane semantic information by the classification of decision trees. In the experiment, 216 road video images captured by a camera mounted onboard a moving vehicle were used to detect lanes and extract lane semantic information. The results show that the proposed method can accurately identify lane semantics from video images.

  16. Comparison of solvent extraction and extraction chromatography resin techniques for uranium isotopic characterization in high-level radioactive waste and barrier materials.

    Science.gov (United States)

    Hurtado-Bermúdez, Santiago; Villa-Alfageme, María; Mas, José Luis; Alba, María Dolores

    2018-07-01

    The development of Deep Geological Repositories (DGP) to the storage of high-level radioactive waste (HLRW) is mainly focused in systems of multiple barriers based on the use of clays, and particularly bentonites, as natural and engineered barriers in nuclear waste isolation due to their remarkable properties. Due to the fact that uranium is the major component of HLRW, it is required to go in depth in the analysis of the chemistry of the reaction of this element within bentonites. The determination of uranium under the conditions of HLRW, including the analysis of silicate matrices before and after the uranium-bentonite reaction, was investigated. The performances of a state-of-the-art and widespread radiochemical method based on chromatographic UTEVA resins, and a well-known and traditional method based on solvent extraction with tri-n-butyl phosphate (TBP), for the analysis of uranium and thorium isotopes in solid matrices with high concentrations of uranium were analysed in detail. In the development of this comparison, both radiochemical approaches have an overall excellent performance in order to analyse uranium concentration in HLRW samples. However, due to the high uranium concentration in the samples, the chromatographic resin is not able to avoid completely the uranium contamination in the thorium fraction. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Integrating Information Extraction Agents into a Tourism Recommender System

    Science.gov (United States)

    Esparcia, Sergio; Sánchez-Anguix, Víctor; Argente, Estefanía; García-Fornes, Ana; Julián, Vicente

    Recommender systems face some problems. On the one hand information needs to be maintained updated, which can result in a costly task if it is not performed automatically. On the other hand, it may be interesting to include third party services in the recommendation since they improve its quality. In this paper, we present an add-on for the Social-Net Tourism Recommender System that uses information extraction and natural language processing techniques in order to automatically extract and classify information from the Web. Its goal is to maintain the system updated and obtain information about third party services that are not offered by service providers inside the system.

  18. From remote sensing data about information extraction for 3D geovisualization - Development of a workflow

    International Nuclear Information System (INIS)

    Tiede, D.

    2010-01-01

    With an increased availability of high (spatial) resolution remote sensing imagery since the late nineties, the need to develop operative workflows for the automated extraction, provision and communication of information from such data has grown. Monitoring requirements, aimed at the implementation of environmental or conservation targets, management of (environmental-) resources, and regional planning as well as international initiatives, especially the joint initiative of the European Commission and ESA (European Space Agency) for Global Monitoring for Environment and Security (GMES) play also a major part. This thesis addresses the development of an integrated workflow for the automated provision of information derived from remote sensing data. Considering applied data and fields of application, this work aims to design the workflow as generic as possible. Following research questions are discussed: What are the requirements of a workflow architecture that seamlessly links the individual workflow elements in a timely manner and secures accuracy of the extracted information effectively? How can the workflow retain its efficiency if mounds of data are processed? How can the workflow be improved with regards to automated object-based image analysis (OBIA)? Which recent developments could be of use? What are the limitations or which workarounds could be applied in order to generate relevant results? How can relevant information be prepared target-oriented and communicated effectively? How can the more recently developed freely available virtual globes be used for the delivery of conditioned information under consideration of the third dimension as an additional, explicit carrier of information? Based on case studies comprising different data sets and fields of application it is demonstrated how methods to extract and process information as well as to effectively communicate results can be improved and successfully combined within one workflow. It is shown that (1

  19. Optimal Information Extraction of Laser Scanning Dataset by Scale-Adaptive Reduction

    Science.gov (United States)

    Zang, Y.; Yang, B.

    2018-04-01

    3D laser technology is widely used to collocate the surface information of object. For various applications, we need to extract a good perceptual quality point cloud from the scanned points. To solve the problem, most of existing methods extract important points based on a fixed scale. However, geometric features of 3D object come from various geometric scales. We propose a multi-scale construction method based on radial basis function. For each scale, important points are extracted from the point cloud based on their importance. We apply a perception metric Just-Noticeable-Difference to measure degradation of each geometric scale. Finally, scale-adaptive optimal information extraction is realized. Experiments are undertaken to evaluate the effective of the proposed method, suggesting a reliable solution for optimal information extraction of object.

  20. OPTIMAL INFORMATION EXTRACTION OF LASER SCANNING DATASET BY SCALE-ADAPTIVE REDUCTION

    Directory of Open Access Journals (Sweden)

    Y. Zang

    2018-04-01

    Full Text Available 3D laser technology is widely used to collocate the surface information of object. For various applications, we need to extract a good perceptual quality point cloud from the scanned points. To solve the problem, most of existing methods extract important points based on a fixed scale. However, geometric features of 3D object come from various geometric scales. We propose a multi-scale construction method based on radial basis function. For each scale, important points are extracted from the point cloud based on their importance. We apply a perception metric Just-Noticeable-Difference to measure degradation of each geometric scale. Finally, scale-adaptive optimal information extraction is realized. Experiments are undertaken to evaluate the effective of the proposed method, suggesting a reliable solution for optimal information extraction of object.

  1. Low-Level Color and Texture Feature Extraction of Coral Reef Components

    Directory of Open Access Journals (Sweden)

    Ma. Sheila Angeli Marcos

    2003-06-01

    Full Text Available The purpose of this study is to develop a computer-based classifier that automates coral reef assessmentfrom digitized underwater video. We extract low-level color and texture features from coral images toserve as input to a high-level classifier. Low-level features for color were labeled blue, green, yellow/brown/orange, and gray/white, which are described by the normalized chromaticity histograms of thesemajor colors. The color matching capability of these features was determined through a technique called“Histogram Backprojection”. The low-level texture feature marks a region as coarse or fine dependingon the gray-level variance of the region.

  2. Lithium NLP: A System for Rich Information Extraction from Noisy User Generated Text on Social Media

    OpenAIRE

    Bhargava, Preeti; Spasojevic, Nemanja; Hu, Guoning

    2017-01-01

    In this paper, we describe the Lithium Natural Language Processing (NLP) system - a resource-constrained, high- throughput and language-agnostic system for information extraction from noisy user generated text on social media. Lithium NLP extracts a rich set of information including entities, topics, hashtags and sentiment from text. We discuss several real world applications of the system currently incorporated in Lithium products. We also compare our system with existing commercial and acad...

  3. Corn silk extract improves cholesterol metabolism in C57BL/6J mouse fed high-fat diets.

    Science.gov (United States)

    Cha, Jae Hoon; Kim, Sun Rim; Kang, Hyun Joong; Kim, Myung Hwan; Ha, Ae Wha; Kim, Woo Kyoung

    2016-10-01

    Corn silk (CS) extract contains large amounts of maysin, which is a major flavonoid in CS. However, studies regarding the effect of CS extract on cholesterol metabolism is limited. Therefore, the purpose of this study was to determine the effect of CS extract on cholesterol metabolism in C57BL/6J mouse fed high-fat diets. Normal-fat group fed 7% fat diet, high-fat (HF) group fed 25% fat diet, and high-fat with corn silk (HFCS) group were orally administered CS extract (100 mg/kg body weight) daily. Serum and hepatic levels of total lipids, triglycerides, and total cholesterol as well as serum free fatty acid, glucose, and insulin levels were determined. The mRNA expression levels of acyl-CoA: cholesterol acyltransferase (ACAT), cholesterol 7-alpha hydroxylase (CYP7A1), farnesoid X receptor (FXR), lecithin cholesterol acyltransferase (LCAT), low-density lipoprotein receptor, 3-hyroxy-3-methylglutaryl-coenzyme A reductase (HMG-CoA reductase), adiponectin, leptin, and tumor necrosis factor α were determined. Oral administration of CS extract with HF improved serum glucose and insulin levels as well as attenuated HF-induced fatty liver. CS extracts significantly elevated mRNA expression levels of adipocytokines and reduced mRNA expression levels of HMG-CoA reductase, ACAT, and FXR. The mRNA expression levels of CYP7A1 and LCAT between the HF group and HFCS group were not statistically different. CS extract supplementation with a high-fat diet improves levels of adipocytokine secretion and glucose homeostasis. CS extract is also effective in decreasing the regulatory pool of hepatic cholesterol, in line with decreased blood and hepatic levels of cholesterol though modulation of mRNA expression levels of HMG-CoA reductase, ACAT, and FXR.

  4. Spmk and Grabcut Based Target Extraction from High Resolution Remote Sensing Images

    Science.gov (United States)

    Cui, Weihong; Wang, Guofeng; Feng, Chenyi; Zheng, Yiwei; Li, Jonathan; Zhang, Yi

    2016-06-01

    Target detection and extraction from high resolution remote sensing images is a basic and wide needed application. In this paper, to improve the efficiency of image interpretation, we propose a detection and segmentation combined method to realize semi-automatic target extraction. We introduce the dense transform color scale invariant feature transform (TC-SIFT) descriptor and the histogram of oriented gradients (HOG) & HSV descriptor to characterize the spatial structure and color information of the targets. With the k-means cluster method, we get the bag of visual words, and then, we adopt three levels' spatial pyramid (SP) to represent the target patch. After gathering lots of different kinds of target image patches from many high resolution UAV images, and using the TC-SIFT-SP and the multi-scale HOG & HSV feature, we constructed the SVM classifier to detect the target. In this paper, we take buildings as the targets. Experiment results show that the target detection accuracy of buildings can reach to above 90%. Based on the detection results which are a series of rectangle regions of the targets. We select the rectangle regions as candidates for foreground and adopt the GrabCut based and boundary regularized semi-auto interactive segmentation algorithm to get the accurate boundary of the target. Experiment results show its accuracy and efficiency. It can be an effective way for some special targets extraction.

  5. L1Track: A fast Level 1 track trigger for the ATLAS high luminosity upgrade

    International Nuclear Information System (INIS)

    Cerri, Alessandro

    2016-01-01

    With the planned high-luminosity upgrade of the LHC (HL-LHC), the ATLAS detector will see its collision rate increase by approximately a factor of 5 with respect to the current LHC operation. The earliest hardware-based ATLAS trigger stage (“Level 1”) will have to provide a higher rejection factor in a more difficult environment: a new improved Level 1 trigger architecture is under study, which includes the possibility of extracting with low latency and high accuracy tracking information in time for the decision taking process. In this context, the feasibility of potential approaches aimed at providing low-latency high-quality tracking at Level 1 is discussed. - Highlights: • HL-LH requires highly performing event selection. • ATLAS is studying the implementation of tracking at the very first trigger level. • Low latency and high-quality seem to be achievable with dedicated hardware and adequate detector readout architecture.

  6. Multi-Filter String Matching and Human-Centric Entity Matching for Information Extraction

    Science.gov (United States)

    Sun, Chong

    2012-01-01

    More and more information is being generated in text documents, such as Web pages, emails and blogs. To effectively manage this unstructured information, one broadly used approach includes locating relevant content in documents, extracting structured information and integrating the extracted information for querying, mining or further analysis. In…

  7. Research on Crowdsourcing Emergency Information Extraction of Based on Events' Frame

    Science.gov (United States)

    Yang, Bo; Wang, Jizhou; Ma, Weijun; Mao, Xi

    2018-01-01

    At present, the common information extraction method cannot extract the structured emergency event information accurately; the general information retrieval tool cannot completely identify the emergency geographic information; these ways also do not have an accurate assessment of these results of distilling. So, this paper proposes an emergency information collection technology based on event framework. This technique is to solve the problem of emergency information picking. It mainly includes emergency information extraction model (EIEM), complete address recognition method (CARM) and the accuracy evaluation model of emergency information (AEMEI). EIEM can be structured to extract emergency information and complements the lack of network data acquisition in emergency mapping. CARM uses a hierarchical model and the shortest path algorithm and allows the toponomy pieces to be joined as a full address. AEMEI analyzes the results of the emergency event and summarizes the advantages and disadvantages of the event framework. Experiments show that event frame technology can solve the problem of emergency information drawing and provides reference cases for other applications. When the emergency disaster is about to occur, the relevant departments query emergency's data that has occurred in the past. They can make arrangements ahead of schedule which defense and reducing disaster. The technology decreases the number of casualties and property damage in the country and world. This is of great significance to the state and society.

  8. High-speed web attack detection through extracting exemplars from HTTP traffic

    KAUST Repository

    Wang, Wei

    2011-01-01

    In this work, we propose an effective method for high-speed web attack detection by extracting exemplars from HTTP traffic before the detection model is built. The smaller set of exemplars keeps valuable information of the original traffic while it significantly reduces the size of the traffic so that the detection remains effective and improves the detection efficiency. The Affinity Propagation (AP) is employed to extract the exemplars from the HTTP traffic. K-Nearest Neighbor(K-NN) and one class Support Vector Machine (SVM) are used for anomaly detection. To facilitate comparison, we also employ information gain to select key attributes (a.k.a. features) from the HTTP traffic for web attack detection. Two large real HTTP traffic are used to validate our methods. The extensive test results show that the AP based exemplar extraction significantly improves the real-time performance of the detection compared to using all the HTTP traffic and achieves a more robust detection performance than information gain based attribute selection for web attack detection. © 2011 ACM.

  9. TRU decontamination of high-level Purex waste by solvent extraction using a mixed octyl(phenyl)-N,N-diisobutyl-carbamoylmethylphosphine oxide/TBP/NPH (TRUEX) solvent

    International Nuclear Information System (INIS)

    Horwitz, E.P.; Kalina, D.G.; Diamond, H.; Kaplan, L.; Vandegrift, G.F.; Leonard, R.A.; Steindler, M.J.; Schulz, W.W.

    1984-01-01

    The TRUEX (transuranium extraction) process was tested on a simulated high-level dissolved sludge waste (DSW). A batch counter-current extraction mode was used for seven extraction and three scrub stages. One additional extraction stage and two scrub stages and all strip stages were performed by batch extraction. The TRUEX solvent consisted of 0.20 M octyl(phenyl)-N,N-diisobutylcarbamoyl-methylphosphine oxide-1.4 M TBP in Conoco (C 12 -C 14 ). The feed solution was 1.0 M in HNO 3 , 0.3 M in H 2 C 2 O 4 and contained mixed (stable) fission products, U, Np, Pu, and Am, and a number of inert constituents, e.g., Fe and Al. The test showed that the process is capable of reducing the TRU concentration in the DSW by a factor of 4 x 10 4 (to <100 nCi/g of disposed form) and reducing the quantity of TRU waste by two orders of magnitude

  10. Unsupervised information extraction by text segmentation

    CERN Document Server

    Cortez, Eli

    2013-01-01

    A new unsupervised approach to the problem of Information Extraction by Text Segmentation (IETS) is proposed, implemented and evaluated herein. The authors' approach relies on information available on pre-existing data to learn how to associate segments in the input string with attributes of a given domain relying on a very effective set of content-based features. The effectiveness of the content-based features is also exploited to directly learn from test data structure-based features, with no previous human-driven training, a feature unique to the presented approach. Based on the approach, a

  11. A Multi-stage Method to Extract Road from High Resolution Satellite Image

    International Nuclear Information System (INIS)

    Zhijian, Huang; Zhang, Jinfang; Xu, Fanjiang

    2014-01-01

    Extracting road information from high-resolution satellite images is complex and hardly achieves by exploiting only one or two modules. This paper presents a multi-stage method, consisting of automatic information extraction and semi-automatic post-processing. The Multi-scale Enhancement algorithm enlarges the contrast of human-made structures with the background. The Statistical Region Merging segments images into regions, whose skeletons are extracted and pruned according to geometry shape information. Setting the start and the end skeleton points, the shortest skeleton path is constructed as a road centre line. The Bidirectional Adaptive Smoothing technique smoothens the road centre line and adjusts it to right position. With the smoothed line and its average width, a Buffer algorithm reconstructs the road region easily. Seen from the last results, the proposed method eliminates redundant non-road regions, repairs incomplete occlusions, jumps over complete occlusions, and reserves accurate road centre lines and neat road regions. During the whole process, only a few interactions are needed

  12. Extraction of CT dose information from DICOM metadata: automated Matlab-based approach.

    Science.gov (United States)

    Dave, Jaydev K; Gingold, Eric L

    2013-01-01

    The purpose of this study was to extract exposure parameters and dose-relevant indexes of CT examinations from information embedded in DICOM metadata. DICOM dose report files were identified and retrieved from a PACS. An automated software program was used to extract from these files information from the structured elements in the DICOM metadata relevant to exposure. Extracting information from DICOM metadata eliminated potential errors inherent in techniques based on optical character recognition, yielding 100% accuracy.

  13. Exploiting graph kernels for high performance biomedical relation extraction.

    Science.gov (United States)

    Panyam, Nagesh C; Verspoor, Karin; Cohn, Trevor; Ramamohanarao, Kotagiri

    2018-01-30

    Relation extraction from biomedical publications is an important task in the area of semantic mining of text. Kernel methods for supervised relation extraction are often preferred over manual feature engineering methods, when classifying highly ordered structures such as trees and graphs obtained from syntactic parsing of a sentence. Tree kernels such as the Subset Tree Kernel and Partial Tree Kernel have been shown to be effective for classifying constituency parse trees and basic dependency parse graphs of a sentence. Graph kernels such as the All Path Graph kernel (APG) and Approximate Subgraph Matching (ASM) kernel have been shown to be suitable for classifying general graphs with cycles, such as the enhanced dependency parse graph of a sentence. In this work, we present a high performance Chemical-Induced Disease (CID) relation extraction system. We present a comparative study of kernel methods for the CID task and also extend our study to the Protein-Protein Interaction (PPI) extraction task, an important biomedical relation extraction task. We discuss novel modifications to the ASM kernel to boost its performance and a method to apply graph kernels for extracting relations expressed in multiple sentences. Our system for CID relation extraction attains an F-score of 60%, without using external knowledge sources or task specific heuristic or rules. In comparison, the state of the art Chemical-Disease Relation Extraction system achieves an F-score of 56% using an ensemble of multiple machine learning methods, which is then boosted to 61% with a rule based system employing task specific post processing rules. For the CID task, graph kernels outperform tree kernels substantially, and the best performance is obtained with APG kernel that attains an F-score of 60%, followed by the ASM kernel at 57%. The performance difference between the ASM and APG kernels for CID sentence level relation extraction is not significant. In our evaluation of ASM for the PPI task, ASM

  14. Extraction of temporal information in functional MRI

    Science.gov (United States)

    Singh, M.; Sungkarat, W.; Jeong, Jeong-Won; Zhou, Yongxia

    2002-10-01

    The temporal resolution of functional MRI (fMRI) is limited by the shape of the haemodynamic response function (hrf) and the vascular architecture underlying the activated regions. Typically, the temporal resolution of fMRI is on the order of 1 s. We have developed a new data processing approach to extract temporal information on a pixel-by-pixel basis at the level of 100 ms from fMRI data. Instead of correlating or fitting the time-course of each pixel to a single reference function, which is the common practice in fMRI, we correlate each pixel's time-course to a series of reference functions that are shifted with respect to each other by 100 ms. The reference function yielding the highest correlation coefficient for a pixel is then used as a time marker for that pixel. A Monte Carlo simulation and experimental study of this approach were performed to estimate the temporal resolution as a function of signal-to-noise ratio (SNR) in the time-course of a pixel. Assuming a known and stationary hrf, the simulation and experimental studies suggest a lower limit in the temporal resolution of approximately 100 ms at an SNR of 3. The multireference function approach was also applied to extract timing information from an event-related motor movement study where the subjects flexed a finger on cue. The event was repeated 19 times with the event's presentation staggered to yield an approximately 100-ms temporal sampling of the haemodynamic response over the entire presentation cycle. The timing differences among different regions of the brain activated by the motor task were clearly visualized and quantified by this method. The results suggest that it is possible to achieve a temporal resolution of /spl sim/200 ms in practice with this approach.

  15. Advanced applications of natural language processing for performing information extraction

    CERN Document Server

    Rodrigues, Mário

    2015-01-01

    This book explains how can be created information extraction (IE) applications that are able to tap the vast amount of relevant information available in natural language sources: Internet pages, official documents such as laws and regulations, books and newspapers, and social web. Readers are introduced to the problem of IE and its current challenges and limitations, supported with examples. The book discusses the need to fill the gap between documents, data, and people, and provides a broad overview of the technology supporting IE. The authors present a generic architecture for developing systems that are able to learn how to extract relevant information from natural language documents, and illustrate how to implement working systems using state-of-the-art and freely available software tools. The book also discusses concrete applications illustrating IE uses.   ·         Provides an overview of state-of-the-art technology in information extraction (IE), discussing achievements and limitations for t...

  16. Extraction of quantifiable information from complex systems

    CERN Document Server

    Dahmen, Wolfgang; Griebel, Michael; Hackbusch, Wolfgang; Ritter, Klaus; Schneider, Reinhold; Schwab, Christoph; Yserentant, Harry

    2014-01-01

    In April 2007, the  Deutsche Forschungsgemeinschaft (DFG) approved the  Priority Program 1324 “Mathematical Methods for Extracting Quantifiable Information from Complex Systems.” This volume presents a comprehensive overview of the most important results obtained over the course of the program.   Mathematical models of complex systems provide the foundation for further technological developments in science, engineering and computational finance.  Motivated by the trend toward steadily increasing computer power, ever more realistic models have been developed in recent years. These models have also become increasingly complex, and their numerical treatment poses serious challenges.   Recent developments in mathematics suggest that, in the long run, much more powerful numerical solution strategies could be derived if the interconnections between the different fields of research were systematically exploited at a conceptual level. Accordingly, a deeper understanding of the mathematical foundations as w...

  17. Post-processing of Deep Web Information Extraction Based on Domain Ontology

    Directory of Open Access Journals (Sweden)

    PENG, T.

    2013-11-01

    Full Text Available Many methods are utilized to extract and process query results in deep Web, which rely on the different structures of Web pages and various designing modes of databases. However, some semantic meanings and relations are ignored. So, in this paper, we present an approach for post-processing deep Web query results based on domain ontology which can utilize the semantic meanings and relations. A block identification model (BIM based on node similarity is defined to extract data blocks that are relevant to specific domain after reducing noisy nodes. Feature vector of domain books is obtained by result set extraction model (RSEM based on vector space model (VSM. RSEM, in combination with BIM, builds the domain ontology on books which can not only remove the limit of Web page structures when extracting data information, but also make use of semantic meanings of domain ontology. After extracting basic information of Web pages, a ranking algorithm is adopted to offer an ordered list of data records to users. Experimental results show that BIM and RSEM extract data blocks and build domain ontology accurately. In addition, relevant data records and basic information are extracted and ranked. The performances precision and recall show that our proposed method is feasible and efficient.

  18. Comparing success levels of different neural network structures in extracting discriminative information from the response patterns of a temperature-modulated resistive gas sensor

    Science.gov (United States)

    Hosseini-Golgoo, S. M.; Bozorgi, H.; Saberkari, A.

    2015-06-01

    Performances of three neural networks, consisting of a multi-layer perceptron, a radial basis function, and a neuro-fuzzy network with local linear model tree training algorithm, in modeling and extracting discriminative features from the response patterns of a temperature-modulated resistive gas sensor are quantitatively compared. For response pattern recording, a voltage staircase containing five steps each with a 20 s plateau is applied to the micro-heater of the sensor, when 12 different target gases, each at 11 concentration levels, are present. In each test, the hidden layer neuron weights are taken as the discriminatory feature vector of the target gas. These vectors are then mapped to a 3D feature space using linear discriminant analysis. The discriminative information content of the feature vectors are determined by the calculation of the Fisher’s discriminant ratio, affording quantitative comparison among the success rates achieved by the different neural network structures. The results demonstrate a superior discrimination ratio for features extracted from local linear neuro-fuzzy and radial-basis-function networks with recognition rates of 96.27% and 90.74%, respectively.

  19. Comparing success levels of different neural network structures in extracting discriminative information from the response patterns of a temperature-modulated resistive gas sensor

    International Nuclear Information System (INIS)

    Hosseini-Golgoo, S M; Bozorgi, H; Saberkari, A

    2015-01-01

    Performances of three neural networks, consisting of a multi-layer perceptron, a radial basis function, and a neuro-fuzzy network with local linear model tree training algorithm, in modeling and extracting discriminative features from the response patterns of a temperature-modulated resistive gas sensor are quantitatively compared. For response pattern recording, a voltage staircase containing five steps each with a 20 s plateau is applied to the micro-heater of the sensor, when 12 different target gases, each at 11 concentration levels, are present. In each test, the hidden layer neuron weights are taken as the discriminatory feature vector of the target gas. These vectors are then mapped to a 3D feature space using linear discriminant analysis. The discriminative information content of the feature vectors are determined by the calculation of the Fisher’s discriminant ratio, affording quantitative comparison among the success rates achieved by the different neural network structures. The results demonstrate a superior discrimination ratio for features extracted from local linear neuro-fuzzy and radial-basis-function networks with recognition rates of 96.27% and 90.74%, respectively. (paper)

  20. An Effective Approach to Biomedical Information Extraction with Limited Training Data

    Science.gov (United States)

    Jonnalagadda, Siddhartha

    2011-01-01

    In the current millennium, extensive use of computers and the internet caused an exponential increase in information. Few research areas are as important as information extraction, which primarily involves extracting concepts and the relations between them from free text. Limitations in the size of training data, lack of lexicons and lack of…

  1. a Statistical Texture Feature for Building Collapse Information Extraction of SAR Image

    Science.gov (United States)

    Li, L.; Yang, H.; Chen, Q.; Liu, X.

    2018-04-01

    Synthetic Aperture Radar (SAR) has become one of the most important ways to extract post-disaster collapsed building information, due to its extreme versatility and almost all-weather, day-and-night working capability, etc. In view of the fact that the inherent statistical distribution of speckle in SAR images is not used to extract collapsed building information, this paper proposed a novel texture feature of statistical models of SAR images to extract the collapsed buildings. In the proposed feature, the texture parameter of G0 distribution from SAR images is used to reflect the uniformity of the target to extract the collapsed building. This feature not only considers the statistical distribution of SAR images, providing more accurate description of the object texture, but also is applied to extract collapsed building information of single-, dual- or full-polarization SAR data. The RADARSAT-2 data of Yushu earthquake which acquired on April 21, 2010 is used to present and analyze the performance of the proposed method. In addition, the applicability of this feature to SAR data with different polarizations is also analysed, which provides decision support for the data selection of collapsed building information extraction.

  2. DKIE: Open Source Information Extraction for Danish

    DEFF Research Database (Denmark)

    Derczynski, Leon; Field, Camilla Vilhelmsen; Bøgh, Kenneth Sejdenfaden

    2014-01-01

    Danish is a major Scandinavian language spoken daily by around six million people. However, it lacks a unified, open set of NLP tools. This demonstration will introduce DKIE, an extensible open-source toolkit for processing Danish text. We implement an information extraction architecture for Danish...

  3. High-level radioactive waste disposal: Key geochemical issues and information needs for site characterization

    International Nuclear Information System (INIS)

    Brooks, D.J.; Bembia, P.J.; Bradbury, J.W.; Jackson, K.C.; Kelly, W.R.; Kovach, L.A.; Mo, T.; Tesoriero, J.A.

    1986-01-01

    Geochemistry plays a key role in determining the potential of a high-level radioactive waste disposal site for long-term radionuclide containment and isolation. The Nuclear Regulatory Commission (NRC) has developed a set of issues and information needs important for characterizing geochemistry at the potential sites being investigated by the Department of Energy Basalt Waste Isolation Project, Nevada Nuclear Waste Storage Investigations project, and Salt Repository Project. The NRC site issues and information needs consider (1) the geochemical environment of the repository, (2) changes to the initial geochemical environment caused by construction and waste emplacement, and (3) interactions that affect the transport of waste radionuclides to the accessible environment. The development of these issues and information needs supports the ongoing effort of the NRC to identify and address areas of geochemical data uncertainty during prelicensing interactions

  4. Addition of Garlic Extract in Ration to Reduce Cholesterol Level of Broiler

    Science.gov (United States)

    Utami, M. M. D.; Pantaya, D.; Agus, A.

    2018-01-01

    The purpose of this research is to know the effect of garlic extract (GE) in reducing cholesterol level of broiler chicken by analyzing cholesterol level of broiler chicken blood. Two hundred one day broiler age were used in this study for 35 days. The chickens were randomly divided into four treatments, each treatment consist of five replications and each repetition consist of ten chickens. This research is used completely randomized design, such as: T0: 0% EBP, T1: 2%, T2: 4% and T3: 6%. Furthermore, at age 35 days each chicken was taken blood to be analyzed cholesterol levels, low density lipoprotein (LDL), high density lipoprotein (HDL) and calculated the ratio of LDL and HDL levels. The data obtained were analyzed using software from Statistical Product and Service Solution (SPSS 16.0). The results of significant analysis continued by Duncan’s New Multiple Range Test. Addition of GE from the 2% level decreases (P <0.05) of LDL and total cholesterol, and increases HDL and HDL-LDL ratio. The conclusions is obtained garlic extract plays an important role in lowering cholesterol levels of broiler meat.

  5. A New Approach to Urban Road Extraction Using High-Resolution Aerial Image

    Directory of Open Access Journals (Sweden)

    Jianhua Wang

    2016-07-01

    Full Text Available Road information is fundamental not only in the military field but also common daily living. Automatic road extraction from a remote sensing images can provide references for city planning as well as transportation database and map updating. However, owing to the spectral similarity between roads and impervious structures, the current methods solely using spectral characteristics are often ineffective. By contrast, the detailed information discernible from the high-resolution aerial images enables road extraction with spatial texture features. In this study, a knowledge-based method is established and proposed; this method incorporates the spatial texture feature into urban road extraction. The spatial texture feature is initially extracted by the local Moran’s I, and the derived texture is added to the spectral bands of image for image segmentation. Subsequently, features like brightness, standard deviation, rectangularity, aspect ratio, and area are selected to form the hypothesis and verification model based on road knowledge. Finally, roads are extracted by applying the hypothesis and verification model and are post-processed based on the mathematical morphology. The newly proposed method is evaluated by conducting two experiments. Results show that the completeness, correctness, and quality of the results could reach approximately 94%, 90% and 86% respectively, indicating that the proposed method is effective for urban road extraction.

  6. Development of a test system for high level liquid waste partitioning

    Directory of Open Access Journals (Sweden)

    Duan Wu H.

    2015-01-01

    Full Text Available The partitioning and transmutation strategy has increasingly attracted interest for the safe treatment and disposal of high level liquid waste, in which the partitioning of high level liquid waste is one of the critical technical issues. An improved total partitioning process, including a tri-alkylphosphine oxide process for the removal of actinides, a crown ether strontium extraction process for the removal of strontium, and a calixcrown ether cesium extraction process for the removal of cesium, has been developed to treat Chinese high level liquid waste. A test system containing 72-stage 10-mm-diam annular centrifugal contactors, a remote sampling system, a rotor speed acquisition-monitoring system, a feeding system, and a video camera-surveillance system was successfully developed to carry out the hot test for verifying the improved total partitioning process. The test system has been successfully used in a 160 hour hot test using genuine high level liquid waste. During the hot test, the test system was stable, which demonstrated it was reliable for the hot test of the high level liquid waste partitioning.

  7. Three-dimensional information extraction from GaoFen-1 satellite images for landslide monitoring

    Science.gov (United States)

    Wang, Shixin; Yang, Baolin; Zhou, Yi; Wang, Futao; Zhang, Rui; Zhao, Qing

    2018-05-01

    To more efficiently use GaoFen-1 (GF-1) satellite images for landslide emergency monitoring, a Digital Surface Model (DSM) can be generated from GF-1 across-track stereo image pairs to build a terrain dataset. This study proposes a landslide 3D information extraction method based on the terrain changes of slope objects. The slope objects are mergences of segmented image objects which have similar aspects; and the terrain changes are calculated from the post-disaster Digital Elevation Model (DEM) from GF-1 and the pre-disaster DEM from GDEM V2. A high mountain landslide that occurred in Wenchuan County, Sichuan Province is used to conduct a 3D information extraction test. The extracted total area of the landslide is 22.58 ha; the displaced earth volume is 652,100 m3; and the average sliding direction is 263.83°. The accuracies of them are 0.89, 0.87 and 0.95, respectively. Thus, the proposed method expands the application of GF-1 satellite images to the field of landslide emergency monitoring.

  8. Dietary supplementation of extracts from a halophyte affects the level of the circulating enzymes in irradiated rats

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. G.; Lee, B. H. [KAERI, Taejon (Korea, Republic of); Kim, J. H.; Youn, Y. D. [Hanyang Univ., Seoul (Korea, Republic of)

    2003-10-01

    Extracts from Salicornia herbacea with two extraction methods (using water or ethanol) were examined for their potential as a radioprotector. This plant accumulates a great amount of salt , Mg, Ca, Fe, and K and thus contains high levels of mineral in its body. It is famous as a remedial material for the constipation and glycosuria in folk medicine. The present study was designed to explore the in vivo antioxidant effects of water - and ethanol- extracts of S. herbacea. Both extracts of the plants were tested for their free radical scavenging activity with the DPPH assay. For the in vivo studies, male F344 rats (3 week- old) received po administration of both extracts 0.5 mg/ml during 5 days before whole- body irradiation. Six hours after irradiation, we measured the body and organ weight and collected blood. The levels of serum aspartate aminotransferase (AST), alanine aminotransferase (ALT) and lactate dehydrogenase (LDH), alkaline phosphatase (ALP) showed a similar pattern six hours after irradiation. In case of the water extract - dietary group after irradiation, the levels of all enzymes had a tendency to decrease toward to the base level. Therefore, the results reflects the antioxidant activity of S. herbacea extracts and its potential to protect against radiation damage.

  9. Mining knowledge from text repositories using information extraction ...

    Indian Academy of Sciences (India)

    Information extraction (IE); text mining; text repositories; knowledge discovery from .... general purpose English words. However ... of precision and recall, as extensive experimentation is required due to lack of public tagged corpora. 4. Mining ...

  10. Support the Design of Improved IUE NEWSIPS High Dispersion Extraction Algorithms: Improved IUE High Dispersion Extraction Algorithms

    Science.gov (United States)

    Lawton, Pat

    2004-01-01

    The objective of this work was to support the design of improved IUE NEWSIPS high dispersion extraction algorithms. The purpose of this work was to evaluate use of the Linearized Image (LIHI) file versus the Re-Sampled Image (SIHI) file, evaluate various extraction, and design algorithms for evaluation of IUE High Dispersion spectra. It was concluded the use of the Re-Sampled Image (SIHI) file was acceptable. Since the Gaussian profile worked well for the core and the Lorentzian profile worked well for the wings, the Voigt profile was chosen for use in the extraction algorithm. It was found that the gamma and sigma parameters varied significantly across the detector, so gamma and sigma masks for the SWP detector were developed. Extraction code was written.

  11. Towards an information extraction and knowledge formation framework based on Shannon entropy

    Directory of Open Access Journals (Sweden)

    Iliescu Dragoș

    2017-01-01

    Full Text Available Information quantity subject is approached in this paperwork, considering the specific domain of nonconforming product management as information source. This work represents a case study. Raw data were gathered from a heavy industrial works company, information extraction and knowledge formation being considered herein. Involved method for information quantity estimation is based on Shannon entropy formula. Information and entropy spectrum are decomposed and analysed for extraction of specific information and knowledge-that formation. The result of the entropy analysis point out the information needed to be acquired by the involved organisation, this being presented as a specific knowledge type.

  12. Medicaid Analytic eXtract (MAX) General Information

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicaid Analytic eXtract (MAX) data is a set of person-level data files on Medicaid eligibility, service utilization, and payments. The MAX data are created to...

  13. Tagline: Information Extraction for Semi-Structured Text Elements in Medical Progress Notes

    Science.gov (United States)

    Finch, Dezon Kile

    2012-01-01

    Text analysis has become an important research activity in the Department of Veterans Affairs (VA). Statistical text mining and natural language processing have been shown to be very effective for extracting useful information from medical documents. However, neither of these techniques is effective at extracting the information stored in…

  14. Chaotic spectra: How to extract dynamic information

    International Nuclear Information System (INIS)

    Taylor, H.S.; Gomez Llorente, J.M.; Zakrzewski, J.; Kulander, K.C.

    1988-10-01

    Nonlinear dynamics is applied to chaotic unassignable atomic and molecular spectra with the aim of extracting detailed information about regular dynamic motions that exist over short intervals of time. It is shown how this motion can be extracted from high resolution spectra by doing low resolution studies or by Fourier transforming limited regions of the spectrum. These motions mimic those of periodic orbits (PO) and are inserts into the dominant chaotic motion. Considering these inserts and the PO as a dynamically decoupled region of space, resonant scattering theory and stabilization methods enable us to compute ladders of resonant states which interact with the chaotic quasi-continuum computed in principle from basis sets placed off the PO. The interaction of the resonances with the quasicontinuum explains the low resolution spectra seen in such experiments. It also allows one to associate low resolution features with a particular PO. The motion on the PO thereby supplies the molecular movements whose quantization causes the low resolution spectra. Characteristic properties of the periodic orbit based resonances are discussed. The method is illustrated on the photoabsorption spectrum of the hydrogen atom in a strong magnetic field and on the photodissociation spectrum of H 3 + . Other molecular systems which are currently under investigation using this formalism are also mentioned. 53 refs., 10 figs., 2 tabs

  15. Mars Target Encyclopedia: Information Extraction for Planetary Science

    Science.gov (United States)

    Wagstaff, K. L.; Francis, R.; Gowda, T.; Lu, Y.; Riloff, E.; Singh, K.

    2017-06-01

    Mars surface targets / and published compositions / Seek and ye will find. We used text mining methods to extract information from LPSC abstracts about the composition of Mars surface targets. Users can search by element, mineral, or target.

  16. Comparison of low-level polycyclic aromatic hydrocarbons in sediment revealed by Soxhlet extraction, microwave-assisted extraction, and pressurized liquid extraction

    International Nuclear Information System (INIS)

    Itoh, Nobuyasu; Numata, Masahiko; Aoyagi, Yoshie; Yarita, Takashi

    2008-01-01

    We analyzed polycyclic aromatic hydrocarbons (PAHs) present in lake sediment at low levels ( -1 ) by using Soxhlet extraction (Soxhlet), microwave-assisted extraction (MAE), and pressurized liquid extraction (PLE) in combination with gas chromatography and isotope-dilution mass spectrometry. Although all extraction techniques showed good repeatability for five target PAHs (relative standard deviation MAE > Soxhlet. Differences in the results originated mainly from differences in the extraction efficiencies of the techniques for native PAHs, because all techniques gave comparable recovery yields of corresponding 13 C-labeled PAHs ( 13 C-PAHs) (51-84%). Since non-negligible amounts of both native PAHs and 13 C-PAHs were re-adsorbed on matrix in MAE, not only recovery yields of 13 C-PAHs but also efficiencies of extraction of native PAHs should be examined to evaluate the appropriateness of any analytical procedures

  17. Individual Learning Route as a Way of Highly Qualified Specialists Training for Extraction of Solid Commercial Minerals Enterprises

    Science.gov (United States)

    Oschepkova, Elena; Vasinskaya, Irina; Sockoluck, Irina

    2017-11-01

    In view of changing educational paradigm (adopting of two-tier system of higher education concept - undergraduate and graduate programs) a need of using of modern learning and information and communications technologies arises putting into practice learner-centered approaches in training of highly qualified specialists for extraction and processing of solid commercial minerals enterprises. In the unstable market demand situation and changeable institutional environment, from one side, and necessity of work balancing, supplying conditions and product quality when mining-and-geological parameters change, from the other side, mining enterprises have to introduce and develop the integrated management process of product and informative and logistic flows under united management system. One of the main limitations, which keeps down the developing process on Russian mining enterprises, is staff incompetence at all levels of logistic management. Under present-day conditions extraction and processing of solid commercial minerals enterprises need highly qualified specialists who can do self-directed researches, develop new and improve present arranging, planning and managing technologies of technical operation and commercial exploitation of transport and transportation and processing facilities based on logistics. Learner-centered approach and individualization of the learning process necessitate the designing of individual learning route (ILR), which can help the students to realize their professional facilities according to requirements for specialists for extraction and processing of solid commercial minerals enterprises.

  18. High-Level Antimicrobial Efficacy of Representative Mediterranean Natural Plant Extracts against Oral Microorganisms

    Directory of Open Access Journals (Sweden)

    Lamprini Karygianni

    2014-01-01

    Full Text Available Nature is an unexplored reservoir of novel phytopharmaceuticals. Since biofilm-related oral diseases often correlate with antibiotic resistance, plant-derived antimicrobial agents could enhance existing treatment options. Therefore, the rationale of the present report was to examine the antimicrobial impact of Mediterranean natural extracts on oral microorganisms. Five different extracts from Olea europaea, mastic gum, and Inula viscosa were tested against ten bacteria and one Candida albicans strain. The extraction protocols were conducted according to established experimental procedures. Two antimicrobial assays—the minimum inhibitory concentration (MIC assay and the minimum bactericidal concentration (MBC assay—were applied. The screened extracts were found to be active against each of the tested microorganisms. O. europaea presented MIC and MBC ranges of 0.07–10.00 mg mL−1 and 0.60–10.00 mg mL−1, respectively. The mean MBC values for mastic gum and I. viscosa were 0.07–10.00 mg mL−1 and 0.15–10.00 mg mL−1, respectively. Extracts were less effective against C. albicans and exerted bactericidal effects at a concentration range of 0.07–5.00 mg mL−1 on strict anaerobic bacteria (Porphyromonas gingivalis, Prevotella intermedia, Fusobacterium nucleatum, and Parvimonas micra. Ethyl acetate I. viscosa extract and total mastic extract showed considerable antimicrobial activity against oral microorganisms and could therefore be considered as alternative natural anti-infectious agents.

  19. Inexperienced clinicians can extract pathoanatomic information from MRI narrative reports with high reproducability for use in research/quality assurance

    DEFF Research Database (Denmark)

    Kent, Peter; Briggs, Andrew M; Albert, Hanne Birgit

    2011-01-01

    Background Although reproducibility in reading MRI images amongst radiologists and clinicians has been studied previously, no studies have examined the reproducibility of inexperienced clinicians in extracting pathoanatomic information from magnetic resonance imaging (MRI) narrative reports and t...

  20. Optimum detection for extracting maximum information from symmetric qubit sets

    International Nuclear Information System (INIS)

    Mizuno, Jun; Fujiwara, Mikio; Sasaki, Masahide; Akiba, Makoto; Kawanishi, Tetsuya; Barnett, Stephen M.

    2002-01-01

    We demonstrate a class of optimum detection strategies for extracting the maximum information from sets of equiprobable real symmetric qubit states of a single photon. These optimum strategies have been predicted by Sasaki et al. [Phys. Rev. A 59, 3325 (1999)]. The peculiar aspect is that the detections with at least three outputs suffice for optimum extraction of information regardless of the number of signal elements. The cases of ternary (or trine), quinary, and septenary polarization signals are studied where a standard von Neumann detection (a projection onto a binary orthogonal basis) fails to access the maximum information. Our experiments demonstrate that it is possible with present technologies to attain about 96% of the theoretical limit

  1. Study on methods and techniques of aeroradiometric weak information extraction for sandstone-hosted uranium deposits based on GIS

    International Nuclear Information System (INIS)

    Han Shaoyang; Ke Dan; Hou Huiqun

    2005-01-01

    The weak information extraction is one of the important research contents in the current sandstone-type uranium prospecting in China. This paper introduces the connotation of aeroradiometric weak information extraction, and discusses the formation theories of aeroradiometric weak information extraction, and discusses the formation theories of aeroradiometric weak information and establishes some effective mathematic models for weak information extraction. Models for weak information extraction are realized based on GIS software platform. Application tests of weak information extraction are realized based on GIS software platform. Application tests of weak information extraction are completed in known uranium mineralized areas. Research results prove that the prospective areas of sandstone-type uranium deposits can be rapidly delineated by extracting aeroradiometric weak information. (authors)

  2. Driver drowsiness classification using fuzzy wavelet-packet-based feature-extraction algorithm.

    Science.gov (United States)

    Khushaba, Rami N; Kodagoda, Sarath; Lal, Sara; Dissanayake, Gamini

    2011-01-01

    Driver drowsiness and loss of vigilance are a major cause of road accidents. Monitoring physiological signals while driving provides the possibility of detecting and warning of drowsiness and fatigue. The aim of this paper is to maximize the amount of drowsiness-related information extracted from a set of electroencephalogram (EEG), electrooculogram (EOG), and electrocardiogram (ECG) signals during a simulation driving test. Specifically, we develop an efficient fuzzy mutual-information (MI)- based wavelet packet transform (FMIWPT) feature-extraction method for classifying the driver drowsiness state into one of predefined drowsiness levels. The proposed method estimates the required MI using a novel approach based on fuzzy memberships providing an accurate-information content-estimation measure. The quality of the extracted features was assessed on datasets collected from 31 drivers on a simulation test. The experimental results proved the significance of FMIWPT in extracting features that highly correlate with the different drowsiness levels achieving a classification accuracy of 95%-- 97% on an average across all subjects.

  3. Using text mining techniques to extract phenotypic information from the PhenoCHF corpus.

    Science.gov (United States)

    Alnazzawi, Noha; Thompson, Paul; Batista-Navarro, Riza; Ananiadou, Sophia

    2015-01-01

    Phenotypic information locked away in unstructured narrative text presents significant barriers to information accessibility, both for clinical practitioners and for computerised applications used for clinical research purposes. Text mining (TM) techniques have previously been applied successfully to extract different types of information from text in the biomedical domain. They have the potential to be extended to allow the extraction of information relating to phenotypes from free text. To stimulate the development of TM systems that are able to extract phenotypic information from text, we have created a new corpus (PhenoCHF) that is annotated by domain experts with several types of phenotypic information relating to congestive heart failure. To ensure that systems developed using the corpus are robust to multiple text types, it integrates text from heterogeneous sources, i.e., electronic health records (EHRs) and scientific articles from the literature. We have developed several different phenotype extraction methods to demonstrate the utility of the corpus, and tested these methods on a further corpus, i.e., ShARe/CLEF 2013. Evaluation of our automated methods showed that PhenoCHF can facilitate the training of reliable phenotype extraction systems, which are robust to variations in text type. These results have been reinforced by evaluating our trained systems on the ShARe/CLEF corpus, which contains clinical records of various types. Like other studies within the biomedical domain, we found that solutions based on conditional random fields produced the best results, when coupled with a rich feature set. PhenoCHF is the first annotated corpus aimed at encoding detailed phenotypic information. The unique heterogeneous composition of the corpus has been shown to be advantageous in the training of systems that can accurately extract phenotypic information from a range of different text types. Although the scope of our annotation is currently limited to a single

  4. Robust space-time extraction of ventricular surface evolution using multiphase level sets

    Science.gov (United States)

    Drapaca, Corina S.; Cardenas, Valerie; Studholme, Colin

    2004-05-01

    This paper focuses on the problem of accurately extracting the CSF-tissue boundary, particularly around the ventricular surface, from serial structural MRI of the brain acquired in imaging studies of aging and dementia. This is a challenging problem because of the common occurrence of peri-ventricular lesions which locally alter the appearance of white matter. We examine a level set approach which evolves a four dimensional description of the ventricular surface over time. This has the advantage of allowing constraints on the contour in the temporal dimension, improving the consistency of the extracted object over time. We follow the approach proposed by Chan and Vese which is based on the Mumford and Shah model and implemented using the Osher and Sethian level set method. We have extended this to the 4 dimensional case to propagate a 4D contour toward the tissue boundaries through the evolution of a 5D implicit function. For convergence we use region-based information provided by the image rather than the gradient of the image. This is adapted to allow intensity contrast changes between time frames in the MRI sequence. Results on time sequences of 3D brain MR images are presented and discussed.

  5. Fusion of Pixel-based and Object-based Features for Road Centerline Extraction from High-resolution Satellite Imagery

    Directory of Open Access Journals (Sweden)

    CAO Yungang

    2016-10-01

    Full Text Available A novel approach for road centerline extraction from high spatial resolution satellite imagery is proposed by fusing both pixel-based and object-based features. Firstly, texture and shape features are extracted at the pixel level, and spectral features are extracted at the object level based on multi-scale image segmentation maps. Then, extracted multiple features are utilized in the fusion framework of Dempster-Shafer evidence theory to roughly identify the road network regions. Finally, an automatic noise removing algorithm combined with the tensor voting strategy is presented to accurately extract the road centerline. Experimental results using high-resolution satellite imageries with different scenes and spatial resolutions showed that the proposed approach compared favorably with the traditional methods, particularly in the aspect of eliminating the salt noise and conglutination phenomenon.

  6. Information Extraction, Data Integration, and Uncertain Data Management: The State of The Art

    NARCIS (Netherlands)

    Habib, Mena Badieh; van Keulen, Maurice

    2011-01-01

    Information Extraction, data Integration, and uncertain data management are different areas of research that got vast focus in the last two decades. Many researches tackled those areas of research individually. However, information extraction systems should have integrated with data integration

  7. Novel bitter melon extracts highly yielded from supercritical extraction reduce the adiposity through the enhanced lipid metabolism in mice fed a high fat diet

    Directory of Open Access Journals (Sweden)

    Li Xu

    2016-12-01

    Full Text Available Bitter melon (Momordica charantia is a species of edible plant known for its medicinal value towards diabetes and obesity. Due to the various compositions of bitter melon extracts (BME, the comprehensive knowledge concerning their anti-obesity effects was insufficient. Here we first introduced supercritical extraction to BME's preparation, (supercritical extraction is a relatively advanced extraction method with a better efficiency and selectivity and expected to be extensively used in future applications and the resultants were subjected to HPLC analysis, validating the presence of 42.60% of conjugated linolenic acid (CLnA, cis9, trans11, trans13-18:3 and 13.17% of conjugated linoleic acid (CLA, cis9, trans11-18:2. The BMSO (bitter melon seed oil was then administered to the HFD mice, an obesity model established by feeding C57BL/6J mice a high fat diet. Consequently, due to the BMSO's supplementation, the HFD mice showed a significantly decreased body-weight, Lee's index, fat index and adipose size, whereas the liver weight stayed unchanged. Meanwhile, the serum FFA (free fatty acids levels returned to normal at the dosage of 10 g/kg, and the elevated serum leptin levels were also recovered by BMSO's supplementation with moderate and high dose. These findings suggested that BMSO restored the balance between lipid intake and metabolism, which was probably mediated by leptin's variation. In summary, a detailed anti-obesity effect was described with regard to a potent CFA's (conjugated fatty acid combination offered by BME. A potential mechanism underlying BME's beneficial effects was proposed, paving the way for the better use of BME's pharmaceutical function to serve the obesity's treatment.

  8. A Method of Road Extraction from High-resolution Remote Sensing Images Based on Shape Features

    Directory of Open Access Journals (Sweden)

    LEI Xiaoqi

    2016-02-01

    Full Text Available Road extraction from high-resolution remote sensing image is an important and difficult task.Since remote sensing images include complicated information,the methods that extract roads by spectral,texture and linear features have certain limitations.Also,many methods need human-intervention to get the road seeds(semi-automatic extraction,which have the great human-dependence and low efficiency.The road-extraction method,which uses the image segmentation based on principle of local gray consistency and integration shape features,is proposed in this paper.Firstly,the image is segmented,and then the linear and curve roads are obtained by using several object shape features,so the method that just only extract linear roads are rectified.Secondly,the step of road extraction is carried out based on the region growth,the road seeds are automatic selected and the road network is extracted.Finally,the extracted roads are regulated by combining the edge information.In experiments,the images that including the better gray uniform of road and the worse illuminated of road surface were chosen,and the results prove that the method of this study is promising.

  9. Translation of a High-Level Temporal Model into Lower Level Models: Impact of Modelling at Different Description Levels

    DEFF Research Database (Denmark)

    Kraft, Peter; Sørensen, Jens Otto

    2001-01-01

    given types of properties, and examine how descriptions on higher levels translate into descriptions on lower levels. Our example looks at temporal properties where the information is concerned with the existence in time. In a high level temporal model with information kept in a three-dimensional space...... the existences in time can be mapped precisely and consistently securing a consistent handling of the temporal properties. We translate the high level temporal model into an entity-relationship model, with the information in a two-dimensional graph, and finally we look at the translations into relational...... and other textual models. We also consider the aptness of models that include procedural mechanisms such as active and object databases...

  10. A Two-Level Cache for Distributed Information Retrieval in Search Engines

    Directory of Open Access Journals (Sweden)

    Weizhe Zhang

    2013-01-01

    Full Text Available To improve the performance of distributed information retrieval in search engines, we propose a two-level cache structure based on the queries of the users’ logs. We extract the highest rank queries of users from the static cache, in which the queries are the most popular. We adopt the dynamic cache as an auxiliary to optimize the distribution of the cache data. We propose a distribution strategy of the cache data. The experiments prove that the hit rate, the efficiency, and the time consumption of the two-level cache have advantages compared with other structures of cache.

  11. A two-level cache for distributed information retrieval in search engines.

    Science.gov (United States)

    Zhang, Weizhe; He, Hui; Ye, Jianwei

    2013-01-01

    To improve the performance of distributed information retrieval in search engines, we propose a two-level cache structure based on the queries of the users' logs. We extract the highest rank queries of users from the static cache, in which the queries are the most popular. We adopt the dynamic cache as an auxiliary to optimize the distribution of the cache data. We propose a distribution strategy of the cache data. The experiments prove that the hit rate, the efficiency, and the time consumption of the two-level cache have advantages compared with other structures of cache.

  12. Recognition techniques for extracting information from semistructured documents

    Science.gov (United States)

    Della Ventura, Anna; Gagliardi, Isabella; Zonta, Bruna

    2000-12-01

    Archives of optical documents are more and more massively employed, the demand driven also by the new norms sanctioning the legal value of digital documents, provided they are stored on supports that are physically unalterable. On the supply side there is now a vast and technologically advanced market, where optical memories have solved the problem of the duration and permanence of data at costs comparable to those for magnetic memories. The remaining bottleneck in these systems is the indexing. The indexing of documents with a variable structure, while still not completely automated, can be machine supported to a large degree with evident advantages both in the organization of the work, and in extracting information, providing data that is much more detailed and potentially significant for the user. We present here a system for the automatic registration of correspondence to and from a public office. The system is based on a general methodology for the extraction, indexing, archiving, and retrieval of significant information from semi-structured documents. This information, in our prototype application, is distributed among the database fields of sender, addressee, subject, date, and body of the document.

  13. Acute Toxicity of Castor Oil Bean Extract and Tolerance Level of ...

    African Journals Online (AJOL)

    The experiment was carried out to determine the acute toxicity of raw castor oil bean (Ricinus communis) extract and the tolerance level of raw castor oil bean by broilers. The seeds were ground, defatted with petroleum ether and the residue was subjected to extraction with phosphate-buffered saline. The extract volume ...

  14. Protective effects of Arctium lappa L. root extracts (AREs) on high fat diet induced quail atherosclerosis.

    Science.gov (United States)

    Wang, Zhi; Li, Ping; Wang, Chenjing; Jiang, Qixiao; Zhang, Lei; Cao, Yu; Zhong, Weizhen; Wang, Chunbo

    2016-01-08

    This study was designed to evaluate the protective effects of Arctium lappa L. root extracts (AREs) from different extraction methods (aqueous, ethanol, chloroform and flavone) on atherosclerosis. Quails (Coturnix coturnix) were subjected to high fat diet, with or without one of the four different AREs or positive control simvastatin. Blood samples were collected before treatment, after 4.5 weeks or ten weeks to assess lipid profile (Levels of total cholesterol (TC), Triacylglycerol (TG), low-density lipoprotein (LDL) and high-density lipoprotein (HDL)). After ten weeks, the serum levels of nitric oxide (NO) as well as antioxidant and pro-oxidative status (Levels of malondialdehyde (MDA), superoxide dismutase (SOD), catalase (CAT), glutathione (GSH), nicotinamide adenine dinucleotide phosphate (NADPH) and glutathione peroxidase (GSH-Px)) were measured. Furthermore, aortas were collected after ten weeks treatment, aorta lipid contents (TC, TG and LDL) were assessed, and histology was used to confirm atherosclerotic changes. The results indicated that high fat diet significantly deteriorated lipid profile and antioxidant status in quail serum, while all the extracts significantly reverted the changes similar to simvastatin. Aorta lipid profile assessment revealed similar results. Histology on aortas from quails treated for ten weeks confirmed atherosclerotic changes in high fat diet group, while the extracts significantly alleviated the atherosclerotic changes similar to simvastatin. Among the different extracts, flavones fraction exerted best protective effects. Our data suggest that the protective effects of AREs were medicated via hypolipidemic and anti-oxidant effects. Underlying molecular mechanisms are under investigation.

  15. High-level Petri Nets

    DEFF Research Database (Denmark)

    various journals and collections. As a result, much of this knowledge is not readily available to people who may be interested in using high-level nets. Within the Petri net community this problem has been discussed many times, and as an outcome this book has been compiled. The book contains reprints...... of some of the most important papers on the application and theory of high-level Petri nets. In this way it makes the relevant literature more available. It is our hope that the book will be a useful source of information and that, e.g., it can be used in the organization of Petri net courses. To make......High-level Petri nets are now widely used in both theoretical analysis and practical modelling of concurrent systems. The main reason for the success of this class of net models is that they make it possible to obtain much more succinct and manageable descriptions than can be obtained by means...

  16. Mapping Entomological Dengue Risk Levels in Martinique Using High-Resolution Remote-Sensing Environmental Data

    Directory of Open Access Journals (Sweden)

    Vanessa Machault

    2014-12-01

    Full Text Available Controlling dengue virus transmission mainly involves integrated vector management. Risk maps at appropriate scales can provide valuable information for assessing entomological risk levels. Here, results from a spatio-temporal model of dwellings potentially harboring Aedes aegypti larvae from 2009 to 2011 in Tartane (Martinique, French Antilles using high spatial resolution remote-sensing environmental data and field entomological and meteorological information are presented. This tele-epidemiology methodology allows monitoring the dynamics of diseases closely related to weather/climate and environment variability. A Geoeye-1 image was processed to extract landscape elements that could surrogate societal or biological information related to the life cycle of Aedes vectors. These elements were subsequently included into statistical models with random effect. Various environmental and meteorological conditions have indeed been identified as risk/protective factors for the presence of Aedes aegypti immature stages in dwellings at a given date. These conditions were used to produce dynamic high spatio-temporal resolution maps from the presence of most containers harboring larvae. The produced risk maps are examples of modeled entomological maps at the housing level with daily temporal resolution. This finding is an important contribution to the development of targeted operational control systems for dengue and other vector-borne diseases, such as chikungunya, which is also present in Martinique.

  17. Batch extractive distillation for high purity methanol

    International Nuclear Information System (INIS)

    Zhang Weijiang; Ma Sisi

    2006-01-01

    In this paper, the application in chemical industry and microelectronic industry, market status and the present situation of production of high purity methanol at home and abroad were introduced firstly. Purification of industrial methanol for high purity methanol is feasible in china. Batch extractive distillation is the best separation technique for purification of industrial methanol. Dimethyl sulfoxide was better as an extractant. (authors)

  18. Terrain Extraction by Integrating Terrestrial Laser Scanner Data and Spectral Information

    Science.gov (United States)

    Lau, C. L.; Halim, S.; Zulkepli, M.; Azwan, A. M.; Tang, W. L.; Chong, A. K.

    2015-10-01

    The extraction of true terrain points from unstructured laser point cloud data is an important process in order to produce an accurate digital terrain model (DTM). However, most of these spatial filtering methods just utilizing the geometrical data to discriminate the terrain points from nonterrain points. The point cloud filtering method also can be improved by using the spectral information available with some scanners. Therefore, the objective of this study is to investigate the effectiveness of using the three-channel (red, green and blue) of the colour image captured from built-in digital camera which is available in some Terrestrial Laser Scanner (TLS) for terrain extraction. In this study, the data acquisition was conducted at a mini replica landscape in Universiti Teknologi Malaysia (UTM), Skudai campus using Leica ScanStation C10. The spectral information of the coloured point clouds from selected sample classes are extracted for spectral analysis. The coloured point clouds which within the corresponding preset spectral threshold are identified as that specific feature point from the dataset. This process of terrain extraction is done through using developed Matlab coding. Result demonstrates that a higher spectral resolution passive image is required in order to improve the output. This is because low quality of the colour images captured by the sensor contributes to the low separability in spectral reflectance. In conclusion, this study shows that, spectral information is capable to be used as a parameter for terrain extraction.

  19. Extracting Semantic Information from Visual Data: A Survey

    Directory of Open Access Journals (Sweden)

    Qiang Liu

    2016-03-01

    Full Text Available The traditional environment maps built by mobile robots include both metric ones and topological ones. These maps are navigation-oriented and not adequate for service robots to interact with or serve human users who normally rely on the conceptual knowledge or semantic contents of the environment. Therefore, the construction of semantic maps becomes necessary for building an effective human-robot interface for service robots. This paper reviews recent research and development in the field of visual-based semantic mapping. The main focus is placed on how to extract semantic information from visual data in terms of feature extraction, object/place recognition and semantic representation methods.

  20. Hierarchical graph-based segmentation for extracting road networks from high-resolution satellite images

    Science.gov (United States)

    Alshehhi, Rasha; Marpu, Prashanth Reddy

    2017-04-01

    Extraction of road networks in urban areas from remotely sensed imagery plays an important role in many urban applications (e.g. road navigation, geometric correction of urban remote sensing images, updating geographic information systems, etc.). It is normally difficult to accurately differentiate road from its background due to the complex geometry of the buildings and the acquisition geometry of the sensor. In this paper, we present a new method for extracting roads from high-resolution imagery based on hierarchical graph-based image segmentation. The proposed method consists of: 1. Extracting features (e.g., using Gabor and morphological filtering) to enhance the contrast between road and non-road pixels, 2. Graph-based segmentation consisting of (i) Constructing a graph representation of the image based on initial segmentation and (ii) Hierarchical merging and splitting of image segments based on color and shape features, and 3. Post-processing to remove irregularities in the extracted road segments. Experiments are conducted on three challenging datasets of high-resolution images to demonstrate the proposed method and compare with other similar approaches. The results demonstrate the validity and superior performance of the proposed method for road extraction in urban areas.

  1. Automated extraction of chemical structure information from digital raster images

    Directory of Open Access Journals (Sweden)

    Shedden Kerby A

    2009-02-01

    Full Text Available Abstract Background To search for chemical structures in research articles, diagrams or text representing molecules need to be translated to a standard chemical file format compatible with cheminformatic search engines. Nevertheless, chemical information contained in research articles is often referenced as analog diagrams of chemical structures embedded in digital raster images. To automate analog-to-digital conversion of chemical structure diagrams in scientific research articles, several software systems have been developed. But their algorithmic performance and utility in cheminformatic research have not been investigated. Results This paper aims to provide critical reviews for these systems and also report our recent development of ChemReader – a fully automated tool for extracting chemical structure diagrams in research articles and converting them into standard, searchable chemical file formats. Basic algorithms for recognizing lines and letters representing bonds and atoms in chemical structure diagrams can be independently run in sequence from a graphical user interface-and the algorithm parameters can be readily changed-to facilitate additional development specifically tailored to a chemical database annotation scheme. Compared with existing software programs such as OSRA, Kekule, and CLiDE, our results indicate that ChemReader outperforms other software systems on several sets of sample images from diverse sources in terms of the rate of correct outputs and the accuracy on extracting molecular substructure patterns. Conclusion The availability of ChemReader as a cheminformatic tool for extracting chemical structure information from digital raster images allows research and development groups to enrich their chemical structure databases by annotating the entries with published research articles. Based on its stable performance and high accuracy, ChemReader may be sufficiently accurate for annotating the chemical database with links

  2. Overview of image processing tools to extract physical information from JET videos

    Science.gov (United States)

    Craciunescu, T.; Murari, A.; Gelfusa, M.; Tiseanu, I.; Zoita, V.; EFDA Contributors, JET

    2014-11-01

    In magnetic confinement nuclear fusion devices such as JET, the last few years have witnessed a significant increase in the use of digital imagery, not only for the surveying and control of experiments, but also for the physical interpretation of results. More than 25 cameras are routinely used for imaging on JET in the infrared (IR) and visible spectral regions. These cameras can produce up to tens of Gbytes per shot and their information content can be very different, depending on the experimental conditions. However, the relevant information about the underlying physical processes is generally of much reduced dimensionality compared to the recorded data. The extraction of this information, which allows full exploitation of these diagnostics, is a challenging task. The image analysis consists, in most cases, of inverse problems which are typically ill-posed mathematically. The typology of objects to be analysed is very wide, and usually the images are affected by noise, low levels of contrast, low grey-level in-depth resolution, reshaping of moving objects, etc. Moreover, the plasma events have time constants of ms or tens of ms, which imposes tough conditions for real-time applications. On JET, in the last few years new tools and methods have been developed for physical information retrieval. The methodology of optical flow has allowed, under certain assumptions, the derivation of information about the dynamics of video objects associated with different physical phenomena, such as instabilities, pellets and filaments. The approach has been extended in order to approximate the optical flow within the MPEG compressed domain, allowing the manipulation of the large JET video databases and, in specific cases, even real-time data processing. The fast visible camera may provide new information that is potentially useful for disruption prediction. A set of methods, based on the extraction of structural information from the visual scene, have been developed for the

  3. Overview of image processing tools to extract physical information from JET videos

    International Nuclear Information System (INIS)

    Craciunescu, T; Tiseanu, I; Zoita, V; Murari, A; Gelfusa, M

    2014-01-01

    In magnetic confinement nuclear fusion devices such as JET, the last few years have witnessed a significant increase in the use of digital imagery, not only for the surveying and control of experiments, but also for the physical interpretation of results. More than 25 cameras are routinely used for imaging on JET in the infrared (IR) and visible spectral regions. These cameras can produce up to tens of Gbytes per shot and their information content can be very different, depending on the experimental conditions. However, the relevant information about the underlying physical processes is generally of much reduced dimensionality compared to the recorded data. The extraction of this information, which allows full exploitation of these diagnostics, is a challenging task. The image analysis consists, in most cases, of inverse problems which are typically ill-posed mathematically. The typology of objects to be analysed is very wide, and usually the images are affected by noise, low levels of contrast, low grey-level in-depth resolution, reshaping of moving objects, etc. Moreover, the plasma events have time constants of ms or tens of ms, which imposes tough conditions for real-time applications. On JET, in the last few years new tools and methods have been developed for physical information retrieval. The methodology of optical flow has allowed, under certain assumptions, the derivation of information about the dynamics of video objects associated with different physical phenomena, such as instabilities, pellets and filaments. The approach has been extended in order to approximate the optical flow within the MPEG compressed domain, allowing the manipulation of the large JET video databases and, in specific cases, even real-time data processing. The fast visible camera may provide new information that is potentially useful for disruption prediction. A set of methods, based on the extraction of structural information from the visual scene, have been developed for the

  4. THE EXTRACTION OF INDOOR BUILDING INFORMATION FROM BIM TO OGC INDOORGML

    Directory of Open Access Journals (Sweden)

    T.-A. Teo

    2017-07-01

    Full Text Available Indoor Spatial Data Infrastructure (indoor-SDI is an important SDI for geosptial analysis and location-based services. Building Information Model (BIM has high degree of details in geometric and semantic information for building. This study proposed direct conversion schemes to extract indoor building information from BIM to OGC IndoorGML. The major steps of the research include (1 topological conversion from building model into indoor network model; and (2 generation of IndoorGML. The topological conversion is a major process of generating and mapping nodes and edges from IFC to indoorGML. Node represents every space (e.g. IfcSpace and objects (e.g. IfcDoor in the building while edge shows the relationships between nodes. According to the definition of IndoorGML, the topological model in the dual space is also represented as a set of nodes and edges. These definitions of IndoorGML are the same as in the indoor network. Therefore, we can extract the necessary data in the indoor network and easily convert them into IndoorGML based on IndoorGML Schema. The experiment utilized a real BIM model to examine the proposed method. The experimental results indicated that the 3D indoor model (i.e. IndoorGML model can be automatically imported from IFC model by the proposed procedure. In addition, the geometric and attribute of building elements are completely and correctly converted from BIM to indoor-SDI.

  5. R and D Activities on high-level nuclear waste management

    International Nuclear Information System (INIS)

    Watanabe, Shosuke

    1985-01-01

    High-level liquid waste (HLLW) at Tokai Reprocessing Plant has been generated from reprocessing of spent fuels from the light water reactors, and successfully managed since 1977. At the time of 1984, about 154m 3 of HLLW from 170 tons of spent fuels were stored in three high-integrity stainless steel tanks (90m 3 for each) as a nitric acid aqueous solution. The HLLW arises mainly from the first cycle solvent extraction phase. Alkaline solution to scrub the extraction solvent is another source of HLLW. The Advisory Committee on Radioactive Waste Management reported the concept on disposal of high-level waste (HLW) in Japan in 1980 report, that the waste be solidified into borosilicate glass and then be disposed in deep geologic formation so as to minimize the influence of the waste on human environment, with the aid of multibarrier system which is the combination of natural barrier and engineered barrier

  6. Extracting the information backbone in online system.

    Directory of Open Access Journals (Sweden)

    Qian-Ming Zhang

    Full Text Available Information overload is a serious problem in modern society and many solutions such as recommender system have been proposed to filter out irrelevant information. In the literature, researchers have been mainly dedicated to improving the recommendation performance (accuracy and diversity of the algorithms while they have overlooked the influence of topology of the online user-object bipartite networks. In this paper, we find that some information provided by the bipartite networks is not only redundant but also misleading. With such "less can be more" feature, we design some algorithms to improve the recommendation performance by eliminating some links from the original networks. Moreover, we propose a hybrid method combining the time-aware and topology-aware link removal algorithms to extract the backbone which contains the essential information for the recommender systems. From the practical point of view, our method can improve the performance and reduce the computational time of the recommendation system, thus improving both of their effectiveness and efficiency.

  7. Extracting the information backbone in online system.

    Science.gov (United States)

    Zhang, Qian-Ming; Zeng, An; Shang, Ming-Sheng

    2013-01-01

    Information overload is a serious problem in modern society and many solutions such as recommender system have been proposed to filter out irrelevant information. In the literature, researchers have been mainly dedicated to improving the recommendation performance (accuracy and diversity) of the algorithms while they have overlooked the influence of topology of the online user-object bipartite networks. In this paper, we find that some information provided by the bipartite networks is not only redundant but also misleading. With such "less can be more" feature, we design some algorithms to improve the recommendation performance by eliminating some links from the original networks. Moreover, we propose a hybrid method combining the time-aware and topology-aware link removal algorithms to extract the backbone which contains the essential information for the recommender systems. From the practical point of view, our method can improve the performance and reduce the computational time of the recommendation system, thus improving both of their effectiveness and efficiency.

  8. Extracting the Information Backbone in Online System

    Science.gov (United States)

    Zhang, Qian-Ming; Zeng, An; Shang, Ming-Sheng

    2013-01-01

    Information overload is a serious problem in modern society and many solutions such as recommender system have been proposed to filter out irrelevant information. In the literature, researchers have been mainly dedicated to improving the recommendation performance (accuracy and diversity) of the algorithms while they have overlooked the influence of topology of the online user-object bipartite networks. In this paper, we find that some information provided by the bipartite networks is not only redundant but also misleading. With such “less can be more” feature, we design some algorithms to improve the recommendation performance by eliminating some links from the original networks. Moreover, we propose a hybrid method combining the time-aware and topology-aware link removal algorithms to extract the backbone which contains the essential information for the recommender systems. From the practical point of view, our method can improve the performance and reduce the computational time of the recommendation system, thus improving both of their effectiveness and efficiency. PMID:23690946

  9. Evaluation of urinary cortisol excretion by radioimmunoassay through two methods (extracted and non-extracted)

    International Nuclear Information System (INIS)

    Fonte Kohek, M.B. da; Mendonca, B.B. de; Nicolau, W.

    1993-01-01

    The objective of this paper is to compare the feasibility, sensitivity and specificity of both methods (extracted versus non-extracted) in the hypercortisolism diagnosis. It used Gamma Coat 125 cortisol Kit provided by Clinical Assays, Incstar, USA, for both methods extracting it with methylene chloride in order to measure the extracted cortisol. It was performed 32 assays from which it was obtained from 0.1 to 0.47 u g/d l of sensitivity. The intra-run precision was varied from 8.29 +- 3.38% and 8.19 +-4.72% for high and low levels, respectively for non-extracted cortisol, and 9.72 +- 1.94% and 9.54 +- 44% for high and low levels, respectively, for extracted cortisol. The inter-run precision was 15.98% and 16.15% for high level of non-extracted cortisol, respectively. For the low level it obtained 17.25% and 18.59% for non-extracted and extracted cortisol respectively. It was evaluated 24-hour urine basal samples from 43 normal subjects, and 53 obese (body mass index > 30) and 53 Cushing's syndrome patients. The sensitivity of the methods were similar (100% and 98.1% for non-extracted and extracted methods, respectively) and the specificity was the same for both methods (100%). It was noticed a positive correlation between the two methods in all the groups studied (p s syndrome. (author)

  10. Data Assimilation to Extract Soil Moisture Information from SMAP Observations

    Directory of Open Access Journals (Sweden)

    Jana Kolassa

    2017-11-01

    Full Text Available This study compares different methods to extract soil moisture information through the assimilation of Soil Moisture Active Passive (SMAP observations. Neural network (NN and physically-based SMAP soil moisture retrievals were assimilated into the National Aeronautics and Space Administration (NASA Catchment model over the contiguous United States for April 2015 to March 2017. By construction, the NN retrievals are consistent with the global climatology of the Catchment model soil moisture. Assimilating the NN retrievals without further bias correction improved the surface and root zone correlations against in situ measurements from 14 SMAP core validation sites (CVS by 0.12 and 0.16, respectively, over the model-only skill, and reduced the surface and root zone unbiased root-mean-square error (ubRMSE by 0.005 m 3 m − 3 and 0.001 m 3 m − 3 , respectively. The assimilation reduced the average absolute surface bias against the CVS measurements by 0.009 m 3 m − 3 , but increased the root zone bias by 0.014 m 3 m − 3 . Assimilating the NN retrievals after a localized bias correction yielded slightly lower surface correlation and ubRMSE improvements, but generally the skill differences were small. The assimilation of the physically-based SMAP Level-2 passive soil moisture retrievals using a global bias correction yielded similar skill improvements, as did the direct assimilation of locally bias-corrected SMAP brightness temperatures within the SMAP Level-4 soil moisture algorithm. The results show that global bias correction methods may be able to extract more independent information from SMAP observations compared to local bias correction methods, but without accurate quality control and observation error characterization they are also more vulnerable to adverse effects from retrieval errors related to uncertainties in the retrieval inputs and algorithm. Furthermore, the results show that using global bias correction approaches without a

  11. The application of quadtree algorithm for information integration in the high-level radioactive waste geological disposal

    International Nuclear Information System (INIS)

    Gao Min; Zhong Xia; Huang Shutao

    2008-01-01

    A multi-source database for high-level radioactive waste geological disposal, aims to promote the information process of the geological of HLW. In the periods of the multi-dimensional and multi-source and the integration of information and applications, it also relates to computer software and hardware, the paper preliminary analysises the data resources Beishan area, Gansu Province. The paper introduces a theory based on GIS technology and methods and open source code GDAL application, at the same time, it discusses the technical methods how to finish the application of the Quadtree algorithm in the area of information resources management system, fully sharing, rapid retrieval and so on. A more detailed description of the characteristics of existing data resources, space-related data retrieval algorithm theory, programming design and implementation of ideas are showed in the paper. (authors)

  12. Maximal exercise and muscle oxygen extraction in acclimatizing lowlanders and high altitude natives

    DEFF Research Database (Denmark)

    Lundby, Carsten; Sander, Mikael; van Hall, Gerrit

    2006-01-01

    , and is the focus of the present study. We have studied six lowlanders during maximal exercise at sea level (SL) and with acute (AH) exposure to 4,100 m altitude, and again after 2 (W2) and 8 weeks (W8) of altitude sojourn, where also eight high altitude native (Nat) Aymaras were studied. Fractional arterial muscle...... O(2) extraction at maximal exercise was 90.0+/-1.0% in the Danish lowlanders at sea level, and remained close to this value in all situations. In contrast to this, fractional arterial O(2) extraction was 83.2+/-2.8% in the high altitude natives, and did not change with the induction of normoxia....... The capillary oxygen conductance of the lower extremity, a measure of oxygen diffusing capacity, was decreased in the Danish lowlanders after 8 weeks of acclimatization, but was still higher than the value obtained from the high altitude natives. The values were (in ml min(-1) mmHg(-1)) 55.2+/-3.7 (SL), 48...

  13. Development of a partitioning method for the management of high-level liquid waste

    International Nuclear Information System (INIS)

    Kubota, M.; Dojiri, S.; Yamaguchi, I.; Morita, Y.; Yamagishi, I.; Kobayashi, T.; Tani, S.

    1989-01-01

    Fundamental studies especially focused on the separation of neptunium and technetium have been carried out to construct the advanced partitioning process of fractioning elements in a high-level liquid waste into four groups: transuranium elements, technetium-noble metals, strontium-cesium, and other elements. For the separation of neptunium by solvent extraction, DIDPA proved excellent for extracting Np(V), and its extraction rate was accelerated by hydrogen peroxide. Np(V) was found to be also separated quantitatively as precipitate with oxalic acid. For the separation of technetium, the denitration with formic acid was effective in precipitating it along with noble metals, and the adsorption with activated carbon was also effective for quantitative separation. Through these fundamental studies, the advanced partitioning process is presented as the candidate to be examined with an actual high-level liquid waste

  14. Development of a test system for high level liquid waste partitioning

    OpenAIRE

    Duan Wu H.; Chen Jing; Wang Jian C.; Wang Shu W.; Wang Xing H.

    2015-01-01

    The partitioning and transmutation strategy has increasingly attracted interest for the safe treatment and disposal of high level liquid waste, in which the partitioning of high level liquid waste is one of the critical technical issues. An improved total partitioning process, including a tri-alkylphosphine oxide process for the removal of actinides, a crown ether strontium extraction process for the removal of strontium, and a calixcrown ether cesium extra...

  15. Automatic Fontanel Extraction from Newborns' CT Images Using Variational Level Set

    Science.gov (United States)

    Kazemi, Kamran; Ghadimi, Sona; Lyaghat, Alireza; Tarighati, Alla; Golshaeyan, Narjes; Abrishami-Moghaddam, Hamid; Grebe, Reinhard; Gondary-Jouet, Catherine; Wallois, Fabrice

    A realistic head model is needed for source localization methods used for the study of epilepsy in neonates applying Electroencephalographic (EEG) measurements from the scalp. The earliest models consider the head as a series of concentric spheres, each layer corresponding to a different tissue whose conductivity is assumed to be homogeneous. The results of the source reconstruction depend highly on the electric conductivities of the tissues forming the head.The most used model is constituted of three layers (scalp, skull, and intracranial). Most of the major bones of the neonates’ skull are ossified at birth but can slightly move relative to each other. This is due to the sutures, fibrous membranes that at this stage of development connect the already ossified flat bones of the neurocranium. These weak parts of the neurocranium are called fontanels. Thus it is important to enter the exact geometry of fontaneles and flat bone in a source reconstruction because they show pronounced in conductivity. Computer Tomography (CT) imaging provides an excellent tool for non-invasive investigation of the skull which expresses itself in high contrast to all other tissues while the fontanels only can be identified as absence of bone, gaps in the skull formed by flat bone. Therefore, the aim of this paper is to extract the fontanels from CT images applying a variational level set method. We applied the proposed method to CT-images of five different subjects. The automatically extracted fontanels show good agreement with the manually extracted ones.

  16. Airborne LIDAR and high resolution satellite data for rapid 3D feature extraction

    Science.gov (United States)

    Jawak, S. D.; Panditrao, S. N.; Luis, A. J.

    2014-11-01

    , including skyscrapers and bridges, which were confounded and extracted as buildings. This can be attributed to low point density at building edges and on flat roofs or occlusions due to which LiDAR cannot give as much precise planimetric accuracy as photogrammetric techniques (in segmentation) and lack of optimum use of textural information as well as contextual information (especially at walls which are away from roof) in automatic extraction algorithm. In addition, there were no separate classes for bridges or the features lying inside the water and multiple water height levels were also not considered. Based on these inferences, we conclude that the LiDAR-based 3D feature extraction supplemented by high resolution satellite data is a potential application which can be used for understanding and characterization of urban setup.

  17. Ginger extract and aerobic training reduces lipid profile in high-fat fed diet rats.

    Science.gov (United States)

    Khosravani, M; Azarbayjani, M A; Abolmaesoomi, M; Yusof, A; Zainal Abidin, N; Rahimi, E; Feizolahi, F; Akbari, M; Seyedjalali, S; Dehghan, F

    2016-04-01

    Obesity, hyperglycemia and dyslipidemia, are major risk factors. However, natural therapies, dietary components, and physical activity may effect on these concerns. The aim of this study was to examine the effect of aerobic exercise and consumption of liquid ginger extract on lipid profile of Male rats with a high-fat fed diet. 32 rats were randomly divided into 4 groups: 1) aerobic exercise, 2) Ginger extract, 3) combined aerobic exercise and Ginger extract, and 4) the control. Subjects of the first three groups received ginger extract via gavage feeding of 250 mg/kg. The exercise program was 3 sessions per week on 3 different days over 4 weeks. Total cholesterol (TC), Triglyceride (TG), HDL and LDL were measured 24-h before the first session and 24-h after the final training session. The concentration of TG in the control group was significantly higher than other groups. In addition, the mean concentration of TG in the aerobic exercise group was significantly lower than Ginger extract group but there was no significant difference as compared to combined aerobic exercise and ginger extract group. The combination of aerobic exercise and ginger consumption significantly reduced the TG level compared to ginger group. TC and LDL concentrations were significantly decreased in all groups compare to control. The combination of aerobic exercise and ginger extract feeding caused a significant increase in HDL levels. The finding of this study suggests that the combination of aerobic exercise and liquid ginger extract consumption might be an effective method of reducing lipid profiles, which will reduce the risk of cardiovascular diseases caused by high-fat diets.

  18. High-pressure extraction of polychlorinated biphenyls from soils and other fine-grained solids

    International Nuclear Information System (INIS)

    Markowz, G.

    1996-12-01

    Four doped and three really contaminated samples were subjected to high-pressure PCB (polychlorinated biphenyl) extraction in a laboratory-scale experimental plant using CO 2 (carbon dioxide) as solvent. The PCB levels (sum out of the six key substances) of the real samples were 2.6, 6.8, and 139 mg/kg. The success of the cleaning process was determined by measuring the residual PCB levels in the soil after the extraction. Parameters were varied and samples were taken selectively from various points in the bed (length 270 mm, diameter 14 mm, weighed - in soil 50-60 g) in order to gain an idea of the effects of upscaling. The following parameters were varied: extraction temperature 40-90 C; extraction pressure 200-300 bar; CO 2 flow rate 3.6-14.6 g/min; CO 2 quantity 0-328 g; degree of contamination (doped samples) 12-60 mg/kg; soil moisture 0-15%; particle size 0-2000 μm; entraining agent methanol, ethanol, acetone; proportion of entraining agent 0-7.5% by weight. Furthermore the influence of moisture at the time of doping on extraction was examined. (orig./ABI) [de

  19. Production and properties of solidified high-level waste

    International Nuclear Information System (INIS)

    Brodersen, K.

    1980-08-01

    Available information on production and properties of solidified high-level waste are presented. The review includes literature up to the end of 1979. The feasibility of production of various types of solidified high-level wast is investigated. The main emphasis is on borosilicate glass but other options are also mentioned. The expected long-term behaviour of the materials are discussed on the basis of available results from laboratory experiments. Examples of the use of the information in safety analysis of disposal in salt formations are given. The work has been made on behalf of the Danish utilities investigation of the possibilities of disposal of high-level waste in salt domes in Jutland. (author)

  20. High Level Radioactive Waste Management

    International Nuclear Information System (INIS)

    1991-01-01

    The proceedings of the second annual international conference on High Level Radioactive Waste Management, held on April 28--May 3, 1991, Las Vegas, Nevada, provides information on the current technical issue related to international high level radioactive waste management activities and how they relate to society as a whole. Besides discussing such technical topics as the best form of the waste, the integrity of storage containers, design and construction of a repository, the broader social aspects of these issues are explored in papers on such subjects as conformance to regulations, transportation safety, and public education. By providing this wider perspective of high level radioactive waste management, it becomes apparent that the various disciplines involved in this field are interrelated and that they should work to integrate their waste management activities. Individual records are processed separately for the data bases

  1. Reconnect on Facebook: The Role of Information Seeking Behavior and Individual- and Relationship-Level Factors.

    Science.gov (United States)

    Ramirez, Artemio; Sumner, Erin M; Hayes, Jameson

    2016-08-01

    Social network sites (SNSs) such as Facebook function as both venues for reconnecting with associates from a user's past and sources of social information about them. Yet, little is known about what factors influence the initial decision to reconnect with a past associate. This oversight is significant given that SNSs and other platforms provide an abundance of social information that may be utilized for reaching such decisions. The present study investigated the links among relational reconnection, information seeking (IS) behavior, and individual- and relationship-level factors in user decisions to reconnect on Facebook. A national survey of 244 Facebook users reported on their most recent experience of receiving a friend request from someone with whom they had been out of contact for an extended period. Results indicated that uncertainty about the potential reconnection partner and forecast about the reconnection's potential reward level significantly predicted IS behavior (passive on both target and mutual friends' SNS pages as well as active). However, the emergence of their two-way interaction revealed that the forecasts moderated the IS-uncertainty link on three of the strategies (extractive, both passive approaches). Moreover, social anxiety, sociability, uncertainty about the partner, the forecast about the reconnection's reward level, and extractive and passive (target SNS pages) strategies significantly predicted user decisions to reconnect. Future directions for research on relational reconnection on SNSs are offered.

  2. The Low-Level Control System for the CERN PS Multi-Turn Extraction Kickers

    CERN Document Server

    Schipper, J; Boucly, C; Carlier, E; Fowler, T; Gaudillet, H; Noulibos, R; Sermeus, L

    2010-01-01

    To reduce the beam losses when preparing high intensity proton beam for the CERN Neutrino to Gran Sasso (CNGS) facility, a new Multi-Turn extraction (MTE) scheme has been implemented in the PS, to replace the present Continuous Transfer (CT) to the SPS. Industrial off-the-shelf components have been used for the low-level part of the MTE kicker control system. National Instruments PXI systems are used to control the high voltage pulse generators and a SIEMENS programmable logic controller (PLC) handles the centralised oil cooling and gas insulation sub-systems

  3. EFFECT OF CONSUMING GUAVA LEAVES (PSIDII FOLIUM EXTRACT ON THE LEVEL OF BLOOD PROFILE IN TEENAGE GIRLS AT VOCATIONAL HIGH SCHOOL OF PALEBON SEMARANG, INDONESIA

    Directory of Open Access Journals (Sweden)

    Yulaeka

    2017-10-01

    Full Text Available Background: Women are at risk of iron-deficiency anemia, especially in teenage girls. One alternative treatment to prevent the occurrence of anemia is to consume guava leaf extract Objective: To examine the effect of guava leaves extract on changes in blood profile level in teenage girls. Methods: This study was a quasy experiment with pretest posttest control group design. This research was conducted at SMK Palebon Semarang conducted on December 2016 - January 2017. There were 36 samples selected using a purposive sampling, with 18 samples were assigned in the experiment and control group. Blood profiles was measured in the Laboratory of Cito Klinik Setiabudi to see the hemoglobin level, hematocrit level, erythrocyte count, and platelet count. Data were analyzed using Independent t-test. Results: There were significant differences in hemoglobin and trombocytes levels after given intervention between the experiment and control group with p-value 0.05. Conclusion: Guava leaves (Psidii folium extracts have a significant effect on changes in hemoglobin and thrombocyte levels in teenage girls, but not in the hematocrit and thrombocytes levels. Therefore, it is suggested that guava leaves (Psidii folium extracts can be an alternative treatment for midwives to prevent the occurrence of anemia in teenage girls.

  4. Extraction of UO22+ by two highly sterically hindered (X1) (X2) PO(OH) extractants from an aqueous chloride phase

    International Nuclear Information System (INIS)

    Mason, G.W.; Lewey, S.M.; Gilles, D.M.; Peppard, D.F.

    1978-01-01

    The comparative extraction behaviour of tracer-level UO 2 2+ into benzene solutions of two highly sterically hindered extractants, di(2,6-di-iso-propylphenyl) phosphoric acid, HD(2,6-i-PPHI)P and di-tertiary-butyl phosphinic acid, H[Dt-BP], vs an aqueous 1.0 F (NaCl + HCl) phase was studied. The extraction of UO 2 2+ in both systems is directly second-power dependent upon extractant concentration and inversely second-power dependent upon hydrogen ion concentration, the stoichiometry of extraction being UOsub(2A) 2+ + 2(HY)sub(2O) = UO 2 (HY 2 )sub(2O) + 2Hsub(A) + . The expression for the distribution ratio, K is K = Ksub(s)F 2 /[H + ] 2 the general expression for the extraction of any metallic species being K - Ksub(s)Fsup(a)/[H + ]sup(b) where Ksub(s) is a constant characteristic of the system, F the concentration in formality units of extractant in the organic phase, [H + ] the concentration of hydrogen ion in the aqueous phase, and a and b the respective extractant and hydrogen-ion dependencies. Both extractants have a high degree of steric hindrance. The HD(2,6-i-PPHI)P is the more highly acidic, the pKsub(A) value, in 75% ethanol, being 3.2. The pKsub(A), previously reported, for H[Dt-BP] is 6.26. The Ksub(s) for UO 2 2+ in the system HY in benzene diluent vs an aqueous 1.0 F (NaCl + HCl) phase is 2 x 10 4 for H[Dt-BP] and 3 x 10 -1 for HD(2,6-i-PPHI)P; the ratio of the Ksub(s) values nearly 7 x 10 3 , favours the less acidic extractant. For comparative purposes, the Ksub(s) values for UO 2 2+ and for Am 3+ and Eu 3+ in other (X 1 )(X 2 )PO(OH), in benzene diluent, vs 1.0 F (NaCl + HCl) systems are presented. The variations are discussed in terms of the pKsub(A) of the extractant and the steric hindrance within the extractant. (author)

  5. Evaluation of solidified high-level waste forms

    International Nuclear Information System (INIS)

    1981-01-01

    One of the objectives of the IAEA waste management programme is to coordinate and promote development of improved technology for the safe management of radioactive wastes. The Agency accomplished this objective specifically through sponsoring Coordinated Research Programmes on the ''Evaluation of Solidified High Level Waste Products'' in 1977. The primary objectives of this programme are to review and disseminate information on the properties of solidified high-level waste forms, to provide a mechanism for analysis and comparison of results from different institutes, and to help coordinate future plans and actions. This report is a summary compilation of the key information disseminated at the second meeting of this programme

  6. A Method for Extracting Road Boundary Information from Crowdsourcing Vehicle GPS Trajectories.

    Science.gov (United States)

    Yang, Wei; Ai, Tinghua; Lu, Wei

    2018-04-19

    Crowdsourcing trajectory data is an important approach for accessing and updating road information. In this paper, we present a novel approach for extracting road boundary information from crowdsourcing vehicle traces based on Delaunay triangulation (DT). First, an optimization and interpolation method is proposed to filter abnormal trace segments from raw global positioning system (GPS) traces and interpolate the optimization segments adaptively to ensure there are enough tracking points. Second, constructing the DT and the Voronoi diagram within interpolated tracking lines to calculate road boundary descriptors using the area of Voronoi cell and the length of triangle edge. Then, the road boundary detection model is established integrating the boundary descriptors and trajectory movement features (e.g., direction) by DT. Third, using the boundary detection model to detect road boundary from the DT constructed by trajectory lines, and a regional growing method based on seed polygons is proposed to extract the road boundary. Experiments were conducted using the GPS traces of taxis in Beijing, China, and the results show that the proposed method is suitable for extracting the road boundary from low-frequency GPS traces, multi-type road structures, and different time intervals. Compared with two existing methods, the automatically extracted boundary information was proved to be of higher quality.

  7. A Method for Extracting Road Boundary Information from Crowdsourcing Vehicle GPS Trajectories

    Directory of Open Access Journals (Sweden)

    Wei Yang

    2018-04-01

    Full Text Available Crowdsourcing trajectory data is an important approach for accessing and updating road information. In this paper, we present a novel approach for extracting road boundary information from crowdsourcing vehicle traces based on Delaunay triangulation (DT. First, an optimization and interpolation method is proposed to filter abnormal trace segments from raw global positioning system (GPS traces and interpolate the optimization segments adaptively to ensure there are enough tracking points. Second, constructing the DT and the Voronoi diagram within interpolated tracking lines to calculate road boundary descriptors using the area of Voronoi cell and the length of triangle edge. Then, the road boundary detection model is established integrating the boundary descriptors and trajectory movement features (e.g., direction by DT. Third, using the boundary detection model to detect road boundary from the DT constructed by trajectory lines, and a regional growing method based on seed polygons is proposed to extract the road boundary. Experiments were conducted using the GPS traces of taxis in Beijing, China, and the results show that the proposed method is suitable for extracting the road boundary from low-frequency GPS traces, multi-type road structures, and different time intervals. Compared with two existing methods, the automatically extracted boundary information was proved to be of higher quality.

  8. Handbook of high-level radioactive waste transportation

    International Nuclear Information System (INIS)

    Sattler, L.R.

    1992-10-01

    The High-Level Radioactive Waste Transportation Handbook serves as a reference to which state officials and members of the general public may turn for information on radioactive waste transportation and on the federal government's system for transporting this waste under the Civilian Radioactive Waste Management Program. The Handbook condenses and updates information contained in the Midwestern High-Level Radioactive Waste Transportation Primer. It is intended primarily to assist legislators who, in the future, may be called upon to enact legislation pertaining to the transportation of radioactive waste through their jurisdictions. The Handbook is divided into two sections. The first section places the federal government's program for transporting radioactive waste in context. It provides background information on nuclear waste production in the United States and traces the emergence of federal policy for disposing of radioactive waste. The second section covers the history of radioactive waste transportation; summarizes major pieces of legislation pertaining to the transportation of radioactive waste; and provides an overview of the radioactive waste transportation program developed by the US Department of Energy (DOE). To supplement this information, a summary of pertinent federal and state legislation and a glossary of terms are included as appendices, as is a list of publications produced by the Midwestern Office of The Council of State Governments (CSG-MW) as part of the Midwestern High-Level Radioactive Waste Transportation Project

  9. A cascade of classifiers for extracting medication information from discharge summaries

    Directory of Open Access Journals (Sweden)

    Halgrim Scott

    2011-07-01

    Full Text Available Abstract Background Extracting medication information from clinical records has many potential applications, and recently published research, systems, and competitions reflect an interest therein. Much of the early extraction work involved rules and lexicons, but more recently machine learning has been applied to the task. Methods We present a hybrid system consisting of two parts. The first part, field detection, uses a cascade of statistical classifiers to identify medication-related named entities. The second part uses simple heuristics to link those entities into medication events. Results The system achieved performance that is comparable to other approaches to the same task. This performance is further improved by adding features that reference external medication name lists. Conclusions This study demonstrates that our hybrid approach outperforms purely statistical or rule-based systems. The study also shows that a cascade of classifiers works better than a single classifier in extracting medication information. The system is available as is upon request from the first author.

  10. CMOS-MEMS Test-Key for Extracting Wafer-Level Mechanical Properties

    Directory of Open Access Journals (Sweden)

    Pei-Zen Chang

    2012-12-01

    Full Text Available This paper develops the technologies of mechanical characterization of CMOS-MEMS devices, and presents a robust algorithm for extracting mechanical properties, such as Young’s modulus, and mean stress, through the external electrical circuit behavior of the micro test-key. An approximate analytical solution for the pull-in voltage of bridge-type test-key subjected to electrostatic load and initial stress is derived based on Euler’s beam model and the minimum energy method. Then one can use the aforesaid closed form solution of the pull-in voltage to extract the Young’s modulus and mean stress of the test structures. The test cases include the test-key fabricated by a TSMC 0.18 μm standard CMOS process, and the experimental results refer to Osterberg’s work on the pull-in voltage of single crystal silicone microbridges. The extracted material properties calculated by the present algorithm are valid. Besides, this paper also analyzes the robustness of this algorithm regarding the dimension effects of test-keys. This mechanical properties extracting method is expected to be applicable to the wafer-level testing in micro-device manufacture and compatible with the wafer-level testing in IC industry since the test process is non-destructive.

  11. Rapid automatic keyword extraction for information retrieval and analysis

    Science.gov (United States)

    Rose, Stuart J [Richland, WA; Cowley,; E, Wendy [Richland, WA; Crow, Vernon L [Richland, WA; Cramer, Nicholas O [Richland, WA

    2012-03-06

    Methods and systems for rapid automatic keyword extraction for information retrieval and analysis. Embodiments can include parsing words in an individual document by delimiters, stop words, or both in order to identify candidate keywords. Word scores for each word within the candidate keywords are then calculated based on a function of co-occurrence degree, co-occurrence frequency, or both. Based on a function of the word scores for words within the candidate keyword, a keyword score is calculated for each of the candidate keywords. A portion of the candidate keywords are then extracted as keywords based, at least in part, on the candidate keywords having the highest keyword scores.

  12. Information Extraction for Clinical Data Mining: A Mammography Case Study.

    Science.gov (United States)

    Nassif, Houssam; Woods, Ryan; Burnside, Elizabeth; Ayvaci, Mehmet; Shavlik, Jude; Page, David

    2009-01-01

    Breast cancer is the leading cause of cancer mortality in women between the ages of 15 and 54. During mammography screening, radiologists use a strict lexicon (BI-RADS) to describe and report their findings. Mammography records are then stored in a well-defined database format (NMD). Lately, researchers have applied data mining and machine learning techniques to these databases. They successfully built breast cancer classifiers that can help in early detection of malignancy. However, the validity of these models depends on the quality of the underlying databases. Unfortunately, most databases suffer from inconsistencies, missing data, inter-observer variability and inappropriate term usage. In addition, many databases are not compliant with the NMD format and/or solely consist of text reports. BI-RADS feature extraction from free text and consistency checks between recorded predictive variables and text reports are crucial to addressing this problem. We describe a general scheme for concept information retrieval from free text given a lexicon, and present a BI-RADS features extraction algorithm for clinical data mining. It consists of a syntax analyzer, a concept finder and a negation detector. The syntax analyzer preprocesses the input into individual sentences. The concept finder uses a semantic grammar based on the BI-RADS lexicon and the experts' input. It parses sentences detecting BI-RADS concepts. Once a concept is located, a lexical scanner checks for negation. Our method can handle multiple latent concepts within the text, filtering out ultrasound concepts. On our dataset, our algorithm achieves 97.7% precision, 95.5% recall and an F 1 -score of 0.97. It outperforms manual feature extraction at the 5% statistical significance level.

  13. High-resolution extraction of particle size via Fourier Ptychography

    Science.gov (United States)

    Li, Shengfu; Zhao, Yu; Chen, Guanghua; Luo, Zhenxiong; Ye, Yan

    2017-11-01

    This paper proposes a method which can extract the particle size information with a resolution beyond λ/NA. This is achieved by applying Fourier Ptychographic (FP) ideas to the present problem. In a typical FP imaging platform, a 2D LED array is used as light sources for angle-varied illuminations, a series of low-resolution images was taken by a full sequential scan of the array of LEDs. Here, we demonstrate the particle size information is extracted by turning on each single LED on a circle. The simulated results show that the proposed method can reduce the total number of images, without loss of reliability in the results.

  14. Anethum graveolens Linn. (dill) extract enhances the mounting frequency and level of testicular tyrosine protein phosphorylation in rats.

    Science.gov (United States)

    Iamsaard, Sitthichai; Prabsattroo, Thawatchai; Sukhorum, Wannisa; Muchimapura, Supaporn; Srisaard, Panee; Uabundit, Nongnut; Thukhammee, Wipawee; Wattanathorn, Jintanaporn

    2013-03-01

    To investigate the effect of Anethum graveolens (AG) extracts on the mounting frequency, histology of testis and epididymis, and sperm physiology. Male rats induced by cold immobilization before treating with vehicle or AG extracts [50, 150, and 450 mg/kg body weight (BW)] via gastric tube for consecutive 1, 7, and 14 d were examined for mounting frequency, testicular phosphorylation level by immunoblotting, sperm concentration, sperm acrosome reaction, and histological structures of testis and epididymis, respectively. AG (50 mg/kg BW) significantly increased the mounting frequency on Days 1 and 7 compared to the control group. Additionally, rat testis treated with 50 mg/kg BW AG showed high levels of phosphorylated proteins as compared with the control group. In histological analyses, AG extract did not affect the sperm concentration, acrosome reaction, and histological structures of testis and epididymis. AG extract enhances the aphrodisiac activity and is not harmful to sperm and male reproductive organs.

  15. Anethum graveolens Linn. (dill) extract enhances the mounting frequency and level of testicular tyrosine protein phosphorylation in rats*

    Science.gov (United States)

    Iamsaard, Sitthichai; Prabsattroo, Thawatchai; Sukhorum, Wannisa; Muchimapura, Supaporn; Srisaard, Panee; Uabundit, Nongnut; Thukhammee, Wipawee; Wattanathorn, Jintanaporn

    2013-01-01

    Objective: To investigate the effect of Anethum graveolens (AG) extracts on the mounting frequency, histology of testis and epididymis, and sperm physiology. Methods: Male rats induced by cold immobilization before treating with vehicle or AG extracts [50, 150, and 450 mg/kg body weight (BW)] via gastric tube for consecutive 1, 7, and 14 d were examined for mounting frequency, testicular phosphorylation level by immunoblotting, sperm concentration, sperm acrosome reaction, and histological structures of testis and epididymis, respectively. Results: AG (50 mg/kg BW) significantly increased the mounting frequency on Days 1 and 7 compared to the control group. Additionally, rat testis treated with 50 mg/kg BW AG showed high levels of phosphorylated proteins as compared with the control group. In histological analyses, AG extract did not affect the sperm concentration, acrosome reaction, and histological structures of testis and epididymis. Conclusions: AG extract enhances the aphrodisiac activity and is not harmful to sperm and male reproductive organs. PMID:23463768

  16. The design of a fast Level-1 track trigger for the high luminosity upgrade of ATLAS.

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00413032; The ATLAS collaboration

    2016-01-01

    The high/luminosity upgrade of the LHC will increase the rate of the proton-proton collisions by approximately a factor of 5 with respect to the initial LHC-design. The ATLAS experiment will upgrade consequently, increasing its robustness and selectivity in the expected high radiation environment. In particular, the earliest, hardware based, ATLAS trigger stage ("Level 1") will require higher rejection power, still maintaining efficient selection on many various physics signatures. The key ingredient is the possibility of extracting tracking information from the brand new full-silicon detector and use it for the process. While fascinating, this solution poses a big challenge in the choice of the architecture, due to the reduced latency available at this trigger level (few tens of micro-seconds) and the high expected working rates (order of MHz). In this paper, we review the design possibilities of such a system in a potential new trigger and readout architecture, and present the performance resulting from a d...

  17. Improving extraction technology of level seams. Sovershenstvovanie tekhnologii razrabotki pologikh plastov

    Energy Technology Data Exchange (ETDEWEB)

    Shetser, M G; Spitsyn, Yu G

    1985-01-01

    This report deals with conditions and prospects for intensifying extraction of level and inclined seams and improving extraction technology. Reviews mechanization of excavation of stables with automatic cutter-loaders (KA80 in conjunction with KD80); coal extraction using two cutter-loaders in seams 0.9 - 1.9 m thick and up to 20 degrees inclination (pillar mining); reciprocating method of coal cutting; one-sided method of coal extraction (KMK97 cutter loaders). Discusses strengthening of junctions of faces with gate roads (KSU and KSU3M props); improved types of props (hydraulic props SUG-30, SUG-V and GVD); roof control methods (induced caving, advance torpedoing or using KM87UMP and KMT power supports). Deals in detail with introduction of new extraction technology and strengthening of unstable rock by injecting polyurethene compounds, extraction of seams with wide-web cutter-loaders (Kirovets, IK101) and plowing. (3 refs.)

  18. Superparamagnetic adsorbents for high-gradient magnetic fishing of lectins out of legume extracts

    DEFF Research Database (Denmark)

    Heebøll-Nielsen, Anders; Dalkiær, M.; Hubbuch, Jürgen

    2004-01-01

    This work presents the development, testing, and application in high-gradient magnetic fishing of superparamagnetic supports for adsorption of lectins. Various approaches were examined to produce affinity, mixed mode, and hydrophobic charge induction type adsorbents. In clean monocomponent systems...... affinity supports created by direct attachment of glucose or maltose to amine-terminated iron oxide particles could bind concanavalin A at levels of up to approximate to 280 mg g(-1) support with high affinity (approximate to 1 muM dissociation constants). However, the best performance was delivered......-linked adsorbents supplied sufficient competition to dissolved sugars to selectively bind concanavalin A in an extract of jack beans. The dextran-linked supports were employed in a high-gradient magnetic fishing experiment, in which concanavalin A was purified to near homogeneity from a crude, unclarified extract...

  19. Ontology-Based High-Level Context Inference for Human Behavior Identification

    Science.gov (United States)

    Villalonga, Claudia; Razzaq, Muhammad Asif; Khan, Wajahat Ali; Pomares, Hector; Rojas, Ignacio; Lee, Sungyoung; Banos, Oresti

    2016-01-01

    Recent years have witnessed a huge progress in the automatic identification of individual primitives of human behavior, such as activities or locations. However, the complex nature of human behavior demands more abstract contextual information for its analysis. This work presents an ontology-based method that combines low-level primitives of behavior, namely activity, locations and emotions, unprecedented to date, to intelligently derive more meaningful high-level context information. The paper contributes with a new open ontology describing both low-level and high-level context information, as well as their relationships. Furthermore, a framework building on the developed ontology and reasoning models is presented and evaluated. The proposed method proves to be robust while identifying high-level contexts even in the event of erroneously-detected low-level contexts. Despite reasonable inference times being obtained for a relevant set of users and instances, additional work is required to scale to long-term scenarios with a large number of users. PMID:27690050

  20. Ontology-Based High-Level Context Inference for Human Behavior Identification

    Directory of Open Access Journals (Sweden)

    Claudia Villalonga

    2016-09-01

    Full Text Available Recent years have witnessed a huge progress in the automatic identification of individual primitives of human behavior, such as activities or locations. However, the complex nature of human behavior demands more abstract contextual information for its analysis. This work presents an ontology-based method that combines low-level primitives of behavior, namely activity, locations and emotions, unprecedented to date, to intelligently derive more meaningful high-level context information. The paper contributes with a new open ontology describing both low-level and high-level context information, as well as their relationships. Furthermore, a framework building on the developed ontology and reasoning models is presented and evaluated. The proposed method proves to be robust while identifying high-level contexts even in the event of erroneously-detected low-level contexts. Despite reasonable inference times being obtained for a relevant set of users and instances, additional work is required to scale to long-term scenarios with a large number of users.

  1. Prediction of isometric motor tasks and effort levels based on high-density EMG in patients with incomplete spinal cord injury

    Science.gov (United States)

    Jordanić, Mislav; Rojas-Martínez, Mónica; Mañanas, Miguel Angel; Francesc Alonso, Joan

    2016-08-01

    Objective. The development of modern assistive and rehabilitation devices requires reliable and easy-to-use methods to extract neural information for control of devices. Group-specific pattern recognition identifiers are influenced by inter-subject variability. Based on high-density EMG (HD-EMG) maps, our research group has already shown that inter-subject muscle activation patterns exist in a population of healthy subjects. The aim of this paper is to analyze muscle activation patterns associated with four tasks (flexion/extension of the elbow, and supination/pronation of the forearm) at three different effort levels in a group of patients with incomplete Spinal Cord Injury (iSCI). Approach. Muscle activation patterns were evaluated by the automatic identification of these four isometric tasks along with the identification of levels of voluntary contractions. Two types of classifiers were considered in the identification: linear discriminant analysis and support vector machine. Main results. Results show that performance of classification increases when combining features extracted from intensity and spatial information of HD-EMG maps (accuracy = 97.5%). Moreover, when compared to a population with injuries at different levels, a lower variability between activation maps was obtained within a group of patients with similar injury suggesting stronger task-specific and effort-level-specific co-activation patterns, which enable better prediction results. Significance. Despite the challenge of identifying both the four tasks and the three effort levels in patients with iSCI, promising results were obtained which support the use of HD-EMG features for providing useful information regarding motion and force intention.

  2. Extractive text summarization system to aid data extraction from full text in systematic review development.

    Science.gov (United States)

    Bui, Duy Duc An; Del Fiol, Guilherme; Hurdle, John F; Jonnalagadda, Siddhartha

    2016-12-01

    Extracting data from publication reports is a standard process in systematic review (SR) development. However, the data extraction process still relies too much on manual effort which is slow, costly, and subject to human error. In this study, we developed a text summarization system aimed at enhancing productivity and reducing errors in the traditional data extraction process. We developed a computer system that used machine learning and natural language processing approaches to automatically generate summaries of full-text scientific publications. The summaries at the sentence and fragment levels were evaluated in finding common clinical SR data elements such as sample size, group size, and PICO values. We compared the computer-generated summaries with human written summaries (title and abstract) in terms of the presence of necessary information for the data extraction as presented in the Cochrane review's study characteristics tables. At the sentence level, the computer-generated summaries covered more information than humans do for systematic reviews (recall 91.2% vs. 83.8%, p<0.001). They also had a better density of relevant sentences (precision 59% vs. 39%, p<0.001). At the fragment level, the ensemble approach combining rule-based, concept mapping, and dictionary-based methods performed better than individual methods alone, achieving an 84.7% F-measure. Computer-generated summaries are potential alternative information sources for data extraction in systematic review development. Machine learning and natural language processing are promising approaches to the development of such an extractive summarization system. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Information, Technology and Information Worker Productivity: Task Level Evidence

    OpenAIRE

    Sinan Aral; Erik Brynjolfsson; Marshall Van Alstyne

    2007-01-01

    In an effort to reveal the fine-grained relationships between IT use, patterns of information flows, and individual information-worker productivity, we study task level practices at a midsize executive recruiting firm. We analyze both project-level and individual-level performance using: (1) detailed accounting data on revenues, compensation, project completion rates, and team membership for over 1300 projects spanning 5 years, (2) direct observation of over 125,000 email messages over a peri...

  4. Robust Vehicle and Traffic Information Extraction for Highway Surveillance

    Directory of Open Access Journals (Sweden)

    Yeh Chia-Hung

    2005-01-01

    Full Text Available A robust vision-based traffic monitoring system for vehicle and traffic information extraction is developed in this research. It is challenging to maintain detection robustness at all time for a highway surveillance system. There are three major problems in detecting and tracking a vehicle: (1 the moving cast shadow effect, (2 the occlusion effect, and (3 nighttime detection. For moving cast shadow elimination, a 2D joint vehicle-shadow model is employed. For occlusion detection, a multiple-camera system is used to detect occlusion so as to extract the exact location of each vehicle. For vehicle nighttime detection, a rear-view monitoring technique is proposed to maintain tracking and detection accuracy. Furthermore, we propose a method to improve the accuracy of background extraction, which usually serves as the first step in any vehicle detection processing. Experimental results are given to demonstrate that the proposed techniques are effective and efficient for vision-based highway surveillance.

  5. National high-level waste systems analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Kristofferson, K.; Oholleran, T.P.; Powell, R.H.

    1995-09-01

    This report documents the assessment of budgetary impacts, constraints, and repository availability on the storage and treatment of high-level waste and on both existing and pending negotiated milestones. The impacts of the availabilities of various treatment systems on schedule and throughput at four Department of Energy sites are compared to repository readiness in order to determine the prudent application of resources. The information modeled for each of these sites is integrated with a single national model. The report suggests a high-level-waste model that offers a national perspective on all high-level waste treatment and storage systems managed by the Department of Energy.

  6. National high-level waste systems analysis report

    International Nuclear Information System (INIS)

    Kristofferson, K.; Oholleran, T.P.; Powell, R.H.

    1995-09-01

    This report documents the assessment of budgetary impacts, constraints, and repository availability on the storage and treatment of high-level waste and on both existing and pending negotiated milestones. The impacts of the availabilities of various treatment systems on schedule and throughput at four Department of Energy sites are compared to repository readiness in order to determine the prudent application of resources. The information modeled for each of these sites is integrated with a single national model. The report suggests a high-level-waste model that offers a national perspective on all high-level waste treatment and storage systems managed by the Department of Energy

  7. Automated road network extraction from high spatial resolution multi-spectral imagery

    Science.gov (United States)

    Zhang, Qiaoping

    For the last three decades, the Geomatics Engineering and Computer Science communities have considered automated road network extraction from remotely-sensed imagery to be a challenging and important research topic. The main objective of this research is to investigate the theory and methodology of automated feature extraction for image-based road database creation, refinement or updating, and to develop a series of algorithms for road network extraction from high resolution multi-spectral imagery. The proposed framework for road network extraction from multi-spectral imagery begins with an image segmentation using the k-means algorithm. This step mainly concerns the exploitation of the spectral information for feature extraction. The road cluster is automatically identified using a fuzzy classifier based on a set of predefined road surface membership functions. These membership functions are established based on the general spectral signature of road pavement materials and the corresponding normalized digital numbers on each multi-spectral band. Shape descriptors of the Angular Texture Signature are defined and used to reduce the misclassifications between roads and other spectrally similar objects (e.g., crop fields, parking lots, and buildings). An iterative and localized Radon transform is developed for the extraction of road centerlines from the classified images. The purpose of the transform is to accurately and completely detect the road centerlines. It is able to find short, long, and even curvilinear lines. The input image is partitioned into a set of subset images called road component images. An iterative Radon transform is locally applied to each road component image. At each iteration, road centerline segments are detected based on an accurate estimation of the line parameters and line widths. Three localization approaches are implemented and compared using qualitative and quantitative methods. Finally, the road centerline segments are grouped into a

  8. A hybrid approach for robust multilingual toponym extraction and disambiguation

    NARCIS (Netherlands)

    Habib, Mena Badieh; van Keulen, Maurice

    Toponym extraction and disambiguation are key topics recently addressed by fields of Information Extraction and Geographical Information Retrieval. Toponym extraction and disambiguation are highly dependent processes. Not only toponym extraction effectiveness affects disambiguation, but also

  9. Detection and Extraction of Roads from High Resolution Satellites Images with Dynamic Programming

    Science.gov (United States)

    Benzouai, Siham; Smara, Youcef

    2010-12-01

    The advent of satellite images allows now a regular and a fast digitizing and update of geographic data, especially roads which are very useful for Geographic Information Systems (GIS) applications such as transportation, urban pollution, geomarketing, etc. For this, several studies have been conducted to automate roads extraction in order to minimize the manual processes [4]. In this work, we are interested in roads extraction from satellite imagery with high spatial resolution (at best equal to 10 m). The method is semi automatic and follows a linear approach where road is considered as a linear object. As roads extraction is a pattern recognition problem, it is useful, above all, to characterize roads. After, we realize a pre-processing by applying an Infinite Size Edge Filter -ISEF- and processing method based on dynamic programming concept, in particular, Fishler algorithm designed by F*.

  10. Pouteria ramiflora extract inhibits salivary amylolytic activity and decreases glycemic level in mice

    Directory of Open Access Journals (Sweden)

    NEIRE M. DE GOUVEIA

    2013-09-01

    Full Text Available In this study, extracts of plant species from the Cerrado biome were assessed in order to find potential inhibitors of human salivary alpha-amylase. The plants were collected and extracts were obtained from leaves, bark, and roots. We performed a preliminary phytochemical analysis and a screening for salivar alpha-amylase inhibitory activity. Only three botanical families (Sapotaceae, Sapindaceae and Flacourtiaceae and 16 extracts showed a substantial inhibition (>75% of alpha-amylase. The ethanolic extracts of Pouteria ramiflora obtained from stem barks and root barks decreased amylolytic activity above 95% at a final concentration of 20 µg/mL. Thus, adult male Swiss mice were treated orally with P. ramiflora in acute toxicity and glycemic control studies. Daily administration with 25, 50 and 100 mg/kg of aqueous extract of P. ramiflora for eight days can reduce significantly body weight and blood glucose level in mice. These data suggest that the crude polar extract of P. ramiflora decreases salivary amylolytic activity while lowering the blood levels of glucose.

  11. The immobilization of High Level Waste Into Glass

    International Nuclear Information System (INIS)

    Aisyah; Martono, H.

    1998-01-01

    High level liquid waste is generated from the first step extraction in the nuclear fuel reprocessing. The waste is immobilized with boro-silicate glass. A certain composition of glass is needed for a certain type of waste, so that the properties of waste glass would meet the requirement either for further process or for disposal. The effect of waste loading on either density, thermal expansion, softening point and leaching rate has been studied. The composition of the high level liquid waste has been determined by ORIGEN 2 and the result has been used to prepare simulated high level waste. The waste loading in the waste glass has been set to be 19.48; 22.32; 25.27; and 26.59 weight percent. The result shows that increasing the waste loading has resulted in the higher density with no thermal expansion and softening point significant change. The increase in the waste loading increase that leaching rate. The properties of the waste glass in this research have not shown any deviation from the standard waste glass properties

  12. A high-throughput platform for low-volume high-temperature/pressure sealed vessel solvent extractions

    Energy Technology Data Exchange (ETDEWEB)

    Damm, Markus [Christian Doppler Laboratory for Microwave Chemistry (CDLMC) and Institute of Chemistry, Karl-Franzens-University Graz, Heinrichstrasse 28, A-8010 Graz (Austria); Kappe, C. Oliver, E-mail: oliver.kappe@uni-graz.at [Christian Doppler Laboratory for Microwave Chemistry (CDLMC) and Institute of Chemistry, Karl-Franzens-University Graz, Heinrichstrasse 28, A-8010 Graz (Austria)

    2011-11-30

    Highlights: Black-Right-Pointing-Pointer Parallel low-volume coffee extractions in sealed-vessel HPLC/GC vials. Black-Right-Pointing-Pointer Extractions are performed at high temperatures and pressures (200 Degree-Sign C/20 bar). Black-Right-Pointing-Pointer Rapid caffeine determination from the liquid phase. Black-Right-Pointing-Pointer Headspace analysis of volatiles using solid-phase microextraction (SPME). - Abstract: A high-throughput platform for performing parallel solvent extractions in sealed HPLC/GC vials inside a microwave reactor is described. The system consist of a strongly microwave-absorbing silicon carbide plate with 20 cylindrical wells of appropriate dimensions to be fitted with standard HPLC/GC autosampler vials serving as extraction vessels. Due to the possibility of heating up to four heating platforms simultaneously (80 vials), efficient parallel analytical-scale solvent extractions can be performed using volumes of 0.5-1.5 mL at a maximum temperature/pressure limit of 200 Degree-Sign C/20 bar. Since the extraction and subsequent analysis by either gas chromatography or liquid chromatography coupled with mass detection (GC-MS or LC-MS) is performed directly from the autosampler vial, errors caused by sample transfer can be minimized. The platform was evaluated for the extraction and quantification of caffeine from commercial coffee powders assessing different solvent types, extraction temperatures and times. For example, 141 {+-} 11 {mu}g caffeine (5 mg coffee powder) were extracted during a single extraction cycle using methanol as extraction solvent, whereas only 90 {+-} 11 were obtained performing the extraction in methylene chloride, applying the same reaction conditions (90 Degree-Sign C, 10 min). In multiple extraction experiments a total of {approx}150 {mu}g caffeine was extracted from 5 mg commercial coffee powder. In addition to the quantitative caffeine determination, a comparative qualitative analysis of the liquid phase coffee

  13. A high-throughput platform for low-volume high-temperature/pressure sealed vessel solvent extractions

    International Nuclear Information System (INIS)

    Damm, Markus; Kappe, C. Oliver

    2011-01-01

    Highlights: ► Parallel low-volume coffee extractions in sealed-vessel HPLC/GC vials. ► Extractions are performed at high temperatures and pressures (200 °C/20 bar). ► Rapid caffeine determination from the liquid phase. ► Headspace analysis of volatiles using solid-phase microextraction (SPME). - Abstract: A high-throughput platform for performing parallel solvent extractions in sealed HPLC/GC vials inside a microwave reactor is described. The system consist of a strongly microwave-absorbing silicon carbide plate with 20 cylindrical wells of appropriate dimensions to be fitted with standard HPLC/GC autosampler vials serving as extraction vessels. Due to the possibility of heating up to four heating platforms simultaneously (80 vials), efficient parallel analytical-scale solvent extractions can be performed using volumes of 0.5–1.5 mL at a maximum temperature/pressure limit of 200 °C/20 bar. Since the extraction and subsequent analysis by either gas chromatography or liquid chromatography coupled with mass detection (GC–MS or LC–MS) is performed directly from the autosampler vial, errors caused by sample transfer can be minimized. The platform was evaluated for the extraction and quantification of caffeine from commercial coffee powders assessing different solvent types, extraction temperatures and times. For example, 141 ± 11 μg caffeine (5 mg coffee powder) were extracted during a single extraction cycle using methanol as extraction solvent, whereas only 90 ± 11 were obtained performing the extraction in methylene chloride, applying the same reaction conditions (90 °C, 10 min). In multiple extraction experiments a total of ∼150 μg caffeine was extracted from 5 mg commercial coffee powder. In addition to the quantitative caffeine determination, a comparative qualitative analysis of the liquid phase coffee extracts and the headspace volatiles was performed, placing special emphasis on headspace analysis using solid-phase microextraction (SPME

  14. Extracting Social Networks and Contact Information From Email and the Web

    National Research Council Canada - National Science Library

    Culotta, Aron; Bekkerman, Ron; McCallum, Andrew

    2005-01-01

    ...-suited for such information extraction tasks. By recursively calling itself on new people discovered on the Web, the system builds a social network with multiple degrees of separation from the user...

  15. Comparison of clinical parameters and environmental noise levels between regular surgery and piezosurgery for extraction of impacted third molars

    Directory of Open Access Journals (Sweden)

    Hao-Hueng Chang

    2015-10-01

    Conclusion: The piezosurgery device produced noise levels similar to or lower than those of the high-speed drilling device. However, piezosurgery provides advantages of increased patient comfort during extraction of impacted third molars.

  16. Zone analysis in biology articles as a basis for information extraction.

    Science.gov (United States)

    Mizuta, Yoko; Korhonen, Anna; Mullen, Tony; Collier, Nigel

    2006-06-01

    In the field of biomedicine, an overwhelming amount of experimental data has become available as a result of the high throughput of research in this domain. The amount of results reported has now grown beyond the limits of what can be managed by manual means. This makes it increasingly difficult for the researchers in this area to keep up with the latest developments. Information extraction (IE) in the biological domain aims to provide an effective automatic means to dynamically manage the information contained in archived journal articles and abstract collections and thus help researchers in their work. However, while considerable advances have been made in certain areas of IE, pinpointing and organizing factual information (such as experimental results) remains a challenge. In this paper we propose tackling this task by incorporating into IE information about rhetorical zones, i.e. classification of spans of text in terms of argumentation and intellectual attribution. As the first step towards this goal, we introduce a scheme for annotating biological texts for rhetorical zones and provide a qualitative and quantitative analysis of the data annotated according to this scheme. We also discuss our preliminary research on automatic zone analysis, and its incorporation into our IE framework.

  17. Optimized digital feature extraction in the FERMI microsystem

    International Nuclear Information System (INIS)

    Alexanian, H.; Appelquist, G.; Bailly, P.

    1995-01-01

    We describe the digital filter section of the FERMI readout microsystem. The filter section, consisting of two separate filter blocks, extracts the pulse amplitude and time information for the first-level trigger process and performs a highly accurate energy measurement for higher-level triggering and data readout purposes. An FIR-order statistic hybrid filter structure is used to improve the amplitude extraction performance. Using a training procedure the filters are optimized to produce a precise and accurate output in the presence of electronics and pile-up noise, sample timing jitter and the superposition of high-energy pulses. As the FERMI system resides inside the detector where accessibility is limited, the filter implementations are presented together with fault tolerance considerations. The filter section is modelled with the VHDL hardware descriptive language and the subsystems are further optimized to minimize the system latency and circuit area. ((orig.))

  18. High-level specification of a proposed information architecture for support of a bioterrorism early-warning system.

    Science.gov (United States)

    Berkowitz, Murray R

    2013-01-01

    Current information systems for use in detecting bioterrorist attacks lack a consistent, overarching information architecture. An overview of the use of biological agents as weapons during a bioterrorist attack is presented. Proposed are the design, development, and implementation of a medical informatics system to mine pertinent databases, retrieve relevant data, invoke appropriate biostatistical and epidemiological software packages, and automatically analyze these data. The top-level information architecture is presented. Systems requirements and functional specifications for this level are presented. Finally, future studies are identified.

  19. Metabolite profiling and quantification of phytochemicals in potato extracts using ultra-high-performance liquid chromatography-mass spectrometry.

    Science.gov (United States)

    Chong, Esther Swee Lan; McGhie, Tony K; Heyes, Julian A; Stowell, Kathryn M

    2013-12-01

    Potatoes contain a diverse range of phytochemicals which have been suggested to have health benefits. Metabolite profiling and quantification were conducted on plant extracts made from a white potato cultivar and 'Urenika', a purple potato cultivar traditionally consumed by New Zealand Maori. There is limited published information regarding the metabolite profile of Solanum tuberosum cultivar 'Urenika'. Using ultra-high- performance liquid chromatography-mass spectrometry (UHPLC-MS), a total of 31 compounds were identified and quantified in the potato extracts. The majority of the compounds were identified for the first time in 'Urenika'. These compounds include several types of anthocyanins, hydroxycinnamic acid (HCA) derivatives, and hydroxycinnamic amides (HCAA). Six classes of compounds, namely organic acids, amino acids, HCA, HCAA, flavonols and glycoalkaloids, were present in both extracts but quantities varied between the two extracts. The unknown plant metabolites in both potato extracts were assigned with molecular formulae and identified with high confidence. Quantification of the metabolites was achieved using a number of appropriate standards. High-resolution mass spectrometry data critical for accurate identification of unknown phytochemicals were achieved and could be added to potato or plant metabolomic database. © 2013 Society of Chemical Industry.

  20. Progress and trends in patients' mindset on dental implants. I: level of information, sources of information and need for patient information.

    Science.gov (United States)

    Pommer, Bernhard; Zechner, Werner; Watzak, Georg; Ulm, Christian; Watzek, Georg; Tepper, Gabor

    2011-02-01

    Little is known about the level of information on implant dentistry in the public. A representative opinion poll on dental implants in the Austrian population was published in 2003 (Clinical Oral Implants Research 14:621-642). Seven years later, the poll was rerun to assess the up-to-date information level and evaluate recent progress and trends in patients' mindset on dental implants. One thousand adults--representative for the Austrian population--were presented with a total of 19 questionnaire items regarding the level and the sources of information about dental implants as well as the subjective and objective need for patient information. Compared with the survey of 2003, the subjective level of patient information about implant dentistry has significantly increased in the Austrian population. The patients' implant awareness rate was 79%. The objective level of general knowledge about dental implants was still all but satisfactory revealing unrealistic patient expectations. Three-quarters trusted their dentists for information about dental implants, while one-quarter turned to the media. The patients' wish for high-quality implant restorations was significantly higher than in 2003, yet the majority felt that only specialists should perform implant dentistry. This representative survey reveals that dentists are still the main source of patient information, but throws doubt on the quality of their public relations work. Dentists must improve communication strategies to provide their patients with comprehensible, legally tenable information on dental implants and bridge information gaps in the future. © 2010 John Wiley & Sons A/S.

  1. Solid-Phase Extraction Combined with High Performance Liquid ...

    African Journals Online (AJOL)

    Methods: Solid-phase extraction method was employed for the extraction of the estrogen from milk and high performance liquid chromatography-diode array detector (HPLC-DAD) was used for the determination of estrogen. Results: Optimal chromatographic conditions were achieved on an Eclipse XDB-C18 column at a ...

  2. OpenCV-Based Nanomanipulation Information Extraction and the Probe Operation in SEM

    Directory of Open Access Journals (Sweden)

    Dongjie Li

    2015-02-01

    Full Text Available Aimed at the established telenanomanipulation system, the method of extracting location information and the strategies of probe operation were studied in this paper. First, the machine learning algorithm of OpenCV was used to extract location information from SEM images. Thus nanowires and probe in SEM images can be automatically tracked and the region of interest (ROI can be marked quickly. Then the location of nanowire and probe can be extracted from the ROI. To study the probe operation strategy, the Van der Waals force between probe and a nanowire was computed; thus relevant operating parameters can be obtained. With these operating parameters, the nanowire in 3D virtual environment can be preoperated and an optimal path of the probe can be obtained. The actual probe runs automatically under the telenanomanipulation system's control. Finally, experiments were carried out to verify the above methods, and results show the designed methods have achieved the expected effect.

  3. A method for automating the extraction of specialized information from the web

    NARCIS (Netherlands)

    Lin, L.; Liotta, A.; Hippisley, A.; Hao, Y.; Liu, J.; Wang, Y.; Cheung, Y-M.; Yin, H.; Jiao, L.; Ma, j.; Jiao, Y-C.

    2005-01-01

    The World Wide Web can be viewed as a gigantic distributed database including millions of interconnected hosts some of which publish information via web servers or peer-to-peer systems. We present here a novel method for the extraction of semantically rich information from the web in a fully

  4. The Impact of the Roast Levels of Coffee Extracts on their Potential Anticancer Activities.

    Science.gov (United States)

    Mojica, Benigno E; Fong, Lisa E; Biju, Denny; Muharram, Alfeah; Davis, Isabel M; Vela, Klarisse O; Rios, Diana; Osorio-Camacena, Elena; Kaur, Baljit; Rojas, Sebastian M; Forester, Sarah C

    2018-04-01

    Coffee is one of the most widely consumed beverages in the world and contains numerous phytochemicals that are beneficial to consumer health. The phytochemical profile of coffee, however, can be affected by the roast level. In this study, we compared the effect of roasting level on the growth inhibitory activity of HT-29 (colon) and SCC-25 (oral) cancer cell lines. The different roasting stages selected for this study were green, cinnamon/blonde, city/medium, full city/medium-dark, and full city plus/dark. Cancer cells were treated with various concentrations of coffee extracts for 72 hr. Cell viability was quantified using the thiazolyl blue tetrazolium bromide assay. It was found that the lighter roast extracts, Cinnamon in particular, reduced cell growth more than darker roast extracts. The Cinnamon extract had the greatest amount of total phenolic content and antioxidant activity. Relative levels of gallic, caffeic, and chlorogenic acid in the extracts were also compared. The Cinnamon coffee extract had the highest levels of gallic and caffeic acids, which have both been widely-regarded as bioactive phytochemicals. In conclusion, the consumption of lighter roasted coffee, may contribute to the prevention of certain types of cancer such as oral and colon. Chemical compounds in coffee may reduce the risk for certain types of cancers. These compounds may be particularly abundant in lighter roasted coffee. Therefore, lighter roasted coffee could contribute to the prevention of cancer through a healthy diet. © 2018 Institute of Food Technologists®.

  5. DEXTER: Disease-Expression Relation Extraction from Text.

    Science.gov (United States)

    Gupta, Samir; Dingerdissen, Hayley; Ross, Karen E; Hu, Yu; Wu, Cathy H; Mazumder, Raja; Vijay-Shanker, K

    2018-01-01

    Gene expression levels affect biological processes and play a key role in many diseases. Characterizing expression profiles is useful for clinical research, and diagnostics and prognostics of diseases. There are currently several high-quality databases that capture gene expression information, obtained mostly from large-scale studies, such as microarray and next-generation sequencing technologies, in the context of disease. The scientific literature is another rich source of information on gene expression-disease relationships that not only have been captured from large-scale studies but have also been observed in thousands of small-scale studies. Expression information obtained from literature through manual curation can extend expression databases. While many of the existing databases include information from literature, they are limited by the time-consuming nature of manual curation and have difficulty keeping up with the explosion of publications in the biomedical field. In this work, we describe an automated text-mining tool, Disease-Expression Relation Extraction from Text (DEXTER) to extract information from literature on gene and microRNA expression in the context of disease. One of the motivations in developing DEXTER was to extend the BioXpress database, a cancer-focused gene expression database that includes data derived from large-scale experiments and manual curation of publications. The literature-based portion of BioXpress lags behind significantly compared to expression information obtained from large-scale studies and can benefit from our text-mined results. We have conducted two different evaluations to measure the accuracy of our text-mining tool and achieved average F-scores of 88.51 and 81.81% for the two evaluations, respectively. Also, to demonstrate the ability to extract rich expression information in different disease-related scenarios, we used DEXTER to extract information on differential expression information for 2024 genes in lung

  6. Anti-Obesity and Hypoglycemic Effects of Poncirus trifoliata L. Extracts in High-Fat Diet C57BL/6 Mice

    Directory of Open Access Journals (Sweden)

    Sheng Jia

    2016-04-01

    Full Text Available The present study investigated the possible anti-obesity and hypoglycemic effects of Poncirus trifoliata L. extracts. Mature fruit were divided into flavedo (PF and juice sacs (PJ, and extracts from them were tested on C57BL/6 mice fed a high-fat diet (HFD for thirteen weeks. Both fruit extracts (40 mg/kg body weight, respectively showed anti-obesity and hypoglycemic effects. Consumption of PF and PJ extracts reduced body weight by 9.21% and 20.27%, respectively. Liver and adipose weights, fasting glucose, serum triglyceride (TG, and low density lipoprotein cholesterol (LDL-c levels decreased significantly, while serum high density lipoprotein cholesterol (HDL-c and oral glucose tolerance levels increased significantly in response to two fruit extracts. These effects were due in part to the modulation of serum insulin, leptin, and adiponectin. Furthermore, transcript levels of fatty acid synthase (FAS and stearoyl-CoA desaturase 1 (SCD1 were reduced while those of carnitine palmitoyltransferase 1α (CPT1α and insulin receptor substrate 2 (IRS2 were increased in the liver of C57BL/6 mice, which might be an important mechanism affecting lipid and glucose metabolism. Taken together, P. trifoliata fruit can be potentially used to prevent or treat obesity and associated metabolic disorders.

  7. Computer-based control of nuclear power information systems at international level

    International Nuclear Information System (INIS)

    Boniface, Ekechukwu; Okonkwo, Obi

    2011-01-01

    In most highly industrialized countries of the world information plays major role in anti-nuclear campaign. Information and discussions on nuclear power need critical and objective analysis before the structured information presentation to the public to avoid bias anti-nuclear information on one side and neglect of great risk in nuclear power. This research is developing a computer-based information system for the control of nuclear power at international level. The system is to provide easy and fast information highways for the followings: (1) Low Regulatory dose and activity limit as level of high danger for individuals and public. (2) Provision of relevant technical or scientific education among the information carriers in the nuclear power countries. The research is on fact oriented investigation about radioactivity. It also deals with fact oriented education about nuclear accidents and safety. A standard procedure for dissemination of latest findings using technical and scientific experts in nuclear technology is developed. The information highway clearly analyzes the factual information about radiation risk and nuclear energy. Radiation cannot be removed from our environment. The necessity of radiation utilizations defines nuclear energy as two-edge sword. It is therefore, possible to use computer-based information system in projecting and dissemination of expert knowledge about nuclear technology positively and also to use it in directing the public on the safety and control of the nuclear energy. The computer-based information highway for nuclear energy technology is to assist in scientific research and technological development at international level. (author)

  8. Extraction of land cover change information from ENVISAT-ASAR data in Chengdu Plain

    Science.gov (United States)

    Xu, Wenbo; Fan, Jinlong; Huang, Jianxi; Tian, Yichen; Zhang, Yong

    2006-10-01

    Land cover data are essential to most global change research objectives, including the assessment of current environmental conditions and the simulation of future environmental scenarios that ultimately lead to public policy development. Chinese Academy of Sciences generated a nationwide land cover database in order to carry out the quantification and spatial characterization of land use/cover changes (LUCC) in 1990s. In order to improve the reliability of the database, we will update the database anytime. But it is difficult to obtain remote sensing data to extract land cover change information in large-scale. It is hard to acquire optical remote sensing data in Chengdu plain, so the objective of this research was to evaluate multitemporal ENVISAT advanced synthetic aperture radar (ASAR) data for extracting land cover change information. Based on the fieldwork and the nationwide 1:100000 land cover database, the paper assesses several land cover changes in Chengdu plain, for example: crop to buildings, forest to buildings, and forest to bare land. The results show that ENVISAT ASAR data have great potential for the applications of extracting land cover change information.

  9. A novel airport extraction model based on saliency region detection for high spatial resolution remote sensing images

    Science.gov (United States)

    Lv, Wen; Zhang, Libao; Zhu, Yongchun

    2017-06-01

    The airport is one of the most crucial traffic facilities in military and civil fields. Automatic airport extraction in high spatial resolution remote sensing images has many applications such as regional planning and military reconnaissance. Traditional airport extraction strategies usually base on prior knowledge and locate the airport target by template matching and classification, which will cause high computation complexity and large costs of computing resources for high spatial resolution remote sensing images. In this paper, we propose a novel automatic airport extraction model based on saliency region detection, airport runway extraction and adaptive threshold segmentation. In saliency region detection, we choose frequency-tuned (FT) model for computing airport saliency using low level features of color and luminance that is easy and fast to implement and can provide full-resolution saliency maps. In airport runway extraction, Hough transform is adopted to count the number of parallel line segments. In adaptive threshold segmentation, the Otsu threshold segmentation algorithm is proposed to obtain more accurate airport regions. The experimental results demonstrate that the proposed model outperforms existing saliency analysis models and shows good performance in the extraction of the airport.

  10. Comparison of reduced sugar high quality chocolates sweetened with stevioside and crude stevia 'green' extract.

    Science.gov (United States)

    Torri, Luisa; Frati, Alessandra; Ninfali, Paolino; Mantegna, Stefano; Cravotto, Giancarlo; Morini, Gabriella

    2017-06-01

    The demand for zero and reduced-sugar food products containing cocoa is expanding continuously. The present study was designed to evaluate the feasibility of producing high-quality chocolate sweetened with a crude extract of Stevia rebaudiana (Bertoni) prepared by a green microwave-assisted water-steam extraction procedure. Seven approximately isosweet chocolate formulations were developed, mixing cocoa paste, sucrose, commercial stevioside, crude green extract and maltitol in different proportions. All samples were analyzed for the determination of polyphenol and flavonoid content, antioxidant activity, and sensory acceptability. The use of a crude stevia extract allowed low-sugar, high-quality chocolates to be obtained that were also acceptable by consumers and had a significant increased antioxidant activity. Moreover, consumers' segmentation revealed a cluster of consumers showing the same overall liking for the sample with 50% sucrose replaced by the stevia crude extract as that obtained with the commercial stevioside and the control sample (without sucrose replacement). The results provide information that can contribute to promoting the development of sweet food products, with advantages in terms of an improved nutritional value (reduced sugar content and increased antioxidant activity) and a reduced impact of the production process on the environment. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  11. THE EFFECT OF CIPLUKAN (Physalis angulata L. FRUIT EXTRACT ON SGPT AND SGOT LEVELS AGAINST WHITE MALE MICE (Mus musculus HYPERGLYCEMIA INDUCED BY ALLOXAN AS BIOLOGY LEARNING RESOURCES

    Directory of Open Access Journals (Sweden)

    Nur Lailatul Fitri

    2016-07-01

    Full Text Available Ciplukan (Physalis angulata L. used by the community as an antidiabetic drug. Antidiabetic effects caused ciplukan fruit of this plant contain chemicals flavonoids with the percentage of fruit extract 300 mg / ml was 84%. Flavonoids are antioxidant compounds one that works a treat or neutralize free radicals that are expected with the administration of these antioxidants can be inhibited damage to body cells and can prevent damage to the body and the onset of degenerative diseases. This type of research is True Experimental Research. The research design using The Posttest-Only Control Group Design. The research design used completely randomized design (CRD. This research data is data SGPT and SGOT levels. Analysis of data using one-way analysis of variance at significance level of 0.05 and Duncan 5%. The results showed that different doses of fruit extract ciplukan effect on SGPT and SGOT levels of mice. Duncan test showed that the treatment dose ciplukan fruit extract is the most effective dose of 2 ml / kg. The research results can be used by teachers as information of an alternative to utilize medical plants of hyperglicemia and antiocsidant on Biology subject for X Grade of Senior High School, especially on the concept Maintenance and Utilizing of Biological Diversity in Core Competence 4

  12. High-temperature extraction of rhenium from sulfuric acid solutions with trialkylamines

    International Nuclear Information System (INIS)

    Gladyhev, V.P.; Andreeva, N.N.; Kim, E.M.; Kovaleva, S.V.

    1985-01-01

    This paper attempts to determine the possibility of conducting high-temperature extraction of rhenium from sulfuric acid solutions with trialkylamines (TAA) using higher hydrocarbon-paraffin mixtures as the diluent of the extraction system. Substitution of kerosene by paraffin in the extraction system would permit decreasing the danger of fire and explosions during he extraction process. In extracting rhenium from industrial solutions with a melt of higher paraffins containing TAA and alcohols, the extraction system can be continously heated in heat exchangers through which washing sulfuric acid passes and then goes to the extractor. This permits utilizing the heat and decreases the temperature of the solutions for extraction to the optimum temperatures. Extraction of rhenium with a melt of trioctylamine in paraffin obeys the same mechanisms as high-temperature extraction of ruthenium (IV) by amines in kerosene and aromatic hydrocarbons

  13. Level Sets and Voronoi based Feature Extraction from any Imagery

    DEFF Research Database (Denmark)

    Sharma, O.; Anton, François; Mioc, Darka

    2012-01-01

    Polygon features are of interest in many GEOProcessing applications like shoreline mapping, boundary delineation, change detection, etc. This paper presents a unique new GPU-based methodology to automate feature extraction combining level sets, or mean shift based segmentation together with Voron...

  14. High-level nuclear-waste disposal: information exchange and conflict resolution

    International Nuclear Information System (INIS)

    Hadden, S.G.; Chiles, J.R.; Anaejionu, P.; Cerny, K.J.

    1981-07-01

    The research presented here was conceived as an exploration of the interactions among parties involved in the resolution of the high-level radioactive waste (HLW) disposal issue. Because of the major differences in the nature of the interactions between levels of government, on the one hand, and between government and the public, on the other hand, this study is divided into two primary areas - public participation and intergovernmental relations. These areas are further divided into theoretical and practical considerations. The format of the paper reflects the divisions explained above as well as the interaction of the various authors. Public participation is addressed from a theoretical perspective in Part 2. In Part 3 an essentially pragmatic approach is taken drawing on experiences from similar exercises. These two aspects of the study are presented in separate parts because the authors worked largely independently. Intergovernmental relations is treated in Part 4. The treatment is organized as two Sections of Part 4 to reflect the authors' close interaction which yielded a more integrated treatment of the theoretical and practical aspects of intergovernmental relations. Detailed recommendations and conclusions appear in the final subsections of Parts 2, 3, and 4. Part 5, Summary and Conclusions, does not reiterate the detailed conclusions and recommendations presented in previous parts but rather expresses some general perceptions with respect to the high-level waste disposal issue. A brief review of the Table of Contents will assist in visualizing the detailed format of this study and in identifying the portions of greatest relevance to specific questions. A detailed Subject Index and an Acronym Index have been included for the reader's convenience

  15. Cortisol level and hemodynamic changes during tooth extraction at hypertensive and normotensive patients.

    Science.gov (United States)

    Agani, Zana Bajrami; Benedetti, Alberto; Krasniqi, Vjosa Hamiti; Ahmedi, Jehona; Sejfija, Zana; Loxha, Mergime Prekazi; Murtezani, Arben; Rexhepi, Aida Namani; Ibraimi, Zana

    2015-04-01

    The patients that are subjects to oral-surgical interventions produce large amounts of steroids in comparison with healthy patients which are not a subject to any dental intervention. The aim of research was to determine the level of stress hormone cortisol in serum, arterial blood pressure and arterial pulse, and to compare the effectiveness of the usage of lidocaine with adrenalin in comparison with lidocaine without adrenalin during the tooth extraction. This clinical research includes patients with indication of tooth extraction divided in hypertensive and normotensive patients. There is no important statistical distinction between groups, for the cortisol levels before, during and after tooth extraction regardless of the type of anesthetic used, while we registered higher values of systolic and diastolic values at hypertensive patients, regardless of the type of anesthetic. There is significant systolic and diastolic blood pressure rise in both groups of patients hypertensive and normotensive patients, (regardless of anesthetic used with or without vasoconstrictor), who underwent tooth extraction. The special emphasize is attributed to hypertensive patients where these changes are more significant. As per cortisol level and pulse rate, our results indicate no significant statistical difference in between groups.

  16. The ATLAS high level trigger region of interest builder

    International Nuclear Information System (INIS)

    Blair, R.; Dawson, J.; Drake, G.; Haberichter, W.; Schlereth, J.; Zhang, J.; Ermoline, Y.; Pope, B.; Aboline, M.; High Energy Physics; Michigan State Univ.

    2008-01-01

    This article describes the design, testing and production of the ATLAS Region of Interest Builder (RoIB). This device acts as an interface between the Level 1 trigger and the high level trigger (HLT) farm for the ATLAS LHC detector. It distributes all of the Level 1 data for a subset of events to a small number of (16 or less) individual commodity processors. These processors in turn provide this information to the HLT. This allows the HLT to use the Level 1 information to narrow data requests to areas of the detector where Level 1 has identified interesting objects

  17. INDIVIDUAL TREE OF URBAN FOREST EXTRACTION FROM VERY HIGH DENSITY LIDAR DATA

    Directory of Open Access Journals (Sweden)

    A. Moradi

    2016-06-01

    Full Text Available Airborne LiDAR (Light Detection and Ranging data have a high potential to provide 3D information from trees. Most proposed methods to extract individual trees detect points of tree top or bottom firstly and then using them as starting points in a segmentation algorithm. Hence, in these methods, the number and the locations of detected peak points heavily effect on the process of detecting individual trees. In this study, a new method is presented to extract individual tree segments using LiDAR points with 10cm point density. In this method, a two-step strategy is performed for the extraction of individual tree LiDAR points: finding deterministic segments of individual trees points and allocation of other LiDAR points based on these segments. This research is performed on two study areas in Zeebrugge, Bruges, Belgium (51.33° N, 3.20° E. The accuracy assessment of this method showed that it could correctly classified 74.51% of trees with 21.57% and 3.92% under- and over-segmentation errors respectively.

  18. Nutrient extraction and exportation by common bean cultivars under different fertilization levels: I - macronutrients

    Directory of Open Access Journals (Sweden)

    Rogério Peres Soratto

    2013-08-01

    Full Text Available The use of cultivars with a higher yield potential and the adoption of new technology have achieved high grain yields in common bean, which probably changed the demand for nutrients in this crop. However, there is almost no information about the periods of the cycle in which nutrients are most demanded at which quantities by the main cultivars. The objective of this study was to evaluate the macronutrient extraction and exportation by the common bean cultivars Pérola and IAC Alvorada, under different levels of NPK fertilization, on a dystroferric Red Nitosol, in Botucatu, São Paulo State, Brazil. The experiment was arranged in a randomized complete block (split plot design with four replications. The plots consisted of six treatments based on a 2 x 3 factorial model, represented by two cultivars and three NPK levels (PD0 - 'Pérola' without fertilization, PD1 - 'Pérola' with 50 % of recommended fertilization, PD2 - 'Pérola' with 100 % of recommended fertilization, AD0 - 'IAC Alvorada' without fertilization, AD1 - 'IAC Alvorada' with 50 % of recommended fertilization, and AD2 - 'IAC Alvorada' with 100 % of recommended fertilization and subplots sampled seven times during the cycle. At higher levels of NPK fertilization, the grain yield and macronutrient extraction and exportation of both cultivars were higher, but without statistical differences. Macronutrient absorption was higher in the treatments with 100 % of recommended NPK fertilization (average amounts per hectare: 140 kg N, 16.5 kg P, 120 kg K, 69 kg Ca, 17.9 kg Mg, and 16.3 kg S. Regardless of the treatment, the demand for N, P, K, Ca, and Mg was highest from 45 to 55 days after emergence (DAE, i.e., in the R7 stage (pod formation, while the highest S absorption rates were concentrated between 55 and 65 DAE. More than 70 % of P, between 58 and 69 % of N, 40 and 52 % of S, 40 and 48 % of K, and 35 and 45 % of Mg absorbed during the cycle was exported with grains, whereas less than 15

  19. Evolving spectral transformations for multitemporal information extraction using evolutionary computation

    Science.gov (United States)

    Momm, Henrique; Easson, Greg

    2011-01-01

    Remote sensing plays an important role in assessing temporal changes in land features. The challenge often resides in the conversion of large quantities of raw data into actionable information in a timely and cost-effective fashion. To address this issue, research was undertaken to develop an innovative methodology integrating biologically-inspired algorithms with standard image classification algorithms to improve information extraction from multitemporal imagery. Genetic programming was used as the optimization engine to evolve feature-specific candidate solutions in the form of nonlinear mathematical expressions of the image spectral channels (spectral indices). The temporal generalization capability of the proposed system was evaluated by addressing the task of building rooftop identification from a set of images acquired at different dates in a cross-validation approach. The proposed system generates robust solutions (kappa values > 0.75 for stage 1 and > 0.4 for stage 2) despite the statistical differences between the scenes caused by land use and land cover changes coupled with variable environmental conditions, and the lack of radiometric calibration between images. Based on our results, the use of nonlinear spectral indices enhanced the spectral differences between features improving the clustering capability of standard classifiers and providing an alternative solution for multitemporal information extraction.

  20. Extraction of Graph Information Based on Image Contents and the Use of Ontology

    Science.gov (United States)

    Kanjanawattana, Sarunya; Kimura, Masaomi

    2016-01-01

    A graph is an effective form of data representation used to summarize complex information. Explicit information such as the relationship between the X- and Y-axes can be easily extracted from a graph by applying human intelligence. However, implicit knowledge such as information obtained from other related concepts in an ontology also resides in…

  1. Capillary electrophoresis coupled with chloroform-acetonitrile extraction for rapid and highly selective determination of cysteine and homocysteine levels in human blood plasma and urine.

    Science.gov (United States)

    Ivanov, Alexander Vladimirovich; Bulgakova, Polina Olegovna; Virus, Edward Danielevich; Kruglova, Maria Petrovna; Alexandrin, Valery Vasil'evich; Gadieva, Viktoriya Aleksandrovna; Luzyanin, Boris Petrovich; Kushlinskii, Nikolai Evgen'evich; Fedoseev, Anatolij Nikolaevich; Kubatiev, Aslan Amirkhanovich

    2017-10-01

    A rapid and selective method has been developed for highly sensitive determination of total cysteine and homocysteine levels in human blood plasma and urine by capillary electrophoresis (CE) coupled with liquid-liquid extraction. Analytes were first derivatized with 1,1'-thiocarbonyldiimidazole and then samples were purified by chloroform-ACN extraction. Electrophoretic separation was performed using 0.1 M phosphate with 30 mM triethanolamine, pH 2, containing 25 μM CTAB, 2.5 μM SDS, and 2.5% polyethylene glycol 600. Samples were injected into the capillary (with total length 32 cm and 50 μm id) at 2250 mbar*s and subsequent injection was performed for 30 s with 0.5 M KОН. The total analysis time was less than 9 min, accuracy was 98%, and precision was <2.6%. The LOD was 0.2 μM for homocysteine and 0.5 μM for cysteine. The use of liquid-liquid extraction allowed the precision and sensitivity of the CE method to be significantly increased. The validated method was applied to determine total cysteine and homocysteine content in human blood plasma and urine samples obtained from healthy volunteers and patients with kidney disorders. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Multi-scale Analysis of High Resolution Topography: Feature Extraction and Identification of Landscape Characteristic Scales

    Science.gov (United States)

    Passalacqua, P.; Sangireddy, H.; Stark, C. P.

    2015-12-01

    With the advent of digital terrain data, detailed information on terrain characteristics and on scale and location of geomorphic features is available over extended areas. Our ability to observe landscapes and quantify topographic patterns has greatly improved, including the estimation of fluxes of mass and energy across landscapes. Challenges still remain in the analysis of high resolution topography data; the presence of features such as roads, for example, challenges classic methods for feature extraction and large data volumes require computationally efficient extraction and analysis methods. Moreover, opportunities exist to define new robust metrics of landscape characterization for landscape comparison and model validation. In this presentation we cover recent research in multi-scale and objective analysis of high resolution topography data. We show how the analysis of the probability density function of topographic attributes such as slope, curvature, and topographic index contains useful information for feature localization and extraction. The analysis of how the distributions change across scales, quantified by the behavior of modal values and interquartile range, allows the identification of landscape characteristic scales, such as terrain roughness. The methods are introduced on synthetic signals in one and two dimensions and then applied to a variety of landscapes of different characteristics. Validation of the methods includes the analysis of modeled landscapes where the noise distribution is known and features of interest easily measured.

  3. Potential of hot water extraction of birch wood to produce high-purity dissolving pulp after alkaline pulping.

    Science.gov (United States)

    Borrega, Marc; Tolonen, Lasse K; Bardot, Fanny; Testova, Lidia; Sixta, Herbert

    2013-05-01

    The potential of hot water extraction of birch wood to produce highly purified dissolving pulp in a subsequent soda-anthraquinone pulping process was evaluated. After intermediate extraction intensities, pulps with low xylan content (3-5%) and high cellulose yield were successfully produced. Increasing extraction intensity further decreased the xylan content in pulp. However, below a xylan content of 3%, the cellulose yield dramatically decreased. This is believed to be due to cleavage of glycosidic bonds in cellulose during severe hot water extractions, followed by peeling reactions during alkaline pulping. Addition of sodium borohydride as well as increased anthraquinone concentration in the pulping liquor increased the cellulose yield, but had no clear effects on pulp purity and viscosity. The low intrinsic viscosity of pulps produced after severe extraction intensities and soda-anthraquinone pulping corresponded to the viscosity at the leveling-off degree of polymerization, suggesting that nearly all amorphous cellulose had been degraded. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Vochysia rufa Stem Bark Extract Protects Endothelial Cells against High Glucose Damage

    Directory of Open Access Journals (Sweden)

    Neire Moura de Gouveia

    2017-02-01

    Full Text Available Background: Increased oxidative stress by persistent hyperglycemia is a widely accepted factor in vascular damage responsible for type 2 diabetes complications. The plant Vochysia rufa (Vr has been used in folk medicine in Brazil for the treatment of diabetes. Thus; the protective effect of a Vr stem bark extract against a challenge by a high glucose concentration on EA.hy926 (EA endothelial cells is evaluated. Methods: Vegetal material is extracted with distilled water by maceration and evaporated until dryness under vacuum. Then; it is isolated by capillary electrophoresis–tandem mass spectrometry. Cell viability is evaluated on EA cells treated with 0.5–100 µg/mL of the Vr extract for 24 h. The extract is diluted at concentrations of 5, 10 and 25 µg/mL and maintained for 24 h along with 30 mM of glucose to evaluate its protective effect on reduced glutathione (GSH; glutathione peroxidase (GPx and reductase (GR and protein carbonyl groups. Results: V. rufa stem bark is composed mainly of sugars; such as inositol; galactose; glucose; mannose; sacarose; arabinose and ribose. Treatment with Vr up to 100 µg/mL for 24 h did not affect cell viability. Treatment of EA cells with 30 mM of glucose for 24 h significantly increased the cell damage. EA cells treated with 30 mM of glucose showed a decrease of GSH concentration and increased Radical Oxygen Species (ROS and activity of antioxidant enzymes and protein carbonyl levels; compared to control. Co-treatment of EA with 30 mM glucose plus 1–10 μg/mL Vr significantly reduced cell damage while 5–25 μg/mL Vr evoked a significant protection against the glucose insult; recovering ROS; GSH; antioxidant enzymes and carbonyls to baseline levels. Conclusion: V. rufa extract protects endothelial cells against oxidative damage by modulating ROS; GSH concentration; antioxidant enzyme activity and protein carbonyl levels.

  5. High temperature solvent extraction of oil shale and bituminous coal using binary solvent mixtures

    Energy Technology Data Exchange (ETDEWEB)

    Goetz, G.K.E. [Lehrstuhl fuer Geologie, Geochemie und Lagerstaetten des Erdoels und der Kohle, RWTH Aachen (Germany)

    1997-12-31

    A high volatile bituminous coal from the Saar Basin and an oil shale from the Messel deposit, both Germany, were extracted with binary solvent mixtures using the Advanced Solvent Extraction method (ASE). Extraction temperature and pressure were kept at 100 C, respectively 150 C, and 20,7 MPa. After the heating phase (5 min) static extractions were performed with mixtures (v:v, 1:3) of methanol with toluene, respectively trichloromethane, for further 5 min. Extract yields were the same or on a higher level compared to those from classical soxhlet extractions (3 days) using the same solvents at 60 C. Comparing the results from ASE with those from supercritical fluid extraction (SFE) the extract yields were similar. Increasing the temperature in ASE releases more soluble organic matter from geological samples, because compounds with higher molecular weight and especially more polar substances were solubilized. But also an enhanced extraction efficiency resulted for aliphatic and aromatic hydrocarbons which are used as biomarkers in Organic Geochemistry. Application of thermochemolysis with tetraethylammonium hydroxide (TEAH) using pyrolysis-gas chromatography-mass spectrometry (Py-GC-MS) on the extraction residues shows clearly that at higher extraction temperatures minor amounts of free fatty acids or their methyl esters (original or produced by ASE) were trapped inside the pore systems of the oil shale or the bituminous coal. ASE offers a rapid and very efficient extraction method for geological samples reducing analysis time and costs for solvents. (orig.)

  6. Inhibitory Effects of Ecklonia cava Extract on High Glucose-Induced Hepatic Stellate Cell Activation

    Directory of Open Access Journals (Sweden)

    Akiko Kojima-Yuasa

    2011-12-01

    Full Text Available Nonalcoholic steatohepatitis (NASH is a disease closely associated with obesity and diabetes. A prevalence of type 2 diabetes and a high body mass index in cryptogenic cirrhosis may imply that obesity leads to cirrhosis. Here, we examined the effects of an extract of Ecklonia cava, a brown algae, on the activation of high glucose-induced hepatic stellate cells (HSCs, key players in hepatic fibrosis. Isolated HSCs were incubated with or without a high glucose concentration. Ecklonia cava extract (ECE was added to the culture simultaneously with the high glucose. Treatment with high glucose stimulated expression of type I collagen and α-smooth muscle actin, which are markers of activation in HSCs, in a dose-dependent manner. The activation of high glucose-treated HSCs was suppressed by the ECE. An increase in the formation of intracellular reactive oxygen species (ROS and a decrease in intracellular glutathione levels were observed soon after treatment with high glucose, and these changes were suppressed by the simultaneous addition of ECE. High glucose levels stimulated the secretion of bioactive transforming growth factor-β (TGF-β from the cells, and the stimulation was also suppressed by treating the HSCs with ECE. These results suggest that the suppression of high glucose-induced HSC activation by ECE is mediated through the inhibition of ROS and/or GSH and the downregulation of TGF-β secretion. ECE is useful for preventing the development of diabetic liver fibrosis.

  7. An Accurate Integral Method for Vibration Signal Based on Feature Information Extraction

    Directory of Open Access Journals (Sweden)

    Yong Zhu

    2015-01-01

    Full Text Available After summarizing the advantages and disadvantages of current integral methods, a novel vibration signal integral method based on feature information extraction was proposed. This method took full advantage of the self-adaptive filter characteristic and waveform correction feature of ensemble empirical mode decomposition in dealing with nonlinear and nonstationary signals. This research merged the superiorities of kurtosis, mean square error, energy, and singular value decomposition on signal feature extraction. The values of the four indexes aforementioned were combined into a feature vector. Then, the connotative characteristic components in vibration signal were accurately extracted by Euclidean distance search, and the desired integral signals were precisely reconstructed. With this method, the interference problem of invalid signal such as trend item and noise which plague traditional methods is commendably solved. The great cumulative error from the traditional time-domain integral is effectively overcome. Moreover, the large low-frequency error from the traditional frequency-domain integral is successfully avoided. Comparing with the traditional integral methods, this method is outstanding at removing noise and retaining useful feature information and shows higher accuracy and superiority.

  8. The spread of scientific information: insights from the web usage statistics in PLoS article-level metrics.

    Directory of Open Access Journals (Sweden)

    Koon-Kiu Yan

    Full Text Available The presence of web-based communities is a distinctive signature of Web 2.0. The web-based feature means that information propagation within each community is highly facilitated, promoting complex collective dynamics in view of information exchange. In this work, we focus on a community of scientists and study, in particular, how the awareness of a scientific paper is spread. Our work is based on the web usage statistics obtained from the PLoS Article Level Metrics dataset compiled by PLoS. The cumulative number of HTML views was found to follow a long tail distribution which is reasonably well-fitted by a lognormal one. We modeled the diffusion of information by a random multiplicative process, and thus extracted the rates of information spread at different stages after the publication of a paper. We found that the spread of information displays two distinct decay regimes: a rapid downfall in the first month after publication, and a gradual power law decay afterwards. We identified these two regimes with two distinct driving processes: a short-term behavior driven by the fame of a paper, and a long-term behavior consistent with citation statistics. The patterns of information spread were found to be remarkably similar in data from different journals, but there are intrinsic differences for different types of web usage (HTML views and PDF downloads versus XML. These similarities and differences shed light on the theoretical understanding of different complex systems, as well as a better design of the corresponding web applications that is of high potential marketing impact.

  9. The spread of scientific information: insights from the web usage statistics in PLoS article-level metrics.

    Science.gov (United States)

    Yan, Koon-Kiu; Gerstein, Mark

    2011-01-01

    The presence of web-based communities is a distinctive signature of Web 2.0. The web-based feature means that information propagation within each community is highly facilitated, promoting complex collective dynamics in view of information exchange. In this work, we focus on a community of scientists and study, in particular, how the awareness of a scientific paper is spread. Our work is based on the web usage statistics obtained from the PLoS Article Level Metrics dataset compiled by PLoS. The cumulative number of HTML views was found to follow a long tail distribution which is reasonably well-fitted by a lognormal one. We modeled the diffusion of information by a random multiplicative process, and thus extracted the rates of information spread at different stages after the publication of a paper. We found that the spread of information displays two distinct decay regimes: a rapid downfall in the first month after publication, and a gradual power law decay afterwards. We identified these two regimes with two distinct driving processes: a short-term behavior driven by the fame of a paper, and a long-term behavior consistent with citation statistics. The patterns of information spread were found to be remarkably similar in data from different journals, but there are intrinsic differences for different types of web usage (HTML views and PDF downloads versus XML). These similarities and differences shed light on the theoretical understanding of different complex systems, as well as a better design of the corresponding web applications that is of high potential marketing impact.

  10. Impact of natural gas extraction on PAH levels in ambient air.

    Science.gov (United States)

    Paulik, L Blair; Donald, Carey E; Smith, Brian W; Tidwell, Lane G; Hobbie, Kevin A; Kincl, Laurel; Haynes, Erin N; Anderson, Kim A

    2015-04-21

    Natural gas extraction, often referred to as "fracking," has increased rapidly in the U.S. in recent years. To address potential health impacts, passive air samplers were deployed in a rural community heavily affected by the natural gas boom. Samplers were analyzed for 62 polycyclic aromatic hydrocarbons (PAHs). Results were grouped based on distance from each sampler to the nearest active well. PAH levels were highest when samplers were closest to active wells. Additionally, PAH levels closest to natural gas activity were an order of magnitude higher than levels previously reported in rural areas. Sourcing ratios indicate that PAHs were predominantly petrogenic, suggesting that elevated PAH levels were influenced by direct releases from the earth. Quantitative human health risk assessment estimated the excess lifetime cancer risks associated with exposure to the measured PAHs. Closest to active wells, the risk estimated for maximum residential exposure was 2.9 in 10 000, which is above the U.S. EPA's acceptable risk level. Overall, risk estimates decreased 30% when comparing results from samplers closest to active wells to those farthest. This work suggests that natural gas extraction may be contributing significantly to PAHs in air, at levels that are relevant to human health.

  11. Impact of natural gas extraction on Pah levels in ambient air

    Science.gov (United States)

    Paulik, L. Blair; Donald, Carey E.; Smith, Brian W.; Tidwell, Lane G.; Hobbie, Kevin A.; Kincl, Laurel; Haynes, Erin N.; Anderson, Kim A.

    2015-01-01

    Natural gas extraction, often referred to as “fracking,” has increased rapidly in the U.S. in recent years. To address potential health impacts, passive air samplers were deployed in a rural community heavily affected by the natural gas boom. Samplers were analyzed for 62 polycyclic aromatic hydrocarbons (PAHs). Results were grouped based on distance from each sampler to the nearest active well. PAH levels were highest when samplers were closest to active wells. Additionally, PAH levels closest to natural gas activity were an order of magnitude higher than levels previously reported in rural areas. Sourcing ratios indicate that PAHs were predominantly petrogenic, suggesting that elevated PAH levels were influenced by direct releases from the earth. Quantitative human health risk assessment estimated the excess lifetime cancer risks associated with exposure to the measured PAHs. Closest to active wells, the risk estimated for maximum residential exposure was 2.9 in 10,000, which is above the U.S. EPA's acceptable risk level. Overall, risk estimates decreased 30% when comparing results from samplers closest to active wells to those farthest. This work suggests that natural gas extraction may be contributing significantly to PAHs in air, at levels that are relevant to human health. PMID:25810398

  12. Knowledge discovery: Extracting usable information from large amounts of data

    International Nuclear Information System (INIS)

    Whiteson, R.

    1998-01-01

    The threat of nuclear weapons proliferation is a problem of world wide concern. Safeguards are the key to nuclear nonproliferation and data is the key to safeguards. The safeguards community has access to a huge and steadily growing volume of data. The advantages of this data rich environment are obvious, there is a great deal of information which can be utilized. The challenge is to effectively apply proven and developing technologies to find and extract usable information from that data. That information must then be assessed and evaluated to produce the knowledge needed for crucial decision making. Efficient and effective analysis of safeguards data will depend on utilizing technologies to interpret the large, heterogeneous data sets that are available from diverse sources. With an order-of-magnitude increase in the amount of data from a wide variety of technical, textual, and historical sources there is a vital need to apply advanced computer technologies to support all-source analysis. There are techniques of data warehousing, data mining, and data analysis that can provide analysts with tools that will expedite their extracting useable information from the huge amounts of data to which they have access. Computerized tools can aid analysts by integrating heterogeneous data, evaluating diverse data streams, automating retrieval of database information, prioritizing inputs, reconciling conflicting data, doing preliminary interpretations, discovering patterns or trends in data, and automating some of the simpler prescreening tasks that are time consuming and tedious. Thus knowledge discovery technologies can provide a foundation of support for the analyst. Rather than spending time sifting through often irrelevant information, analysts could use their specialized skills in a focused, productive fashion. This would allow them to make their analytical judgments with more confidence and spend more of their time doing what they do best

  13. SPMK AND GRABCUT BASED TARGET EXTRACTION FROM HIGH RESOLUTION REMOTE SENSING IMAGES

    Directory of Open Access Journals (Sweden)

    W. Cui

    2016-06-01

    Full Text Available Target detection and extraction from high resolution remote sensing images is a basic and wide needed application. In this paper, to improve the efficiency of image interpretation, we propose a detection and segmentation combined method to realize semi-automatic target extraction. We introduce the dense transform color scale invariant feature transform (TC-SIFT descriptor and the histogram of oriented gradients (HOG & HSV descriptor to characterize the spatial structure and color information of the targets. With the k-means cluster method, we get the bag of visual words, and then, we adopt three levels’ spatial pyramid (SP to represent the target patch. After gathering lots of different kinds of target image patches from many high resolution UAV images, and using the TC-SIFT-SP and the multi-scale HOG & HSV feature, we constructed the SVM classifier to detect the target. In this paper, we take buildings as the targets. Experiment results show that the target detection accuracy of buildings can reach to above 90%. Based on the detection results which are a series of rectangle regions of the targets. We select the rectangle regions as candidates for foreground and adopt the GrabCut based and boundary regularized semi-auto interactive segmentation algorithm to get the accurate boundary of the target. Experiment results show its accuracy and efficiency. It can be an effective way for some special targets extraction.

  14. Actinide partitioning from high level liquid waste using the Diamex process

    International Nuclear Information System (INIS)

    Madic, C.; Blanc, P.; Condamines, N.; Baron, P.; Berthon, L.; Nicol, C.; Pozo, C.; Lecomte, M.; Philippe, M.; Masson, M.; Hequet, C.

    1994-01-01

    The removal of long-lived radionuclides, which belong to the so-called minor actinides elements, neptunium, americium and curium, from the high level nuclear wastes separated during the reprocessing of the irradiated nuclear fuels in order to transmute them into short-lived nuclides, can substantially decrease the potential hazards associated with the management of these nuclear wastes. In order to separate minor actinides from high-level liquid wastes (HLLW), a liquid-liquid extraction process was considered, based on the use of diamide molecules, which display the property of being totally burnable, thus they do not generate secondary solid wastes. The main extracting properties of dimethyldibutyltetradecylmalonamide (DMDBTDMA), the diamide selected for the development of the DIAMEX process, are briefly described in this paper. Hot tests of the DIAMEX process (using DMDBTDMA) related to the treatment of an mixed oxide fuels (MOX) type HLLW, were successfully performed. The minor actinide decontamination factors of the HLLW obtained were encouraging. The main results of these tests are presented and discussed in this paper. (authors). 9 refs., 2 figs., 7 tabs

  15. Plantago maxima leaves extract inhibits adipogenic action of a high-fat diet in female Wistar rats.

    Science.gov (United States)

    Tinkov, Alexey A; Nemereshina, Olga N; Popova, Elizaveta V; Polyakova, Valentina S; Gritsenko, Viktor A; Nikonorov, Alexandr A

    2014-04-01

    The primary objective of this study is to investigate the content of biologically active compounds producing an antioxidant effect in Plantago maxima and their influence on main mechanisms of dietary obesity development. Biologically active compounds in P. maxima were tested using paper chromatography. In in vivo experiment, high-fat-fed Wistar rats obtained P. maxima water extract for 3 months. Morphometric parameters, weight gain, serum adipokines, and cytokines, as well as oxidative stress biomarkers in rats’ tissues were evaluated. Gut microflora was also examined. Plantago maxima leaves used in the experiment contained significant amount of flavonoids, iridoids, phenol carboxylic acids, and tannins and ascorbic acid. Our in vivo experiment data demonstrate that P. maxima water extract prevents excessive adiposity in a diet-induced model. P. maxima consumption reduced serum leptin (twofold), macrophage chemoattractant protein-1 (sevenfold), tumornecrosis factor-α (25%), and interleukine-6 (26%) levels. P. maxima water extract decreased adipose tissue oxidative stress biomarkers in rats fed a high-fat diet. In addition, increased bacterial growth in the diet-induced obesity model was reversed by the P. maxima extract treatment. Plantago maxima water extract possessed antiadipogenic, antidiabetic, antiinflammatory, antioxidant activity, and normalized gut microflora in a rat model of diet-induced excessive adiposity due to a high content of biologically active compounds.

  16. High-Q Bandpass Comb Filter for Mains Interference Extraction

    Directory of Open Access Journals (Sweden)

    Neycheva T.

    2009-12-01

    Full Text Available This paper presents a simple digital high-Q bandpass comb filter for power-line (PL or other periodical interference extraction. The filter concept relies on a correlated signal average resulting in alternating constructive and destructive spectrum interference i.e. the so-called comb frequency response. The presented filter is evaluated by Matlab simulations with real ECG signal contaminated with low amplitude PL interference. The made simulations show that this filter accurately extract the PL interference. It has high-Q notches only at PL odd harmonics and is appropriate for extraction of any kind of odd harmonic interference including rectangular shape. The filter is suitable for real-time operation with popular low-cost microcontrollers.

  17. Coastal barrier stratigraphy for Holocene high-resolution sea-level reconstruction.

    Science.gov (United States)

    Costas, Susana; Ferreira, Óscar; Plomaritis, Theocharis A; Leorri, Eduardo

    2016-12-08

    The uncertainties surrounding present and future sea-level rise have revived the debate around sea-level changes through the deglaciation and mid- to late Holocene, from which arises a need for high-quality reconstructions of regional sea level. Here, we explore the stratigraphy of a sandy barrier to identify the best sea-level indicators and provide a new sea-level reconstruction for the central Portuguese coast over the past 6.5 ka. The selected indicators represent morphological features extracted from coastal barrier stratigraphy, beach berm and dune-beach contact. These features were mapped from high-resolution ground penetrating radar images of the subsurface and transformed into sea-level indicators through comparison with modern analogs and a chronology based on optically stimulated luminescence ages. Our reconstructions document a continuous but slow sea-level rise after 6.5 ka with an accumulated change in elevation of about 2 m. In the context of SW Europe, our results show good agreement with previous studies, including the Tagus isostatic model, with minor discrepancies that demand further improvement of regional models. This work reinforces the potential of barrier indicators to accurately reconstruct high-resolution mid- to late Holocene sea-level changes through simple approaches.

  18. INFORMATION EXTRACTION IN TOMB PIT USING HYPERSPECTRAL DATA

    Directory of Open Access Journals (Sweden)

    X. Yang

    2018-04-01

    Full Text Available Hyperspectral data has characteristics of multiple bands and continuous, large amount of data, redundancy, and non-destructive. These characteristics make it possible to use hyperspectral data to study cultural relics. In this paper, the hyperspectral imaging technology is adopted to recognize the bottom images of an ancient tomb located in Shanxi province. There are many black remains on the bottom surface of the tomb, which are suspected to be some meaningful texts or paintings. Firstly, the hyperspectral data is preprocessing to get the reflectance of the region of interesting. For the convenient of compute and storage, the original reflectance value is multiplied by 10000. Secondly, this article uses three methods to extract the symbols at the bottom of the ancient tomb. Finally we tried to use morphology to connect the symbols and gave fifteen reference images. The results show that the extraction of information based on hyperspectral data can obtain a better visual experience, which is beneficial to the study of ancient tombs by researchers, and provides some references for archaeological research findings.

  19. Information Extraction in Tomb Pit Using Hyperspectral Data

    Science.gov (United States)

    Yang, X.; Hou, M.; Lyu, S.; Ma, S.; Gao, Z.; Bai, S.; Gu, M.; Liu, Y.

    2018-04-01

    Hyperspectral data has characteristics of multiple bands and continuous, large amount of data, redundancy, and non-destructive. These characteristics make it possible to use hyperspectral data to study cultural relics. In this paper, the hyperspectral imaging technology is adopted to recognize the bottom images of an ancient tomb located in Shanxi province. There are many black remains on the bottom surface of the tomb, which are suspected to be some meaningful texts or paintings. Firstly, the hyperspectral data is preprocessing to get the reflectance of the region of interesting. For the convenient of compute and storage, the original reflectance value is multiplied by 10000. Secondly, this article uses three methods to extract the symbols at the bottom of the ancient tomb. Finally we tried to use morphology to connect the symbols and gave fifteen reference images. The results show that the extraction of information based on hyperspectral data can obtain a better visual experience, which is beneficial to the study of ancient tombs by researchers, and provides some references for archaeological research findings.

  20. The activity of Stichopus hermanii extract on triglyceride serum level in periodontitis

    Directory of Open Access Journals (Sweden)

    Rima Parwati Sari

    2011-06-01

    Full Text Available Background: The level of triglyceride can be used as a parameter of  hypercholesterolemia. Periodontitis can make the condition of hypercholesterolemia worse. Stichopus hermanii extract is a source of saturated fatty acid containing omega-3 which can decrease triglyceride blood level. Purpose: The aim of this research was to investigate the effect of Stichopus hermanii extract in triglyceride blood level of wistar rats which got periodontitis. Methods: The samples of this research were 30 rats divided into 5 groups, namely group K(– as negative control group (without treatment, group K(+ as positive control group (induced with periodontopathogen mix, group P1 as treatment group1 (induced with periodontopathogen and Stichopus hermanii extract mix, 0.09 ml/kgW, Group P2 (induced with periodontopathogen and Stichopus hermanii extract mix, 0,18 ml/kgW, and group P3 (induced with periodontopathogen and Stichopus hermanii extract mix, 0,36 ml/kgW. Then, all of those rats were sacrificed and all serum was measured for their level of triglyceride. Results: All data was analyzed with ANOVA test showing a significant result. LSD test showed a significant different between group K(– and group K(+, and between group K(+ and group P2 and P3. Conclusion: Stichopus hermanii extract can decrease the triglyceride blood level in wistar rats with periodontitis.Latar belakang: Kadar trigliserida dalam darah dapat digunakan sebagai parameter hiperkolesterolemia. Periodontitis dapat memperburuk kondisi hiperkolesterolemia. Stichopus hermanii ekstrak mengandung asam lemak jenuh terutama omega-3 yang dapat berfungsi menurunkan kadar trigliserida dalam darah. Tujuan: Tujuan dari penelitian ini adalah untuk mengetahui pengaruh ekstrak Stichopus hermanii terhadap kadar trigliserida dalam darah tikus wistar yang mengalami periodontitis. Metode: Sampel penelitian ini adalah 30 ekor tikus wistar yang dibagi dalam 5 kelompok. Kelompok K(– sebagai kelompok kontrol

  1. [Evaluation of the results of high-speed handpiece and minimally invasive extraction in impacted mandibular third molar extraction].

    Science.gov (United States)

    Yang, Ying-yang; DU, Sheng-nan; Lv, Zong-kai

    2015-08-01

    To compare the results of high-speed handpiece and minimally invasive extraction in impacted mandibular third molar extraction. From May 2011 to May 2014, 83 patients undergoing impacted mandibular third molar extraction were enrolled into the study and randomly divided into 2 groups: 42 patients in group A (experimental group) and 41 patients in group B (control group). Group B underwent extraction with traditional method and group A underwent high-speed handpiece and minimally invasive extraction of the impacted mandibular third molar. The occurrences of the root fracture, gingival laceration, tooth mobility, lingual bone plate fracture, jaw fracture and dislocation of temporomandibular joint during operation and lower lip numbness, dry socket, facial swelling and limitation of mouth opening after operation were observed and compared between 2 groups. The operation time, integrity of extraction sockets, VAS pain score and satisfaction from patients were collected and compared. SPSS 19.0 software package was used for statistical analysis. The occurrences of root fracture, gingival laceration, tooth mobility, lingual bone plate fracture, jaw fracture, and dislocation of temporomandibular joint during operation in group A significantly decreased compared with group B (Pextraction sockets, VAS pain scores and satisfaction scores in group A improved significantly compared with group B (Phandpiece and minimally invasive extraction should be widely used in impacted mandibular third molar extraction, due to the advantages of simple operation, high efficiency, minimal trauma, and few perioperative complications.

  2. Urban Boundary Extraction and Urban Sprawl Measurement Using High-Resolution Remote Sensing Images: a Case Study of China's Provincial

    Science.gov (United States)

    Wang, H.; Ning, X.; Zhang, H.; Liu, Y.; Yu, F.

    2018-04-01

    Urban boundary is an important indicator for urban sprawl analysis. However, methods of urban boundary extraction were inconsistent, and construction land or urban impervious surfaces was usually used to represent urban areas with coarse-resolution images, resulting in lower precision and incomparable urban boundary products. To solve above problems, a semi-automatic method of urban boundary extraction was proposed by using high-resolution image and geographic information data. Urban landscape and form characteristics, geographical knowledge were combined to generate a series of standardized rules for urban boundary extraction. Urban boundaries of China's 31 provincial capitals in year 2000, 2005, 2010 and 2015 were extracted with above-mentioned method. Compared with other two open urban boundary products, accuracy of urban boundary in this study was the highest. Urban boundary, together with other thematic data, were integrated to measure and analyse urban sprawl. Results showed that China's provincial capitals had undergone a rapid urbanization from year 2000 to 2015, with the area change from 6520 square kilometres to 12398 square kilometres. Urban area of provincial capital had a remarkable region difference and a high degree of concentration. Urban land became more intensive in general. Urban sprawl rate showed inharmonious with population growth rate. About sixty percent of the new urban areas came from cultivated land. The paper provided a consistent method of urban boundary extraction and urban sprawl measurement using high-resolution remote sensing images. The result of urban sprawl of China's provincial capital provided valuable urbanization information for government and public.

  3. Investigation of prostaglandin levels in human milk after high performance liquid chromatography purification

    International Nuclear Information System (INIS)

    Wu-Wang, C.Y.; Neu, J.

    1986-01-01

    This study was conducted to investigate five prostaglandins (PGs), i.e. PGE 2 , PGF/sub 2α/, 13-14-dihydro-15-keto-PGF/sub 2α/ (DHKF/sub 2α/), thromboxane B 2 (TXB 2 ) and 6-keto-PGF/sub 1α/), measured by (RIA) after C 18 Sep-Pak extraction and reverse phase high performance liquid chromatography (HPLC). Two trials were performed. In each trial, 3-5 mature human milk samples were pooled, acidified and extracted for PGs. The separation of PGs by HPLC was achieved by using an isocratic solvent system of acetonitrile/water (pH 3.0) (32/68, V/V). The PG levels from the two trials were determined and averaged after monitoring the recoveries. The results indicate that PGE 2 and DHKF/sub 2α/ are the two major PGs found in extracted human milk. However, after HPLC purification, no predominant PG is found and the levels of all the five PGs are much lower compared to the extracted sample. Since the immunoreactive material was also detected in HPLC fractions not within the PG peak, low levels of PG found in human milk after HPLC is likely due to the purification step removing the bulk of nonspecific immunoreactive substances present in the sample

  4. Strategies for the extraction and analysis of non-extractable polyphenols from plants.

    Science.gov (United States)

    Domínguez-Rodríguez, Gloria; Marina, María Luisa; Plaza, Merichel

    2017-09-08

    The majority of studies based on phenolic compounds from plants are focused on the extractable fraction derived from an aqueous or aqueous-organic extraction. However, an important fraction of polyphenols is ignored due to the fact that they remain retained in the residue of extraction. They are the so-called non-extractable polyphenols (NEPs) which are high molecular weight polymeric polyphenols or individual low molecular weight phenolics associated to macromolecules. The scarce information available about NEPs shows that these compounds possess interesting biological activities. That is why the interest about the study of these compounds has been increasing in the last years. Furthermore, the extraction and characterization of NEPs are considered a challenge because the developed analytical methodologies present some limitations. Thus, the present literature review summarizes current knowledge of NEPs and the different methodologies for the extraction of these compounds, with a particular focus on hydrolysis treatments. Besides, this review provides information on the most recent developments in the purification, separation, identification and quantification of NEPs from plants. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. High-level radioactive wastes. Supplement 1

    International Nuclear Information System (INIS)

    McLaren, L.H.

    1984-09-01

    This bibliography contains information on high-level radioactive wastes included in the Department of Energy's Energy Data Base from August 1982 through December 1983. These citations are to research reports, journal articles, books, patents, theses, and conference papers from worldwide sources. Five indexes, each preceded by a brief description, are provided: Corporate Author, Personal Author, Subject, Contract Number, and Report Number. 1452 citations

  6. Effect of cinnamon extract on blood glucose level and lipid profile in alloxan induced diabetic rats

    International Nuclear Information System (INIS)

    Mahmood, S.; Khurshid, R.

    2011-01-01

    Background: Cinnamon has been shown to potentiate the hypoglycaemic effect of insulin through up regulation of the glucose uptake in cultured adipocytes of rats. This study tried to find out the effect of Cinnamon alone or in combination with Insulin in diabetic albino rats. Methods: Thirty rats were divided into three groups, A and B. Group A were given cinnamon extract 200 mg/Kg body weight daily orally and group B rats were given cinnamon extract 400 mg/Kg body weight daily. After six weeks blood glucose and lipid profile levels were evaluated in all the groups. Results: Group of rats given 200 mg cinnamon extract showed significant decrease of blood glucose concentration but there was slight or no change in the level of lipid parameters including serum cholesterol, triglyceride and lipoproteins (HDL, LDL-chol). On the other hand group of rats given 400 mg extract of cinnamon showed a better but non significant change in level of lipid related parameter while blood glucose level was significantly decreased. Conclusion: The cinnamon at a dose of 400 mg showed same effects on blood glucose level but better effects on lipid profiles especially of serum cholesterol level of group of rats compared to 200 mg of cinnamon extract. Cinnamon may be recommended as hypoglycaemic herb but not as hypolipidaemic herb. (author)

  7. FPGA based compute nodes for high level triggering in PANDA

    International Nuclear Information System (INIS)

    Kuehn, W; Gilardi, C; Kirschner, D; Lang, J; Lange, S; Liu, M; Perez, T; Yang, S; Schmitt, L; Jin, D; Li, L; Liu, Z; Lu, Y; Wang, Q; Wei, S; Xu, H; Zhao, D; Korcyl, K; Otwinowski, J T; Salabura, P

    2008-01-01

    PANDA is a new universal detector for antiproton physics at the HESR facility at FAIR/GSI. The PANDA data acquisition system has to handle interaction rates of the order of 10 7 /s and data rates of several 100 Gb/s. FPGA based compute nodes with multi-Gb/s bandwidth capability using the ATCA architecture are designed to handle tasks such as event building, feature extraction and high level trigger processing. Data connectivity is provided via optical links as well as multiple Gb Ethernet ports. The boards will support trigger algorithms such us pattern recognition for RICH detectors, EM shower analysis, fast tracking algorithms and global event characterization. Besides VHDL, high level C-like hardware description languages will be considered to implement the firmware

  8. Dual-wavelength phase-shifting digital holography selectively extracting wavelength information from wavelength-multiplexed holograms.

    Science.gov (United States)

    Tahara, Tatsuki; Mori, Ryota; Kikunaga, Shuhei; Arai, Yasuhiko; Takaki, Yasuhiro

    2015-06-15

    Dual-wavelength phase-shifting digital holography that selectively extracts wavelength information from five wavelength-multiplexed holograms is presented. Specific phase shifts for respective wavelengths are introduced to remove the crosstalk components and extract only the object wave at the desired wavelength from the holograms. Object waves in multiple wavelengths are selectively extracted by utilizing 2π ambiguity and the subtraction procedures based on phase-shifting interferometry. Numerical results show the validity of the proposed technique. The proposed technique is also experimentally demonstrated.

  9. Phase extracting algorithms analysis in the white-light spectral interferometry

    Science.gov (United States)

    Guo, Tong; Li, Bingtong; Li, Minghui; Chen, Jinping; Fu, Xing; Hu, Xiaotang

    2018-01-01

    As an optical testing method, white-light spectral interferometry has the characteristics of non-contact, high precision. The phase information can be obtained by analyzing the spectral interference signal of the tested sample, and then the absolute distance is calculated. Fourier transform method, temporal phase-shifting method, spatial phase-shifting method and envelope method can be used to extract the phase information of the spectral interference signal. In this paper, the performance of four methods to extract phase information is simulated and analyzed by using the ideal spectral interference signal. It turns out that temporal phase-shifting method has the performance of high precision, the results of Fourier transform method and envelop method are distorted at the edge of the signal, and spatial phase-shifting method has the worst precision. Adding different levels of white noise to the ideal signal, temporal phase-shifting method is most accurate, while Fourier transform method and envelope method are relatively poor. Finally, the absolute distance measurement experiment is carried out on the constructed test system, and the results are consistent with the simulation ones.

  10. Validation and extraction of molecular-geometry information from small-molecule databases.

    Science.gov (United States)

    Long, Fei; Nicholls, Robert A; Emsley, Paul; Graǽulis, Saulius; Merkys, Andrius; Vaitkus, Antanas; Murshudov, Garib N

    2017-02-01

    A freely available small-molecule structure database, the Crystallography Open Database (COD), is used for the extraction of molecular-geometry information on small-molecule compounds. The results are used for the generation of new ligand descriptions, which are subsequently used by macromolecular model-building and structure-refinement software. To increase the reliability of the derived data, and therefore the new ligand descriptions, the entries from this database were subjected to very strict validation. The selection criteria made sure that the crystal structures used to derive atom types, bond and angle classes are of sufficiently high quality. Any suspicious entries at a crystal or molecular level were removed from further consideration. The selection criteria included (i) the resolution of the data used for refinement (entries solved at 0.84 Å resolution or higher) and (ii) the structure-solution method (structures must be from a single-crystal experiment and all atoms of generated molecules must have full occupancies), as well as basic sanity checks such as (iii) consistency between the valences and the number of connections between atoms, (iv) acceptable bond-length deviations from the expected values and (v) detection of atomic collisions. The derived atom types and bond classes were then validated using high-order moment-based statistical techniques. The results of the statistical analyses were fed back to fine-tune the atom typing. The developed procedure was repeated four times, resulting in fine-grained atom typing, bond and angle classes. The procedure will be repeated in the future as and when new entries are deposited in the COD. The whole procedure can also be applied to any source of small-molecule structures, including the Cambridge Structural Database and the ZINC database.

  11. A sediment extraction and cleanup method for wide-scope multitarget screening by liquid chromatography-high-resolution mass spectrometry.

    Science.gov (United States)

    Massei, Riccardo; Byers, Harry; Beckers, Liza-Marie; Prothmann, Jens; Brack, Werner; Schulze, Tobias; Krauss, Martin

    2018-01-01

    Previous studies on organic sediment contaminants focused mainly on a limited number of highly hydrophobic micropollutants accessible to gas chromatography using nonpolar, aprotic extraction solvents. The development of liquid chromatography-high-resolution mass spectrometry (LC-HRMS) permits the spectrum of analysis to be expanded to a wider range of more polar and ionic compounds present in sediments and allows target, suspect, and nontarget screening to be conducted with high sensitivity and selectivity. In this study, we propose a comprehensive multitarget extraction and sample preparation method for characterization of sediment pollution covering a broad range of physicochemical properties that is suitable for LC-HRMS screening analysis. We optimized pressurized liquid extraction, cleanup, and sample dilution for a target list of 310 compounds. Finally, the method was tested on sediment samples from a small river and its tributaries. The results show that the combination of 100 °C for ethyl acetate-acetone (50:50, neutral extract) followed by 80 °C for acetone-formic acid (100:1, acidic extract) and methanol-10 mM sodium tetraborate in water (90:10, basic extract) offered the best extraction recoveries for 287 of 310 compounds. At a spiking level of 1 μg mL -1 , we obtained satisfactory cleanup recoveries for the neutral extract-(93 ± 23)%-and for the combined acidic/basic extracts-(42 ± 16)%-after solvent exchange. Among the 69 compounds detected in environmental samples, we successfully quantified several pharmaceuticals and polar pesticides.

  12. Handling and storage of conditioned high-level wastes

    International Nuclear Information System (INIS)

    1983-01-01

    This report deals with certain aspects of the management of one of the most important wastes, i.e. the handling and storage of conditioned (immobilized and packaged) high-level waste from the reprocessing of spent nuclear fuel and, although much of the material presented here is based on information concerning high-level waste from reprocessing LWR fuel, the principles, as well as many of the details involved, are applicable to all fuel types. The report provides illustrative background material on the arising and characteristics of high-level wastes and, qualitatively, their requirements for conditioning. The report introduces the principles important in conditioned high-level waste storage and describes the types of equipment and facilities, used or studied, for handling and storage of such waste. Finally, it discusses the safety and economic aspects that are considered in the design and operation of handling and storage facilities

  13. Handling and storage of conditioned high-level wastes

    International Nuclear Information System (INIS)

    Heafield, W.

    1984-01-01

    This paper deals with certain aspects of the management of one of the most important radioactive wastes arising from the nuclear fuel cycle, i.e. the handling and storage of conditioned high-level wastes. The paper is based on an IAEA report of the same title published during 1983 in the Technical Reports Series. The paper provides illustrative background material on the characteristics of high-level wastes and, qualitatively, their requirements for conditioning. The principles important in the storage of high-level wastes are reviewed in conjunction with the radiological and socio-political considerations involved. Four fundamentally different storage concepts are described with reference to published information and the safety aspects of particular storage concepts are discussed. Finally, overall conclusions are presented which confirm the availability of technology for constructing and operating conditioned high-level waste storage facilities for periods of at least several decades. (author)

  14. Extraction of Terraces on the Loess Plateau from High-Resolution DEMs and Imagery Utilizing Object-Based Image Analysis

    Directory of Open Access Journals (Sweden)

    Hanqing Zhao

    2017-05-01

    Full Text Available Abstract: Terraces are typical artificial landforms on the Loess Plateau, with ecological functions in water and soil conservation, agricultural production, and biodiversity. Recording the spatial distribution of terraces is the basis of monitoring their extent and understanding their ecological effects. The current terrace extraction method mainly relies on high-resolution imagery, but its accuracy is limited due to vegetation coverage distorting the features of terraces in imagery. High-resolution topographic data reflecting the morphology of true terrace surfaces are needed. Terraces extraction on the Loess Plateau is challenging because of the complex terrain and diverse vegetation after the implementation of “vegetation recovery”. This study presents an automatic method of extracting terraces based on 1 m resolution digital elevation models (DEMs and 0.3 m resolution Worldview-3 imagery as auxiliary information used for object-based image analysis (OBIA. A multi-resolution segmentation method was used where slope, positive and negative terrain index (PN, accumulative curvature slope (AC, and slope of slope (SOS were determined as input layers for image segmentation by correlation analysis and Sheffield entropy method. The main classification features based on DEMs were chosen from the terrain features derived from terrain factors and texture features by gray-level co-occurrence matrix (GLCM analysis; subsequently, these features were determined by the importance analysis on classification and regression tree (CART analysis. Extraction rules based on DEMs were generated from the classification features with a total classification accuracy of 89.96%. The red band and near-infrared band of images were used to exclude construction land, which is easily confused with small-size terraces. As a result, the total classification accuracy was increased to 94%. The proposed method ensures comprehensive consideration of terrain, texture, shape, and

  15. Extraction of drainage networks from large terrain datasets using high throughput computing

    Science.gov (United States)

    Gong, Jianya; Xie, Jibo

    2009-02-01

    Advanced digital photogrammetry and remote sensing technology produces large terrain datasets (LTD). How to process and use these LTD has become a big challenge for GIS users. Extracting drainage networks, which are basic for hydrological applications, from LTD is one of the typical applications of digital terrain analysis (DTA) in geographical information applications. Existing serial drainage algorithms cannot deal with large data volumes in a timely fashion, and few GIS platforms can process LTD beyond the GB size. High throughput computing (HTC), a distributed parallel computing mode, is proposed to improve the efficiency of drainage networks extraction from LTD. Drainage network extraction using HTC involves two key issues: (1) how to decompose the large DEM datasets into independent computing units and (2) how to merge the separate outputs into a final result. A new decomposition method is presented in which the large datasets are partitioned into independent computing units using natural watershed boundaries instead of using regular 1-dimensional (strip-wise) and 2-dimensional (block-wise) decomposition. Because the distribution of drainage networks is strongly related to watershed boundaries, the new decomposition method is more effective and natural. The method to extract natural watershed boundaries was improved by using multi-scale DEMs instead of single-scale DEMs. A HTC environment is employed to test the proposed methods with real datasets.

  16. An automatic extraction algorithm of three dimensional shape of brain parenchyma from MR images

    International Nuclear Information System (INIS)

    Matozaki, Takeshi

    2000-01-01

    For the simulation of surgical operations, the extraction of the selected region using MR images is useful. However, this segmentation requires a high level of skill and experience from the technicians. We have developed an unique automatic extraction algorithm for extracting three dimensional brain parenchyma using MR head images. It is named the ''three dimensional gray scale clumsy painter method''. In this method, a template having the shape of a pseudo-circle, a so called clumsy painter (CP), moves along the contour of the selected region and extracts the region surrounded by the contour. This method has advantages compared with the morphological filtering and the region growing method. Previously, this method was applied to binary images, but there were some problems in that the results of the extractions were varied by the value of the threshold level. We introduced gray level information of images to decide the threshold, and depend upon the change of image density between the brain parenchyma and CSF. We decided the threshold level by the vector of a map of templates, and changed the map according to the change of image density. As a result, the over extracted ratio was improved by 36%, and the under extracted ratio was improved by 20%. (author)

  17. Pressurized Hot Water Extraction of anthocyanins from red onion: A study on extraction and degradation rates

    Energy Technology Data Exchange (ETDEWEB)

    Petersson, Erik V.; Liu Jiayin; Sjoeberg, Per J.R.; Danielsson, Rolf [Uppsala University, Department of Physical and Analytical Chemistry, P.O. Box 599, SE-751 24, Uppsala (Sweden); Turner, Charlotta, E-mail: Charlotta.Turner@kemi.uu.se [Uppsala University, Department of Physical and Analytical Chemistry, P.O. Box 599, SE-751 24, Uppsala (Sweden)

    2010-03-17

    Pressurized Hot Water Extraction (PHWE) is a quick, efficient and environmentally friendly technique for extractions. However, when using PHWE to extract thermally unstable analytes, extraction and degradation effects occur at the same time, and thereby compete. At first, the extraction effect dominates, but degradation effects soon take over. In this paper, extraction and degradation rates of anthocyanins from red onion were studied with experiments in a static batch reactor at 110 deg. C. A total extraction curve was calculated with data from the actual extraction and degradation curves, showing that more anthocyanins, 21-36% depending on the species, could be extracted if no degradation occurred, but then longer extraction times would be required than those needed to reach the peak level in the apparent extraction curves. The results give information about the different kinetic processes competing during an extraction procedure.

  18. INTEC High-Level Waste Studies Universal Solvent Extraction Feasibility Study

    International Nuclear Information System (INIS)

    Banaee, J.; Barnes, C.M.; Battisti, T.; Herrmann, S.; Losinski, S.J.; McBride, S.

    2000-01-01

    This report summarizes a feasibility study that has been conducted on the Universal Solvent Extraction (UNEX) Process for treatment and disposal of 4.3 million liters of INEEL sodium-bearing waste located at the Idaho Nuclear Technology and Engineering Center. This feasibility study covers two scenarios of treatment. The first, the UNEX Process, partitions the Cs/Sr from the SBW and creates remote-handled LLW and contact-handled TRU waste forms. Phase one of this study, covered in the 30% review documents, dealt with defining the processes and defining the major unit operations. The second phase of the project, contained in the 60% review, expanded on the application of the UNEX processes and included facility requirements and definitions. Two facility options were investigated for the UNEX process, resulting in a 2 x 2 matrix of process/facility scenarios as follows: Option A, UNEX at Greenfield Facility, Option B, Modified UNEX at Greenfield Facility, Option C, UNEX at NWCF, th is document, covers life-cycle costs for all options presented along with results and conclusions determined from the study

  19. Influences of groundwater extraction on flow dynamics and arsenic levels in the western Hetao Basin, Inner Mongolia, China

    Science.gov (United States)

    Zhang, Zhuo; Guo, Huaming; Zhao, Weiguang; Liu, Shuai; Cao, Yongsheng; Jia, Yongfeng

    2018-04-01

    Data on spatiotemporal variations in groundwater levels are crucial for understanding arsenic (As) behavior and dynamics in groundwater systems. Little is known about the influences of groundwater extraction on the transport and mobilization of As in the Hetao Basin, Inner Mongolia (China), so groundwater levels were recorded in five monitoring wells from 2011 to 2016 and in 57 irrigation wells and two multilevel wells in 2016. Results showed that groundwater level in the groundwater irrigation area had two troughs each year, induced by extensive groundwater extraction, while groundwater levels in the river-diverted (Yellow River) water irrigation area had two peaks each year, resulting from surface-water irrigation. From 2011 to 2016, groundwater levels in the groundwater irrigation area presented a decreasing trend due to the overextraction. Groundwater samples were taken for geochemical analysis each year in July from 2011 to 2016. Increasing trends were observed in groundwater total dissolved solids (TDS) and As. Owing to the reverse groundwater flow direction, the Shahai Lake acts as a new groundwater recharge source. Lake water had flushed the near-surface sediments, which contain abundant soluble components, and increased groundwater salinity. In addition, groundwater extraction induced strong downward hydraulic gradients, which led to leakage recharge from shallow high-TDS groundwater to the deep semiconfined aquifer. The most plausible explanation for similar variations among As, Fe(II) and total organic carbon (TOC) concentrations is the expected dissimilatory reduction of Fe(III) oxyhydroxides.

  20. EFFECT OF SAPPAN WOOD (Caesalpinnia sappan L EXTRACT ON BLOOD GLUCOSE LEVEL IN WHITE RATS

    Directory of Open Access Journals (Sweden)

    Saefudin Saefudin

    2016-05-01

    Full Text Available Sappan wood or kayu secang (Caesalpinia sappan L. was reported of having medicinal properties, such as natural antioxidant, relieve vomiting of blood, and mix of ingredients for malaria drugs. The research was conducted to study the influence of ethanol extract from sappan wood on blood glucose level of white rats. The study of the blood glucose level in rats was carried out by using glucose tolerance method. It was measured by Refloluxs (Accutrend GC with Chloropropamide 50 mg/200 g BW (Body weight as positive control. The ethanol extracts were used in various concentrations 10, 20, 30, 40 and 50 mg/200 g BW per-oral and was observed every hour, beginning one hour before to 7 hours after the extract being administered. The results showed that treatment of ethanol extract of sappan wood by administer doses gave remarkable effect on the blood glucose level in white rat. It reduced the glucose level in the blood compared to the negative and positive control. Treatment of dose 30 mg/200 g BW gave similar effect to positive controls, while a dose of 50 mg/200 g BW gave lower blood glucose level (93 mg/dl than the positive controls.

  1. The information protection level assessment system implementation

    Science.gov (United States)

    Trapeznikov, E. V.

    2018-04-01

    Currently, the threat of various attacks increases significantly as automated systems become more widespread. On the basis of the conducted analysis the information protection level assessment system establishing objective was identified. The paper presents the information protection level assessment software implementation in the information system by applying the programming language C #. In conclusions the software features are identified and experimental results are represented.

  2. Extracting information from two-dimensional electrophoresis gels by partial least squares regression

    DEFF Research Database (Denmark)

    Jessen, Flemming; Lametsch, R.; Bendixen, E.

    2002-01-01

    of all proteins/spots in the gels. In the present study it is demonstrated how information can be extracted by multivariate data analysis. The strategy is based on partial least squares regression followed by variable selection to find proteins that individually or in combination with other proteins vary......Two-dimensional gel electrophoresis (2-DE) produces large amounts of data and extraction of relevant information from these data demands a cautious and time consuming process of spot pattern matching between gels. The classical approach of data analysis is to detect protein markers that appear...... or disappear depending on the experimental conditions. Such biomarkers are found by comparing the relative volumes of individual spots in the individual gels. Multivariate statistical analysis and modelling of 2-DE data for comparison and classification is an alternative approach utilising the combination...

  3. Improvement of Control Infrastructure and High Level Application for KOMAC LINAC

    Energy Technology Data Exchange (ETDEWEB)

    Song, Young-Gi; Kim, Jae-Ha; Ahn, Tae-Sung; Kwon, Hyeok-Jung; Cho, Yong-Sub [Korea Atomic Energy Research Institute, Gyeongju (Korea, Republic of)

    2015-10-15

    The Korea multi-purpose accelerator complex (KOMAC) has two beam extraction points at 20 and 100 MeV for proton beam utilization. There are about 70 control systems for controlling the KOMAC subsystems, such as the ion source, the radio frequency, the diagnostic devices, the magnet power supply, and the cooling system. The infrastructure which includes network system, local controllers, and control system environment was required to be changed to process increasing process variables without fail. Experimental Physics and Industrial Control System (EPICS) based high level control environment which includes alarm, data archiving was changed to support the improved infrastructure of KOMAC control system. In this paper, we will describe the improvement of infrastructures for the KOMAC control system and EPICS based high level application. We improved the control network environment and EPCIS based high level application for enhancement of the KOMAC control system.

  4. 76 FR 35137 - Vulnerability and Threat Information for Facilities Storing Spent Nuclear Fuel and High-Level...

    Science.gov (United States)

    2011-06-16

    ... High-Level Radioactive Waste AGENCY: U.S. Nuclear Regulatory Commission. ACTION: Public meeting... Nuclear Fuel, High-Level Radioactive Waste, and Reactor-Related Greater Than Class C Waste,'' and 73... Spent Nuclear Fuel (SNF) and High-Level Radioactive Waste (HLW) storage facilities. The draft regulatory...

  5. The future of event-level information repositories, indexing, and selection in ATLAS

    International Nuclear Information System (INIS)

    Barberis, D; Cranshaw, J; Malon, D; Gemmeren, P Van; Zhang, Q; Dimitrov, G; Nairz, A; Sorokoletov, R; Doherty, T; Quilty, D; Gallas, E J; Hrivnac, J; Nowak, M

    2014-01-01

    ATLAS maintains a rich corpus of event-by-event information that provides a global view of the billions of events the collaboration has measured or simulated, along with sufficient auxiliary information to navigate to and retrieve data for any event at any production processing stage. This unique resource has been employed for a range of purposes, from monitoring, statistics, anomaly detection, and integrity checking, to event picking, subset selection, and sample extraction. Recent years of data-taking provide a foundation for assessment of how this resource has and has not been used in practice, of the uses for which it should be optimized, of how it should be deployed and provisioned for scalability to future data volumes, and of the areas in which enhancements to functionality would be most valuable. This paper describes how ATLAS event-level information repositories and selection infrastructure are evolving in light of this experience, and in view of their expected roles both in wide-area event delivery services and in an evolving ATLAS analysis model in which the importance of efficient selective access to data can only grow.

  6. High-level radioactive wastes. Supplement 1

    Energy Technology Data Exchange (ETDEWEB)

    McLaren, L.H. (ed.)

    1984-09-01

    This bibliography contains information on high-level radioactive wastes included in the Department of Energy's Energy Data Base from August 1982 through December 1983. These citations are to research reports, journal articles, books, patents, theses, and conference papers from worldwide sources. Five indexes, each preceded by a brief description, are provided: Corporate Author, Personal Author, Subject, Contract Number, and Report Number. 1452 citations.

  7. High maysin corn silk extract reduces body weight and fat deposition in C57BL/6J mice fed high-fat diets.

    Science.gov (United States)

    Lee, Eun Young; Kim, Sun Lim; Kang, Hyeon Jung; Kim, Myung Hwan; Ha, Ae Wha; Kim, Woo Kyoung

    2016-12-01

    The study was performed to investigate the effects and mechanisms of action of high maysin corn silk extract on body weight and fat deposition in experimental animals. A total of 30 male C57BL/6J mice, 4-weeks-old, were purchased and divided into three groups by weight using a randomized block design. The normal-fat (NF) group received 7% fat (diet weight basis), the high-fat (HF) group received 25% fat and 0.5% cholesterol, and the high-fat corn silk (HFCS) group received high-fat diet and high maysin corn silk extract at 100 mg/kg body weight through daily oral administration. Body weight and body fat were measured, and mRNA expression levels of proteins involved in adipocyte differentiation, fat accumulation, fat synthesis, lipolysis, and fat oxidation in adipose tissue and the liver were measured. After experimental diet intake for 8 weeks, body weight was significantly lower in the HFCS group compared to the HF group ( P corn silk extract inhibits expression of genes involved in adipocyte differentiation, fat accumulation, and fat synthesis as well as promotes expression of genes involved in lipolysis and fat oxidation, further inhibiting body fat accumulation and body weight elevation in experimental animals.

  8. A high throughput DNA extraction method with high yield and quality

    Directory of Open Access Journals (Sweden)

    Xin Zhanguo

    2012-07-01

    Full Text Available Abstract Background Preparation of large quantity and high quality genomic DNA from a large number of plant samples is a major bottleneck for most genetic and genomic analyses, such as, genetic mapping, TILLING (Targeting Induced Local Lesion IN Genome, and next-generation sequencing directly from sheared genomic DNA. A variety of DNA preparation methods and commercial kits are available. However, they are either low throughput, low yield, or costly. Here, we describe a method for high throughput genomic DNA isolation from sorghum [Sorghum bicolor (L. Moench] leaves and dry seeds with high yield, high quality, and affordable cost. Results We developed a high throughput DNA isolation method by combining a high yield CTAB extraction method with an improved cleanup procedure based on MagAttract kit. The method yielded large quantity and high quality DNA from both lyophilized sorghum leaves and dry seeds. The DNA yield was improved by nearly 30 fold with 4 times less consumption of MagAttract beads. The method can also be used in other plant species, including cotton leaves and pine needles. Conclusion A high throughput system for DNA extraction from sorghum leaves and seeds was developed and validated. The main advantages of the method are low cost, high yield, high quality, and high throughput. One person can process two 96-well plates in a working day at a cost of $0.10 per sample of magnetic beads plus other consumables that other methods will also need.

  9. Information Extraction and Interpretation Analysis of Mineral Potential Targets Based on ETM+ Data and GIS technology: A Case Study of Copper and Gold Mineralization in Burma

    International Nuclear Information System (INIS)

    Wenhui, Du; Yongqing, Chen; Nana, Guo; Yinglong, Hao; Pengfei, Zhao; Gongwen, Wang

    2014-01-01

    Mineralization-alteration and structure information extraction plays important roles in mineral resource prospecting and assessment using remote sensing data and the Geographical Information System (GIS) technology. Choosing copper and gold mines in Burma as example, the authors adopt band ratio, threshold segmentation and principal component analysis (PCA) to extract the hydroxyl alteration information using ETM+ remote sensing images. Digital elevation model (DEM) (30m spatial resolution) and ETM+ data was used to extract linear and circular faults that are associated with copper and gold mineralization. Combining geological data and the above information, the weights of evidence method and the C-A fractal model was used to integrate and identify the ore-forming favourable zones in this area. Research results show that the high grade potential targets are located with the known copper and gold deposits, and the integrated information can be used to the next exploration for the mineral resource decision-making

  10. Timing of High-level Waste Disposal

    International Nuclear Information System (INIS)

    2008-01-01

    This study identifies key factors influencing the timing of high-level waste (HLW) disposal and examines how social acceptability, technical soundness, environmental responsibility and economic feasibility impact on national strategies for HLW management and disposal. Based on case study analyses, it also presents the strategic approaches adopted in a number of national policies to address public concerns and civil society requirements regarding long-term stewardship of high-level radioactive waste. The findings and conclusions of the study confirm the importance of informing all stakeholders and involving them in the decision-making process in order to implement HLW disposal strategies successfully. This study will be of considerable interest to nuclear energy policy makers and analysts as well as to experts in the area of radioactive waste management and disposal. (author)

  11. Outline of facility for studying high level radioactive materials (CPF) and study programmes

    International Nuclear Information System (INIS)

    Sakamoto, Motoi

    1983-01-01

    The Chemical Processing Facility for studying high level radioactive materials in Tokai Works of Power Reactor and Nuclear Fuel Development Corp. is a facility for fundamental studies centering around hot cells, necessary for the development of fuel recycle techniques for fast breeder reactors, an important point of nuclear fuel cycle, and of the techniques for processing and disposing high level radioactive liquid wastes. The operation of the facility was started in 1982, for both the system A (the test of fuel recycle for fast breeder reactors) and the system B (the test of vitrification of high level liquid wastes). In this report, the outline of the facility, the contents of testings and the reflection of the results are described. For the fuel recycle test, the hot test of the spent fuel pins of JOYO MK-1 core was started, and now the uranium and plutonium extraction test is underway. The scheduled tests are fuel solubility, the confirmation of residual properties in fuel melting, the confirmation of extracting conditions, the electrolytic reduction of plutonium, off-gas behaviour and the test of material reliability. For the test of vitrification of high level liquid wastes, the fundamental test on the solidifying techniques for the actual high level wastes eluted from the Tokai reprocessing plant has been started, and the following tests are programmed: Assessment of the properties of actual liquid wastes, denitration and concentration test, vitrification test, off-gas treatment test, the test of evaluating solidified wastes, and the test of storing solidified wastes. These test results are programmed to be reflected to the safety deliberation and the demonstration operation of a vitrification pilot plant. (Wakatsuki, Y.)

  12. Information Extraction of Tourist Geological Resources Based on 3d Visualization Remote Sensing Image

    Science.gov (United States)

    Wang, X.

    2018-04-01

    Tourism geological resources are of high value in admiration, scientific research and universal education, which need to be protected and rationally utilized. In the past, most of the remote sensing investigations of tourism geological resources used two-dimensional remote sensing interpretation method, which made it difficult for some geological heritages to be interpreted and led to the omission of some information. This aim of this paper is to assess the value of a method using the three-dimensional visual remote sensing image to extract information of geological heritages. skyline software system is applied to fuse the 0.36 m aerial images and 5m interval DEM to establish the digital earth model. Based on the three-dimensional shape, color tone, shadow, texture and other image features, the distribution of tourism geological resources in Shandong Province and the location of geological heritage sites were obtained, such as geological structure, DaiGu landform, granite landform, Volcanic landform, sandy landform, Waterscapes, etc. The results show that using this method for remote sensing interpretation is highly recognizable, making the interpretation more accurate and comprehensive.

  13. Automated Extraction of Substance Use Information from Clinical Texts.

    Science.gov (United States)

    Wang, Yan; Chen, Elizabeth S; Pakhomov, Serguei; Arsoniadis, Elliot; Carter, Elizabeth W; Lindemann, Elizabeth; Sarkar, Indra Neil; Melton, Genevieve B

    2015-01-01

    Within clinical discourse, social history (SH) includes important information about substance use (alcohol, drug, and nicotine use) as key risk factors for disease, disability, and mortality. In this study, we developed and evaluated a natural language processing (NLP) system for automated detection of substance use statements and extraction of substance use attributes (e.g., temporal and status) based on Stanford Typed Dependencies. The developed NLP system leveraged linguistic resources and domain knowledge from a multi-site social history study, Propbank and the MiPACQ corpus. The system attained F-scores of 89.8, 84.6 and 89.4 respectively for alcohol, drug, and nicotine use statement detection, as well as average F-scores of 82.1, 90.3, 80.8, 88.7, 96.6, and 74.5 respectively for extraction of attributes. Our results suggest that NLP systems can achieve good performance when augmented with linguistic resources and domain knowledge when applied to a wide breadth of substance use free text clinical notes.

  14. ALICE High Level Trigger

    CERN Multimedia

    Alt, T

    2013-01-01

    The ALICE High Level Trigger (HLT) is a computing farm designed and build for the real-time, online processing of the raw data produced by the ALICE detectors. Events are fully reconstructed from the raw data, analyzed and compressed. The analysis summary together with the compressed data and a trigger decision is sent to the DAQ. In addition the reconstruction of the events allows for on-line monitoring of physical observables and this information is provided to the Data Quality Monitor (DQM). The HLT can process event rates of up to 2 kHz for proton-proton and 200 Hz for Pb-Pb central collisions.

  15. Network and Ensemble Enabled Entity Extraction in Informal Text (NEEEEIT) final report

    Energy Technology Data Exchange (ETDEWEB)

    Kegelmeyer, Philip W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shead, Timothy M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dunlavy, Daniel M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-09-01

    This SAND report summarizes the activities and outcomes of the Network and Ensemble Enabled Entity Extraction in Information Text (NEEEEIT) LDRD project, which addressed improving the accuracy of conditional random fields for named entity recognition through the use of ensemble methods.

  16. Synthesis of High-Frequency Ground Motion Using Information Extracted from Low-Frequency Ground Motion

    Science.gov (United States)

    Iwaki, A.; Fujiwara, H.

    2012-12-01

    Broadband ground motion computations of scenario earthquakes are often based on hybrid methods that are the combinations of deterministic approach in lower frequency band and stochastic approach in higher frequency band. Typical computation methods for low-frequency and high-frequency (LF and HF, respectively) ground motions are the numerical simulations, such as finite-difference and finite-element methods based on three-dimensional velocity structure model, and the stochastic Green's function method, respectively. In such hybrid methods, LF and HF wave fields are generated through two different methods that are completely independent of each other, and are combined at the matching frequency. However, LF and HF wave fields are essentially not independent as long as they are from the same event. In this study, we focus on the relation among acceleration envelopes at different frequency bands, and attempt to synthesize HF ground motion using the information extracted from LF ground motion, aiming to propose a new method for broad-band strong motion prediction. Our study area is Kanto area, Japan. We use the K-NET and KiK-net surface acceleration data and compute RMS envelope at four frequency bands: 0.5-1.0 Hz, 1.0-2.0 Hz, 2.0-4.0 Hz, .0-8.0 Hz, and 8.0-16.0 Hz. Taking the ratio of the envelopes of adjacent bands, we find that the envelope ratios have stable shapes at each site. The empirical envelope-ratio characteristics are combined with low-frequency envelope of the target earthquake to synthesize HF ground motion. We have applied the method to M5-class earthquakes and a M7 target earthquake that occurred in the vicinity of Kanto area, and successfully reproduced the observed HF ground motion of the target earthquake. The method can be applied to a broad band ground motion simulation for a scenario earthquake by combining numerically-computed low-frequency (~1 Hz) ground motion with the empirical envelope ratio characteristics to generate broadband ground motion

  17. Optimization of flavanones extraction by modulating differential solvent densities and centrifuge temperatures.

    Science.gov (United States)

    Chebrolu, Kranthi K; Jayaprakasha, G K; Jifon, J; Patil, Bhimanagouda S

    2011-07-15

    Understanding the factors influencing flavonone extraction is critical for the knowledge in sample preparation. The present study was focused on the extraction parameters such as solvent, heat, centrifugal speed, centrifuge temperature, sample to solvent ratio, extraction cycles, sonication time, microwave time and their interactions on sample preparation. Flavanones were analyzed in a high performance liquid chromatography (HPLC) and later identified by liquid chromatography and mass spectrometry (LC-MS). The five flavanones were eluted by a binary mobile phase with 0.03% phosphoric acid and acetonitrile in 20 min and detected at 280 nm, and later identified by mass spectral analysis. Dimethylsulfoxide (DMSO) and dimethyl formamide (DMF) had optimum extraction levels of narirutin, naringin, neohesperidin, didymin and poncirin compared to methanol (MeOH), ethanol (EtOH) and acetonitrile (ACN). Centrifuge temperature had a significant effect on flavanone distribution in the extracts. The DMSO and DMF extracts had homogeneous distribution of flavanones compared to MeOH, EtOH and ACN after centrifugation. Furthermore, ACN showed clear phase separation due to differential densities in the extracts after centrifugation. The number of extraction cycles significantly increased the flavanone levels during extraction. Modulating the sample to solvent ratio increased naringin quantity in the extracts. Current research provides critical information on the role of centrifuge temperature, extraction solvent and their interactions on flavanone distribution in extracts. Published by Elsevier B.V.

  18. Effects of different nitrogen levels and plant density on flower, essential oils and extract production and nitrogen use efficiency of Marigold (Calendula officinalis.

    Directory of Open Access Journals (Sweden)

    ali akbar ameri

    2009-06-01

    Full Text Available Efficient use of nitrogen for medicinal plants production, might increase flower dry matter, essential oil and extract yield and reduce cost of yield production. A two year (2005 and 2006 field study was conducted in Torogh region(36,10° N,59.33° E and 1300 m altitude of Mashhad, Iran, to observe the effects of different nitrogen and densities on flower dry matter, essential oil and extract production and nitrogen use efficiency (NUE in a multi-harvested Marigold (Calendula officinalis. The levels of Nitrogen fertilizer (N were 0, 50, 100 and 150 kg ha-1 and levels of density were 20, 40, 60 and 80 plant m-2. The combined analysis results revealed significant effects of N and density levels on flower dry matter, essential oil and extract production and NUE of Marigold. The highest dry flower production obtained by 150 kg ha-1 N and 80 plant m-2 plant population (102.86 g m-2. The higher flower dry matter production caused more essential oil and extract production in high nitrogen and density levels. Agronomic N-use efficiency (kg flower dry matter yield per kg N applied, physiological efficiency (kg flower dry matter yield per kg N absorbed and fertilizer N-recovery efficiency (kg N absorbed per kg N applied, expressed as % for marigold across treatments ranged from 6.8 to14.9, 12.3 to 33.6 and 55.5 to 77.6, respectively and all were greater for N application at 50 compared with150 kg N ha-1, and under high density than low density. The amount of essential oil and extract per 100g flower dry matter decreased during the flower harvesting period. The higher amount of essential oil and extract obtained at early flowering season. The essential oil and extract ranged from 0.22 to 0.12 (ml. per 100g flower dry matter and 2.74 to 2.13 (g per 100g flower dry matter respectively.

  19. System and method for extracting physiological information from remotely detected electromagnetic radiation

    NARCIS (Netherlands)

    2016-01-01

    The present invention relates to a device and a method for extracting physiological information indicative of at least one health symptom from remotely detected electromagnetic radiation. The device comprises an interface (20) for receiving a data stream comprising remotely detected image data

  20. System and method for extracting physiological information from remotely detected electromagnetic radiation

    NARCIS (Netherlands)

    2015-01-01

    The present invention relates to a device and a method for extracting physiological information indicative of at least one health symptom from remotely detected electromagnetic radiation. The device comprises an interface (20) for receiving a data stream comprising remotely detected image data

  1. Removal of actinide elements from high level radioactive waste by trialkylphosphine oxide (TRPO)

    International Nuclear Information System (INIS)

    Song Chongli; Yang Dazhu; He Longhai; Xu Jingming; Zhu Yongjun

    1992-03-01

    The modified TRPO process for removing actinide elements from synthetic solution, which was taken from reprocessing of power reactor nuclear fuel, was verified by cascade experiment. Neptunium valence was adjusted in the process for improving neptunium removing efficiency. At 1 mol/L concentration of HNO 3 of feed solution and after a few stages of extraction with 30% t=TRPO kerosene, over 99.9% of Am, Pu, Np and U could be removed from HAW (high level radioactive waste) solution. The stripping of actinides loaded in TRPO are accomplished by high concentration nitric acid, oxalic acid and sodium carbonate instead of amino carboxylic complexing agents used in previous process. The actinides stripped were divided into three groups, which are Am + RE, Np + Pu, and U, and the cross contamination between them is small. Behaviours of F.P. elements are divided into three types which are not extracted, little extracted and extracted elements. The extracted elements are rare earth and Pd, Zr and Mo which are co-extracted with actinides. The separation factor between actinides and other two types of F.P.elements will increase if more scrubbing sections are added in the process. The relative concentration profile of actinide elements and Tc in various stages as well as the distribution of actinides and F.P. elements in the process stream solutions are also presented

  2. Videomicroscopic extraction of specific information on cell proliferation and migration in vitro

    International Nuclear Information System (INIS)

    Debeir, Olivier; Megalizzi, Veronique; Warzee, Nadine; Kiss, Robert; Decaestecker, Christine

    2008-01-01

    In vitro cell imaging is a useful exploratory tool for cell behavior monitoring with a wide range of applications in cell biology and pharmacology. Combined with appropriate image analysis techniques, this approach has been shown to provide useful information on the detection and dynamic analysis of cell events. In this context, numerous efforts have been focused on cell migration analysis. In contrast, the cell division process has been the subject of fewer investigations. The present work focuses on this latter aspect and shows that, in complement to cell migration data, interesting information related to cell division can be extracted from phase-contrast time-lapse image series, in particular cell division duration, which is not provided by standard cell assays using endpoint analyses. We illustrate our approach by analyzing the effects induced by two sigma-1 receptor ligands (haloperidol and 4-IBP) on the behavior of two glioma cell lines using two in vitro cell models, i.e., the low-density individual cell model and the high-density scratch wound model. This illustration also shows that the data provided by our approach are suggestive as to the mechanism of action of compounds, and are thus capable of informing the appropriate selection of further time-consuming and more expensive biological evaluations required to elucidate a mechanism

  3. The function of androgen/androgen receptor and insulin growth factor‑1/insulin growth factor‑1 receptor on the effects of Tribulus terrestris extracts in rats undergoing high intensity exercise.

    Science.gov (United States)

    Wu, Yin; Yang, Hongfang; Wang, Xiaohui

    2017-09-01

    Our previous study demonstrated that treatment with Tribulus terrestris (TT) extracts (120 mg/kg) promoted the muscle weight gain and performance of rats undergoing high intensity exercise. The present study was designed to explore the mechanisms underlying the effect of treatment with TT extracts and the involvement of androgens, the androgen receptor (AR), insulin growth factor‑1 (IGF‑1) and the IGF‑1 receptor (IGF‑1R). A total of 32 Sprague‑Dawley rats were randomly divided into groups as follows: Control; TT, treated with TT extracts, E, high intensity exercise; E+TT, high intensity exercise plus TT treatment. The rats of the E and E+TT groups underwent high intensity exercise with a progressively increasing load for 5 weeks, and TT extracts were intragastrically administered in the TT and E+TT rats 30 min prior to training. TT extract composition was analyzed using ultra‑high performance liquid chromatography‑quadrupole‑time of flight mass spectrometry. Testosterone and IGF‑1 plasma levels and AR, IGF‑1R and myosin heavy chain (MHC) protein levels in muscles were determined by ELISA and western blotting, respectively. The saponins tigogenin and diosgenin comprised ~71.35% of the total peak area. Compared with the E group, TT extracts increased the testosterone and IGF‑1 plasma levels, and AR, IGF‑1R and MHC protein levels in the gastrocnemius of rats undergoing high intensity exercise, accompanied with increased body weight and gastrocnemius weight. In conclusion, the effect of TT extracts on the performance of high intensity exercise rats may be attributed to increased levels of circulating testosterone and IGF‑1 and increased AR and IGF‑1R protein expression levels in the gastrocnemius, resulting in increased muscle weight and increased MHC in the gastrocnemius. The present study provided preliminary evidence supporting the use of TT extracts as a dietary supplement for the promotion of skeletal muscle mass increase and the

  4. Improving extraction efficiency of the third integer resonant extraction using higher order multipoles

    Energy Technology Data Exchange (ETDEWEB)

    Brown, K. A. [Brookhaven National Lab. (BNL), Upton, NY (United States); Schoefer, V. [Brookhaven National Lab. (BNL), Upton, NY (United States); Tomizawa, M. [High Energy Accelerator Research Organization (KEK), Tsukuba (Japan)

    2017-03-09

    The new accelerator complex at J-PARC will operate with both high energy and very high intensity proton beams. With a design slow extraction efficiency of greater than 99% this facility will still be depositing significant beam power onto accelerator components [2]. To achieve even higher efficiencies requires some new ideas. The design of the extraction system and the accelerator lattice structure leaves little room for improvement using conventional techniques. In this report we will present one method for improving the slow extraction efficiency at J-PARC by adding duodecapoles or octupoles to the slow extraction system. We will review the theory of resonant extraction, describe simulation methods, and present the results of detailed simulations. From our investigations we find that we can improve extraction efficiency and thereby reduce the level of residual activation in the accelerator components and surrounding shielding.

  5. Extracting of implicit information in English advertising texts with phonetic and lexical-morphological means

    Directory of Open Access Journals (Sweden)

    Traikovskaya Natalya Petrovna

    2015-12-01

    Full Text Available The article deals with phonetic and lexical-morphological language means participating in the process of extracting implicit information in English-speaking advertising texts for men and women. The functioning of phonetic means of the English language is not the basis for implication of information in advertising texts. Lexical and morphological means play the role of markers of relevant information, playing the role of the activator ofimplicit information in the texts of advertising.

  6. Intake of Moringa oleifera Leaf Extract Decreases IL-1 and TNF-α Levels in Dyslipidemic Wistar Rat Model

    Directory of Open Access Journals (Sweden)

    Sri Wahyuni

    2017-05-01

    Full Text Available Changes in consumption behavior to instant food cause various health problems, such as obesity, dislipidemia, and atherosclerosis. A study was conducted to investigate Moringa oleifera extract as an anti-inflammation product that decreases the levels of biochemical markers IL-1 and TNF-a. This experiment was done with randomized pre- and posttest control-group design, employing 40 Wistar rats separated into five groups: control group 0% M. oleifera leaf extract (P0, treatment group 1 with 10% M. oleifera leaf extract (P1, treatment group 2 with 15% M. oleifera leaf extract (P2, treatment group 3 with 20% M. oleifera leaf extract (P3, and treatment group 4 with 25% M. oleifera leaf extract (P4. This research observed that intake of 20% M. oleifera leaf extract results in the highest significant decrease of 15.42% of IL-1 level (134.64 ± 1.98 to 113.87 ± 4.30 pg/mL and decrease of 45.63% of TNF-α level (28.62 ± 1.25 to 15.56 ± 7.20 pg/mL. Therefore, it can be concluded that intake of M. oleifera leaf extract by Wistar rat has anti-inflammatory effects on chronic dyslipidemia through decrease of IL-1 and TNF-α levels and histopathology profile. Further research is required to determine whether the application of M. oleifera leaf extract (daun kelor in humans will have similar anti-inflammation effects.

  7. Effect of Vaccinium bracteatum Thunb. leaves extract on blood glucose and plasma lipid levels in streptozotocin-induced diabetic mice.

    Science.gov (United States)

    Wang, Li; Zhang, Xue Tong; Zhang, Hai Yan; Yao, Hui Yuan; Zhang, Hui

    2010-08-09

    To investigate the hypoglycemic effects of Vaccinium bracteatum Thunb. leaves (VBTL) extract in streptozotocin-induced diabetic mice. After administration of VBTL extract for 4 weeks, the body weight, organ weight, blood glucose (BG), insulin and plasma lipid levels of streptozotocin-induced diabetic mice were measured. Body weights of diabetic mice treated with VBTL extract were partly recovered. The BG levels of AEG (diabetic mice treated with VBTL aqueous extract) were reduced to 91.52 and 85.82% at week 2 and week 4, respectively (P0.05). The insulin levels of AEG and EEG were obviously higher (P<0.05) than those of MC (diabetic mice in model control group). Comparing with MC, AEG and EEG had significantly lower (P<0.05) TC or TG levels and similar HDL-cholesterol or LDL-cholesterol levels. In comparison with non-diabetic control mice, AEG had similar plasma lipid levels except higher LDL-cholesterol level, while EEG had higher TC, TG and LDL-cholesterol levels and lower HDL-cholesterol levels. Both aqueous and ethanolic extract of VBTL possess a potential hypoglycemic effect in streptozotocin-induced diabetic mice. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  8. High-level intuitive features (HLIFs) for intuitive skin lesion description.

    Science.gov (United States)

    Amelard, Robert; Glaister, Jeffrey; Wong, Alexander; Clausi, David A

    2015-03-01

    A set of high-level intuitive features (HLIFs) is proposed to quantitatively describe melanoma in standard camera images. Melanoma is the deadliest form of skin cancer. With rising incidence rates and subjectivity in current clinical detection methods, there is a need for melanoma decision support systems. Feature extraction is a critical step in melanoma decision support systems. Existing feature sets for analyzing standard camera images are comprised of low-level features, which exist in high-dimensional feature spaces and limit the system's ability to convey intuitive diagnostic rationale. The proposed HLIFs were designed to model the ABCD criteria commonly used by dermatologists such that each HLIF represents a human-observable characteristic. As such, intuitive diagnostic rationale can be conveyed to the user. Experimental results show that concatenating the proposed HLIFs with a full low-level feature set increased classification accuracy, and that HLIFs were able to separate the data better than low-level features with statistical significance. An example of a graphical interface for providing intuitive rationale is given.

  9. Affinity Crystallography: A New Approach to Extracting High-Affinity Enzyme Inhibitors from Natural Extracts.

    Science.gov (United States)

    Aguda, Adeleke H; Lavallee, Vincent; Cheng, Ping; Bott, Tina M; Meimetis, Labros G; Law, Simon; Nguyen, Nham T; Williams, David E; Kaleta, Jadwiga; Villanueva, Ivan; Davies, Julian; Andersen, Raymond J; Brayer, Gary D; Brömme, Dieter

    2016-08-26

    Natural products are an important source of novel drug scaffolds. The highly variable and unpredictable timelines associated with isolating novel compounds and elucidating their structures have led to the demise of exploring natural product extract libraries in drug discovery programs. Here we introduce affinity crystallography as a new methodology that significantly shortens the time of the hit to active structure cycle in bioactive natural product discovery research. This affinity crystallography approach is illustrated by using semipure fractions of an actinomycetes culture extract to isolate and identify a cathepsin K inhibitor and to compare the outcome with the traditional assay-guided purification/structural analysis approach. The traditional approach resulted in the identification of the known inhibitor antipain (1) and its new but lower potency dehydration product 2, while the affinity crystallography approach led to the identification of a new high-affinity inhibitor named lichostatinal (3). The structure and potency of lichostatinal (3) was verified by total synthesis and kinetic characterization. To the best of our knowledge, this is the first example of isolating and characterizing a potent enzyme inhibitor from a partially purified crude natural product extract using a protein crystallographic approach.

  10. Methods from Information Extraction from LIDAR Intensity Data and Multispectral LIDAR Technology

    Science.gov (United States)

    Scaioni, M.; Höfle, B.; Baungarten Kersting, A. P.; Barazzetti, L.; Previtali, M.; Wujanz, D.

    2018-04-01

    LiDAR is a consolidated technology for topographic mapping and 3D reconstruction, which is implemented in several platforms On the other hand, the exploitation of the geometric information has been coupled by the use of laser intensity, which may provide additional data for multiple purposes. This option has been emphasized by the availability of sensors working on different wavelength, thus able to provide additional information for classification of surfaces and objects. Several applications ofmonochromatic and multi-spectral LiDAR data have been already developed in different fields: geosciences, agriculture, forestry, building and cultural heritage. The use of intensity data to extract measures of point cloud quality has been also developed. The paper would like to give an overview on the state-of-the-art of these techniques, and to present the modern technologies for the acquisition of multispectral LiDAR data. In addition, the ISPRS WG III/5 on `Information Extraction from LiDAR Intensity Data' has collected and made available a few open data sets to support scholars to do research on this field. This service is presented and data sets delivered so far as are described.

  11. METHODS FROM INFORMATION EXTRACTION FROM LIDAR INTENSITY DATA AND MULTISPECTRAL LIDAR TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    M. Scaioni

    2018-04-01

    Full Text Available LiDAR is a consolidated technology for topographic mapping and 3D reconstruction, which is implemented in several platforms On the other hand, the exploitation of the geometric information has been coupled by the use of laser intensity, which may provide additional data for multiple purposes. This option has been emphasized by the availability of sensors working on different wavelength, thus able to provide additional information for classification of surfaces and objects. Several applications ofmonochromatic and multi-spectral LiDAR data have been already developed in different fields: geosciences, agriculture, forestry, building and cultural heritage. The use of intensity data to extract measures of point cloud quality has been also developed. The paper would like to give an overview on the state-of-the-art of these techniques, and to present the modern technologies for the acquisition of multispectral LiDAR data. In addition, the ISPRS WG III/5 on ‘Information Extraction from LiDAR Intensity Data’ has collected and made available a few open data sets to support scholars to do research on this field. This service is presented and data sets delivered so far as are described.

  12. Ultra-trace levels analysis of microcystins and nodularin in surface water by on-line solid-phase extraction with high-performance liquid chromatography tandem mass spectrometry.

    Science.gov (United States)

    Balest, Lydia; Murgolo, Sapia; Sciancalepore, Lucia; Montemurro, Patrizia; Abis, Pier Paolo; Pastore, Carlo; Mascolo, Giuseppe

    2016-06-01

    An on-line solid phase extraction coupled with high-performance liquid chromatography in tandem with mass spectrometry (on-line SPE/HPLC/MS-MS) method for the determination of five microcystins and nodularin in surface waters at submicrogram per liter concentrations has been optimized. Maximum recoveries were achieved by carefully optimizing the extraction sample volume, loading solvent, wash solvent, and pH of the sample. The developed method was also validated according to both UNI EN ISO IEC 17025 and UNICHIM guidelines. Specifically, ten analytical runs were performed at three different concentration levels using a reference mix solution containing the six analytes. The method was applied for monitoring the concentrations of microcystins and nodularin in real surface water during a sampling campaign of 9 months in which the ELISA method was used as standard official method. The results of the two methods were compared showing good agreement when the highest concentration values of MCs were found. Graphical abstract An on-line SPE/HPLC/MS-MS method for the determination of five microcystins and nodularin in surface waters at sub μg L(-1) was optimized and compared with ELISA assay method for real samples.

  13. A High-Speed Vision-Based Sensor for Dynamic Vibration Analysis Using Fast Motion Extraction Algorithms

    Directory of Open Access Journals (Sweden)

    Dashan Zhang

    2016-04-01

    Full Text Available The development of image sensor and optics enables the application of vision-based techniques to the non-contact dynamic vibration analysis of large-scale structures. As an emerging technology, a vision-based approach allows for remote measuring and does not bring any additional mass to the measuring object compared with traditional contact measurements. In this study, a high-speed vision-based sensor system is developed to extract structure vibration signals in real time. A fast motion extraction algorithm is required for this system because the maximum sampling frequency of the charge-coupled device (CCD sensor can reach up to 1000 Hz. Two efficient subpixel level motion extraction algorithms, namely the modified Taylor approximation refinement algorithm and the localization refinement algorithm, are integrated into the proposed vision sensor. Quantitative analysis shows that both of the two modified algorithms are at least five times faster than conventional upsampled cross-correlation approaches and achieve satisfactory error performance. The practicability of the developed sensor is evaluated by an experiment in a laboratory environment and a field test. Experimental results indicate that the developed high-speed vision-based sensor system can extract accurate dynamic structure vibration signals by tracking either artificial targets or natural features.

  14. Determination of total ribonucleotide pool in plant materials by high-pH anion-exchange high-performance liquid chromatography following extraction with potassium hydroxide.

    Science.gov (United States)

    Riondet, Christophe; Morel, Sylvain; Alcaraz, Gérard

    2005-06-10

    A new, improved method that only requires a potassium hydroxide extraction procedure is presented for the analysis of a full nucleotide pool in plant materials. Quantification was performed by high-pH anion-exchange chromatography (HPAEC) with UV detection after a potassium hydroxide extraction, and allowed the quantification of 13 linear ribonucleotides in a single run. The method has been validated by comparison of six extraction methods and also by measurement of the intracellular nucleotide levels of three plant species (cell cultures and leaves). The evolution of the nucleotide pool of Nicotiana tabacum cell culture during growth has also been measured, and showed an increase in the pool until the fifth day, where the growth rate reaches a maximum, after which a decrease was observed.

  15. Extraction method for high free radical scavenging activity of Siamese neem tree flowers

    Directory of Open Access Journals (Sweden)

    Worarat Chaisawangwong

    2009-10-01

    Full Text Available Siamese neem tree (Azadirachta indica A. Juss. var. siamensis Valeton is a medicinal plant found in Thailand. Youngleaves and young flowers of this plant are commonly consumed as a bitter tonic vegetable. The flowers are also used fortreatment of fever. The flower extract has been reported to exhibit in vitro free radical scavenging activity and can inhibitlipid peroxidation of bronchogenic cancer cell line. Active compounds in the flowers are flavonoids such as rutin andquercetin. The content of these compounds in the crude extract depends on the method of extraction. Therefore, the appropriateextraction method promoting high yield of total flavonoids and high free radical scavenging activity was investigated inthis study. Six different extraction methods, i.e. maceration, percolation, decoction, soxhlet extraction, ultrasonic extraction(UE, and microwave assisted extraction (MA were carried out for extracting dried powder of Siamese neem tree young flowers. The solvent used for maceration, percolation, and soxhlet extraction was 50% ethanol, while distilled water was used for decoction and MA, and both solvents were used for UE. The content of crude extract, free radical scavenging activity, and total flavonoids content of each extract were investigated and compared. Comparing the various extraction methods, decoction provided an extract containing a high amount of total flavonoids (17.54 mgRE/g extract and promoting the highest scavenging activity at EC50 11.36 g/ml. Decoction is also simple, cheap, and convenient and could be used in developing countries. Thus, it should be the recommended extraction method for the flowers of Siamese neem tree for furtherdevelopment of antioxidant pharmaceutical preparations.

  16. Antiobesity and Hypoglycaemic Effects of Aqueous Extract of Ibervillea sonorae in Mice Fed a High-Fat Diet with Fructose

    Science.gov (United States)

    Rivera-Ramírez, Fabiola; Escalona-Cardoso, Gerardo N.; Garduño-Siciliano, Leticia; Galaviz-Hernández, Carlos; Paniagua-Castro, Norma

    2011-01-01

    Obesity, type II diabetes, and hyperlipidaemia, which frequently coexist and are strongly associated with oxidative stress, increase the risk of cardiovascular disease. An increase in carbohydrate intake, especially of fructose, and a high-fat diet are both factors that contribute to the development of these metabolic disorders. In recent studies carried out in diabetic rats, authors reported that Ibervillea sonorae had hypoglycaemic activity; saponins and monoglycerides present in the plant could be responsible for the effects observed. In the present study, we determined the effects of an aqueous I. sonorae extract on a murine model of obesity and hyperglycaemia, induced by a high-calorie diet, and the relationship of these effects with hepatic oxidation. A high-fat diet over a period of 8 weeks induced weight gain in the mice and increased triglycerides and blood glucose levels. Simultaneous treatment with I. sonorae aqueous extracts, at doses of 100, 200, and 400 mg/kg, decreased triglycerides and glycaemia levels, prevented an increase in body weight in a dose-dependent manner, and decreased hepatic lipid oxidation at a dose of 200 mg/kg. These data suggest that the aqueous extract from I. sonorae root prevents obesity, dyslipidaemia, and hyperglycaemia induced by a hypercaloric diet; however, high doses may induce toxicity. PMID:22174560

  17. Antiobesity and hypoglycaemic effects of aqueous extract of Ibervillea sonorae in mice fed a high-fat diet with fructose.

    Science.gov (United States)

    Rivera-Ramírez, Fabiola; Escalona-Cardoso, Gerardo N; Garduño-Siciliano, Leticia; Galaviz-Hernández, Carlos; Paniagua-Castro, Norma

    2011-01-01

    Obesity, type II diabetes, and hyperlipidaemia, which frequently coexist and are strongly associated with oxidative stress, increase the risk of cardiovascular disease. An increase in carbohydrate intake, especially of fructose, and a high-fat diet are both factors that contribute to the development of these metabolic disorders. In recent studies carried out in diabetic rats, authors reported that Ibervillea sonorae had hypoglycaemic activity; saponins and monoglycerides present in the plant could be responsible for the effects observed. In the present study, we determined the effects of an aqueous I. sonorae extract on a murine model of obesity and hyperglycaemia, induced by a high-calorie diet, and the relationship of these effects with hepatic oxidation. A high-fat diet over a period of 8 weeks induced weight gain in the mice and increased triglycerides and blood glucose levels. Simultaneous treatment with I. sonorae aqueous extracts, at doses of 100, 200, and 400 mg/kg, decreased triglycerides and glycaemia levels, prevented an increase in body weight in a dose-dependent manner, and decreased hepatic lipid oxidation at a dose of 200 mg/kg. These data suggest that the aqueous extract from I. sonorae root prevents obesity, dyslipidaemia, and hyperglycaemia induced by a hypercaloric diet; however, high doses may induce toxicity.

  18. Antiobesity and Hypoglycaemic Effects of Aqueous Extract of Ibervillea sonorae in Mice Fed a High-Fat Diet with Fructose

    Directory of Open Access Journals (Sweden)

    Fabiola Rivera-Ramírez

    2011-01-01

    Full Text Available Obesity, type II diabetes, and hyperlipidaemia, which frequently coexist and are strongly associated with oxidative stress, increase the risk of cardiovascular disease. An increase in carbohydrate intake, especially of fructose, and a high-fat diet are both factors that contribute to the development of these metabolic disorders. In recent studies carried out in diabetic rats, authors reported that Ibervillea sonorae had hypoglycaemic activity; saponins and monoglycerides present in the plant could be responsible for the effects observed. In the present study, we determined the effects of an aqueous I. sonorae extract on a murine model of obesity and hyperglycaemia, induced by a high-calorie diet, and the relationship of these effects with hepatic oxidation. A high-fat diet over a period of 8 weeks induced weight gain in the mice and increased triglycerides and blood glucose levels. Simultaneous treatment with I. sonorae aqueous extracts, at doses of 100, 200, and 400 mg/kg, decreased triglycerides and glycaemia levels, prevented an increase in body weight in a dose-dependent manner, and decreased hepatic lipid oxidation at a dose of 200 mg/kg. These data suggest that the aqueous extract from I. sonorae root prevents obesity, dyslipidaemia, and hyperglycaemia induced by a hypercaloric diet; however, high doses may induce toxicity.

  19. Academic Activities Transaction Extraction Based on Deep Belief Network

    Directory of Open Access Journals (Sweden)

    Xiangqian Wang

    2017-01-01

    Full Text Available Extracting information about academic activity transactions from unstructured documents is a key problem in the analysis of academic behaviors of researchers. The academic activities transaction includes five elements: person, activities, objects, attributes, and time phrases. The traditional method of information extraction is to extract shallow text features and then to recognize advanced features from text with supervision. Since the information processing of different levels is completed in steps, the error generated from various steps will be accumulated and affect the accuracy of final results. However, because Deep Belief Network (DBN model has the ability to automatically unsupervise learning of the advanced features from shallow text features, the model is employed to extract the academic activities transaction. In addition, we use character-based feature to describe the raw features of named entities of academic activity, so as to improve the accuracy of named entity recognition. In this paper, the accuracy of the academic activities extraction is compared by using character-based feature vector and word-based feature vector to express the text features, respectively, and with the traditional text information extraction based on Conditional Random Fields. The results show that DBN model is more effective for the extraction of academic activities transaction information.

  20. High-level verification

    CERN Document Server

    Lerner, Sorin; Kundu, Sudipta

    2011-01-01

    Given the growing size and heterogeneity of Systems on Chip (SOC), the design process from initial specification to chip fabrication has become increasingly complex. This growing complexity provides incentive for designers to use high-level languages such as C, SystemC, and SystemVerilog for system-level design. While a major goal of these high-level languages is to enable verification at a higher level of abstraction, allowing early exploration of system-level designs, the focus so far for validation purposes has been on traditional testing techniques such as random testing and scenario-based

  1. High Pressure Extraction of Antioxidants from Solanum stenotomun Peel

    Directory of Open Access Journals (Sweden)

    Enrique J. Martínez de la Ossa

    2013-03-01

    Full Text Available In the work described here, two techniques for the recovery of anthocyanins from potato peel were studied and compared. One of the techniques employed was supercritical fluid extraction (SFE with pure CO2 or with CO2 and ethanol as cosolvent and the other technique was pressurized liquid extraction (PLE, where the solvent used was ethanol in water acidified to pH 2.6. The effects of pressure and temperature were studied and the anthocyanin contents obtained were statistically analyzed. In SFE the use of low pressure (100 bar and high temperature (65 °C was desirable for the anthocyanin extraction. With PLE the anthocyanin contents are increased considerably, and the best yields were obtained at 100 bar and 80 °C. This result is in correspondence with antioxidant activity index values (1.66 obtained in a DPPH antioxidant activity assay. In the extracts obtained with PLE the phenolic compounds were also determined, but the main compounds presented in the extract are anthocyanins.

  2. On Robust Information Extraction from High-Dimensional Data

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2014-01-01

    Roč. 9, č. 1 (2014), s. 131-144 ISSN 1452-4864 Grant - others:GA ČR(CZ) GA13-01930S Institutional support: RVO:67985807 Keywords : data mining * high-dimensional data * robust econometrics * outliers * machine learning Subject RIV: IN - Informatics, Computer Science

  3. Studies on Am(III) separation from simulated high-level waste using cobalt bis(dicarbollide) (1(-)) ion derivative covalently bound to N,N'-di-n-octyl diglycol diamide as extractant and DTPA as stripping agent

    Czech Academy of Sciences Publication Activity Database

    Bubeníková, M.; Selucký, P.; Rais, J.; Grüner, Bohumír; Švec, Petr

    2012-01-01

    Roč. 293, č. 1 (2012), s. 403-408 ISSN 0236-5731 R&D Projects: GA ČR GA104/09/0668 Institutional research plan: CEZ:AV0Z40320502 Keywords : Solvent extraction * actinides * high- level liquid waste * dicarbollide derivatives * carboranes * TODGA * DTPA Subject RIV: CA - Inorganic Chemistry Impact factor: 1.467, year: 2012

  4. Shifts in information processing level: the speed theory of intelligence revisited.

    Science.gov (United States)

    Sircar, S S

    2000-06-01

    A hypothesis is proposed here to reconcile the inconsistencies observed in the IQ-P3 latency relation. The hypothesis stems from the observation that task-induced increase in P3 latency correlates positively with IQ scores. It is hypothesised that: (a) there are several parallel information processing pathways of varying complexity which are associated with the generation of P3 waves of varying latencies; (b) with increasing workload, there is a shift in the 'information processing level' through progressive recruitment of more complex polysynaptic pathways with greater processing power and inhibition of the oligosynaptic pathways; (c) high-IQ subjects have a greater reserve of higher level processing pathways; (d) a given 'task-load' imposes a greater 'mental workload' in subjects with lower IQ than in those with higher IQ. According to this hypothesis, a meaningful comparison of the P3 correlates of IQ is possible only when the information processing level is pushed to its limits.

  5. Simultaneous Effect of High-Intensity Interval Training (HIIT and Consumption of Flaxseed on Serum Levels of TNF-α and IL1β in Rats

    Directory of Open Access Journals (Sweden)

    Khademi Y.

    2017-12-01

    Full Text Available Aims The high concentration of Interleukin-1 beta (IL-1β and Tumor Necrosis Factor- α (TNF-α is an important risk factor for developing cardiovascular disease. The purpose of this study was to investigate the simultaneous effect of High-Intensity Interval Training (HIIT and the use of flaxseed oil with different doses on the serum levels of TNF-α and IL1β in rats. Materials & Methods In this experimental study, 30 Wistar rats were randomly divided into six groups: control, training, 10mg/kg supplement, 30mg/kg supplement, training with 10mg/kg supplement and training with 30mg/kg supplement. The groups performed High- Intensity Interval Training (HIIT for 10 weeks and received flaxseed oil extracts. Data were analyzed by one way ANOVA and LSD post hoc test. Findings Serum levels of IL1β in the training group and training groups with doses of 10 and 30mg/kg of extract were significantly lower than the control group. Serum levels of IL1β in the training group with 30mg/kg of extract, was significantly lower than group with 10mg/ kg of extract. Also, serum levels of TNF-α in the training group, training groups with doses of 10 and 30mg/kg of extract and group with 30mg/kg of extract were significantly lower than the control group. Serum levels of TNF-α in the training group with 30mg/kg of extract were significantly lower than other groups (p<0.05. Conclusion High-Intensity Interval Training (HIIT and consumption of flaxseed oil for 10 weeks have interactive effects on reduction of serum levels of TNF-α and IL-1β in rats.

  6. Determination of the Antibiotic Oxytetracycline in Commercial Milk by Solid-Phase Extraction: A High-Performance Liquid Chromatography (HPLC) Experiment for Quantitative Instrumental Analysis

    Science.gov (United States)

    Mei-Ratliff, Yuan

    2012-01-01

    Trace levels of oxytetracylcine spiked into commercial milk samples are extracted, cleaned up, and preconcentrated using a C[subscript 18] solid-phase extraction column. The extract is then analyzed by a high-performance liquid chromatography (HPLC) instrument equipped with a UV detector and a C[subscript 18] column (150 mm x 4.6 mm x 3.5 [mu]m).…

  7. Steam stripping of polycyclic aromatics from simulated high-level radioactive waste

    International Nuclear Information System (INIS)

    Lambert, D.P.; Shah, H.B.; Young, S.R.; Edwards, R.E.; Carter, J.T.

    1992-01-01

    The Defense Waste Processing Facility (DWPF) at the Savannah River Site (SRS) will be the United States' first facility to process High Level radioactive Waste (HLW) into a borosilicate glass matrix. The removal of aromatic precipitates by hydrolysis, evaporation, liquid-liquid extraction and decantation will be a key step in the processing of the HLW. This step, titled the Precipitate Hydrolysis Process, has been demonstrated by the Savannah River Technology Center with the Precipitate Hydrolysis Experimental Facility (PHEF). The mission of the PHEF is to demonstrate processing of simulated high level radioactive waste which contains tetraphenylborate precipitates and nitrite. Aqueous washing or nitrite destruction is used to reduce nitrite. Formic acid with a copper catalyst is used to hydrolyze tetraphenylborate (TPB). The primary offgases are benzene, carbon dioxide, nitrous oxide, and nitric oxide. Hydrolysis of TPB in the presence of nitrite results in the production of polycyclic aromatics and aromatic amines (referred as high boiling organics) such as biphenyl, diphenylamine, terphenyls etc. The decanter separates the organic (benzene) and aqueous phase, but the high boiling organic separation is difficult. This paper focuses on the evaluation of the operating strategies, including steam stripping, to maximize the removal of the high boiling organics from the aqueous stream. Two areas were investigated, (1) a stream stripping comparison of the late wash flowsheet to the HAN flowsheet and (2) the extraction performance of the original decanter to the new decanter. The focus of both studies was to minimize the high boiling organic content of the Precipitate Hydrolysis Aqueous (PHA) product in order to minimize downstream impacts caused by organic deposition

  8. Advanced integrated solvent extraction systems

    Energy Technology Data Exchange (ETDEWEB)

    Horwitz, E.P.; Dietz, M.L.; Leonard, R.A. [Argonne National Lab., IL (United States)

    1997-10-01

    Advanced integrated solvent extraction systems are a series of novel solvent extraction (SX) processes that will remove and recover all of the major radioisotopes from acidic-dissolved sludge or other acidic high-level wastes. The major focus of this effort during the last 2 years has been the development of a combined cesium-strontium extraction/recovery process, the Combined CSEX-SREX Process. The Combined CSEX-SREX Process relies on a mixture of a strontium-selective macrocyclic polyether and a novel cesium-selective extractant based on dibenzo 18-crown-6. The process offers several potential advantages over possible alternatives in a chemical processing scheme for high-level waste treatment. First, if the process is applied as the first step in chemical pretreatment, the radiation level for all subsequent processing steps (e.g., transuranic extraction/recovery, or TRUEX) will be significantly reduced. Thus, less costly shielding would be required. The second advantage of the Combined CSEX-SREX Process is that the recovered Cs-Sr fraction is non-transuranic, and therefore will decay to low-level waste after only a few hundred years. Finally, combining individual processes into a single process will reduce the amount of equipment required to pretreat the waste and therefore reduce the size and cost of the waste processing facility. In an ongoing collaboration with Lockheed Martin Idaho Technology Company (LMITCO), the authors have successfully tested various segments of the Advanced Integrated Solvent Extraction Systems. Eichrom Industries, Inc. (Darien, IL) synthesizes and markets the Sr extractant and can supply the Cs extractant on a limited basis. Plans are under way to perform a test of the Combined CSEX-SREX Process with real waste at LMITCO in the near future.

  9. Electronics, information, Communication and high technology

    International Nuclear Information System (INIS)

    1999-11-01

    The contents of this book are summary of investigation, investigation system, purpose of investigation, characteristic of this investigation, important studying and development filed, compare of the level of research and development, policy, characteristic of the respondent, a future illustration in 2025 cause of hindrance of realization, propel method of research and development, the prediction of the realization period the result of investigation in electronics, information communication and high technology.

  10. Improving mental task classification by adding high frequency band information.

    Science.gov (United States)

    Zhang, Li; He, Wei; He, Chuanhong; Wang, Ping

    2010-02-01

    Features extracted from delta, theta, alpha, beta and gamma bands spanning low frequency range are commonly used to classify scalp-recorded electroencephalogram (EEG) for designing brain-computer interface (BCI) and higher frequencies are often neglected as noise. In this paper, we implemented an experimental validation to demonstrate that high frequency components could provide helpful information for improving the performance of the mental task based BCI. Electromyography (EMG) and electrooculography (EOG) artifacts were removed by using blind source separation (BSS) techniques. Frequency band powers and asymmetry ratios from the high frequency band (40-100 Hz) together with those from the lower frequency bands were used to represent EEG features. Finally, Fisher discriminant analysis (FDA) combining with Mahalanobis distance were used as the classifier. In this study, four types of classifications were performed using EEG signals recorded from four subjects during five mental tasks. We obtained significantly higher classification accuracy by adding the high frequency band features compared to using the low frequency bands alone, which demonstrated that the information in high frequency components from scalp-recorded EEG is valuable for the mental task based BCI.

  11. High-throughput analysis of sulfatides in cerebrospinal fluid using automated extraction and UPLC-MS/MS.

    Science.gov (United States)

    Blomqvist, Maria; Borén, Jan; Zetterberg, Henrik; Blennow, Kaj; Månsson, Jan-Eric; Ståhlman, Marcus

    2017-07-01

    Sulfatides (STs) are a group of glycosphingolipids that are highly expressed in brain. Due to their importance for normal brain function and their potential involvement in neurological diseases, development of accurate and sensitive methods for their determination is needed. Here we describe a high-throughput oriented and quantitative method for the determination of STs in cerebrospinal fluid (CSF). The STs were extracted using a fully automated liquid/liquid extraction method and quantified using ultra-performance liquid chromatography coupled to tandem mass spectrometry. With the high sensitivity of the developed method, quantification of 20 ST species from only 100 μl of CSF was performed. Validation of the method showed that the STs were extracted with high recovery (90%) and could be determined with low inter- and intra-day variation. Our method was applied to a patient cohort of subjects with an Alzheimer's disease biomarker profile. Although the total ST levels were unaltered compared with an age-matched control group, we show that the ratio of hydroxylated/nonhydroxylated STs was increased in the patient cohort. In conclusion, we believe that the fast, sensitive, and accurate method described in this study is a powerful new tool for the determination of STs in clinical as well as preclinical settings. Copyright © 2017 by the American Society for Biochemistry and Molecular Biology, Inc.

  12. CLASSIFICATION OF INFORMAL SETTLEMENTS THROUGH THE INTEGRATION OF 2D AND 3D FEATURES EXTRACTED FROM UAV DATA

    Directory of Open Access Journals (Sweden)

    C. M. Gevaert

    2016-06-01

    Full Text Available Unmanned Aerial Vehicles (UAVs are capable of providing very high resolution and up-to-date information to support informal settlement upgrading projects. In order to provide accurate basemaps, urban scene understanding through the identification and classification of buildings and terrain is imperative. However, common characteristics of informal settlements such as small, irregular buildings with heterogeneous roof material and large presence of clutter challenge state-of-the-art algorithms. Especially the dense buildings and steeply sloped terrain cause difficulties in identifying elevated objects. This work investigates how 2D radiometric and textural features, 2.5D topographic features, and 3D geometric features obtained from UAV imagery can be integrated to obtain a high classification accuracy in challenging classification problems for the analysis of informal settlements. It compares the utility of pixel-based and segment-based features obtained from an orthomosaic and DSM with point-based and segment-based features extracted from the point cloud to classify an unplanned settlement in Kigali, Rwanda. Findings show that the integration of 2D and 3D features leads to higher classification accuracies.

  13. Progress in evaluation of radionuclide geochemical information developed by DOE high-level nuclear waste repository site projects

    International Nuclear Information System (INIS)

    Meyer, R.E.; Arnold, W.D.; O'Kelley, G.D.; Case, F.I.; Land, J.F.

    1989-08-01

    Information that is being developed by projects within the Department of Energy (DOE) pertinent to the potential geochemical behavior of radionuclides at candidate sites for a high-level radioactive waste repository is being evaluated by Oak Ridge National Laboratory (ORNL) for the Nuclear Regulatory Commission (NRC). During this report period, all experiments were conducted with tuff from the proposed high-level nuclear waste site at Yucca Mountain, Nevada. The principal emphasis in this report period was on column studies of migration of uranium and technetium in water from well J-13 at the Yucca Mountain site. Columns 1 cm in diameter and about 5 cm long were constructed and carefully packed with ground tuff. The characteristics of the columns were tested by determination of elution curves of tritium and TcO 4 - . Elution peaks obtained in past studies with uranium were asymmetrical and the shapes were often complex, observations that suggested irreversibilities in the sorption reaction. To try to understand these observations, the effects of flow rate and temperature on uranium migration were studied in detail. Sorption ratios calculated from the elution peaks became larger as the flow rate decreased and as the temperature increased. These observations support the conclusion that the sorption of uranium is kinetically hindered. To confirm this, batch sorption ratio experiments were completed for uranium as a function of time for a variety of conditions

  14. Betel Leaf Extract (Piper betle L. Antihyperuricemia Effect Decreases Oxidative Stress by Reducing the Level of MDA and Increase Blood SOD Levels of Hyperuricemia Wistar Rats (Rattus norvegicus

    Directory of Open Access Journals (Sweden)

    I Made Sumarya

    2016-06-01

    Full Text Available Background: Betel leaf extracts (Piper betle L. antioxidant activity and enzyme inhibitors of XO. Hyperuricemia cause oxidative stress by increasing the formation of reactive oxygen species (ROS cause lipid peroxidation and oxygenation of low-density lipoprotein cholesterol (LDLc. Objective: The aim of this research was to determine the betel leaf extract as an anti hyperuricemia that can lower the blood uric acid levels and oxidative stress by lowering the levels of MDA and increase the SOD of hyperuricemia of the rat’s blood. Method: Experimental research was conducted with the design of The Randomized Post Test Only Control Group Design, on normal Wistar rats (Rattus norvegicus, administered with oxonic potassium (hyperuricemia and the hyperuricemia rats either given betel leaf extract and allopurinol. After the experiment of uric acid levels, MDA and SOD in rat blood determined. Results: The results showed that the betel leaf extract significantly (p <0.05 lower uric acid levels, MDA and increase levels of SOD in rat blood. There is a positive correlation between the levels of uric acid with MDA levels and a negative correlation, although not significantly with SOD (p >0.05. Conclusion: It can be concluded that the betel leaf extract as an anti-hyperuricemia can lower the uric acid levels and decreases oxidative stress by lowering the levels of MDA and increasing the SOD.

  15. [The comparison of blood levels between peripheral vein and tooth extraction wound after the oral administration of antibiotics (author's transl)].

    Science.gov (United States)

    Hashimoto, T; Ookawa, H; Morishita, M; Takeyasu, K; Shiiki, K; Imoto, T

    1981-06-01

    The oral administration of 300 mg of clindamycin was undertaken on 23 patients, of 500 mg of cefadroxil on 11 patients and of 250 mg of talampicillin on 12 patients, and then tooth extraction was performed under local anesthesia. Blood samples were taken from the extraction wound and the peripheral vein at the same time and assayed by the bioassay method. The blood levels of clindamycin and cefadroxil indicated a similar pattern between the extraction wound and the peripheral vein, but the blood level of talampicillin reached peek level rapider than clindamycin and cefadroxil. The blood levels of the extraction wound were 60 - 80% as compared with the venous blood levels with each antimicrobial agent.

  16. Interpersonal Movement Synchrony Responds to High- and Low-Level Conversational Constraints

    Directory of Open Access Journals (Sweden)

    Alexandra Paxton

    2017-07-01

    Full Text Available Much work on communication and joint action conceptualizes interaction as a dynamical system. Under this view, dynamic properties of interaction should be shaped by the context in which the interaction is taking place. Here we explore interpersonal movement coordination or synchrony—the degree to which individuals move in similar ways over time—as one such context-sensitive property. Studies of coordination have typically investigated how these dynamics are influenced by either high-level constraints (i.e., slow-changing factors or low-level constraints (i.e., fast-changing factors like movement. Focusing on nonverbal communication behaviors during naturalistic conversation, we analyzed how interacting participants' head movement dynamics were shaped simultaneously by high-level constraints (i.e., conversation type; friendly conversations vs. arguments and low-level constraints (i.e., perceptual stimuli; non-informative visual stimuli vs. informative visual stimuli. We found that high- and low-level constraints interacted non-additively to affect interpersonal movement dynamics, highlighting the context sensitivity of interaction and supporting the view of joint action as a complex adaptive system.

  17. Effects of aqueous extract of Portulaca oleracea L. on oxidative stress and liver, spleen leptin, PARα and FAS mRNA expression in high-fat diet induced mice.

    Science.gov (United States)

    Chen, Bendong; Zhou, Haining; Zhao, Wenchao; Zhou, Wenyan; Yuan, Quan; Yang, Guangshun

    2012-08-01

    We reported that an aqueous extract of Portulaca oleracea L. inhibited high-fat-diet-induced oxidative injury in a dose-dependent manner. Male kunming mice (5-weeks-old, 24 g) were used in this experiment. After a 4-day adaptation period, animals were randomly divided into four groups (n = 10 in each group); Group 1: animals received normal powdered rodent diet; Group 2: animals received high fat diet; Groups 3 and 4: animals received high fat diet and were fed by gavage to mice once a day with aqueous extract at the doses of 100 and 200 mg/kg body weight, respectively. In mice fed with high-fat diet, blood and liver lipid peroxidation level was significantly increased, whereas antioxidant enzymes activities were markedly decreased compared to normal control mice. Administration of an aqueous extract of P. oleracea L. significantly dose-dependently reduced levels of blood and liver lipid peroxidation and increased the activities of blood and liver antioxidant enzymes activities in high fat mice. Moreover, administration of an aqueous extract of P. oleracea L. significantly dose-dependently increase liver Leptin/β-actin (B), and Liver PPARα/β-actin, decrease liver, spleen FAS mRNA, p-PERK and p-PERK/PERK protein expression levels. Taken together, these data demonstrate that aqueous extract of P. oleracea L. can markedly alleviate high fat diet-induced oxidative injury by enhancing blood and liver antioxidant enzyme activities, modulating Leptin/β-actin (B), and Liver PPARα/β-actin, decrease liver, spleen FAS mRNA, p-PERK and p-PERK/PERK protein expression levels in mice.

  18. Green Tea Extract Supplementation Induces the Lipolytic Pathway, Attenuates Obesity, and Reduces Low-Grade Inflammation in Mice Fed a High-Fat Diet

    Directory of Open Access Journals (Sweden)

    Cláudio A. Cunha

    2013-01-01

    Full Text Available The aim of this study was to evaluate the effects of green tea Camellia sinensis extract on proinflammatory molecules and lipolytic protein levels in adipose tissue of diet-induced obese mice. Animals were randomized into four groups: CW (chow diet and water; CG (chow diet and water + green tea extract; HW (high-fat diet and water; HG (high-fat diet and water + green tea extract. The mice were fed ad libitum with chow or high-fat diet and concomitantly supplemented (oral gavage with 400 mg/kg body weight/day of green tea extract (CG and HG, resp.. The treatments were performed for eight weeks. UPLC showed that in 10 mg/mL green tea extract, there were 15 μg/mg epigallocatechin, 95 μg/mg epigallocatechin gallate, 20.8 μg/mg epicatechin gallate, and 4.9 μg/mg gallocatechin gallate. Green tea administered concomitantly with a high-fat diet increased HSL, ABHD5, and perilipin in mesenteric adipose tissue, and this was associated with reduced body weight and adipose tissue gain. Further, we observed that green tea supplementation reduced inflammatory cytokine TNFα levels, as well as TLR4, MYD88, and TRAF6 proinflammatory signalling. Our results show that green tea increases the lipolytic pathway and reduces adipose tissue, and this may explain the attenuation of low-grade inflammation in obese mice.

  19. Text mining analysis of public comments regarding high-level radioactive waste disposal

    International Nuclear Information System (INIS)

    Kugo, Akihide; Yoshikawa, Hidekazu; Shimoda, Hiroshi; Wakabayashi, Yasunaga

    2005-01-01

    In order to narrow the risk perception gap as seen in social investigations between the general public and people who are involved in nuclear industry, public comments on high-level radioactive waste (HLW) disposal have been conducted to find the significant talking points with the general public for constructing an effective risk communication model of social risk information regarding HLW disposal. Text mining was introduced to examine public comments to identify the core public interest underlying the comments. The utilized test mining method is to cluster specific groups of words with negative meanings and then to analyze public understanding by employing text structural analysis to extract words from subjective expressions. Using these procedures, it was found that the public does not trust the nuclear fuel cycle promotion policy and shows signs of anxiety about the long-lasting technological reliability of waste storage. To develop effective social risk communication of HLW issues, these findings are expected to help experts in the nuclear industry to communicate with the general public more effectively to obtain their trust. (author)

  20. Acute and Subchronic Oral Toxicity of Aqueous Extract of Ageratum ...

    African Journals Online (AJOL)

    However, histological studies revealed that the extract caused dose-dependent lesions, resulting in hepatorenal changes correlated with a high level of transaminases activity and hyperleukocytosis at 800 mg/kg dose level. The hemoglobin and hematocrit concentrations were also high in all groups treated with the extract.

  1. ONTOGRABBING: Extracting Information from Texts Using Generative Ontologies

    DEFF Research Database (Denmark)

    Nilsson, Jørgen Fischer; Szymczak, Bartlomiej Antoni; Jensen, P.A.

    2009-01-01

    We describe principles for extracting information from texts using a so-called generative ontology in combination with syntactic analysis. Generative ontologies are introduced as semantic domains for natural language phrases. Generative ontologies extend ordinary finite ontologies with rules...... for producing recursively shaped terms representing the ontological content (ontological semantics) of NL noun phrases and other phrases. We focus here on achieving a robust, often only partial, ontology-driven parsing of and ascription of semantics to a sentence in the text corpus. The aim of the ontological...... analysis is primarily to identify paraphrases, thereby achieving a search functionality beyond mere keyword search with synsets. We further envisage use of the generative ontology as a phrase-based rather than word-based browser into text corpora....

  2. MedEx: a medication information extraction system for clinical narratives

    Science.gov (United States)

    Stenner, Shane P; Doan, Son; Johnson, Kevin B; Waitman, Lemuel R; Denny, Joshua C

    2010-01-01

    Medication information is one of the most important types of clinical data in electronic medical records. It is critical for healthcare safety and quality, as well as for clinical research that uses electronic medical record data. However, medication data are often recorded in clinical notes as free-text. As such, they are not accessible to other computerized applications that rely on coded data. We describe a new natural language processing system (MedEx), which extracts medication information from clinical notes. MedEx was initially developed using discharge summaries. An evaluation using a data set of 50 discharge summaries showed it performed well on identifying not only drug names (F-measure 93.2%), but also signature information, such as strength, route, and frequency, with F-measures of 94.5%, 93.9%, and 96.0% respectively. We then applied MedEx unchanged to outpatient clinic visit notes. It performed similarly with F-measures over 90% on a set of 25 clinic visit notes. PMID:20064797

  3. Ge extraction from gasification fly ash

    Energy Technology Data Exchange (ETDEWEB)

    Oriol Font; Xavier Querol; Angel Lopez-Soler; Jose M. Chimenos; Ana I. Fernandez; Silvia Burgos; Francisco Garcia Pena [Institute of Earth Sciences ' Jaume Almera' , Barcelona (Spain)

    2005-08-01

    Water-soluble germanium species (GeS{sub 2}, GeS and hexagonal-GeO{sub 2}) are generated during coal gasification and retained in fly ash. This fact together with the high market value of this element and the relatively high contents in the fly ashes of the Puertollano Integrated Gasification in Combined Cycle (IGCC) plant directed our research towards the development of an extraction process for this element. Major objectives of this research was to find a low cost and environmentally suitable process. Several water based extraction tests were carried out using different Puertollano IGCC fly ash samples, under different temperatures, water/fly ash ratios, and extraction times. High Ge extraction yields (up to 84%) were obtained at room temperature (25{sup o}C) but also high proportions of other trace elements (impurities) were simultaneously extracted. Increasing the extraction temperature to 50, 90 and 150{sup o}C, Ge extraction yields were kept at similar levels, while reducing the content of impurities, the water/fly ash ratio and extraction time. The experimental data point out the influence of chloride, calcium and sulphide dissolutions on the Ge extraction. 16 refs., 9 figs., 6 tabs.

  4. Methods to extract information on the atomic and molecular states from scientific abstracts

    International Nuclear Information System (INIS)

    Sasaki, Akira; Ueshima, Yutaka; Yamagiwa, Mitsuru; Murata, Masaki; Kanamaru, Toshiyuki; Shirado, Tamotsu; Isahara, Hitoshi

    2005-01-01

    We propose a new application of information technology to recognize and extract expressions of atomic and molecular states from electrical forms of scientific abstracts. Present results will help scientists to understand atomic states as well as the physics discussed in the articles. Combining with the internet search engines, it will make one possible to collect not only atomic and molecular data but broader scientific information over a wide range of research fields. (author)

  5. THE COMBINATION OF MANGOSTEEN PEEL EXTRACT WITH ROSELLA FLOWER PETALS EXTRACT AND ANTHILL PLANT EXTRACT AS CHOLESTEROL AND TRIGLYCERIDES REDUCER ON MALE WHITE RATS

    Directory of Open Access Journals (Sweden)

    Anjar Mahardian Kusuma

    2016-12-01

    Full Text Available Hypercholesterolemia is a disease associated with high levels of cholesterol and LDL levels in the blood. Utilization of the commercial drugs can be given; however apart from the expensive price, adverse side effects might occur. It makes people choose alternative medication with herbal medicine through the use of natural materials. This study aimed to determine the effect of the combination of mangosteen peel extract-extract of roselle calyx and mangosteen peel extract-extract the ant nest plant as lowering cholesterol and triglyceride levels in male rats. The method used in this study was a laboratory experimental method using device posttest only control group design (simple experimental design. This study used 25 male rats of Wistar strain, divided into 5 groups; Group I: group without treatment, group II: control group solvent (NaCMC 1%, group III: positive control group (Simvastatin, Group IV: combination group mangosteen peel extract (200 mg / kg - extract of roselle calyx (250 mg / kg, group V: group combination of mangosteen peel extract 200 mg / kg - extract anthill (270 mg / kg. Induction of cholesterol in rats using quail egg yolk (10 ml / kg. The results showed that there was no significant difference in cholesterol and triglycerides between the combination of both extracts of mangosteen peel with a positive control (p<0,05.

  6. Strong coupling constant extraction from high-multiplicity Z +jets observables

    Science.gov (United States)

    Johnson, Mark; Maître, Daniel

    2018-03-01

    We present a strong coupling constant extraction at next-to-leading order QCD accuracy using ATLAS Z +2 ,3,4 jets data. This is the first extraction using processes with a dependency on high powers of the coupling constant. We obtain values of the strong coupling constant at the Z mass compatible with the world average and with uncertainties commensurate with other next-to-leading order extractions at hadron colliders. Our most conservative result for the strong coupling constant is αS(MZ)=0.117 8-0.0043+0.0051 .

  7. Analysis Methods for Extracting Knowledge from Large-Scale WiFi Monitoring to Inform Building Facility Planning

    DEFF Research Database (Denmark)

    Ruiz-Ruiz, Antonio; Blunck, Henrik; Prentow, Thor Siiger

    2014-01-01

    realistic data to inform facility planning. In this paper, we propose analysis methods to extract knowledge from large sets of network collected WiFi traces to better inform facility management and planning in large building complexes. The analysis methods, which build on a rich set of temporal and spatial......The optimization of logistics in large building com- plexes with many resources, such as hospitals, require realistic facility management and planning. Current planning practices rely foremost on manual observations or coarse unverified as- sumptions and therefore do not properly scale or provide....... Spatio-temporal visualization tools built on top of these methods enable planners to inspect and explore extracted information to inform facility-planning activities. To evaluate the methods, we present results for a large hospital complex covering more than 10 hectares. The evaluation is based on Wi...

  8. Partitioning and recovery of neptunium from high level waste streams of PUREX origin using 30% TBP

    International Nuclear Information System (INIS)

    Mathur, J.N.; Murali, M.S.; Balarama Krishna, M.V.; Iyer, R.H.; Chitnis, R.R.; Wattal, P.K.; Theyyunni, T.K.; Ramanujam, A.; Dhami, P.S.; Gopalakrishnan, V.

    1995-01-01

    237 Np is one of the longest-lived nuclides among the actinides present in the high level waste solutions of reprocessing origin. Its separation, recovery and transmutation can reduce the problem of long term storage of the vitrified waste to a great extent. With this objective, the present work was initiated to study the extraction of neptunium into TBP under the conditions relevant to high level waste, along with uranium and plutonium by oxidising it to hexavalent state using potassium dichromate and subsequently recovering it by selective stripping. Three types of simulated HLW solutions namely sulphate bearing (SB), with an acidity of ∼ 0.3 M and non-sulphate wastes originating from the reprocessing of fuels from pressurised heavy water reactor (PHWR) and fast breeder reactor (FBR) with acidities of 3.0 M HNO 3 were employed in these studies. The extraction of U(VI), Np(VI) and Pu(VI) was very high for PHWR- and FBR-HLW solutions, whereas for the SB-HLW solution, these values were less but reasonably high. Quantitative recovery of neptunium and plutonium was achieved using a stripping solution containing 0.1 M H 2 O 2 and 0.01 M ascorbic acid at an acidity of 2.0 M. Since, cerium present in the waste solutions is expected to undergo oxidation in presence of K 2 Cr 2 O 7 , its extraction behaviour was also studied under similar conditions. Based on the results, a scheme was formulated for the recovery of neptunium along with plutonium and was successfully applied to actual high level waste solution originating from the reprocessing of research reactor fuels. (author). 19 refs., 2 figs., 17 tabs

  9. Chemopreventive and Antiproliferative Effect of Andrographis Paniculata Extract

    Directory of Open Access Journals (Sweden)

    Agrawal RC

    2017-06-01

    Full Text Available An Andrographis paniculata leaf and stem extract was studied in Hela cells lines by In Vitro methods and anti promoting effect by skin tumour model. The dose dependent cytotoxicity was observed in HeLa cell lines by stem and leaves extracts of Andrographis paniculata extract. The prevention of bone marrow micronucleus formation by Andrographis paniculata leaves and stem extract was also observed. The reductions in tumour numbers were observed. The glutathione level was increased in the liver of animals which received the treatment of Andrographis extract along with DMBA + Croton Oil. The revealing information about the anticancer, antiproliferative and antimutagenic effect of an Andrographis paniculata extract was observed.

  10. In Vivo Hypocholesterolemic Effect of MARDI Fermented Red Yeast Rice Water Extract in High Cholesterol Diet Fed Mice

    Directory of Open Access Journals (Sweden)

    Swee Keong Yeap

    2014-01-01

    Full Text Available Fermented red yeast rice has been traditionally consumed as medication in Asian cuisine. This study aimed to determine the in vivo hypocholesterolemic and antioxidant effects of fermented red yeast rice water extract produced using Malaysian Agricultural Research and Development Institute (MARDI Monascus purpureus strains in mice fed with high cholesterol diet. Absence of monacolin-k, lower level of γ-aminobutyric acid (GABA, higher content of total amino acids, and antioxidant activities were detected in MARDI fermented red yeast rice water extract (MFRYR. In vivo MFRYR treatment on hypercholesterolemic mice recorded similar lipid lowering effect as commercial red yeast rice extract (CRYR as it helps to reduce the elevated serum liver enzyme and increased the antioxidant levels in liver. This effect was also associated with the upregulation of apolipoproteins-E and inhibition of Von Willebrand factor expression. In summary, MFRYR enriched in antioxidant and amino acid without monacolin-k showed similar hypocholesterolemic effect as CRYR that was rich in monacolin-k and GABA.

  11. In Vivo Hypocholesterolemic Effect of MARDI Fermented Red Yeast Rice Water Extract in High Cholesterol Diet Fed Mice

    Science.gov (United States)

    Beh, Boon Kee; Kong, Joan; Ho, Wan Yong; Mohd Yusof, Hamidah; Hussin, Aminuddin bin; Jaganath, Indu Bala; Alitheen, Noorjahan Banu; Jamaluddin, Anisah

    2014-01-01

    Fermented red yeast rice has been traditionally consumed as medication in Asian cuisine. This study aimed to determine the in vivo hypocholesterolemic and antioxidant effects of fermented red yeast rice water extract produced using Malaysian Agricultural Research and Development Institute (MARDI) Monascus purpureus strains in mice fed with high cholesterol diet. Absence of monacolin-k, lower level of γ-aminobutyric acid (GABA), higher content of total amino acids, and antioxidant activities were detected in MARDI fermented red yeast rice water extract (MFRYR). In vivo MFRYR treatment on hypercholesterolemic mice recorded similar lipid lowering effect as commercial red yeast rice extract (CRYR) as it helps to reduce the elevated serum liver enzyme and increased the antioxidant levels in liver. This effect was also associated with the upregulation of apolipoproteins-E and inhibition of Von Willebrand factor expression. In summary, MFRYR enriched in antioxidant and amino acid without monacolin-k showed similar hypocholesterolemic effect as CRYR that was rich in monacolin-k and GABA. PMID:25031606

  12. Transferring knowledge about high-level waste repositories: An ethical consideration

    International Nuclear Information System (INIS)

    Berndes, S.; Kornwachs, K.

    1996-01-01

    The purpose of this paper is to present requirements to Information and Documentation Systems for high-level waste repositories from an ethical point of view. A structured synopsis of ethical arguments used by experts from Europe and America is presented. On the one hand the review suggests to reinforce the obligation to transfer knowledge about high level waste repositories. This obligation is reduced on the other hand by the objection that ethical obligations are dependent on the difference between our and future civilizations. This reflection results in proposing a list of well-balanced ethical arguments. Then a method is presented which shows how scenarios of possible future civilizations for different time horizons and related ethical arguments are used to justify requirements to the Information and Documentation System

  13. An investigation of children's levels of inquiry in an informal science setting

    Science.gov (United States)

    Clark-Thomas, Beth Anne

    Elementary school students' understanding of both science content and processes are enhanced by the higher level thinking associated with inquiry-based science investigations. Informal science setting personnel, elementary school teachers, and curriculum specialists charged with designing inquiry-based investigations would be well served by an understanding of the varying influence of certain present factors upon the students' willingness and ability to delve into such higher level inquiries. This study examined young children's use of inquiry-based materials and factors which may influence the level of inquiry they engaged in during informal science activities. An informal science setting was selected as the context for the examination of student inquiry behaviors because of the rich inquiry-based environment present at the site and the benefits previously noted in the research regarding the impact of informal science settings upon the construction of knowledge in science. The study revealed several patterns of behavior among children when they are engaged in inquiry-based activities at informal science exhibits. These repeated behaviors varied in the children's apparent purposeful use of the materials at the exhibits. These levels of inquiry behavior were taxonomically defined as high/medium/low within this study utilizing a researcher-developed tool. Furthermore, in this study adult interventions, questions, or prompting were found to impact the level of inquiry engaged in by the children. This study revealed that higher levels of inquiry were preceded by task directed and physical feature prompts. Moreover, the levels of inquiry behaviors were haltered, even lowered, when preceded by a prompt that focused on a science content or concept question. Results of this study have implications for the enhancement of inquiry-based science activities in elementary schools as well as in informal science settings. These findings have significance for all science educators

  14. Anti-hyperglycemic effect of Aloe vera peel extract on blood sugar level of alloxan-induced Wistar rats

    Science.gov (United States)

    Peniati, E.; Setiadi, E.; Susanti, R.; Iswari, R. S.

    2018-03-01

    Aloe vera peel contains flavonoids, alkaloids, tannins, saponins, and sterols as its secondary metabolites. This research explores the effect of Aloe vera peel extract on blood glucose levels of alloxan-induced Wistar rats in a laboratory experimental scale. Blood glucose examination was performed by using GOD-PAP method. Twenty five 2 months old-white rat (Rattus norvegicus) male wistar strain weigh 150-200 grams body weight, and in healthy condition, was randomly divided into five groups. Those five groups were negative control group (K-), positive control group (K+), treatment group 1 (P1), treatment group 2 (P 2), and treatment group 3 (P 3). Each group was fed by standard diet and ad-libitum drinking. Treatments were given for 28 days. On the day 29, blood glucose level of all groups were analyzed. The results showed that the highest blood glucose levels in control group rat were positive (191.2 mg/dl). Aloe vera extract was able to decrease blood sugar level up to 104,6mg/dl in P3 group treatment rats (served Aloe vera extract 350 mg/kg BW/day). It comes to the conclusion that giving Aloe vera peel extract for 28 days decreases blood sugar level of hyperglycemic rat.

  15. A Four-Level Hierarchy for Organizing Wildland Stream Resource Information

    Science.gov (United States)

    Harry Parrott; Daniel A. Marion; R. Douglas Perkinson

    1989-01-01

    An analysis of current USDA Forest Service methods of collecting and using wildland stream resource data indicates that required information can be organized into a four-level hierarchy. Information at each level is tiered with information at the preceding level. Level 1 is the ASSOCIATION, which is differentiated by stream size and flow regime. Level 2, STREAM TYPE,...

  16. A Concept of Constructing a Common Information Space for High Tech Programs Using Information Analytical Systems

    Science.gov (United States)

    Zakharova, Alexandra A.; Kolegova, Olga A.; Nekrasova, Maria E.

    2016-04-01

    The paper deals with the issues in program management used for engineering innovative products. The existing project management tools were analyzed. The aim is to develop a decision support system that takes into account the features of program management used for high-tech products: research intensity, a high level of technical risks, unpredictable results due to the impact of various external factors, availability of several implementing agencies. The need for involving experts and using intelligent techniques for information processing is demonstrated. A conceptual model of common information space to support communication between members of the collaboration on high-tech programs has been developed. The structure and objectives of the information analysis system “Geokhod” were formulated with the purpose to implement the conceptual model of common information space in the program “Development and production of new class mining equipment - “Geokhod”.

  17. The technology of uranium extraction from the brine with high chlorine-ion content

    International Nuclear Information System (INIS)

    Khakimov, N.; Nazarov, Kh.M.; Mirsaidov, I.U.; Negmatov, Sh.I.; Barotov, B.B.

    2010-01-01

    Present article is devoted to technology of uranium extraction from the brine with high chlorine-ion content. The research results on uranium extraction from the brine of Sasik-Kul Lake by means of sorption method were considered. The chemical composition of salt was determined. The process of uranium sorption was described and analyzed. The technology of uranium extraction from the brine with high chlorine-ion content was proposed.

  18. Building Extraction in Very High Resolution Remote Sensing Imagery Using Deep Learning and Guided Filters

    Directory of Open Access Journals (Sweden)

    Yongyang Xu

    2018-01-01

    Full Text Available Very high resolution (VHR remote sensing imagery has been used for land cover classification, and it tends to a transition from land-use classification to pixel-level semantic segmentation. Inspired by the recent success of deep learning and the filter method in computer vision, this work provides a segmentation model, which designs an image segmentation neural network based on the deep residual networks and uses a guided filter to extract buildings in remote sensing imagery. Our method includes the following steps: first, the VHR remote sensing imagery is preprocessed and some hand-crafted features are calculated. Second, a designed deep network architecture is trained with the urban district remote sensing image to extract buildings at the pixel level. Third, a guided filter is employed to optimize the classification map produced by deep learning; at the same time, some salt-and-pepper noise is removed. Experimental results based on the Vaihingen and Potsdam datasets demonstrate that our method, which benefits from neural networks and guided filtering, achieves a higher overall accuracy when compared with other machine learning and deep learning methods. The method proposed shows outstanding performance in terms of the building extraction from diversified objects in the urban district.

  19. Helichrysum and grapefruit extracts inhibit carbohydrate digestion and absorption, improving postprandial glucose levels and hyperinsulinemia in rats.

    Science.gov (United States)

    de la Garza, Ana Laura; Etxeberria, Usune; Lostao, María Pilar; San Román, Belén; Barrenetxe, Jaione; Martínez, J Alfredo; Milagro, Fermín I

    2013-12-11

    Several plant extracts rich in flavonoids have been reported to improve hyperglycemia by inhibiting digestive enzyme activities and SGLT1-mediated glucose uptake. In this study, helichrysum ( Helichrysum italicum ) and grapefruit ( Citrus × paradisi ) extracts inhibited in vitro enzyme activities. The helichrysum extract showed higher inhibitory activity of α-glucosidase (IC50 = 0.19 mg/mL) than α-amylase (IC50 = 0.83 mg/mL), whereas the grapefruit extract presented similar α-amylase and α-glucosidase inhibitory activities (IC50 = 0.42 mg/mL and IC50 = 0.41 mg/mL, respectively). Both extracts reduced maltose digestion in noneverted intestinal sacs (57% with helichrysum and 46% with grapefruit). Likewise, both extracts inhibited SGLT1-mediated methylglucoside uptake in Caco-2 cells in the presence of Na(+) (56% of inhibition with helichrysum and 54% with grapefruit). In vivo studies demonstrated that helichrysum decreased blood glucose levels after an oral maltose tolerance test (OMTT), and both extracts reduced postprandial glucose levels after the oral starch tolerance test (OSTT). Finally, both extracts improved hyperinsulinemia (31% with helichrysum and 50% with grapefruit) and HOMA index (47% with helichrysum and 54% with grapefruit) in a dietary model of insulin resistance in rats. In summary, helichrysum and grapefruit extracts improve postprandial glycemic control in rats, possibly by inhibiting α-glucosidase and α-amylase enzyme activities and decreasing SGLT1-mediated glucose uptake.

  20. Domain-independent information extraction in unstructured text

    Energy Technology Data Exchange (ETDEWEB)

    Irwin, N.H. [Sandia National Labs., Albuquerque, NM (United States). Software Surety Dept.

    1996-09-01

    Extracting information from unstructured text has become an important research area in recent years due to the large amount of text now electronically available. This status report describes the findings and work done during the second year of a two-year Laboratory Directed Research and Development Project. Building on the first-year`s work of identifying important entities, this report details techniques used to group words into semantic categories and to output templates containing selective document content. Using word profiles and category clustering derived during a training run, the time-consuming knowledge-building task can be avoided. Though the output still lacks in completeness when compared to systems with domain-specific knowledge bases, the results do look promising. The two approaches are compatible and could complement each other within the same system. Domain-independent approaches retain appeal as a system that adapts and learns will soon outpace a system with any amount of a priori knowledge.

  1. Data base system for research and development of high-level waste conditioning

    International Nuclear Information System (INIS)

    Masaki, Toshio; Igarashi, Hiroshi; Ohuchi, Jin; Miyauchi, Tomoko.

    1992-01-01

    Results of research and development for High-Level Waste Conditioning are accumulated as large number of documents. Data Base System for Research and Development of High-Level Waste Conditioning has been developed since 1987 to search for necessary informations correctly and rapidly with the intention of offering and transferring the results to organization inside and outside of PNC. This data base system has contributed that technical informations has been correctly and rapidly searched. Designing of devices etc. and making of reports have become easy and work has been efficiently and rationally accomplished. (author)

  2. Breast ultrasound image segmentation: an optimization approach based on super-pixels and high-level descriptors

    Science.gov (United States)

    Massich, Joan; Lemaître, Guillaume; Martí, Joan; Mériaudeau, Fabrice

    2015-04-01

    Breast cancer is the second most common cancer and the leading cause of cancer death among women. Medical imaging has become an indispensable tool for its diagnosis and follow up. During the last decade, the medical community has promoted to incorporate Ultra-Sound (US) screening as part of the standard routine. The main reason for using US imaging is its capability to differentiate benign from malignant masses, when compared to other imaging techniques. The increasing usage of US imaging encourages the development of Computer Aided Diagnosis (CAD) systems applied to Breast Ultra-Sound (BUS) images. However accurate delineations of the lesions and structures of the breast are essential for CAD systems in order to extract information needed to perform diagnosis. This article proposes a highly modular and flexible framework for segmenting lesions and tissues present in BUS images. The proposal takes advantage of optimization strategies using super-pixels and high-level descriptors, which are analogous to the visual cues used by radiologists. Qualitative and quantitative results are provided stating a performance within the range of the state-of-the-art.

  3. Progress in radiation chemistry of crown ether extractants used for the solvent extraction of "9"0Sr

    International Nuclear Information System (INIS)

    Peng Jing; Yu Chuhong; Cui Zhenpeng; Zhai Maolin

    2011-01-01

    The separation of the long-lived fission products from dissolved nuclear fuel could improve the safe disposal of high-level nuclear wastes and reduce their threaten to human being and environment. Since the extractant system will be exposed to high radiation environment during the solvent extraction of long-lived fission products. The understanding of radiation chemistry of extractants is very important for practical design of extractant system. The radiation chemistry of crown ether systems proposed for use in the solvent extraction of one of fission products "9"0Sr were reviewed based on the study on the radiation stability and radiolysis mechanism of crown ether system. Finally some challenges were suggested. (authors)

  4. DECISION LEVEL FUSION OF ORTHOPHOTO AND LIDAR DATA USING CONFUSION MATRIX INFORMATION FOR LNAD COVER CLASSIFICATION

    Directory of Open Access Journals (Sweden)

    S. Daneshtalab

    2017-09-01

    Full Text Available Automatic urban objects extraction from airborne remote sensing data is essential to process and efficiently interpret the vast amount of airborne imagery and Lidar data available today. The aim of this study is to propose a new approach for the integration of high-resolution aerial imagery and Lidar data to improve the accuracy of classification in the city complications. In the proposed method, first, the classification of each data is separately performed using Support Vector Machine algorithm. In this case, extracted Normalized Digital Surface Model (nDSM and pulse intensity are used in classification of LiDAR data, and three spectral visible bands (Red, Green, Blue are considered as feature vector for the orthoimage classification. Moreover, combining the extracted features of the image and Lidar data another classification is also performed using all the features. The outputs of these classifications are integrated in a decision level fusion system according to the their confusion matrices to find the final classification result. The proposed method was evaluated using an urban area of Zeebruges, Belgium. The obtained results represented several advantages of image fusion with respect to a single shot dataset. With the capabilities of the proposed decision level fusion method, most of the object extraction difficulties and uncertainty were decreased and, the overall accuracy and the kappa values were improved 7% and 10%, respectively.

  5. The Hydrometallurgical Extraction and Recovery of High-Purity Silver

    Science.gov (United States)

    Hoffmann, James E.

    2012-06-01

    -bearing inputs, will be described in detail to demonstrate how typical chemical engineering unit process and unit operations have supplanted classic smelting and fire refining techniques. The Kennecott Copper Company, which has operated a hydrometallurgical circuit successfully for the recovery of high-purity silver from the slimes wet chlorination residue, has permitted me to provide some operation information and results using the technology. Both Phelps Dodge and Kennecott should be recognized for their forward-looking attitude in undertaking the conversion of conceptual chemistry into successful, full-scale plants. The process as employed at Phelps Dodge is discussed at length in reference (J.E. Hoffmann and B. Wesstrom: Hydrometallurgy, 1994, vol. 94, pp. 69-105).

  6. Hidden discriminative features extraction for supervised high-order time series modeling.

    Science.gov (United States)

    Nguyen, Ngoc Anh Thi; Yang, Hyung-Jeong; Kim, Sunhee

    2016-11-01

    In this paper, an orthogonal Tucker-decomposition-based extraction of high-order discriminative subspaces from a tensor-based time series data structure is presented, named as Tensor Discriminative Feature Extraction (TDFE). TDFE relies on the employment of category information for the maximization of the between-class scatter and the minimization of the within-class scatter to extract optimal hidden discriminative feature subspaces that are simultaneously spanned by every modality for supervised tensor modeling. In this context, the proposed tensor-decomposition method provides the following benefits: i) reduces dimensionality while robustly mining the underlying discriminative features, ii) results in effective interpretable features that lead to an improved classification and visualization, and iii) reduces the processing time during the training stage and the filtering of the projection by solving the generalized eigenvalue issue at each alternation step. Two real third-order tensor-structures of time series datasets (an epilepsy electroencephalogram (EEG) that is modeled as channel×frequency bin×time frame and a microarray data that is modeled as gene×sample×time) were used for the evaluation of the TDFE. The experiment results corroborate the advantages of the proposed method with averages of 98.26% and 89.63% for the classification accuracies of the epilepsy dataset and the microarray dataset, respectively. These performance averages represent an improvement on those of the matrix-based algorithms and recent tensor-based, discriminant-decomposition approaches; this is especially the case considering the small number of samples that are used in practice. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. extraction of high quality dna from polysaccharides-secreting ...

    African Journals Online (AJOL)

    cistvr

    A DNA extraction method using CTAB was used for the isolation of genomic DNA from ten. Xanthomonas campestris pathovars, ten isolates of Xanthomonas albilineans and one isolate of. Pseudomonas rubrisubalbicans. High quality DNA was obtained that was ideal for molecular analy- ses. Extracellular polysaccharides ...

  8. Extraction of indirectly captured information for use in a comparison of offline pH measurement technologies.

    Science.gov (United States)

    Ritchie, Elspeth K; Martin, Elaine B; Racher, Andy; Jaques, Colin

    2017-06-10

    Understanding the causes of discrepancies in pH readings of a sample can allow more robust pH control strategies to be implemented. It was found that 59.4% of differences between two offline pH measurement technologies for an historical dataset lay outside an expected instrument error range of ±0.02pH. A new variable, Osmo Res , was created using multiple linear regression (MLR) to extract information indirectly captured in the recorded measurements for osmolality. Principal component analysis and time series analysis were used to validate the expansion of the historical dataset with the new variable Osmo Res . MLR was used to identify variables strongly correlated (p<0.05) with differences in pH readings by the two offline pH measurement technologies. These included concentrations of specific chemicals (e.g. glucose) and Osmo Res, indicating culture medium and bolus feed additions as possible causes of discrepancies between the offline pH measurement technologies. Temperature was also identified as statistically significant. It is suggested that this was a result of differences in pH-temperature compensations employed by the pH measurement technologies. In summary, a method for extracting indirectly captured information has been demonstrated, and it has been shown that competing pH measurement technologies were not necessarily interchangeable at the desired level of control (±0.02pH). Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Extraction of level density and γ strength function from primary γ spectra

    International Nuclear Information System (INIS)

    Schiller, A.; Bergholt, L.; Guttormsen, M.; Melby, E.; Rekstad, J.; Siem, S.

    2000-01-01

    We present a new iterative procedure to extract the level density and the γ strength function from primary γ spectra for energies close up to the neutron binding energy. The procedure is tested on simulated spectra and on data from the 173 Yb( 3 He,α) 172 Yb reaction

  10. High-performance scalable Information Service for the ATLAS experiment

    CERN Document Server

    Kolos, S; The ATLAS collaboration; Hauser, R

    2012-01-01

    The ATLAS experiment is being operated by highly distributed computing system which is constantly producing a lot of status information which is used to monitor the experiment operational conditions as well as to access the quality of the physics data being taken. For example the ATLAS High Level Trigger(HLT) algorithms are executed on the online computing farm consisting from about 1500 nodes. Each HLT algorithm is producing few thousands histograms, which have to be integrated over the whole farm and carefully analyzed in order to properly tune the event rejection. In order to handle such non-physics data the Information Service (IS) facility has been developed in the scope of the ATLAS TDAQ project. The IS provides high-performance scalable solution for information exchange in distributed environment. In the course of an ATLAS data taking session the IS handles about hundred gigabytes of information which is being constantly updated with the update interval varying from a second to few tens of seconds. IS ...

  11. Estrogen hormone level of prepubertal female rat treated with Calliandra calothyrsus ethanolic leaf extract

    Science.gov (United States)

    Setyawati, I.; Wiratmini, N. I.; Narayani, I.

    2018-03-01

    This research examined the phytoestrogen potential of Calliandra calothyrsus leaf extract in prepubertal female rat (Rattus norvegicus). Sixty weaned female rats (21 days old) were divided into five groups i.e. control (K), negative control which was given 0.5% Na CMC suspension (KN) and treatment groups which were given with C. calothyrsus ethanolic leaf extract doses 25 mg/kg bw (P1), 50 mg/kg bw (P2) and 75 mg/kg bw (P3). The treatment suspension was administered 0.5 mL/rat/day by gavage for 28 days, started at the age of 21st days old. The rats were sacrificed and the blood samples were collected from 4 rats / group at the age of 28th, 42nd and 56th days old, each. The concentration of estrogen hormone levels were measured from blood serum by ELISA kit and were read at 450 nm wavelength with an ELISA Spectrophotometer. Data was analyzed statistically by General Linear Model with 95% of confidence. The result showed that rat’s body weight decreased significantly with the higher doses and the longer the treatment of C. calothyrsus leaf extract due to the anti-nutritive activity of calliandra tannins. The estrogen hormone level was significantly increased at the highest dose. The highest estrogen levels were found in the group of female rats which were given the exctract of 75 mg/kg bw until the age of 42nd days. This results showed that there was a phytoestrogen potential in the C. calothyrsus leaf extract.

  12. Extraction of high quality DNA from seized Moroccan cannabis resin (Hashish.

    Directory of Open Access Journals (Sweden)

    Moulay Abdelaziz El Alaoui

    Full Text Available The extraction and purification of nucleic acids is the first step in most molecular biology analysis techniques. The objective of this work is to obtain highly purified nucleic acids derived from Cannabis sativa resin seizure in order to conduct a DNA typing method for the individualization of cannabis resin samples. To obtain highly purified nucleic acids from cannabis resin (Hashish free from contaminants that cause inhibition of PCR reaction, we have tested two protocols: the CTAB protocol of Wagner and a CTAB protocol described by Somma (2004 adapted for difficult matrix. We obtained high quality genomic DNA from 8 cannabis resin seizures using the adapted protocol. DNA extracted by the Wagner CTAB protocol failed to give polymerase chain reaction (PCR amplification of tetrahydrocannabinolic acid (THCA synthase coding gene. However, the extracted DNA by the second protocol permits amplification of THCA synthase coding gene using different sets of primers as assessed by PCR. We describe here for the first time the possibility of DNA extraction from (Hashish resin derived from Cannabis sativa. This allows the use of DNA molecular tests under special forensic circumstances.

  13. Prunus mume leaf extract lowers blood glucose level in diabetic mice.

    Science.gov (United States)

    Lee, Min Woo; Kwon, Jung Eun; Lee, Young-Jong; Jeong, Yong Joon; Kim, Inhye; Cho, Young Mi; Kim, Yong-Min; Kang, Se Chan

    2016-10-01

    Context Diabetes is a common metabolic disease with long-term complications. Prunus mume Sieb. et Zucc. (Rosaceae) fruits have shown to ameliorate glucose intolerance. However, the antidiabetic effects of P. mume leaves have not been investigated. Objective This study evaluated the effects of P. mume leaf 70% ethanol extract (PMLE) on alleviating diabetes in vivo and in vitro. Materials and methods PMLE was fractionated into n-hexane, dichloromethane (CH2Cl2), ethyl acetate (EtOAc), n-butanol (BuOH) and water. Polyphenol and flavonoid contents in PMLE fractions were determined using Folin-Ciocalteu reagent and the aluminium chloride colorimetric method, respectively. We evaluated α-glucosidase inhibition using a microplate reader at 400 nm. Adipocyte differentiation by lipid accumulation was measured using Nile Red staining. Male imprinting control region (ICR) mice were injected with streptozotocin (STZ, 100 mg/kg, i.p.). High-fat diets were provided for three weeks prior to PMLE treatments to induce type 2 diabetes. PMLE (0, 5, 25 or 50 mg/kg) was administrated for four weeks with high-fat diets. Results The EtOAc fraction of PMLE inhibited α-glucosidase activity (IC50 = 68.2 μg/mL) and contained 883.5 ± 14.9 mg/g of polyphenols and 820.1 ± 7.7 mg/g of flavonoids. The 50 mg/kg PMLE supplement reduced 40% of blood glucose level compared to obese/diabetes mice. Obese/diabetic mice treated with 50 mg/kg PMLE showed a lower level of triacylglycerol (320.7 ± 20.73 mg/dL) compared to obese/diabetes mice (494.9 ± 14.80 mg/dL). Conclusion The data demonstrate that P. mume leaves exert antidiabetic effects that may be attributable to high concentrations of polyphenols and flavonoids.

  14. Shadow Analysis Technique for Extraction of Building Height using High Resolution Satellite Single Image and Accuracy Assessment

    Science.gov (United States)

    Raju, P. L. N.; Chaudhary, H.; Jha, A. K.

    2014-11-01

    These High resolution satellite data with metadata information is used to extract the height of the building using shadow. Proposed approach divides into two phases 1) rooftop and shadow extraction and 2) height estimation. Firstly the rooftop and shadow region were extracted by manual/ automatic methods using Example - Based and Rule - Based approaches. After feature extraction next step is estimating height of the building by taking rooftop in association with shadow using Ratio Method and by using the relation between sun-satellite geometry. The performance analysis shows the total mean error of height is 0.67 m from ratio method, 1.51 m from Example - Based Approach and 0.96 m from Rule - Based Approach. Analysis concluded that Ratio Method i.e. manual method is best for height estimation but it is time consuming so the automatic Rule Based approach is best for height estimation in comparison to Example Based Approach because it require more knowledge and selection of more training samples as well as slows the processing rate of the method.

  15. Extracting breathing rate information from a wearable reflectance pulse oximeter sensor.

    Science.gov (United States)

    Johnston, W S; Mendelson, Y

    2004-01-01

    The integration of multiple vital physiological measurements could help combat medics and field commanders to better predict a soldier's health condition and enhance their ability to perform remote triage procedures. In this paper we demonstrate the feasibility of extracting accurate breathing rate information from a photoplethysmographic signal that was recorded by a reflectance pulse oximeter sensor mounted on the forehead and subsequently processed by a simple time domain filtering and frequency domain Fourier analysis.

  16. Separation of actinides and long-lived fission products from high-level radioactive wastes (a review)

    International Nuclear Information System (INIS)

    Kolarik, Z.

    1991-11-01

    The management of high-level radioactive wastes is facilitated, if long-lived and radiotoxic actinides and fission products are separated before the final disposal. Especially important is the separation of americium, curium, plutonium, neptunium, strontium, cesium and technetium. The separated nuclides can be deposited separately from the bulk of the high-level waste, but their transmutation to short-lived nuclides is a muchmore favourable option. This report reviews the chemistry of the separation of actinides and fission products from radioactive wastes. The composition, nature and conditioning of the wastes are described. The main attention is paid to the solvent extraction chemistry of the elements and to the application of solvent extraction in unit operations of potential partitioning processes. Also reviewed is the behaviour of the elements in the ion exchange chromatography, precipitation, electrolysis from aqueous solutions and melts, and the distribution between molten salts and metals. Flowsheets of selected partitioning processes are shown and general aspects of the waste partitioning are shortly discussed. (orig.) [de

  17. High-level manpower movement and Japan's foreign aid.

    Science.gov (United States)

    Furuya, K

    1992-01-01

    "Japan's technical assistance programs to Asian countries are summarized. Movements of high-level manpower accompanying direct foreign investments by private enterprise are also reviewed. Proposals for increased human resources development include education and training of foreigners in Japan as well as the training of Japanese aid experts and the development of networks for information exchange." excerpt

  18. Development of a geoscience database for preselecting China's high level radioactive waste disposal sites

    International Nuclear Information System (INIS)

    Li Jun; Fan Ai; Huang Shutao; Wang Ju

    1998-01-01

    Taking the development of a geoscience database for China's high level waste disposal sites: Yumen Town, Gansu Province, northwest of China, as an example, the author introduces in detail the application of Geographical Information System (GIS) to high level waste disposal and analyses its application prospect in other fields. The development of GIS provides brand-new thinking for administrators and technicians at all levels. At the same time, the author also introduces the administration of maps and materials by using Geographical Information System

  19. Extraction of three bioactive diterpenoids from Andrographis paniculata: effect of the extraction techniques on extract composition and quantification of three andrographolides using high-performance liquid chromatography.

    Science.gov (United States)

    Kumar, Satyanshu; Dhanani, Tushar; Shah, Sonal

    2014-10-01

    Andrographis paniculata (Burm.f.) wall.ex Nees (Acanthaceae) or Kalmegh is an important medicinal plant finding uses in many Ayurvedic formulations. Diterpenoid compounds andrographolides (APs) are the main bioactive phytochemicals present in leaves and herbage of A. paniculata. The efficiency of supercritical fluid extraction (SFE) using carbon dioxide was compared with the solid-liquid extraction techniques such as solvent extraction, ultrasound-assisted solvent extraction and microwave-assisted solvent extraction with methanol, water and methanol-water as solvents. Also a rapid and validated reverse-phase high-performance liquid chromatography-diode array detection method was developed for the simultaneous determination of the three biologically active compounds, AP, neoandrographolide and andrograpanin, in the extracts of A. paniculata. Under the best SFE conditions tested for diterpenoids, which involved extraction at 60°C and 100 bar, the extractive efficiencies were 132 and 22 µg/g for AP and neoandrographolide, respectively. The modifier percentage significantly affected the extraction efficiency. © The Author [2013]. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. Information Superiority and Game Theory: The Value of Varying Levels of Information

    National Research Council Canada - National Science Library

    McIntosh, Gary

    2002-01-01

    .... This thesis examines how various levels of information and information superiority affect strategy choices and decision-making in determining the payoff value for opposing forces in a classic zero-sum two-sided contest...

  1. Triple aldose reductase/α-glucosidase/radical scavenging high-resolution profiling combined with high-performance liquid chromatography – high-resolution mass spectrometry – solid-phase extraction – nuclear magnetic resonance spectroscopy for identification of antidiabetic constituents in crude, extract of Radix Scutellariae

    DEFF Research Database (Denmark)

    Tahtah, Yousof; Kongstad, Kenneth Thermann; Wubshet, Sileshi Gizachew

    2015-01-01

    high-performance liquid chromatography – high-resolution mass spectrometry – solid-phase extraction – nuclear magnetic resonance spectroscopy. The only α-glucosidase inhibitor was baicalein, whereas main aldose reductase inhibitors in the crude extract were baicalein and skullcapflavone II, and main....../α-glucosidase/radical scavenging high-resolution inhibition profile - allowing proof of concept with Radix Scutellariae crude extract as a polypharmacological herbal drug. The triple bioactivity high-resolution profiles were used to pinpoint bioactive compounds, and subsequent structure elucidation was performed with hyphenated...

  2. Solvent extraction of cerium (III) with high molecular weight amines

    International Nuclear Information System (INIS)

    Chatterjee, A.; Basu, S.

    1992-01-01

    The use of high molecular weight amines in the extraction of cerium (III) as EDTA complex from neutral aqueous medium is reported. The extraction condition was optimised from the study of effects of several variables like concentration of amine and EDTA pH nature of diluents etc. The method has been applied for the determination of cerium in few mineral samples. (author). 7 refs., 5 tabs

  3. Effects of Fortunella margarita fruit extract on metabolic disorders in high-fat diet-induced obese C57BL/6 mice.

    Science.gov (United States)

    Tan, Si; Li, Mingxia; Ding, Xiaobo; Fan, Shengjie; Guo, Lu; Gu, Ming; Zhang, Yu; Feng, Li; Jiang, Dong; Li, Yiming; Xi, Wanpeng; Huang, Cheng; Zhou, Zhiqin

    2014-01-01

    Obesity is a nutritional disorder associated with many health problems such as dyslipidemia, type 2 diabetes and cardiovascular diseases. In the present study, we investigated the anti-metabolic disorder effects of kumquat (Fortunella margarita Swingle) fruit extract (FME) on high-fat diet-induced C57BL/6 obese mice. The kumquat fruit was extracted with ethanol and the main flavonoids of this extract were analyzed by HPLC. For the preventive experiment, female C57BL/6 mice were fed with a normal diet (Chow), high-fat diet (HF), and high-fat diet with 1% (w/w) extract of kumquat (HF+FME) for 8 weeks. For the therapeutic experiment, female C57BL/6 mice were fed with high-fat diet for 3 months to induce obesity. Then the obese mice were divided into two groups randomly, and fed with HF or HF+FME for another 2 weeks. Body weight and daily food intake amounts were recorded. Fasting blood glucose, glucose tolerance test, insulin tolerance test, serum and liver lipid levels were assayed and the white adipose tissues were imaged. The gene expression in mice liver and brown adipose tissues were analyzed with a quantitative PCR assay. In the preventive treatment, FME controlled the body weight gain and the size of white adipocytes, lowered the fasting blood glucose, serum total cholesterol (TC), serum low density lipoprotein cholesterol (LDL-c) levels as well as liver lipid contents in high-fat diet-fed C57BL/6 mice. In the therapeutic treatment, FME decreased the serum triglyceride (TG), serum TC, serum LDL-c, fasting blood glucose levels and liver lipid contents, improved glucose tolerance and insulin tolerance. Compared with the HF group, FME significantly increased the mRNA expression of PPARα and its target genes. Our study suggests that FME may be a potential dietary supplement for preventing and ameliorating the obesity and obesity-related metabolic disturbances.

  4. Effects of Fortunella margarita fruit extract on metabolic disorders in high-fat diet-induced obese C57BL/6 mice.

    Directory of Open Access Journals (Sweden)

    Si Tan

    Full Text Available INTRODUCTION: Obesity is a nutritional disorder associated with many health problems such as dyslipidemia, type 2 diabetes and cardiovascular diseases. In the present study, we investigated the anti-metabolic disorder effects of kumquat (Fortunella margarita Swingle fruit extract (FME on high-fat diet-induced C57BL/6 obese mice. METHODS: The kumquat fruit was extracted with ethanol and the main flavonoids of this extract were analyzed by HPLC. For the preventive experiment, female C57BL/6 mice were fed with a normal diet (Chow, high-fat diet (HF, and high-fat diet with 1% (w/w extract of kumquat (HF+FME for 8 weeks. For the therapeutic experiment, female C57BL/6 mice were fed with high-fat diet for 3 months to induce obesity. Then the obese mice were divided into two groups randomly, and fed with HF or HF+FME for another 2 weeks. Body weight and daily food intake amounts were recorded. Fasting blood glucose, glucose tolerance test, insulin tolerance test, serum and liver lipid levels were assayed and the white adipose tissues were imaged. The gene expression in mice liver and brown adipose tissues were analyzed with a quantitative PCR assay. RESULTS: In the preventive treatment, FME controlled the body weight gain and the size of white adipocytes, lowered the fasting blood glucose, serum total cholesterol (TC, serum low density lipoprotein cholesterol (LDL-c levels as well as liver lipid contents in high-fat diet-fed C57BL/6 mice. In the therapeutic treatment, FME decreased the serum triglyceride (TG, serum TC, serum LDL-c, fasting blood glucose levels and liver lipid contents, improved glucose tolerance and insulin tolerance. Compared with the HF group, FME significantly increased the mRNA expression of PPARα and its target genes. CONCLUSION: Our study suggests that FME may be a potential dietary supplement for preventing and ameliorating the obesity and obesity-related metabolic disturbances.

  5. Enhancement of Lipid Extraction from Marine Microalga, Scenedesmus Associated with High-Pressure Homogenization Process

    Science.gov (United States)

    Cho, Seok-Cheol; Choi, Woon-Yong; Oh, Sung-Ho; Lee, Choon-Geun; Seo, Yong-Chang; Kim, Ji-Seon; Song, Chi-Ho; Kim, Ga-Vin; Lee, Shin-Young; Kang, Do-Hyung; Lee, Hyeon-Yong

    2012-01-01

    Marine microalga, Scenedesmus sp., which is known to be suitable for biodiesel production because of its high lipid content, was subjected to the conventional Folch method of lipid extraction combined with high-pressure homogenization pretreatment process at 1200 psi and 35°C. Algal lipid yield was about 24.9% through this process, whereas only 19.8% lipid can be obtained by following a conventional lipid extraction procedure using the solvent, chloroform : methanol (2 : 1, v/v). Present approach requires 30 min process time and a moderate working temperature of 35°C as compared to the conventional extraction method which usually requires >5 hrs and 65°C temperature. It was found that this combined extraction process followed second-order reaction kinetics, which means most of the cellular lipids were extracted during initial periods of extraction, mostly within 30 min. In contrast, during the conventional extraction process, the cellular lipids were slowly and continuously extracted for >5 hrs by following first-order kinetics. Confocal and scanning electron microscopy revealed altered texture of algal biomass pretreated with high-pressure homogenization. These results clearly demonstrate that the Folch method coupled with high-pressure homogenization pretreatment can easily destruct the rigid cell walls of microalgae and release the intact lipids, with minimized extraction time and temperature, both of which are essential for maintaining good quality of the lipids for biodiesel production. PMID:22969270

  6. Enhancement of Lipid Extraction from Marine Microalga, Scenedesmus Associated with High-Pressure Homogenization Process

    Directory of Open Access Journals (Sweden)

    Seok-Cheol Cho

    2012-01-01

    Full Text Available Marine microalga, Scenedesmus sp., which is known to be suitable for biodiesel production because of its high lipid content, was subjected to the conventional Folch method of lipid extraction combined with high-pressure homogenization pretreatment process at 1200 psi and 35°C. Algal lipid yield was about 24.9% through this process, whereas only 19.8% lipid can be obtained by following a conventional lipid extraction procedure using the solvent, chloroform : methanol (2 : 1, v/v. Present approach requires 30 min process time and a moderate working temperature of 35°C as compared to the conventional extraction method which usually requires >5 hrs and 65°C temperature. It was found that this combined extraction process followed second-order reaction kinetics, which means most of the cellular lipids were extracted during initial periods of extraction, mostly within 30 min. In contrast, during the conventional extraction process, the cellular lipids were slowly and continuously extracted for >5 hrs by following first-order kinetics. Confocal and scanning electron microscopy revealed altered texture of algal biomass pretreated with high-pressure homogenization. These results clearly demonstrate that the Folch method coupled with high-pressure homogenization pretreatment can easily destruct the rigid cell walls of microalgae and release the intact lipids, with minimized extraction time and temperature, both of which are essential for maintaining good quality of the lipids for biodiesel production.

  7. Selective extraction of cesium: from compound to process

    International Nuclear Information System (INIS)

    Simon, N.; Eymard, S.; Tournois, B.; Dozol, J.F.

    2000-01-01

    Under the French law of 30 December 1991 on nuclear waste management, research is conducted to recover long-lived fission products from high-level radioactive effluents generated by spent fuel reprocessing, in order to destroy them by transmutation or encapsulate them in specific matrices. Cesium extraction with mono and bis-crown calix(4)arenes (Frame 1) is a candidate for process development. These extractants remove cesium from highly acidic or basic pH media even with high salinity. A real raffinate was treated in 1994 in a hot cell to extract cesium with a calix-crown extractant. The success of this one batch experiment confirmed the feasibility of cesium decontamination from high-level liquid waste. It was then decided to develop a process flowchart to extract cesium selectively from high-level raffinate, to be included in the general scheme of long-lived radionuclide partitioning. It was accordingly decided to develop a process based on liquid-liquid extraction and hence optimize a calixarene/diluent solvent according to: - hydraulic properties: density, viscosity, interfacial tension, - chemical criteria: sufficient cesium extraction (depending on the diluent), kinetics, third phase elimination... New mono-crown-calixarenes branched with long aliphatic groups (Frame 2) were designed to be soluble in aliphatic diluents. To prevent third phase formation associated with nitric acid extraction, the addition of modifiers (alcohol, phosphate and amide) in the organic phase was tested (Frame 3). Table 1 shows examples of calixarene/diluent systems suitable for a process flowchart, and Figure 2 provides data on cesium extraction with these new systems. Alongside these improvements, a system based on a modified 1,3-di(n-octyl-oxy)2,4-calix[4]arene crown and a modified diluent was also developed, considering a mixed TPH/NPHE system as the diluent, where TPH (hydrogenated tetra propylene) is a common aliphatic industrial solvent and NPHE is nitrophenyl

  8. Information Fusion for High Level Situation Assessment and Prediction

    National Research Council Canada - National Science Library

    Ji, Qiang

    2007-01-01

    .... In addition, we developed algorithms for performing active information fusion to improve both fusion accuracy and efficiency so that decision making and situation assessment can be made in a timely and efficient manner...

  9. Effects of Ferulago angulata Extract on Serum Lipids and Lipid Peroxidation

    Directory of Open Access Journals (Sweden)

    Mahmoud Rafieian-kopaei

    2014-01-01

    Full Text Available Background. Nowadays, herbs they are considered to be the main source of effective drugs for lowering serum lipids and lipid peroxidation. The present experimental animal study aimed to assess the impact of Ferulago angulata on serum lipid profiles, and on levels of lipid peroxidation. Methods. Fifty male Wistar rats, weighing 250–300 g, were randomly divided into five equal groups (ten rats in each. The rat groups received different diets as follows: Group I: fat-rich diet; Group II: fat-rich diet plus hydroalcoholic extracts of Ferulago angulata at a dose of 400 mg/kg; Group III: fat-rich diet plus hydroalcoholic extracts of Ferulago angulata at a dose of 600 mg/kg; Group IV: fat-rich diet plus atorvastatin; Group V: common stock diet. The levels of serum glucose and lipids and the atherogenic index were measured. In addition, malondialdehyde (MDA, thiol oxidation, carbonyl concentrations, C-reactive proteins, and antioxidant capacity were evaluated in each group of rats. Results. Interestingly, by adding a hydroalcoholic extract of Ferulago angulata to the high-fat diet, the levels of total cholesterol and low-density lipoproteins (LDL in the high-fat diet rats were both significantly reduced. This result was considerably greater compared to when atorvastatin was added as an antilipid drug. The beneficial effects of the Ferulago angulata extract on lowering the level of triglycerides was observed only when a high dosage of this plant extraction was added to a high fat diet. Furthermore, the level of malondialdehyde, was significantly affected by the use of the plant extract in a high-fat diet, compared with a normal regimen or high-fat diet alone. Conclusion. Administration of a hydroalcoholic extract of Ferulago angulata can reduce serum levels of total cholesterol, triglycerides, and LDL. It can also inhibit lipid peroxidation.

  10. Influence of the extraction method and storage time on the physicochemical properties and carotenoid levels of pequi (Caryocar brasiliense Camb. oil

    Directory of Open Access Journals (Sweden)

    Milton Cosme Ribeiro

    2012-06-01

    Full Text Available The objective of this study was to analyze the physicochemical properties and carotenoid levels of pequi oil obtained by different extraction methods and to evaluate the preservation of these properties and pigments during storage time. The pequi oil was obtained by solvent extraction, mechanical extraction, and hot water flotation. It was stored for over 180 days in an amber bottle at ambient conditions. Analyses for the determination of the acidity, peroxide, saponification and iodine values, coloration, total carotenoids, and β-carotene levels were conducted. The oil extraction with solvents produced the best yield and carotenoid levels. The oil obtained by mechanical extraction presented higher acidity (5.44 mg KOH.g-1 and peroxide values (1.07 mEq.kg-1. During the storage of pequi oil, there was an increase in the acidity and the peroxide values, darkening of the oil coloration, and a reduction of the carotenoid levels. Mechanical extraction is the less advantageous method for the conservation of the physicochemical properties and carotenoid levels in pequi oil.

  11. Linking attentional processes and conceptual problem solving: visual cues facilitate the automaticity of extracting relevant information from diagrams.

    Science.gov (United States)

    Rouinfar, Amy; Agra, Elise; Larson, Adam M; Rebello, N Sanjay; Loschky, Lester C

    2014-01-01

    This study investigated links between visual attention processes and conceptual problem solving. This was done by overlaying visual cues on conceptual physics problem diagrams to direct participants' attention to relevant areas to facilitate problem solving. Participants (N = 80) individually worked through four problem sets, each containing a diagram, while their eye movements were recorded. Each diagram contained regions that were relevant to solving the problem correctly and separate regions related to common incorrect responses. Problem sets contained an initial problem, six isomorphic training problems, and a transfer problem. The cued condition saw visual cues overlaid on the training problems. Participants' verbal responses were used to determine their accuracy. This study produced two major findings. First, short duration visual cues which draw attention to solution-relevant information and aid in the organizing and integrating of it, facilitate both immediate problem solving and generalization of that ability to new problems. Thus, visual cues can facilitate re-representing a problem and overcoming impasse, enabling a correct solution. Importantly, these cueing effects on problem solving did not involve the solvers' attention necessarily embodying the solution to the problem, but were instead caused by solvers attending to and integrating relevant information in the problems into a solution path. Second, this study demonstrates that when such cues are used across multiple problems, solvers can automatize the extraction of problem-relevant information extraction. These results suggest that low-level attentional selection processes provide a necessary gateway for relevant information to be used in problem solving, but are generally not sufficient for correct problem solving. Instead, factors that lead a solver to an impasse and to organize and integrate problem information also greatly facilitate arriving at correct solutions.

  12. Systematically extracting metal- and solvent-related occupational information from free-text responses to lifetime occupational history questionnaires.

    Science.gov (United States)

    Friesen, Melissa C; Locke, Sarah J; Tornow, Carina; Chen, Yu-Cheng; Koh, Dong-Hee; Stewart, Patricia A; Purdue, Mark; Colt, Joanne S

    2014-06-01

    Lifetime occupational history (OH) questionnaires often use open-ended questions to capture detailed information about study participants' jobs. Exposure assessors use this information, along with responses to job- and industry-specific questionnaires, to assign exposure estimates on a job-by-job basis. An alternative approach is to use information from the OH responses and the job- and industry-specific questionnaires to develop programmable decision rules for assigning exposures. As a first step in this process, we developed a systematic approach to extract the free-text OH responses and convert them into standardized variables that represented exposure scenarios. Our study population comprised 2408 subjects, reporting 11991 jobs, from a case-control study of renal cell carcinoma. Each subject completed a lifetime OH questionnaire that included verbatim responses, for each job, to open-ended questions including job title, main tasks and activities (task), tools and equipment used (tools), and chemicals and materials handled (chemicals). Based on a review of the literature, we identified exposure scenarios (occupations, industries, tasks/tools/chemicals) expected to involve possible exposure to chlorinated solvents, trichloroethylene (TCE) in particular, lead, and cadmium. We then used a SAS macro to review the information reported by study participants to identify jobs associated with each exposure scenario; this was done using previously coded standardized occupation and industry classification codes, and a priori lists of associated key words and phrases related to possibly exposed tasks, tools, and chemicals. Exposure variables representing the occupation, industry, and task/tool/chemicals exposure scenarios were added to the work history records of the study respondents. Our identification of possibly TCE-exposed scenarios in the OH responses was compared to an expert's independently assigned probability ratings to evaluate whether we missed identifying

  13. The Feature Extraction Based on Texture Image Information for Emotion Sensing in Speech

    Directory of Open Access Journals (Sweden)

    Kun-Ching Wang

    2014-09-01

    Full Text Available In this paper, we present a novel texture image feature for Emotion Sensing in Speech (ESS. This idea is based on the fact that the texture images carry emotion-related information. The feature extraction is derived from time-frequency representation of spectrogram images. First, we transform the spectrogram as a recognizable image. Next, we use a cubic curve to enhance the image contrast. Then, the texture image information (TII derived from the spectrogram image can be extracted by using Laws’ masks to characterize emotional state. In order to evaluate the effectiveness of the proposed emotion recognition in different languages, we use two open emotional databases including the Berlin Emotional Speech Database (EMO-DB and eNTERFACE corpus and one self-recorded database (KHUSC-EmoDB, to evaluate the performance cross-corpora. The results of the proposed ESS system are presented using support vector machine (SVM as a classifier. Experimental results show that the proposed TII-based feature extraction inspired by visual perception can provide significant classification for ESS systems. The two-dimensional (2-D TII feature can provide the discrimination between different emotions in visual expressions except for the conveyance pitch and formant tracks. In addition, the de-noising in 2-D images can be more easily completed than de-noising in 1-D speech.

  14. High linear energy transfer degradation studies simulating alpha radiolysis of TRU solvent extraction processes

    Energy Technology Data Exchange (ETDEWEB)

    Pearson, Jeremy [Department of Chemical Engineering and Materials Science - University of California Irvine, 916 Engineering Tower, Irvine, CA, 92697 (United States); Miller, George [Department of Chemistry- University of California Irvine, 2046D PS II, Irvine, CA, 92697 (United States); Nilsson, Mikael [Department of Chemical Engineering and Materials Science - University of California Irvine, 916 Engineering Tower, Irvine, CA, 92697 (United States)

    2013-07-01

    Treatment of used nuclear fuel through solvent extraction separation processes is hindered by radiolytic damage from radioactive isotopes present in used fuel. The nature of the damage caused by the radiation may depend on the radiation type, whether it be low linear energy transfer (LET) such as gamma radiation or high LET such as alpha radiation. Used nuclear fuel contains beta/gamma emitting isotopes but also a significant amount of transuranics which are generally alpha emitters. Studying the respective effects on matter of both of these types of radiation will allow for accurate prediction and modeling of process performance losses with respect to dose. Current studies show that alpha radiation has milder effects than that of gamma. This is important to know because it will mean that solvent extraction solutions exposed to alpha radiation may last longer than expected and need less repair and replacement. These models are important for creating robust, predictable, and economical processes that have strong potential for mainstream adoption on the commercial level. The effects of gamma radiation on solvent extraction ligands have been more extensively studied than the effects of alpha radiation. This is due to the inherent difficulty in producing a sufficient and confluent dose of alpha particles within a sample without leaving the sample contaminated with long lived radioactive isotopes. Helium ion beam and radioactive isotope sources have been studied in the literature. We have developed a method for studying the effects of high LET radiation in situ via {sup 10}B activation and the high LET particles that result from the {sup 10}B(n,a){sup 7}Li reaction which follows. Our model for dose involves solving a partial differential equation representing absorption by 10B of an isentropic field of neutrons penetrating a sample. This method has been applied to organic solutions of TBP and CMPO, two ligands common in TRU solvent extraction treatment processes. Rates

  15. Optimized extraction conditions from high power-ECRIS by dedicated dielectric structures

    International Nuclear Information System (INIS)

    Schachter, L.; Dobrescu, S.; Stiebing, K.E.

    2012-01-01

    The MD-method of enhancing the ion output from ECR ion sources is well established and basically works via two mechanisms, the regenerative injection of cold electrons from an emissive dielectric layer on the plasma chamber walls and via the cutting of compensating wall currents, which results in an improved ion extraction from the plasma. As this extraction from the plasma becomes a more and more challenging issue for modern ECRIS installations with high microwave power input, a series of experiments was carried out at the 14 GHz ECRIS of the Institut fuer Kernphysik in Frankfurt/Main, Germany (IKF). In contrast to our earlier work, in these experiments emphasis was put on the second of the above mechanisms namely to influence the sheath potential at the extraction by structures with special dielectric properties. Two different types of dielectric structures, Tantalum-oxide and Aluminium oxide (the latter also being used for the MD-method) with dramatically different electrical properties were mounted on the extraction electrode of the IKF-ECRIS, facing the plasma. For both structures an increase of the extracted ion beam currents for middle and high charge states by 60-80 % was observed. The method can also be applied to other ECR ion sources for increasing the extracted ion beam performances. The paper is followed by the slides of the presentation. (authors)

  16. Research on Methods of High Coherent Target Extraction in Urban Area Based on Psinsar Technology

    Science.gov (United States)

    Li, N.; Wu, J.

    2018-04-01

    PSInSAR technology has been widely applied in ground deformation monitoring. Accurate identification of Persistent Scatterers (PS) is key to the success of PSInSAR data processing. In this paper, the theoretic models and specific algorithms of PS point extraction methods are summarized and the characteristics and applicable conditions of each method, such as Coherence Coefficient Threshold method, Amplitude Threshold method, Dispersion of Amplitude method, Dispersion of Intensity method, are analyzed. Based on the merits and demerits of different methods, an improved method for PS point extraction in urban area is proposed, that uses simultaneously backscattering characteristic, amplitude and phase stability to find PS point in all pixels. Shanghai city is chosen as an example area for checking the improvements of the new method. The results show that the PS points extracted by the new method have high quality, high stability and meet the strong scattering characteristics. Based on these high quality PS points, the deformation rate along the line-of-sight (LOS) in the central urban area of Shanghai is obtained by using 35 COSMO-SkyMed X-band SAR images acquired from 2008 to 2010 and it varies from -14.6 mm/year to 4.9 mm/year. There is a large sedimentation funnel in the cross boundary of Hongkou and Yangpu district with a maximum sedimentation rate of more than 14 mm per year. The obtained ground subsidence rates are also compared with the result of spirit leveling and show good consistent. Our new method for PS point extraction is more reasonable, and can improve the accuracy of the obtained deformation results.

  17. Application of mercapto-silica polymerized high internal phase emulsions for the solid-phase extraction and preconcentration of trace lead(II).

    Science.gov (United States)

    Su, Rihui; Ruan, Guihua; Chen, Zhengyi; Du, Fuyou; Li, Jianping

    2015-12-01

    A new class of solid-phase extraction column prepared with grafted mercapto-silica polymerized high internal phase emulsion particles was used for the preconcentration of trace lead. First, mercapto-silica polymerized high internal phase emulsion particles were synthesized by using high internal phase emulsion polymerization and carefully assembled in a polyethylene syringe column. The influences of various parameters including adsorption pH value, adsorption and desorption solvents, flow rate of the adsorption and desorption procedure were optimized, respectively, and the suitable uploading sample volumes, adsorption capacity, and reusability of solid phase extraction column were also investigated. Under the optimum conditions, Pb(2+) could be preconcentrated quantitatively over a wide pH range (2.0-5.0). In the presence of foreign ions, such as Na(+) , K(+) , Ca(2+) , Zn(2+) , Mg(2+) , Cu(2+) , Fe(2+) , Cd(2+) , Cl(-) and NO3 (-) , Pb(2+) could be recovered successfully. The prepared solid-phase extraction column performed with high stability and desirable durability, which allowed more than 100 replicate extractions without measurable changes of performance. The feasibility of the developed method was further validated by the extraction of Pb(2+) in rice samples. At three spiked levels of 40.0, 200 and 800 μg/kg, the average recoveries for Pb(2+) in rice samples ranged from 87.3 to 105.2%. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Feature extraction from high resolution satellite imagery as an input to the development and rapid update of a METRANS geographic information system (GIS).

    Science.gov (United States)

    2011-06-01

    This report describes an accuracy assessment of extracted features derived from three : subsets of Quickbird pan-sharpened high resolution satellite image for the area of the : Port of Los Angeles, CA. Visual Learning Systems Feature Analyst and D...

  19. Advanced integrated solvent extraction and ion exchange systems

    International Nuclear Information System (INIS)

    Horwitz, P.

    1996-01-01

    Advanced integrated solvent extraction (SX) and ion exchange (IX) systems are a series of novel SX and IX processes that extract and recover uranium and transuranics (TRUs) (neptunium, plutonium, americium) and fission products 90 Sr, 99 Tc, and 137 Cs from acidic high-level liquid waste and that sorb and recover 90 Sr, 99 Tc, and 137 Cs from alkaline supernatant high-level waste. Each system is based on the use of new selective liquid extractants or chromatographic materials. The purpose of the integrated SX and IX processes is to minimize the quantity of waste that must be vitrified and buried in a deep geologic repository by producing raffinates (from SX) and effluent streams (from IX) that will meet the specifications of Class A low-level waste

  20. About increasing informativity of diagnostic system of asynchronous electric motor by extracting additional information from values of consumed current parameter

    Science.gov (United States)

    Zhukovskiy, Y.; Korolev, N.; Koteleva, N.

    2018-05-01

    This article is devoted to expanding the possibilities of assessing the technical state of the current consumption of asynchronous electric drives, as well as increasing the information capacity of diagnostic methods, in conditions of limited access to equipment and incompleteness of information. The method of spectral analysis of the electric drive current can be supplemented by an analysis of the components of the current of the Park's vector. The research of the hodograph evolution in the moment of appearance and development of defects was carried out using the example of current asymmetry in the phases of an induction motor. The result of the study is the new diagnostic parameters of the asynchronous electric drive. During the research, it was proved that the proposed diagnostic parameters allow determining the type and level of the defect. At the same time, there is no need to stop the equipment and taky it out of service for repair. Modern digital control and monitoring systems can use the proposed parameters based on the stator current of an electrical machine to improve the accuracy and reliability of obtaining diagnostic patterns and predicting their changes in order to improve the equipment maintenance systems. This approach can also be used in systems and objects where there are significant parasitic vibrations and unsteady loads. The extraction of useful information can be carried out in electric drive systems in the structure of which there is a power electric converter.

  1. Vegetation extraction from high-resolution satellite imagery using the Normalized Difference Vegetation Index (NDVI)

    Science.gov (United States)

    AlShamsi, Meera R.

    2016-10-01

    Over the past years, there has been various urban development all over the UAE. Dubai is one of the cities that experienced rapid growth in both development and population. That growth can have a negative effect on the surrounding environment. Hence, there has been a necessity to protect the environment from these fast pace changes. One of the major impacts this growth can have is on vegetation. As technology is evolving day by day, there is a possibility to monitor changes that are happening on different areas in the world using satellite imagery. The data from these imageries can be utilized to identify vegetation in different areas of an image through a process called vegetation detection. Being able to detect and monitor vegetation is very beneficial for municipal planning and management, and environment authorities. Through this, analysts can monitor vegetation growth in various areas and analyze these changes. By utilizing satellite imagery with the necessary data, different types of vegetation can be studied and analyzed, such as parks, farms, and artificial grass in sports fields. In this paper, vegetation features are detected and extracted through SAFIY system (i.e. the Smart Application for Feature extraction and 3D modeling using high resolution satellite ImagerY) by using high-resolution satellite imagery from DubaiSat-2 and DEIMOS-2 satellites, which provide panchromatic images of 1m resolution and spectral bands (red, green, blue and near infrared) of 4m resolution. SAFIY system is a joint collaboration between MBRSC and DEIMOS Space UK. It uses image-processing algorithms to extract different features (roads, water, vegetation, and buildings) to generate vector maps data. The process to extract green areas (vegetation) utilize spectral information (such as, the red and near infrared bands) from the satellite images. These detected vegetation features will be extracted as vector data in SAFIY system and can be updated and edited by end-users, such as

  2. The influences of scientific information on the growing in opinion for high level waste repository. Focusing on education in civil engineering course

    International Nuclear Information System (INIS)

    Amemiya, Kiyoshi; Chijimatsu, Masakazu

    2002-01-01

    In this research, survey of awareness and attitude to high level radioactive waste (HLW) disposal on 36 students of a postgraduate course was conducted. They have been studying civil and rock engineering, so they belong to 'the Group' that acquires high education, culture and faculty to understand the science in geological disposal of HLW. First of all the awareness of danger or safety to HLW disposal was examined. Some 23% regard HLW disposal as safe, on the contrary 60% feel danger. This is similar to the awareness of the average public. And some 72% think that HLW should be disposal, but only 6% agree the repository in their town. It shows that the Group of high education has a tendency of calmly understand the necessity of disposal, but they also have a nature so-called 'not in my back yard (NIMBY)'. After that, the students were divided in two groups. Then, one group received information from the promoter, and another received information from opponents. The result of second questionnaire shows that the awareness of danger is affected strongly by given information even in this Group, but they become thoughtful and prudent in their opinion and decision-making as increasing information. Finally in this paper it is studied that 'what is the role of education of civil engineering?' and 'what is key issue in R and D of HLW disposal?' considering Public Acceptance. (author)

  3. Development of a geoscience database for preselecting China's high level radioactive waste disposal sites

    International Nuclear Information System (INIS)

    Li Jun; Fan Ai; Huang Shutao; Wang Ju

    2004-01-01

    Taking the development of a geoscience database for China's high level waste disposal sites: Yumen Town, Guansu province, northwest of China, as an example, this paper introduces in detail the application of Geographical Information System (GIS) to high level waste disposal and analyses its application prospect in other fields. The development of GIS provides brand-new thinking for administrators and technicians at all levels. At the same time, this paper also introduces the administration of maps and materials by using Geographical Information System. (author)

  4. The CMS High Level Trigger System

    CERN Document Server

    Afaq, A; Bauer, G; Biery, K; Boyer, V; Branson, J; Brett, A; Cano, E; Carboni, A; Cheung, H; Ciganek, M; Cittolin, S; Dagenhart, W; Erhan, S; Gigi, D; Glege, F; Gómez-Reino, Robert; Gulmini, M; Gutiérrez-Mlot, E; Gutleber, J; Jacobs, C; Kim, J C; Klute, M; Kowalkowski, J; Lipeles, E; Lopez-Perez, Juan Antonio; Maron, G; Meijers, F; Meschi, E; Moser, R; Murray, S; Oh, A; Orsini, L; Paus, C; Petrucci, A; Pieri, M; Pollet, L; Rácz, A; Sakulin, H; Sani, M; Schieferdecker, P; Schwick, C; Sexton-Kennedy, E; Sumorok, K; Suzuki, I; Tsirigkas, D; Varela, J

    2007-01-01

    The CMS Data Acquisition (DAQ) System relies on a purely software driven High Level Trigger (HLT) to reduce the full Level-1 accept rate of 100 kHz to approximately 100 Hz for archiving and later offline analysis. The HLT operates on the full information of events assembled by an event builder collecting detector data from the CMS front-end systems. The HLT software consists of a sequence of reconstruction and filtering modules executed on a farm of O(1000) CPUs built from commodity hardware. This paper presents the architecture of the CMS HLT, which integrates the CMS reconstruction framework in the online environment. The mechanisms to configure, control, and monitor the Filter Farm and the procedures to validate the filtering code within the DAQ environment are described.

  5. Robust rooftop extraction from visible band images using higher order CRF

    KAUST Repository

    Li, Er

    2015-08-01

    In this paper, we propose a robust framework for building extraction in visible band images. We first get an initial classification of the pixels based on an unsupervised presegmentation. Then, we develop a novel conditional random field (CRF) formulation to achieve accurate rooftops extraction, which incorporates pixel-level information and segment-level information for the identification of rooftops. Comparing with the commonly used CRF model, a higher order potential defined on segment is added in our model, by exploiting region consistency and shape feature at segment level. Our experiments show that the proposed higher order CRF model outperforms the state-of-the-art methods both at pixel and object levels on rooftops with complex structures and sizes in challenging environments. © 1980-2012 IEEE.

  6. Changing stress levels through gaining information on stress

    Directory of Open Access Journals (Sweden)

    S.N. Madu

    2002-09-01

    Full Text Available Objective: The aim of this research was to find out the effect of the Information Phase of a Stress Management Program (SMP on the perceptions of participants about their stress levels. Method: A total sample of 100 workers (nursing staff, private business men and women, laboratory assistants, the protective services [foreman and security staff], as well as people in human resources departments took part in this study. All the participants were from the Northern and Gauteng Provinces in South Africa. The Combined Hassles and Uplifts Scale (Folkman & Lazarus, 1989 was used as an instrument to measure the perceived stress level of participants in a SMP. Result: A significant reduction in stress levels was achieved among those who received the Information Phase of the SMP only, as well as those who received the whole stress management techniques. There was no significant difference between the amount of reduction in perceived stress-levels achieved among those that received the Information Phase of the SMP only, compared to that of those who received the whole techniques. Conclusion: The authors conclude that where the resources are limited, only the information phase of a SMP may be given to desiring clients. That should help to save time and money spent on participating in SMPs. This should however not discourage the use of the whole SPM, where affordable. Keywords: Stress Management Programs, Information Phase, Perception, Stress Level.

  7. Overview of high-level waste management accomplishments

    International Nuclear Information System (INIS)

    Lawroski, H.; Berreth, J.R.; Freeby, W.A.

    1980-01-01

    Storage of power reactor spent fuel is necessary at present because of the lack of reprocessing operations particularly in the U.S. By considering the above solidification and storage scenario, there is more than reasonable assurance that acceptable, stable, low heat generation rate, solidified waste can be produced, and safely disposed. The public perception of no waste disposal solutions is being exploited by detractors of nuclear power application. The inability to even point to one overall system demonstration lends credibility to the negative assertions. By delaying the gathering of on-line information to qualify repository sites, and to implement a demonstration, the actions of the nuclear power detractors are self serving in that they can continue to point out there is no demonstration of satisfactory high-level waste disposal. By maintaining the liquid and solidified high-level waste in secure above ground storage until acceptable decay heat generation rates are achieved, by producing a compatible, high integrity, solid waste form, by providing a second or even third barrier as a compound container and by inserting the enclosed waste form in a qualified repository with spacing to assure moderately low temperature disposal conditions, there appears to be no technical reason for not progressing further with the disposal of high-level wastes and needed implementation of the complete nuclear power fuel cycle

  8. High-mesembrine Sceletium extract (Trimesemine™) is a monoamine releasing agent, rather than only a selective serotonin reuptake inhibitor.

    Science.gov (United States)

    Coetzee, Dirk D; López, Víctor; Smith, Carine

    2016-01-11

    Extracts from and alkaloids contained in plants in the genus Sceletium have been reported to inhibit ligand binding to serotonin transporter. From this, the conclusion was made that Sceletium products act as selective serotonin-reuptake inhibitors. However, other mechanisms which may similarly result in the anxiolytic or anti-depressant effect ascribed to Sceletium, such as monoamine release, have not been investigated. The current study investigated simultaneously and at two consecutive time points, the effect of high-mesembrine Sceletium extract on both monoamine release and serotonin reuptake into both human astrocytes and mouse hippocampal neurons, as well as potential inhibitory effects on relevant enzyme activities. Human astrocytes and mouse hippocampal cells were treated with citalopram or Sceletium extract for 15 and 30min, after which protein expression levels of serotonin transporter (SERT) and vesicular monoamine transporter-2 (VAMT-2) was assessed using fluorescent immunocytochemistry and digital image analysis. Efficacy of inhibition of acetylcholinesterase (AChE) and monoamine oxidate-A (MAO-A) activity were assessed using the Ellman and Olsen methods (and appropriate controls) respectively. We report the first investigation of mechanism of action of Sceletium extract in the context of serotonin transport, release and reuptake in a cellular model. Cell viability was not affected by Sceletium treatment. High-mesembrine Sceletium extract down-regulated SERT expression similarly to citalopram. In addition, VMAT-2 was upregulated significantly in response to Sceletium treatment. The extract showed only relatively mild inhibition of AChE and MAO-A. We conclude that the serotonin reuptake inhibition activity ascribed to the Sceletium plant, is a secondary function to the monoamine-releasing activity of high-mesembrine Sceletium extract (Trimesemine(TM)). Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  9. An Overview of Biomolecular Event Extraction from Scientific Documents.

    Science.gov (United States)

    Vanegas, Jorge A; Matos, Sérgio; González, Fabio; Oliveira, José L

    2015-01-01

    This paper presents a review of state-of-the-art approaches to automatic extraction of biomolecular events from scientific texts. Events involving biomolecules such as genes, transcription factors, or enzymes, for example, have a central role in biological processes and functions and provide valuable information for describing physiological and pathogenesis mechanisms. Event extraction from biomedical literature has a broad range of applications, including support for information retrieval, knowledge summarization, and information extraction and discovery. However, automatic event extraction is a challenging task due to the ambiguity and diversity of natural language and higher-level linguistic phenomena, such as speculations and negations, which occur in biological texts and can lead to misunderstanding or incorrect interpretation. Many strategies have been proposed in the last decade, originating from different research areas such as natural language processing, machine learning, and statistics. This review summarizes the most representative approaches in biomolecular event extraction and presents an analysis of the current state of the art and of commonly used methods, features, and tools. Finally, current research trends and future perspectives are also discussed.

  10. An Overview of Biomolecular Event Extraction from Scientific Documents

    Directory of Open Access Journals (Sweden)

    Jorge A. Vanegas

    2015-01-01

    Full Text Available This paper presents a review of state-of-the-art approaches to automatic extraction of biomolecular events from scientific texts. Events involving biomolecules such as genes, transcription factors, or enzymes, for example, have a central role in biological processes and functions and provide valuable information for describing physiological and pathogenesis mechanisms. Event extraction from biomedical literature has a broad range of applications, including support for information retrieval, knowledge summarization, and information extraction and discovery. However, automatic event extraction is a challenging task due to the ambiguity and diversity of natural language and higher-level linguistic phenomena, such as speculations and negations, which occur in biological texts and can lead to misunderstanding or incorrect interpretation. Many strategies have been proposed in the last decade, originating from different research areas such as natural language processing, machine learning, and statistics. This review summarizes the most representative approaches in biomolecular event extraction and presents an analysis of the current state of the art and of commonly used methods, features, and tools. Finally, current research trends and future perspectives are also discussed.

  11. Antidiabetic and Hypolipidemic Activities of Curculigo latifolia Fruit:Root Extract in High Fat Fed Diet and Low Dose STZ Induced Diabetic Rats

    Directory of Open Access Journals (Sweden)

    Nur Akmal Ishak

    2013-01-01

    Full Text Available Curculigo latifolia fruit is used as alternative sweetener while root is used as alternative treatment for diuretic and urinary problems. The antidiabetic and hypolipidemic activities of C. latifolia fruit:root aqueous extract in high fat diet (HFD and 40 mg streptozotocin (STZ induced diabetic rats through expression of genes involved in glucose and lipid metabolisms were investigated. Diabetic rats were treated with C. latifolia fruit:root extract for 4 weeks. Plasma glucose, insulin, adiponectin, lipid profiles, alanine aminotransferase (ALT, gamma glutamyltransferase (GGT, urea, and creatinine levels were measured before and after treatments. Regulations of selected genes involved in glucose and lipid metabolisms were determined. Results showed the significant (P<0.05 increase in body weight, high density lipoprotein (HDL, insulin, and adiponectin levels and decreased glucose, total cholesterol (TC, triglycerides (TG, low density lipoprotein (LDL, urea, creatinine, ALT, and GGT levels in diabetic rats after 4 weeks treatment. Furthermore, C. latifolia fruit:root extract significantly increased the expression of IRS-1, IGF-1, GLUT4, PPARα, PPARγ, AdipoR1, AdipoR2, leptin, LPL, and lipase genes in adipose and muscle tissues in diabetic rats. These results suggest that C. latifolia fruit:root extract exerts antidiabetic and hypolipidemic effects through altering regulation genes in glucose and lipid metabolisms in diabetic rats.

  12. YAdumper: extracting and translating large information volumes from relational databases to structured flat files.

    Science.gov (United States)

    Fernández, José M; Valencia, Alfonso

    2004-10-12

    Downloading the information stored in relational databases into XML and other flat formats is a common task in bioinformatics. This periodical dumping of information requires considerable CPU time, disk and memory resources. YAdumper has been developed as a purpose-specific tool to deal with the integral structured information download of relational databases. YAdumper is a Java application that organizes database extraction following an XML template based on an external Document Type Declaration. Compared with other non-native alternatives, YAdumper substantially reduces memory requirements and considerably improves writing performance.

  13. High hydrostatic pressure extraction of phenolic compounds from ...

    African Journals Online (AJOL)

    High hydrostatic pressure processing (HHPP) is a food processing method, in which food is subjected to the elevated pressure which is mostly between 100 to 800 MPa. HHPP is seen not only in food engineering, but also have other application areas, such as extraction of active ingredients from natural biomaterials.

  14. Smart Extraction and Analysis System for Clinical Research.

    Science.gov (United States)

    Afzal, Muhammad; Hussain, Maqbool; Khan, Wajahat Ali; Ali, Taqdir; Jamshed, Arif; Lee, Sungyoung

    2017-05-01

    With the increasing use of electronic health records (EHRs), there is a growing need to expand the utilization of EHR data to support clinical research. The key challenge in achieving this goal is the unavailability of smart systems and methods to overcome the issue of data preparation, structuring, and sharing for smooth clinical research. We developed a robust analysis system called the smart extraction and analysis system (SEAS) that consists of two subsystems: (1) the information extraction system (IES), for extracting information from clinical documents, and (2) the survival analysis system (SAS), for a descriptive and predictive analysis to compile the survival statistics and predict the future chance of survivability. The IES subsystem is based on a novel permutation-based pattern recognition method that extracts information from unstructured clinical documents. Similarly, the SAS subsystem is based on a classification and regression tree (CART)-based prediction model for survival analysis. SEAS is evaluated and validated on a real-world case study of head and neck cancer. The overall information extraction accuracy of the system for semistructured text is recorded at 99%, while that for unstructured text is 97%. Furthermore, the automated, unstructured information extraction has reduced the average time spent on manual data entry by 75%, without compromising the accuracy of the system. Moreover, around 88% of patients are found in a terminal or dead state for the highest clinical stage of disease (level IV). Similarly, there is an ∼36% probability of a patient being alive if at least one of the lifestyle risk factors was positive. We presented our work on the development of SEAS to replace costly and time-consuming manual methods with smart automatic extraction of information and survival prediction methods. SEAS has reduced the time and energy of human resources spent unnecessarily on manual tasks.

  15. STUDY ON BUILDING EXTRACTION FROM HIGH-RESOLUTION IMAGES USING MBI

    Directory of Open Access Journals (Sweden)

    Z. Ding

    2018-04-01

    Full Text Available Building extraction from high resolution remote sensing images is a hot research topic in the field of photogrammetry and remote sensing. However, the diversity and complexity of buildings make building extraction methods still face challenges in terms of accuracy, efficiency, and so on. In this study, a new building extraction framework based on MBI and combined with image segmentation techniques, spectral constraint, shadow constraint, and shape constraint is proposed. In order to verify the proposed method, worldview-2, GF-2, GF-1 remote sensing images covered Xiamen Software Park were used for building extraction experiments. Experimental results indicate that the proposed method improve the original MBI significantly, and the correct rate is over 86 %. Furthermore, the proposed framework reduces the false alarms by 42 % on average compared to the performance of the original MBI.

  16. From Specific Information Extraction to Inferences: A Hierarchical Framework of Graph Comprehension

    Science.gov (United States)

    2004-09-01

    The skill to interpret the information displayed in graphs is so important to have, the National Council of Teachers of Mathematics has created...guidelines to ensure that students learn these skills ( NCTM : Standards for Mathematics , 2003). These guidelines are based primarily on the extraction of...graphical perception. Human Computer Interaction, 8, 353-388. NCTM : Standards for Mathematics . (2003, 2003). Peebles, D., & Cheng, P. C.-H. (2002

  17. Image understanding systems based on the unifying representation of perceptual and conceptual information and the solution of mid-level and high-level vision problems

    Science.gov (United States)

    Kuvychko, Igor

    2001-10-01

    Vision is a part of a larger information system that converts visual information into knowledge structures. These structures drive vision process, resolving ambiguity and uncertainty via feedback, and provide image understanding, that is an interpretation of visual information in terms of such knowledge models. A computer vision system based on such principles requires unifying representation of perceptual and conceptual information. Computer simulation models are built on the basis of graphs/networks. The ability of human brain to emulate similar graph/networks models is found. That means a very important shift of paradigm in our knowledge about brain from neural networks to the cortical software. Starting from the primary visual areas, brain analyzes an image as a graph-type spatial structure. Primary areas provide active fusion of image features on a spatial grid-like structure, where nodes are cortical columns. The spatial combination of different neighbor features cannot be described as a statistical/integral characteristic of the analyzed region, but uniquely characterizes such region itself. Spatial logic and topology naturally present in such structures. Mid-level vision processes like clustering, perceptual grouping, multilevel hierarchical compression, separation of figure from ground, etc. are special kinds of graph/network transformations. They convert low-level image structure into the set of more abstract ones, which represent objects and visual scene, making them easy for analysis by higher-level knowledge structures. Higher-level vision phenomena like shape from shading, occlusion, etc. are results of such analysis. Such approach gives opportunity not only to explain frequently unexplainable results of the cognitive science, but also to create intelligent computer vision systems that simulate perceptional processes in both what and where visual pathways. Such systems can open new horizons for robotic and computer vision industries.

  18. Influence of a highly purified senna extract on colonic epithelium

    NARCIS (Netherlands)

    van Gorkom, B A; Karrenbeld, A; van Der Sluis, T; Koudstaal, J; de Vries, E G; Kleibeuker, J H

    2000-01-01

    BACKGROUND: Chronic use of sennoside laxatives often causes pseudomelanosis coli. A recent study suggested that pseudomelanosis coli is associated with an increased colorectal cancer risk. A single high dose of highly purified senna extract increased proliferation rate and reduced crypt length in

  19. Simplification improves understanding of informed consent information in clinical trials regardless of health literacy level.

    Science.gov (United States)

    Kim, Eun Jin; Kim, Su Hyun

    2015-06-01

    This study evaluated the effect of a simplified informed consent form for clinical trials on the understanding and efficacy of informed consent information across health literacy levels. A total of 150 participants were randomly assigned to one of two groups and provided with either standard or simplified consent forms for a cancer clinical trial. The features of the simplified informed consent form included plain language, short sentences, diagrams, pictures, and bullet points. Levels of objective and subjective understanding were significantly higher in participants provided with simplified informed consent forms relative to those provided with standard informed consent forms. The interaction effects between type of consent form and health literacy level on objective and subjective understanding were nonsignificant. Simplified informed consent was effective in enhancing participant's subjective and objective understanding regardless of health literacy. © The Author(s) 2015.

  20. Elicited soybean (Glycine max) extract effect on improving levels of Ter-119+Cd59+ in a mouse model fed a high fat-fructose diet

    Science.gov (United States)

    Safitri, Yunita Diyah; Widyarti, Sri; Rifa'i, Muhaimin

    2017-05-01

    People who have unbalanced lifestyles and habits such as consuming high fat and sugar foods, as well as the lack of physical activity, have an increased risk of obesity and related metabolic diseases. The condition of obesity occurs due to an excess of nutrients which leads to low-grade inflammation. Inflammation induced by obesity causes unstable bone marrow homeostasis which is associated with proliferation and differentiation of Hematopoietic Stem Cells (HSCs). This study aimed to observe the erythroid progenitor (TER-119) and complement regulator (CD59) on bone marrow cells in mouse models fed a high fat-fructose diet (HFFD). This research was conducted by modeling obese mice using high fat and fructose food for 20 weeks, and then treating them with elicited soybean extract (ESE) for four weeks with several doses: low dose (78 mg/kgBB), moderate dose (104 mg/kgBB) and high dose (130 mg/kgBB). Cell TER119+CD59+ expression decreased in the HFFD group compared to the normal group. In the low, moderate and high dose group, TER119+CD59+ expression significantly increased compared to the HFFD group. These results demonstrate that soybean elicited extract can improve the hematopoietic system by increasing TER119+CD59+ expression in a high fat and fructose diet mouse model.

  1. Work Environment Factors and Their Influence on Urinary Chromium Levels in Informal Electroplating Workers

    Science.gov (United States)

    Setyaningsih, Yuliani; Husodo, Adi Heru; Astuti, Indwiani

    2018-02-01

    One of the informal sector which absorbs labor was electroplating business. This sector uses chromium as coating material because it was strong, corrosion resistant and strong. Nonetheless hexavalent chromium is highly toxic if inhaled, swallowed and contact with skin. Poor hygiene, the lack of work environment factors and sanitation conditions can increase the levels of chromium in the body. This aimed of this study was to analyze the association between work environment factors and levels of urinary chromium in informal electroplating worker. A Purposive study was conducted in Tegal Central Java. The research subjects were 66 male workers. Chi Square analysis was used to establish an association between work environment factors and level of urinary chromium. There is a relationship between heat stress and wind direction to the chromium levels in urine (p 0.05). This explains that work environment factors can increase chromium levels in the urine of informal electroplating workers.

  2. Work Environment Factors and Their Influence on Urinary Chromium Levels in Informal Electroplating Workers

    Directory of Open Access Journals (Sweden)

    Setyaningsih Yuliani

    2018-01-01

    Full Text Available One of the informal sector which absorbs labor was electroplating business. This sector uses chromium as coating material because it was strong, corrosion resistant and strong. Nonetheless hexavalent chromium is highly toxic if inhaled, swallowed and contact with skin. Poor hygiene, the lack of work environment factors and sanitation conditions can increase the levels of chromium in the body. This aimed of this study was to analyze the association between work environment factors and levels of urinary chromium in informal electroplating worker. A Purposive study was conducted in Tegal Central Java. The research subjects were 66 male workers. Chi Square analysis was used to establish an association between work environment factors and level of urinary chromium. There is a relationship between heat stress and wind direction to the chromium levels in urine (p 0.05. This explains that work environment factors can increase chromium levels in the urine of informal electroplating workers.

  3. Surfactant-enhanced liquid-liquid microextraction coupled to micro-solid phase extraction onto highly hydrophobic magnetic nanoparticles

    International Nuclear Information System (INIS)

    Giannoulis, Kiriakos M.; Giokas, Dimosthenis L.; Tsogas, George Z.; Vlessidis, Athanasios G.; Zhu, Qing; Pan, Qinmin

    2013-01-01

    We are presenting a simplified alternative method for dispersive liquid-liquid microextraction (DLLME) by resorting to the use of surfactants as emulsifiers and micro solid-phase extraction (μ-SPE). In this combined procedure, DLLME of hydrophobic components is initially accomplished in a mixed micellar/microemulsion extractant phase that is prepared by rapidly mixing a non-ionic surfactant and 1-octanol in aqueous medium. Then, and in contrast to classic DLLME, the extractant phase is collected by highly hydrophobic polysiloxane-coated core-shell Fe 2 O 3 (at)C magnetic nanoparticles. Hence, the sample components are the target analyte in the DLLME which, in turn, becomes the target analyte of the μ-SPE step. This 2-step approach represents a new and simple DLLME procedure that lacks tedious steps such as centrifugation, thawing, or delicate collection of the extractant phase. As a result, the analytical process is accelerated and the volume of the collected phase does not depend on the volume of the extraction solvent. The method was applied to extract cadmium in the form of its pyrrolidine dithiocarbamate chelate from spiked water samples prior to its determination by FAAS. Detection limits were brought down to the low μg L −1 levels by preconcentrating 10 mL samples with satisfactory recoveries (96.0–108.0 %). (author)

  4. Answers to your questions on high-level nuclear waste

    International Nuclear Information System (INIS)

    1987-11-01

    This booklet contains answers to frequently asked questions about high-level nuclear wastes. Written for the layperson, the document contains basic information on the hazards of radiation, the Nuclear Waste Management Program, the proposed geologic repository, the proposed monitored retrievable storage facility, risk assessment, and public participation in the program

  5. High-efficient extraction of principal medicinal components from fresh Phellodendron bark (cortex phellodendri

    Directory of Open Access Journals (Sweden)

    Keqin Xu

    2018-05-01

    Full Text Available There are three key medicinal components (phellodendrine, berberine and palmatine in the extracts of Phellodendron bark, as one of the fundamental herbs of traditional Chinese medicine. Different extraction methods and solvent combinations were investigated to obtain the optimal technologies for high-efficient extraction of these medicinal components. Results: The results showed that combined solvents have higher extracting effect of phellodendrine, berberine and palmatine than single solvent, and the effect of ultrasonic extraction is distinctly better than those of distillation and soxhlet extraction. Conclusion: The hydrochloric acid/methanol-ultrasonic extraction has the best effect for three medicinal components of fresh Phellodendron bark, providing an extraction yield of 103.12 mg/g berberine, 24.41 mg/g phellodendrine, 1.25 mg/g palmatine. Keywords: Phellodendron, Cortex phellodendri, Extraction methods, Medicinal components

  6. Automatic Extraction of High-Resolution Rainfall Series from Rainfall Strip Charts

    Science.gov (United States)

    Saa-Requejo, Antonio; Valencia, Jose Luis; Garrido, Alberto; Tarquis, Ana M.

    2015-04-01

    Soil erosion is a complex phenomenon involving the detachment and transport of soil particles, storage and runoff of rainwater, and infiltration. The relative magnitude and importance of these processes depends on a host of factors, including climate, soil, topography, cropping and land management practices among others. Most models for soil erosion or hydrological processes need an accurate storm characterization. However, this data are not always available and in some cases indirect models are generated to fill this gap. In Spain, the rain intensity data known for time periods less than 24 hours back to 1924 and many studies are limited by it. In many cases this data is stored in rainfall strip charts in the meteorological stations but haven't been transfer in a numerical form. To overcome this deficiency in the raw data a process of information extraction from large amounts of rainfall strip charts is implemented by means of computer software. The method has been developed that largely automates the intensive-labour extraction work based on van Piggelen et al. (2011). The method consists of the following five basic steps: 1) scanning the charts to high-resolution digital images, 2) manually and visually registering relevant meta information from charts and pre-processing, 3) applying automatic curve extraction software in a batch process to determine the coordinates of cumulative rainfall lines on the images (main step), 4) post processing the curves that were not correctly determined in step 3, and 5) aggregating the cumulative rainfall in pixel coordinates to the desired time resolution. A colour detection procedure is introduced that automatically separates the background of the charts and rolls from the grid and subsequently the rainfall curve. The rainfall curve is detected by minimization of a cost function. Some utilities have been added to improve the previous work and automates some auxiliary processes: readjust the bands properly, merge bands when

  7. White LED with High Package Extraction Efficiency

    International Nuclear Information System (INIS)

    Yi Zheng; Stough, Matthew

    2008-01-01

    The goal of this project is to develop a high efficiency phosphor converting (white) Light Emitting Diode (pcLED) 1-Watt package through an increase in package extraction efficiency. A transparent/translucent monolithic phosphor is proposed to replace the powdered phosphor to reduce the scattering caused by phosphor particles. Additionally, a multi-layer thin film selectively reflecting filter is proposed between blue LED die and phosphor layer to recover inward yellow emission. At the end of the project we expect to recycle approximately 50% of the unrecovered backward light in current package construction, and develop a pcLED device with 80 lm/W e using our technology improvements and commercially available chip/package source. The success of the project will benefit luminous efficacy of white LEDs by increasing package extraction efficiency. In most phosphor-converting white LEDs, the white color is obtained by combining a blue LED die (or chip) with a powdered phosphor layer. The phosphor partially absorbs the blue light from the LED die and converts it into a broad green-yellow emission. The mixture of the transmitted blue light and green-yellow light emerging gives white light. There are two major drawbacks for current pcLEDs in terms of package extraction efficiency. The first is light scattering caused by phosphor particles. When the blue photons from the chip strike the phosphor particles, some blue light will be scattered by phosphor particles. Converted yellow emission photons are also scattered. A portion of scattered light is in the backward direction toward the die. The amount of this backward light varies and depends in part on the particle size of phosphors. The other drawback is that yellow emission from phosphor powders is isotropic. Although some backward light can be recovered by the reflector in current LED packages, there is still a portion of backward light that will be absorbed inside the package and further converted to heat. Heat generated

  8. High-efficient extraction of principal medicinal components from fresh Phellodendron bark (cortex phellodendri).

    Science.gov (United States)

    Xu, Keqin; He, Gongxiu; Qin, Jieming; Cheng, Xuexiang; He, Hanjie; Zhang, Dangquan; Peng, Wanxi

    2018-05-01

    There are three key medicinal components (phellodendrine, berberine and palmatine) in the extracts of Phellodendron bark, as one of the fundamental herbs of traditional Chinese medicine. Different extraction methods and solvent combinations were investigated to obtain the optimal technologies for high-efficient extraction of these medicinal components. The results showed that combined solvents have higher extracting effect of phellodendrine, berberine and palmatine than single solvent, and the effect of ultrasonic extraction is distinctly better than those of distillation and soxhlet extraction. The hydrochloric acid/methanol-ultrasonic extraction has the best effect for three medicinal components of fresh Phellodendron bark, providing an extraction yield of 103.12 mg/g berberine, 24.41 mg/g phellodendrine, 1.25 mg/g palmatine.

  9. The extraction of lifetimes of weakly-populated nuclear levels in recoil distance method experiments

    International Nuclear Information System (INIS)

    Kennedy, D.L.; Stuchbery, A.E.; Bolotin, H.H.

    1979-01-01

    Two analytic techniques are described which extend the conventional analysis of recoil-distance method (RDM) data. The first technique utilizes the enhanced counting statistics of the composite spectrum formed by the addition of all γ-ray spectra recorded at the different target-to-stopper distances employed, in order to extract the lifetimes of levels whose observed depopulating γ-ray transitions have insufficient statistics to permit conventional analysis. The second technique analyses peak centroids rather than peak areas to account for contamination by flight distance dependent background. The results from a recent study of the low-lying excited states in 196 198 Pt for those levels whose lifetimes could be extracted by conventional RDM analysis are shown to be in good agreement with those obtained using the new methods of analysis

  10. Text mining facilitates database curation - extraction of mutation-disease associations from Bio-medical literature.

    Science.gov (United States)

    Ravikumar, Komandur Elayavilli; Wagholikar, Kavishwar B; Li, Dingcheng; Kocher, Jean-Pierre; Liu, Hongfang

    2015-06-06

    Advances in the next generation sequencing technology has accelerated the pace of individualized medicine (IM), which aims to incorporate genetic/genomic information into medicine. One immediate need in interpreting sequencing data is the assembly of information about genetic variants and their corresponding associations with other entities (e.g., diseases or medications). Even with dedicated effort to capture such information in biological databases, much of this information remains 'locked' in the unstructured text of biomedical publications. There is a substantial lag between the publication and the subsequent abstraction of such information into databases. Multiple text mining systems have been developed, but most of them focus on the sentence level association extraction with performance evaluation based on gold standard text annotations specifically prepared for text mining systems. We developed and evaluated a text mining system, MutD, which extracts protein mutation-disease associations from MEDLINE abstracts by incorporating discourse level analysis, using a benchmark data set extracted from curated database records. MutD achieves an F-measure of 64.3% for reconstructing protein mutation disease associations in curated database records. Discourse level analysis component of MutD contributed to a gain of more than 10% in F-measure when compared against the sentence level association extraction. Our error analysis indicates that 23 of the 64 precision errors are true associations that were not captured by database curators and 68 of the 113 recall errors are caused by the absence of associated disease entities in the abstract. After adjusting for the defects in the curated database, the revised F-measure of MutD in association detection reaches 81.5%. Our quantitative analysis reveals that MutD can effectively extract protein mutation disease associations when benchmarking based on curated database records. The analysis also demonstrates that incorporating

  11. Effects of herbal mixture extracts on obesity in rats fed a high-fat diet

    Directory of Open Access Journals (Sweden)

    Mei-Yin Chien

    2016-07-01

    Full Text Available The aim of this study was to investigate and compare the effects of three herbal mixture extracts on obesity induced by high-fat diet (HFD in rats. The prescriptions—Pericarpium citri reticulatae and Fructus crataegi—were used as matrix components and mixed with Ampelopsis grossedentata, Salvia miltiorrhiza, and epigallocatechin-3-gallate (EGCG to form T1, T2, and T3 complexes, respectively. Results revealed that HFD feeding significantly increased body weight gain, fat deposition, plasma lipid profiles, hepatic lipid accumulation, and hepatic vacuoles formation, but decreased plasma levels of adiponectin in rats. Only the T1 complex showed the tendency, although not significantly so, for decreased HFD-induced body weight gain. T1 and T3 complexes significantly reduced HFD-induced fat deposition, and plasma levels of triglyceride, total cholesterol, and low-density lipoprotein cholesterol. Only the T1 complex significantly increased HFD-reduced adiponectin levels in plasma, but decreased HFD-increased triglyceride content in liver tissues. All complexes effectively inhibited HFD-induced vacuoles formation. The content of dihydromyricetin, salvianolic acid B, and EGCG in T1, T2, and T3 complexes was 18.25 ± 0.07%, 22.20 ± 0.10%, and 18.86 ± 0.04%, respectively. In summary, we demonstrated that herbal mixture extracts, especially T1 complex, exhibit antiobesity activity in HFD-fed rats.

  12. Patients subject to high levels of coercion: staff's understanding.

    Science.gov (United States)

    Bowers, Len; Wright, Steve; Stewart, Duncan

    2014-05-01

    Measures to keep staff and patients safe (containment) frequently involve coercion. A small proportion of patients is subject to a large proportion of containment use. To reduce the use of containment, we need a better understanding of the circumstances in which it is used and the understandings of patients and staff. Two sweeps were made of all the wards, spread over four hospital sites, in one large London mental health organization to identify patients who had been subject to high levels of containment in the previous two weeks. Data were then extracted from their case notes about their past history, current problem behaviours, and how they were understood by the patients involved and the staff. Nurses and consultant psychiatrists were interviewed to supplement the information from the case records. Twenty-six heterogeneous patients were identified, with many ages, genders, diagnoses, and psychiatric specialities represented. The main problem behaviours giving rise to containment use were violence and self-harm. The roots of the problem behaviours were to be found in severe psychiatric symptoms, cognitive difficulties, personality traits, and the implementation of the internal structure of the ward by staff. Staff's range and depth of understandings was limited and did not include functional analysis, defence mechanisms, specific cognitive assessment, and other potential frameworks. There is a need for more in-depth assessment and understanding of patients' problems, which may lead to additional ways to reduce containment use.

  13. Triangle network motifs predict complexes by complementing high-error interactomes with structural information.

    Science.gov (United States)

    Andreopoulos, Bill; Winter, Christof; Labudde, Dirk; Schroeder, Michael

    2009-06-27

    A lot of high-throughput studies produce protein-protein interaction networks (PPINs) with many errors and missing information. Even for genome-wide approaches, there is often a low overlap between PPINs produced by different studies. Second-level neighbors separated by two protein-protein interactions (PPIs) were previously used for predicting protein function and finding complexes in high-error PPINs. We retrieve second level neighbors in PPINs, and complement these with structural domain-domain interactions (SDDIs) representing binding evidence on proteins, forming PPI-SDDI-PPI triangles. We find low overlap between PPINs, SDDIs and known complexes, all well below 10%. We evaluate the overlap of PPI-SDDI-PPI triangles with known complexes from Munich Information center for Protein Sequences (MIPS). PPI-SDDI-PPI triangles have ~20 times higher overlap with MIPS complexes than using second-level neighbors in PPINs without SDDIs. The biological interpretation for triangles is that a SDDI causes two proteins to be observed with common interaction partners in high-throughput experiments. The relatively few SDDIs overlapping with PPINs are part of highly connected SDDI components, and are more likely to be detected in experimental studies. We demonstrate the utility of PPI-SDDI-PPI triangles by reconstructing myosin-actin processes in the nucleus, cytoplasm, and cytoskeleton, which were not obvious in the original PPIN. Using other complementary datatypes in place of SDDIs to form triangles, such as PubMed co-occurrences or threading information, results in a similar ability to find protein complexes. Given high-error PPINs with missing information, triangles of mixed datatypes are a promising direction for finding protein complexes. Integrating PPINs with SDDIs improves finding complexes. Structural SDDIs partially explain the high functional similarity of second-level neighbors in PPINs. We estimate that relatively little structural information would be sufficient

  14. Effect of repeated administration of cinnamon aqueous extract on body weight, glucose levels and lipid profile on over weight rats

    International Nuclear Information System (INIS)

    Bano, F.; Akhtar, N.

    2012-01-01

    Plants are the source of both traditional and medicinal plant for curing and treatment of diseases in recent year. Plant extracts containing several active constituents which often work together synergistically. The study was designed to investigate the effect CNAE on lipid profile and glucose level in overweight albino wistar rats. Animal were divided into two group 1 receive CNAE and 2 receive equal volume of tap water. Extract were given daily once a day at the dose of 2ml/animal. After the 17 % of reduction of weight treatment were terminated and blood sample were collected for biochemical estimation. The result show significant decrease in body weight total Cholesterol, Triglycerides, Low density lipoprotein cholesterol and significant increase in high y density lipoprotein while non-significant effect were observed in electrolyte levels. The data of present research demonstrated that CNAE not only possess hypoglycemic and hypolipidemic properties as well as it could be used for reduction body weight. (author)

  15. Low-Level Radioactive Waste siting simulation information package

    International Nuclear Information System (INIS)

    1985-12-01

    The Department of Energy's National Low-Level Radioactive Waste Management Program has developed a simulation exercise designed to facilitate the process of siting and licensing disposal facilities for low-level radioactive waste. The siting simulation can be conducted at a workshop or conference, can involve 14-70 participants (or more), and requires approximately eight hours to complete. The exercise is available for use by states, regional compacts, or other organizations for use as part of the planning process for low-level waste disposal facilities. This information package describes the development, content, and use of the Low-Level Radioactive Waste Siting Simulation. Information is provided on how to organize a workshop for conducting the simulation. 1 ref., 1 fig

  16. Effects of increased low-level diode laser irradiation time on extraction socket healing in rats.

    Science.gov (United States)

    Park, Joon Bong; Ahn, Su-Jin; Kang, Yoon-Goo; Kim, Eun-Cheol; Heo, Jung Sun; Kang, Kyung Lhi

    2015-02-01

    In our previous studies, we confirmed that low-level laser therapy (LLLT) with a 980-nm gallium-aluminum-arsenide diode laser was beneficial for the healing of the alveolar bone in rats with systemic disease. However, many factors can affect the biostimulatory effects of LLLT. Thus, we attempted to investigate the effects of irradiation time on the healing of extraction sockets by evaluating the expressions of genes and proteins related to bone healing. The left and right first maxillary molars of 24 rats were extracted. Rats were randomly divided into four groups in which extraction sockets were irradiated for 0, 1, 2, or 5 min each day for 3 or 7 days. Specimens containing the sockets were examined using quantitative real-time reverse transcription polymerase chain reaction and western blotting. LLLT increased the expressions of all tested genes, Runx2, collagen type 1, osteocalcin, platelet-derived growth factor-B, and vascular endothelial growth factor, in a time-dependent manner. The highest levels of gene expressions were in the 5-min group after 7 days. Five minutes of irradiation caused prominent increases of the expression of all tested proteins after both 3 and 7 days. The expression level of each protein in group 4 was higher by almost twofold compared with group 1 after 7 days. Laser irradiation for 5 min caused the highest expressions of genes and proteins related to bone healing. In conclusion, LLLT had positive effects on the early stages of bone healing of extraction sockets in rats, which were irradiation time-dependent.

  17. A comparison of sorptive extraction techniques coupled to a new quantitative, sensitive, high throughput GC-MS/MS method for methoxypyrazine analysis in wine.

    Science.gov (United States)

    Hjelmeland, Anna K; Wylie, Philip L; Ebeler, Susan E

    2016-02-01

    Methoxypyrazines are volatile compounds found in plants, microbes, and insects that have potent vegetal and earthy aromas. With sensory detection thresholds in the low ng L(-1) range, modest concentrations of these compounds can profoundly impact the aroma quality of foods and beverages, and high levels can lead to consumer rejection. The wine industry routinely analyzes the most prevalent methoxypyrazine, 2-isobutyl-3-methoxypyrazine (IBMP), to aid in harvest decisions, since concentrations decrease during berry ripening. In addition to IBMP, three other methoxypyrazines IPMP (2-isopropyl-3-methoxypyrazine), SBMP (2-sec-butyl-3-methoxypyrazine), and EMP (2-ethyl-3-methoxypyrazine) have been identified in grapes and/or wine and can impact aroma quality. Despite their routine analysis in the wine industry (mostly IBMP), accurate methoxypyrazine quantitation is hindered by two major challenges: sensitivity and resolution. With extremely low sensory detection thresholds (~8-15 ng L(-1) in wine for IBMP), highly sensitive analytical methods to quantify methoxypyrazines at trace levels are necessary. Here we were able to achieve resolution of IBMP as well as IPMP, EMP, and SBMP from co-eluting compounds using one-dimensional chromatography coupled to positive chemical ionization tandem mass spectrometry. Three extraction techniques HS-SPME (headspace-solid phase microextraction), SBSE (stirbar sorptive extraction), and HSSE (headspace sorptive extraction) were validated and compared. A 30 min extraction time was used for HS-SPME and SBSE extraction techniques, while 120 min was necessary to achieve sufficient sensitivity for HSSE extractions. All extraction methods have limits of quantitation (LOQ) at or below 1 ng L(-1) for all four methoxypyrazines analyzed, i.e., LOQ's at or below reported sensory detection limits in wine. The method is high throughput, with resolution of all compounds possible with a relatively rapid 27 min GC oven program. Copyright © 2015

  18. Analysis of Technique to Extract Data from the Web for Improved Performance

    Science.gov (United States)

    Gupta, Neena; Singh, Manish

    2010-11-01

    The World Wide Web rapidly guides the world into a newly amazing electronic world, where everyone can publish anything in electronic form and extract almost all the information. Extraction of information from semi structured or unstructured documents, such as web pages, is a useful yet complex task. Data extraction, which is important for many applications, extracts the records from the HTML files automatically. Ontologies can achieve a high degree of accuracy in data extraction. We analyze method for data extraction OBDE (Ontology-Based Data Extraction), which automatically extracts the query result records from the web with the help of agents. OBDE first constructs an ontology for a domain according to information matching between the query interfaces and query result pages from different web sites within the same domain. Then, the constructed domain ontology is used during data extraction to identify the query result section in a query result page and to align and label the data values in the extracted records. The ontology-assisted data extraction method is fully automatic and overcomes many of the deficiencies of current automatic data extraction methods.

  19. "Legal highs" on the net-Evaluation of UK-based Websites, products and product information.

    Science.gov (United States)

    Schmidt, Martin M; Sharma, Akhilesh; Schifano, Fabrizio; Feinmann, Charlotte

    2011-03-20

    A vast array of substances are marketed as "legal highs" in the UK. These products are mainly marketed online and are packaged and produced to mimic illicit drugs. Little is known about the full range of products available at present and no studies have evaluated the product information provided to consumers. AIMS & HYPOTHESIS: To describe the available legal high products marketed by UK-based Internet retailers and evaluate the product information provided to consumers. Websites were identified using the terms "buy legal highs+UK" and two search engines. The first 100 hits and a random sample of 5% of the remaining results were screened. Websites based in the UK were included and all products were entered on a database. Information on product name, list price, claimed effects, side effects, contraindications and interactions was extracted. A descriptive analysis was conducted using SPSS v14. 115 Websites met the inclusion criteria but due to duplicate listings this was reduced to 39 unique Websites. 1308 products were found and evaluated. The average product price was 9.69 British pounds. Products took the form of pills (46.6%), smoking material (29.7%) and single plant material/extract (18.1%). Most products claimed to be stimulants (41.7%), sedatives (32.3%), or hallucinogens (12.9%). 40.1% of products failed to list ingredients, 91.9% failed to list side effects, 81.9% failed to list contraindications and 86.3% failed to list drug interactions. Top 5 products (with active ingredients in brackets) by frequency were Salvia divinorum (Salivinorin A), Kratom (Mitragynine), Hawaiian Baby Woodrose Seeds (Lysergic Acid Amide), Fly Agaric (Ibotenic Acid, Muscimol) and Genie (JWH018, CP47497). Products marketed as "legal highs" are easily available from UK-based Internet retailers and are reasonably affordable. Safety information provided to consumers is poor. Uninformed users risk serious adverse effects. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  20. Efficient management of high level XMM-Newton science data products

    Science.gov (United States)

    Zolotukhin, Ivan

    2015-12-01

    Like it is the case for many large projects, XMM-Newton data have been used by the community to produce many valuable higher level data products. However, even after 15 years of the successful mission operation, the potential of these data is not yet fully uncovered, mostly due to the logistical and data management issues. We present a web application, http://xmm-catalog.irap.omp.eu, to highlight an idea that existing public high level data collections generate significant added research value when organized and exposed properly. Several application features such as access to the all-time XMM-Newton photon database and online fitting of extracted sources spectra were never available before. In this talk we share best practices we worked out during the development of this website and discuss their potential use for other large projects generating astrophysical data.

  1. A Robust Level-Set Algorithm for Centerline Extraction

    NARCIS (Netherlands)

    Telea, Alexandru; Vilanova, Anna

    2003-01-01

    We present a robust method for extracting 3D centerlines from volumetric datasets. We start from a 2D skeletonization method to locate voxels centered with respect to three orthogonal slicing directions. Next, we introduce a new detection criterion to extract the centerline voxels from the above

  2. High-level language computer architecture

    CERN Document Server

    Chu, Yaohan

    1975-01-01

    High-Level Language Computer Architecture offers a tutorial on high-level language computer architecture, including von Neumann architecture and syntax-oriented architecture as well as direct and indirect execution architecture. Design concepts of Japanese-language data processing systems are discussed, along with the architecture of stack machines and the SYMBOL computer system. The conceptual design of a direct high-level language processor is also described.Comprised of seven chapters, this book first presents a classification of high-level language computer architecture according to the pr

  3. Other-than-high-level waste

    International Nuclear Information System (INIS)

    Bray, G.R.

    1976-01-01

    The main emphasis of the work in the area of partitioning transuranic elements from waste has been in the area of high-level liquid waste. But there are ''other-than-high-level wastes'' generated by the back end of the nuclear fuel cycle that are both large in volume and contaminated with significant quantities of transuranic elements. The combined volume of these other wastes is approximately 50 times that of the solidified high-level waste. These other wastes also contain up to 75% of the transuranic elements associated with waste generated by the back end of the fuel cycle. Therefore, any detailed evaluation of partitioning as a viable waste management option must address both high-level wastes and ''other-than-high-level wastes.''

  4. Criteria to Extract High-Quality Protein Data Bank Subsets for Structure Users.

    Science.gov (United States)

    Carugo, Oliviero; Djinović-Carugo, Kristina

    2016-01-01

    It is often necessary to build subsets of the Protein Data Bank to extract structural trends and average values. For this purpose it is mandatory that the subsets are non-redundant and of high quality. The first problem can be solved relatively easily at the sequence level or at the structural level. The second, on the contrary, needs special attention. It is not sufficient, in fact, to consider the crystallographic resolution and other feature must be taken into account: the absence of strings of residues from the electron density maps and from the files deposited in the Protein Data Bank; the B-factor values; the appropriate validation of the structural models; the quality of the electron density maps, which is not uniform; and the temperature of the diffraction experiments. More stringent criteria produce smaller subsets, which can be enlarged with more tolerant selection criteria. The incessant growth of the Protein Data Bank and especially of the number of high-resolution structures is allowing the use of more stringent selection criteria, with a consequent improvement of the quality of the subsets of the Protein Data Bank.

  5. Clustering with Instance and Attribute Level Side Information

    Directory of Open Access Journals (Sweden)

    Jinlong Wang

    2010-12-01

    Full Text Available Selecting a suitable proximity measure is one of the fundamental tasks in clustering. How to effectively utilize all available side information, including the instance level information in the form of pair-wise constraints, and the attribute level information in the form of attribute order preferences, is an essential problem in metric learning. In this paper, we propose a learning framework in which both the pair-wise constraints and the attribute order preferences can be incorporated simultaneously. The theory behind it and the related parameter adjusting technique have been described in details. Experimental results on benchmark data sets demonstrate the effectiveness of proposed method.

  6. 5W1H Information Extraction with CNN-Bidirectional LSTM

    Science.gov (United States)

    Nurdin, A.; Maulidevi, N. U.

    2018-03-01

    In this work, information about who, did what, when, where, why, and how on Indonesian news articles were extracted by combining Convolutional Neural Network and Bidirectional Long Short-Term Memory. Convolutional Neural Network can learn semantically meaningful representations of sentences. Bidirectional LSTM can analyze the relations among words in the sequence. We also use word embedding word2vec for word representation. By combining these algorithms, we obtained F-measure 0.808. Our experiments show that CNN-BLSTM outperforms other shallow methods, namely IBk, C4.5, and Naïve Bayes with the F-measure 0.655, 0.645, and 0.595, respectively.

  7. Peculiarities of the High-Level Concrete-Encased Radwaste Repository Disposition at the Radwaste Disposal Site of the Russian Research Center 'Kurchatov Institute'

    International Nuclear Information System (INIS)

    Volkov, V.G.; Ponomarev-Stepnoi, N.N.; Gorodetsky, G.G.; Zverkov, Yu.A.; Ivanov, O.P.; Lemus, A.V.; Semenov, S.G.; Stepanov, V.E.; Chesnokov, A.V.; Shisha, A.D.

    2006-01-01

    The paper presents peculiarities of organization and performance of activities on disposition of the old repository that contained high-level waste and located at the radwaste disposal site of the Russian Research Center 'Kurchatov Institute' in Moscow. The repository was constructed in the late 1950's. A large number of cases with high-level waste were placed in the repository along with low- and intermediate-level waste. When the repository was filled in 1973, the entire radwaste mass was encased in concrete matrix which caused difficulties with the radwaste extraction and made the work on the repository disposition highly hazardous in terms of radiation conditions. Based on results of the preliminary radiation survey of the repository, technologies and equipment to be used in disposition works were selected, and a decision on construction of external radiation shielding around the repository to maintain normal radiation conditions during these works was made. Specific features of the selected radiation shielding design constructed around the repository and of a technology used for the radwaste extraction from the repository are provided. According to the technology, conventional construction machines equipped with a hydraulic hammer or a clamshell were used for destruction of the concrete-encased radwaste mass and extraction of low-level waste. Intermediate- and high-level waste was extracted by remotely controlled robots operating inside the radiation shielding structure. Video cameras and a gamma imager were used for detection of high-level waste or fragments of such radwaste in the mass concrete being destroyed and for guiding remotely controlled robots. Peculiarities of rapid control of changes in radiation conditions in the working areas are presented. This control was performed using a gamma locator with on-line transmission of its data to a PC for their processing. With disposition of this not easily accessible repository, the stage of remediation of old

  8. Statistics of high-level scene context.

    Science.gov (United States)

    Greene, Michelle R

    2013-01-01

    CONTEXT IS CRITICAL FOR RECOGNIZING ENVIRONMENTS AND FOR SEARCHING FOR OBJECTS WITHIN THEM: contextual associations have been shown to modulate reaction time and object recognition accuracy, as well as influence the distribution of eye movements and patterns of brain activations. However, we have not yet systematically quantified the relationships between objects and their scene environments. Here I seek to fill this gap by providing descriptive statistics of object-scene relationships. A total of 48, 167 objects were hand-labeled in 3499 scenes using the LabelMe tool (Russell et al., 2008). From these data, I computed a variety of descriptive statistics at three different levels of analysis: the ensemble statistics that describe the density and spatial distribution of unnamed "things" in the scene; the bag of words level where scenes are described by the list of objects contained within them; and the structural level where the spatial distribution and relationships between the objects are measured. The utility of each level of description for scene categorization was assessed through the use of linear classifiers, and the plausibility of each level for modeling human scene categorization is discussed. Of the three levels, ensemble statistics were found to be the most informative (per feature), and also best explained human patterns of categorization errors. Although a bag of words classifier had similar performance to human observers, it had a markedly different pattern of errors. However, certain objects are more useful than others, and ceiling classification performance could be achieved using only the 64 most informative objects. As object location tends not to vary as a function of category, structural information provided little additional information. Additionally, these data provide valuable information on natural scene redundancy that can be exploited for machine vision, and can help the visual cognition community to design experiments guided by statistics

  9. Respiratory Information Extraction from Electrocardiogram Signals

    KAUST Repository

    Amin, Gamal El Din Fathy

    2010-12-01

    The Electrocardiogram (ECG) is a tool measuring the electrical activity of the heart, and it is extensively used for diagnosis and monitoring of heart diseases. The ECG signal reflects not only the heart activity but also many other physiological processes. The respiratory activity is a prominent process that affects the ECG signal due to the close proximity of the heart and the lungs. In this thesis, several methods for the extraction of respiratory process information from the ECG signal are presented. These methods allow an estimation of the lung volume and the lung pressure from the ECG signal. The potential benefit of this is to eliminate the corresponding sensors used to measure the respiration activity. A reduction of the number of sensors connected to patients will increase patients’ comfort and reduce the costs associated with healthcare. As a further result, the efficiency of diagnosing respirational disorders will increase since the respiration activity can be monitored with a common, widely available method. The developed methods can also improve the detection of respirational disorders that occur while patients are sleeping. Such disorders are commonly diagnosed in sleeping laboratories where the patients are connected to a number of different sensors. Any reduction of these sensors will result in a more natural sleeping environment for the patients and hence a higher sensitivity of the diagnosis.

  10. High-Level waste process and product data annotated bibliography

    International Nuclear Information System (INIS)

    Stegen, G.E.

    1996-01-01

    The objective of this document is to provide information on available issued documents that will assist interested parties in finding available data on high-level waste and transuranic waste feed compositions, properties, behavior in candidate processing operations, and behavior on candidate product glasses made from those wastes. This initial compilation is only a partial list of available references

  11. Biosynthesis of spherical and highly stable gold nanoparticles using Ferulago Angulata aqueous extract: dual role of extract

    Science.gov (United States)

    Alizadeh, A.; Parsafar, S.; Khodaei, M. M.

    2017-03-01

    A biocompatible method for synthesizing of highly disperses gold nanoparticles using Ferulago Angulata leaf extract has been developed. It has been shown that leaf extract acts as reducing and coating agent. Various spectroscopic and electron microscopic techniques were employed for the structural characterization of the prepared nanoparticles. The biosynthesized particles were identified as elemental gold with spherical morphology, narrow size distribution (ranged 9.2-17.5 nm) with high stability. Also, the effect of initial ratio of precursors, temperature and time of reaction on the size and morphology of the nanoparticles was studied in more detail. It was observed that varying these parameters provides an accessible remote control on the size and morphology of nanoparticles. The uniqueness of this procedure lies in its cleanliness using no extra surfactant, reducing agent or any capping agent.

  12. Information extraction and knowledge graph construction from geoscience literature

    Science.gov (United States)

    Wang, Chengbin; Ma, Xiaogang; Chen, Jianguo; Chen, Jingwen

    2018-03-01

    Geoscience literature published online is an important part of open data, and brings both challenges and opportunities for data analysis. Compared with studies of numerical geoscience data, there are limited works on information extraction and knowledge discovery from textual geoscience data. This paper presents a workflow and a few empirical case studies for that topic, with a focus on documents written in Chinese. First, we set up a hybrid corpus combining the generic and geology terms from geology dictionaries to train Chinese word segmentation rules of the Conditional Random Fields model. Second, we used the word segmentation rules to parse documents into individual words, and removed the stop-words from the segmentation results to get a corpus constituted of content-words. Third, we used a statistical method to analyze the semantic links between content-words, and we selected the chord and bigram graphs to visualize the content-words and their links as nodes and edges in a knowledge graph, respectively. The resulting graph presents a clear overview of key information in an unstructured document. This study proves the usefulness of the designed workflow, and shows the potential of leveraging natural language processing and knowledge graph technologies for geoscience.

  13. Metaproteomics: extracting and mining proteome information to characterize metabolic activities in microbial communities.

    Science.gov (United States)

    Abraham, Paul E; Giannone, Richard J; Xiong, Weili; Hettich, Robert L

    2014-06-17

    Contemporary microbial ecology studies usually employ one or more "omics" approaches to investigate the structure and function of microbial communities. Among these, metaproteomics aims to characterize the metabolic activities of the microbial membership, providing a direct link between the genetic potential and functional metabolism. The successful deployment of metaproteomics research depends on the integration of high-quality experimental and bioinformatic techniques for uncovering the metabolic activities of a microbial community in a way that is complementary to other "meta-omic" approaches. The essential, quality-defining informatics steps in metaproteomics investigations are: (1) construction of the metagenome, (2) functional annotation of predicted protein-coding genes, (3) protein database searching, (4) protein inference, and (5) extraction of metabolic information. In this article, we provide an overview of current bioinformatic approaches and software implementations in metaproteome studies in order to highlight the key considerations needed for successful implementation of this powerful community-biology tool. Copyright © 2014 John Wiley & Sons, Inc.

  14. Determination of ppq-levels of alkylmethoxypyrazines in wine by stirbar sorptive extraction combined with multidimensional gas chromatography-mass spectrometry.

    Science.gov (United States)

    Wen, Yan; Ontañon, Ignacio; Ferreira, Vicente; Lopez, Ricardo

    2018-07-30

    Alkylmethoxypyrazines are powerful odorants in many food products. A new method for analysing 3-isopropyl-2-methoxypyrazine, 3-s-butyl-2-methoxypyrazine and 3-isobutyl-2-methoxypyrazine has been developed and applied to wine. The analytes were extracted from 5 mL of wine using stirbar sorptive extraction followed by thermal desorption and multidimensional gas chromatography-mass spectrometry analysis in a single oven. The extraction conditions were optimized in order to obtain a high recovery of the 3-alkyl-2-methoxypyrazines (MP). The detection limits of the method in all cases were under 0.08 ng/L, well below the olfactory thresholds of these compounds in wine. The reproducibility of the method was adequate (below 10%), the linearity satisfactory and the recoveries in all cases close to 100%. The method has been applied to the analysis of 111 Spanish and French wine samples. The levels found suggest that MP have a low direct impact on the aroma properties of wines from the regions around the Pyrenean massif. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. Effects of P-Zn interaction and lime on plant growth in the presence of high levels of extractable zinc

    Energy Technology Data Exchange (ETDEWEB)

    Koukoulakis, P

    1973-01-01

    Six glasshouse experiments were conducted in order to study (a) the effect of P and lime on dry matter yield and mineral composition of tomato, cotton, maize and sudan grass grown on a Zn polluted soil (containing 170 ppM of 2.5% acetic acid extractable Zn), (b) the effect of residual P on dry matter yield and mineral composition of beans, lettuce, and maize grown on a similar soil, and (c) the effect of various Zn treatments on the availability of indigenous and added P of a soil low in Zn (11 ppM). It was found that the yield response to applied P of maize and sudan grass was independent of lime, while cotton, tomato and beans failed almost completely to respond to the absence of lime. The crops responded differently to the excess soil Zn and the dry matter yields were related to the ability to accumulate Zn. High Zn accumulator plants failed to respond to applied P in the absence of lime, while low Zn accumulating plants responded positively. The positive and highly significant effect of P on total Zn uptake of plants, masked the depressive effect of P on Zn concentration. However, the results indicated that the P-Zn interrelationship is far more complicated than a dilution effect caused by the promotive effect of applied P. Studies of the effect of applied Zn levels on available soil P and conversely, indicated that a strong mutual fixation, probably coprecipitation takes place in the soil, which may account for a considerable part of the depressive effect of P on plant Zn, in addition to the effects like coprecipitation in roots and dilution, reported in the literature. Finally, the residual effect of P varied with the plant species, and the plant Zn concentration was found to be a determinant factor in controlling dry matter yields. 58 references, 13 figures, 24 tables.

  16. Information extraction from FN plots of tungsten microemitters

    Energy Technology Data Exchange (ETDEWEB)

    Mussa, Khalil O. [Department of Physics, Mu' tah University, Al-Karak (Jordan); Mousa, Marwan S., E-mail: mmousa@mutah.edu.jo [Department of Physics, Mu' tah University, Al-Karak (Jordan); Fischer, Andreas, E-mail: andreas.fischer@physik.tu-chemnitz.de [Institut für Physik, Technische Universität Chemnitz, Chemnitz (Germany)

    2013-09-15

    Tungsten based microemitter tips have been prepared both clean and coated with dielectric materials. For clean tungsten tips, apex radii have been varied ranging from 25 to 500 nm. These tips were manufactured by electrochemical etching a 0.1 mm diameter high purity (99.95%) tungsten wire at the meniscus of two molar NaOH solution. Composite micro-emitters considered here are consisting of a tungsten core coated with different dielectric materials—such as magnesium oxide (MgO), sodium hydroxide (NaOH), tetracyanoethylene (TCNE), and zinc oxide (ZnO). It is worthwhile noting here, that the rather unconventional NaOH coating has shown several interesting properties. Various properties of these emitters were measured including current–voltage (IV) characteristics and the physical shape of the tips. A conventional field emission microscope (FEM) with a tip (cathode)–screen (anode) separation standardized at 10 mm was used to electrically characterize the electron emitters. The system was evacuated down to a base pressure of ∼10{sup −8}mbar when baked at up to ∼180°C overnight. This allowed measurements of typical field electron emission (FE) characteristics, namely the IV characteristics and the emission images on a conductive phosphorus screen (the anode). Mechanical characterization has been performed through a FEI scanning electron microscope (SEM). Within this work, the mentioned experimental results are connected to the theory for analyzing Fowler–Nordheim (FN) plots. We compared and evaluated the data extracted from clean tungsten tips of different radii and determined deviations between the results of different extraction methods applied. In particular, we derived the apex radii of several clean and coated tungsten tips by both SEM imaging and analyzing FN plots. The aim of this analysis is to support the ongoing discussion on recently developed improvements of the theory for analyzing FN plots related to metal field electron emitters, which in

  17. Information extraction from FN plots of tungsten microemitters

    International Nuclear Information System (INIS)

    Mussa, Khalil O.; Mousa, Marwan S.; Fischer, Andreas

    2013-01-01

    Tungsten based microemitter tips have been prepared both clean and coated with dielectric materials. For clean tungsten tips, apex radii have been varied ranging from 25 to 500 nm. These tips were manufactured by electrochemical etching a 0.1 mm diameter high purity (99.95%) tungsten wire at the meniscus of two molar NaOH solution. Composite micro-emitters considered here are consisting of a tungsten core coated with different dielectric materials—such as magnesium oxide (MgO), sodium hydroxide (NaOH), tetracyanoethylene (TCNE), and zinc oxide (ZnO). It is worthwhile noting here, that the rather unconventional NaOH coating has shown several interesting properties. Various properties of these emitters were measured including current–voltage (IV) characteristics and the physical shape of the tips. A conventional field emission microscope (FEM) with a tip (cathode)–screen (anode) separation standardized at 10 mm was used to electrically characterize the electron emitters. The system was evacuated down to a base pressure of ∼10 −8 mbar when baked at up to ∼180°C overnight. This allowed measurements of typical field electron emission (FE) characteristics, namely the IV characteristics and the emission images on a conductive phosphorus screen (the anode). Mechanical characterization has been performed through a FEI scanning electron microscope (SEM). Within this work, the mentioned experimental results are connected to the theory for analyzing Fowler–Nordheim (FN) plots. We compared and evaluated the data extracted from clean tungsten tips of different radii and determined deviations between the results of different extraction methods applied. In particular, we derived the apex radii of several clean and coated tungsten tips by both SEM imaging and analyzing FN plots. The aim of this analysis is to support the ongoing discussion on recently developed improvements of the theory for analyzing FN plots related to metal field electron emitters, which in

  18. Extraction of neutron spectral information from Bonner-Sphere data

    CERN Document Server

    Haney, J H; Zaidins, C S

    1999-01-01

    We have extended a least-squares method of extracting neutron spectral information from Bonner-Sphere data which was previously developed by Zaidins et al. (Med. Phys. 5 (1978) 42). A pulse-height analysis with background stripping is employed which provided a more accurate count rate for each sphere. Newer response curves by Mares and Schraube (Nucl. Instr. and Meth. A 366 (1994) 461) were included for the moderating spheres and the bare detector which comprise the Bonner spectrometer system. Finally, the neutron energy spectrum of interest was divided using the philosophy of fuzzy logic into three trapezoidal regimes corresponding to slow, moderate, and fast neutrons. Spectral data was taken using a PuBe source in two different environments and the analyzed data is presented for these cases as slow, moderate, and fast neutron fluences. (author)

  19. A fast, simple and green method for the extraction of carbamate pesticides from rice by microwave assisted steam extraction coupled with solid phase extraction.

    Science.gov (United States)

    Song, Weitao; Zhang, Yiqun; Li, Guijie; Chen, Haiyan; Wang, Hui; Zhao, Qi; He, Dong; Zhao, Chun; Ding, Lan

    2014-01-15

    This paper presented a fast, simple and green sample pretreatment method for the extraction of 8 carbamate pesticides in rice. The carbamate pesticides were extracted by microwave assisted water steam extraction method, and the extract obtained was immediately applied on a C18 solid phase extraction cartridge for clean-up and concentration. The eluate containing target compounds was finally analysed by high performance liquid chromatography with mass spectrometry. The parameters affecting extraction efficiency were investigated and optimised. The limits of detection ranging from 1.1 to 4.2ngg(-1) were obtained. The recoveries of 8 carbamate pesticides ranged from 66% to 117% at three spiked levels, and the inter- and intra-day relative standard deviation values were less than 9.1%. Compared with traditional methods, the proposed method cost less extraction time and organic solvent. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Diluent effects in solvent extraction. The Effects of Diluents in Solvent Extraction - a literature study

    International Nuclear Information System (INIS)

    Loefstroem-Engdahl, Elin; Aneheim, Emma; Ekberg, Christian; Foreman, Mark; Skarnemark, Gunnar

    2010-01-01

    The fact that the choice of organic diluent is important for a solvent extraction process goes without saying. Several factors, such as e.g. price, flash point, viscosity, polarity etc. each have their place in the planning of a solvent extraction system. This high number of variables makes the lack of compilations concerning diluent effects to an interesting topic. Often the interest for the research concerning a specific extraction system focuses on the extractant used and the complexes built up during an extraction. The diluents used are often classical ones, even if it has been shown that choice of diluent can affect extraction as well as separation in an extraction system. An attempt to point out important steps in the understanding of diluent effects in solvent extraction is here presented. This large field is, of course, not summarized in this article, but an attempt is made to present important steps in the understanding of diluents effects in solvent extraction. Trying to make the information concerning diluent effects and applications more easily accessible this review offers a selected summarizing of literature concerning diluents effects in solvent extraction. (authors)

  1. Developing an Approach to Prioritize River Restoration using Data Extracted from Flood Risk Information System Databases.

    Science.gov (United States)

    Vimal, S.; Tarboton, D. G.; Band, L. E.; Duncan, J. M.; Lovette, J. P.; Corzo, G.; Miles, B.

    2015-12-01

    Prioritizing river restoration requires information on river geometry. In many states in the US detailed river geometry has been collected for floodplain mapping and is available in Flood Risk Information Systems (FRIS). In particular, North Carolina has, for its 100 Counties, developed a database of numerous HEC-RAS models which are available through its Flood Risk Information System (FRIS). These models that include over 260 variables were developed and updated by numerous contractors. They contain detailed surveyed or LiDAR derived cross-sections and modeled flood extents for different extreme event return periods. In this work, over 4700 HEC-RAS models' data was integrated and upscaled to utilize detailed cross-section information and 100-year modelled flood extent information to enable river restoration prioritization for the entire state of North Carolina. We developed procedures to extract geomorphic properties such as entrenchment ratio, incision ratio, etc. from these models. Entrenchment ratio quantifies the vertical containment of rivers and thereby their vulnerability to flooding and incision ratio quantifies the depth per unit width. A map of entrenchment ratio for the whole state was derived by linking these model results to a geodatabase. A ranking of highly entrenched counties enabling prioritization for flood allowance and mitigation was obtained. The results were shared through HydroShare and web maps developed for their visualization using Google Maps Engine API.

  2. High resolution reservoir geological modelling using outcrop information

    Energy Technology Data Exchange (ETDEWEB)

    Zhang Changmin; Lin Kexiang; Liu Huaibo [Jianghan Petroleum Institute, Hubei (China)] [and others

    1997-08-01

    This is China`s first case study of high resolution reservoir geological modelling using outcrop information. The key of the modelling process is to build a prototype model and using the model as a geological knowledge bank. Outcrop information used in geological modelling including seven aspects: (1) Determining the reservoir framework pattern by sedimentary depositional system and facies analysis; (2) Horizontal correlation based on the lower and higher stand duration of the paleo-lake level; (3) Determining the model`s direction based on the paleocurrent statistics; (4) Estimating the sandbody communication by photomosaic and profiles; (6) Estimating reservoir properties distribution within sandbody by lithofacies analysis; and (7) Building the reservoir model in sandbody scale by architectural element analysis and 3-D sampling. A high resolution reservoir geological model of Youshashan oil field has been built by using this method.

  3. Effects of Urtica dioica extract on lipid profile in hypercholesterolemic rats.

    Science.gov (United States)

    Nassiri-Asl, Marjan; Zamansoltani, Farzaneh; Abbasi, Esmail; Daneshi, Mohammad-Mehdi; Zangivand, Amir-Abdollah

    2009-05-01

    To investigate the effects of extract of Urtica dioica, a perennial herb in Iran, on lipid profile in hypercholesterolemic rats. The effects of Urtica dioica extract were tested by using it as a supplement in a high-cholesterol diet. Male rats were fed a high cholesterol diet (10 mL/kg) for 4 weeks with Urtica dioica extract (100 or 300 mg/kg) or 10 mg/kg lovastatin supplementation to study the hypocholesterolemic effects of Urtica dioica on plasma lipid levels, hepatic enzymes activities, and liver histopathological changes. Urtica dioica extract at 100 and 300 mg/kg significantly reduced the levels of total cholesterol (TC), and low-density lipoprotein-cholesterol (LDL-C) and also markedly decreased liver enzymes and weight in animals with a high cholesterol diet. Hematoxylin and eosin staining showed that in the 100 mg/kg extract of Urtica dioica group, the appearance of the liver cells was similar to the control group, and steatosis and inflammation were not found. In the 300 mg/kg extract of Urtica dioica group, mild steatosis was observed but mononuclear inflammatory infiltration was not found. The hepatic histopathological results reflect the correlation of Urtica dioica extract with both liver weight and the levels of plasma TC and LDL-C. These results indicate that Urtica dioica extract has hypocholesterolemic effects in the animal model.

  4. Combining extractant systems for the simultaneous extraction of transuranic elements and selected fission products

    International Nuclear Information System (INIS)

    Horwitz, E.P.

    1993-01-01

    The popularity of solvent extraction (SX) stems from its ability to operate in a continuous mode, to achieve high throughputs and high decontamination factors of product streams, and to utilize relatively small quantities of very selective chemical compounds as metal ion complexants. The chemical pretreatment of nuclear waste for the purpose of waste minimization will probably utilize one or more SX processes. Because of the diversity and complexity of nuclear waste, perhaps the greatest difficulty for the separation chemist is to develop processes that remove not only actinides but also selected fission products in a single process. A stand alone acid-side SX process (TRUEX) for removal of uranium and transuranic elements (Np, Pu, Am) from nuclear waste has been widely reported. Recently, an acid-side SX process (SREX) to extract and recover 90 Sr from high-level nuclear waste has also been reported. Both the TRUEX and SREX processes extract Tc to a significant extent although not as efficiently as they extract transuranics and Sr. Ideally one would like to have a process that can extract and recover all actinides as well as 99 Tc, 90 Sr, and 137 Cs. A possible solution to multielement extraction is to mix two extractants with totally different properties into a single process solvent formulation. For this approach to be successful, both extractants must be essentially the same type, either neutral, liquid cationic, or liquid anionic. Experimental work has been carried out on mixed TRUEX and SREX processes, for synthetically created waste, and demonstrates the combined solvent formulation is effective at extracting both the actinides and Tc, as well as Sr. There is no evidence for the presence of either synergistic or antagonistic effects between the two extractants. This demonstates the feasibility of at least part of a combined solvent extraction scheme

  5. Stable Isolation of Phycocyanin from Spirulina platensis Associated with High-Pressure Extraction Process

    Directory of Open Access Journals (Sweden)

    Kyung-Hwan Jung

    2013-01-01

    Full Text Available A method for stably purifying a functional dye, phycocyanin from Spirulina platensis was developed by a hexane extraction process combined with high pressure. This was necessary because this dye is known to be very unstable during normal extraction processes. The purification yield of this method was estimated as 10.2%, whose value is 3%–5% higher than is the case from another conventional separation method using phosphate buffer. The isolated phycocyanin from this process also showed the highest purity of 0.909 based on absorbance of 2.104 at 280 nm and 1.912 at 620 nm. Two subunits of phycocyanin namely α-phycocyanin (18.4 kDa and β-phycocyanin (21.3 kDa were found to remain from the original mixtures after being extracted, based on SDS-PAGE analysis, clearly demonstrating that this process can stably extract phycocyanin and is not affected by extraction solvent, temperature, etc. The stability of the extracted phycocyanin was also confirmed by comparing its DPPH (α,α-diphenyl-β-picrylhydrazyl scavenging activity, showing 83% removal of oxygen free radicals. This activity was about 15% higher than that of commercially available standard phycocyanin, which implies that the combined extraction method can yield relatively intact chromoprotein through absence of degradation. The results were achieved because the low temperature and high pressure extraction effectively disrupted the cell membrane of Spirulina platensis and degraded less the polypeptide subunits of phycocyanin (which is a temperature/pH-sensitive chromoprotein as well as increasing the extraction yield.

  6. Orthodontic Extraction of High-Risk Impacted Mandibular Third Molars in Close Proximity to the Mandibular Canal: A Systematic Review.

    Science.gov (United States)

    Kalantar Motamedi, Mahmood Reza; Heidarpour, Majid; Siadat, Sara; Kalantar Motamedi, Alimohammad; Bahreman, Ali Akbar

    2015-09-01

    Extraction of mandibular third molars (M3s) in close proximity to the mandibular canal has some inherent risks to adjacent structures, such as neurologic damage to teeth, bone defects distal to the mandibular second molar (M2), or pathologic fractures in association with enlarged dentigerous cysts. The procedure for extrusion and subsequent extraction of high-risk M3s is called orthodontic extraction. This is a systematic review of the available approaches for orthodontic extraction of impacted mandibular M3s in close proximity to the mandibular canal and their outcomes. The PubMed, Scopus, Cochrane Central Register of Controlled Trials (CENTRAL), DOAJ, Google Scholar, OpenGrey, Iranian Science Information Database (SID), Iranmedex, and Irandoc databases were searched using specific keywords up to June 2, 2014. Studies were evaluated based on predetermined eligibility criteria, treatment approaches, and their outcomes. Thirteen articles met the inclusion criteria. A total of 123 impacted teeth were extracted by orthodontic extraction and 2 cases were complicated by transient paresthesia. Three types of biomechanical approaches were used: 1) using the posterior maxillary region as the anchor for orthodontic extrusion of lower M3s, 2) simple cantilever springs attached to the M3 buttonhole, and 3) cantilever springs tied to a bonded orthodontic bracket on the M3 plus multiple-loop spring wire for distal movement of the M3. Osteo-periodontal status of M2s also improved uneventfully. Despite the drawbacks of orthodontic extraction, removal of deeply impacted M3s using the described techniques is safe with regard to mandibular nerve injury and neurologic damage. Orthodontic extraction is recommended for extraction of impacted M3s that present a high risk of postoperative osteo-periodontal defects on the distal surface of the adjacent M2 and those associated with dentigerous cysts. Copyright © 2015 American Association of Oral and Maxillofacial Surgeons. Published by

  7. A study of inoculation route and dosage levels on embryonated chicken eggs as media for testing tea mistlestoe (Scurrula oortiana extract activity

    Directory of Open Access Journals (Sweden)

    Sri Murtini

    2006-06-01

    Full Text Available Tea mistlestoe extract (Scurrula oortiana has cytotoxic activity which is potential to be used in preventing viral induced-chicken tumor. The following study was designed to evaluate the effects of different inoculation routes, dosage levels, and strains of embryonated chicken eggs as media for testing the tea mistlestoe extract (Scurrula oortiana antiviral activity. Proper inoculation route was examined by inoculation of the extract at dose level of 0,2 mg/egg into embryonated layer eggs via allantoic cavity, chorio-allantoic membrane, and yolk sac. Effect of dose level of tea mistlestoe extract on embryo development was examined in groups of embryonated broiler eggs inoculated with the extract at 0.02, 0.2, 2, 20, or 200 mg/egg. Inoculation of tea mistlestoe extract into allantoic cavity was the safest procedure as indicated by the absence of embryos mortality, and faster embryo growth compared to those of chorio-allantoic membrane and yolk sac-inoculated eggs. The extract induced different growth effects when inoculated into embryonated layer or broiler eggs. Administration of the extract at dose levels between 0,02–200 mg/egg reduced significantly the weight of broiler embryoes, but not the relative weights of liver, heart and spleen. Administration of similar dosage in layer embryoes did not cause any significant difference in the embryoes weight. This study suggests that the study of antiviral activity of tea mistlestoe extract in embryonated chicken eggs should be carried out on embryonated eggs of layer breeds and the extract should be inoculated via allantoic cavity.

  8. RF extraction issues in the relativistic klystron amplifiers

    Science.gov (United States)

    Serlin, Victor; Friedman, Moshe; Lampe, Martin; Hubbard, Richard F.

    1994-05-01

    Relativistic klystron amplifiers (RKAs) were successfully operated at NRL in several frequency regimes and power levels. In particular, an L-band RKA was optimized for high- power rf extraction into the atmosphere and an S-band RKA was operated, both in a two-beam and a single-beam configuration. At L-band the rf extraction at maximum power levels (>= 15 GW) was hindered by pulse shortening and poor repeatability. Preliminary investigation showed electron emission in the radiating horn, due to very high voltages associated with the multi-gigawatt rf power levels. This electron current constituted an electric load in parallel with the radiating antenna, and precipitated the rf pulse collapse. At S-band the peak extracted power reached 1.7 GW with power efficiency approximately 50%. However, pulse shortening limited the duration to approximately 50 nanoseconds. The new triaxial RKA promises to solve many of the existing problems.

  9. Separation of Molybdenum from Acidic High-Phosphorus Tungsten Solution by Solvent Extraction

    Science.gov (United States)

    Li, Yongli; Zhao, Zhongwei

    2017-10-01

    A solvent-extraction process for deep separation of molybdenum from an acidic high-phosphate tungsten solution was developed using tributyl phosphate (TBP) as the extractant and hydrogen peroxide (H2O2) as a complexing agent. The common aqueous complexes of tungsten and molybdenum (PMoxW12-xO40 3-, x = 0-12) are depolymerized to {PO4[Mo(O)2(O-O)]4}3- and {PO4[W(O)2(O-O)]4}3- by H2O2. The former can be preferentially extracted by TBP. The extractant concentration, phase contact time, H2O2 dosage, and H2SO4 concentration were optimized. By employing 80% by volume TBP, O:A = 1:1, 1.0 mol/L H2SO4, 1.0 mol/L H3PO4, a contact time of 2 min, and a molar ratio of H2O2/(W + Mo) equal to 1.5, 60.2% molybdenum was extracted in a single stage, while limiting tungsten co-extraction to 3.2%. An extraction isotherm indicated that the raffinate could be reduced to <0.1 g/L Mo in six stages of continuous counter-current extraction.

  10. Discovering highly informative feature set over high dimensions

    KAUST Repository

    Zhang, Chongsheng; Masseglia, Florent; Zhang, Xiangliang

    2012-01-01

    For many textual collections, the number of features is often overly large. These features can be very redundant, it is therefore desirable to have a small, succinct, yet highly informative collection of features that describes the key characteristics of a dataset. Information theory is one such tool for us to obtain this feature collection. With this paper, we mainly contribute to the improvement of efficiency for the process of selecting the most informative feature set over high-dimensional unlabeled data. We propose a heuristic theory for informative feature set selection from high dimensional data. Moreover, we design data structures that enable us to compute the entropies of the candidate feature sets efficiently. We also develop a simple pruning strategy that eliminates the hopeless candidates at each forward selection step. We test our method through experiments on real-world data sets, showing that our proposal is very efficient. © 2012 IEEE.

  11. Discovering highly informative feature set over high dimensions

    KAUST Repository

    Zhang, Chongsheng

    2012-11-01

    For many textual collections, the number of features is often overly large. These features can be very redundant, it is therefore desirable to have a small, succinct, yet highly informative collection of features that describes the key characteristics of a dataset. Information theory is one such tool for us to obtain this feature collection. With this paper, we mainly contribute to the improvement of efficiency for the process of selecting the most informative feature set over high-dimensional unlabeled data. We propose a heuristic theory for informative feature set selection from high dimensional data. Moreover, we design data structures that enable us to compute the entropies of the candidate feature sets efficiently. We also develop a simple pruning strategy that eliminates the hopeless candidates at each forward selection step. We test our method through experiments on real-world data sets, showing that our proposal is very efficient. © 2012 IEEE.

  12. Isolation of transplutonium elements from high-level radioactive wastes using diphenyl(dibutylcarbamoylmethyl)phosphine oxide

    International Nuclear Information System (INIS)

    Chmutova, M.K.; Litvina, M.N.; Pribylova, G.A.; Ivanova, L.A.; Myasoedov, B.F.; Smirnov, I.V.; Shadrin, A.Yu.

    1999-01-01

    Consequent stages of development of principal technological scheme of extraction separation of transplutonium elements from high-level radioactive wastes of spent fuel reprocessing are presented. Approach to reagent selection from the series of carbamoylmethylphosphine oxides is based. Distribution of transplutonium elements and accompanying elements between model solution of high-level radioactive wastes and solution of reagent in organic solvent is investigated. Methods of separation of transplutonium elements, reextraction of transplutonium elements together with rare earth elements are developed. Principal technological scheme of transplutonium elements separation from nonevaporated raffinates of spent fuel of WWER type reactors and method of separation of transplutonium and rare earth elements in weakly acid reextract with the use of liquid chromatography with free immobile phase are proposed [ru

  13. Fully Convolutional Network Based Shadow Extraction from GF-2 Imagery

    Science.gov (United States)

    Li, Z.; Cai, G.; Ren, H.

    2018-04-01

    There are many shadows on the high spatial resolution satellite images, especially in the urban areas. Although shadows on imagery severely affect the information extraction of land cover or land use, they provide auxiliary information for building extraction which is hard to achieve a satisfactory accuracy through image classification itself. This paper focused on the method of building shadow extraction by designing a fully convolutional network and training samples collected from GF-2 satellite imagery in the urban region of Changchun city. By means of spatial filtering and calculation of adjacent relationship along the sunlight direction, the small patches from vegetation or bridges have been eliminated from the preliminary extracted shadows. Finally, the building shadows were separated. The extracted building shadow information from the proposed method in this paper was compared with the results from the traditional object-oriented supervised classification algorihtms. It showed that the deep learning network approach can improve the accuracy to a large extent.

  14. Study of Foeniculum vulgare (Fennel Seed Extract Effects on Serum Level of Oxidative Stress

    Directory of Open Access Journals (Sweden)

    Sadeghpour Nahid

    2015-04-01

    Full Text Available Objective: The Foeniculum vulgare (FVE, known as fennel, has a long history of herbal uses as both food and medicine. The seed of this plant has been used to promote menstruation, alleviate the symptoms of female climacteric, and increase the number of ovarian follicles. The aim of this study was to evaluate the fennel extract effects on serum level of oxidative stress in female mice. Materials and Methods: Totally, 28 virgin female albino mice were divided into four groups (n = 7. Groups 1 and 2 (experimental groups were administered FVE at 100 and at a concentration of 100 and 200 mg/kg for 5 days, interaperitoneally. Group 3 (negative control received ethanol and Group 4 (positive control received normal saline. Animals were scarified at 6th day, sera were collected and the level of oxidative stress was determination of using total antioxidant status kit. Results: Data analysis revealed that there is a significant difference in the mean level of serum oxidative stress between four different groups. P value in experimental groups compared to the control group was (P < 0.0001. Conclusion: Fennel extract can decrease the serum level of oxidative factors in female mice; it can be introduced as a novel medicine for treatment of infertility

  15. Solvent extraction of uranium from high acid leach solution

    International Nuclear Information System (INIS)

    Ramadevi, G.; Sreenivas, T.; Navale, A.S.; Padmanabhan, N.P.H.

    2010-01-01

    A significant part of the total uranium reserves all over the world is contributed by refractory uranium minerals. The refractory oxides are highly stable and inert to attack by most of the commonly used acids under normal conditions of acid strength, pressure and temperature. Quantitative dissolution of uranium from such ores containing refractory uranium minerals requires drastic operating conditions during chemical leaching like high acid strength, elevated pressures and temperatures. The leach liquors produced under these conditions normally have high free acidity, which affects the downstream operations like ion exchange and solvent extraction

  16. Essential oils (EOs), pressurized liquid extracts (PLE) and carbon dioxide supercritical fluid extracts (SFE-CO2) from Algerian Thymus munbyanus as valuable sources of antioxidants to be used on an industrial level.

    Science.gov (United States)

    Bendif, Hamdi; Adouni, Khaoula; Miara, Mohamed Djamel; Baranauskienė, Renata; Kraujalis, Paulius; Venskutonis, Petras Rimantas; Nabavi, Seyed Mohammad; Maggi, Filippo

    2018-09-15

    The aim of this study was to demonstrate the potential of extracts from Algerian Thymus munbyanus as a valuable source of antioxidants for use on an industrial level. To this end, a study was conducted on the composition and antioxidant activities of essential oils (EOs), pressurized liquid extracts (PLE) and supercritical fluid extracts (SFE-CO 2 ) obtained from Thymus munbyanus subsp. coloratus (TMC) and subsp. munbyanus (TMM). EOs and SFE-CO 2 extracts were analysed by GC-FID and GC×GC-TOFMS revealing significant differences. A successive extraction of the solid SFE-CO 2 residue by PLE extraction with solvents of increasing polarity such as acetone, ethanol and water, was carried out. The extracts were evaluated for total phenolic content by Folin-Ciocalteu assay, while the antioxidant power was assessed by DPPH, FRAP, and ORAC assays. SFE-CO 2 extracts were also analysed for their tocopherol content. The antioxidant activity of PLE extracts was found to be higher than that of SFE-CO 2 extracts, and this increased with solvent polarity (water > ethanol > acetone). Overall, these results support the use of T. munbyanus as a valuable source of substances to be used on an industrial level as preservative agents. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Triangle network motifs predict complexes by complementing high-error interactomes with structural information

    Directory of Open Access Journals (Sweden)

    Labudde Dirk

    2009-06-01

    Full Text Available Abstract Background A lot of high-throughput studies produce protein-protein interaction networks (PPINs with many errors and missing information. Even for genome-wide approaches, there is often a low overlap between PPINs produced by different studies. Second-level neighbors separated by two protein-protein interactions (PPIs were previously used for predicting protein function and finding complexes in high-error PPINs. We retrieve second level neighbors in PPINs, and complement these with structural domain-domain interactions (SDDIs representing binding evidence on proteins, forming PPI-SDDI-PPI triangles. Results We find low overlap between PPINs, SDDIs and known complexes, all well below 10%. We evaluate the overlap of PPI-SDDI-PPI triangles with known complexes from Munich Information center for Protein Sequences (MIPS. PPI-SDDI-PPI triangles have ~20 times higher overlap with MIPS complexes than using second-level neighbors in PPINs without SDDIs. The biological interpretation for triangles is that a SDDI causes two proteins to be observed with common interaction partners in high-throughput experiments. The relatively few SDDIs overlapping with PPINs are part of highly connected SDDI components, and are more likely to be detected in experimental studies. We demonstrate the utility of PPI-SDDI-PPI triangles by reconstructing myosin-actin processes in the nucleus, cytoplasm, and cytoskeleton, which were not obvious in the original PPIN. Using other complementary datatypes in place of SDDIs to form triangles, such as PubMed co-occurrences or threading information, results in a similar ability to find protein complexes. Conclusion Given high-error PPINs with missing information, triangles of mixed datatypes are a promising direction for finding protein complexes. Integrating PPINs with SDDIs improves finding complexes. Structural SDDIs partially explain the high functional similarity of second-level neighbors in PPINs. We estimate that

  18. Acceleration of Extractability from Seaweed, Kombu, by Irradiation

    International Nuclear Information System (INIS)

    Suzuki, T.; Yoshie, Y.; Shirai, T.; Hirano, T.

    1993-01-01

    The stock of brown alga, kombu, is used for Japanese dishes. Makombu Laminaria japonica cut into 1×1cm squares was irradiated at dose levels of 1, 10, and 100kGy under moisture contents of 10 or 40%, and then boiled up to 30min. Nitrogen, glutamic acid, and sugars in the kombu extracts were determined. When kombu was irradiated with a dose of 10 and 100kGy at 10% moisture, the amounts of extractive nitrogen boiled for 2 and 5min were about 1.4 times as many as those of non-irradiated control. Glutamic acid, one of the main constituents of delicious taste in kombu, was also highly extracted during 2 and 5min boiling in comparison with that of non-irradiated kombu. However, there were no significant differences in the amounts of either extractive nitrogen or glutamic acid after 30min boiling in any of samples. Sugars which deteriorated the quality of the kombu stock due to viscosity were dissolved in high amounts at a dose level of 100kGy. The amount of solubilized sugars from irradiated samples of 40% moisture was higher than that of 10% moisture, but no significant change was observed in other extracted substances during heating. From these results, the irradiation dose level of 10kGy was appropriate to obtain good kombu stock as a result of increased extractability

  19. Effect of Luffa aegyptiaca (seeds) and Carissa edulis (leaves) extracts on blood glucose level of normal and streptozotocin diabetic rats.

    Science.gov (United States)

    El-Fiky, F K; Abou-Karam, M A; Afify, E A

    1996-01-01

    The present study investigates the effect of oral administration of the ethanolic extracts of Luffa aegyptiaca (seeds) and Carissa edulis (leaves) on blood glucose levels both in normal and streptozotocin (STZ) diabetic rats. Treatment with both extracts significantly reduced the blood glucose level in STZ diabetic rats during the first three hours of treatment. L. aegyptiaca extract decreased blood glucose level with a potency similar to that of the biguanide, metformin. The total glycaemic areas were 589.61 +/- 45.62 mg/dl/3 h and 660.38 +/- 64.44 mg/dl/3 h for L. aegyptiaca and metformin, respectively, vs. 816.73 +/- 43.21 mg/dl/3 h for the control (P < 0.05). On the other hand, in normal rats, both treatments produced insignificant changes in blood glucose levels compared to glibenclamide treatment.

  20. High-level waste management technology program plan

    Energy Technology Data Exchange (ETDEWEB)

    Harmon, H.D.

    1995-01-01

    The purpose of this plan is to document the integrated technology program plan for the Savannah River Site (SRS) High-Level Waste (HLW) Management System. The mission of the SRS HLW System is to receive and store SRS high-level wastes in a see and environmentally sound, and to convert these wastes into forms suitable for final disposal. These final disposal forms are borosilicate glass to be sent to the Federal Repository, Saltstone grout to be disposed of on site, and treated waste water to be released to the environment via a permitted outfall. Thus, the technology development activities described herein are those activities required to enable successful accomplishment of this mission. The technology program is based on specific needs of the SRS HLW System and organized following the systems engineering level 3 functions. Technology needs for each level 3 function are listed as reference, enhancements, and alternatives. Finally, FY-95 funding, deliverables, and schedules are s in Chapter IV with details on the specific tasks that are funded in FY-95 provided in Appendix A. The information in this report represents the vision of activities as defined at the beginning of the fiscal year. Depending on emergent issues, funding changes, and other factors, programs and milestones may be adjusted during the fiscal year. The FY-95 SRS HLW technology program strongly emphasizes startup support for the Defense Waste Processing Facility and In-Tank Precipitation. Closure of technical issues associated with these operations has been given highest priority. Consequently, efforts on longer term enhancements and alternatives are receiving minimal funding. However, High-Level Waste Management is committed to participation in the national Radioactive Waste Tank Remediation Technology Focus Area. 4 refs., 5 figs., 9 tabs.

  1. High-level waste management technology program plan

    International Nuclear Information System (INIS)

    Harmon, H.D.

    1995-01-01

    The purpose of this plan is to document the integrated technology program plan for the Savannah River Site (SRS) High-Level Waste (HLW) Management System. The mission of the SRS HLW System is to receive and store SRS high-level wastes in a see and environmentally sound, and to convert these wastes into forms suitable for final disposal. These final disposal forms are borosilicate glass to be sent to the Federal Repository, Saltstone grout to be disposed of on site, and treated waste water to be released to the environment via a permitted outfall. Thus, the technology development activities described herein are those activities required to enable successful accomplishment of this mission. The technology program is based on specific needs of the SRS HLW System and organized following the systems engineering level 3 functions. Technology needs for each level 3 function are listed as reference, enhancements, and alternatives. Finally, FY-95 funding, deliverables, and schedules are s in Chapter IV with details on the specific tasks that are funded in FY-95 provided in Appendix A. The information in this report represents the vision of activities as defined at the beginning of the fiscal year. Depending on emergent issues, funding changes, and other factors, programs and milestones may be adjusted during the fiscal year. The FY-95 SRS HLW technology program strongly emphasizes startup support for the Defense Waste Processing Facility and In-Tank Precipitation. Closure of technical issues associated with these operations has been given highest priority. Consequently, efforts on longer term enhancements and alternatives are receiving minimal funding. However, High-Level Waste Management is committed to participation in the national Radioactive Waste Tank Remediation Technology Focus Area. 4 refs., 5 figs., 9 tabs

  2. Effect of Syzygium Aromaticum (CLOVE) Extract on Blood Glucose Level in Streptozotocin induced Diabetic Rats

    International Nuclear Information System (INIS)

    Chaudhry, Z. R.; Chaudhry, S. R.; Naseer, A.; Chaudhry, F. R.

    2013-01-01

    Objective: To evaluate the glucose lowering effect of 50% ethanol extract of Syzygium aromaticum in comparison with that of standard insulin in streptozotocin induced diabetic rats. Study Design: Randomized control trial. Place and Duration of Study: National Institute of Health Islamabad. Jul 2011- Dec 2011 Material and Methods: It was carried out on 48 adult rats of Sprague dawley specie. Rats were equally divided into 6 groups (I-VI). Group - I served as control. Diabetes was induced by giving single intraperitoneal injection of STZ in Group II to VI. Group-II served as diabetic control, while groups III, IV, V and VI served as experimental groups. Group III, IV and V rats received 50% ethanol extract of Syzygium aromaticum at a dose of 250, 500 and 750 mg/kg body weight respectively for sixty days. Group VI (standard) received humulin insulin 70/30 at dose of 0.6 units<-kg body weight subcutaneously bid for sixty days. Fasting blood samples were taken at zero day, 15 day, 30 day and 60 day after giving injection STZ. Although Syzygium aromaticum with the doses of 250, 500 and 750 mg/kg body weight and insulin reduced the level of glucose in rats but on comparison Syzygium aromaticum 750 mg=kg dose reduced glucose more effectively than 250 and 500 mg/kg dose. While in group III, IV subjects, blood glucose levels remained above normal level. In group VI receiving insulin the level of this parameter remained almost closer to group IV rats. On studying the weight of the animals after receiving STZ there was initial reduction in the weight of all the experimental groups but after receiving the extract of plant improvement was seen and the weight of group V getting 750 mg=kg/body weight of Syzygium aromaticum became almost closer to the weight of control group. Conclusion: Syzygium aromaticum extract has glucose lowering effect in STZ induced diabetic rats and this effect is dose related and the dose of 750 mg/kg body weight has produced maximum effect. (author)

  3. Andrographis paniculata extract attenuates pathological cardiac hypertrophy and apoptosis in high-fat diet fed mice.

    Science.gov (United States)

    Hsieh, You-Liang; Shibu, Marthandam Asokan; Lii, Chong-Kuei; Viswanadha, Vijaya Padma; Lin, Yi-Lin; Lai, Chao-Hung; Chen, Yu-Feng; Lin, Kuan-Ho; Kuo, Wei-Wen; Huang, Chih-Yang

    2016-11-04

    Andrographis paniculata (Burm. f.) Nees (Acanthaceae) has a considerable medicinal reputation in most parts of Asia as a potent medicine in the treatment of Endocrine disorders, inflammation and hypertension. Water extract of A. paniculata and its active constituent andrographolide are known to possess anti-inflammatory and anti-apoptotic effects. Our aim is to identify whether A. paniculata extract could protect myocardial damage in high-fat diet induced obese mice. The test mice were divided into three groups fed either with normal chow or with high fat diet (obese) or with high fat diet treated with A. paniculata extract (2g/kg/day, through gavage, for a week). We found that the myocardial inflammation pathway related proteins were increased in the obese mouse which potentially contributes to cardiac hypertrophy and myocardial apoptosis. But feeding with A. paniculata extract showed significant inhibition on the effects of high fat diet. Our study strongly suggests that supplementation of A. paniculata extract can be used for prevention and treatment of cardiovascular disease in obese patients. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  4. Effects of aqueous extract from Asparagus officinalis L. roots on hypothalamic-pituitary-gonadal axis hormone levels and the number of ovarian follicles in adult rats

    Directory of Open Access Journals (Sweden)

    Hojatollah Karimi Jashni

    2016-02-01

    Full Text Available Background: Asparagus is a plant with high nutritional, pharmaceutical, and industrial values. Objective: The present study aimed to evaluate the effect of aqueous extract of asparagus roots on the hypothalamic-pituitary-gonadal axis hormones and oogenesis in female rats. Materials and Methods: In this experimental study, 40 adult female Wistar rats were divided into five groups, which consist 8 rats. Groups included control, sham and three experimental groups receiving different doses (100, 200, 400 mg/kg/bw of aqueous extract of asparagus roots. All dosages were administered orally for 28 days. Blood samples were taken from rats to evaluate serum levels of Gonadotropin releasing hormone (GnRH, follicular stimulating hormone (FSH, Luteinal hormone (LH, estrogen, and progesterone hormones. The ovaries were removed, weighted, sectioned, and studied by light microscope. Results: Dose-dependent aqueous extract of asparagus roots significantly increased serum levels of GnRH, FSH, LH, estrogen, and progestin hormones compared to control and sham groups. Increase in number of ovarian follicles and corpus luteum in groups treated with asparagus root extract was also observed (p<0.05. Conclusion: Asparagus roots extract stimulates secretion of hypothalamic- pituitary- gonadal axis hormones. This also positively affects oogenesis in female rats.

  5. Molecular design of highly efficient extractants for separation of lanthanides and actinides by computational chemistry

    International Nuclear Information System (INIS)

    Uezu, Kazuya; Yamagawa, Jun-ichiro; Goto, Masahiro

    2006-01-01

    Novel organophosphorus extractants, which have two functional moieties in the molecular structure, were developed for the recycle system of transuranium elements using liquid-liquid extraction. The synthesized extractants showed extremely high extractability to lanthanides elements compared to those of commercially available extractants. The results of extraction equilibrium suggested that the structural effect of extractants is one of the key factors to enhance the selectivity and extractability in lanthanides extractions. Furthermore, molecular modeling was carried out to evaluate the extraction properties for extraction of lanthanides by the synthesized extractants. Molecular modeling was shown to be very useful for designing new extractants. The new concept to connect some functional moieties with a spacer is very useful and is a promising method to develop novel extractants for treatment of nuclear fuel. (author)

  6. Geo-metadata design for the GIS of the pre-selected site for China's high-level radioactive waste repository

    International Nuclear Information System (INIS)

    Zhong Xia; Wang Ju; Huang Shutao; Wang Shuhong; Gao Min

    2008-01-01

    The information system for the geological disposal of high-level radioactive waste aims at the integrated management and full application of multi-sourceful information in the research for geological disposal of high-level radioactive waste. And the establishment and operation of the system need geo-metadata's support of multi-sourceful information. In the paper, on the basis of geo-data analysis for pre-selected site of disposal of high-level radioactive waste, we can apply the existing metadata standards. Also we can research and design the content information, management pattern and application for geo-metadata of the multi-sourceful information. (authors)

  7. Sequential high pressure extractions applied to recover piceatannol and scirpusin B from passion fruit bagasse.

    Science.gov (United States)

    Viganó, Juliane; Aguiar, Ana C; Moraes, Damila R; Jara, José L P; Eberlin, Marcos N; Cazarin, Cinthia B B; Maróstica, Mário R; Martínez, Julian

    2016-07-01

    Passion fruit seeds are currently discarded on the pulp processing but are known for their high piceatannol and scirpusin B contents. Using pressurized liquid extraction (PLE), these highly valuable phenolic compounds were efficiently extracted from defatted passion fruit bagasse (DPFB). PLE was performed using mixtures of ethanol and water (50 to 100% ethanol, w/w) as solvent, temperatures from 50 to 70°C and pressure at 10MPa. The extraction methods were compared in terms of the global yield, total phenolic content (TPC), piceatannol content and the antioxidant capacity of the extracts. The DPFB extracts were also compared with those from non-defatted passion fruit bagasse (nDPFB). Identification and quantification of piceatannol were performed using UHPLC-MS/MS. The results showed that high TPC and piceatannol content were achieved for the extracts obtained from DPFB through PLE at 70°C and using 50 and 75% ethanol as the solvent. The best PLE conditions for TPC (70°C, 75% ethanol) resulted in 55.237mgGAE/g dried and defatted bagasse, whereas PLE at 70°C and 50% ethanol achieved 18.590mg of piceatannol/g dried and defatted bagasse, and such yields were significantly higher than those obtained using conventional extraction techniques. The antioxidant capacity assays showed high correlation with the TPC (r>0.886) and piceatannol (r>0.772). The passion fruit bagasse has therefore proved to be a rich source of piceatannol and PLE showed high efficiency to recover phenolic compounds from defatted passion fruit bagasse. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Extracting Low-Frequency Information from Time Attenuation in Elastic Waveform Inversion

    Science.gov (United States)

    Guo, Xuebao; Liu, Hong; Shi, Ying; Wang, Weihong

    2017-03-01

    Low-frequency information is crucial for recovering background velocity, but the lack of low-frequency information in field data makes inversion impractical without accurate initial models. Laplace-Fourier domain waveform inversion can recover a smooth model from real data without low-frequency information, which can be used for subsequent inversion as an ideal starting model. In general, it also starts with low frequencies and includes higher frequencies at later inversion stages, while the difference is that its ultralow frequency information comes from the Laplace-Fourier domain. Meanwhile, a direct implementation of the Laplace-transformed wavefield using frequency domain inversion is also very convenient. However, because broad frequency bands are often used in the pure time domain waveform inversion, it is difficult to extract the wavefields dominated by low frequencies in this case. In this paper, low-frequency components are constructed by introducing time attenuation into the recorded residuals, and the rest of the method is identical to the traditional time domain inversion. Time windowing and frequency filtering are also applied to mitigate the ambiguity of the inverse problem. Therefore, we can start at low frequencies and to move to higher frequencies. The experiment shows that the proposed method can achieve a good inversion result in the presence of a linear initial model and records without low-frequency information.

  9. Impact of high hydrostatic pressure and pasteurization on the structure and the extractability of bioactive compounds of persimmon “Rojo Brillante”.

    Science.gov (United States)

    Hernández-Carrión, M; Vázquez-Gutiérrez, J L; Hernando, I; Quiles, A

    2014-01-01

    Rojo Brillante is an astringent oriental persimmon variety with high levels of bioactive compounds such as soluble tannins, carotenoids, phenolic acids, and dietary fiber. The purpose of this study was to investigate the effects of high hydrostatic pressure (HHP) and pasteurization on the structure of the fruit and on the extractability of certain bioactive compounds. The microstructure was studied using light microscopy, transmission electron microscopy, and low temperature scanning electron microscopy, and certain physicochemical properties (carotenoid and total soluble tannin content, antioxidant activity, fiber content, color, and texture properties) were measured. The structural changes induced by HHP caused a rise in solute circulation in the tissues that could be responsible for the increased carotenoid level and the unchanged antioxidant activity in comparison with the untreated persimmon. In contrast, the changes that took place during pasteurization lowered the tannin content and antioxidant activity. Consequently, HHP treatment could improve the extraction of potentially bioactive compoundsxsts from persimmons. A high nutritional value ingredient to be used when formulating new functional foods could be obtained using HHP. © 2013 Institute of Food Technologists®

  10. Parity dependence of the nuclear level density at high excitation

    International Nuclear Information System (INIS)

    Rao, B.V.; Agrawal, H.M.

    1995-01-01

    The basic underlying assumption ρ(l+1, J)=ρ(l, J) in the level density function ρ(U, J, π) has been checked on the basis of high quality data available on individual resonance parameters (E 0 , Γ n , J π ) for s- and p-wave neutrons in contrast to the earlier analysis where information about p-wave resonance parameters was meagre. The missing level estimator based on the partial integration over a Porter-Thomas distribution of neutron reduced widths and the Dyson-Mehta Δ 3 statistic for the level spacing have been used to ascertain that the s- and p-wave resonance level spacings D(0) and D(1) are not in error because of spurious and missing levels. The present work does not validate the tacit assumption ρ(l+1, J)=ρ(l, J) and confirms that the level density depends upon parity at high excitation. The possible implications of the parity dependence of the level density on the results of statistical model calculations of nuclear reaction cross sections as well as on pre-compound emission have been emphasized. (orig.)

  11. Novel extractants with high selectivity for valuable metals in seawater. Calixarene derivatives

    International Nuclear Information System (INIS)

    Kakoi, Takahiko; Goto, Masahiro

    1997-01-01

    Seawater contains various valuable metals such as uranium and lithium. Therefore, attempts are being made to develop highly selective extractants which recognize target metal ions in reclaimed seawater. In this review, we have focused our study on the application of novel cyclic compound calixarene based extractants. A novel host compound calixarene, which is a cyclic compound connecting some phenol rings, is capable of forming several different extractant ring sizes and introducing various kinds of functional groups towards targeting of metal ions in seawater. Therefore, calixarene derivatives are capable of extracting valuable metals such as uranium, alkaline metals, heavy metals, rare earth metals and noble metals selectively by varying structural ring size and functional groups. The novel host compound calixarene has given promising results which line it up as a potential extractant for the separation of valuable metal ions in seawater. (author)

  12. The management of high-level radioactive wastes

    International Nuclear Information System (INIS)

    Lennemann, Wm.L.

    1979-01-01

    The definition of high-level radioactive wastes is given. The following aspects of high-level radioactive wastes' management are discussed: fuel reprocessing and high-level waste; storage of high-level liquid waste; solidification of high-level waste; interim storage of solidified high-level waste; disposal of high-level waste; disposal of irradiated fuel elements as a waste

  13. Critical parameters in cost-effective alkaline extraction for high protein yield from leaves

    NARCIS (Netherlands)

    Zhang, C.; Sanders, J.P.M.; Bruins, M.E.

    2014-01-01

    Leaves are potential resources for feed or food, but their applications are limited due to a high proportion of insoluble protein and inefficient processing. To overcome these problems, parameters of alkaline extraction were evaluated using green tea residue (GTR). Protein extraction could be

  14. Extracting the Beat: An Experience-dependent Complex Integration of Multisensory Information Involving Multiple Levels of the Nervous System

    Directory of Open Access Journals (Sweden)

    Laurel J. Trainor

    2009-04-01

    Full Text Available In a series of studies we have shown that movement (or vestibular stimulation that is synchronized to every second or every third beat of a metrically ambiguous rhythm pattern biases people to perceive the meter as a march or as a waltz, respectively. Riggle (this volume claims that we postulate an "innate", "specialized brain unit" for beat perception that is "directly" influenced by vestibular input. In fact, to the contrary, we argue that experience likely plays a large role in the development of rhythmic auditory-movement interactions, and that rhythmic processing in the brain is widely distributed and includes subcortical and cortical areas involved in sound processing and movement. Further, we argue that vestibular and auditory information are integrated at various subcortical and cortical levels along with input from other sensory modalities, and it is not clear which levels are most important for rhythm processing or, indeed, what a "direct" influence of vestibular input would mean. Finally, we argue that vestibular input to sound location mechanisms may be involved, but likely cannot explain the influence of vestibular input on the perception of auditory rhythm. This remains an empirical question for future research.

  15. The Effects of Root Extract Ruellia tuberosa L on Histopathology and Malondialdehyde Levels on the Liver of Diabetic Rats

    Science.gov (United States)

    Nur Laily Kurniawati, Alfin; Aulanni'am; Srihardyastutie, Arie; Safitri, Anna

    2018-01-01

    The aim of this research is to study antidiabetic activity of root extract of Ruellia tuberosa L on rats (Rattus novergicus) induced by multiple-low dose streptozotocin as animal diabetic models. The parameters investigated were blood glucose levels, free radicals (MDA, malondialdehyde) levels and hepatic histopathology. The main materials used were n-hexane root extracts from Ruellia tuberosa L. Three groups of rats, including control group (group I), diabetic group (group II), and therapy group with Ruellia tuberosa L (group III), were used. Streptozotocin was given at multiple-low dose of 20 mg/kg of body weight for 5 times in 5 consecutive days i.p. to rats in groups II and III. The Ruellia tuberosa L extracts were then given orally for group III in the dose of 250 mg/kg of body weight per day for 3 weeks. Results of the current work showed that root extract Ruellia tuberosa L had lowered blood glucose levels on rats in group III by 60.3%, from 299.7 ± 24.7 mg/dL up to 119.0 ± 26.6 mg/dL. Moreover, the antidiabetic activity of Ruellia tuberosa L extracts also deduced from decrease of MDA levels in group III, from 3.5 ± 0.3 μg/mL up to 1.7 ± 0.4 μg/mL. The recovery of hepatic organ from treatment group has also been proven from the its histology profiles stained with hematoxylin-eosin.

  16. Comparison of methods of extracting information for meta-analysis of observational studies in nutritional epidemiology

    Directory of Open Access Journals (Sweden)

    Jong-Myon Bae

    2016-01-01

    Full Text Available OBJECTIVES: A common method for conducting a quantitative systematic review (QSR for observational studies related to nutritional epidemiology is the “highest versus lowest intake” method (HLM, in which only the information concerning the effect size (ES of the highest category of a food item is collected on the basis of its lowest category. However, in the interval collapsing method (ICM, a method suggested to enable a maximum utilization of all available information, the ES information is collected by collapsing all categories into a single category. This study aimed to compare the ES and summary effect size (SES between the HLM and ICM. METHODS: A QSR for evaluating the citrus fruit intake and risk of pancreatic cancer and calculating the SES by using the HLM was selected. The ES and SES were estimated by performing a meta-analysis using the fixed-effect model. The directionality and statistical significance of the ES and SES were used as criteria for determining the concordance between the HLM and ICM outcomes. RESULTS: No significant differences were observed in the directionality of SES extracted by using the HLM or ICM. The application of the ICM, which uses a broader information base, yielded more-consistent ES and SES, and narrower confidence intervals than the HLM. CONCLUSIONS: The ICM is advantageous over the HLM owing to its higher statistical accuracy in extracting information for QSR on nutritional epidemiology. The application of the ICM should hence be recommended for future studies.

  17. A COMPARATIVE ANALYSIS OF WEB INFORMATION EXTRACTION TECHNIQUES DEEP LEARNING vs. NAÏVE BAYES vs. BACK PROPAGATION NEURAL NETWORKS IN WEB DOCUMENT EXTRACTION

    Directory of Open Access Journals (Sweden)

    J. Sharmila

    2016-01-01

    Full Text Available Web mining related exploration is getting the chance to be more essential these days in view of the reason that a lot of information is overseen through the web. Web utilization is expanding in an uncontrolled way. A particular framework is required for controlling such extensive measure of information in the web space. Web mining is ordered into three noteworthy divisions: Web content mining, web usage mining and web structure mining. Tak-Lam Wong has proposed a web content mining methodology in the exploration with the aid of Bayesian Networks (BN. In their methodology, they were learning on separating the web data and characteristic revelation in view of the Bayesian approach. Roused from their investigation, we mean to propose a web content mining methodology, in view of a Deep Learning Algorithm. The Deep Learning Algorithm gives the interest over BN on the basis that BN is not considered in any learning architecture planning like to propose system. The main objective of this investigation is web document extraction utilizing different grouping algorithm and investigation. This work extricates the data from the web URL. This work shows three classification algorithms, Deep Learning Algorithm, Bayesian Algorithm and BPNN Algorithm. Deep Learning is a capable arrangement of strategies for learning in neural system which is connected like computer vision, speech recognition, and natural language processing and biometrics framework. Deep Learning is one of the simple classification technique and which is utilized for subset of extensive field furthermore Deep Learning has less time for classification. Naive Bayes classifiers are a group of basic probabilistic classifiers in view of applying Bayes hypothesis with concrete independence assumptions between the features. At that point the BPNN algorithm is utilized for classification. Initially training and testing dataset contains more URL. We extract the content presently from the dataset. The

  18. Automatic Centerline Extraction of Coverd Roads by Surrounding Objects from High Resolution Satellite Images

    Science.gov (United States)

    Kamangir, H.; Momeni, M.; Satari, M.

    2017-09-01

    This paper presents an automatic method to extract road centerline networks from high and very high resolution satellite images. The present paper addresses the automated extraction roads covered with multiple natural and artificial objects such as trees, vehicles and either shadows of buildings or trees. In order to have a precise road extraction, this method implements three stages including: classification of images based on maximum likelihood algorithm to categorize images into interested classes, modification process on classified images by connected component and morphological operators to extract pixels of desired objects by removing undesirable pixels of each class, and finally line extraction based on RANSAC algorithm. In order to evaluate performance of the proposed method, the generated results are compared with ground truth road map as a reference. The evaluation performance of the proposed method using representative test images show completeness values ranging between 77% and 93%.

  19. Effects of Solanum torvum fruit water extract on hyperlipidemia and sex hormones in high-fat fed male rats

    Directory of Open Access Journals (Sweden)

    Supaporn Wannasiri

    2017-05-01

    Conclusions: S. torvum extract can reverse the level of sex hormones to their normal level and reduce serum cholesterol in HFD-induced obese male rats. Furthermore, the long term oral administration of S. torvum extract is harmless.

  20. Effect of aqueous and hydro-alcoholic extracts of lettuce (Lactuca sativa) seed on testosterone level and spermatogenesis in NMRI mice.

    Science.gov (United States)

    Ahangarpour, Akram; Oroojan, Ali Akbar; Radan, Maryam

    2014-01-01

    One of the considerable uses of lettuce (Lactuca sativa) seed in traditional medicine has been to reduce semen, sperm and sexuality. The aim of this study was to investigate the effects of aqueous and hydro-alcoholic extracts of lettuce seed on testosterone level and spermatogenesis. In this experimental study 24 adult male NMRI mice weighing 20-25gr were purchased. Animals were randomly divided into 4 groups: controls, hydro-alcoholic (200 mg/kg) and aqueous extracts (50, 100mg/kg). The extracts were injected intraperitoneally once a day for 10 consecutive days. 2 weeks after the last injection, the mice were anaesthetized by ether and after laparatomy blood was collected from the heart to determine testosterone by ELISA assay kit. Then testis and cauda epididymis of all animals were removed for analyzing testis morphology and sperm count and viability. Testis weight in hydro-alcoholic and aqueous extracts 100 mg/kg (p=0.001) and aqueous extract 50 mg/kg (p=0.008) groups was increased. Sperm viability in hydro-alcoholic (p=0.001) and aqueous extracts 50 (p=0.026), 100 mg/kg (p=0.045) groups was decreased, Also the results showed a significant decrease in sperm count in hydro-alcoholic (p=0.035) and aqueous extracts 50 mg/kg (p=0.006) groups in comparison with control group. Also there was a significant increase in serum level of testosterone in aqueous extract 50 mg/kg group in comparison with control (p=0.002) hydro-alcoholic (p=0.001) and aqueous extracts 100 mg/kg (p=0.003) groups. Present results demonstrated that hydro-alcoholic and aqueous 50 mg/kg extracts of lettuce seed have antispermatogenic effects, also aqueous extract 50 mg/kg increased serum level of testosterone in mice. Therefore we can suggest that lettuce seed could be a potential contraceptive agent. This article extracted from M.Sc. student research project. (Ali Akbar Oroojan).